Poster
Gated Recurrent Neural Networks with Weighted Time-Delay Feedback
N. Benjamin Erichson · Soon Hoe Lim · Aditya Guntuboyina
[
Abstract
]
Abstract:
In this paper, we present a novel approach to modeling long-term dependencies in sequential data by introducing a gated recurrent unit (GRU) with a weighted time-delay feedback mechanism. Our proposed model, named τ-GRU, is a discretized version of a continuous-time formulation of a recurrent unit, where the dynamics are governed by delay differential equations (DDEs). We prove the existence and uniqueness of solutions for the continuous-time model and show that the proposed feedback mechanism can significantly improve the modeling of long-term dependencies. Our empirical results indicate that τ-GRU outperforms state-of-the-art recurrent units and gated recurrent architectures on a range of tasks, achieving faster convergence and better generalization.
Live content is unavailable. Log in and register to view live content