Get the latest tech news

Architectural Effects on Maximum Dependency Lengths of Recurrent Neural Networks


This work proposes a methodology for determining the maximum dependency length of a recurrent neural network (RNN), and then studies the effects of architectural changes, including the number and neuron count of layers, on the maximum dependency lengths of traditional RNN, gated recurrent unit (GRU), and long-short term memory (LSTM) models.

View a PDF of the paper titled A Technical Note on the Architectural Effects on Maximum Dependency Lengths of Recurrent Neural Networks, by Jonathan S. Kent and 1 other authors View PDFHTML (experimental) Abstract:This work proposes a methodology for determining the maximum dependency length of a recurrent neural network (RNN), and then studies the effects of architectural changes, including the number and neuron count of layers, on the maximum dependency lengths of traditional RNN, gated recurrent unit (GRU), and long-short term memory (LSTM) models. From: Jonathan Kent [ view email][v1] Fri, 19 Jul 2024 23:00:38 UTC (588 KB)

Get the Android app

Or read this on Hacker News