Bibliography
- Balduzzi, D. and Ghifary, M. (20–22 Jun 2016). Strongly-Typed Recurrent Neural Networks. In: Proceedings of The 33rd International Conference on Machine Learning, Vol. 48 of Proceedings of Machine Learning Research, edited by Balcan, M. F. and Weinberger, K. Q. (PMLR, New York, New York, USA); pp. 1292–1300.
- Chang, B.; Chen, M.; Haber, E. and Chi, E. H. (2019). AntisymmetricRNN: A Dynamical System View on Recurrent Neural Networks. In: International Conference on Learning Representations.
- Gers, F. A.; Schraudolph, N. N. and Schmidhuber, J. (2002). Learning precise timing with LSTM recurrent networks. Journal of Machine Learning Research 3, 115–143.
- Jozefowicz, R.; Zaremba, W. and Sutskever, I. (07–09 Jul 2015). An Empirical Exploration of Recurrent Network Architectures. In: Proceedings of the 32nd International Conference on Machine Learning, Vol. 37 of Proceedings of Machine Learning Research, edited by Bach, F. and Blei, D. (PMLR, Lille, France); pp. 2342–2350.
- Krause, B.; Murray, I.; Renals, S. and Lu, L. (2017). Multiplicative LSTM for sequence modelling.
- Kusupati, A.; Singh, M.; Bhatia, K.; Kumar, A.; Jain, P. and Varma, M. (2018). FastGRNN: A Fast, Accurate, Stable and Tiny Kilobyte Sized Gated Recurrent Neural Network. In: Advances in Neural Information Processing Systems, Vol. 31, edited by Bengio, S.; Wallach, H.; Larochelle, H.; Grauman, K.; Cesa-Bianchi, N. and Garnett, R. (Curran Associates, Inc.).
- Landi, F.; Baraldi, L.; Cornia, M. and Cucchiara, R. (2021). Working Memory Connections for LSTM. Neural Networks 144, 334–341.
- Laurent, T. and Brecht, J. (2017). A recurrent neural network without chaos. In: International Conference on Learning Representations.
- Lee, K.; Levy, O. and Zettlemoyer, L. (2017). Recurrent Additive Networks, arXiv:1705.07393.
- Li, S.; Li, W.; Cook, C.; Zhu, C. and Gao, Y. (Jun 2018). Independently Recurrent Neural Network (IndRNN): Building A Longer and Deeper RNN. In: 2018 IEEE/CVF Conference on Computer Vision and Pattern Recognition (IEEE); pp. 5457–5466.
- Mikolov, T.; Joulin, A.; Chopra, S.; Mathieu, M. and Ranzato, M. (2014). Learning longer memory in recurrent neural networks, arXiv preprint arXiv:1412.7753.
- Mujika, A.; Meier, F. and Steger, A. (2017). Fast-Slow Recurrent Neural Networks. In: Advances in Neural Information Processing Systems, Vol. 30, edited by Guyon, I.; Luxburg, U. V.; Bengio, S.; Wallach, H.; Fergus, R.; Vishwanathan, S. and Garnett, R. (Curran Associates, Inc.).
- Ravanelli, M.; Brakel, P.; Omologo, M. and Bengio, Y. (2018). Light Gated Recurrent Units for Speech Recognition. IEEE Transactions on Emerging Topics in Computational Intelligence 2, 92–102.
- Rusch, T. K. and Mishra, S. (2021). Coupled Oscillatory Recurrent Neural Network (co{RNN}): An accurate and (gradient) stable architecture for learning long time dependencies. In: International Conference on Learning Representations.
- Rusch, T. K.; Mishra, S.; Erichson, N. B. and Mahoney, M. W. (2022). Long Expressive Memory for Sequence Modeling. In: International Conference on Learning Representations.
- Rusch, T. K. and Mishra, S. (18–24 Jul 2021). UnICORNN: A recurrent model for learning very long time dependencies. In: Proceedings of the 38th International Conference on Machine Learning, Vol. 139 of Proceedings of Machine Learning Research, edited by Meila, M. and Zhang, T. (PMLR); pp. 9168–9178.
- Sutskever, I.; Martens, J. and Hinton, G. E. (2011). Generating Text with Recurrent Neural Networks. In: Proceedings of the 28th International Conference on Machine Learning (ICML); pp. 1017–1024.
- Turkoglu, M. O.; D’Aronco, S.; Wegner, J. and Schindler, K. (2021). Gating Revisited: Deep Multi-layer RNNs That Can Be Trained. IEEE Transactions on Pattern Analysis and Machine Intelligence, 1–1.
- Vecoven, N.; Ernst, D. and Drion, G. (2021). A bio-inspired bistable recurrent cell allows for long-lasting memory. PLOS ONE 16, e0252676.
- van der Westhuizen, J. and Lasenby, J. (2018). The unreasonable effectiveness of the forget gate.
- Ye, H.; Zhang, Y.; Liu, H.; Li, X.; Chang, J. and Zheng, H. (2024). Light Recurrent Unit: Towards an Interpretable Recurrent Neural Network for Modeling Long-Range Dependency. Electronics 13, 3204.
- Zhang, B.; Xiong, D.; Su, J.; Lin, Q. and Zhang, H. (2018). Simplifying Neural Machine Translation with Addition-Subtraction Twin-Gated Recurrent Networks. In: Proceedings of the 2018 Conference on Empirical Methods in Natural Language Processing (Association for Computational Linguistics); pp. 4273–4283.
- Zhou, G.-B.; Wu, J.; Zhang, C.-L. and Zhou, Z.-H. (2016). Minimal gated unit for recurrent neural networks. International Journal of Automation and Computing 13, 226–234.
- Zilly, J. G.; Srivastava, R. K.; Koutnı́k, J. and Schmidhuber, J. (06–11 Aug 2017). Recurrent Highway Networks. In: Proceedings of the 34th International Conference on Machine Learning, Vol. 70 of Proceedings of Machine Learning Research, edited by Precup, D. and Teh, Y. W. (PMLR); pp. 4189–4198.
- Zoph, B. and Le, Q. (2017). Neural Architecture Search with Reinforcement Learning. In: International Conference on Learning Representations.