Roadmap
This page documents some planned work for RecurrentLayers.jl. Future work for this library includes additional cells such as:
- FastRNNs and FastGRUs (current focus) arxiv
- Unitary recurrent neural networks arxiv
- Modern recurrent neural networks such as LRU and minLSTM/minGRU
- Quasi recurrent neural networks arxiv
Additionally, some cell-independent architectures are also planned, that expand the ability of recurrent architectures and could theoretically take any cell:
An implementation of these ideally would be, for example FastSlow(RNNCell, input_size => hidden_size)
. More details on this soon!