Things I find interesting.
The initial plan was to have two layers of abstraction (inspired from Tensorflow). One layer for the Recurrent Cell, and one for the time unrolling using this cell. See
this presentation for more details. However after writing the
deep learning module, we concluded that writing the entire RNNLayer
as a single class would be better suited. The cell technique was still kept in Forward()
and
Backward()
functions to maintain better modularity.
At https://github.com/tmvadnn/tmva-dnn-tutorial and TestFullRNN you can find some examples using the Recurrent Networks. The repository isstill under construction, and we will be adding better examples.
Full code can be found here
Here is an example of an RNN->Reshape->DenseLayer
network learning the identity function (input of dimensionality two stored as a state of size 3 in RNN, then
reconstructed by DenseLayer).