You might have noticed another key difference between Figure 1 and Figure 3.In the earlier, multiple different weights are applied to the different parts of an input item generating a hidden layer neuron, which in turn is transformed using further weights to produce an output. Whereas in Figure 3, we seem to be applying the same weights over and over again to different items in the input series.Recurrent Neural Networks (RNNs) add an interesting twist to basic neural networks.
Also, depending on the application, if the sensitivity to immediate and closer neighbors is higher than inputs that come further away, a variant that looks only into a limited future/past can be modeled.
A recurrent neural network parses the inputs in a sequential fashion.
While RNNs learn similarly while training, in addition, they remember things learnt from prior input(s) while generating output(s). RNNs can take one or more input vectors and produce one or more output vectors and the output(s) are influenced not just by weights applied on inputs like a regular NN, but also by a “hidden” state vector representing the context based on prior input(s)/output(s).
So, the same input could produce a different output depending on previous inputs in the series.
Note: Basic feed forward networks “remember” things too, but they remember things they learnt during training.
For example, an image classifier learns what a “1” looks like during training and then uses that knowledge to classify things in production.
A recursive neural network is similar to the extent that the transitions are repeatedly applied to inputs, but not necessarily in a sequential fashion.
Recursive Neural Networks are a more general form of Recurrent Neural Networks. Parsing through input nodes, combining child nodes into parent nodes and combining them with other child/parent nodes to create a tree like structure.
This paper by Pascanu et al., explores this in detail and in general established that deep RNNs perform better than shallow RNNs.
Sometimes it’s not just about learning from the past to predict the future, but we also need to look into the future to fix the past.