Bidirectional Recurrent Neural Networks - rugbyprof/5443-Data-Mining GitHub Wiki

These are trained to predict both in the positive and negative time directions simultaneously, they do this by connecting two hidden layers of opposite directions to the same output. By this structure, the output layer can get information from past and future states. E.g. in handwriting recognition, the performance can be enhanced by knowledge of the letters located before and after the current letter. https://en.wikipedia.org/wiki/Bidirectional_recurrent_neural_networks