Thursday, February 17, 2011

[NN] A guide to recurrent neural network and backpropagation

This doc is a rather simple introduction to Recurrent Neural Network giving a basic understanding of RNN. 

Modeling time series, we can use either feedforward networks or RNN:

1) Adopting temporal windows on the input feature for feed forward NN, i.e. the tapped delay line memory in this paper


2) RNN is inherently capable of modeling the temporal information in the input sequence due to its recurrent characteristic


One major concern is the simple RNN only make use of the previous input information, which may not be sufficient especially for speech signals.

To incorporate more temporal information a complex system is required and the structure after unfolding is illustrated below:


Will this more complex system beat the feed forward NN with similar model complexity?

Another issue is the RNN only utilizes the historical information in the time series, while for windowing in feed forward NN, we can actually employ both the history and future information for current frame prediction. 

However, feed forward NN process the current input and the context information with the same set of model parameter, which may not be the best way. 

Maybe the first thing to try is just to experiment with RNN see whether it gives any promising results.

 

Download now or preview on posterous
10.1.1.3.9311.pdf (137 KB)

Posted via email from Troy's posterous

No comments:

Post a Comment

Google+