Skip to content

Latest commit

 

History

History
10 lines (4 loc) · 533 Bytes

RNN.md

File metadata and controls

10 lines (4 loc) · 533 Bytes

Recurrent Neural Networks

Traditional Neural Networks loose previous state of information much quickly and suffer from issue called vanishing gradients. To overcome this Recurrent Neural Networts as the work recurrent imply have loops in the circuit which allow information to persist. Lately they have been used in speech recognition, language modeling, document translation, image labelling etc..

Most of the impressive work of RNN is achieved using a special type of RNN's called "LSTM" meaning Long Short Term Memory.