# Excellent Tutorial on Sequence Learning using Recurrent Neural Networks

Excellent tutorial explaining Recurrent Neural Networks (RNNs) which hold great promise for learning general sequences, and have applications for text analysis, handwriting recognition and even machine translation.

By Gregory Piatetsky,

While feats of Deep Learning has been gathering much attention, there were also breakthroughs in a related technology of Recurrent Neural Networks (RNN). RNNs hold great promise for learning general sequences, and have applications for text analysis, handwriting recognition and even machine translation.

RNN is learning to paint house numbers (Andrej Karpathy)

See a fantastic post by Andrej Karpathy, "The Unreasonable Effectiveness of Recurrent Neural Networks" where he uses RNNs to do amazing stuff like paint house numbers in this image, or generate text in the style of Paul Graham, Shakespeare, and even Latex.

See below an excellent tutorial

by Alec Radford, Indico Head of Research, who led a workshop on general sequence learning using recurrent neural networks at Next.ML in San Francisco, Feb 2015.

Alec introduces RNNs and sketches how to implement them and cover the tricks necessary to make them work well. Then he investigates using RNNs as general text classification and regression models, examining where they succeed and where they fail compared to more traditional text analysis models.

Finally, he presents simple Python and Theano library for training RNNs with a scikit-learn style interface and you'll see how to use it through several hands-on tutorials on real world text datasets.

**@kdnuggets**.While feats of Deep Learning has been gathering much attention, there were also breakthroughs in a related technology of Recurrent Neural Networks (RNN). RNNs hold great promise for learning general sequences, and have applications for text analysis, handwriting recognition and even machine translation.

RNN is learning to paint house numbers (Andrej Karpathy)

See a fantastic post by Andrej Karpathy, "The Unreasonable Effectiveness of Recurrent Neural Networks" where he uses RNNs to do amazing stuff like paint house numbers in this image, or generate text in the style of Paul Graham, Shakespeare, and even Latex.

See below an excellent tutorial

**"General Sequence Learning using Recurrent Neural Networks"**by Alec Radford, Indico Head of Research, who led a workshop on general sequence learning using recurrent neural networks at Next.ML in San Francisco, Feb 2015.

Alec introduces RNNs and sketches how to implement them and cover the tricks necessary to make them work well. Then he investigates using RNNs as general text classification and regression models, examining where they succeed and where they fail compared to more traditional text analysis models.

Finally, he presents simple Python and Theano library for training RNNs with a scikit-learn style interface and you'll see how to use it through several hands-on tutorials on real world text datasets.

**Related:****Top /r/MachineLearning Posts, May: Unreasonable Effectiveness of Recurrent Neural Networks, Time-Lapse Mining****Deep Learning RNNaissance, an insightful, comprehensive, and entertaining overview****Top KDnuggets tweets, Jan 14-15: 10 FB likes predicts personality better than a co-worker; A Deep Dive into Recurrent Neural Nets**