What are “Recurrent Neural Networks”, and how it is different than “Convolutional Neural Networks”

Author: Sameer Nigam

CNN and RNN are amongst most important algorithm of Neural Network family, also they differ in their network process and solving problems.

So talking about their differences:

CNN are used to solve classification and regression problems and RNN are used to solve sequence information.

CNN are used for 2D image data, RNN are used for sequenced data(example time stamped sales data, sequence of text, heart beat data, etc).

So does “Sequences” means ?

Example of Sequences:

1) Time Series Data

2) Sentences

3) Audio

4) Car Trajectories

5) Music

A human would easily predict the future of sequence [1,2,3,4,5,6], and the most obvious answer here is 7. Same thing we do with Recurrent neural network. Predicting the future by building a neural network that can learn from the history.

So to do this we tell neuron about the previous history of output that came out of the neural network and to do that we simply feed the output back again as an input.

Take the image as a visual explanation

A simple neural network looks like this:

But a Recurrent neural network works like this

So a full Recurrent neural network works like this.

 Cells that are function of inputs from previous time steps are also known as memory cells.

 RNN are also flexible in their inputs and outputs, for both sequences and single vector values.

Artificial Neural Network with three neurons.

Recurrent Neural Network with three neurons

How RNN are flexible in their inputs and outputs?

Major Disadvantage of RNN

1) We only remember the previous output, it would be great if we can track the longer history.

2) Another major issue is “Vanishing Gradient”.

Which I will be talking about in my next blog.

Comment below what you think.

Go to Source