neural networksLast summer my family and I visited Russia, even though none of us could read Russian, we did not have any trouble figuring our way out. All thanks to Google’s real-time translation that translates Russian boards into English. This is one of the several applications of Neural Networks. Some of the others are Facial Recognition, Music Composition, Forecasting etc., 

Neural Networks form the base of Deep Learning, a subfield of Machine Learning where the algorithms are inspired by the structure of the human brain.

Neural Networks are algorithms that are based on the way the brain works, they can build predictive models by learning the patterns in historical data.

Neural Networks take a data and train themselves to recognise the patterns in this data and predict the outputs for a new set of similar data.

Let’s understand how this is done.  

Neural Networks are made up of small interconnected processing elements called “NODES or NEURONS”. Each node/neurons possess a small part of the task.

The most common type of Neural Network is called Multilayer Perceptron. Here the Nodes are organised in layers.

  • First layer is called the Input layer.
  • The outermost layer is termed as Output layer.
  • The layers in between Input and Output layers are called Hidden layers.

The input layer receives the values of independent variables as input.

The nodes of the hidden layer take the inputs from the input layer, process it and pass it to the output layer. This is what happens within a node or neuron.

A node receives information from various nodes from the previous layer. These are multiplied by unique weights and added together with a small value called bias. This total is processed by a function called an activation function and leaves the node as output. This process is called Forward propagation.

This process proceeds until the information reaches the output layer and leaves it as a prediction for the dependent variable. 

The network then compares the prediction with the actual value of the dependent variable. If these do not match, this information will be given back to the network and this process is called Backward propagation. Based on this information it adjusts all the weight in the network and repeats the process. These iterations will be repeated until the neural network is able to produce accurate predictions for most of the observations. 

Once this is achieved we are left with the neural network model, that can be applied to a new set of data to provide predictions. 

Neural networks may take hours or months to train. But, time is a reasonable trade-off when compared to its scope.

Let us look at some of the prime applications of neural networks.

Facial Recognition: Cameras on smartphone these days can estimate the age of the person based on their facial features.

Forecasting: Neural Networks are trained to understand the patterns and detect the possibility of rainfall or raise in stock prices with high accuracy.

Music Composition: It can even learn patterns in Music and train itself enough to compose a fresh music.

Final Thoughts

With Deep Learning and Neural Networks we are still taking baby steps. Let’s see to what extent we can replicate the human brain. We’d have to wait for few more years to give a definite answers.