Deep Learning Introduction
Input Layer
The input layer is where we feed input to the network. Each input will have some influence on predicting the output. However, no computations is performed in the input layer; it is just used fir passing information from the outside world to the network.
Biological and artificial neurons
A neuron can be defined as the basic computational unit of the human brain. Each and every neuron is connected to one another through a structure called a synapse, which is accountable for receiving input from the external environment, sensory organs for sending motor instructions to our muscles, and for performing other activities.
Exploring activation functions
An activation function, also known as a transfer function, plays a vital role in neural networks. It is used to introduce non-linearity in neural networks.
Hidden layer
Any layer between the input layer and the output layer is called a hidden layer. The hidden layer is responsible for deriving complex relationships between input and output. That is, the hidden layer identifies the pattern in the dataset.
Output layer
As the name suggests, the output layer emits the output. The number of neurons in the output layer is based on the type of problem we want our network to solve.
Backward pass
Backward pass: Backward pass implies backpropagating from the output layer to the input layer.
What is deep learning?
Deep learning is just a modern name for artificial neural networks with many layers. Deep learning is a subset of machine learning. Some of the interesting applications include automatically generating captions to the image, adding sound to the silent movies, converting black-and-white mages to colored images, generating text, and many more. Google's language translate, Netflix, Amazon, and Spotify's recommendations engines, and self-driving cars are some of the applications powered by deep learning.
Forward pass
Forward pass implies forward propagating from the input layer to the output layer.
Number of interactions
The number of interactions implies the number of passes where one pass = one forward pass + one backward pass.
The sigmoid function
The sigmoid function is one of the most commonly used activation functions. It scales the value between 0 and 1. It scales the value between 0 and 1. The sigmoid function can be defined as follows.
The softmax function
The softmax function is basically the generalization of the sigmoid function. It is usually applied to the final layer of the network and while performing multi-class classification tasks. It gives the probabilities of each class for being output and thus, the sum of the softmax values will always equal 1.
ANN and its layers
While neurons are really cool, we cannot just use a single neuron to perform complex tasks. This is the reason our brain has billions of neurons, stacked in layers, forming a network. Similarly, artificial neurons are arranged in layers. Each and every layer will be connected in such a way that information is passed from one layer to another. A typical ANN consists of the following layers: · Input layer · Hidden layer · Output layer
The Rectified Linear Unit function
The Rectified Linear unit (ReLU) function is another one of the most commonly used activation functions. It outputs a value from o to infinity. It is basically a piecewise function and can be expressed as follows.
Batch size
The batch size specifies the number of training samples we use in one forward pass and one backward pass.
Epoch
The epoch specifies the number of times the neural networks sees our whole training data. So, we can say one epoch is equal to one forward pass and one backward pass for all the training examples.