First name basis

Using a name is a key sales tactic. It’s personal, shows you listened and you care. You cared enough to use it.. “First name basis” is published by Fraser Larock.

Smartphone

独家优惠奖金 100% 高达 1 BTC + 180 免费旋转




Basic Concepts of Artificial Neural Networks

Most of us might have heard about “Deep Learning” , the emerging technology whose applications include speech recognition, computer vision, Natural language processing and the list goes on. Deep learning is a field dedicated for creation of machines which can learn & predict as we humans do. In short , DL’s main motive is to recreate the machines that learn progressively .But do you know , “Artificial Neural Networks” are the key components of Deep Learning?! So understanding basics of this core component of DL is necessary to quick start journey in Deep Learning.

What Are Artificial Neural Networks?

We can infer from the name “Artificial Neural Networks” that it formed by network i.e, collection and connection of Artificial Neurons . Neurons are the smallest parts of Neural Networks that can be compared to that of atoms in matter.

Components of Deep Learning

The structural representation looks quite synonyms to human brain, as it is inspired by brain structure and function. As the functioning of ANN is similar to biological neurons, So Let’s take a brief look at them and data processing in them.

Brain comprises of Neurons as the basic building blocks. It is estimated that human brain consists of 100 billion neurons or nerve cells interconnected. Input to this network is obtained from information obtained from sense organs , this data is processed in the neurons and its response or output from one neuron to the other is sent through Axons of transmitting cell to dendrites of receiving neuron. The output would generally will be a action of motor organs.

Lets Zoom into data processing of signals in one neuron. For a Neuron, inputs will be obtained from neighboring neurons or itself. There can be multiple input signals which will be scaled based on the weight given for each input, the weights given to the signal is based on the strength of connection or importance for the respective axon to dendrite connection path from which the signal flows. All the weighted inputs(input signals multiplied with their respective weight) will be combined (or) summed up & compared to a threshold value and the output is triggered from the neuron. This output acts as the input for other neurons. The above process is exactly same to that of Artificial neurons, so let us move directly to the , mathematical Understanding .

Mathematical Representation of Artificial Neurons:

Artificial Neuron

where f() is activation function, W.S is weighted sum.

This is for one neuron, many such neurons will be interconnected to form Artificial Neural Network and output is passed on from one neuron to others.

What is activation function and its need?

As mentioned previously, the output of weighted inputs sum will be subjected to a function which is called step function or activation function. For example, if we want to determine whether a student has passed an exam or not , just calculating percentage is not enough, we still have to make comparison with cutoff value to conclude. This comparison can be thought of like a activation function. In Real world scenario, problems would be much more complex than the above mentioned case. Also, Weighted sum is always a linear value building a model with neurons capable of learning only linear pattern between input and output won’t help in the real world problem solving which are Nonlinear in nature. So to avoid this we are going to use the activation function on weighted sum.

Various Activation functions used in Neural networks:

The below activation functions are most frequently used activation functions.

Linear Functions:

The linear functions are very simple in nature .

1.Step or threshold Function:

Output=f(x) = ‘0’ if x<T , else ‘1’ for x>T

2.Signum function:

Output=f(x)= ‘+1’ if x> T, else ‘-1’ for x≤ T

Here ‘T’ represents threshold value, ‘x’ is weighted sum.

Non Linear Functions:

s

2.Tanh

3. Rectified Linear Unit function(ReLU):

There is also approximated function of ReLU called as Softplus.

Lets look at the graphical representation of all this Non-Linear functions

Neurons in the network may or may not have same activation function. It is often noticed that neurons present in same layer will have same activation function, but this condition is not mandatory.

Arrangement of Neurons in Neural Networks:

The Neurons which are the simpler components in Neural Network are positioned in such a way that they are organized layer by layer. The more complex the problem (or) pattern between input and output (or) more the features, the number of layers increases in the network.

The Network in which the input flows from input layer to output layer towards single direction is Feed forward Neural Network.

Also the outputs of neurons can be used as a Feed back to the neurons in the network .This feedback is used to self adjust the weights on nodes (connecting line between sending and receiving neuron), these networks are called as Back propagation Neural Network.

Complete Blue Print on Working of Artificial Neural Networks:

Consider building of predictive model whose task is to predict the output of unseen input data in training, the first and foremost step to construct this model is to obtain as much as necessary data to train the model. We feed this data to the model to train the Neural Network, the model learns the pattern between input and output, by adjusting weights on the nodes slowly. Thus model learns predicting output. This seems similar to Learning process of spoken language i.e, conversational language by humans , We start by learning simple & most frequently used words, correcting errors like pronunciation, later we head towards simple sentences, complicated words, basic rules for sentence formation and so on.

“The Beauty of Deep Learning capable of Solving complex Real world Cases is , not because of highly complicated or complex techniques, But it simple technique of breaking down the problem into much smaller chunks and using simple neurons to solve. Neurons acts individually and also contributes to the complete network. This is like Jigsaw puzzle being solved by individual neurons which finds correct location for single Jigsaw Piece , lastly combining all the individual findings from the neurons provides the final output.”

Pros & Cons of Artificial Neural Networks:

As Artificial Neural Networks are main parts of Deep Learning, their Pros and cons Impacts deep learning directly .

As every aspect, ANNS also have few disadvantages

Add a comment

Related posts:

Distribute to win back

We buy because it is cheap and not because the industry is environmentally friendly or because it pays its taxes and offers good salaries. We buy what money allows, because, far more important than…

Running Slow in Nature May Be Better Than Running Fast in a City

In the large city where I now live, the runners I come in contact with all seem focused on running faster and training to run faster. When I run in the city, I’m still focused on my endurance and…

Medium Stats Have Loosened

This story compares concerns about Medium stats to the lack of winning on slot machines. However, the stats seem to be improving in recent days.