In the beginning, there was (and still is) the neuron. A neuron is the basic building block of artificial neural networks, which forms the basic architecture for Deep Learning today.
The neuron is basically, a summation of all inputs (X1, X2 & B in Fig 1) multiplied by the weights and passed through an activation function F (i.e. non-linear transformation). By combining a whole bunch of these neurons, you get an artificial neural network (Fig 2) which you could train to perform predictions of various tasks (e.g. estimate test results, classify stuff according to categories, etc.).
So, there you go! this is the beginning of neural networks and Deep Learning as we know it today. I hope to share more about how the ANN works in upcoming posts.