While designing a Neural Network, in the beginning, we initialize weights with some random values or any variable. It is not necessary that whatever weights values we have selected will be correct. i.e., there could be enormous error value. One way to train our model is called as Backpropagation.
The Backpropagation algorithm looks for the least value of the error function in weight space using a technique called gradient descent or delta rule. The weights that minimize the error function are then considered to be a solution to the learning problem.