It is first-order iterative optimization algorithm for finding the minimum of a function. In machine learning algorithms, we use gradient descent to minimize the cost function. It finds out the best set of parameters for our algorithm.
1. On the basis of data ingestion
In full batch gradient descent algorithms, we use whole data at once to compute the gradient, whereas in stochastic we take a sample while computing the gradient.
2. On the basis of differentiation techniques