# Introduction to Gradient Descent

Gradient Descent is an optimization technique in machine learning and deep learning and it can be used with all the learning algorithms.

A gradient is the slope of the function, the degree of change of a parameter with the amount of change in another parameter.

Mathematically, it can be described as the partial derivative of a set of parameters concerning its inputs. The more the gradient, the steeper the slope.

Gradient Descent is a Convex Function.

Gradient Descent can be described as an iterative method that is used to find the values of the parameters of a function that minimizes the cost function as much as possible.

The parameters are initially defined with a particular value and from that, Gradient Descent is run in an alternative fashion to find the optimal values of the parameters, using calculus, to find the minimum possible value of the given cost function.