Gradient descent is an optimization algorithm that finds the local minimum of a function. The algorithm will take iterative proportional steps toward the negative gradient of the function at the current point. The algorithm usually starts with parameters (weights and bias) and improves them slowly as it tries to get a sense of the value of the cost function for weights that are similar to the current weights (by calculating the gradient). Then it moves in the direction which reduces the cost function by repeating this step thousands of times. The algorithm will continually minimize the cost function.
Next: Natural Language Processing