What is the difference between back-propagation and gradient descent?
Back-propagation and gradient descent:
Back-propagation is the process of calculating the derivatives.
Gradient descent is the process of determining the first-order iterative optimization for determining the local minimum of a differentiable function.