OLA interview Question | Gradient Descent

Question

Do gradient descent methods always converge to the same point?

in progress 0
Dhruv2301 4 years 1 Answer 829 views Great Grand Master 0

Answer ( 1 )

  1. Not always. It depends upon from what point you start the process and start descending down the curve.
    Also, the learning rate which you keep plays a vital role as it determines the step size to take in a descent.
    Sometimes, you can get stuck in the local minima.

Leave an answer

Browse
Browse