Data Science | Machine learning
Question
Why learning rate should be less in gradient descent?
in progress
0
Machine Learning
4 years
1 Answer
695 views
Member 2
Answer ( 1 )
Gradient descent is a technique of minimizing the losses so that it can achieve global minima. The global minimum is a point where the losses are minimum. So if the learning rate is less then only we can reach the global minimum otherwise if the learning rate is high then it can miss the global minimum.
Attachment