Answer ( 1 )

  1. Gradient descent is a technique of minimizing the losses so that it can achieve global minima. The global minimum is a point where the losses are minimum. So if the learning rate is less then only we can reach the global minimum otherwise if the learning rate is high then it can miss the global minimum.


    Attachment

Leave an answer

Browse
Browse