Respuesta :

False iiiiiiiiiiiiiiiiiiiiiiiiiiiiiiiii

Answer:

false i think.

Explanation:

Gradient Descent is more likely to reach a local minima. because starting at different points and just in general having a different starting point, will lead us to a different local minimum( aka the lowest point closest to the starting point). if alpha(the learning rate) is too large, gradient descent may fail to converge and may even diverge.

ACCESS MORE