bree7246 bree7246
  • 19-08-2021
  • Computers and Technology
contestada

In the Gradient Descent algorithm, we are more likely to reach the global minimum, if the learning rate is selected to be a large value.

a. True
b. False

Respuesta :

oddkevin9
oddkevin9 oddkevin9
  • 19-08-2021
False iiiiiiiiiiiiiiiiiiiiiiiiiiiiiiiii
Answer Link
swan2414 swan2414
  • 23-08-2021

Answer:

false i think.

Explanation:

Gradient Descent is more likely to reach a local minima. because starting at different points and just in general having a different starting point, will lead us to a different local minimum( aka the lowest point closest to the starting point). if alpha(the learning rate) is too large, gradient descent may fail to converge and may even diverge.

Answer Link

Otras preguntas

An observational field study allows scientists to gain information about how things occur in behavior or in nature. In what a field of science would a scientist
Resolve the equation: w(x)=x^4 + x^2 + 5.
Vous ____ des crayons dans votre sac à dos.
Which expression is equivalent to (265x^16)^1/4?
finance is a transaction between two parties where one party lends directly to the other​ party, whereas ▼ indirect direct finance involves three​ parties: the​
A government in which power is held by a small number of people is called
Which country would be least likely to join an intergovernmental organization?
Justin is the CEO of Nexon Inc. He works in an open office and encourages his employees to approach his desk any time of day. Justin incorporates a culture of o
what gods didn't help Odysseus
what is logic gate? ​