Toggle search
Search
Toggle menu
notifications
Toggle personal menu
Editing
Learning Rate
(section)
From llamawiki.ai
Views
Read
Edit
Edit source
View history
associated-pages
Page
Discussion
More actions
Warning:
You are not logged in. Your IP address will be publicly visible if you make any edits. If you
log in
or
create an account
, your edits will be attributed to your username, along with other benefits.
Anti-spam check. Do
not
fill this in!
== Overview == The learning rate determines the step size during the [[stochastic gradient descent]] process. A high learning rate allows the model to learn faster, at the cost of possibly overshooting the minimum point of the [[Loss function|loss function]]. A low learning rate allows the model to converge reliably, but at the cost of slower learning speed.
Summary:
Please note that all contributions to llamawiki.ai may be edited, altered, or removed by other contributors. If you do not want your writing to be edited mercilessly, then do not submit it here.
You are also promising us that you wrote this yourself, or copied it from a public domain or similar free resource (see
LlamaWiki:Copyrights
for details).
Do not submit copyrighted work without permission!
Cancel
Editing help
(opens in new window)