About cookies on this site Our websites require some cookies to function properly (required). In addition, other cookies may be used with your consent to analyze site usage, improve the user experience and for advertising. For more information, please review your options. By visiting our website, you agree to our processing of information as described in IBM’sprivacy statement. To provide a smooth navigation, your cookie preferences will be shared across the IBM web domains listed here.
LearningRate
class LearningRate(learning_rate)
Bases: Generator
Represents a Learning Rate. Will be an attribute of GradientDescentState
. Note that GradientDescent
also has a learning rate. That learning rate can be a float, a list, an array, a function returning a generator and will be used to create a generator to be used during the optimization process. This class wraps Generator
so that we can also access the last yielded value.
Parameters
learning_rate (Union
[float
, List
[float
], ndarray
, Callable
[[], Iterator
]]) – Used to create a generator to iterate on.
Methods
close
LearningRate.close()
Raise GeneratorExit inside generator.
send
LearningRate.send(value)
Send a value into the generator. Return next yielded value or raise StopIteration.
throw
LearningRate.throw(typ, val=None, tb=None)
Raise an exception in the generator. Return next yielded value or raise StopIteration.
Attributes
current
Returns the current value of the learning rate.
Was this page helpful?
Report a bug or request content on GitHub.