Represents a Learning Rate. Will be an attribute of
GradientDescentState. Note that
GradientDescent also has a learning rate. That learning rate can be a float, a list, an array, a function returning a generator and will be used to create a generator to be used during the optimization process. This class wraps
Generator so that we can also access the last yielded value.
learning_rate (float (opens in a new tab) |list (opens in a new tab)[float (opens in a new tab)] | np.ndarray | Callable[, Generator[float (opens in a new tab), None, None]]) – Used to create a generator to iterate on.
Returns the current value of the learning rate.
Raise GeneratorExit inside generator.
Send a value into the generator. Return next yielded value or raise StopIteration.
throw(typ, val=None, tb=None)
Raise an exception in the generator. Return next yielded value or raise StopIteration.