Class Optimizer
Defined in tensorflow/python/keras/optimizers.py.
Abstract optimizer base class.
All Keras optimizers support the following keyword arguments:
clipnorm: float >= 0. Gradients will be clipped
when their L2 norm exceeds this value.
clipvalue: float >= 0. Gradients will be clipped
when their absolute value exceeds this value.
__init__
__init__(**kwargs)
Initialize self. See help(type(self)) for accurate signature.
Methods
tf.keras.optimizers.Optimizer.from_config
@classmethod
from_config(
cls,
config
)
tf.keras.optimizers.Optimizer.get_config
get_config()
tf.keras.optimizers.Optimizer.get_gradients
get_gradients(
loss,
params
)
Returns gradients of loss with respect to params.
Arguments:
loss: Loss tensor.params: List of variables.
Returns:
List of gradient tensors.
Raises:
ValueError: In case any gradient cannot be computed (e.g. if gradient function not implemented).
tf.keras.optimizers.Optimizer.get_updates
get_updates(
loss,
params
)
tf.keras.optimizers.Optimizer.get_weights
get_weights()
Returns the current value of the weights of the optimizer.
Returns:
A list of numpy arrays.
tf.keras.optimizers.Optimizer.set_weights
set_weights(weights)
Sets the weights of the optimizer, from Numpy arrays.
Should only be called after computing the gradients (otherwise the optimizer has no weights).
Arguments:
weights: a list of Numpy arrays. The number of arrays and their shape must match number of the dimensions of the weights of the optimizer (i.e. it should match the output ofget_weights).
Raises:
ValueError: in case of incompatible weight shapes.