tf.contrib.estimator.clip_gradients_by_norm(
optimizer,
clip_norm
)
Returns an optimizer which clips gradients before applying them.
Example:
optimizer = tf.train.ProximalAdagradOptimizer(
learning_rate=0.1,
l1_regularization_strength=0.001)
optimizer = tf.contrib.estimator.clip_gradients_by_norm(
optimizer, clip_norm)
estimator = tf.estimator.DNNClassifier(
feature_columns=[...],
hidden_units=[1024, 512, 256],
optimizer=optimizer)
Args:
optimizer
: Antf.Optimizer
object to apply gradients.clip_norm
: A 0-D (scalar)Tensor
> 0. The clipping ratio.
Returns:
A tf.Optimizer
.