Class ScipyOptimizerInterface
Inherits From: ExternalOptimizerInterface
Defined in tensorflow/contrib/opt/python/training/external_optimizer.py
.
Wrapper allowing scipy.optimize.minimize
to operate a tf.Session
.
Example:
vector = tf.Variable([7., 7.], 'vector')
# Make vector norm as small as possible.
loss = tf.reduce_sum(tf.square(vector))
optimizer = ScipyOptimizerInterface(loss, options={'maxiter': 100})
with tf.Session() as session:
optimizer.minimize(session)
# The value of vector should now be [0., 0.].
Example with simple bound constraints:
vector = tf.Variable([7., 7.], 'vector')
# Make vector norm as small as possible.
loss = tf.reduce_sum(tf.square(vector))
optimizer = ScipyOptimizerInterface(
loss, var_to_bounds={vector: ([1, 2], np.infty)})
with tf.Session() as session:
optimizer.minimize(session)
# The value of vector should now be [1., 2.].
Example with more complicated constraints:
vector = tf.Variable([7., 7.], 'vector')
# Make vector norm as small as possible.
loss = tf.reduce_sum(tf.square(vector))
# Ensure the vector's y component is = 1.
equalities = [vector[1] - 1.]
# Ensure the vector's x component is >= 1.
inequalities = [vector[0] - 1.]
# Our default SciPy optimization algorithm, L-BFGS-B, does not support
# general constraints. Thus we use SLSQP instead.
optimizer = ScipyOptimizerInterface(
loss, equalities=equalities, inequalities=inequalities, method='SLSQP')
with tf.Session() as session:
optimizer.minimize(session)
# The value of vector should now be [1., 1.].
__init__
__init__(
loss,
var_list=None,
equalities=None,
inequalities=None,
var_to_bounds=None,
**optimizer_kwargs
)
Initialize a new interface instance.
Args:
loss
: A scalarTensor
to be minimized.var_list
: Optionallist
ofVariable
objects to update to minimizeloss
. Defaults to the list of variables collected in the graph under the keyGraphKeys.TRAINABLE_VARIABLES
.equalities
: Optionallist
of equality constraint scalarTensor
s to be held equal to zero.inequalities
: Optionallist
of inequality constraint scalarTensor
s to be held nonnegative.var_to_bounds
: Optionaldict
where each key is an optimizationVariable
and each corresponding value is a length-2 tuple of(low, high)
bounds. Although enforcing this kind of simple constraint could be accomplished with theinequalities
arg, not all optimization algorithms support general inequality constraints, e.g. L-BFGS-B. Bothlow
andhigh
can either be numbers or anything convertible to a NumPy array that can be broadcast to the shape ofvar
(usingnp.broadcast_to
). To indicate that there is no bound, useNone
(or+/- np.infty
). For example, ifvar
is a 2x3 matrix, then any of the following correspondingbounds
could be supplied:(0, np.infty)
: Each element ofvar
held positive.(-np.infty, [1, 2])
: First column less than 1, second column less than 2.(-np.infty, [[1], [2], [3]])
: First row less than 1, second row less than 2, etc.(-np.infty, [[1, 2, 3], [4, 5, 6]])
: Entryvar[0, 0]
less than 1,var[0, 1]
less than 2, etc.
**optimizer_kwargs
: Other subclass-specific keyword arguments.
Methods
tf.contrib.opt.ScipyOptimizerInterface.minimize
minimize(
session=None,
feed_dict=None,
fetches=None,
step_callback=None,
loss_callback=None,
**run_kwargs
)
Minimize a scalar Tensor
.
Variables subject to optimization are updated in-place at the end of optimization.
Note that this method does not just return a minimization Op
, unlike
Optimizer.minimize()
; instead it actually performs minimization by
executing commands to control a Session
.
Args:
session
: ASession
instance.feed_dict
: A feed dict to be passed to calls tosession.run
.fetches
: A list ofTensor
s to fetch and supply toloss_callback
as positional arguments.step_callback
: A function to be called at each optimization step; arguments are the current values of all optimization variables flattened into a single vector.loss_callback
: A function to be called every time the loss and gradients are computed, with evaluated fetches supplied as positional arguments.**run_kwargs
: kwargs to pass tosession.run
.