tf.contrib.constrained_optimization.AdditiveSwapRegretOptimizer

Class AdditiveSwapRegretOptimizer

Defined in tensorflow/contrib/constrained_optimization/python/swap_regret_optimizer.py.

A ConstrainedOptimizer based on swap-regret minimization.

This ConstrainedOptimizer uses the given tf.train.Optimizers to jointly minimize over the model parameters, and maximize over constraint/objective weight matrix (the analogue of Lagrange multipliers), with the latter maximization using additive updates and an algorithm that minimizes swap regret.

For more specifics, please refer to:

Cotter, Jiang and Sridharan. "Two-Player Games for Efficient Non-Convex Constrained Optimization". https://arxiv.org/abs/1804.06500

The formulation used by this optimizer can be found in Definition 2, and is discussed in Section 4. It is most similar to Algorithm 2 in Section 4, with the differences being that it uses tf.train.Optimizers, instead of SGD, for the "inner" updates, and performs additive (instead of multiplicative) updates of the stochastic matrix.

__init__

__init__(
    optimizer,
    constraint_optimizer=None
)

Constructs a new AdditiveSwapRegretOptimizer.

Args:

  • optimizer: tf.train.Optimizer, used to optimize the objective and proxy_constraints portion of ConstrainedMinimizationProblem. If constraint_optimizer is not provided, this will also be used to optimize the Lagrange multiplier analogues.
  • constraint_optimizer: optional tf.train.Optimizer, used to optimize the Lagrange multiplier analogues.

Returns:

A new AdditiveSwapRegretOptimizer.

Properties

constraint_optimizer

Returns the tf.train.Optimizer used for the matrix.

optimizer

Returns the tf.train.Optimizer used for optimization.

Methods

tf.contrib.constrained_optimization.AdditiveSwapRegretOptimizer.minimize

minimize(
    minimization_problem,
    unconstrained_steps=None,
    global_step=None,
    var_list=None,
    gate_gradients=train_optimizer.Optimizer.GATE_OP,
    aggregation_method=None,
    colocate_gradients_with_ops=False,
    name=None,
    grad_loss=None
)

Returns an Operation for minimizing the constrained problem.

This method combines the functionality of minimize_unconstrained and minimize_constrained. If global_step < unconstrained_steps, it will perform an unconstrained update, and if global_step >= unconstrained_steps, it will perform a constrained update.

The reason for this functionality is that it may be best to initialize the constrained optimizer with an approximate optimum of the unconstrained problem.

Args:

  • minimization_problem: ConstrainedMinimizationProblem, the problem to optimize.
  • unconstrained_steps: int, number of steps for which we should perform unconstrained updates, before transitioning to constrained updates.
  • global_step: as in tf.train.Optimizer's minimize method.
  • var_list: as in tf.train.Optimizer's minimize method.
  • gate_gradients: as in tf.train.Optimizer's minimize method.
  • aggregation_method: as in tf.train.Optimizer's minimize method.
  • colocate_gradients_with_ops: as in tf.train.Optimizer's minimize method.
  • name: as in tf.train.Optimizer's minimize method.
  • grad_loss: as in tf.train.Optimizer's minimize method.

Returns:

Operation, the train_op.

Raises:

  • ValueError: If unconstrained_steps is provided, but global_step is not.

tf.contrib.constrained_optimization.AdditiveSwapRegretOptimizer.minimize_constrained

minimize_constrained(
    minimization_problem,
    global_step=None,
    var_list=None,
    gate_gradients=train_optimizer.Optimizer.GATE_OP,
    aggregation_method=None,
    colocate_gradients_with_ops=False,
    name=None,
    grad_loss=None
)

Returns an Operation for minimizing the constrained problem.

Unlike minimize_unconstrained, this function attempts to find a solution that minimizes the objective portion of the minimization problem while satisfying the constraints portion.

Args:

Returns:

Operation, the train_op.

tf.contrib.constrained_optimization.AdditiveSwapRegretOptimizer.minimize_unconstrained

minimize_unconstrained(
    minimization_problem,
    global_step=None,
    var_list=None,
    gate_gradients=train_optimizer.Optimizer.GATE_OP,
    aggregation_method=None,
    colocate_gradients_with_ops=False,
    name=None,
    grad_loss=None
)

Returns an Operation for minimizing the unconstrained problem.

Unlike minimize_constrained, this function ignores the constraints (and proxy_constraints) portion of the minimization problem entirely, and only minimizes objective.

Args:

Returns:

Operation, the train_op.