OpFromGraph¶
This page describes theano.OpFromGraph
, an Op that allows to
encapsulate a Theano graph in an op.
This can be used to encapsulate some functionality in one block. It is useful to scale Theano compilation for regular bigger graphs when we reuse that encapsulated fonctionality with different inputs many times. Due to this encapsulation, it can make Theano compilation phase faster for graphs with many nodes.
Using this for small graphs is not recommended as it disables optimizations between what is inside the encapsulation and outside of it.
-
class
theano.compile.builders.
OpFromGraph
(inputs, outputs, **kwargs)¶ This creates an Op from inputs and outputs lists of variables.
The signature is similar to theano.function() and the resulting Op‘s perform will do the same operation as:
orig_function(inputs, outputs, **kwargs)
- TODO:
- examples for a multi-layer mlp. where?
- __hash__, __eq__ otherwise won’t merge, try gof.opt.is_same_graph_with_merge(op1.new_outputs, op2, new_outputs)
- c_code() to remove the double overhead?
- opt to unfold it, work inplace on inputs
- grad() make it support DisconnectedType and the new interface
- check how it works with updates.
- add test with constant as input or inside the inner graph.
- Add support for the GPU? Probably just need an opt to remove transfer
- Add support to pickle this Op.
- Add support/test with random generator
Notes
- We support shared variables in the inner graph. This is automatic and invisible to the user. They can be as input to the node or in the inner graph.
- We support unused inputs. This is needed for the grad.
Examples
Example 1:
from theano import function, OpFromGraph, tensor x, y, z = tensor.scalars('xyz') e = x + y * z op = OpFromGraph([x, y, z], [e]) # op behaves like a normal theano op e2 = op(x, y, z) + op(z, y, x) fn = function([x, y, z], [e2])
Example 2 with shared variable:
import numpy import theano from theano import config, function, OpFromGraph, tensor x, y, z = tensor.scalars('xyz') s = theano.shared(numpy.random.rand(2, 2).astype(config.floatX)) e = x + y * z + s op = OpFromGraph([x, y, z], [e]) # op behaves like a normal theano op e2 = op(x, y, z) + op(z, y, x) fn = function([x, y, z], [e2])