graph
– Interface for the Theano graph¶
Reference¶
Node classes (Apply, Variable) and expression graph algorithms.
To read about what theano graphs are from a user perspective, have a look at graph.html.
-
class
theano.gof.graph.
Apply
(op, inputs, outputs)¶ An Apply instance is a node in an expression graph which represents the application of an Op to some input Variable nodes, producing some output Variable nodes.
This class is typically instantiated by an Op’s make_node() function, which is typically called by that Op’s __call__() function.
An Apply instance serves as a simple structure with three important attributes:
inputs
: a list of Variable nodes that represent the arguments of the expression,outputs
: a list of Variable nodes that represent the variable of the expression, andop
: an Op instance that determines the nature of the expression being applied.
The driver compile.function uses Apply’s inputs attribute together with Variable’s owner attribute to search the expression graph and determine which inputs are necessary to compute the function’s outputs.
A Linker uses the Apply instance’s op field to compute the variables.
Comparing with the Python language, an Apply instance is theano’s version of a function call (or expression instance) whereas Op is theano’s version of a function definition.
Parameters: - op (Op instance) –
- inputs (list of Variable instances) –
- outputs (list of Variable instances) –
Notes
The owner field of each output in the outputs list will be set to self.
If an output element has an owner that is neither None nor self, then a ValueError exception will be raised.
-
clone
()¶ Duplicate this Apply instance with inputs = self.inputs.
Returns: A new Apply instance (or subclass instance) with new outputs. Return type: object Notes
Tags are copied from self to the returned instance.
-
clone_with_new_inputs
(inputs, strict=True)¶ Duplicate this Apply instance in a new graph.
Parameters: - inputs – List of Variable instances to use as inputs.
- strict (bool) – If True, the type fields of all the inputs must be equal to the current ones (or compatible, for instance Tensor / CudaNdarray of the same dtype and broadcastable patterns, in which case they will be converted into current Type), and returned outputs are guaranteed to have the same types as self.outputs. If False, then there’s no guarantee that the clone’s outputs will have the same types as self.outputs, and cloning may not even be possible (it depends on the Op).
Returns: An Apply instance with the same op but different outputs.
Return type: object
-
default_output
()¶ Returns the default output for this node.
Returns: An element of self.outputs, typically self.outputs[0]. Return type: Variable instance Notes
May raise AttributeError self.op.default_output is out of range, or if there are multiple outputs and self.op.default_output does not exist.
-
nin
¶ Property: Number of inputs.
-
nout
¶ Property: Number of outputs.
-
out
¶ Alias for self.default_output().
-
params_type
¶ type to use for the params
-
run_params
()¶ Returns the params for the node, or NoParams if no params is set.
-
class
theano.gof.graph.
Constant
(type, data, name=None)¶ A Constant is a Variable with a value field that cannot be changed at runtime.
Constant nodes make eligible numerous optimizations: constant inlining in C code, constant folding, etc.
Notes
The data field is filtered by what is provided in the constructor for the Constant’s type field.
WRITEME
-
clone
()¶ We clone this object, but we don’t clone the data to lower memory requirement. We suppose that the data will never change.
-
value
¶ read-only data access method
-
-
class
theano.gof.graph.
Node
¶ A Node in a theano graph.
Graphs contain two kinds of Nodes – Variable and Apply. Edges in the graph are not explicitly represented. Instead each Node keeps track of its parents via Variable.owner / Apply.inputs and its children via Variable.clients / Apply.outputs.
-
get_parents
()¶ Return a list of the parents of this node. Should return a copy–i.e., modifying the return value should not modify the graph structure.
-
-
class
theano.gof.graph.
Variable
(type, owner=None, index=None, name=None)¶ A Variable is a node in an expression graph that represents a variable.
The inputs and outputs of every Apply (theano.gof.Apply) are Variable instances. The input and output arguments to create a function are also Variable instances. A Variable is like a strongly-typed variable in some other languages; each Variable contains a reference to a Type instance that defines the kind of value the Variable can take in a computation.
A Variable is a container for four important attributes:
type
a Type instance defining the kind of value this Variable can have,owner
either None (for graph roots) or the Apply instance of which self is an output,index
the integer such thatowner.outputs[index] is this_variable
(ignored if owner is None),name
a string to use in pretty-printing and debugging.
There are a few kinds of Variables to be aware of: A Variable which is the output of a symbolic computation has a reference to the Apply instance to which it belongs (property: owner) and the position of itself in the owner’s output list (property: index).
Variable (this base type) is typically the output of a symbolic computation.
Constant (a subclass) which adds a default and un-replaceable
value
, and requires that owner is None.- TensorVariable subclass of Variable that represents a numpy.ndarray
object.
TensorSharedVariable Shared version of TensorVariable.
SparseVariable subclass of Variable that represents a scipy.sparse.{csc,csr}_matrix object.
CudaNdarrayVariable subclass of Variable that represents our object on the GPU that is a subset of numpy.ndarray.
RandomVariable.
A Variable which is the output of a symbolic computation will have an owner not equal to None.
Using the Variables’ owner field and the Apply nodes’ inputs fields, one can navigate a graph from an output all the way to the inputs. The opposite direction is not possible until a FunctionGraph has annotated the Variables with the clients field, ie, before the compilation process has begun a Variable does not know which Apply nodes take it as input.
Parameters: - type (a Type instance) – The type governs the kind of data that can be associated with this variable.
- owner (None or Apply instance) – The Apply instance which computes the value for this variable.
- index (None or int) – The position of this Variable in owner.outputs.
- name (None or str) – A string for pretty-printing and debugging.
Examples
import theano from theano import tensor a = tensor.constant(1.5) # declare a symbolic constant b = tensor.fscalar() # declare a symbolic floating-point scalar c = a + b # create a simple expression f = theano.function([b], [c]) # this works because a has a value associated with it already assert 4.0 == f(2.5) # bind 2.5 to an internal copy of b and evaluate an internal c theano.function([a], [c]) # compilation error because b (required by c) is undefined theano.function([a,b], [c]) # compilation error because a is constant, it can't be an input d = tensor.value(1.5) # create a value similar to the constant 'a' e = d + b theano.function([d,b], [e]) # this works. d's default value of 1.5 is ignored.
The python variables
a,b,c
all refer to instances of type Variable. The Variable refered to by a is also an instance of Constant.compile.function uses each Apply instance’s inputs attribute together with each Variable’s owner field to determine which inputs are necessary to compute the function’s outputs.
-
clone
()¶ Return a new Variable like self.
Returns: A new Variable instance (or subclass instance) with no owner or index. Return type: Variable instance Notes
Tags are copied to the returned instance.
Name is copied to the returned instance.
-
eval
(inputs_to_values=None)¶ Evaluates this variable.
Parameters: inputs_to_values – A dictionary mapping theano Variables to values. Examples
>>> import theano.tensor as T >>> x = T.dscalar('x') >>> y = T.dscalar('y') >>> z = x + y >>> z.eval({x : 16.3, y : 12.1}) array(28.4)
We passed
eval()
a dictionary mapping symbolic theano variables to the values to substitute for them, and it returned the numerical value of the expression.Notes
eval will be slow the first time you call it on a variable – it needs to call
function()
to compile the expression behind the scenes. Subsequent calls toeval()
on that same variable will be fast, because the variable caches the compiled function.This way of computing has more overhead than a normal Theano function, so don’t use it too much in real scripts.
-
theano.gof.graph.
ancestors
(variable_list, blockers=None)¶ Return the variables that contribute to those in variable_list (inclusive).
Parameters: variable_list (list of Variable instances) – Output Variable instances from which to search backward through owners. Returns: All input nodes, in the order found by a left-recursive depth-first search started at the nodes in variable_list. Return type: list of Variable instances
-
theano.gof.graph.
as_string
(i, o, leaf_formatter=<type 'str'>, node_formatter=<function default_node_formatter at 0x47dd398>)¶ WRITEME
Parameters: Returns: Returns a string representation of the subgraph between i and o. If the same op is used by several other ops, the first occurrence will be marked as
*n -> description
and all subsequent occurrences will be marked as*n
, where n is an id number (ids are attributed in an unspecified order and only exist for viewing convenience).Return type: str
-
theano.gof.graph.
clone
(i, o, copy_inputs=True)¶ Copies the subgraph contained between i and o.
Parameters: - i (list) – Input L{Variable}s.
- o (list) – Output L{Variable}s.
- copy_inputs (bool) – If True, the inputs will be copied (defaults to True).
Returns: The inputs and outputs of that copy.
Return type: object
-
theano.gof.graph.
clone_get_equiv
(inputs, outputs, copy_inputs_and_orphans=True, memo=None)¶ Return a dictionary that maps from Variable and Apply nodes in the original graph to a new node (a clone) in a new graph.
This function works by recursively cloning inputs... rebuilding a directed graph from the bottom (inputs) up to eventually building new outputs.
Parameters: - inputs (a list of Variables) –
- outputs (a list of Variables) –
- copy_inputs_and_orphans (bool) – True means to create the cloned graph from new input and constant nodes (the bottom of a feed-upward graph). False means to clone a graph that is rooted at the original input nodes.
- memo (None or dict) – Optionally start with a partly-filled dictionary for the return value. If a dictionary is passed, this function will work in-place on that dictionary and return it.
-
theano.gof.graph.
general_toposort
(r_out, deps, debug_print=False, compute_deps_cache=None, deps_cache=None)¶ WRITEME
Parameters: - deps – A python function that takes a node as input and returns its dependence.
- compute_deps_cache (optional) – If provided deps_cache should also be provided. This is a function like deps, but that also cache its results in a dict passed as deps_cache.
- deps_cache (dict) – Must be used with compute_deps_cache.
Notes
deps(i) should behave like a pure function (no funny business with internal state).
deps(i) will be cached by this function (to be fast).
The order of the return value list is determined by the order of nodes returned by the deps() function.
deps should be provided or can be None and the caller provides compute_deps_cache and deps_cache. The second option removes a Python function call, and allows for more specialized code, so it can be faster.
-
theano.gof.graph.
inputs
(variable_list, blockers=None)¶ Return the inputs required to compute the given Variables.
Parameters: variable_list (list of Variable instances) – Output Variable instances from which to search backward through owners. Returns: Input nodes with no owner, in the order found by a left-recursive depth-first search started at the nodes in variable_list. Return type: list of Variable instances
-
theano.gof.graph.
io_connection_pattern
(inputs, outputs)¶ Returns the connection pattern of a subgraph defined by given inputs and outputs.
-
theano.gof.graph.
io_toposort
(inputs, outputs, orderings=None)¶ WRITEME
Parameters: - inputs (list or tuple of Variable instances) –
- outputs (list or tuple of Apply instances) –
- orderings (dict) – Key: Apply instance. Value: list of Apply instance. It is important that the value be a container with a deterministic iteration order. No sets allowed!
-
theano.gof.graph.
is_same_graph
(var1, var2, givens=None, debug=False)¶ Return True iff Variables var1 and var2 perform the same computation.
By ‘performing the same computation’, we mean that they must share the same graph, so that for instance this function will return False when comparing (x * (y * z)) with ((x * y) * z).
The current implementation is not efficient since, when possible, it verifies equality by calling two different functions that are expected to return the same output. The goal is to verify this assumption, to eventually get rid of one of them in the future.
Parameters: - var1 – The first Variable to compare.
- var2 – The second Variable to compare.
- givens – Similar to the givens argument of theano.function, it can be used to perform substitutions in the computational graph of var1 and var2. This argument is associated to neither var1 nor var2: substitutions may affect both graphs if the substituted variable is present in both.
- debug (bool) – If True, then an exception is raised when we are in a situation where the equal_computations implementation cannot be called. This parameter is intended to be used in tests only, to make sure we properly test both implementations.
Examples
var1 var2 givens output x + 1 x + 1 {} True x + 1 y + 1 {} False x + 1 y + 1 {x: y} True
-
theano.gof.graph.
list_of_nodes
(inputs, outputs)¶ Return the apply nodes of the graph between inputs and outputs.
-
theano.gof.graph.
op_as_string
(i, op, leaf_formatter=<type 'str'>, node_formatter=<function default_node_formatter at 0x47dd398>)¶ WRITEME
-
theano.gof.graph.
ops
(i, o)¶ WRITEME
Parameters: - i (list) – Input L{Variable}s.
- o (list) – Output L{Variable}s.
Returns: The set of ops that are contained within the subgraph that lies between i and o, including the owners of the L{Variable}s in o and intermediary ops between i and o, but not the owners of the L{Variable}s in i.
Return type: object
-
theano.gof.graph.
orphans
(i, o)¶ WRITEME
Parameters: - i (list) – Input L{Variable}s.
- o (list) – Output L{Variable}s.
Returns: The set of Variables which one or more Variables in o depend on but are neither in i nor in the subgraph that lies between i and o.
Return type: object
Examples
orphans([x], [(x+y).out]) => [y]
-
theano.gof.graph.
stack_search
(start, expand, mode='bfs', build_inv=False)¶ Search through a graph, either breadth- or depth-first.
Parameters: - start (deque) – Search from these nodes.
- expand (callable) – When we get to a node, add expand(node) to the list of nodes to visit. This function should return a list, or None.
Returns: The list of nodes in order of traversal.
Return type: list of Variable or Apply instances (depends on expend)
Notes
A node will appear at most once in the return value, even if it appears multiple times in the start parameter.
Postcondition: every element of start is transferred to the returned list. Postcondition: start is empty.
-
theano.gof.graph.
variables
(i, o)¶ WRITEME
Parameters: - i (list) – Input L{Variable}s.
- o (list) – Output L{Variable}s.
Returns: The set of Variables that are involved in the subgraph that lies between i and o. This includes i, o, orphans(i, o) and all values of all intermediary steps from i to o.
Return type: object
-
theano.gof.graph.
variables_and_orphans
(i, o)¶ WRITEME
-
theano.gof.graph.
view_roots
(r)¶ Utility function that returns the leaves of a search through consecutive view_map()s.
WRITEME