symbolic_pymc package

Submodules

symbolic_pymc.dispatch module

symbolic_pymc.dispatch.car_MetaVariable(x)

Return the operator/head/CAR of a meta variable.

symbolic_pymc.dispatch.cdr_MetaVariable(x)

Return the arguments/tail/CDR of a variable object.

See cdr_MetaSymbol

symbolic_pymc.dispatch.unify_MetaSymbol(u, v, s)

symbolic_pymc.meta module

class symbolic_pymc.meta.MetaOp(*args, **kwargs)

Bases: symbolic_pymc.meta.MetaSymbol

A meta object that represents a MetaVariable-producing operator.

Also, make sure to override Op.out_meta_type and make it return the expected meta variable type, if it isn’t the default: MetaTensorVariable.

In some cases, operators hold their own inputs and outputs (e.g. TensorFlow), and, in others, an intermediary “application” node holds that information. This class leaves those details up to the implementation.

abstract output_meta_types(inputs=None)

Return the types of meta variables this Op is expected to produce given the inputs.

reset()
exception symbolic_pymc.meta.MetaReificationError

Bases: Exception

An exception type for errors encountered during the creation of base objects from meta objects.

class symbolic_pymc.meta.MetaSymbol(obj=None)

Bases: object

Meta objects for unification and such.

TODO: Should MetaSymbol.obj be an abstract property and a weakref?

abstract property base

Return the underlying (e.g. a theano/tensorflow) base type/rator for this meta object.

classmethod base_subclasses()

Return all meta symbols with valid, implemented bases (i.e. base property is a type object).

classmethod is_meta(obj)
property obj
property rands

Get a tuple of the meta object’s operator parameters (i.e. “rands”).

reify()

Attempt to create a concrete base object from this meta object.

During the process, dependent objects will need to be reified, which may result in updates to the object(s) being reified.

For instance, if a meta tensor’s parent operator is fully reifiable to a base object, then the meta tensor’s dtype and shape may be fixed: e.g. a tensor corresponding to the output of a sum of two float64 scalars is necessarily a float64 scalar.

This function will set any unspecified properties (e.g. dtype and shape values for the previous example), mutating the object in-place when possible. It will return a [refined/partially reified] meta object when it can’t fully reify to a base object (in which case, it will return the base object) or when partial reification results in a meta object from a subclass.

reset()
class symbolic_pymc.meta.MetaSymbolType

Bases: abc.ABCMeta

class symbolic_pymc.meta.MetaVariable(obj=None)

Bases: symbolic_pymc.meta.MetaSymbol

abstract property base_arguments

Return the base-level arguments.

These arguments used in conjunction with the callable self.base_operator should re-produce this variable.

abstract property base_operator

Return a meta object representing a base-level operator.

It should be callable with all inputs necessary to reproduce this tensor given by self.base_arguments.

reset()
symbolic_pymc.meta.disable_auto_reification()

Stop meta objects from automatically reifying themselves in order to determine unspecified properties.

symbolic_pymc.meta.enable_lvar_defaults(*types)

Use logic variables instead of guessed/inferred values during meta object creation.

This is useful for handling unexpected values–created by default or behind the scenes–in backend base objects (e.g. default names, TF NodeDef attributes, etc.). By using logic variables instead, it’s much easier to create meta object “patterns” when certain types of exactness aren’t necessary.

types: collection of str

String names for the types we want to make default to logic variables. Currently allowed values are “names” and “node_attrs” (for TensorFlow).

symbolic_pymc.meta.meta_reify_iter(rands)

Recursively reify an iterable object and return a boolean indicating the presence of un-reifiable objects, if any.

symbolic_pymc.meta.metatize(obj)

Convert object to base type then meta object.

symbolic_pymc.utils module

class symbolic_pymc.utils.HashableNDArray

Bases: numpy.ndarray, collections.abc.Hashable

A subclass of Numpy’s ndarray that uses tostring hashing and array_equal equality testing.

>>> import numpy as np
>>> from symbolic_pymc.utils import HashableNDArray
>>> x = np.r_[1, 2, 3]
>>> x_new = x.view(HashableNDArray)
>>> assert hash(x_new) == hash(x.tostring())
>>> assert x_new == np.r_[1, 2, 3]
class symbolic_pymc.utils.UnequalMetaParts(path, reason, objects)

Bases: tuple

Create new instance of UnequalMetaParts(path, reason, objects)

property objects

Alias for field number 2

property path

Alias for field number 0

property reason

Alias for field number 1

symbolic_pymc.utils.eq_lvar(x, y)

Perform an equality check that considers all logic variables equal.

symbolic_pymc.utils.lvar_ignore_ne(x, y)
symbolic_pymc.utils.meta_diff(x, y, pdb=False, ne_fn=<built-in function ne>, cmp_types=True, path=<function identity>)

Traverse meta objects and return information about the first pair of elements that are not equal.

Returns a UnequalMetaParts object containing the object path, reason for being unequal, and the unequal object pair; otherwise, None.

symbolic_pymc.utils.meta_diff_seq(x, y, loc, path, is_map=False, **kwargs)

Module contents