altair.FacetEncodingFieldDef

class altair.FacetEncodingFieldDef(type=Undefined, aggregate=Undefined, align=Undefined, bin=Undefined, bounds=Undefined, center=Undefined, columns=Undefined, field=Undefined, header=Undefined, sort=Undefined, spacing=Undefined, timeUnit=Undefined, title=Undefined, **kwds)

FacetEncodingFieldDef schema wrapper

Mapping(required=[type])

Attributes
typeStandardType

The encoded field’s type of measurement ( "quantitative", "temporal", "ordinal", or "nominal" ). It can also be a "geojson" type for encoding ‘geoshape’.

Note:

  • Data values for a temporal field can be either a date-time string (e.g., "2015-03-07 12:32:17", "17:01", "2015-03-16". "2015" ) or a timestamp number (e.g., 1552199579097 ).

  • Data type describes the semantics of the data rather than the primitive data types (number, string, etc.). The same primitive data type can have different types of measurement. For example, numeric data can represent quantitative, ordinal, or nominal data.

  • When using with bin, the type property can be either "quantitative" (for using a linear bin scale) or “ordinal” (for using an ordinal bin scale).

  • When using with timeUnit, the type property can be either "temporal" (for using a temporal scale) or “ordinal” (for using an ordinal scale).

  • When using with aggregate, the type property refers to the post-aggregation data type. For example, we can calculate count distinct of a categorical field "cat" using {"aggregate": "distinct", "field": "cat", "type": "quantitative"}. The "type" of the aggregate output is "quantitative".

  • Secondary channels (e.g., x2, y2, xError, yError ) do not have type as they have exactly the same type as their primary channels (e.g., x, y ).

See also: type documentation.

aggregateAggregate

Aggregation function for the field (e.g., "mean", "sum", "median", "min", "max", "count" ).

Default value: undefined (None)

See also: aggregate documentation.

alignanyOf(LayoutAlign, RowColLayoutAlign)

The alignment to apply to grid rows and columns. The supported string values are "all", "each", and "none".

  • For "none", a flow layout will be used, in which adjacent subviews are simply placed one after the other.

  • For "each", subviews will be aligned into a clean grid structure, but each row or column may be of variable size.

  • For "all", subviews will be aligned and each row or column will be sized identically based on the maximum observed size. String values for this property will be applied to both grid rows and columns.

Alternatively, an object value of the form {"row": string, "column": string} can be used to supply different alignments for rows and columns.

Default value: "all".

binanyOf(boolean, BinParams, None)

A flag for binning a quantitative field, an object defining binning parameters, or indicating that the data for x or y channel are binned before they are imported into Vega-Lite ( "binned" ).

If true, default binning parameters will be applied.

If "binned", this indicates that the data for the x (or y ) channel are already binned. You can map the bin-start field to x (or y ) and the bin-end field to x2 (or y2 ). The scale and axis will be formatted similar to binning in Vega-Lite. To adjust the axis ticks based on the bin step, you can also set the axis’s tickMinStep property.

Default value: false

See also: bin documentation.

boundsenum(‘full’, ‘flush’)

The bounds calculation method to use for determining the extent of a sub-plot. One of full (the default) or flush.

  • If set to full, the entire calculated bounds (including axes, title, and legend) will be used.

  • If set to flush, only the specified width and height values for the sub-view will be used. The flush setting can be useful when attempting to place sub-plots without axes or legends into a uniform grid structure.

Default value: "full"

centeranyOf(boolean, RowColboolean)

Boolean flag indicating if subviews should be centered relative to their respective rows or columns.

An object value of the form {"row": boolean, "column": boolean} can be used to supply different centering values for rows and columns.

Default value: false

columnsfloat

The number of columns to include in the view composition layout.

Default value : undefined – An infinite number of columns (a single row) will be assumed. This is equivalent to hconcat (for concat ) and to using the column channel (for facet and repeat ).

Note :

  1. This property is only for:

  • the general (wrappable) concat operator (not hconcat / vconcat )

  • the facet and repeat operator with one field/repetition definition (without row/column nesting)

2) Setting the columns to 1 is equivalent to vconcat (for concat ) and to using the row channel (for facet and repeat ).

fieldField

Required. A string defining the name of the field from which to pull a data value or an object defining iterated values from the repeat operator.

See also: field documentation.

Notes: 1) Dots ( . ) and brackets ( [ and ] ) can be used to access nested objects (e.g., "field": "foo.bar" and "field": "foo['bar']" ). If field names contain dots or brackets but are not nested, you can use \ to escape dots and brackets (e.g., "a\.b" and "a\[0\]" ). See more details about escaping in the field documentation. 2) field is not required if aggregate is count.

headerHeader

An object defining properties of a facet’s header.

sortanyOf(SortArray, SortOrder, EncodingSortField, None)

Sort order for the encoded field.

For continuous fields (quantitative or temporal), sort can be either "ascending" or "descending".

For discrete fields, sort can be one of the following:

  • "ascending" or "descending" – for sorting by the values’ natural order in JavaScript.

  • A sort field definition for sorting by another field.

  • An array specifying the field values in preferred order. In this case, the sort order will obey the values in the array, followed by any unspecified values in their original order. For discrete time field, values in the sort array can be date-time definition objects. In addition, for time units "month" and "day", the values can be the month or day names (case insensitive) or their 3-letter initials (e.g., "Mon", "Tue" ).

  • null indicating no sort.

Default value: "ascending"

Note: null is not supported for row and column.

spacinganyOf(float, RowColnumber)

The spacing in pixels between sub-views of the composition operator. An object of the form {"row": number, "column": number} can be used to set different spacing values for rows and columns.

Default value : Depends on "spacing" property of the view composition configuration ( 20 by default)

timeUnitanyOf(TimeUnit, TimeUnitParams)

Time unit (e.g., year, yearmonth, month, hours ) for a temporal field. or a temporal field that gets casted as ordinal.

Default value: undefined (None)

See also: timeUnit documentation.

titleanyOf(Text, None)

A title for the field. If null, the title will be removed.

Default value: derived from the field’s name and transformation function ( aggregate, bin and timeUnit ). If the field has an aggregate function, the function is displayed as part of the title (e.g., "Sum of Profit" ). If the field is binned or has a time unit applied, the applied function is shown in parentheses (e.g., "Profit (binned)", "Transaction Date (year-month)" ). Otherwise, the title is simply the field name.

Notes :

1) You can customize the default field title format by providing the fieldTitle property in the config or fieldTitle function via the compile function’s options.

2) If both field definition’s title and axis, header, or legend title are defined, axis/header/legend title will be used.

__init__(self, type=Undefined, aggregate=Undefined, align=Undefined, bin=Undefined, bounds=Undefined, center=Undefined, columns=Undefined, field=Undefined, header=Undefined, sort=Undefined, spacing=Undefined, timeUnit=Undefined, title=Undefined, **kwds)

Initialize self. See help(type(self)) for accurate signature.

Methods

__init__(self[, type, aggregate, align, …])

Initialize self.

copy(self[, deep, ignore])

Return a copy of the object

from_dict(dct[, validate, _wrapper_classes])

Construct class from a dictionary representation

from_json(json_string[, validate])

Instantiate the object from a valid JSON string

resolve_references([schema])

Resolve references in the context of this object’s schema or root schema.

to_dict(self[, validate, ignore, context])

Return a dictionary representation of the object

to_json(self[, validate, ignore, context, …])

Emit the JSON representation for this object as a string.

validate(instance[, schema])

Validate the instance against the class schema in the context of the rootschema.

validate_property(name, value[, schema])

Validate a property against property schema in the context of the rootschema

copy(self, deep=True, ignore=())

Return a copy of the object

Parameters
deepboolean or list, optional

If True (default) then return a deep copy of all dict, list, and SchemaBase objects within the object structure. If False, then only copy the top object. If a list or iterable, then only copy the listed attributes.

ignorelist, optional

A list of keys for which the contents should not be copied, but only stored by reference.

classmethod from_dict(dct, validate=True, _wrapper_classes=None)

Construct class from a dictionary representation

Parameters
dctdictionary

The dict from which to construct the class

validateboolean

If True (default), then validate the input against the schema.

_wrapper_classeslist (optional)

The set of SchemaBase classes to use when constructing wrappers of the dict inputs. If not specified, the result of cls._default_wrapper_classes will be used.

Returns
objSchema object

The wrapped schema

Raises
jsonschema.ValidationError :

if validate=True and dct does not conform to the schema

classmethod from_json(json_string, validate=True, **kwargs)

Instantiate the object from a valid JSON string

Parameters
json_stringstring

The string containing a valid JSON chart specification.

validateboolean

If True (default), then validate the input against the schema.

**kwargs :

Additional keyword arguments are passed to json.loads

Returns
chartChart object

The altair Chart object built from the specification.

classmethod resolve_references(schema=None)

Resolve references in the context of this object’s schema or root schema.

to_dict(self, validate=True, ignore=None, context=None)

Return a dictionary representation of the object

Parameters
validateboolean or string

If True (default), then validate the output dictionary against the schema. If “deep” then recursively validate all objects in the spec. This takes much more time, but it results in friendlier tracebacks for large objects.

ignorelist

A list of keys to ignore. This will not passed to child to_dict function calls.

contextdict (optional)

A context dictionary that will be passed to all child to_dict function calls

Returns
dctdictionary

The dictionary representation of this object

Raises
jsonschema.ValidationError :

if validate=True and the dict does not conform to the schema

to_json(self, validate=True, ignore=[], context={}, indent=2, sort_keys=True, **kwargs)

Emit the JSON representation for this object as a string.

Parameters
validateboolean or string

If True (default), then validate the output dictionary against the schema. If “deep” then recursively validate all objects in the spec. This takes much more time, but it results in friendlier tracebacks for large objects.

ignorelist

A list of keys to ignore. This will not passed to child to_dict function calls.

contextdict (optional)

A context dictionary that will be passed to all child to_dict function calls

indentinteger, default 2

the number of spaces of indentation to use

sort_keysboolean, default True

if True, sort keys in the output

**kwargs

Additional keyword arguments are passed to json.dumps()

Returns
specstring

The JSON specification of the chart object.

classmethod validate(instance, schema=None)

Validate the instance against the class schema in the context of the rootschema.

classmethod validate_property(name, value, schema=None)

Validate a property against property schema in the context of the rootschema