tf.contrib.layers.embedding_column(
sparse_id_column,
dimension,
combiner='mean',
initializer=None,
ckpt_to_load_from=None,
tensor_name_in_ckpt=None,
max_norm=None,
trainable=True
)
Defined in tensorflow/contrib/layers/python/layers/feature_column.py.
Creates an _EmbeddingColumn for feeding sparse data into a DNN.
Args:
sparse_id_column: A_SparseColumnwhich is created by for examplesparse_column_with_*or crossed_column functions. Note thatcombinerdefined insparse_id_columnis ignored.dimension: An integer specifying dimension of the embedding.combiner: A string specifying how to reduce if there are multiple entries in a single row. Currently "mean", "sqrtn" and "sum" are supported, with "mean" the default. "sqrtn" often achieves good accuracy, in particular with bag-of-words columns. Each of this can be thought as example level normalizations on the column:- "sum": do not normalize
- "mean": do l1 normalization
- "sqrtn": do l2 normalization
For more information:
tf.embedding_lookup_sparse.
initializer: A variable initializer function to be used in embedding variable initialization. If not specified, defaults totf.truncated_normal_initializerwith mean 0.0 and standard deviation 1/sqrt(sparse_id_column.length).ckpt_to_load_from: (Optional). String representing checkpoint name/pattern to restore the column weights. Required iftensor_name_in_ckptis not None.tensor_name_in_ckpt: (Optional). Name of theTensorin the provided checkpoint from which to restore the column weights. Required ifckpt_to_load_fromis not None.max_norm: (Optional). If not None, embedding values are l2-normalized to the value of max_norm.trainable: (Optional). Should the embedding be trainable. Default is True
Returns:
An _EmbeddingColumn.