tf.compat.v1.nn.rnn_cell.LSTMCell

View source on GitHub

Long short-term memory unit (LSTM) recurrent network cell.

tf.compat.v1.nn.rnn_cell.LSTMCell(
    num_units, use_peepholes=False, cell_clip=None, initializer=None, num_proj=None,
    proj_clip=None, num_unit_shards=None, num_proj_shards=None, forget_bias=1.0,
    state_is_tuple=True, activation=None, reuse=None, name=None, dtype=None,
    **kwargs
)

The default non-peephole implementation is based on:

https://pdfs.semanticscholar.org/1154/0131eae85b2e11d53df7f1360eeb6476e7f4.pdf

Felix Gers, Jurgen Schmidhuber, and Fred Cummins. "Learning to forget: Continual prediction with LSTM." IET, 850-855, 1999.

The peephole implementation is based on:

https://research.google.com/pubs/archive/43905.pdf

Hasim Sak, Andrew Senior, and Francoise Beaufays. "Long short-term memory recurrent neural network architectures for large scale acoustic modeling." INTERSPEECH, 2014.

The class uses optional peep-hole connections, optional cell clipping, and an optional projection layer.

Note that this cell is not optimized for performance. Please use tf.contrib.cudnn_rnn.CudnnLSTM for better performance on GPU, or tf.contrib.rnn.LSTMBlockCell and tf.contrib.rnn.LSTMBlockFusedCell for better performance on CPU.

Args:

Attributes:

Methods

get_initial_state

View source

get_initial_state(
    inputs=None, batch_size=None, dtype=None
)

zero_state

View source

zero_state(
    batch_size, dtype
)

Return zero-filled state tensor(s).

Args:

Returns:

If state_size is an int or TensorShape, then the return value is a N-D tensor of shape [batch_size, state_size] filled with zeros.

If state_size is a nested list or tuple, then the return value is a nested list or tuple (of the same structure) of 2-D tensors with the shapes [batch_size, s] for each s in state_size.