tf.compat.v1.nn.static_bidirectional_rnn

View source on GitHub

Creates a bidirectional recurrent neural network. (deprecated)

tf.compat.v1.nn.static_bidirectional_rnn(
    cell_fw, cell_bw, inputs, initial_state_fw=None, initial_state_bw=None,
    dtype=None, sequence_length=None, scope=None
)

Warning: THIS FUNCTION IS DEPRECATED. It will be removed in a future version. Instructions for updating: Please use keras.layers.Bidirectional(keras.layers.RNN(cell, unroll=True)), which is equivalent to this API

Similar to the unidirectional case above (rnn) but takes input and builds independent forward and backward RNNs with the final forward and backward outputs depth-concatenated, such that the output will have the format [time][batch][cell_fw.output_size + cell_bw.output_size]. The input_size of forward and backward cell must match. The initial state for both directions is zero by default (but can be set optionally) and no intermediate states are ever returned -- the network is fully unrolled for the given (passed in) length(s) of the sequence(s) or completely unrolled if length(s) is not given.

Args:

Returns:

A tuple (outputs, output_state_fw, output_state_bw) where: outputs is a length T list of outputs (one for each input), which are depth-concatenated forward and backward outputs. output_state_fw is the final state of the forward rnn. output_state_bw is the final state of the backward rnn.

Raises: