Bidirectional wrapper for RNNs
merge_mode = "concat",
weights = NULL,
backward_layer = NULL,
What to compose the new
Layer instance with. Typically a
Sequential model or a Tensor (e.g., as returned by
The return value depends on
Layer instance is returned.
Sequential model, the model with an additional layer is returned.
a Tensor, the output tensor from
layer_instance(object) is returned.
RNN layer instance, such as
layer_gru(). It could also be a
keras$layers$Layer instance that
meets the following criteria:
Be a sequence-processing layer (accepts 3D+ inputs).
(with the same semantics as for the
Implement serialization via
that the recommended way to create new RNN layers is to write a custom RNN
cell and use it with
layer_rnn(), instead of subclassing
returns_sequences = TRUE, the output of the masked timestep will
be zero regardless of the layer's original
Mode by which outputs of the forward and backward RNNs will
be combined. One of
NULL, the outputs will not be combined, they will be returned as a list.
Default value is
Split and propagated to the
initial_weights attribute on the
forward and backward layer.
instance to be used to handle backwards input processing. If
backward_layer is not provided, the layer instance passed as the
argument will be used to generate the backward layer automatically. Note
that the provided
backward_layer layer should have properties matching
those of the
layer argument, in particular it should have the same values
return_sequences, etc. In addition,
layer should have different
ValueError will be raised if these requirements are not met.
standard layer arguments.