Alpha Dropout is a dropout that keeps mean and variance of inputs to their original values, in order to ensure the self-normalizing property even after this dropout.
layer_alpha_dropout( object, rate, noise_shape = NULL, seed = NULL, input_shape = NULL, batch_input_shape = NULL, batch_size = NULL, dtype = NULL, name = NULL, trainable = NULL, weights = NULL )
What to compose the new
float, drop probability (as with
An integer to use as random seed.
Dimensionality of the input (integer) not including the samples axis. This argument is required when using this layer as the first layer in a model.
Shapes, including the batch size. For instance,
Fixed batch size for layer
The data type expected by the input, as a string (
An optional name string for the layer. Should be unique in a model (do not reuse the same name twice). It will be autogenerated if it isn't provided.
Whether the layer weights will be updated during training.
Initial weights for layer.
Alpha Dropout fits well to Scaled Exponential Linear Units by randomly setting activations to the negative saturation value.
Arbitrary. Use the keyword argument
of integers, does not include the samples axis) when using this layer as
the first layer in a model.
Same shape as input.