Adadelta optimizer as described in ADADELTA: An Adaptive Learning Rate Method.
optimizer_adadelta( learning_rate = 1, rho = 0.95, epsilon = NULL, decay = 0, clipnorm = NULL, clipvalue = NULL, ... )
float >= 0. Learning rate.
float >= 0. Decay factor.
float >= 0. Fuzz factor. If
NULL, defaults to
float >= 0. Learning rate decay over each update.
Gradients will be clipped when their L2 norm exceeds this value.
Gradients will be clipped when their absolute value exceeds this value.
Unused, present only for backwards compatability
It is recommended to leave the parameters of this optimizer at their default values.