The generator is run in parallel to the model, for efficiency. For instance, this allows you to do real-time data augmentation on images on CPU in parallel to training your model on GPU.
fit_generator(object, generator, steps_per_epoch, epochs = 1, verbose = getOption("keras.fit_verbose", default = 1), callbacks = NULL, view_metrics = getOption("keras.view_metrics", default = "auto"), validation_data = NULL, validation_steps = NULL, class_weight = NULL, max_queue_size = 10, initial_epoch = 0)
Keras model object
The output of the generator must be a list of one of these forms:
- (inputs, targets) - (inputs, targets, sample_weights)
This list (a single output of the generator) makes a single batch.
Therefore, all arrays in this list must have the same length (equal to
the size of this batch). Different batches may have different sizes.
For example, the last batch of the epoch is commonly smaller than the
others, if the size of the dataset is not divisible by the batch size.
The generator is expected to loop over its data indefinitely. An epoch
Total number of steps (batches of samples) to yield
Integer. Number of epochs to train the model.
An epoch is an iteration over the entire data provided, as defined by
Verbosity mode (0 = silent, 1 = progress bar, 2 = one line per epoch).
List of callbacks to apply during training.
View realtime plot of training metrics (by epoch). The
this can be either:
Only relevant if
Optional named list mapping class indices (integer) to a weight (float) value, used for weighting the loss function (during training only). This can be useful to tell the model to "pay more attention" to samples from an under-represented class.
Maximum size for the generator queue. If unspecified,
epoch at which to start training (useful for resuming a previous training run)
Training history object (invisibly)
Other model functions: