Calculates how often predictions match binary labels

metric_binary_accuracy(
  y_true,
  y_pred,
  threshold = 0.5,
  ...,
  name = "binary_accuracy",
  dtype = NULL
)

Arguments

y_true

Tensor of true targets.

y_pred

Tensor of predicted targets.

threshold

(Optional) Float representing the threshold for deciding whether prediction values are 1 or 0.

...

Passed on to the underlying metric. Used for forwards and backwards compatibility.

name

(Optional) string name of the metric instance.

dtype

(Optional) data type of the metric result.

Value

If y_true and y_pred are missing, a (subclassed) Metric instance is returned. The Metric object can be passed directly to compile(metrics = ) or used as a standalone object. See ?Metric for example usage.

Alternatively, if called with y_true and y_pred arguments, then the computed case-wise values for the mini-batch are returned directly.

Details

This metric creates two local variables, total and count that are used to compute the frequency with which y_pred matches y_true. This frequency is ultimately returned as binary accuracy: an idempotent operation that simply divides total by count.

If sample_weight is NULL, weights default to 1. Use sample_weight of 0 to mask values.

See also