# larq.metrics¶

We add metrics specific to extremely quantized networks using a scope rather than through the metrics parameter of model.compile(), where most common metrics reside. This is because, to calculate metrics like the flip_ratio, we need a layer's kernel or activation and not just the y_true and y_pred that Keras passes to metrics defined in the usual way.

## scope¶

scope(metrics=[])

A context manager to set the training metrics to be used in layers.

Example

with larq.metrics.scope(["flip_ratio"]):
model = tf.keras.models.Sequential(
[larq.layers.QuantDense(3, kernel_quantizer="ste_sign", input_shape=(32,))]
)
model.compile(loss="mse", optimizer="sgd")


Arguments

• metrics: Iterable of metrics to add to layers defined inside this context. Currently only the flip_ratio metric is available.

## get_training_metrics¶

get_training_metrics()

Retrieves a live reference to the training metrics in the current scope.

Updating and clearing training metrics using larq.metrics.scope is preferred, but get_training_metrics can be used to directly access them.

Example

get_training_metrics().clear()