Skip to main content
Version: 0.5.x

metrics

Utility functions for measuring accuracy and performance.

Module: aisp.utils.metrics
Import: from aisp.utils import metrics

Functions

accuracy_score

def accuracy_score(
y_true: Union[npt.NDArray, list],
y_pred: Union[npt.NDArray, list]
) -> float:
...

Calculate the accuracy score based on true and predicted labels.

Parameters

NameTypeDefaultDescription
y_trueUnion[npt.NDArray, list]-Ground truth (correct) labels. Expected to be of the same length as y_pred.
y_predUnion[npt.NDArray, list]-Predicted labels. Expected to be of the same length as y_true.

Returns

TypeDescription
floatThe ratio of correct predictions to the total number of predictions.

Raises

ExceptionDescription
ValueErrorIf y_true or y_pred are empty or if they do not have the same length.