Metrics
The metrics file provides utilities to measure, analyze, and compare the performance of the package's algorithms in a standardized way.
def accuracy_score(...)
def accuracy_score(
y_true: Union[npt.NDArray, list],
y_pred: Union[npt.NDArray, list]
) -> float
Function to calculate precision accuracy based on lists of true labels and predicted labels.
Parameters:
- y_true (
Union[npt.NDArray, list]
): Ground truth (correct) labels. Expected to be of the same length asy_pred
. - y_pred (
Union[npt.NDArray, list]
): Predicted labels. Expected to be of the same length asy_true
.
Returns:
- Accuracy (
float
): The ratio of correct predictions to the total number of predictions.
Raises:
ValueError
: Ify_true
ory_pred
are empty or if they do not have the same length.