save_metrics#

class usencrypt.ai.metrics.save_metrics(y_true, y_pred, file_path, metrics, labels=None)#

Saves the performance metrics used in classification to a JSON file. Options include accuracy, precision, recall, F1 score, and a confusion matrix.

Parameters
  • y_true (list or numpy.ndarray) – The ground-truth labels.

  • y_pred (list or numpy.ndarray) – The predicted labels.

  • file_path (str) – Path for the resulting JSON file.

  • metrics (list) – A list of performance metrics to be computed and saved.

  • labels (list or numpy.ndarray) – The set of all unique labels in the dataset. Defaults to None. If None, a list of classes will be generated from the dataset.

Returns

Returns a dictionary containing the performance metrics that were saved to the specified file path.

Return type

dict

Examples
>>> import numpy as np
>>> import usencrypt as ue
>>> y_true = np.random.randint(0, 2, size=(10,))
>>> y_true
array([0 0 1 0 1 1 0 0 1 0])
>>> y_pred = np.random.randint(0, 2, size=(10,))
>>> y_pred
array([0 1 0 1 1 0 0 0 1 0])
>>> metrics = ue.ai.metrics.save_metrics(y_true, y_pred, './saved_metrics.json', metrics=['confusion_matrix', 'accuracy', 'f1'], labels=[0,1])
Performance metrics saved successfully. (Saved in: './saved_metrics.json')
>>> metrics
{'confusion matrix': [[4, 2], [2, 2]], 'accuracy': 0.6, 'f1 score': [0.6666666666666666, 0.5]}