precision_score#

class usencrypt.ai.metrics.precision_score(y_true, y_pred, labels=None, pos_label=1, average='binary')#

Computes the precision score, calculated as the ratio \(\frac{\text{TP}}{(\text{TP} + \text{FP})}\), where \(\text{TP}\) is the number of true positives and \(\text{FP}\) is the number of false positives.

Parameters
  • y_true (list or numpy.ndarray) – The ground-truth labels.

  • y_pred (list or numpy.ndarray) – The predicted labels.

  • labels (list or numpy.ndarray) – The set of labels to include when average != 'binary'. By default, all labels in y_true and y_pred are used in their order of appearance. Defaults to None.

  • pos_label (str, usencrypt.cipher.String, int, or usencrypt.cipher.Int) – The class to report if average = 'binary' and the data is binary. Otherwise, this parameter is ignored. Defaults to 1.

  • average ({'micro', 'macro', 'weighted', 'binary'} or None) –

    This determines the type of averaging performed on the data. If None, the scores for each class are returned. Defaults to 'binary'.

    • 'micro': Calculates the precision score globally by counting the total

      true positives, false negatives and false positives. This is equivalent to accuracy.

    • 'macro': Calculates the precision score for each class and finds their

      unweighted mean.

    • 'weighted': Calculates the precision score for each class and finds

      their average weighted by the support (the number of instances for each label).

    • 'binary': Only reports the precision score of the class denoted by

      pos_label as long as y_true and y_pred are binary.

Returns

Returns the precision score of the positive class for binary classification or an array of the recall scores of each class for multi-class classification. If average is None, the precision scores for each unique class are returned in a numpy.ndarray of shape (num_unique_classes, num_unique_classes).

Return type

float, usencrypt.cipher.Float, or numpy.ndarray of float or usencrypt.cipher.Float.

Examples

For binary classification, the precision score can be computed as follows:

>>> import numpy as np
>>> import usencrypt as ue
>>> y_true = np.random.randint(0, 2, size=(10,))
>>> y_true
array([0 1 0 1 1 0 0 0 1 0])
>>> y_pred = np.random.randint(0, 2, size=(10,))
>>> y_pred
array([0 1 1 1 0 0 0 1 0 1])
>>> precision = ue.ai.metrics.precision_score(y_true, y_pred)
>>> precision
0.4

Further, setting the average parameter to None in multi-class classification results in a numpy.ndarray containing the precision scores for each unique class:

>>> import numpy as np
>>> import usencrypt as ue
>>> y_true = np.random.randint(0, 2, size=(10,))
>>> y_true
array([0 1 0 2 0 0 2 0 1 2])
>>> y_pred = np.random.randint(0, 3, size=(10,))
>>> y_pred
array([2 1 1 2 0 0 0 1 2 0])
>>> precision = ue.ai.metrics.precision_score(y_true, y_pred, average=None)
>>> precision
array([0.5 0.3333333333333333 0.3333333333333333])