relu#

class usencrypt.ai.activations.relu(x, max_value=None)#

Applies a rectified linear unit (ReLU) activation, defined as:

\[g(x) = max(0, x)\]
Parameters
  • x (list or numpy.ndarray of float or usencrypt.cipher.Float) – The input list, matrix, or tensor.

  • max_value (float) – The saturation threshold, or the largest value the function will return.

Returns

The input transformed by the ReLU activation of the same shape as x.

Return type

numpy.ndarray of float or usencrypt.cipher.Float

Examples
>>> import numpy as np
>>> import usencrypt as ue
>>> x = np.random.randn(3, 4) - 0.5
>>> x
array([[-0.00355442,  0.03237752, -0.48492764,  0.04233708],
       [ 0.20589401, -0.04723594, -0.06618799, -0.41012695],
       [ 0.31960923,  0.37602457, -0.23324309, -0.2232066 ]])
>>> a = ue.ai.activation.relu(a)
>>> a
array([[0.        , 0.03237752, 0.        , 0.04233708],
       [0.20589401, 0.        , 0.        , 0.        ],
       [0.31960923, 0.37602457, 0.        , 0.        ]])