Activations#

The following activation functions are supported in USEncrypt.ai:

relu

Applies a rectified linear unit (ReLU) activation, defined as:

sigmoid

Applies a sigmoid activation, formally defined as:

softmax

Applies a softmax activation to the input, transforming it into a matrix of probability distributions.

tanh

Applies a hyperbolic tangent activation, defined as: