ReluLayer#

class usencrypt.ai.nn.layers.ReluLayer#

Rectified Linear Unit (ReLU) activation function in layer format for neural networks.

Parameters

name (str) – The layer’s string identifier.

Inheritance
Examples

Like all layers, the ReLU layer can be added to the top of a neural network architecture stack as follows:

>>> import usencrypt as ue
>>> net = ue.ai.nn.NeuralNetwork()
>>> net.add(ue.ai.nn.layers.FCLayer(input_size=4, output_size=3))
>>> net.add(ue.ai.nn.layers.ReluLayer())