The Layers API#

USEncrypt.ai includes support for various layers that can be used to create any architecture for the NeuralNetwork model. These include:

BatchNormLayer

Applies normalization to the input.

FCLayer

Fully-connected layer for neural networks, defined by the linear operation \(y = W^TX + b\), where \(W\) is the weights matrix, \(X\) is the features matrix, and \(b\) is the bias vector.

DropoutLayer

Applies dropout to the input.

ReluLayer

Rectified Linear Unit (ReLU) activation function in layer format for neural networks.

SigmoidLayer

Sigmoid activation function in layer format for neural networks.

SoftmaxLayer

Softmax activation function in layer format for neural networks.

TanhLayer

Hyperbolic tangent activation function in layer format for neural networks.

These can be added to a NeuralNetwork model using its add() function.