# Layers#

The following layers are supported in USEncrypt.ai:

 Layer Base class for neural network layers. ActivationLayer Base class for neural network activation layers. BatchNormLayer Applies normalization to the input. DropoutLayer Applies dropout to the input. FCLayer Fully-connected layer for neural networks, defined by the linear operation $$y = W^TX + b$$, where $$W$$ is the weights matrix, $$X$$ is the features matrix, and $$b$$ is the bias vector. ReluLayer Rectified Linear Unit (ReLU) activation function in layer format for neural networks. SigmoidLayer Sigmoid activation function in layer format for neural networks. SoftmaxLayer Softmax activation function in layer format for neural networks. TanhLayer Hyperbolic tangent activation function in layer format for neural networks.