BatchNormLayer#

class usencrypt.ai.nn.layers.BatchNormLayer(axis=- 1, momentum=0.99, epsilon=0.001, center=True, scale=True, beta_initializer='zeros', gamma_initializer='ones', _config=None)#

Applies normalization to the input.

Parameters
  • axis (int) – The axis in which to normalize the input. Defaults to -1.

  • momentum (float) – Momentum value used for the moving average. Defaults to 0.99.

  • epsilon (float) – Small value added to the variance to avoid division by zero. Defaults to 1e-3

  • center (bool) – If True, the beta offset tensor is added to the normalized output. If False, beta is ignored.

  • scale (bool) – If True, the input is multiplied by gamma. If False, gamma is ignored.

  • beta_initializer (str) – Initializer for the beta parameter.

  • gamma_initializer (str) – Initializer for the gamma parameter.

Variables
  • axis (int) – The axis in which to normalize the input. Defaults to -1.

  • momentum (float) – Momentum value used for the moving average. Defaults to 0.99.

  • epsilon (float) – Small value added to the variance to avoid division by zero. Defaults to 1e-3

  • center (bool) – If True, the beta offset tensor is added to the normalized output. If False, beta is ignored.

  • scale (bool) – If True, the input is multiplied by gamma. If False, gamma is ignored.

  • beta_initializer (str) – Initializer for the beta parameter.

  • gamma_initializer (str) – Initializer for the gamma parameter.

  • name (str) – The layer’s string identifier.

  • params (dict) – Dictionary containing all trainable parameters.

  • params_prime (dict) – Dictionary containing all parameter gradients.

Inheritance

usencrypt.ai.nn.layers.Layer

Examples

Like all layers, the batch normalization layer can be added to the top of a neural network architecture stack as follows:

>>> import usencrypt as ue
>>> net = ue.ai.nn.NeuralNetwork()
>>> net.add(ue.ai.nn.layers.FCLayer(4, 3))
>>> net.add(ue.ai.nn.layers.BatchNormLayer())
>>> net.add(ue.ai.nn.layers.SoftmaxLayer())