SGD#

class usencrypt.ai.optimizers.SGD(learning_rate=0.01, momentum=0.0, _config=None)#

Gradient descent with momentum.

Parameters
  • learning_rate (float) – The learning rate. Defaults to 0.01.

  • momentum (float) – The value that accelerates gradient descent. Defaults to 0.

Examples

A usencrypt.ai.nn.NeuralNetwork object can be set to use the SGD optimizer using the neural network’s compile() function:

>>> import usencrypt as ue
>>> net = ue.ai.nn.NeuralNetwork()
>>> net.add(ue.ai.nn.layers.FCLayer(4, 5))
>>> net.add(ue.ai.nn.layers.ReluLayer())
>>> net.add(ue.ai.nn.layers.FCLayer(5, 3))
>>> net.add(ue.ai.nn.layers.SoftmaxLayer())
>>> loss = ue.ai.losses.CategoricalCrossEntropy()
>>> optimizer = ue.ai.optimizers.SGD(learning_rate=0.001, momentum=0.9)
>>> net.compile(loss, optimizer)