compile(optimizer=’sgd’)#

Net.compile(optimizer='sgd', options=None, random_state=None)

Compile the neural network with Stochastic Gradient Descent (SGD) optimizer.

See also

For documentation for the rest of the parameters, see Net.compile

Options:
learning_ratefloat, default 0.01

The learning rate used to update the weights during training.

momentumfloat, default 0

The momentum used to smooth the gradient updates.