compile(optimizer=’adagrad’)#

Net.compile(optimizer='adagrad', options=None, random_state=None)

Compile the neural network with Adaptive Gradient (Adagrad) optimizer.

See also

For documentation for the rest of the parameters, see Net.compile

Options:
learning_ratefloat, default 0.01

The learning rate used to update the weights during training.