compile(optimizer=’adam’)#

Net.compile(optimizer='adam', options=None, random_state=None)

Compile the neural network with Adaptive Moment Estimation (Adam) optimizer.

See also

For documentation for the rest of the parameters, see Net.compile

Options:
learning_ratefloat, default 0.001

The learning rate used to update the weights during training.

b1float, default 0.9

Exponential decay rate for the first moment estimate.

b2float, default 0.999

Exponential decay rate for the second moment estimate.