smash.Net.compile#
- Net.compile(optimizer='adam', options=None, random_state=None)[source]#
Compile the network and set optimizer.
- Parameters:
- optimizerstr, default ‘adam’
Name of optimizer. Should be one of
‘sgd’
‘adam’
‘adagrad’
‘rmsprop’
- optionsdict or None, default None
A dictionary of optimizer options.
Hint
See options for each optimizer:
‘sgd’ (see here)
‘adam’ (see here)
‘adagrad’ (see here)
‘rmsprop’ (see here)
- random_stateint or None, default None
Random seed used to initialize weights.
Note
If not given, the weights will be initialized with a random seed.
Examples
>>> net = smash.Net()
Define graph
>>> net.add(layer="dense", options={"input_shape": (6,), "neurons": 16}) >>> net.add(layer="activation", options={"name": "relu"})
Compile the network
>>> net.compile(optimizer='sgd', options={'learning_rate': 0.009, 'momentum': 0.001}) >>> net +-------------------------------------------------------+ | Layer Type Input/Output Shape Num Parameters | +-------------------------------------------------------+ | Dense (6,)/(16,) 112 | | Activation (ReLU) (16,)/(16,) 0 | +-------------------------------------------------------+ Total parameters: 112 Trainable parameters: 112 Optimizer: (sgd, lr=0.009)