smash.factory.Net.set_trainable#

Net.set_trainable(trainable)[source]#

Method which enables to train or freeze the weights and biases of the network’s layers.

Parameters:
trainableListLike

List of booleans with a length of the total number of the network’s layers.

Note

Dropout, activation, and scaling functions are non-parametric layers, meaning they do not have any learnable weights or biases. Therefore, it is not necessary to set these layers as trainable since they do not involve any weight updates during training.

Examples

>>> net.add(layer="dense", options={"input_shape": (8,), "neurons": 32})
>>> net.add(layer="activation", options={"name": "relu"})
>>> net.add(layer="dense", options={"neurons": 16})
>>> net.add(layer="activation", options={"name": "relu"})
>>> net
+-------------------------------------------------------+
| Layer Type         Input/Output Shape  Num Parameters |
+-------------------------------------------------------+
| Dense              (8,)/(32,)          288            |
| Activation (ReLU)  (32,)/(32,)         0              |
| Dense              (32,)/(16,)         528            |
| Activation (ReLU)  (16,)/(16,)         0              |
+-------------------------------------------------------+
Total parameters: 816
Trainable parameters: 816

Freeze the parameters in the second dense layer:

>>> net.set_trainable([True, False, False, False])
>>> net
+-------------------------------------------------------+
| Layer Type         Input/Output Shape  Num Parameters |
+-------------------------------------------------------+
| Dense              (8,)/(32,)          288            |
| Activation (ReLU)  (32,)/(32,)         0              |
| Dense              (32,)/(16,)         528            |
| Activation (ReLU)  (16,)/(16,)         0              |
+-------------------------------------------------------+
Total parameters: 816
Trainable parameters: 288