smash.factory.Net.add_dense#

Net.add_dense(neurons, input_shape=None, activation=None, kernel_initializer='glorot_uniform', bias_initializer='zeros')[source]#

Add a fully-connected layer to the neural network.

This method adds a dense layer into the neural network graph but does not initialize its weight and bias values.

Parameters:
neuronsint

The number of neurons in the layer.

input_shapeint, tuple, list, or None, default None

The expected input shape of the layer. It must be specified if this is the first layer in the network.

activationstr or None, default None

Add an activation layer following the current layer if specified. Should be one of

  • 'relu' : Rectified Linear Unit

  • 'sigmoid' : Sigmoid

  • 'selu' : Scaled Exponential Linear Unit

  • 'elu' : Exponential Linear Unit

  • 'softmax' : Softmax

  • 'leakyrelu' : Leaky Rectified Linear Unit

  • 'tanh' : Hyperbolic Tangent

  • 'softplus' : Softplus

  • 'silu' : Sigmoid Linear Unit

kernel_initializerstr, default ‘glorot_uniform’

Kernel initialization method. Should be one of 'uniform', 'glorot_uniform', 'he_uniform', 'normal', 'glorot_normal', 'he_normal', 'zeros'.

bias_initializerstr, default ‘zeros’

Bias initialization method. Should be one of 'uniform', 'glorot_uniform', 'he_uniform', 'normal', 'glorot_normal', 'he_normal', 'zeros'.

Examples

>>> from smash.factory import Net
>>> net = Net()
>>> net.add_dense(128, input_shape=12, activation="relu")
>>> net.add_dense(32, activation="sigmoid")
>>> net
+----------------------------------------------------------+
| Layer Type            Input/Output Shape  Num Parameters |
+----------------------------------------------------------+
| Dense                 (12,)/(128,)        1664           |
| Activation (ReLU)     (128,)/(128,)       0              |
| Dense                 (128,)/(32,)        4128           |
| Activation (Sigmoid)  (32,)/(32,)         0              |
+----------------------------------------------------------+
Total parameters: 5792
Trainable parameters: 5792