smash.factory.Net.add_conv2d#

Net.add_conv2d(filters, filter_shape, input_shape=None, activation=None, kernel_initializer='glorot_uniform', bias_initializer='zeros')[source]#

Add a 2D convolutional layer (with same padding and a stride of one) to the neural network.

This method adds a 2D convolutional layer into the neural network graph but does not initialize its weight and bias values.

Parameters:
filtersint

The number of filters in the convolutional layer.

filter_shapeint or tuple

The size of the convolutional window.

input_shapetuple, list, or None, default None

The expected input shape of the layer. It must be specified if this is the first layer in the network.

activationstr or None, default None

Add an activation layer following the current layer if specified. Should be one of

  • 'relu' : Rectified Linear Unit

  • 'sigmoid' : Sigmoid

  • 'selu' : Scaled Exponential Linear Unit

  • 'elu' : Exponential Linear Unit

  • 'softmax' : Softmax

  • 'leakyrelu' : Leaky Rectified Linear Unit

  • 'tanh' : Hyperbolic Tangent

  • 'softplus' : Softplus

  • 'silu' : Sigmoid Linear Unit

kernel_initializerstr, default ‘glorot_uniform’

Kernel initialization method. Should be one of 'uniform', 'glorot_uniform', 'he_uniform', 'normal', 'glorot_normal', 'he_normal', 'zeros'.

bias_initializerstr, default ‘zeros’

Bias initialization method. Should be one of 'uniform', 'glorot_uniform', 'he_uniform', 'normal', 'glorot_normal', 'he_normal', 'zeros'.

Examples

>>> from smash.factory import Net
>>> net = Net()
>>> net.add_conv2d(128, filter_shape=(8, 6), input_shape=(56, 50, 3), activation="relu")
>>> net.add_conv2d(32, filter_shape=(8, 6), activation="leakyrelu")
>>> net
+---------------------------------------------------------------------+
| Layer Type              Input/Output Shape           Num Parameters |
+---------------------------------------------------------------------+
| Conv2D                  (56, 50, 3)/(56, 50, 128)    18560          |
| Activation (ReLU)       (56, 50, 128)/(56, 50, 128)  0              |
| Conv2D                  (56, 50, 128)/(56, 50, 32)   196640         |
| Activation (LeakyReLU)  (56, 50, 32)/(56, 50, 32)    0              |
+---------------------------------------------------------------------+
Total parameters: 215200
Trainable parameters: 215200