add(layer=’activation’)#

Net.add(layer, options)

Activation layer that applies a specified activation function to the input.

See also

For documentation for the rest of the parameters, see Net.add

Options:
namestr

The name of the activation function that will be used. Should be one of

  • ‘relu’ : Rectified Linear Unit

  • ‘sigmoid’ : Sigmoid

  • ‘selu’ : Scaled Exponential Linear Unit

  • ‘elu’ : Exponential Linear Unit

  • ‘softmax’ : Softmax

  • ‘leaky_relu’ : Leaky Rectified Linear Unit

  • ‘tanh’ : Hyperbolic Tangent

  • ‘softplus’ : Softplus