smash.Model.nn_parameters#
- property Model.nn_parameters[source]#
The weight and bias of the parameterization neural network.
The neural network is used in hybrid model structures to correct internal fluxes.
- Returns:
- nn_parameters
NN_ParametersDT It returns a Fortran derived type containing the weight and bias of the parameterization neural network.
- nn_parameters
See also
Model.get_nn_parameters_weightGet the weight of the parameterization neural network.
Model.get_nn_parameters_biasGet the bias of the parameterization neural network.
Model.set_nn_parameters_weightSet the values of the weight in the parameterization neural network.
Model.set_nn_parameters_biasSet the values of the bias in the parameterization neural network.
Examples
>>> from smash.factory import load_dataset >>> setup, mesh = load_dataset("cance")
Set the hydrological module to
'gr4_mlp'(hybrid hydrological model with multilayer perceptron)>>> setup["hydrological_module"] = "gr4_mlp" >>> model = smash.Model(setup, mesh)
By default, the weight and bias of the parameterization neural network are set to zero. Access to their values with the getter method
get_nn_parameters_weightorget_nn_parameters_bias>>> model.get_nn_parameters_bias() [array([0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.], dtype=float32), array([0., 0., 0., 0.], dtype=float32)]
The output contains a list of weight or bias values for trainable layers.
Set random values with the setter methods
set_nn_parameters_weightorset_nn_parameters_biasusing available initializers>>> model.set_nn_parameters_bias(initializer="uniform", random_state=0) >>> model.get_nn_parameters_bias() [array([ 0.09762701, 0.43037874, 0.20552675, 0.08976637, -0.1526904 , 0.29178822, -0.12482557, 0.78354603, 0.92732555, -0.23311697, 0.5834501 , 0.05778984, 0.13608912, 0.85119325, -0.85792786, -0.8257414 ], dtype=float32), array([-0.9595632 , 0.6652397 , 0.5563135 , 0.74002427], dtype=float32)]
If you are using IPython, tab completion allows you to visualize all the attributes and methods
>>> model.nn_parameters.<TAB> model.nn_parameters.bias_1 model.nn_parameters.from_handle( model.nn_parameters.bias_2 model.nn_parameters.weight_1 model.nn_parameters.bias_3 model.nn_parameters.weight_2 model.nn_parameters.copy() model.nn_parameters.weight_3
Note
Not all layer weights and biases are used in the neural network. The default network only uses 2 layers, which means that
weight_3andbias_3are not used and are empty arrays in this case>>> model.nn_parameters.weight_3.size, model.nn_parameters.bias_3.size (0, 0)
To set another neural network structure
>>> setup["hidden_neuron"] = (32, 16) >>> model_2 = smash.Model(setup, mesh)
In this case, the number of layers is 3 instead of 2
>>> weights = model_2.get_nn_parameters_weight() >>> len(weights) 3
>>> weights[2].size 64