smash.default_optimize_options#
- smash.default_optimize_options(model, mapping='uniform', optimizer=None)[source]#
Default optimization options of Model.
- Parameters:
- model
Model Primary data structure of the hydrological model
smash.- mappingstr, default ‘uniform’
Type of mapping. Should be one of
'uniform''distributed''multi-linear''multi-power''ann'
Hint
See the Mapping section.
- optimizerstr or None, default None
Name of optimizer. Should be one of
'sbs'(only for'uniform'mapping)'nelder-mead'(only for'uniform'mapping)'powell'(only for'uniform'mapping)'lbfgsb'(for all mappings except'ann')'adam'(for all mappings)'adagrad'(for all mappings)'rmsprop'(for all mappings)'sgd'(for all mappings)
Note
If not given, a default optimizer will be set as follows:
'sbs'for mapping ='uniform''lbfgsb'for mapping ='distributed','multi-linear','multi-power''adam'for mapping ='ann'
Hint
See the Optimization Algorithms section.
- model
- Returns:
- optimize_optionsdict[str, Any]
Dictionary containing optimization options for fine-tuning the optimization process. The specific keys returned depend on the chosen mapping and optimizer. This dictionary can be directly passed to the optimize_options argument of the
optimize(orModel.optimize) method.
Examples
>>> from smash.factory import load_dataset >>> setup, mesh = load_dataset("cance") >>> model = smash.Model(setup, mesh)
Get the default optimization options for
'uniform'mapping>>> opt_u = smash.default_optimize_options(model, mapping="uniform") >>> opt_u { 'parameters': ['cp', 'ct', 'kexc', 'llr'], 'bounds': { 'cp': (1e-06, 1000.0), 'ct': (1e-06, 1000.0), 'kexc': (-50, 50), 'llr': (1e-06, 1000.0) }, 'control_tfm': 'sbs', 'termination_crit': {'maxiter': 50}, }
Directly pass this dictionary to the optimize_options argument of the
optimize(orModel.optimize) method. It’s equivalent to set optimize_options to None (which is the default value)>>> model_u = smash.optimize(model, mapping="uniform", optimize_options=opt_u) </> Optimize At iterate 0 nfg = 1 J = 6.95010e-01 ddx = 0.64 At iterate 1 nfg = 30 J = 9.84107e-02 ddx = 0.64 At iterate 2 nfg = 59 J = 4.54087e-02 ddx = 0.32 At iterate 3 nfg = 88 J = 3.81818e-02 ddx = 0.16 At iterate 4 nfg = 117 J = 3.73617e-02 ddx = 0.08 At iterate 5 nfg = 150 J = 3.70873e-02 ddx = 0.02 At iterate 6 nfg = 183 J = 3.68004e-02 ddx = 0.02 At iterate 7 nfg = 216 J = 3.67635e-02 ddx = 0.01 At iterate 8 nfg = 240 J = 3.67277e-02 ddx = 0.01 CONVERGENCE: DDX < 0.01
Customize the optimization options by removing
'kexc'from the optimized parameters>>> opt_u["parameters"].remove("kexc") >>> opt_u { 'parameters': ['cp', 'ct', 'llr'], 'bounds': { 'cp': (1e-06, 1000.0), 'ct': (1e-06, 1000.0), 'kexc': (-50, 50), 'llr': (1e-06, 1000.0) }, 'control_tfm': 'sbs', 'termination_crit': {'maxiter': 50}, }
Run the optimization method
>>> model_u = smash.optimize(model, mapping="uniform", optimize_options=opt_u) ValueError: Unknown, non optimized, or unbounded parameter 'kexc' in bounds optimize_options. Choices: ['cp', 'ct', 'llr']
An error is raised because we define
boundsto a non optimized parameterkexc. Remove alsokexcfrom bounds>>> opt_u["bounds"].pop("kexc") (-50, 50)
Note
The built-in dictionary method pop returns the value associated to the removed key
>>> opt_u { 'parameters': ['cp', 'ct', 'llr'], 'bounds': { 'cp': (1e-06, 1000.0), 'ct': (1e-06, 1000.0), 'llr': (1e-06, 1000.0) }, 'control_tfm': 'sbs', 'termination_crit': {'maxiter': 50}, }
Run again the optimization to see the differences linked to a change in control vector
>>> model_u = smash.optimize(model, mapping="uniform", optimize_options=opt_u) </> Optimize At iterate 0 nfg = 1 J = 6.95010e-01 ddx = 0.64 At iterate 1 nfg = 17 J = 1.28863e-01 ddx = 0.64 At iterate 2 nfg = 32 J = 6.94838e-02 ddx = 0.32 At iterate 3 nfg = 49 J = 4.50720e-02 ddx = 0.16 At iterate 4 nfg = 65 J = 4.40468e-02 ddx = 0.08 At iterate 5 nfg = 84 J = 4.35278e-02 ddx = 0.04 At iterate 6 nfg = 102 J = 4.26906e-02 ddx = 0.02 At iterate 7 nfg = 122 J = 4.26645e-02 ddx = 0.01 At iterate 8 nfg = 140 J = 4.26062e-02 ddx = 0.01 CONVERGENCE: DDX < 0.01
Get the default optimization options for a different mapping
>>> opt_ann = smash.default_optimize_options(model, mapping="ann") >>> opt_ann { 'parameters': ['cp', 'ct', 'kexc', 'llr'], 'bounds': { 'cp': (1e-06, 1000.0), 'ct': (1e-06, 1000.0), 'kexc': (-50, 50), 'llr': (1e-06, 1000.0) }, 'net': +----------------------------------------------------------+ | Layer Type Input/Output Shape Num Parameters | +----------------------------------------------------------+ | Dense (2,)/(18,) 54 | | Activation (ReLU) (18,)/(18,) 0 | | Dense (18,)/(9,) 171 | | Activation (ReLU) (9,)/(9,) 0 | | Dense (9,)/(4,) 40 | | Activation (Sigmoid) (4,)/(4,) 0 | | Scale (MinMaxScale) (4,)/(4,) 0 | +----------------------------------------------------------+ Total parameters: 265 Trainable parameters: 265, 'learning_rate': 0.003, 'random_state': None, 'termination_crit': {'maxiter': 200, 'early_stopping': 0} }
Again, customize the optimization options and optimize the Model
>>> opt_ann["learning_rate"] = 0.006 >>> opt_ann["termination_crit"]["maxiter"] = 50 >>> opt_ann["termination_crit"]["early_stopping"] = 5 >>> opt_ann["random_state"] = 21 >>> model.optimize(mapping="ann", optimize_options=opt_ann) </> Optimize At iterate 0 nfg = 1 J = 1.22206e+00 |proj g| = 2.09135e-04 At iterate 1 nfg = 2 J = 1.21931e+00 |proj g| = 2.39937e-04 ... At iterate 40 nfg = 41 J = 5.21514e-02 |proj g| = 1.31863e-02 At iterate 41 nfg = 42 J = 5.12064e-02 |proj g| = 3.74748e-03 At iterate 42 nfg = 43 J = 5.79208e-02 |proj g| = 5.08674e-03 At iterate 43 nfg = 44 J = 6.38050e-02 |proj g| = 1.01001e-02 At iterate 44 nfg = 45 J = 6.57343e-02 |proj g| = 1.33649e-02 At iterate 45 nfg = 46 J = 6.45393e-02 |proj g| = 1.56155e-02 At iterate 46 nfg = 47 J = 6.33092e-02 |proj g| = 1.72698e-02 EARLY STOPPING: NO IMPROVEMENT for 5 CONSECUTIVE ITERATIONS Revert to iteration 41 with J = 5.12064e-02 due to early stopping
The training process was terminated after 46 iterations, where the loss did not decrease below the minimal value at iteration 41 for 5 consecutive iterations. The optimal parameters are thus recorded at iteration 41.