smash.default_bayesian_optimize_options#
- smash.default_bayesian_optimize_options(model, mapping='uniform', optimizer=None)[source]#
Default bayesian optimization options of Model.
- Parameters:
- model
Model Primary data structure of the hydrological model
smash.- mappingstr, default ‘uniform’
Type of mapping. Should be one of
'uniform''distributed''multi-linear''multi-power'
Hint
See the Mapping section.
- optimizerstr or None, default None
Name of optimizer. Should be one of
'sbs'(only for'uniform'mapping)'nelder-mead'(only for'uniform'mapping)'powell'(only for'uniform'mapping)'lbfgsb'(for all mappings)'adam'(for all mappings)'adagrad'(for all mappings)'rmsprop'(for all mappings)'sgd'(for all mappings)
Note
If not given, a default optimizer will be set as follows:
'sbs'for mapping ='uniform''lbfgsb'for mapping ='distributed','multi-linear','multi-power'
Hint
See the Optimization Algorithms section.
- model
- Returns:
- optimize_optionsdict[str, Any]
Dictionary containing optimization options for fine-tuning the optimization process. The specific keys returned depend on the chosen mapping and optimizer. This dictionary can be directly passed to the optimize_options argument of the
bayesian_optimize(orModel.bayesian_optimize) method.
Examples
>>> from smash.factory import load_dataset >>> setup, mesh = load_dataset("cance") >>> model = smash.Model(setup, mesh)
Get the default bayesian optimization options for
'uniform'mapping>>> opt_u = smash.default_bayesian_optimize_options(model, mapping="uniform") >>> opt_u { 'parameters': ['cp', 'ct', 'kexc', 'llr', 'sg0', 'sg1'], 'bounds': { 'cp': (1e-06, 1000.0), 'ct': (1e-06, 1000.0), 'kexc': (-50, 50), 'llr': (1e-06, 1000.0), 'sg0': (1e-06, 1000.0), 'sg1': (1e-06, 10.0) }, 'control_tfm': 'sbs', 'termination_crit': {'maxiter': 50}, }
Directly pass this dictionary to the optimize_options argument of the
bayesian_optimize(orModel.bayesian_optimize) method. It’s equivalent to set optimize_options to None (which is the default value)>>> model_u = smash.bayesian_optimize(model, mapping="uniform", optimize_options=opt_u) </> Bayesian Optimize At iterate 0 nfg = 1 J = 7.70491e+01 ddx = 0.64 At iterate 1 nfg = 68 J = 2.58460e+00 ddx = 0.64 At iterate 2 nfg = 135 J = 2.32432e+00 ddx = 0.32 At iterate 3 nfg = 202 J = 2.30413e+00 ddx = 0.08 At iterate 4 nfg = 269 J = 2.26219e+00 ddx = 0.08 At iterate 5 nfg = 343 J = 2.26025e+00 ddx = 0.01 At iterate 6 nfg = 416 J = 2.25822e+00 ddx = 0.01 CONVERGENCE: DDX < 0.01
Get the default bayesian optimization options for a different mapping
>>> opt_ml = smash.default_bayesian_optimize_options(model, mapping="multi-linear") >>> opt_ml { 'parameters': ['cp', 'ct', 'kexc', 'llr', 'sg0', 'sg1'], 'bounds': { 'cp': (1e-06, 1000.0), 'ct': (1e-06, 1000.0), 'kexc': (-50, 50), 'llr': (1e-06, 1000.0), 'sg0': (1e-06, 1000.0), 'sg1': (1e-06, 10.0) }, 'control_tfm': 'normalize', 'descriptor': { 'cp': array(['slope', 'dd'], dtype='<U5'), 'ct': array(['slope', 'dd'], dtype='<U5'), 'kexc': array(['slope', 'dd'], dtype='<U5'), 'llr': array(['slope', 'dd'], dtype='<U5') }, 'termination_crit': {'maxiter': 100, 'factr': 1000000.0, 'pgtol': 1e-12}, }
Customize the bayesian optimization options and optimize the Model
>>> opt_ml["bounds"]["cp"] = (1, 2000) >>> opt_ml["bounds"]["sg0"] = (1e-3, 100) >>> opt_ml["descriptor"]["cp"] = "slope" >>> opt_ml["termination_crit"]["maxiter"] = 10 >>> model.bayesian_optimize(mapping="multi-linear", optimize_options=opt_ml) </> Bayesian Optimize At iterate 0 nfg = 1 J = 7.70491e+01 |proj g| = 1.05147e+04 At iterate 1 nfg = 2 J = 6.69437e+00 |proj g| = 2.15263e+02 At iterate 2 nfg = 3 J = 6.52716e+00 |proj g| = 2.03207e+02 At iterate 3 nfg = 4 J = 5.08876e+00 |proj g| = 6.83760e+01 At iterate 4 nfg = 5 J = 4.73664e+00 |proj g| = 4.19148e+01 At iterate 5 nfg = 6 J = 4.42125e+00 |proj g| = 1.94103e+01 At iterate 6 nfg = 7 J = 4.28494e+00 |proj g| = 9.39774e+00 At iterate 7 nfg = 8 J = 4.19646e+00 |proj g| = 4.74194e+00 At iterate 8 nfg = 9 J = 4.13953e+00 |proj g| = 1.74698e+00 At iterate 9 nfg = 10 J = 4.09997e+00 |proj g| = 1.04288e+00 At iterate 10 nfg = 11 J = 4.02741e+00 |proj g| = 4.41394e+00 STOP: TOTAL NO. of ITERATIONS REACHED LIMIT
The optimization process was terminated after 10 iterations, the maximal value we defined.