smash.default_bayesian_optimize_options#
- smash.default_bayesian_optimize_options(model, mapping='uniform', optimizer=None)[source]#
Default bayesian optimize options of Model.
- Parameters:
- model
Model
Primary data structure of the hydrological model
smash
.- mappingstr, default ‘uniform’
Type of mapping. Should be one of
'uniform'
'distributed'
'multi-linear'
'multi-polynomial'
Hint
See the Mapping section
- optimizerstr or None, default None
Name of optimizer. Should be one of
'sbs'
('uniform'
mapping only)'lbfgsb'
('uniform'
,'distributed'
,'multi-linear'
or'multi-polynomial'
mapping only)
Note
If not given, a default optimizer will be set depending on the optimization mapping:
mapping =
'uniform'
; optimizer ='sbs'
mapping =
'distributed'
,'multi-linear'
, or'multi-polynomial'
; optimizer ='lbfgsb'
Hint
See the Optimization Algorithm section
- model
- Returns:
- optimize_optionsdict[str, Any]
Dictionary containing optimization options for fine-tuning the optimization process. The specific keys returned depend on the chosen mapping and optimizer. This dictionary can be directly pass to the optimize_options argument of the optimize method
bayesian_optimize
(orModel.bayesian_optimize
).
Examples
>>> from smash.factory import load_dataset >>> setup, mesh = load_dataset("cance") >>> model = smash.Model(setup, mesh)
Get the default bayesian optimization options for
'uniform'
mapping>>> opt_u = smash.default_bayesian_optimize_options(model, mapping="uniform") >>> opt_u { 'parameters': ['cp', 'ct', 'kexc', 'llr', 'sg0', 'sg1'], 'bounds': { 'cp': (1e-06, 1000.0), 'ct': (1e-06, 1000.0), 'kexc': (-50, 50), 'llr': (1e-06, 1000.0), 'sg0': (1e-06, 1000.0), 'sg1': (1e-06, 10.0) }, 'control_tfm': 'sbs', 'termination_crit': {'maxiter': 50}, }
Directly pass this dictionary to the optimize_options argument of the optimize method
bayesian_optimize
(orModel.bayesian_optimize
). It’s equivalent to set optimize_options to None (which is the default value)>>> model_u = smash.bayesian_optimize(model, mapping="uniform", optimize_options=opt_u) </> Bayesian Optimize At iterate 0 nfg = 1 J = 77.049133 ddx = 0.64 At iterate 1 nfg = 68 J = 2.584603 ddx = 0.64 At iterate 2 nfg = 135 J = 2.324317 ddx = 0.32 At iterate 3 nfg = 202 J = 2.304130 ddx = 0.08 At iterate 4 nfg = 269 J = 2.262191 ddx = 0.08 At iterate 5 nfg = 343 J = 2.260251 ddx = 0.01 At iterate 6 nfg = 416 J = 2.258220 ddx = 0.00 CONVERGENCE: DDX < 0.01
Get the default bayesian optimization options for a different mapping
>>> opt_ml = smash.default_bayesian_optimize_options(model, mapping="multi-linear") >>> opt_ml { 'parameters': ['cp', 'ct', 'kexc', 'llr', 'sg0', 'sg1'], 'bounds': { 'cp': (1e-06, 1000.0), 'ct': (1e-06, 1000.0), 'kexc': (-50, 50), 'llr': (1e-06, 1000.0), 'sg0': (1e-06, 1000.0), 'sg1': (1e-06, 10.0) }, 'control_tfm': 'normalize', 'descriptor': { 'cp': array(['slope', 'dd'], dtype='<U5'), 'ct': array(['slope', 'dd'], dtype='<U5'), 'kexc': array(['slope', 'dd'], dtype='<U5'), 'llr': array(['slope', 'dd'], dtype='<U5') }, 'termination_crit': {'maxiter': 100, 'factr': 1000000.0, 'pgtol': 1e-12}, }
Customize the bayesian optimize options and optimize the Model
>>> opt_ml["bounds"]["cp"] = (1, 2000) >>> opt_ml["bounds"]["sg0"] = (1e-3, 100) >>> opt_ml["descriptor"]["cp"] = "slope" >>> opt_ml["termination_crit"]["maxiter"] = 10 >>> model.bayesian_optimize(mapping="multi-linear", optimize_options=opt_ml) </> Bayesian Optimize At iterate 0 nfg = 1 J = 77.049095 |proj g| = 147.958771 At iterate 1 nfg = 2 J = 6.694370 |proj g| = 4.301311 At iterate 2 nfg = 3 J = 6.527157 |proj g| = 3.754591 At iterate 3 nfg = 4 J = 5.088758 |proj g| = 2.261085 At iterate 4 nfg = 5 J = 4.736641 |proj g| = 1.637916 At iterate 5 nfg = 6 J = 4.421250 |proj g| = 1.300914 At iterate 6 nfg = 7 J = 4.284939 |proj g| = 1.345769 At iterate 7 nfg = 8 J = 4.196455 |proj g| = 1.179024 At iterate 8 nfg = 9 J = 4.139528 |proj g| = 1.159659 At iterate 9 nfg = 10 J = 4.099973 |proj g| = 1.042880 At iterate 10 nfg = 11 J = 4.027408 |proj g| = 1.142047 STOP: TOTAL NO. OF ITERATION EXCEEDS LIMIT
The optimization process was terminated after 10 iterations, the maximal value we defined.