Tuning sklearn esitmator

Below code shows how you can use the BO library to tune sklearn estimator

[2]:
# Copyright (C) 2020. Huawei Technologies Co., Ltd. All rights reserved.

# This program is free software; you can redistribute it and/or modify it under
# the terms of the MIT license.

# This program is distributed in the hope that it will be useful, but WITHOUT ANY
# WARRANTY; without even the implied warranty of MERCHANTABILITY or FITNESS FOR A
# PARTICULAR PURPOSE. See the MIT License for more details.

from sklearn.datasets import load_boston
from sklearn.ensemble import RandomForestRegressor
from sklearn.metrics import r2_score, mean_squared_error
from hebo.sklearn_tuner import sklearn_tuner

space_cfg = [
        {'name' : 'max_depth',        'type' : 'int', 'lb' : 1, 'ub' : 20},
        {'name' : 'min_samples_leaf', 'type' : 'num', 'lb' : 1e-4, 'ub' : 0.5},
        {'name' : 'max_features',     'type' : 'cat', 'categories' : ['auto', 'sqrt', 'log2']},
        {'name' : 'bootstrap',        'type' : 'bool'},
        {'name' : 'min_impurity_decrease', 'type' : 'pow', 'lb' : 1e-4, 'ub' : 1.0},
        ]
X, y   = load_boston(return_X_y = True)
result = sklearn_tuner(RandomForestRegressor, space_cfg, X, y, metric = r2_score, max_iter = 16)
Iter 0, best metric: 0.492082
Iter 1, best metric: 0.5526
Iter 2, best metric: 0.5526
Iter 3, best metric: 0.5526
Iter 4, best metric: 0.5526
Iter 5, best metric: 0.715855
Iter 6, best metric: 0.862657
Iter 7, best metric: 0.862657
Iter 8, best metric: 0.862657
Iter 9, best metric: 0.862657
Iter 10, best metric: 0.869785
Iter 11, best metric: 0.882127
Iter 12, best metric: 0.882127
Iter 13, best metric: 0.882127
Iter 14, best metric: 0.882127
Iter 15, best metric: 0.882127
[2]:
result
[2]:
{'max_depth': 15,
 'min_samples_leaf': 0.00011814573477638075,
 'max_features': 'log2',
 'bootstrap': False,
 'min_impurity_decrease': 0.00010743041070558209}
[ ]: