000_xgboost_ python - smart1004/doc GitHub Wiki
11_random forest python #https://www.kaggle.com/omarito/gridsearchcv-xgbregressor-0-556-lb from datacleaner import autoclean
https://www.analyticsvidhya.com/blog/2016/03/complete-guide-parameter-tuning-xgboost-with-codes-python/
#https://medium.com/@gautam.karmakar/xgboost-model-to-win-kaggle-e12b35cd1aad
{'colsample_bytree':[0.6], 'colsample_bylevel': [0.7,0.8,0.9], 'subsample': [0.7], 'objective':['reg:linear'], 'min_child_weight': [4], 'gamma':[0], 'max_depth': [3], 'learning_rate':[0.01, 0.1], 'n_estimators':[90, 100], 'random_state' : [2019], 'n_jobs': [-1]}
{'colsample_bytree':[0.6], 'colsample_bylevel': [0.9, 1], 'subsample': [0.7], 'objective':['reg:linear'], 'min_child_weight': [4], 'gamma':[0], 'max_depth': [3], 'learning_rate':[0.1], 'n_estimators':[100, 120, 150], 'random_state' : [2019], 'n_jobs': [-1]}
gridsearchcv
XGBRegressor gridsearchcv
A parameter grid for XGBoost
params = {'min_child_weight':[4,5], 'gamma':[i/10.0 for i in range(3,6)], 'subsample':[i/10.0 for i in range(6,11)], 'colsample_bytree':[i/10.0 for i in range(6,11)], 'max_depth': [2,3,4]}
xgb1 = XGBRegressor()
parameters = {'nthread':[4], #when use hyperthread, xgboost may become slower
'objective':['reg:linear'],
'learning_rate': [.03, 0.05, .07], #so called eta
value
'max_depth': [5, 6, 7],
'min_child_weight': [4],
'silent': [1],
'subsample': [0.7],
'colsample_bytree': [0.7],
'n_estimators': [500]}
https://www.kaggle.com/jayatou/xgbregressor-with-gridsearchcv