Tuning sequence for xGBoost:
1. Get n_estimators give a learning rate & some default parameters
2. Tune max_depth, min_child_weight
3. Tune gamma
4. Tune subsample & colsample_bytree
5. Tune reg_alpha
6. Reduce learning rate
Tuning sequence for Random Forests:
1. max_features
2. n_estimators
3. min_sample_leaf
1. Get n_estimators give a learning rate & some default parameters
2. Tune max_depth, min_child_weight
3. Tune gamma
4. Tune subsample & colsample_bytree
5. Tune reg_alpha
6. Reduce learning rate
- lambda [default=1, alias: reg_lambda]
- L2 regularization term on weights, increase this value will make model more conservative.
- alpha [default=0, alias: reg_alpha]
- L1 regularization term on weights, increase this value will make model more conservative.
lambda [default=1]
- L2 regularization term on weights (analogous to Ridge regression)
- This used to handle the regularization part of XGBoost. Though many data scientists don’t use it often, it should be explored to reduce overfitting.
- L1 regularization term on weight (analogous to Lasso regression)
- Can be used in case of very high dimensionality so that the algorithm runs faster when implemented
Tuning sequence for Random Forests:
1. max_features
2. n_estimators
3. min_sample_leaf
No comments:
Post a Comment