About 1,520,000 results
Open links in new tab
  1. How can I implement incremental training for xgboost?

    Jun 28, 2016 · If I train with two iterations I get an AUC of 0.66 and 0.68 for the successive iterations. Then when training the next minibatch with the exact same data I get the exact …

  2. python - XGBoost CV and best iteration - Stack Overflow

    Nov 9, 2016 · I am using XGBoost cv to find the optimal number of rounds for my model. I would be very grateful if someone could confirm (or refute), the optimal number of rounds is: estop = …

  3. The loss function and evaluation metric of XGBoost

    Nov 29, 2018 · I am confused now about the loss functions used in XGBoost. Here is how I feel confused: Here is how I feel confused: we have objective , which is the loss function needs to …

  4. XGBOOST: sample_Weights vs scale_pos_weight - Stack Overflow

    Jan 3, 2018 · @milad-shahidi's answer covers what should happen, but empirically I've found XGBoost doesn't always conform to theory: I'd advise treating the two parameters as …

  5. How to get feature importance in xgboost? - Stack Overflow

    Jun 4, 2016 · The scikit-learn like API of Xgboost is returning gain importance while get_fscore returns weight type. Permutation based importance perm_importance = …

  6. multioutput regression by xgboost - Stack Overflow

    Apr 28, 2023 · The 2.0.0 xgboost release supports multi-target trees with vector-leaf outputs. Meaning, xgboost can now build multi-output trees where the size of leaf equals the number of …

  7. XGBoost Categorical Variables: Dummification vs encoding

    Dec 14, 2015 · XGBoost has since version 1.3.0 added experimental support for categorical features. From the docs: 1.8.7 Categorical Data. Other than users performing encoding, …

  8. How to get the list of features which are actually used by XGBoost

    Apr 17, 2023 · The correct approach would be to traverse XGBoost tree data structure, and collect node split indices (which correspond to column indices in your training dataset). If your …

  9. python - Feature importance 'gain' in XGBoost - Stack Overflow

    I wonder if xgboost also uses this approach using information gain or accuracy as stated in the citation above. I've tried to dig in the code of xgboost and found out this method (already cut …

  10. scikit learn - XGBoost: Early stopping on default metric, not ...

    Feb 17, 2020 · I am using XGBoost 0.90. I wish to train a XGBoost regression models, with Python, using a built-in learning objective with early stopping on a built-in evaluation metric. …

Refresh