What is XGBoost and what specific technical improvements does it make over standard gradient boosting?

Data Science with Python Hard

Data Science with Python — Hard

What is XGBoost and what specific technical improvements does it make over standard gradient boosting?

Key points

  • XGBoost utilizes second-order Taylor expansion and regularization for improved performance
  • Column subsampling and cache-aware computation contribute to its speed
  • Handling sparse data is a specific technical improvement
  • These features collectively make XGBoost faster and more regularized

Ready to go further?

Related questions