An optimized general purpose gradient boosting library. The library is parallelized, and also provides an optimized distributed version.
It implements machine learning algorithms under the Gradient Boosting framework, including Generalized Linear Model (GLM) and Gradient Boosted Decision Trees (GBDT). XGBoost can also be distributed and scale to Terascale data
XGBoost is part of Distributed Machine Learning Common <img src=https://avatars2.githubusercontent.com/u/11508361?v=3&s=20> projects
- What's New
- Version
- Documentation
- Build Instruction
- Features
- Distributed XGBoost
- Usecases
- Bug Reporting
- Contributing to XGBoost
- Committers and Contributors
- License
- XGBoost in Graphlab Create
- XGBoost helps Vlad Mironov, Alexander Guschin to win the CERN LHCb experiment Flavour of Physics competition. Check out the interview from Kaggle.
- XGBoost helps Mario Filho, Josef Feigl, Lucas, Gilberto to win the Caterpillar Tube Pricing competition. Check out the interview from Kaggle.
- XGBoost helps Halla Yang to win the Recruit Coupon Purchase Prediction Challenge. Check out the interview from Kaggle.
- XGBoost helps Owen Zhang to win the Avito Context Ad Click competition. Check out the interview from Kaggle.
- XGBoost helps Chenglong Chen to win Kaggle CrowdFlower Competition Check out the winning solution
- XGBoost-0.4 release, see CHANGES.md
- XGBoost helps three champion teams to win WWW2015 Microsoft Malware Classification Challenge (BIG 2015) Check out the winning solution
- External Memory Version
- Current version xgboost-0.4
- Change log
- This version is compatible with 0.3x versions
- Easily accessible through CLI, python, R, Julia
- Its fast! Benchmark numbers comparing xgboost, H20, Spark, R - benchm-ml numbers
- Memory efficient - Handles sparse matrices, supports external memory
- Accurate prediction, and used extensively by data scientists and kagglers - highlight links
- Distributed version runs on Hadoop (YARN), MPI, SGE etc., scales to billions of examples.
- For reporting bugs please use the xgboost/issues page.
- For generic questions or to share your experience using xgboost please use the XGBoost User Group
XGBoost has been developed and used by a group of active community members. Everyone is more than welcome to contribute. It is a way to make the project better and more accessible to more users.
- Check out Feature Wish List to see what can be improved, or open an issue if you want something.
- Contribute to the documents and examples to share your experience with other users.
- Please add your name to CONTRIBUTORS.md after your patch has been merged.
© Contributors, 2015. Licensed under an Apache-2 license.
- XGBoost is adopted as part of boosted tree toolkit in Graphlab Create (GLC). Graphlab Create is a powerful python toolkit that allows you to do data manipulation, graph processing, hyper-parameter search, and visualization of TeraBytes scale data in one framework. Try the Graphlab Create
- Nice blogpost by Jay Gu about using GLC boosted tree to solve kaggle bike sharing challenge: