Software Alternatives & Reviews
Table of contents
  1. Videos
  2. Social Mentions
  3. Comments

XGBoost

XGBoost is an optimized distributed gradient boosting library designed to be highly efficient, flexible and portable. subtitle

XGBoost Reviews and details

Screenshots and images

  • XGBoost Landing page
    Landing page //
    2023-07-30

Badges

Promote XGBoost. You can add any of these badges on your website.
SaaSHub badge
Show embed code

Videos

XGBoost Part 3: Mathematical Details

XGBoost A Scalable Tree Boosting System June 02, 2016

Free Udemy Course - CatBoost vs XGBoost - Classification and Regression Modeling with Python

Social recommendations and mentions

We have tracked the following product recommendations or mentions on various public social media platforms and blogs. They can help you see what people think about XGBoost and what they use it for.
  • CS Internship Questions
    By the way, most of the time XGBoost works just as well for projects, would not recommend applying deep learning to every single problem you come across, it's something Stanford CS really likes to showcase when it's well known (1) that sometimes "smaller"/less complex models can perform just as well or have their own interpretive advantages and (2) it is well known within ML and DS communities that deep learning... Source: almost 2 years ago

Do you know an article comparing XGBoost to other products?
Suggest a link to a post with product alternatives.

Suggest an article

Generic XGBoost discussion

Log in or Post with

This is an informative page about XGBoost. You can review and discuss the product here. The primary details have not been verified within the last quarter, and they might be outdated. If you think we are missing something, please use the means on this page to comment or suggest changes. All reviews and comments are highly encouranged and appreciated as they help everyone in the community to make an informed choice. Please always be kind and objective when evaluating a product and sharing your opinion.