CatBoost is a fast, scalable, high performance open-source gradient boosting on decision trees library

Get started

CatBoost on GPU talk at GTC 2018

March 26, 2018

Vasily Ershov CatBoost lead developer, will talk on GTC 2018 about fastest gradient boosting implementation on GPU.

He'll provide a brief overview of problems which could be solved with CatBoost, discuss challenges and key optimizations in the most significant computation blocks and describe how one can efficiently build histograms in shared memory to construct decision trees and how to avoid atomic operation during this step. Also he'll provide benchmarks that shows that our GPU implementation is five to 40 times faster compared to CPU. And finally he'll talk about performance comparison against GPU implementations of gradient boosting in other open-source libraries.

Picture contains learning speed on GPU comparison between CatBoost, XGBoost and LightGBM on epsilon dataset. Notice XGBoost GPU implementation doesn’t support V100 cards.

Talk scheduled for March 27, 1:00 PM - Room 231. Don’t miss the chance to listen about best-in-class gradient boosting implementation on GPU and ask any question.

Latest News

CatBoost papers on NeurIPS 2018

On December 2018, on NeurIPS conference in Montreal, Yandex team presented two papers related to CatBoost, an open-source machine learning library developed by Yandex.

0.10.x and 0.9.x releases review

CatBoost team continues to make a lot of improvements and speedups. What new and interesting have we added in our two latest releases and why is it worth to try CatBoost now? We'll discuss it in this post.