CatBoost is a fast, scalable, high performance open-source gradient boosting on decision trees library

Get started

News

CatBoost Enables Fast Gradient Boosting on Decision Trees Using GPUs

December 18, 2018

Gradient boosting benefits from training on huge datasets. In addition, the technique is efficiently accelerated using GPUs. Read details in this post.

Read the full version

CatBoost papers on NeurIPS 2018

December 17, 2018

On December 2018, on NeurIPS conference in Montreal, Yandex team presented two papers related to CatBoost, an open-source machine learning library developed by Yandex.

Read the full version

0.10.x and 0.9.x releases review

November 2, 2018

CatBoost team continues to make a lot of improvements and speedups. What new and interesting have we added in our two latest releases and why is it worth to try CatBoost now? We'll discuss it in this post.

Read the full version

New ways to explore your data

April 20, 2018

New superb tool for exploring feature importance, new algorithm for finding most influential training samples, possibility to save your model as cpp or python code and more. Check CatBoost v0.8 details inside!

Read the full version

CatBoost on GPU talk at GTC 2018

March 26, 2018

Come and listen our talk about the fastest implementation of Gradient Boosting for GPU at the GTC 2018 Silicon Valley! GTC will take place on March 26-29 and will provide an excellent opportunity to get more details about CatBoost performance on GPU.

Read the full version

Best in class inference and a ton of speedups

January 31, 2018

New version of CatBoost has industry fastest inference implementation. It's 35 times faster than open-source alternatives and completely production ready. Furthermore 0.6 release contains a lot of speedups and improvements. Find more inside.

Read the full version

Contacts