CatBoost is an open-source gradient boosting library 
with categorical features support

Get started

News

New ways to explore your data

April 20, 2018

New superb tool for exploring feature importance, new algorithm for finding most influential training samples, possibility to save your model as cpp or python code and more. Check CatBoost v0.8 details inside!

Read the full version

CatBoost on GPU talk at GTC 2018

March 26, 2018

Come and listen our talk about the fastest implementation of Gradient Boosting for GPU at the GTC 2018 Silicon Valley! GTC will take place on March 26-29 and will provide an excellent opportunity to get more details about CatBoost performance on GPU.

Read the full version

Best in class inference and a ton of speedups

January 31, 2018

New version of CatBoost has industry fastest inference implementation. It's 35 times faster than open-source alternatives and completely production ready. Furthermore 0.6 release contains a lot of speedups and improvements. Find more inside.

Read the full version

Extremely fast learning on GPU has arrived!

November 2, 2017

CatBoost version 0.3 brings efficient support of distributed training on GPU! One server with 8 GPUs can process as much data as few hundreds of CPU servers and will work much faster. Even with a single GPU you will get up to 40x speed up of your training. Check out our benchmarks inside and download new version on GitHub.

Read the full version

Version 0.2 released

September 14, 2017

We are proud to release CatBoost version 0.2. Speed, stability, quality and ton of other improvements are already published on GitHub. Find the full list of improvements below.

Read the full version

CatBoost at ICML 2017

July 20, 2017

Come and meet us at the 2017 ICML conference in Sydney! The 34th International Conference on Machine Learning will take place on August 6-11 and will provide an excellent opportunity to get a demo of CatBoost in action.

Read the full version