4 Upvotes

Catboost - Training a Regression Model on GPU

Python
Supervised Learning

Training a regression model using catboost on GPU.

# Initalise regressor model with RMSE loss function
# Train using GPU
model = cb.CatBoostRegressor(iterations=10000, 
                             learning_rate = 0.05,
                             depth = 10,
                             min_data_in_leaf = 5,
                             border_count = 64,
                             l2_leaf_reg = 6,
                             loss_function='RMSE', 
                             eval_metric='RMSE',
                             task_type='GPU')

#Create catboost pool for evaluation set
#cat_indicies is a list of indicies where categorical exist in the X dataframes
eval_dataset = cb.Pool(X_val, y_val, cat_features=cat_indicies)

#Fits the model with early stopping rounds set to 50
model.fit(X_train, 
          y_train, 
          cat_features=cat_indicies,
          eval_set=eval_dataset,
          early_stopping_rounds=50,
          silent=False)

By detro - Last Updated Feb. 17, 2022, 5:08 p.m.

Did you find this snippet useful?

Sign up to bookmark this in your snippet library

COMMENTS
RELATED SNIPPETS
Top Contributors
103
100