Catboost - Training a Regression Model on GPU

Python

Training a regression model using catboost on GPU.

 1|  # Initalise regressor model with RMSE loss function
 2|  # Train using GPU
 3|  model = cb.CatBoostRegressor(iterations=10000, 
 4|                               learning_rate = 0.05,
 5|                               depth = 10,
 6|                               min_data_in_leaf = 5,
 7|                               border_count = 64,
 8|                               l2_leaf_reg = 6,
 9|                               loss_function='RMSE', 
10|                               eval_metric='RMSE',
11|                               task_type='GPU')
12|  
13|  #Create catboost pool for evaluation set
14|  #cat_indicies is a list of indicies where categorical exist in the X dataframes
15|  eval_dataset = cb.Pool(X_val, y_val, cat_features=cat_indicies)
16|  
17|  #Fits the model with early stopping rounds set to 50
18|  model.fit(X_train, 
19|            y_train, 
20|            cat_features=cat_indicies,
21|            eval_set=eval_dataset,
22|            early_stopping_rounds=50,
23|            silent=False)
Did you find this snippet useful?

Sign up for free to to add this to your code library