How to Train a Catboost Classifier with GridSearch Hyperparameter Tuning

Python

Catboostclassifier Python example with hyper parameter tuning. In this code snippet we train a classification model using Catboost. We initiate the model and then use grid search to to find optimum parameter values from a list that we define inside the grid dictionary. The model is then fit with these parameters assigned.

 1|  import catboost as cb
 2|  
 3|  #Create datasets
 4|  train_dataset = cb.Pool(X_train,y_train, cat_features=categorical_indicies)
 5|  eval_dataset = cb.Pool(X_val,y_val, cat_features=categorical_indicies)
 6|  
 7|  model = cb.CatBoostClassifier(iterations=1000, 
 8|                                loss_function='Logloss', 
 9|                                eval_metric='Accuracy')
10|  
11|  #Declare parameters to tune and values to try
12|  grid = {'learning_rate': [0.03, 0.1],
13|          'depth': [4, 6, 10],
14|          'l2_leaf_reg': [1, 3, 5,]}
15|  
16|  #Find optimum parameters
17|  model.grid_search(grid,train_dataset,plot=True)
18|  
19|  #Fit model with early stopping if improvement hasn't been made within 50 iterations
20|  model.fit(train_dataset, 
21|            eval_set=eval_dataset,
22|            early_stopping_rounds=50,
23|            plot=True,
24|            silent=False)
25|  
Did you find this snippet useful?

Sign up for free to to add this to your code library