LGBMRegressor - Training a LightGBM Regression Model

Python

This code snippet includes the following three steps: initialising and fitting the model, plotting feature importances, and evaluating performance on the test data.

With LightGBM we can expect faster performance and comparable accuracy to other popular regression algorithms.

 1|  import lightgbm as lgb
 2|  from sklearn.metrics import mean_squared_error, mean_absolute_error, max_error, explained_variance_score, mean_absolute_percentage_error
 3|  import matplotlib.pyplot as plt
 4|  
 5|  # Step 1: Initialise and fit LightGBM regression model
 6|  model = lgb.LGBMRegressor(objective='regression', 
 7|                            n_estimators=1000, 
 8|                            max_depth=4, 
 9|                            learning_rate=0.1, 
10|                            min_child_samples=1,
11|                            colsample_bytree=0.9,
12|                            subsample=0.9,
13|                            random_state=101)
14|  model.fit(X_train, y_train)
15|  
16|  # Save the model
17|  model.booster_.save_model('lgb_regressor.model')
18|  
19|  # Step 2: Plot feature importances
20|  features = X_train.columns
21|  importance_values = model.feature_importances_
22|  
23|  plt.barh(y=range(len(features)),
24|           width=importance_values,
25|           tick_label=features)
26|  plt.show()
27|  
28|  # Step 3: Make prediction for test data & evaluate performance
29|  y_pred = model.predict(X_test)
30|  print('RMSE:',mean_squared_error(y_test, y_pred, squared=False))
31|  print('MAE:',mean_absolute_error(y_test, y_pred))
32|  print('MAPE:',mean_absolute_percentage_error(y_test, y_pred))
33|  print('Max Error:',max_error(y_test, y_pred))
34|  print('Explained Variance Score:',explained_variance_score(y_test, y_pred))
Did you find this snippet useful?

Sign up for free to to add this to your code library