XGBRegressor - Training a Regression Model With XGBoost
Python
Training an XGBoost regression model using the sci-kit learn API.
1| from xgboost import XGBRegressor 2| from sklearn.metrics import mean_squared_error, mean_absolute_error, max_error, explained_variance_score, mean_absolute_percentage_error 3| 4| # Step 1: Initialise and fit XGBoost regression model 5| model = XGBRegressor(objective='reg:squarederror', 6| n_estimators=1000, 7| max_depth=4, 8| learning_rate=0.1, 9| min_child_weight=1, 10| colsample_bytree=0.9, 11| subsample=0.9, 12| n_jobs=-1, 13| random_state=101) 14| model.fit(X_train, y_train) 15| 16| model.save_model('xgb_regressor.model') 17| 18| # Step 2: Plot feature importances 19| features = X_train.columns 20| importance_values = model.feature_importances_ 21| 22| plt.barh(y=range(len(features)), 23| width=importance_values, 24| tick_label=features) 25| plt.show() 26| 27| # Step 3: Make prediction for test data & evaluate performance 28| y_pred = model.predict(X_test) 29| print('RMSE:',mean_squared_error(y_test, y_pred, squared = False)) 30| print('MAE:',mean_absolute_error(y_test, y_pred)) 31| print('MAPE:',mean_absolute_percentage_error(y_test, y_pred)) 32| print('Max Error:',max_error(y_test, y_pred)) 33| print('Explained Variance Score:',explained_variance_score(y_test, y_pred))
142
127
122
115