How to evaluate a regression model's performance?

In this article, you will learn how to evaluate and understand the performance of your regression models to make informed decisions on your next steps.

When a model is successfully trained, it is evaluated using a test portion. If the model performs well on the test portion, this implies that it can generalize well and is not overfitted.

1. Navigate to the model details page

From the model leaderboard, click on a model name. This is a link that directs users to the selected model's detail page.

model leaderboard on the AI & Analytics Engine

Model Leaderboard on the AI & Analytics Engine

2. Read and understand the performance summary

The model’s performance on the latest test portion of the training dataset is summarized in this tab. Users can see the following:

  1. Prediction quality, which is based on the appropriate metrics that our team of data scientists has expertly selected for a regression problem

  2. Prediction error, which is the average difference between the predicted and actual target values

  3. Percentage error, which is the prediction error as a percentage of the actual target values, averaged across all predictions

Model performance summary

Tip:  To see the metrics used in prediction quality, see this article

For further understanding of the concepts of prediction and percentage errors, this article provides a detailed explanation.

Based on their satisfaction with the performance, users can follow the Engine’s suggestions on what they can do next. If users wish to see a more detailed report on the model’s performance, they can simply go to the Performance tab.

detailed report on model performance

3. Read the detailed report on the Performance tab

To view the detailed report in the Performance tab, users can do one of the following:

  • Click on the Performance tab; or

detailed report on the model performance

  • Click on the “View detailed report” below the performance summary card.

detailed report on model performance

On this page, users can:

a. Select the test portion to see the model’s performance

Using the dropdown, users can select a test portion. All of the data in the Performance tab will be updated to reflect the model’s performance on the selected test portion.

b. Look at the performance summary

Users can quickly look at the prediction quality, prediction error, and percentage error.

model performance summary

c. Use the prediction error breakdown table

Click on the Prediction error breakdown card to expand the section and view the prediction error breakdown table.

To analyze the prediction and percentage errors of the model, users can:

  1. Add breakpoints to view specific ranges

  2. Edit and delete the added breakpoints

Note:  The breakpoint value must be within the range of the actual target value in the selected test portion.

In addition to this, users can also see the number of rows where the actual target values fall within the range they specified.

prediction error breakdown table

d. Detailed view of the model's performance

Click on the detailed view card to expand the section and view the relevant charts and metrics used by data scientists to evaluate model performance.

Users can switch between the tabs to see:

  • Predicted v. Actual target values

  • Residual v. Predicted values

  • Residual histogram

model performance summary and detailed view