0
$\begingroup$

Sorry if my terminology is incorrect. I am trying to calculate the average error of prediction to be represented in percentage. For example, I should be able to say, the predicted values are on average 20% different than the original values. More precisely, given the following original and predicted values:

Original Predicted
1000     1200 -> 20% error
100      70 -> 30% error

what is the average accuracy? Is it 100%-(20%+30%)/2 = 75%? What if I have an error that is more than 100? Like in the following case:

Original    Predicted
1000        2500 -> 150% error
100         -120 -> 120% error

What is the average accuracy in this case? Is it 100%-(150%+120%)/2=-65%? It does not make sense for accuracy to be a minus value, right? What about in the case of categorical values? How do I know the average accuracy of the prediction?

$\endgroup$

1 Answer 1

1
$\begingroup$

This chapter in a free online forecasting textbook should be very helpful to you. It describes the Mean Percentage Error (MPE) and the Mean Absolute Percentage Error (MAPE) that you discuss above. The MAPE, in particular, is commonly seen, although it has its problems - it is bounded from below by zero, but unbounded above (so your accuracy, defined as 1-MAPE, can be negative, and more seriously, if you optimize forecasts based on MAPE, you will get biased forecasts) and is undefined if actuals are zero.

For categorical values, you can look at misclassification rates. In the case of a binary classification, you can look at KPIs like sensitivity and specificity.

In any case, you should choose the error measure(s) that most accurately reflect your loss function, which in turn depend on what you want to use the forecast for.

$\endgroup$

Your Answer

By clicking “Post Your Answer”, you agree to our privacy policy.

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.