Sorry if my terminology is incorrect. I am trying to calculate the average error of prediction to be represented in percentage. For example, I should be able to say, the predicted values are on average 20% different than the original values. More precisely, given the following original and predicted values:
Original Predicted
1000 1200 -> 20% error
100 70 -> 30% error
what is the average accuracy? Is it 100%-(20%+30%)/2 = 75%?
What if I have an error that is more than 100? Like in the following case:
Original Predicted
1000 2500 -> 150% error
100 -120 -> 120% error
What is the average accuracy in this case? Is it 100%-(150%+120%)/2=-65%? It does not make sense for accuracy to be a minus value, right?
What about in the case of categorical values? How do I know the average accuracy of the prediction?