Accuracy, F1 Score, Precision and Recall in Machine Learning

You must have gone through the terms like Accuracy, F1 Score, Confusion Matrix, Precision and Recall while evaluating the performance of your machine learning model. In this article, I will take you through what is Accuracy, F1 Score, Confusion Matrix, Precision and Recall in Machine Learning.

Introduction to Accuracy, F1 Score, Confusion Matrix, Precision and Recall

After training a machine learning model, let’s say a classification model with class labels 0 and 1, the next step we need to do is make predictions on the test data. To find out how well our model works on the test data, we usually print a confusion matrix.

Also, Read – Machine Learning Full Course for free.

The confusion matrix generally looks like the figure shown below:

confusion matrix
  1. TP means True Positive
  2. FP means False Positive
  3. TN means True Negative
  4. FN means False Negative

As you can see in the title of the figure above it is represented by two terms; Actual values and Predicted values, which leads to the introduction of all other factors of performance evaluation metrics like Accuracy, F1 Score, Precision and Recall.

Accuracy is the ratio of the True predicted values to the Total predicted values. Accuracy = (True Positive + True Negative) / (True Positive + False Positive + True Negative + False Negative).

The precision for class 1 is, out of all predicted class values like 1, how many actually belong to class 1. Precision = TP / (TP + FP).

Recall for class 1 is, out of all the values that actually belong to class 1, how much is predicted as class 1. Recall = TP / (TP + FN).

Since there is a trade-off between precision and recall, this means that if one increases, the other decreases. Sometimes accuracy alone is not a good idea to use as an evaluation measure. This is the reason why we use precision and recall in consideration.

To have a combined effect of precision and recall, we use the F1 score. The F1 score is the harmonic mean of precision and recall. F1 score = 2 / (1 / Precision + 1 / Recall).

I hope you liked this article on the concept of Performance Evaluation matrics of a Machine Learning model. Feel free to ask your valuable questions in the comments section below.

Also, Read – Machine Learning Projects solved and explained for free.

Aman Kharwal
Aman Kharwal

I'm a writer and data scientist on a mission to educate others about the incredible power of data📈.

Articles: 1435

Leave a Reply