Confusion Matrix and Conditional Probability

Understanding the confusion matrix and its relation to conditional probability.
Machine Learning
Probability
Published

June 26, 2025

Let’s start with a quick overview of the confusion matrix. The confusion matrix is a table that is often used to describe the performance of a classification model. It summarizes the results of predictions made by the model, comparing them to the actual outcomes. Let’s consider a simple example of a binary classification problem, where we have two classes: positive (1) and negative (0). The confusion matrix for this problem may look like this:

Predicted Positive (1) Predicted Negative (0)
Actual Positive (1) True Positive (TP) False Negative (FN)
Actual Negative (0) False Positive (FP) True Negative (TN)

To measure the performance of the model, we can calculate several metrics based on the confusion matrix: