What is a confusion matrix?
A
A matrix used to encode labels
B
A performance summary of a regression model
C
A table that shows correct and incorrect predictions of a classification model
D
A matrix used to confuse the model
Analysis & Theory
A confusion matrix is a table used to evaluate the performance of a classification model by showing true vs. predicted labels.
In a binary confusion matrix, what does the top-left cell represent?
Analysis & Theory
The top-left cell usually represents True Negatives — when the model correctly predicts the negative class.
What does the bottom-right cell in a confusion matrix represent?
Analysis & Theory
The bottom-right cell represents True Positives — correctly predicted positive class cases.
Which value from the confusion matrix indicates false alarms?
Analysis & Theory
False Positives are incorrect positive predictions — model predicts positive, but it's actually negative.
Which metric is calculated as TP / (TP + FP)?
Analysis & Theory
Precision = True Positives / (True Positives + False Positives) — it measures how many predicted positives are actual positives.
What does a high number of false negatives indicate?
A
Model often misses actual positives
B
Model overpredicts positives
C
Model is perfectly accurate
Analysis & Theory
False negatives are cases where the model fails to identify actual positive instances.
Which metric is best when false negatives are more important than false positives?
Analysis & Theory
Recall is critical when missing a positive (false negative) is costly — such as in medical diagnosis.
What is the formula for accuracy using a confusion matrix?
A
(TP + TN) / (TP + TN + FP + FN)
Analysis & Theory
Accuracy measures the proportion of total correct predictions: (TP + TN) / Total samples.
What does the F1 score represent?
A
Average of sensitivity and specificity
B
Average of accuracy and recall
C
Harmonic mean of precision and recall
D
Geometric mean of all matrix values
Analysis & Theory
F1 Score balances precision and recall using the harmonic mean: 2 * (Precision * Recall) / (Precision + Recall).
Which Python library provides a function to generate a confusion matrix?
Analysis & Theory
`sklearn.metrics.confusion_matrix()` is used to create a confusion matrix from true and predicted labels.