Skip to main content
Calcimator

Confusion Matrix Analyzer

Analyze classification model performance from confusion matrix values. Calculate precision, recall, F1 score, MCC, accuracy, specificity, and more.

Inputs

Results

Accuracy

0.9

Precision

0.83

Recall (Sensitivity)

0.91

F1 Score

0.87

Matthews Correlation Coeff.0.79
Specificity0.89
False Positive Rate0.11
Balanced Accuracy0.9
Prevalence0.38
Total Samples145
How to Use This Calculator
  1. Enter the true positive, false positive, true negative, and false negative counts from your model.
  2. Review accuracy, precision, recall (sensitivity), and F1-score.
  3. Check specificity and AUC estimate for balanced evaluation.
  4. Identify whether false positives or false negatives are more costly for your use case.
  5. Adjust the classification threshold to trade off precision vs. recall based on business requirements.
Ad Placeholder

Related Calculators

Ad Placeholder