Confusion Matrix Analyzer
Analyze classification model performance from confusion matrix values. Calculate precision, recall, F1 score, MCC, accuracy, specificity, and more.
Inputs
Results
Accuracy
0.9
Precision
0.83
Recall (Sensitivity)
0.91
F1 Score
0.87
How to Use This Calculator
- Enter the true positive, false positive, true negative, and false negative counts from your model.
- Review accuracy, precision, recall (sensitivity), and F1-score.
- Check specificity and AUC estimate for balanced evaluation.
- Identify whether false positives or false negatives are more costly for your use case.
- Adjust the classification threshold to trade off precision vs. recall based on business requirements.
Related Calculators
A/B Test Calculator
Determine statistical significance of A/B test results. Calculate z-score, p-value, conversion lift, and whether your variant beats the control at your chosen confidence level.
Feature Scaling Calculator
Scale raw feature values using min-max normalization, z-score standardization, robust scaling, and max-abs normalization for machine learning preprocessing.
Clustering Quality Score Calculator
Evaluate clustering quality using silhouette score, Calinski-Harabasz index, Davies-Bouldin index, and Dunn index from cluster distance metrics.
Six Sigma Level Calculator
Calculate sigma level, DPMO, defect rate, and process yield from defects, opportunities, and units. Includes performance level assessment and COPQ estimate.