Skip to main content
Calcimator

Model Performance Metrics Calculator

Calculate accuracy, precision, recall, F1 score, MCC, AUC, and other classification performance metrics.

Inputs

Results

Accuracy

87.5%

Precision

89.47%

Recall (Sensitivity)

85%

F1 Score

87.18%

Matthews Correlation Coefficient75.09
AUC (Area Under Curve)87.5
How to Use This Calculator
  1. Enter the confusion matrix values: true positives, false positives, true negatives, false negatives.
  2. Review the derived metrics: accuracy, precision, recall, F1 score, and specificity.
  3. Set the positive class prevalence to see how threshold changes affect precision-recall tradeoff.
  4. For imbalanced datasets, prioritize F1 score or AUC-ROC over raw accuracy.
  5. Compare metrics across models to select the best performer on your evaluation criteria.
Ad Placeholder

Formula

F1 = 2 Γ— (Precision Γ— Recall) / (Precision + Recall)

Related Calculators

Ad Placeholder