Skip to main content
Calcimator

Model Performance Metrics Calculator

Calculate accuracy, precision, recall, F1 score, MCC, AUC, and other classification performance metrics.

Inputs

Results

Accuracy

87.5%

Precision

89.47%

Recall (Sensitivity)

85%

F1 Score

87.18%

Matthews Correlation Coefficient75.09
AUC (Area Under Curve)87.5

Accuracy

87.5%

Precision

89.47%

Recall (Sensitivity)

85%

F1 Score

87.18%

How to Use This Calculator
  1. Start by filling in the input fields below. Results update instantly as you type, so you can experiment with different values to see how they affect the outcome.
  2. True Positives (TP) — Correctly predicted positive cases Minimum value: 0 (default: 85).
  3. True Negatives (TN) — Correctly predicted negative cases Minimum value: 0 (default: 90).
  4. False Positives (FP) — Incorrectly predicted as positive Minimum value: 0 (default: 10).
  5. False Negatives (FN) — Incorrectly predicted as negative Minimum value: 0 (default: 15).
  6. Once all inputs are set, review your results in the Results panel. Here's what each output means:
  7. Accuracy — shown as a percentage. This is the primary result of this calculator.
  8. Precision — shown as a percentage. This is the primary result of this calculator.
  9. Recall (Sensitivity) — shown as a percentage. This is the primary result of this calculator.
  10. F1 Score — shown as a percentage. This is the primary result of this calculator.
  11. Matthews Correlation Coefficient — shown as a numeric value.
  12. AUC (Area Under Curve) — shown as a numeric value.
  13. Explore the related calculators below if you need deeper analysis or want to approach this topic from a different angle.
Ad Placeholder

Formula

F1 = 2 × (Precision × Recall) / (Precision + Recall)

Related Calculators

Ad Placeholder