Skip to main content
Calcimator

Inter-Rater Reliability Calculator

Calculate Cohen's kappa coefficient to measure agreement between two raters beyond what would be expected by chance. Get an interpretation of your reliability score.

Inputs

%
%

Results

Cohen's Kappa

0.6

InterpretationModerate
Observed Agreement80%
Expected Agreement50%
Agreements80
Disagreements20
How to Use This Calculator
  1. Enter the number of raters and the number of items rated.
  2. Input the rating scale type (nominal, ordinal, interval).
  3. Enter the percentage of agreements observed across all rater pairs.
  4. Review the Cohen's Kappa or ICC coefficient along with its interpretation.
  5. If Kappa is below 0.60, schedule a rater calibration session and re-rate a subset of items.
Ad Placeholder

Related Calculators

Ad Placeholder