Cohen’s Kappa Calculator

Cohen’s kappa measures the level of agreement between two raters or judges who each classify items into mutually exclusive categories. The formula for Cohen’s kappa is calculated as:
k = (po – pe) / (1 – pe)
where:
  • po: Relative observed agreement among raters
  • pe: Hypothetical probability of chance agreement
To find Cohen’s kappa between two raters, simply fill in the boxes below and then click the “Calculate” button.

Cohen’s Kappa: 0.2857

Leave a Reply

Your email address will not be published. Required fields are marked *