Kappa is a way to assess a system based on the degree of agreement in a measurement system, to see if it is more effective than guessing at the right answer (usually pass/fail decisions).

If you flipped a coin and you guessed heads or tails, you would be right about 50% of the time by chance

- Kappa = 0 means that you were equal to random chance (50%)
- Kappa < 0 means that you were worse than random chance (less than 50% correct)
- Kappa > 0 means that you were better than random chance (more than 50% correct)
- Kappa = 1 means that you were correct 100% of the time

The formula for calculating kappa is below…

Where P_{o} is the actual probability of occurrence, and P_{e} is expected probability of occurrence.

In the coin toss study, P_{o} = 62 heads out of 100 = 0.62, and P_{e} = 0.50 (50% chance of heads)

Learn more by downloading the Gage R&R study course >>>

« Back to Glossary Index