StatsMiniBlog: Kappa

20140205-091454.jpg
After a short pause while brain cells were diverted elsewhere, we’re returning with the critically acclaimed (well, slightly positively tweeted) StatsMiniBlog series.

(As an aside – do let me know via comments, Facebook or Twitter if there’s an issue you’d like to see covered)

Kappa (κ) is a measure of agreement, usually between two observers of a dichotomous outcome although there are variants for multiple observers.  It gives you a measure of what agreement you see that is ‘beyond chance’

The idea is that if you got two observers to report on the colour of balls emerging from a bag, and in reality there were 5 blue and 5 yellow, if the observers didn’t look at them and just guessed you would expect them both to be right 0.5 * 0.5 = 0.25 (25%) of the time.

Anything over this is “agreement beyond chance”.

The value given to Kappa reflects the degree of agreement, 0 is perfect disagreement, 1, perfect agreement, in between is usually interpreted

Kappa
(κ)
Strength of Agreement
<0.20 Poor
0.21-0.4 Fair
0.41-0.6 Moderate
0.61-0.8 Good
0.81-1.0 Excellent

 

How good you ‘need’ this to be depends upon what you’re using it for. In the same way there is no “right” wine to go with a meal, there is no “correct” kappa value.

– Archi

 

(Visited 94 times, 1 visits today)