Abstract
Studies in applied behavior analysis have used percentage agreement as an index of interobserver reliability. Because percentage agreement is inflated by chance agreements, kappa has been recommended as a preferred alternative. In this article, we present a procedure to estimate maximum and minimum kappa from percentage agreement with limited information. The procedure will facilitate comparisons between studies and the reassessment of the reliabilities of previous studies.
| Original language | English (US) |
|---|---|
| Pages (from-to) | 375-378 |
| Number of pages | 4 |
| Journal | Behavioral Assessment |
| Volume | 6 |
| Issue number | 4 |
| State | Published - Sep 1984 |
UN SDGs
This output contributes to the following UN Sustainable Development Goals (SDGs)
-
SDG 3 Good Health and Well-being
All Science Journal Classification (ASJC) codes
- Psychiatry and Mental health
Fingerprint
Dive into the research topics of 'The estimation of kappa from percentage agreement interobserver reliability'. Together they form a unique fingerprint.Cite this
- APA
- Author
- BIBTEX
- Harvard
- Standard
- RIS
- Vancouver