Determining Intercoder Agreement for a Collocation Identification Task

Conference contribution
(Conference Contribution)


Publication Details

Author(s): Krenn B, Evert S, Zinsmeister H
Publishing place: Vienna, Austria
Publication year: 2004
Conference Proceedings Title: Proceedings of KONVENS 2004
Pages range: 89-96
Language: English


Abstract




In this paper, we describe an alternative to the kappa statistic for measuring intercoder agreement. We present a model based on the assumption that the observed surface agreement can be divided into (unknown amounts of) true agreement and chance agreement. This model leads to confidence interval estimates for the proportion of true agreement, which turn out to be comparable to confidence intervals for the kappa value. Thus we arrive at a meaningful alternative to the kappa statistic. We apply our approach to measuring intercoder agreement in a collocation annotation task, where human annotators were asked to classify PP-verb combinations extracted from a German text corpus as collocational versus non-collocational. Such a manual classification is essential for the evaluation of computational collocation extraction tools.



FAU Authors / FAU Editors

Evert, Stefan Prof. Dr.
Lehrstuhl für Korpus- und Computerlinguistik


External institutions with authors

Universität Hamburg


How to cite

APA:
Krenn, B., Evert, S., & Zinsmeister, H. (2004). Determining Intercoder Agreement for a Collocation Identification Task. In Proceedings of KONVENS 2004 (pp. 89-96). Vienna, Austria.

MLA:
Krenn, Brigitte, Stefan Evert, and Heike Zinsmeister. "Determining Intercoder Agreement for a Collocation Identification Task." Proceedings of the Proceedings of KONVENS 2004 Vienna, Austria, 2004. 89-96.

BibTeX: 

Last updated on 2018-11-08 at 00:09