Determining Intercoder Agreement for a Collocation Identification Task

Krenn B, Evert S, Zinsmeister H (2004)


Publication Language: English

Publication Type: Conference contribution, Conference Contribution

Publication year: 2004

City/Town: Vienna, Austria

Pages Range: 89-96

Conference Proceedings Title: Proceedings of KONVENS 2004

URI: http://purl.org/stefan.evert/PUB/KrennEvertZinsmeister2004.pdf

Abstract


In this paper, we describe an alternative to the kappa statistic for measuring intercoder agreement. We present a model based on the assumption that the observed surface agreement can be divided into (unknown amounts of) true agreement and chance agreement. This model leads to confidence interval estimates for the proportion of true agreement, which turn out to be comparable to confidence intervals for the kappa value. Thus we arrive at a meaningful alternative to the kappa statistic. We apply our approach to measuring intercoder agreement in a collocation annotation task, where human annotators were asked to classify PP-verb combinations extracted from a German text corpus as collocational versus non-collocational. Such a manual classification is essential for the evaluation of computational collocation extraction tools.

Authors with CRIS profile

Involved external institutions

How to cite

APA:

Krenn, B., Evert, S., & Zinsmeister, H. (2004). Determining Intercoder Agreement for a Collocation Identification Task. In Proceedings of KONVENS 2004 (pp. 89-96). Vienna, Austria.

MLA:

Krenn, Brigitte, Stephanie Evert, and Heike Zinsmeister. "Determining Intercoder Agreement for a Collocation Identification Task." Proceedings of the Proceedings of KONVENS 2004 Vienna, Austria, 2004. 89-96.

BibTeX: Download