PDF) Weighted Krippendorff's alpha is a more reliable metrics for multi- coders ordinal annotations: experimental studies on emotion, opinion and coreference annotation | JY Jya - Academia.edu
Measuring Inter-coder Agreement - ATLAS.ti
On the usage of Kappa to evaluate agreement on coding tasks
Reliability Coefficient, Krippendorff's Alpha for the Coding Scheme... | Download Table
Krippendorff's Alpha Ratings | Real Statistics Using Excel
Krippendorff's Alpha Overview | Real Statistics Using Excel
Testing the Difference of Correlated Agreement Coefficients for Statistical Significance - Kilem L. Gwet, 2016