Measuring Agreement of Multivariate Discrete Survival Times Using a Modified Weighted Kappa Coefficient
Summary measures of agreement and association between many raters' ordinal classifications - ScienceDirect
Cohen's kappa with three categories of variable - Cross Validated
Symmetry | Free Full-Text | An Empirical Comparative Assessment of Inter-Rater Agreement of Binary Outcomes and Multiple Raters
Weighted Kappa in R: Best Reference - Datanovia
Analysis of the Weighted Kappa and Its Maximum with Markov Moves | SpringerLink
Kappa Coefficient - an overview | ScienceDirect Topics
Cohen's Kappa: Learn It, Use It, Judge It | KNIME
Pairwise classifications of two observers who rated teacher 7 on 35... | Download Scientific Diagram
Inter-Annotator Agreement (IAA). Pair-wise Cohen kappa and group Fleiss'… | by Louis de Bruijn | Towards Data Science
Interrater agreement of two adverse drug reaction causality assessment methods: A randomised comparison of the Liverpool Adverse Drug Reaction Causality Assessment Tool and the World Health Organization-Uppsala Monitoring Centre system | PLOS
Weighted Cohen's Kappa | Real Statistics Using Excel
Cohen's linearly weighted kappa is a weighted average of 2x2 kappas
Mean Pairwise Interrater Percentage Agreement and Kappa Scores (±... | Download Table