Skip to main content

Advertisement

Table 1 Inter-rater agreement (percentage agreement) and reliability (kappa coefficients) on whether the property was evaluated in an article (COSMIN step 1)

From: Inter-rater agreement and reliability of the COSMIN (COnsensus-based Standards for the selection of health status Measurement Instruments) Checklist

  percentage agreement Intraclass kappaa
Internal consistency 94 0.66
Reliability 94 0.77
Measurement error 94 0.02b
Content validity 84 0.29
Structural validity 86 0.48
Hypotheses testing 87 0.29
Cross-cultural validity 95 0.66b
Criterion validity 93 0.23b
Responsiveness 96 0.81
Interpretability 86 0.02b
  1. a number of ratings on the 75 articles = 263; b items with low dispersal i.e. more than 75% of the raters who responded to an item rated the same response category; printed in bold indicates kappa > 0.70 or % agreement >80%