Gianinazzi, Micòl E. and Rueegg, Corina S. and Zimmerman, Karin and Kuehni, Claudia E. and Michel, Gisela and Swiss Paediatric Oncology Group, . (2015) Intra-rater and inter-rater reliability of a medical record abstraction study on transition of care after childhood cancer. PloS one, 10 (5). e0124290.
Full text not available from this repository.
Official URL: https://edoc.unibas.ch/63387/
Downloads: Statistics Overview
Abstract
The abstraction of data from medical records is a widespread practice in epidemiological research. However, studies using this means of data collection rarely report reliability. Within the Transition after Childhood Cancer Study (TaCC) which is based on a medical record abstraction, we conducted a second independent abstraction of data with the aim to assess a) intra-rater reliability of one rater at two time points; b) the possible learning effects between these two time points compared to a gold-standard; and c) inter-rater reliability.; Within the TaCC study we conducted a systematic medical record abstraction in the 9 Swiss clinics with pediatric oncology wards. In a second phase we selected a subsample of medical records in 3 clinics to conduct a second independent abstraction. We then assessed intra-rater reliability at two time points, the learning effect over time (comparing each rater at two time-points with a gold-standard) and the inter-rater reliability of a selected number of variables. We calculated percentage agreement and Cohen's kappa.; For the assessment of the intra-rater reliability we included 154 records (80 for rater 1; 74 for rater 2). For the inter-rater reliability we could include 70 records. Intra-rater reliability was substantial to excellent (Cohen's kappa 0-6-0.8) with an observed percentage agreement of 75%-95%. In all variables learning effects were observed. Inter-rater reliability was substantial to excellent (Cohen's kappa 0.70-0.83) with high agreement ranging from 86% to 100%.; Our study showed that data abstracted from medical records are reliable. Investigating intra-rater and inter-rater reliability can give confidence to draw conclusions from the abstracted data and increase data quality by minimizing systematic errors.
Faculties and Departments: | 03 Faculty of Medicine > Departement Public Health > Institut für Pflegewissenschaft |
---|---|
UniBasel Contributors: | Zimmermann, Karin |
Item Type: | Article, refereed |
Article Subtype: | Research Article |
ISSN: | 1932-6203 |
Note: | Publication type according to Uni Basel Research Database: Journal article |
Identification Number: |
|
Last Modified: | 08 Dec 2018 11:41 |
Deposited On: | 08 Dec 2018 11:41 |
Repository Staff Only: item control page