RSNA Expert Consensus Statement on Reporting Chest CT Findings Related to COVID-19: Interobserver Agreement Between Chest Radiologists
Abstract
Abstract
Purpose:
Methods:
Results:
Conclusion:
Résumé
Objectif:
Méthodes:
Résultats:
Conclusion:
Introduction
Methods and Materials
Study Population
Imaging Protocol
Imaging Analysis
Rad 1 (%) | Rad 2 (%) | Rad 3 (%) | |
---|---|---|---|
Typical cases out of total (303) | 140 (46.2%) | 160 (52.8%) | 139 (45.9%) |
% of typical cases | |||
Ground glass total | 137 (97.9%) | 157 (98.1%) | 130 (93.5%) |
Round | 124 (88.6%) | 154 (96.3%) | 124 (89.2%) |
Peripheral | 137 (97.9%) | 156 (97.5%) | 133 (95.7%) |
Crazy paving total | 55 (39.3%) | 37 (23.1%) | 44 (31.7%) |
Round | 31 (22.1%) | 32 (20%) | 40 (28.8%) |
Peripheral | 55 (39.3%) | 40 (25%) | 43 (30.9%) |
Consolidation | 91 (65%) | 95 (59.4%) | 72 (51.8%) |
Round | 49 (35%) | 90 (56.3%) | 61 (43.9%) |
Peripheral | 80 (57.1%) | 104 (65%) | 81 (58.3%) |
Peribronchovascular | 45 (32.1%) | 43 (26.9%) | 48 (34.5%) |
Perilobular | 70 (50%) | 49 (30.6%) | 29 (20.9%) |
Consolidation with reverse halo | 55 (39.3%) | 20 (12.5%) | 33 (23.7%) |
Posterior distribution | 140 (100%) | 160 (100%) | 139 (100%) |
Bronchial dilatation | 5 (3.6%) | 11 (6.9%) | 6 (4.3%) |
Statistical Analysis
Results
COVID-19 appearance | Agreement | Fleiss kappa | Standard error | P value | Interpretation |
---|---|---|---|---|---|
Typical COVID-19 appearance | |||||
Radiologist 1 and radiologist 2 and radiologist 3 | 0.815 | <.0001 | Almost perfect agreement | ||
Radiologist 1 and radiologist 2 | 90.1% | 0.803 | 0.034 | <.0001 | Almost perfect agreement |
Radiologist 1 and radiologist 3 | 91.1% | 0.821 | 0.033 | <.0001 | Almost perfect agreement |
Radiologist 2 and radiologist 3 | 91.1% | 0.823 | 0.032 | <.0001 | Almost perfect agreement |
Indeterminate COVID-19 appearance | |||||
Radiologist 1 and radiologist 2 and radiologist 3 | 0.636 | <.0001 | Substantial agreement | ||
Radiologist 1 and radiologist 2 | 89.1% | 0.597 | 0.063 | <.0001 | Moderate agreement |
Radiologist 1 and radiologist 3 | 89.4% | 0.668 | 0.054 | <.0001 | Substantial agreement |
Radiologist 2 and radiologist 3 | 89.8% | 0.641 | 0.058 | <.0001 | Substantial agreement |
Atypical COVID-19 appearance | |||||
Radiologist 1 and radiologist 2 and radiologist 3 | 0.806 | <.0001 | Almost perfect agreement | ||
Radiologist 1 and radiologist 2 | 98.0% | 0.823 | 0.071 | <.0001 | Almost perfect agreement |
Radiologist 1 and radiologist 3 | 98.3% | 0.830 | 0.074 | <.0001 | Almost perfect agreement |
Radiologist 2 and radiologist 3 | 97.7% | 0.762 | 0.086 | <.0001 | Substantial agreement |
Negative for pneumonia | |||||
Radiologist 1 and radiologist 2 and radiologist 3 | 0.962 | <.0001 | Almost perfect agreement | ||
Radiologist 1 and radiologist 2 | 98.3% | 0.960 | 0.018 | <.0001 | Almost perfect agreement |
Radiologist 1 and radiologist 3 | 98.7% | 0.968 | 0.016 | <.0001 | Almost perfect agreement |
Radiologist 2 and radiologist 3 | 98.3% | 0.960 | 0.018 | <.0001 | Almost perfect agreement |
COVID-19 appearance | Cramer V | P value | Interpretation |
---|---|---|---|
Typical COVID-19 appearance | |||
Radiologist 1 and radiologist 2 | 0.810 | <.001 | Very strong correlation |
Radiologist 1 and radiologist 3 | 0.821 | <.001 | Very strong correlation |
Radiologist 2 and radiologist 3 | 0.831 | <.001 | Very strong correlation |
Indeterminate COVID-19 appearance | |||
Radiologist 1 and radiologist 2 | 0.611 | <.001 | Very strong correlation |
Radiologist 1 and radiologist 3 | 0.669 | <.001 | Very strong correlation |
Radiologist 2 and radiologist 3 | 0.665 | <.001 | Very strong correlation |
Atypical COVID-19 appearance | |||
Radiologist 1 and radiologist 2 | 0.823 | <.001 | Very strong correlation |
Radiologist 1 and radiologist 3 | 0.842 | <.001 | Very strong correlation |
Radiologist 2 and radiologist 3 | 0.774 | <.001 | Very strong correlation |
Negative for pneumonia | |||
Radiologist 1 and radiologist 2 | 0.960 | <.001 | Very strong correlation |
Radiologist 1 and radiologist 3 | 0.968 | <.001 | Very strong correlation |
Radiologist 2 and radiologist 3 | 0.960 | <.001 | Very strong correlation |
Discussion
Conclusion
Declaration of Conflicting Interests
Funding
ORCID iDs
References
Cite article
Cite article
Cite article
Download to reference manager
If you have citation software installed, you can download article citation data to the citation manager of your choice
Information, rights and permissions
Information
Published In
Keywords
Authors
Metrics and citations
Metrics
Journals metrics
This article was published in Canadian Association of Radiologists Journal.
VIEW ALL JOURNAL METRICSArticle usage*
Total views and downloads: 2227
*Article usage tracking started in December 2016
Articles citing this one
Receive email alerts when this article is cited
Web of Science: 0
Crossref: 0
-
Diagnostic Performance and Reproducibility of the Radiological Society...
-
Evidence of a cognitive bias in the quantification of COVID-19 with CT...
-
Diagnostic performance of standardized typical CT findings for COVID-1...
-
Clinical Frailty Scale (CFS) indicated frailty is associated with incr...
-
Comparison of the RSNA chest CT classification system and CO-RADS syst...
-
Facilitating standardized COVID-19 suspicion prediction based on compu...
-
Diagnostic captioning: a survey
-
Meta-analysis of COVID-19 prevalence during preoperative COVID-19 scre...
-
Algorithmic Approach to the Diagnosis of Organizing Pneumonia
-
Severity of Disease and COVID-19 Complications During Hospital Stay: A...
-
Value of quantitative airspace disease measured on chest CT and chest ...
-
Impact of COVID-19 outbreak on patients with ST-segment elevation myoc...
-
Evaluation of Treatment with a single (400mg) versus double dose (800m...
-
Covid-19 Chest CT Scan Image Classification Using LCKSVD and Frozen Sp...
-
Diagnostic performance of the RSNA-proposed classification for COVID-1...
-
Validation of the North America expert consensus statement on reportin...
-
COVID-19-Related Lung Involvement at Different Time Intervals: Evaluat...
-
Differentiation of Chest CT Findings Between Influenza Pneumonia and C...
-
Imaging in the COVID-19 era: Lessons learned during a pandemic
-
Imaging in the COVID-19 era: Lessons learned during a pandemic
-
Classification Performance for COVID Patient Prognosis from Automatic ...
-
COVID-19, AI enthusiasts, and toy datasets: radiology without radiolog...
-
COVID-19’DA KARDİYOTORASİK RADYOLOJİK GÖRÜNTÜLEME VE YAPAY ZEKANIN ROL...
-
Covid-19 infection in cancer patients: the management in a diagnostic ...
-
Diagnosis of the coronavirus disease 2019 with chest computed tomograp...
-
Evaluation of the RSNA and CORADS classifications for COVID-19 on ches...
-
Evaluation of the RSNA and CORADS classifications for COVID-19 on ches...
Figures and tables
Figures & Media
Tables
View Options
View options
PDF/ePub
View PDF/ePubGet access
Access options
If you have access to journal content via a personal subscription, university, library, employer or society, select from the options below:
loading institutional access options
CAR members can access this journal content using society membership credentials.
CAR members can access this journal content using society membership credentials.
Alternatively, view purchase options below:
Purchase 24 hour online access to view and download content.
Access journal content via a DeepDyve subscription or find out more about this option.