To assess a Smart Imagery Framing and Truthing (SIFT) system in automatically labeling and annotating chest X-ray (CXR) images with multiple diseases as an assist to radiologists on multi-disease CXRs. SIFT system was developed by integrating a convolutional neural network based-augmented MaskR-CNN and a multi-layer perceptron neural network. It is trained with images containing 307,415 ROIs representing 69 different abnormalities and 67,071 normal CXRs. SIFT automatically labels ROIs with a specific type of abnormality, annotates fine-grained boundary, gives confidence score, and recommends other possible types of abnormality. An independent set of 178 CXRs containing 272 ROIs depicting five different abnormalities including pulmonary tuberculosis, pulmonary nodule, pneumonia, COVID-19, and fibrogenesis was used to evaluate radiologists’ performance based on three radiologists in a double-blinded study. The radiologist first manually annotated each ROI without SIFT. Two weeks later, the radiologist annotated the same ROIs with SIFT aid to generate final results. Evaluation of consistency, efficiency and accuracy for radiologists with and without SIFT was conducted. After using SIFT, radiologists accept 93% SIFT annotated area, and variation across annotated area reduce by 28.23%. Inter-observer variation improves by 25.27% on averaged IOU. The consensus true positive rate increases by 5.00% (p=0.16), and false positive rate decreases by 27.70% (p<0.001). The radiologist’s time to annotate these cases decreases by 42.30%. Performance in labelling abnormalities statistically remains the same. Independent observer study showed that SIFT is a promising step toward improving the consistency and efficiency of annotation, which is important for improving clinical X-ray diagnostic and monitoring efficiency.
Chest x-ray radiography (CXR) is widely used in screening and detecting lung diseases. However, reading CXR images is often difficult resulting in diagnostic errors and inter-reader variability. To address this clinical challenge, a Multi-task, Optimal-recommendation, and Max-predictive Classification and Segmentation (MOM-ClaSeg) system is developed to detect and delineate different abnormal regions of interest (ROIs) on CXR, make multiple recommendations of abnormalities sorted by the generated probability scores, and automatically generate diagnostic report. MOM-ClaSeg consists of convolutional neural networks to generate a detection, finer-grained segmentation and prediction score for each ROI based on augmented MaskRCNN framework, and multi-layer perception neural networks to fuse results to generate the optimal recommendation for each detected ROI based on decision fusion framework. Total of 310,333 adult CXR containing 67,071 normal and 243,262 abnormal images depicting 307,415 confirmed ROIs of 65 different abnormalities were assembled as to train MOM-ClaSeg. An independent 22,642 CXR was assembled to test MOMClaSeg. Radiologists detected 6,646 ROIs that depict 43 different types of abnormalities on 4,068 CXR images. Comparing with radiologists’ detection results, MOM-ClaSeg system detected 6,009 true-positive ROIs and 6,379 false-positive ROIs, which represents 90.3% sensitivity and 0.28 false-positive ROIs per image. For the eight common diseases, the computed areas under ROC curves ranged from 0.880 to 0.988. Additionally, 70.4% of MOM-ClaSeg system-detected abnormalities along with system-generated diagnostic reports were directly accepted by radiologists. This study presents the first AI-based multi-task prediction system to detect different abnormalities and generate diagnostic reports to assist radiologists accurately and/or efficiently detecting lung diseases.
Access to the requested content is limited to institutions that have purchased or subscribe to SPIE eBooks.
You are receiving this notice because your organization may not have SPIE eBooks access.*
*Shibboleth/Open Athens users─please
sign in
to access your institution's subscriptions.
To obtain this item, you may purchase the complete book in print or electronic format on
SPIE.org.
INSTITUTIONAL Select your institution to access the SPIE Digital Library.
PERSONAL Sign in with your SPIE account to access your personal subscriptions or to use specific features such as save to my library, sign up for alerts, save searches, etc.