The world health organization’s global Tuberculosis (TB) report for 2022 identifies TB, with an estimated 1.6 million, as a leading cause of death. The number of new cases has risen since 2020, particularly the number of new drug-resistant cases, estimated at 450,000 in 2021. This is concerning, as treatment of patients with drug resistant TB is complex and may not always be successful. The NIAID TB Portals program is an international consortium with a primary focus on patient centric data collection and analysis for drug resistant TB. The data includes images, their associated radiological findings, clinical records, and socioeconomic information. This work describes a TB Portals’ chest x-ray-based image retrieval system which enables precision medicine. An input image is used to retrieve similar images and the associated patient specific information, thus facilitating inspection of outcomes and treatment regimens from comparable patients. Image similarity is defined using clinically relevant biomarkers: gender, age, Body Mass Index (BMI), and the percentage of lung affected per sextant. The biomarkers are predicted using variations of the DenseNet169 convolutional neural network. A multi-task approach is used to predict gender, age and BMI incorporating transfer learning from an initial training on the NIH Clinical Center CXR dataset to the TB portals dataset. The resulting gender AUC, age and BMI mean absolute errors were 0.9854, 4.03years and 1.67 kg|m2. For the percentage of sextant affected by lesions the mean absolute errors ranged between 7% to 12% with higher error values in the middle and upper sextants which exhibit more variability than the lower sextants. The retrieval system is currently available from https://rap.tbportals.niaid.nih.gov/find similar cxr.
To assess a Smart Imagery Framing and Truthing (SIFT) system in automatically labeling and annotating chest X-ray (CXR) images with multiple diseases as an assist to radiologists on multi-disease CXRs. SIFT system was developed by integrating a convolutional neural network based-augmented MaskR-CNN and a multi-layer perceptron neural network. It is trained with images containing 307,415 ROIs representing 69 different abnormalities and 67,071 normal CXRs. SIFT automatically labels ROIs with a specific type of abnormality, annotates fine-grained boundary, gives confidence score, and recommends other possible types of abnormality. An independent set of 178 CXRs containing 272 ROIs depicting five different abnormalities including pulmonary tuberculosis, pulmonary nodule, pneumonia, COVID-19, and fibrogenesis was used to evaluate radiologists’ performance based on three radiologists in a double-blinded study. The radiologist first manually annotated each ROI without SIFT. Two weeks later, the radiologist annotated the same ROIs with SIFT aid to generate final results. Evaluation of consistency, efficiency and accuracy for radiologists with and without SIFT was conducted. After using SIFT, radiologists accept 93% SIFT annotated area, and variation across annotated area reduce by 28.23%. Inter-observer variation improves by 25.27% on averaged IOU. The consensus true positive rate increases by 5.00% (p=0.16), and false positive rate decreases by 27.70% (p<0.001). The radiologist’s time to annotate these cases decreases by 42.30%. Performance in labelling abnormalities statistically remains the same. Independent observer study showed that SIFT is a promising step toward improving the consistency and efficiency of annotation, which is important for improving clinical X-ray diagnostic and monitoring efficiency.
Access to the requested content is limited to institutions that have purchased or subscribe to SPIE eBooks.
You are receiving this notice because your organization may not have SPIE eBooks access.*
*Shibboleth/Open Athens users─please
sign in
to access your institution's subscriptions.
To obtain this item, you may purchase the complete book in print or electronic format on
SPIE.org.
INSTITUTIONAL Select your institution to access the SPIE Digital Library.
PERSONAL Sign in with your SPIE account to access your personal subscriptions or to use specific features such as save to my library, sign up for alerts, save searches, etc.