Daniel Gareau, James Browning, Joel Correa Da Rosa, Mayte Suarez-Farinas, Samantha Lish , Amanda Zong, Benjamin Firester, Charles Vrattos, Yael Renert-Yuval, Mauricio Gamboa, María Vallone, Zamira Barragán-Estudillo, Alejandra Tamez-Peña, Javier Montoya, Miriam Jesús-Silva, Cristina Carrera, Josep Malvehy, Susana Puig, Ashfaq Marghoob, John Carucci, James Krueger
Significance: Melanoma is a deadly cancer that physicians struggle to diagnose early because they lack the knowledge to differentiate benign from malignant lesions. Deep machine learning approaches to image analysis offer promise but lack the transparency to be widely adopted as stand-alone diagnostics.
Aim: We aimed to create a transparent machine learning technology (i.e., not deep learning) to discriminate melanomas from nevi in dermoscopy images and an interface for sensory cue integration.
Approach: Imaging biomarker cues (IBCs) fed ensemble machine learning classifier (Eclass) training while raw images fed deep learning classifier training. We compared the areas under the diagnostic receiver operator curves.
Results: Our interpretable machine learning algorithm outperformed the leading deep-learning approach 75% of the time. The user interface displayed only the diagnostic imaging biomarkers as IBCs.
Conclusions: From a translational perspective, Eclass is better than convolutional machine learning diagnosis in that physicians can embrace it faster than black box outputs. Imaging biomarkers cues may be used during sensory cue integration in clinical screening. Our method may be applied to other image-based diagnostic analyses, including pathology and radiology.
Early diagnosis of melanomas is the most effective means of improving melanoma prognosis. We can arm the non-expert screeners with artificial intelligence but most artificial intelligence methods are somewhat impractical in a clinical setting given the lack of transparency. To provide a quantitative and algorithmic approach to lesion diagnosis while maintaining transparency, and to supplement the clinician rather than replace them, our digital analysis provides visual features, or, “imaging biomarkers” that can both be used in machine learning and visualized too.
Chemical space for small molecule therapeutics discovery is greatly under-explored due to difficulties in animal testing, the first bottleneck compounds encounter in going from formula to human use. We developed and validated an assay that combines 3D tissue biofabrication with high-throughput imaging biomarkers. This may impact more diseases than just skin cancer, where we have recently shown promising preliminary findings. Our skin constructs have normal epidermis, with populations of human keratinocytes, dermis with human fibroblasts and tumor spheroids containing populations of human squamous cell carcinoma cells. We present imaging biomarkers that show the cellular chemotherapeutic treatment. This constitutes a novel chemotherapeutic assay that may enable a paradigm-shifting drug discovery pipeline. Such a pipeline could enable tissue-relevant assay on a high throughput scale and be both more robust than monolayer cell culture and easier than animal models.
We investigate the process of induced pluripotent stem cell (iPSC) passaging. Subcultures are created by transferring cells from iPSC cultures to new growth mediums. We found that standard protocols for iPSC passaging primarily have researchers use their eyesight to determine cultures' confluencies and cell counts. With the consequences of inaccurate estimates going as far as cell death due to passaging at the suboptimal confluency, we sought to circumvent human error and develop a culture analyzing algorithm (CAA) that calculates both confluency and cell count primarily through Otsu's method. We incorporate multi-image machine learning into our CAA, improving its ability to recognize colonies as it is fed more images. In comparing our algorithm to standard protocols, we found that there was a significant percent difference between both methods when measuring the confluency and cell count of iPSC cultures. Through further refinement, we hope to streamline our CAA for large-scale use.
With over 4.3 million new cases in the U.S. every year, basal cell carcinoma (BCC), is the most common form of skin cancer. Pathologists must examine pathology images to diagnose BCC, potentially resulting in delay, error, and inconsistency. To address the need for standardized, expedited diagnosis, we created an automated diagnostic machine to identify BCC given pathology images. In MATLAB, we adapted a deep neural network image segmentation model, UNet, to train on BCC images and their corresponding masks, which can learn to highlight these nodules in pathology images by outputting a computer-generated mask. We trained the U-Net on one image from the dataset and compared the computer-generated mask output from testing on three types of images: an image from a different region of the same image taken with the same microscope, an image from a different tissue sample with a different microscope, and an image taken with a confocal microscope. We observed good, medium and poor results, respectively, illustrating that performance depends on the similarity between test and training data. In subsequent tests using data augmentation, we achieved sensitivity of 0.82±0.07 and specificity of 0.87±0.16 on N = 6 sample sections from 3 different BCCs imaged with the same microscope system. These data show that the U-Net performed well with a relatively few number of training images. Examining the errors raised interesting questions regarding what the errors mean and how they possibly arose. By creating a surgeon interface for rapid pathological assessment and machine learning diagnostics for pathological features, the BCC diagnosis process will be expedited and standardized.
Access to the requested content is limited to institutions that have purchased or subscribe to SPIE eBooks.
You are receiving this notice because your organization may not have SPIE eBooks access.*
*Shibboleth/Open Athens users─please
sign in
to access your institution's subscriptions.
To obtain this item, you may purchase the complete book in print or electronic format on
SPIE.org.
INSTITUTIONAL Select your institution to access the SPIE Digital Library.
PERSONAL Sign in with your SPIE account to access your personal subscriptions or to use specific features such as save to my library, sign up for alerts, save searches, etc.