Background: Breast cancer intrinsic molecular subtype (IMS) as classified by the expression-based PAM50 assay is considered a strong prognostic feature, even when controlled for by standard clinicopathological features such as age, grade, and nodal status, yet the molecular testing required to elucidate these subtypes is not routinely performed. Furthermore, when such bulk assays as RNA sequencing are performed, intratumoral heterogeneity that may affect prognosis and therapeutic decision-making can be missed. Methods: As a more facile and readily available method for determining IMS in breast cancer, we developed a deep learning approach for approximating PAM50 intrinsic subtyping using only whole-slide images of H&E-stained breast biopsy tissue sections. This algorithm was trained on images from 443 tumors that had previously undergone PAM50 subtyping to classify small patches of the images into four major molecular subtypes-Basal-like, HER2-enriched, Luminal A, and Luminal B-as well as Basal vs. non-Basal. The algorithm was subsequently used for subtype classification of a held-out set of 222 tumors. Results: This deep learning image-based classifier correctly subtyped the majority of samples in the held-out set of tumors. However, in many cases, significant heterogeneity was observed in assigned subtypes across patches from within a single whole-slide image. We performed further analysis of heterogeneity, focusing on contrasting Luminal A and Basal-like subtypes because classifications from our deep learning algorithm-similar to PAM50-are associated with significant differences in survival between these two subtypes. Patients with tumors classified as heterogeneous were found to have survival intermediate between Luminal A and Basal patients, as well as more varied levels of hormone receptor expression patterns. Conclusions: Here, we present a method for minimizing manual work required to identify cancer-rich patches among all multiscale patches in H&E-stained WSIs that can be generalized to any indication. These results suggest that advanced deep machine learning methods that use only routinely collected whole-slide images can approximate RNAseq-based molecular tests such as PAM50 and, importantly, may increase detection of heterogeneous tumors that may require more detailed subtype analysis.
I would like to express my gratitude to the wonderful people who made this achievement possible. It is a pleasure to present my sincere thanks to Dr. Eli Saber for advising my thesis and giving his support and guidance. Special thanks for Dr. Sohail Dianat for his comments and review throughout this work. I would also like to thank Dr. Vincent Amuso and Dr. Eric Peskin for their feedback.
Understanding complex events from unstructured video, like scoring a goal in a football game, is an extremely challenging task due to the dynamics, complexity and variation of video sequences. In this work, we attack this problem exploiting the capabilities of the recently developed framework of deep learning. We consider independently encoding spatial and temporal information via convolutional neural networks and fusion of features via regularized Autoencoders. To demonstrate the capacities of the proposed scheme, a new dataset is compiled, composed of goal and no-goal sequences. Experimental results demonstrate that extremely high classification accuracy can be achieved, from a dramatically limited number of examples, by leveraging pretrained models with fine-tuned fusion of spatio-temporal features.
understand that I must submit a print copy of my thesis or dissertation to the RIT Archives, per current RIT guidelines for the completion of my degree. I hereby grant to the Rochester Institute of Technology and its agents the non-exclusive license to archive and make accessible my thesis or dissertation in whole or in part in all forms of media in perpetuity. I retain all other ownership I would like to acknowledge and extend my heartfelt gratitude to the following colleagues: Mustafa Jaber for sharing his valuable experiences, and always being optimistic, Abdul Haleem Syed for his positive outlook on life and for our worthwhile discussions. I am also thankful to Mr. Peter Bauer, and the Hewlett Packard Company for their support and sponsorship for making this research possible. I am very grateful to Dr. Drew Maywar who helped me discover a new world in electrical engineering. His knowledge and vision changed my point of view, and provided me with a new focus in an area of study which fascinates me. I also would like to thank to Dr. Ferat Sahin and Prof. Sohail Dianat for agreeing to be committee members for my thesis, and sharing their valuable ideas and reviews. And finally, special thanks to Prof. Eli Saber who gave me an opportunity to work on his exciting projects. This thesis could not have been written without his encouragement throughout my academic life at RIT. His exacting character motivated me to aspire to greater academic achievements. He never accepted less than my best efforts, as he well knows. In addition to his academic support, he has always been there and willing to discuss my concerns, even if they are very personal. He taught me the significance of pushing myself, trying hard, and preserving. Thank you Prof. Eli Saber for everything, it has been a pleasure and an honor being your student.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
customersupport@researchsolutions.com
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
This site is protected by reCAPTCHA and the Google Privacy Policy and Terms of Service apply.
Copyright © 2024 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.