2022
DOI: 10.1016/j.ccell.2022.09.012
|View full text |Cite
|
Sign up to set email alerts
|

Artificial intelligence for multimodal data integration in oncology

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

2
124
0

Year Published

2022
2022
2024
2024

Publication Types

Select...
6
4

Relationship

1
9

Authors

Journals

citations
Cited by 261 publications
(126 citation statements)
references
References 101 publications
2
124
0
Order By: Relevance
“…Notably, for some cancers there were significant performance differences between using sequence context or gene information, suggesting combining these two features would give the best performance. If instance features are suspected to interact with each other they should clearly be fused at the instance level, but if no interaction is expected there is a range of possibilities for combining the features 24,25 . While careful consideration should be made on how to best fuse input features in the context of MIL, that was not the focus of this study and the best approach may be data dependent.…”
Section: Resultsmentioning
confidence: 99%
“…Notably, for some cancers there were significant performance differences between using sequence context or gene information, suggesting combining these two features would give the best performance. If instance features are suspected to interact with each other they should clearly be fused at the instance level, but if no interaction is expected there is a range of possibilities for combining the features 24,25 . While careful consideration should be made on how to best fuse input features in the context of MIL, that was not the focus of this study and the best approach may be data dependent.…”
Section: Resultsmentioning
confidence: 99%
“…The use of multiple imaging modalities to evaluate bone tumours is known to improve the accuracy of diagnosis [ 3 ]. Machine learning models built using different modalities (multimodal) have also been shown to improve diagnostic performance [ 193 ]. In breast radiology, Antropova et al [ 194 ] came out with a CNN method involving fusion-based classification using dynamic contrast enhanced-MRI, full-field digital mammography, and ultrasound.…”
Section: Discussionmentioning
confidence: 99%
“…Deep learning models are particularly effective and leverage artificial neural networks (ANNs) to make predictions based on input data [3]. As data availability has proliferated through large-scale consortiums such as The Cancer Genome Atlas (TCGA), ML techniques have been applied to integrate a wide range of input data modalities for prognostication [4], such as whole-slideimaging (WSI), gene expression quantification, clinical attributes like age and gender, and other biological and molecular measures [5][6][7]. These methods aim to quantify prognosis as a scalar hazard ratio and are supervised by Cox loss, a common method in survival analysis [8].…”
Section: Background Cancer Prognostication and Machine Learningmentioning
confidence: 99%