2022
DOI: 10.1101/2022.07.07.499217
|View full text |Cite
Preprint
|
Sign up to set email alerts
|

SNVformer: An Attention-based Deep Neural Network for GWAS Data

Abstract: Despite being the widely-used gold standard for linking common genetic variations to phenotypes and disease, genome-wide association studies (GWAS) suffer major limitations, partially attributable to the reliance on simple, typically linear, models of genetic effects. More elaborate methods, such as epistasis-aware models, typically struggle with the scale of GWAS data. In this paper, we build on recent advances in neural networks employing Transformer-based architectures to enable such models at a large scale… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1

Citation Types

0
1
0

Year Published

2023
2023
2023
2023

Publication Types

Select...
1
1

Relationship

0
2

Authors

Journals

citations
Cited by 2 publications
(1 citation statement)
references
References 30 publications
0
1
0
Order By: Relevance
“…The complexities of genetic information pose unique challenges, such as high dimensionality and the need for significant computational power, which have so far hindered the widespread adoption of foundation models in this area with relatively few publications applying basic concepts of foundation models to genomic data [104,[106][107][108]. For example, Santiesteban et al [109] showed that foundation models combining transcriptomics and histopathology data through self-supervised learning significantly improve survival prediction.…”
Section: Opportunities Of Large Language Models and Foundation Modelsmentioning
confidence: 99%
“…The complexities of genetic information pose unique challenges, such as high dimensionality and the need for significant computational power, which have so far hindered the widespread adoption of foundation models in this area with relatively few publications applying basic concepts of foundation models to genomic data [104,[106][107][108]. For example, Santiesteban et al [109] showed that foundation models combining transcriptomics and histopathology data through self-supervised learning significantly improve survival prediction.…”
Section: Opportunities Of Large Language Models and Foundation Modelsmentioning
confidence: 99%