Proceedings of the Twenty-Eighth International Joint Conference on Artificial Intelligence 2019
DOI: 10.24963/ijcai.2019/680
|View full text |Cite
|
Sign up to set email alerts
|

Medical Concept Representation Learning from Multi-source Data

Abstract: Representing words as low dimensional vectors is very useful in many natural language processing tasks. This idea has been extended to medical domain where medical codes listed in medical claims are represented as vectors to facilitate exploratory analysis and predictive modeling. However, depending on a type of a medical provider, medical claims can use medical codes from different ontologies or from a combination of ontologies, which complicates learning of the representations. To be able to properly utiliz… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1
1

Citation Types

0
19
0

Year Published

2020
2020
2024
2024

Publication Types

Select...
5
1
1

Relationship

0
7

Authors

Journals

citations
Cited by 15 publications
(19 citation statements)
references
References 8 publications
0
19
0
Order By: Relevance
“…The medical concept embedding was trained with the ICD code sequence from the EMR using a skip-gram of the Word2vec embedding method. In [18], modified pointwise mutual information (PMI) [24] and modified negative sampling [13] were used to embed medical codes from different ontologies of medical codes.…”
Section: Medical Concept Embeddingmentioning
confidence: 99%
See 1 more Smart Citation
“…The medical concept embedding was trained with the ICD code sequence from the EMR using a skip-gram of the Word2vec embedding method. In [18], modified pointwise mutual information (PMI) [24] and modified negative sampling [13] were used to embed medical codes from different ontologies of medical codes.…”
Section: Medical Concept Embeddingmentioning
confidence: 99%
“…Some studies try to embed medical concepts by the EMR concept sequence rather than medical concept ontology [15], [16]. Other studies attempted to reflect the ontology of the medical concept for drug-diseases relation extraction [17] and cross-referencing between different medical concepts [18].…”
Section: Introductionmentioning
confidence: 99%
“…In the medical field, the above-mentioned task is often treated as embeddings learning for the medical concepts [1,6], implying ICD diagnosis codes, terms, and abbreviations, medication and procedure names, etc. Commonly such concepts are viewed within a temporal process associated with a patient.…”
Section: Related Workmentioning
confidence: 99%
“…To efficiently deal with sequentially organized medical data the notion of context becomes crucial. For this, some papers [1,6] use modifications of the Skip-gram algorithm [13] though it is limited to account for only a fixed-size context of a sequence. Models with Recurrent Neural Network (RNN) architectures offer a better context handling mechanism.…”
Section: Related Workmentioning
confidence: 99%
See 1 more Smart Citation