2019
DOI: 10.4218/etrij.2018-0456
|View full text |Cite
|
Sign up to set email alerts
|

Linear‐Time Korean Morphological Analysis Using an Action‐based Local Monotonic Attention Mechanism

Abstract: For Korean language processing, morphological analysis is a critical component that requires extensive work. This morphological analysis can be conducted in an end‐to‐end manner without requiring a complicated feature design using a sequence‐to‐sequence model. However, the sequence‐to‐sequence model has a time complexity of O(n2) for an input length n when using the attention mechanism technique for high performance. In this study, we propose a linear‐time Korean morphological analysis model using a local mono… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
2

Citation Types

0
4
0

Year Published

2021
2021
2024
2024

Publication Types

Select...
2
1

Relationship

0
3

Authors

Journals

citations
Cited by 3 publications
(4 citation statements)
references
References 4 publications
(15 reference statements)
0
4
0
Order By: Relevance
“…In recent years, Korean morphological analyses have witnessed a diverse range of methodologies [3–31]. The agglutinative nature of the Korean language poses challenges that have inspired researchers to devise innovative solutions, laying the foundation for future investigations.…”
Section: Related Workmentioning
confidence: 99%
See 1 more Smart Citation
“…In recent years, Korean morphological analyses have witnessed a diverse range of methodologies [3–31]. The agglutinative nature of the Korean language poses challenges that have inspired researchers to devise innovative solutions, laying the foundation for future investigations.…”
Section: Related Workmentioning
confidence: 99%
“…Several approaches have been suggested for morphological analysis, a critical aspect of Korean language comprehension [3–31]. Typically, when individuals grasp spoken or written language, they try to comprehend it through familiar vocabulary and concepts.…”
Section: Introductionmentioning
confidence: 99%
“…Their approach outperforms several other models on grapheme-to-phoneme conversion, transliteration, and morphological inflection. Monotonic attention has also improved tasks such as summarization (Chung et al, 2020) and morphological analysis (Hwang and Lee, 2020).…”
Section: Related Workmentioning
confidence: 99%
“…However, it is well-known that transformer is insufficient at processing long sequences in terms of performance and computation speed, and speech signals are usually represented by much longer sequences than the written sentences. There have been many research studies to overcome this drawback, including [9][10][11].…”
Section: Introductionmentioning
confidence: 99%