Proceedings of the 2019 Conference on Empirical Methods in Natural Language Processing and the 9th International Joint Conferen 2019
DOI: 10.18653/v1/d19-1150
|View full text |Cite
|
Sign up to set email alerts
|

Korean Morphological Analysis with Tied Sequence-to-Sequence Multi-Task Model

Abstract: Korean morphological analysis has been considered as a sequence of morpheme processing and POS tagging. Thus, a pipeline model of the tasks has been adopted widely by previous studies. However, the model has a problem that it cannot utilize interactions among the tasks. This paper formulates Korean morphological analysis as a combination of the tasks and presents a tied sequence-tosequence multi-task model for training the two tasks simultaneously without any explicit regularization. The experiments prove the … Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
10
0

Year Published

2021
2021
2024
2024

Publication Types

Select...
5
1
1

Relationship

0
7

Authors

Journals

citations
Cited by 9 publications
(10 citation statements)
references
References 14 publications
0
10
0
Order By: Relevance
“…For our experiments, we utilized the Sejong corpus (used in the literature [18–24, 29, 30]), UCorpus [37], and Everyone's Corpus [39]. In line with previous studies for comparison purposes, the Sejong corpus underwent training using a single model without separation.…”
Section: Resultsmentioning
confidence: 99%
See 2 more Smart Citations
“…For our experiments, we utilized the Sejong corpus (used in the literature [18–24, 29, 30]), UCorpus [37], and Everyone's Corpus [39]. In line with previous studies for comparison purposes, the Sejong corpus underwent training using a single model without separation.…”
Section: Resultsmentioning
confidence: 99%
“…In recent years, Korean morphological analyses have witnessed a diverse range of methodologies [3–31]. The agglutinative nature of the Korean language poses challenges that have inspired researchers to devise innovative solutions, laying the foundation for future investigations.…”
Section: Related Workmentioning
confidence: 99%
See 1 more Smart Citation
“…Targeting community question answering, [139] uses the result of question category prediction to enhance document representations. [110] feeds the result of morphological tagging to a POS tagging model and the two models are further tied by skip connections. To enable asynchronous training of multi-task models, [152] initializes input from other tasks by a uniform distribution across labels.…”
Section: Hierarchicalmentioning
confidence: 99%
“…For example, [93] converts the parsing of Alexa meaning representation language into three independent tagging tasks for intents, types, and properties, respectively. [110] transforms the pipeline relation between POS tagging and morphological tagging into a parallel relation and further builds a joint MTL model.…”
Section: Joint Mtlmentioning
confidence: 99%