2018
DOI: 10.1613/jair.1.11266
|View full text |Cite
|
Sign up to set email alerts
|

Transition-Based Neural Word Segmentation Using Word-Level Features

Abstract: Character-based and word-based methods are two different solutions for Chinese word segmentation, the former exploiting sequence labeling models over characters and the latter using word-level features. Neural models have been exploited for character-based Chinese word segmentation, giving high accuracies by making use of external character embeddings, yet requiring less feature engineering. In this paper, we study a neural model for word-based Chinese word segmentation, by replacing the m… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2

Citation Types

0
2
0

Year Published

2021
2021
2022
2022

Publication Types

Select...
5
1

Relationship

2
4

Authors

Journals

citations
Cited by 7 publications
(2 citation statements)
references
References 39 publications
0
2
0
Order By: Relevance
“…Alternatively, the transition-based framework offers another solution for end-to-end modeling, which is totally orthogonal to the graph-based models. Transition-based models have been widely exploited for end-to-end sequence labeling (Zhang and Clark 2010;Lyu, Zhang, and Ji 2016;Zhang, Zhang, and Fu 2018), structural parsing (Zhou et al 2015;Dyer et al 2015;Yuan, Jiang, and Tu 2019) and relation extraction (Wang et al 2018;, which are closely related to SRL. These models can also achieve very competitive performances for a range of tasks, and meanwhile maintain high efficiencies with linear-time decoding complexity.…”
Section: Introductionmentioning
confidence: 99%
“…Alternatively, the transition-based framework offers another solution for end-to-end modeling, which is totally orthogonal to the graph-based models. Transition-based models have been widely exploited for end-to-end sequence labeling (Zhang and Clark 2010;Lyu, Zhang, and Ji 2016;Zhang, Zhang, and Fu 2018), structural parsing (Zhou et al 2015;Dyer et al 2015;Yuan, Jiang, and Tu 2019) and relation extraction (Wang et al 2018;, which are closely related to SRL. These models can also achieve very competitive performances for a range of tasks, and meanwhile maintain high efficiencies with linear-time decoding complexity.…”
Section: Introductionmentioning
confidence: 99%
“…Alternatively, the transition-based framework offers another solution for end-to-end modeling, which is totally orthogonal to the graph-based models. Transition-based models have been widely exploited for end-to-end sequence labeling (Zhang and Clark 2010;Lyu, Zhang, and Ji 2016;Zhang, Zhang, and Fu 2018), structural parsing (Zhou et al 2015;Dyer et al 2015;Yuan, Jiang, and Tu 2019) and relation extraction (Wang et al 2018;, which are closely related to SRL. These models can also achieve very competitive performances for a range of tasks, and meanwhile maintain high efficiencies with linear-time decoding complexity.…”
Section: Introductionmentioning
confidence: 99%