Proceedings of the CoNLL 2020 Shared Task: Cross-Framework Meaning Representation Parsing 2020
DOI: 10.18653/v1/2020.conll-shared.4
|View full text |Cite
|
Sign up to set email alerts
|

Hitachi at MRP 2020: Text-to-Graph-Notation Transducer

Abstract: This paper presents our proposed parser for the shared task on Meaning Representation Parsing (MRP 2020) at CoNLL, where participant systems were required to parse five types of graphs in different languages. We propose to unify these tasks as a text-to-graph-notation transduction in which we convert an input text into a graph notation. To this end, we designed a novel Plain Graph Notation (PGN) that handles various graphs universally. Then, our parser predicts a PGN-based sequence by leveraging Transformers a… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1

Citation Types

1
12
0

Year Published

2020
2020
2024
2024

Publication Types

Select...
5
3
1

Relationship

1
8

Authors

Journals

citations
Cited by 12 publications
(13 citation statements)
references
References 29 publications
1
12
0
Order By: Relevance
“…In this sense, we continue the recent cross-framework trend formally started by the shared task of Oepen et al (2019), exploring the possibility of using translation-based approaches for framework-independent parsing, as opposed to the transition-based parsers proposed in that seminal work. Our findings are in line with the recent results reported by Oepen et al (2020) and, in particular, by Ozaki et al (2020) (d) UCCA Linearization Figure 1: AMR and UCCA graphs, along with their linearizations, for the sentence "After graduation, John moved to Paris". To ease readability, linearizations are shown with newlines and indentation; however, when fed to the neural model, they are in a single-line single-space format.…”
Section: Related Worksupporting
confidence: 92%
“…In this sense, we continue the recent cross-framework trend formally started by the shared task of Oepen et al (2019), exploring the possibility of using translation-based approaches for framework-independent parsing, as opposed to the transition-based parsers proposed in that seminal work. Our findings are in line with the recent results reported by Oepen et al (2020) and, in particular, by Ozaki et al (2020) (d) UCCA Linearization Figure 1: AMR and UCCA graphs, along with their linearizations, for the sentence "After graduation, John moved to Paris". To ease readability, linearizations are shown with newlines and indentation; however, when fed to the neural model, they are in a single-line single-space format.…”
Section: Related Worksupporting
confidence: 92%
“…According to the official whole-percent-only all F1 score, our competition submission reached tied first place in both the cross-lingual and the crossframework track, with its performance virtually (Oepen et al, 2019). identical to the system by Hitachi (Ozaki et al, 2020). Our bugfixed submission reached the first rank in both tracks, improving the cross-lingual score by nearly one percent point.…”
Section: Resultsmentioning
confidence: 77%
“…When parsing cross-framework meaning representations for English, the system is trained with a BERT-large-cased pretrained encoder, and when parsing cross-lingual meaning representations, it is trained with multilingual BERT. (Chen et al, 2019), HIT-SCIR (Che et al, 2019), and Saarland (Donatelli et al, 2019), respectively; in MRP 2020 the Hitachi system (Ozaki et al, 2020) was at the top for all three frameworks, sharing the UCCA first rank withÚFAL (Samuel and Straka, 2020).…”
Section: Overview Of Approachesmentioning
confidence: 99%