EVALITA Evaluation of NLP and Speech Tools for Italian - December 17th, 2020 2020
DOI: 10.4000/books.aaccademia.7518
|View full text |Cite
|
Sign up to set email alerts
|

PRELEARN @ EVALITA 2020: Overview of the Prerequisite Relation Learning Task for Italian

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
7
0

Year Published

2020
2020
2023
2023

Publication Types

Select...
3
2
1

Relationship

1
5

Authors

Journals

citations
Cited by 6 publications
(7 citation statements)
references
References 16 publications
0
7
0
Order By: Relevance
“…We did 10-fold cross validation for the in-domain experiments except with the Italian-BERT 9 , for which we used a stratified split of 30% for validation set. Table 4 shows the experimental results over the training (validation) set for both, in-domain and cross-domain scenarios.…”
Section: Discussion: Ablation Analysismentioning
confidence: 99%
See 1 more Smart Citation
“…We did 10-fold cross validation for the in-domain experiments except with the Italian-BERT 9 , for which we used a stratified split of 30% for validation set. Table 4 shows the experimental results over the training (validation) set for both, in-domain and cross-domain scenarios.…”
Section: Discussion: Ablation Analysismentioning
confidence: 99%
“…In this work, we present our systems to automatically detect prerequisite relations for Italian language in the context of the PRELEARN shared task (Alzetta et al, 2020) at EVALITA 2020 (Basile et al, 2020). The evaluation of submissions considers: (1) in-domain and crossdomain scenarios defined by either the inclusion (in-domain) or exclusion (cross-domain) of the target domain in the training set.…”
Section: Introductionmentioning
confidence: 99%
“…Different from word2vec, fastText operated additional tokenization, which breaks word-level tokens into several n-grams to provide representation for out-of-vocabulary words. In 2020, a shared task on Concept Prerequisite Learning (PRELEARN) was organized during the EVALITA 2020 [16]. It is the first shared task on automatic prerequisite learning between educational concepts in Italian.…”
Section: Related Workmentioning
confidence: 99%
“…With the popularization of deep learning technology, various deep learning models show a trend of coming from behind in the concept prerequisite learning tasks [14][15][16]. It has to be mentioned that transformers have become a prevalent deep learning technology in recent years.…”
Section: Concept Prerequisite Relation Discoveringmentioning
confidence: 99%
See 1 more Smart Citation