2021
DOI: 10.48550/arxiv.2105.13995
|View full text |Cite
Preprint
|
Sign up to set email alerts
|

SemEval-2021 Task 9: Fact Verification and Evidence Finding for Tabular Data in Scientific Documents (SEM-TAB-FACTS)

Abstract: Understanding tables is an important and relevant task that involves understanding table structure as well as being able to compare and contrast information within cells. In this paper, we address this challenge by presenting a new dataset and tasks that addresses this goal in a shared task in SemEval 2020 Task 9: Fact Verification and Evidence Finding for Tabular Data in Scientific Documents (SEM-TAB-FACTS). Our dataset contains 981 manuallygenerated tables and an auto-generated dataset of 1980 tables providi… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1

Citation Types

0
1
0

Year Published

2022
2022
2024
2024

Publication Types

Select...
2

Relationship

0
2

Authors

Journals

citations
Cited by 2 publications
(1 citation statement)
references
References 5 publications
0
1
0
Order By: Relevance
“…The TabFact dataset and InfoTabs dataset [2,3] were proposed in 2020 to investigate the task of fact verification for textual claims based on tables from Wikipedia. The Sem-tab-facts dataset extracts tables from scientific articles and requires the model from Select cells in the table as evidence for subsequent reasoning [4] .…”
Section: Introductionmentioning
confidence: 99%
“…The TabFact dataset and InfoTabs dataset [2,3] were proposed in 2020 to investigate the task of fact verification for textual claims based on tables from Wikipedia. The Sem-tab-facts dataset extracts tables from scientific articles and requires the model from Select cells in the table as evidence for subsequent reasoning [4] .…”
Section: Introductionmentioning
confidence: 99%