Proceedings of the 28th International Conference on Computational Linguistics 2020
DOI: 10.18653/v1/2020.coling-main.62
|View full text |Cite
|
Sign up to set email alerts
|

French Biomedical Text Simplification: When Small and Precise Helps

Abstract: We present experiments on biomedical text simplification in French. We use two kinds of corpora -parallel sentences extracted from existing health comparable corpora in French and WikiLarge corpus translated from English to French -and a lexicon that associates medical terms with paraphrases. Then, we train neural models on these parallel corpora using different ratios of general and specialized sentences. We evaluate the results with BLEU, SARI and Kandel scores. The results point out that little specialized … Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
8
0

Year Published

2021
2021
2023
2023

Publication Types

Select...
4
3

Relationship

0
7

Authors

Journals

citations
Cited by 10 publications
(8 citation statements)
references
References 15 publications
0
8
0
Order By: Relevance
“…We have seen above that our system currently tends to poorly handle technical terms, often missing from WOLF. We plan to investigate solutions to this limitation, for example by applying automatic text simplification for the medical domain (Cardon and Grabar, 2020) on the original sentences.…”
Section: Discussionmentioning
confidence: 99%
“…We have seen above that our system currently tends to poorly handle technical terms, often missing from WOLF. We plan to investigate solutions to this limitation, for example by applying automatic text simplification for the medical domain (Cardon and Grabar, 2020) on the original sentences.…”
Section: Discussionmentioning
confidence: 99%
“…All the existing large corpora used post-hoc aligned sentences [30,2,33,32,3]. The SimpleText corpus [11] contains directly simplified sentences, and is not much smaller than existing high-quality corpora like NEWSELA [30] (2,259 sentences).…”
Section: Task 3: Text Simplification -Rewriting Scientific Textmentioning
confidence: 99%
“…Shardlow and Nawaz (2019) used the general purpose neural text simplification model (Nisioi et al, 2017) augmented with the phrase table of complex-simple medical terminology to automatically simplify clinical letters in English. Cardon and Grabar (2020) used similar approach for biomedical texts in French. Emphasizing the need for high-quality simplification in medical domain, Van et al (2020) explored the possibility of applying pretrained neural language models to the autocomplete process for sentencelevel medical text simplification.…”
Section: Ats In Medical Domainmentioning
confidence: 99%