2022
DOI: 10.1016/j.eswa.2022.117231
|View full text |Cite
|
Sign up to set email alerts
|

PictoBERT: Transformers for next pictogram prediction

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1
1

Citation Types

0
12
0

Year Published

2022
2022
2024
2024

Publication Types

Select...
3
3

Relationship

0
6

Authors

Journals

citations
Cited by 7 publications
(12 citation statements)
references
References 21 publications
0
12
0
Order By: Relevance
“…The literature suggests that neural networks based language models may perform better than statistical models [Goldberg and Hirst 2017]. Besides, [Pereira et al 2022] compared their model with knowledge-based approaches and demonstrated improvements. Their models outperformed the semantic grammar on predicting the correct pictogram to complete a sentence.…”
Section: Resultsmentioning
confidence: 99%
See 4 more Smart Citations
“…The literature suggests that neural networks based language models may perform better than statistical models [Goldberg and Hirst 2017]. Besides, [Pereira et al 2022] compared their model with knowledge-based approaches and demonstrated improvements. Their models outperformed the semantic grammar on predicting the correct pictogram to complete a sentence.…”
Section: Resultsmentioning
confidence: 99%
“…In the process, the codes representing the categories may be merged or renamed [Petersen et al 2015]. According to [Petersen et al 2015], the process may only be applied to the [Pereira et al 2022] Expert Systems with Applications papers' abstract. However, if the abstracts are unclear, the method may consider the paper introduction, conclusion, or other parts.…”
Section: Data Extractionmentioning
confidence: 99%
See 3 more Smart Citations