2021
DOI: 10.1016/j.eswa.2021.115570
|View full text |Cite
|
Sign up to set email alerts
|

A simple and fast method for Named Entity context extraction from patents

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1

Citation Types

0
2
0

Year Published

2022
2022
2024
2024

Publication Types

Select...
6
1

Relationship

0
7

Authors

Journals

citations
Cited by 13 publications
(2 citation statements)
references
References 25 publications
0
2
0
Order By: Relevance
“…To summarize, IE techniques have been successfully implemented in various tasks in the real world, including the extraction of cause-effect relationships [24], event detection and extraction [139], [144], corpus extraction from text [39], [40], [75], KG construction, question answering [36], [99], and mining social media data [148], [150], [157]. The mechanisms of IE from textual data are continually developing as more research is being conducted to address the existing problems within each technique.…”
Section: ) Challenges Based On Rq3: Issues Related To Future Applicat...mentioning
confidence: 99%
“…To summarize, IE techniques have been successfully implemented in various tasks in the real world, including the extraction of cause-effect relationships [24], event detection and extraction [139], [144], corpus extraction from text [39], [40], [75], KG construction, question answering [36], [99], and mining social media data [148], [150], [157]. The mechanisms of IE from textual data are continually developing as more research is being conducted to address the existing problems within each technique.…”
Section: ) Challenges Based On Rq3: Issues Related To Future Applicat...mentioning
confidence: 99%
“…That same year, three studies were published that utilized the bidirectional encoder representations from transformers (BERT) architecture [28,43,46]. By 2021 [47][48][49][50][51][52][53][54], the BERT architecture and its variants had emerged as the primary NER model applied to EHRs, a trend that continues to this day [4,24,25,[55][56][57][58][59][60][61][62][63][64][65][66][67][68][69][70][71]. However, this self-attention mechanism was initially introduced in 2017 [72].…”
Section: Classification Modelsmentioning
confidence: 99%