2022
DOI: 10.1007/s11192-022-04602-4
|View full text |Cite
|
Sign up to set email alerts
|

SsciBERT: a pre-trained language model for social science texts

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
8
0
1

Year Published

2023
2023
2024
2024

Publication Types

Select...
10

Relationship

1
9

Authors

Journals

citations
Cited by 13 publications
(9 citation statements)
references
References 29 publications
0
8
0
1
Order By: Relevance
“…Through the mining of web crawler algorithms, various possibilities are verified, including breadth-first (search the neighbors at the same level), depth-first (traverse to the bottom from the root node), URL ordering (queue), page-rank (importance based on the number of backlinks or citations), online page importance (importance of a page in a website), largest sites first (websites with the largest number of pages), page request—HTTP or the dynamic, customized site map (applicable to deal with updates on already visited pages), and filtering (query-based approach) [ 7 , 80 , 81 ]. In some of these algorithms, keywords are accepted as the search query, and all relevant URLs fulfilling that search query are returned.…”
Section: Discussionmentioning
confidence: 99%
“…Through the mining of web crawler algorithms, various possibilities are verified, including breadth-first (search the neighbors at the same level), depth-first (traverse to the bottom from the root node), URL ordering (queue), page-rank (importance based on the number of backlinks or citations), online page importance (importance of a page in a website), largest sites first (websites with the largest number of pages), page request—HTTP or the dynamic, customized site map (applicable to deal with updates on already visited pages), and filtering (query-based approach) [ 7 , 80 , 81 ]. In some of these algorithms, keywords are accepted as the search query, and all relevant URLs fulfilling that search query are returned.…”
Section: Discussionmentioning
confidence: 99%
“…SsciBERT is a BERT-based pre-trained language model adapted for social and humanities disciplines (Shen et al, 2023 ). It uses a corpus that includes a massive collection of abstracts from Social Sciences Citation Index (SSCI) journals.…”
Section: Methodsmentioning
confidence: 99%
“…In the pursuit of developing a computational solution for interpreting pre-conceptual schemas, we are motivated to employ the BERT model due to its proven efficacy across a range of natural language processing tasks in social sciences [21], from sentiment analysis to question answering, which makes it a favorable choice for enhancing the linguistic intelligence of our solution. Moreover, some of BERT's open-source resources available at the Hugging Face repository aligns with our commitment to integrate the most used technology in research [22].…”
Section: Bert Modelmentioning
confidence: 99%