2020
DOI: 10.1007/978-3-030-62466-8_35
|View full text |Cite
|
Sign up to set email alerts
|

NEO: A Tool for Taxonomy Enrichment with New Emerging Occupations

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1

Citation Types

0
9
0

Year Published

2021
2021
2024
2024

Publication Types

Select...
4
3
1

Relationship

2
6

Authors

Journals

citations
Cited by 16 publications
(9 citation statements)
references
References 27 publications
0
9
0
Order By: Relevance
“…We compare JAMES against a exhaustive list of ten baseline models relevant to the JTM task: KNN-based [38], Word2Vec-based [3], DeepCarotene [32], Node2Vec [14], GloVe [27], NEO [13], WoLMIS [2], BERT-based [30], Job2Vec [35], and Universal Sentence Encoder (USE) [5]. The details of baselines and our implementation settings are described in the Appendix A.…”
Section: Set-upmentioning
confidence: 99%
“…We compare JAMES against a exhaustive list of ten baseline models relevant to the JTM task: KNN-based [38], Word2Vec-based [3], DeepCarotene [32], Node2Vec [14], GloVe [27], NEO [13], WoLMIS [2], BERT-based [30], Job2Vec [35], and Universal Sentence Encoder (USE) [5]. The details of baselines and our implementation settings are described in the Appendix A.…”
Section: Set-upmentioning
confidence: 99%
“…NEO (Giabelli et al 2020b) This activity is crucial to allow economists and policy makers to observe up-to-date labour market dynamics using standard taxonomies as a lingua franca, overcoming linguistic boundaries (see, e.g. (Frey and Osborne 2017;Giabelli et al 2020a;Colombo, Mercorio, and Mezzanzanica 2019)).…”
Section: Overview Of Neomentioning
confidence: 99%
“…To select the best representing vectors, we rely on three distinct sub-tasks, that are the following: T1.1: train three different word embedding models (Word2Vec, GloVe, FastText), T1.2. construct measure of pairwise semantic similarity between taxonomic elements, namely Hierarchical Semantic Relatedness (HSR) (Giabelli et al 2020b). Compared with HSR, state-of-the-art metrics for semantic similarity (see Aouicha, Taieb, and Hamadou (2016) for a survey) suffer of two main drawbacks.…”
Section: Overview Of Neomentioning
confidence: 99%
See 1 more Smart Citation
“…The workflow of skills2job, presented in Fig. 1, can be divided in five main steps: S.1 To extract linguistic patterns from OJVs, we train multiple embedding models through FastText, a library for representation learning which builds word embeddings considering sub-word information by representing each word as the sum of its character n-gram vectors; S.2 we compute measure of pairwise semantic similarity between taxonomic elements, namely HSS (developed in (Giabelli et al 2020b) and previously called Hierarchical Semantic Relatedness (HSR)).…”
Section: An Overview Of Skills2jobmentioning
confidence: 99%