Proceedings of the 2019 Conference of the North 2019
DOI: 10.18653/v1/n19-4010
|View full text |Cite
|
Sign up to set email alerts
|

Untitled

Abstract: We present FLAIR, an NLP framework designed to facilitate training and distribution of state-of-the-art sequence labeling, text classification and language models. The core idea of the framework is to present a simple, unified interface for conceptually very different types of word and document embeddings. This effectively hides all embedding-specific engineering complexity and allows researchers to "mix and match" various embeddings with little effort. The framework also implements standard model training and… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
2
1

Citation Types

0
134
0
1

Year Published

2020
2020
2022
2022

Publication Types

Select...
5
1
1

Relationship

0
7

Authors

Journals

citations
Cited by 210 publications
(135 citation statements)
references
References 20 publications
(21 reference statements)
0
134
0
1
Order By: Relevance
“…Different types of embeddings have been selected and mixed, based on BPEmb subword embeddings [44] and Flair contextual string embeddings [54] , for which a detailed analysis is provided below.
Fig.
…”
Section: Methodsmentioning
confidence: 99%
See 1 more Smart Citation
“…Different types of embeddings have been selected and mixed, based on BPEmb subword embeddings [44] and Flair contextual string embeddings [54] , for which a detailed analysis is provided below.
Fig.
…”
Section: Methodsmentioning
confidence: 99%
“…Contextual language models, such as Embeddings from Language Models (ELMo) [51] and Flair [52] , proved to be superior to static models such as Word2Vec [14] and Global Vectors for Word Representation (GloVe) [53] thanks to the ability to analyze the context, and this further improved performance in cross-lingual scenarios. Flair embeddings [54] , [55] , which constitute a character-based contextual language model on which the Flair NLP framework [52] is based, prompted [56] to test a novel methodology for cross-lingual transfer learning for Japanese NER, based on a Bi-LSTM architecture and embeddings at both word and character level as input.…”
Section: Background and Related Workmentioning
confidence: 99%
“…More recent work has focused on generating such embedding directly by an unsupervised manner, like with Skip-Thoughts [19], Quick-Thoughts [20] and sent2vec, [26] or by a supervised manner, like with InferSent [9]. More recent work in that field focuses on generating a contextualized embedding, like with ELMo [28], Flair [2] and, perhaps most notably, BERT [10].…”
Section: Related Workmentioning
confidence: 99%
“…For modeling we use the contextual string embeddings for sequence labeling approach via the Flair framework [2], which achieved state-of-the-art results in multiple sequence labeling tasks, such as Named Entity Recognition (NER) and part-of-speech (PoS) tagging. This framework allows to stack multiple word embeddings from various pre-trained models, which are then passed into a vanilla bidirectional Long short-term memory (BiLSTM) recurrent neural network and a subsequent conditional random field (CRF) decoding layer [2,12,21].…”
Section: L2q Model Trainingmentioning
confidence: 99%
See 1 more Smart Citation