2014
DOI: 10.1093/llc/fqu032
|View full text |Cite
|
Sign up to set email alerts
|

Word-level language identification inThe Chymistry of Isaac Newton

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
9
0

Year Published

2015
2015
2024
2024

Publication Types

Select...
3
2
1

Relationship

0
6

Authors

Journals

citations
Cited by 9 publications
(9 citation statements)
references
References 8 publications
0
9
0
Order By: Relevance
“…Mukherjee et al (2014) used language labels of surrounding words with NB. King et al (2015) used the language probabilities of the previous word to determining weights for languages. King et al (2014b) used unigram, bigram and trigram language label transition probabilities.…”
Section: Statistics Of Words Van Der Lee and Boschmentioning
confidence: 99%
“…Mukherjee et al (2014) used language labels of surrounding words with NB. King et al (2015) used the language probabilities of the previous word to determining weights for languages. King et al (2014b) used unigram, bigram and trigram language label transition probabilities.…”
Section: Statistics Of Words Van Der Lee and Boschmentioning
confidence: 99%
“…They occur in all digital techniques that access digitized sources through words. Thus, virtually every digital method for working with textual sources is confronted with these problems, not just retrieval methods such as source selection through keyword search, but also more quantitative analytical methods (e.g., King, K€ ubler, and Hooper 2015;Lijffijt et al 2016;Verhoef 2015, 56-57). As soon as one uses words to access sources, one is confronted with the equivocality of natural language: The meaning of words varies between contexts.…”
Section: Assessing Digital Techniquesmentioning
confidence: 99%
“…Several models for MAN-EN codeswitched language identification were developed as part of the First Shared Task on Language Identification in Codeswitched Data King et al, 2014). The most common technique was to employ supervised machine learning algorithms (e.g., extended Markov Models and Conditional Random Field) to train a classifier.…”
Section: Related Workmentioning
confidence: 99%