2023
DOI: 10.1609/icwsm.v17i1.22167
|View full text |Cite
|
Sign up to set email alerts
|

Beyond Discrete Genres: Mapping News Items onto a Multidimensional Framework of Genre Cues

Abstract: In the contemporary media landscape, with the vast and diverse supply of news, it is increasingly challenging to study such an enormous amount of items without a standardized framework. Although attempts have been made to organize and compare news items on the basis of news values, news genres receive little attention, especially the genres in a news consumer’s perception. Yet, perceived news genres serve as an essential component in exploring how news has developed, as well as a precondition for understanding… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1

Citation Types

0
1
0

Year Published

2023
2023
2024
2024

Publication Types

Select...
2
1
1
1

Relationship

0
5

Authors

Journals

citations
Cited by 5 publications
(1 citation statement)
references
References 51 publications
0
1
0
Order By: Relevance
“…To overcome drawbacks of dictionary and traditional SML, researchers have recently successfully applied sophisticated Large Language Models (LLMs), many of which are built on the Transformer architecture, such as BERT (Bidirectional Encoder Representations from Transformers) (Devlin et al, 2019) and GPT-3 (Brown et al, 2020). LLM methods have shown to outperform other methods for classifying related concepts like news genres in news articles (Lin et al, 2023), sentiment in news headlines (van Atteveldt et al, 2021), and PNR content in web content (Makhortykh et al, 2022) and website titles (Wojcieszak et al, n.d.).…”
Section: Transformer-based Supervised Machine Learningmentioning
confidence: 99%
“…To overcome drawbacks of dictionary and traditional SML, researchers have recently successfully applied sophisticated Large Language Models (LLMs), many of which are built on the Transformer architecture, such as BERT (Bidirectional Encoder Representations from Transformers) (Devlin et al, 2019) and GPT-3 (Brown et al, 2020). LLM methods have shown to outperform other methods for classifying related concepts like news genres in news articles (Lin et al, 2023), sentiment in news headlines (van Atteveldt et al, 2021), and PNR content in web content (Makhortykh et al, 2022) and website titles (Wojcieszak et al, n.d.).…”
Section: Transformer-based Supervised Machine Learningmentioning
confidence: 99%