2024
DOI: 10.1109/tg.2022.3228480
|View full text |Cite
|
Sign up to set email alerts
|

Generating Role-Playing Game Quests With GPT Language Models

Abstract: Quests represent an integral part of role-playing games (RPGs). While evocative, narrative-rich quests are still mostly hand-authored, player demands towards more and richer game content, as well as business requirements for continuous player engagement necessitate alternative, procedural quest generation methods. While existing methods produce mostly uninteresting, mechanical quest descriptions, recent advances in AI have brought forth generative language models with promising computational storytelling capab… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
8
0

Year Published

2024
2024
2024
2024

Publication Types

Select...
6
2
2

Relationship

0
10

Authors

Journals

citations
Cited by 30 publications
(8 citation statements)
references
References 31 publications
0
8
0
Order By: Relevance
“…Previous studies analyzed the relationship between LLMs and games. Värtinen et al [10] explored the use of GPT language models (GPT-2 and GPT-3) to procedurally generate quests for role-playing games (RPGs), as an alternative to hand-authored quests. Overall, the study highlights the promise of AI in generating game content but also points to the need for further improvements in the GPT language models.…”
Section: Large Language Models and Gamesmentioning
confidence: 99%
“…Previous studies analyzed the relationship between LLMs and games. Värtinen et al [10] explored the use of GPT language models (GPT-2 and GPT-3) to procedurally generate quests for role-playing games (RPGs), as an alternative to hand-authored quests. Overall, the study highlights the promise of AI in generating game content but also points to the need for further improvements in the GPT language models.…”
Section: Large Language Models and Gamesmentioning
confidence: 99%
“…It was trained on a collection of web data and, thus, outputs text for general purposes. Previous work has fine-tuned GPT for multiple domains and tasks, such as the task of quest generation in games (Värtinen et al, 2022) or the medical domain (Schneider et al, 2021). In addition to domain adaption, GPT was tailored to specific text styles and characteristics.…”
Section: Related Workmentioning
confidence: 99%
“…The inputs are arrangements of 1024 sequential tokens. The bigger show was prepared on 256 cloud TPU v3 centers [23].…”
Section: Gpt-2mentioning
confidence: 99%