2020
DOI: 10.1101/2020.07.12.199265
|View full text |Cite
Preprint
|
Sign up to set email alerts
|

Multitask Learning over Shared Subspaces

Abstract: This paper uses constructs from the field of multitask machine learning to define pairs of learning tasks that either shared or did not share a common subspace. Human subjects then learnt these tasks using a feedback-based approach. We found, as hypothesised, that subject performance was significantly higher on the second task if it shared the same subspace as the first, an advantage that played out most strongly at the beginning of the second task. Additionally, accuracy was positively correlated over subject… Show more

Help me understand this report
View published versions

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1

Citation Types

0
2
0

Year Published

2022
2022
2022
2022

Publication Types

Select...
2

Relationship

0
2

Authors

Journals

citations
Cited by 2 publications
(2 citation statements)
references
References 41 publications
0
2
0
Order By: Relevance
“…G. Collins & Frank, 2013;A. G. E. Collins & Frank, 2016;Menghi, Kacar, & Penny, 2021), mostly using stimuli with a discrete number of possible values.…”
Section: Structure Learningmentioning
confidence: 99%
“…G. Collins & Frank, 2013;A. G. E. Collins & Frank, 2016;Menghi, Kacar, & Penny, 2021), mostly using stimuli with a discrete number of possible values.…”
Section: Structure Learningmentioning
confidence: 99%
“…1 top). If such generalization is warranted (i.e., the old and new environments are structured or generated according to a common rule [10][11][12][13] , decision makers can solve the exploration-exploitation tradeoff in the new environments more efficiently. Although the question of knowledge generalizability has been discussed for many years 14,15 , it has remained largely unanswered because of the computational difficulty of quantifying its cognitive underpinnings in detail.…”
Section: Introductionmentioning
confidence: 99%