2023
DOI: 10.1038/s41586-023-06668-3
|View full text |Cite
|
Sign up to set email alerts
|

Human-like systematic generalization through a meta-learning neural network

Brenden M. Lake,
Marco Baroni

Abstract: The power of human language and thought arises from systematic compositionality—the algebraic ability to understand and produce novel combinations from known components. Fodor and Pylyshyn1 famously argued that artificial neural networks lack this capacity and are therefore not viable models of the mind. Neural networks have advanced considerably in the years since, yet the systematicity challenge persists. Here we successfully address Fodor and Pylyshyn’s challenge by providing evidence that neural networks c… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
23
0

Year Published

2023
2023
2024
2024

Publication Types

Select...
7
2

Relationship

0
9

Authors

Journals

citations
Cited by 60 publications
(23 citation statements)
references
References 44 publications
0
23
0
Order By: Relevance
“…the ability to conceptualize prior experience in terms of components that can be re-configured in a novel situation [167,168]. More broadly, compositional generalization has long been understood to be a crucial component for human-like learning and generalization [163], in large part due to the diversity and breadth of its applications: compositional generalization often involves the composition of many rules, relations, or attributes [163,[169][170][171]. The comparative simplicity of TI enabled us to identify how minimally structured learning systems can implement the inductive biases needed for this task.…”
Section: Discussionmentioning
confidence: 99%
“…the ability to conceptualize prior experience in terms of components that can be re-configured in a novel situation [167,168]. More broadly, compositional generalization has long been understood to be a crucial component for human-like learning and generalization [163], in large part due to the diversity and breadth of its applications: compositional generalization often involves the composition of many rules, relations, or attributes [163,[169][170][171]. The comparative simplicity of TI enabled us to identify how minimally structured learning systems can implement the inductive biases needed for this task.…”
Section: Discussionmentioning
confidence: 99%
“…Although these models are large in scale and contain billions of parameters, it’s undeniable that LLMs have demonstrated impressive capabilities in image generation tasks. Recent study result (Lake and Baroni, 2023) shows that neural networks can achieve human-like systematicity when optimized for their compositional skills. In addition to LLMs, a new visual model has also attracted attention and it is the Large Vision Model (LVM).…”
Section: Conclusion and Future Directionmentioning
confidence: 99%
“…Meta-learning was shown to enhance the flexibility of AI systems by enabling them to learn new tasks quickly with minimal data input [23], [58]- [60]. Algorithms that adjust their learning rules based on the task at hand demonstrated high versatility and adaptability [60]- [62].…”
Section: Meta-learning Algorithmsmentioning
confidence: 99%
“…Research into meta-learner architectures provided insights into optimizing model parameters for rapid adaptability [63], [67], [68]. The use of metalearning in complex scenarios involving multiple tasks and domains was explored, showing potential for scalable, efficient learning systems [36], [58], [69]- [71]. Findings also indicated that meta-learning could reduce the computational costs associated with adapting models to new environments [60], [62], [72].…”
Section: Meta-learning Algorithmsmentioning
confidence: 99%