2021
DOI: 10.1038/s41567-021-01170-x
|View full text |Cite|
|
Sign up to set email alerts
|

Topological limits to the parallel processing capability of network architectures

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
2
1

Citation Types

4
24
0

Year Published

2021
2021
2025
2025

Publication Types

Select...
3
2
2

Relationship

1
6

Authors

Journals

citations
Cited by 25 publications
(28 citation statements)
references
References 27 publications
4
24
0
Order By: Relevance
“…Multipleresource theories [41,42,44,45] became increasingly successful in explaining multitasking phenomena in laboratory tasks, such as higher dual-task interference for structurally overlapping tasks [46,47], and in real-world scenarios, such as the effects of phone dialing on speed control while driving [48,49]. In addition, these theories are supported by recent numerical and analytical work, suggesting that even modest amounts of resource sharing between tasks can be sufficient to drastically limit the multitasking capacity of a neural system [50][51][52] and that this effect scales with the number of processing steps (layers) in the network [53]. Thus, even small amounts of representation sharing are sufficient to induce constraints on multitasking that may invite misinterpretation as a central bottleneck [54].…”
Section: Open Accessmentioning
confidence: 99%
See 2 more Smart Citations
“…Multipleresource theories [41,42,44,45] became increasingly successful in explaining multitasking phenomena in laboratory tasks, such as higher dual-task interference for structurally overlapping tasks [46,47], and in real-world scenarios, such as the effects of phone dialing on speed control while driving [48,49]. In addition, these theories are supported by recent numerical and analytical work, suggesting that even modest amounts of resource sharing between tasks can be sufficient to drastically limit the multitasking capacity of a neural system [50][51][52] and that this effect scales with the number of processing steps (layers) in the network [53]. Thus, even small amounts of representation sharing are sufficient to induce constraints on multitasking that may invite misinterpretation as a central bottleneck [54].…”
Section: Open Accessmentioning
confidence: 99%
“…can be recombined to form another (interfering) task [51,52]. However, despite growing empirical and quantitative support, multiple-resource theories lack a principled explanation for why a neural system, such as the human brain, would rely on shared resources between tasks at all, given the constraints on multitasking that this imposes.…”
Section: Open Accessmentioning
confidence: 99%
See 1 more Smart Citation
“…Moreover, in support of the link between multitasking and performance generalisability, have shown that neural networks that allow overlap between task representations show both decrements in multitasking performance and more rapid acquisition of new tasks. Importantly, representational sharing rapidly attenuates multitasking performance independent of network size (Petri et al, 2021), perhaps accounting for how a system as complex as the human brain can still show striking multitasking limitations.…”
Section: Introductionmentioning
confidence: 99%
“…Here we explore the topology attacks and injection attacks for the deep networks. Motivated by the topological limits to the parallel processing capability of the multitask networks 16 . The deep networks can give play to the maximum work capability by removing the nodes, which is described as dropout 17 .…”
Section: Introductionmentioning
confidence: 99%