2019
DOI: 10.1103/physreve.100.052608
|View full text |Cite
|
Sign up to set email alerts
|

Tuning and jamming reduced to their minima

Abstract: Inspired by protein folding, we smooth out the complex cost function landscapes of two processes, the tuning of networks, and the jamming of ideal spheres. In both processes, geometrical frustration plays a role -tuning pressure differences between pairs of target nodes far from the source in a flow network impedes tuning of nearby pairs more than the reverse process, while unjamming the system in one region can make it more difficult to unjam elsewhere. By modifying the cost functions to control the order in … Show more

Help me understand this report
View preprint versions

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1

Citation Types

0
14
0

Year Published

2020
2020
2024
2024

Publication Types

Select...
6
4

Relationship

4
6

Authors

Journals

citations
Cited by 17 publications
(14 citation statements)
references
References 32 publications
(44 reference statements)
0
14
0
Order By: Relevance
“…Most notably, [16,17] show that curriculum can lead to faster learning in a simple setting, but the effects of curriculum on asymptotic generalisation and the dependence on task structure remain unclear. A hint that indeed curriculum learning might lead to statistically different minima comes from a connection between constraint-satisfaction problems and physics results on flow networks [18], but to our knowledge no direct result has been reported in the modern theoretical machine learning literature.…”
Section: Introductionmentioning
confidence: 91%
“…Most notably, [16,17] show that curriculum can lead to faster learning in a simple setting, but the effects of curriculum on asymptotic generalisation and the dependence on task structure remain unclear. A hint that indeed curriculum learning might lead to statistically different minima comes from a connection between constraint-satisfaction problems and physics results on flow networks [18], but to our knowledge no direct result has been reported in the modern theoretical machine learning literature.…”
Section: Introductionmentioning
confidence: 91%
“…Depending on the values of Γ i , the topography of the loss function will change, but the loss function will still vanish at the same global minima, which are unaffected by the value of Γ i . This transformation was motivated by recent work in which the topography of the loss function was changed to improve the tuning of physical flow networks [39]. Here, we use Γ i to emphasize one class relative to the others for a period T , and cycle through all the classes in turn so the total duration of a cycle that passes through all classes is CT .…”
Section: Myrtle5 and Cifar10 Phase Diagramsmentioning
confidence: 99%
“…Here we follow a different approach to learning that exploits physical processes, involving simple and local rules, in lieu of complex ones inspired by neurons or non-local computer science algorithms. In previous work, simulated and laboratory mechanical networks, and simulated flow networks, have been trained to perform desired tasks by adjusting their internal degrees of freedom [26][27][28][29][30][31][32][33][34][35][36][37][38][39]. This has been accomplished either by minimizing a global cost function [26][27][28][29][30] or using local rules [31][32][33][34][35][36][37][38][39], in which each edge of the network adjusts some property -such as its mechanical stiffness -that we will refer to as a 'learning degree of freedom' in response to the stress on it.…”
Section: Introductionmentioning
confidence: 99%