Fracton phases of matter feature local excitations with restricted mobility. Despite the substantial theoretical progress they lack conclusive experimental evidence. We discuss a simple and experimentally available realization of fracton physics. We note that superfluid vortices form a Hamiltonian system that conserves total dipole moment and trace of the quadrupole moment of vorticity; thereby establishing a relation to a traceless scalar charge theory in two spatial dimensions. Next we consider the limit where the number of vortices is large and show that emergent vortex hydrodynamics also conserves these moments. Finally, we show that on curved surfaces, the motion of vortices and that of fractons agree; thereby opening a route to experimental study of the interplay between fracton physics and curved space. Our conclusions also apply to charged particles in a strong magnetic field.
Deep neural networks are notorious for defying theoretical treatment. However, when the number of parameters in each layer tends to infinity the network function is a Gaussian process (GP) and quantitatively predictive description is possible. Gaussian approximation allows to formulate criteria for selecting hyperparameters, such as variances of weights and biases, as well as the learning rate. These criteria rely on the notion of criticality defined for deep neural networks. In this work we describe a new way to diagnose (both theoretically and empirically) this criticality.To that end, we introduce partial Jacobians of a network, defined as derivatives of preactivations in layer l with respect to preactivations in layer l 0 < l. These quantities are particularly useful when the network architecture involves many different layers. We discuss various properties of the partial Jacobians such as their scaling with depth and relation to the neural tangent kernel (NTK). We derive the recurrence relations for the partial Jacobians and utilize them to analyze criticality of deep MLP networks with (and without) LayerNorm. We find that the normalization layer changes the optimal values of hyperparameters and critical exponents. We argue that LayerNorm is more stable when applied to preactivations, rather than activations due to larger correlation depth.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
customersupport@researchsolutions.com
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
This site is protected by reCAPTCHA and the Google Privacy Policy and Terms of Service apply.
Copyright © 2025 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.