In this paper we prove that in the high temperature region of the SherringtonKirkpatrick model for a typical realization of the disorder the weighted average of spins i≤N t i σ i will be approximately Gaussian provided that max i≤N |t i |/ i≤N t 2 i is small.
We prove new probabilistic upper bounds on generalization error of complex classifiers that are combinations of simple classifiers. Such combinations could be implemented by neural networks or by voting methods of combining the classifiers, such as boosting and bagging. The bounds are in terms of the empirical distribution of the margin of the combined classifier. They are based on the methods of the theory of Gaussian and empirical processes (comparison inequalities, symmetrization method, concentration inequalities) and they improve previous results of Bartlett (1998) on bounding the generalization error of neural networks in terms of ℓ1-norms of the weights of neurons and of Schapire, Freund, Bartlett and Lee (1998) on bounding the generalization error of boosting. We also obtain rates of convergence in Lévy distance of empirical margin distribution to the true margin distribution uniformly over the classes of classifiers and prove the optimality of these rates.
We construct data dependent upper bounds on the risk in function learning problems. The bounds are based on the local norms of the Rademacher process indexed by the underlying function class and they do not require prior knowledge about the distribution of training examples or any specific properties of the function class. Using Talagrand's type concentration inequalities for empirical and Rademacher processes, we show that the bounds hold with high probability that decreases exponentially fast when the sample size grows. In typical situations that are frequently encountered in the theory of function learning, the bounds give nearly optimal rate of convergence of the risk to zero.
In this paper we prove that the support of a random measure on the unit ball of a separable Hilbert space that satisfies the Ghirlanda-Guerra identities must be ultrametric with probability one. This implies the Parisi ultrametricity conjecture in mean-field spin glass models, such as the Sherrington-Kirkpatrick and mixed p-spin models, for which Gibbs' measures are known to satisfy the Ghirlanda-Guerra identities in the thermodynamic limit.
In an important recent paper, [2], S. Franz and M. Leone prove rigorous lower bounds for the free energy of the diluted p-spin model and the K-sat model at any temperature. We show that the results for these two models are consequences of a single general principle. Our calculations are significantly simpler than those of [2], even in the replica-symmetric case.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.