Tackling semi-supervised learning problems with graph-based methods has become a trend in recent years since graphs can represent all kinds of data and provide a suitable framework for studying continuum limits, for example, of differential operators. A popular strategy here is p-Laplacian learning, which poses a smoothness condition on the sought inference function on the set of unlabeled data. For $$p<\infty $$
p
<
∞
continuum limits of this approach were studied using tools from $$\varGamma $$
Γ
-convergence. For the case $$p=\infty $$
p
=
∞
, which is referred to as Lipschitz learning, continuum limits of the related infinity Laplacian equation were studied using the concept of viscosity solutions. In this work, we prove continuum limits of Lipschitz learning using $$\varGamma $$
Γ
-convergence. In particular, we define a sequence of functionals which approximate the largest local Lipschitz constant of a graph function and prove $$\varGamma $$
Γ
-convergence in the $$L^{\infty }$$
L
∞
-topology to the supremum norm of the gradient as the graph becomes denser. Furthermore, we show compactness of the functionals which implies convergence of minimizers. In our analysis we allow a varying set of labeled data which converges to a general closed set in the Hausdorff distance. We apply our results to nonlinear ground states, i.e., minimizers with constrained $$L^p$$
L
p
-norm, and, as a by-product, prove convergence of graph distance functions to geodesic distance functions.