Abstract. There are currently several communities working on algorithms for classification of high dimensional data. This work develops a class of variational algorithms that combine recent ideas from spectral methods on graphs with nonlinear edge/region detection methods traditionally used in in the PDE-based imaging community. The algorithms are based on the Ginzburg-Landau functional which has classical PDE connections to total variation minimization. Convex-splitting algorithms allow us to quickly find minimizers of the proposed model and take advantage of fast spectral solvers of linear graph-theoretic problems. We present diverse computational examples involving both basic clustering and semi-supervised learning for different applications. Case studies include feature identification in images, segmentation in social networks, and segmentation of shapes in high dimensional datasets.Key words. Nyström extension, diffuse interfaces, image processing, high dimensional data AMS subject classifications. Insert AMS subject classifications.This work brings together ideas from different communities and for this reason we review various components of the algorithms in order to make the paper accessible to readers familiar with either the PDE-based or graph-theoretic approaches. In Section 1 we review diffuse interface methods in Euclidean space and convex splitting methods for minimization. These well-known constructions make heavy use of the classical Laplace operator and our new algorithms involve extensions of this idea to a more general graph Laplacian. Section 2 reviews some of the notation and definitions of the graph Laplacian and this discussion contains a level of detail appropriate for readers less familiar with this machinery. Included in this section is a review of segmentation using spatial clustering and a discussion of various normalization conventions for these linear operators on graphs, in connection to real world problems such as machine learning in image analysis. The rest of the paper explains the main computational algorithm and presents different examples involving both sparse connectivity and non-sparse connectivity of the graph. The algorithms have a multi-scale flavor due to (a) the different scales inherent in diffuse interface methods and (b) the role of scale in the eigenfunctions and eigenvalues of the graph Laplacian.1. Background on diffuse interfaces, image processing, and convex splitting methods. Diffuse interface models in Euclidean space are often built around the Ginzburg-Landau functionalwhere W is a double well potential. For example W (u) = 1 4 (u 2 − 1) 2 has minimizers at plus and minus one. The operator ∇ denotes the spatial gradient operator and the first term in GL is /2 times the H 1 semi-norm of u. The small parameter *