As traditional machine learning tools are increasingly applied to science and engineering applications, physics-informed methods have emerged as effective tools for endowing inferences with properties essential for physical realizability. While promising, these methods generally enforce physics weakly via penalization. To enforce physics strongly, we turn to the exterior calculus framework underpinning combinatorial Hodge theory and physics-compatible discretization of partial differential equations (PDEs). Historically, these two fields have remained largely distinct, as graphs are strictly topological objects lacking the metric information fundamental to PDE discretization. We present an approach where this missing metric information may be learned from data, using graphs as coarse-grained mesh surrogates that inherit desirable conservation and exact sequence structure from the combinatorial Hodge theory. The resulting datadriven exterior calculus (DDEC) may be used to extract structure-preserving surrogate models with mathematical guarantees of well-posedness. The approach admits a PDE-constrained optimization training strategy which guarantees machine-learned models enforce physics to machine precision, even for poorly trained models or small data regimes. We provide analysis of the method for a class of models designed to reproduce nonlinear perturbations of elliptic problems and provide examples of learning H(div)/H(curl) systems representative of subsurface flows and electromagnetics.