We combine constrained literals for model representation with key concepts from first-order superposition and propositional conflict-driven clause learning (CDCL) to create the new calculus Non-Redundant Clause Learning (NRCL) deciding the Bernays-Schönfinkel fragment. Our calculus uses first-order literals constrained by disequations between tuples of terms for compact model representation. From superposition, NRCL inherits the abstract redundancy criterion and the monotone model operator. CDCL adds the dynamic, conflict-driven search for an atom ordering inducing a model. As a result, in NRCL a false clause can be found effectively modulo the current model candidate. It guides the derivation of a first-order ordered resolvent that is never redundant. Similar to 1UIPlearning in CDCL, the learned resolvent induces backtracking and, by blocking the previous conflict state via propagation, it enforces progress towards finding a model or a refutation. The non-redundancy result also implies that only finitely many clauses can be generated by NRCL on the Bernays-Schönfinkel fragment, which serves as an argument for termination.