Abstract-Sequential data labeling is a fundamental task in machine learning applications, with speech and natural language processing, activity recognition in video sequences, and biomedical data analysis being characteristic such examples, to name just a few. The conditional random field (CRF), a log-linear model representing the conditional distribution of the observation labels, is one of the most successful approaches for sequential data labeling and classification, and has lately received significant attention in machine learning, as it achieves superb prediction performance in a variety of scenarios. Nevertheless, existing CRF formulations can capture only one-or few-timestep interactions, and neglect higher-order dependencies, which are potentially useful in many real-life sequential data modeling applications. To resolve these issues, in this paper we introduce a novel CRF formulation, based on the postulation of an energy function which entails infinitely-long time-dependencies between the modeled data. Building blocks of our novel approach are: (i) the sequence memoizer, a recently proposed nonparametric Bayesian approach for modeling label sequences with infinitely-long time dependencies; and (b) a mean-field-like approximation of the model marginal likelihood, which allows for the derivation of computationally efficient inference algorithms for our model. The efficacy of the so-obtained infinite-order CRF (CRF ∞ ) model is experimentally demonstrated.Index Terms-Conditional random field, sequential data, sequence memoizer, mean-field principle.