The use of expert knowledge is always more or less afflicted with uncertainties for many reasons: Expert knowledge may be imprecise, imperfect, or erroneous, for instance. If we ask several experts to label data (e.g., to assign class labels to given data objects, i.e. samples), we often state that these experts make different, sometimes conflicting statements. The problem of labeling data for classification tasks is a serious one in many technical applications where it is rather easy to gather unlabeled data, but the task of labeling requires substantial effort regarding time and, consequently, money. In this article, we address the problem of combining several, potentially wrong class labels. We assume that we have an ordinal class structure (i.e., three or more classes are arranged such as "light", "medium-weight", and "heavy") and only a few expert statements are available. We propose a novel combination rule, the Extended Imprecise Dirichlet Model Rule (EIDMR) which is based on a k-nearestneighbor approach and Dirichlet distributions, i.e., second-order distributions for multinomial distributions. In addition, experts may assess the difficulty of the labeling task, which may optionally be considered in the combination. The combination rule EIDMR is compared to others such as a standard Imprecise Dirichlet Model Rule, the Dempster-Shafer Rule, and Murphy's Rule. In our evaluation of EIDMR we first use artificial data where we know the data characteristics and true class labels. Then, we present results of a case study where we classify lowvoltage grids with Support Vector Machines (SVM). Here, the task is to assess the expandability of these grids with additional photovoltaic generators (or other distributed generators) by assigning these grids to one of five ordinal classes. It can be shown that the use of our new EIDMR leads to better classifiers in cases where ordinal class labels are used and only a few, uncertain expert statements are available.