Grid partitioning for input space results in the exponential rise in the number of rules in adaptive network-based fuzzy inference system (ANFIS) and patch learning (PL) as the number of features increases, thus resulting in the huge computational load and deteriorating its interpretability. An improved PL (iPL) is put forward for the training of each sub-fuzzy system to overcome the rule-explosion problem. In the iPL, input partitioning is done using fuzzy c-means (FCM) clustering to avoid the heavy computational complexity arising due to the large number of rules generated from high dimensionality. In this paper, two novel classifiers, called FCM clustering based deep patch learning with improved high-level interpretability for classification problems, are presented, named as HI-FCMDPL-CP1 and HI-FCMDPL-CP2. The proposed classifiers have two characteristics: One is a stacked deep structure of component iPL fuzzy classifiers for high accuracy, and the other is the use of maximal information coefficient (MIC) and the maximum misclassification threshold (MMT) to optimize the deep structures. High interpretability is achieved at each layer by using the FCM clustering, concise structure and large input dimensionality. The MMT, random input (RI) and parameter sharing (PS) are integrated to improve their classification accuracy without losing their interpretability. Experiments on several real-word datasets demonstrated that MIC, RI and PS in HI-FCMDPL-CP1 and HI-FCMDPL-CP2 are effective individually, and integrating them all three can further improve the classification performance. A more concise deep fuzzy system is obtained with the number of features and fuzzy rules reduced simultaneously. Furthermore, MIC, RI and PS are used to determine the advantages and disadvantages of using serial versus parallel structures to avoid subjective selection of these two categories.INDEX TERMS Fuzzy c-means (FCM) clustering, maximal information coefficient (MIC), random input (RI), deep patch learning classifier, interpretability.