In recent years, meta-learning has become a mainstream technique for few-shot learning, and it has been widely used and achieved good results in computer vision and image processing. Based on this powerful empirical performance, we are interested in using Meta-learning frameworks in NLP to deal with the task of few-shot learning (FSL). However, due to the sparse sample size, sample-level comparisons based on other expressions are highly susceptible to interference, leading to serious overfitting problems. To achieve classification tasks, we suggest a novel Adaptive Cross-Capsule Network (ACCN) for learning generalized representations. A dynamic routing technique is utilized with the concept of a prototype network to train the support set to generalize the generalized representations of each category. The support set and the query set can fully interact dynamically to capture the essential semantic aspects of the query set following a successful non-parametric cross-attention method. Experimental results show that ACCN proposed in this paper is well adaptive to the intention classification task under additional categories, which obtain SOTA results on FewRel Datasets, which also can perform significantly better than the original classification system on Huffpost Datasets. This provides a crucial foundation for this study.