The era of AI large models, represented by GPT-4, is accelerating, and profoundly transforming various aspects of societal life. Large models with massive parameters in deep learning offer an effective approach to unraveling the bottleneck of complex big data intelligent learning. While these large models showcase powerful learning capabilities, they also face challenges of high energy consumption and computational power requirements. Research indicates that the average energy consumption produced during the training of one AI large model is roughly equivalent to the total carbon emissions from five cars throughout their lifetimes, and the computational power needed to drive AI large models doubles every 3.5 months.As a beneficial complement, law-embeded cross-scale systematic learning presents another effective approach to address the challenges of complex big data intelligent learning. Cross-scale systematic learning has demonstrated significant success in some professional domains, such as the 2021 Nobel Prize in Physics awarded for cross-scale modeling of complex physical systems and its applications in global climate change. In fact, Chinese scientists have pioneered research in cross-scale learning of complex systems, with the team analyzing dark matter big data at Beihang University utilizing cross-scale systematic learning methods to achieve real-time learning of critical data in petabyte-scale datasets, achieving precision at the level of one in ten thousand.This paper analyzes the fundamental principles of cross-scale systematic learning at micro, meso, and macro scales, establishes a universal method for law-embeded cross-scale systematic learning, and conducts typical application with demonstrations using social big data. The applications of cross-scale systematic learning in areas such as epidemic prevention and control, and public opinion analysis have achieved remarkable results, providing new successful examples for the digitization, networking, and intelligence development of China's social governance.