IntroductionRecently, the integration of big data and machine learning technologies has emerged as an exciting new paradigm for industry and life. These technologies are fast-growing areas of research that cover the fields of Information and Communication Technologies (ICT) for future computing (FC). It becomes a preferable solution for solving complex problems in different fields, such as artificial intelligence, large-scale analysis systems, natural language processing, pattern recognition, video analysis, and telematics. For instance, AlphaGo, developed by Google DeepMind, changed the current paradigm of machine learning on big data after its steady win in the Google DeepMind Challenge. Machine learning is the art and science of designing computer algorithms from the dataset, and AlphaGo learns and improves performance by relying on massive data associated with huge games played by humans to improve artificial intelligence. Likewise, there are many issues to realize and provide effective services and applications with enormous attentions, and efforts have been focused on machine learning on big data for FC. Contributions to theoretical research presenting advanced technologies and concepts, analysis and reporting of experiences, and experiences related to the implementation and application of theories, as well as tutorials on emerging trends are required for these fields of research. As such, the purpose of this special issue is to provide an overview of the state-ofthe-art technologies and solution guidelines for these areas of research.This special issue covers pure research and applications within novel scopes related to machine learning on big data for future computing. In addition, it deals