With the advances of information communication technologies, it is critical to improve the efficiency and accuracy of modern data processing techniques. The past decade has witnessed the tremendous technical advances in sensor networks, Internet/Web of Things, cloud computing, mobile/embedded computing, spatial/temporal data processing, and big data, and these technologies have provided new opportunities and solutions to data processing techniques. Big data is an emerging paradigm applied to datasets whose size is beyond the ability of commonly used software tools to capture, manage, and process the data within a tolerable elapsed time. Such datasets are often from various sources (Variety) yet unstructured such as social media, sensors, scientific applications, surveillance, video and image archives, Internet texts and documents, Internet search indexing, medical records, business transactions, and web logs and are of large size (Volume) with fast data in/out (Velocity). More importantly, big data has to be of high value (Value) and establish trust in it for business decision-making (Veracity). Various technologies are being discussed to support the handling of big data such as massively parallel processing databases, scalable storage systems, cloud computing platforms, and MapReduce. Big data is more than simply a matter of size; it is an opportunity to find insights in new and emerging types of data and content, to make business more agile, and to answer questions that were previously considered beyond our reach. This special issue wants to demonstrate the emerging issues in the research of big data and approaches towards it. Original and research articles are solicited in all aspects including theoretical studies, practical applications, and experimental prototypes.The submitted manuscripts were reviewed by experts from both academia and industry. After two rounds of reviewing, the highest quality manuscripts were accepted for this special issue. Totally, we have received 20 manuscripts and 11 papers are accepted. Five papers are selected from SKG2015 conference with about 50% new content. This special issue will be published by Concurrency and Computation: Practice and Experience as special issues.In order to detect and describe the real-time urban emergency event, the knowledge base model is proposed by the paper by Z. Xu et al [1]. The crowdsourcing-based knowledge base model is introduced, which uses the information from social media. X. Lin et al [2]. introduced one such comprehensive schemes: it takes consideration of the DG's own characteristics and its ability to support the local loading, it adopts different protection strategy, and realizes the fault isolation and island division through the coordination between the protection and the automatic devices. J. Zheng [3] extracted the feature information and also mined the association rules to identify and mark the unknown protocol using the learning mechanism of machine. The unknown protocol in a specific environment was found and analyzed by marking t...