Summary
New Science paradigms have recently evolved to promote open publication of scientific findings as well as multi‐disciplinary collaborative approaches to scientific experimentation. These approaches can face modern scientific challenges but must deal with large quantities of data produced by industrial and scientific experiments. These data, so‐called Big Data, require to introduce new computer science systems to help scientists cooperate, extract information, and possibly produce new knowledge out of the data. E‐infrastructures are distributed computer systems that foster collaboration between users and can embed distributed and parallel processing systems to manage big data. However, in order to meet modern Science requirements, e‐Infrastructures impose several requirements to computational systems in turn, eg, being economically sustainable, managing community‐provided processes, using standard representations for processes and data, managing big data size and heterogeneous representations, supporting reproducible Science, collaborative experimentation, and cooperative online environments, managing security and privacy for data and services. In this paper, we present a cloud computing system (gCube DataMiner) that meets these requirements and operates in an e‐Infrastructure, while sharing characteristics with state‐of‐the‐art cloud computing systems. To this aim, DataMiner uses the web processing service standard of the open geospatial consortium and introduces features like collaborative experimental spaces, automatic installation of processes and services on top of a flexible and sustainable cloud computing architecture. We compare DataMiner with another mature cloud computing system and highlight the benefits our system brings, the new paradigms requirements it satisfies, and the applications that can be developed based on this system.