<p>The FAIR data principles form the core OGC mission that renders in the open geospatial standards and the open-data initiatives that use them. Although OGC is best known for the technical interoperability, the domain modelling and semantic level play an inevitable role in the standards definition and the exploitation. On the one hand, we have a growing number of specialised profiles and implementations that selectively use the OGC modular specification model components. On the other hand, various domain ontologies exist already, enabling a better understanding of the data. As there could be multiple semantic representations, common data models support cross ontology traverses. Defining the service in the technical-semantic space requires fixing some flexibility points, including optional and mandatory elements, additional constraints and rules, and content including normalised vocabularies to be used.</p><p>The proposed solution of the OGC Definition Server is a multi-purpose application built around the triple store database engine integrated with the ingestion, validation, and entailment tools and exposing customized end-points. The models are available in the human-readable format and machine-2-machine aimed encodings. For manual processes, it enables understanding the technical and semantic definitions/relationships between entities. Programmatic solutions benefit from a precise referential system, validations, and entailment.</p><p>Currently, OGC Definition Server is hosting several types of definitions covering:</p><ul><li>Register of OGC bodies, assets, and its modules</li> <li>Ontological common semantic models (e.g., for Agriculture)</li> <li>Dictionaries of subject domains (e.g., PipelineML Codelists)</li> </ul><p>In practice, that is a step forward in defining the bridge between conceptual and logical models. The concepts can be expressed as instances of various ontological classes and interpreted within multiple contexts, with the definition translated into entities, relationships, and properties. In the future, it is linking the data to the reference model and external ontologies that may be even more significant. Doing so can greatly improve the quality of the knowledge produced based on the collected data. Ability to verify the research outcomes and explainable AI are just two examples where a precise log of inferences and unambiguous semantic compatibility of the data will play a key role.</p>
<p>Cloud-based big earth data workflow architectures for operational decision making across communities need to follow<strong> </strong>FAIR (Findable, Accessible, Interoperable, Reusable) principles in order to be effective. This presentation highlights mature implementations of OGC standards-based building blocks for climate data processing and service provision that are deployed in leading climate services information server systems such as the COPERNICUS Climate Change Service C3S. OGC Web Processing Services (WPS) form the bases of component operations in these implementations, from simple polygon subsetting to climate indices calculation and complex hydrological modelling. Interoperable building blocks also handle security functions such as user registration, client-site utilities, and data quality compliance.&#160;</p><p>A particular focus will be the ROOCS (Remote Operations on Climate Simulations) project, a set of tools and services to provide "data-aware" processing of ESGF&#160; (Earth System Grid Federation) and other standards-compliant climate datasets from modelling initiatives such as CMIP6 and CORDEX. One example is the WPS service &#8216;Rook&#8217;, that enables remote operations, such as spatio-temporal subsetting, on climate model data. It exposes all&#160; the operations available in the &#8216;daops&#8217; library based on Xarray. Finch is a WPS-based service for remote climate index calculations, also used for the analytics of ClimateData.ca, that dynamically wraps Xclim, a Python-based high-performance distributed climate index library. Finch automatically builds catalogues of available climate indicators, fetches data using &#8220;lazy&#8221;-loading, and manages asynchronous requests with Gunicorn and Dask. Raven-WPS provides parallel web access to a dynamically-configurable &#8216;RAVEN&#8217; hydrological modelling framework with numerous pre-configured hydrological models (GR4J-CN, HBV-EC, HMETS, MOHYSE) and terrain-based analyses. Coupling GeoServer-housed terrain datasets with climate datasets, RAVEN can perform analyses such as hydrological forecasting without requirements of local access to data, installation of binaries, or local computation.</p><p>The EO Exploitation Platform Common Architecture (EOEPCA) describes an app-to-the-data paradigm where users select, deploy and run application workflows on remote platforms where the data resides. Following OGC Best Practices for EO Application Packages, Weaver executes workflows that chain together various applications and WPS inputs/outputs. It can also deploy near-to-data applications using Common Workflow Language (CWL) application definitions. Weaver was developed especially with climate services use cases in mind.</p><p>The architectural patterns illustrated by these examples will be exercised and tested in the upcoming OGC Climate Services Pilot initiative, whose&#160; outputs will be also&#160; incorporated into disaster risk indicators developed in the upcoming OGC Disaster Pilot 2022.</p><p><img src="https://contentmanager.copernicus.org/fileStorageProxy.php?f=gnp.090231131ed165283491461/sdaolpUECMynit/22UGE&app=m&a=0&c=67d3cb8cdcd79c816211ccddfc20b1fb&ct=x&pn=gnp.elif&d=1" alt=""></p><p>Further reading:</p><p>https://docs.google.com/document/d/1IrwlEiR-yRLcoI9fGh2B1leH4KU0v0SUMWQqiaxc1BM/edit</p><p><br><br></p><p>&#160;</p>
<p>The recent OGC Cloud Concept Development Study [1]&#160; has shown that the major (big) Geospatial Data providers are going towards Cloud solutions not only to make more data more accessible, but also to locate data processing next to the data. Meanwhile, recent experiences from the H2020 e-shape project show that the EO developers community still needs support to fully adopt the Cloud all the more that based on the feedback received during e- shape&#8217;s first sprint, the Earth Observation Cloud platforms still need to mature to be more attractive. In order to support the good connection between Data providers, Technology providers and EO developers, it is critical that sponsors keep on supporting the efforts from the Earth Observation community at a number of levels: Enhancing Copernicus and other open data accessibility, developing Clouds and platforms interoperability and operational maturity, increasing cloud skills among developers and scientists, sustaining funding mechanisms long enough to allow the rendez-vous in the Cloud of all the critical stakeholders with good timing to reach the critical point of self-sustainability.</p><p>During this process it is important to not only develop the technical skills and new platforms capacities, but also to develop a good understanding of the pricing mechanisms and how to optimize the costs. This is very needed to develop the trust that outsourcing infrastructures will lead to the expected budget savings and&#160; to trigger the budgets organization evolutions that&#160; moving to Cloud technologies requires.&#160;</p><p>&#160;</p><p>[1]&#160; Echterhoff, J., Wagermann, J., Lieberman, J.: OGC 21-023, OGC Earth Observation Cloud Platform COncept Development Study Report. Open Geospatial Consortium (2021). https://docs.ogc.org/per/21-023.html</p>
Abstract. Recent advances in modelling capabilities and data processing combined with vastly improved observation tools and networks have resulted in the expansion of available weather and climate information, from historical observations to seasonal climate forecasts, as well as decadal climate predictions and multi-decadal climate change projections. However, it remains a key challenge to ensure this information reaches the intended climate-sensitive sectors (e.g. water, energy, agriculture, health), and is fit-for-purpose to guarantee the usability of climate information for these downstream users. Climate information can be produced on demand via climate resilience information systems which are existing in various forms. To optimise the efficiency and establish better information exchange between these systems, standardisation is necessary. Here, standards and deployment options are described for how scientific methods can be be deployed in climate resilience information systems, respecting the principles of being findable, accessible, interoperable and reusable. Besides the general description of OGC-API Standards and OGC-API Processes based on existing building blocks, ongoing developments in AI-enhanced services for climate services are described.
<p>Producing and providing useful information for climate services requires vast volumes of data to come together which requires technical standards. Especially in the case of extreme climate events, where scientific methods for appropriate assessments, detection or even attribution are facing high complexity for the data processing workflows, therefore the production of climate information services requires optimal technical systems to underpinn climate services with science. These climate resilience information systems like the Climate Data Store (CDS) of the Copernicus Climate Change Service (C3S) can be enhanced when scientific workflows for extreme event detection are optimized as information production service, accordingly deployed to be usable by extreme event experts to facilitate their work through a frontend. Deployment into federated data processing systems like CDS requires that scientific methods and their algorithms be wrapped up as technical services following standards of application programming interfaces (API) and, as good practice, even FAIR principles. FAIR principles means to be <strong>Findable</strong> within federated data distribution architectures, including public catalogues of well documented scientific analytical processes. Remote storage and computation resources should be operationally <strong>Accessible</strong> to all, including low bandwidth regions and closing digital gaps to &#8216;Leave No One Behind&#8217;. including Data inputs, outputs, and processing API standards are the necessary conditions to ensure the system is <strong>Interoperable</strong>. And they should be built from <strong>Reusable</strong> building blocks that can be realized by modular architectures with swappable components, data provenance systems, and rich metadata.<br>Here we present challenges and preliminary prototypes for service which are based on OGC API standards for processing (https://ogcapi.ogc.org/processes/) open geospatial consortium (OGC). We are presenting blueprints on how AI-based scientific workflows can be ingested into climate resilience information systems to enhance climate services related to extreme weather and impact events. The importance of API standards will be pointed out to ensure reliable data processing in federated spatial data infrastructures. Examples will be taken from the EU H2020 Climate Intelligence (CLINT; https://climateintelligence.eu/) project, where extreme events components will be developed for C3S. Within this project, appropriate technical services will be developed as building blocks ready to deploy into digital data infrastructures like C3S but also European Science Cloud, or the DIAS. This deployment flexibility results out of the standard compliance and FAIR principles. In particular, a service employing state-of-the-art deep learning based inpainting technology to reconstruct missing climate information of global temperature patterns will be developed. This OGC-standard based web processing service (WPS) will be used as a prototype and extended in the future to other climate variables. Developments focus on heatwaves and warm nights, extreme droughts, tropical cyclones and compound and concurrent events, including their impacts, whilst the concepts are targeting generalised opportunities to transfer any kind of scientific workflow to a technical service underpinning scientific climate service. The blueprints are taking into account how to chain the data processing from data search and fetch, event index definition and detection as well as identifying the drivers responsible for the intensity of the extreme event to construct storylines guiding to the event.</p>
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
customersupport@researchsolutions.com
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
This site is protected by reCAPTCHA and the Google Privacy Policy and Terms of Service apply.
Copyright © 2024 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.