A new release of the Max Planck Institute for Meteorology Earth System Model version 1.2 (MPI‐ESM1.2) is presented. The development focused on correcting errors in and improving the physical processes representation, as well as improving the computational performance, versatility, and overall user friendliness. In addition to new radiation and aerosol parameterizations of the atmosphere, several relatively large, but partly compensating, coding errors in the model's cloud, convection, and turbulence parameterizations were corrected. The representation of land processes was refined by introducing a multilayer soil hydrology scheme, extending the land biogeochemistry to include the nitrogen cycle, replacing the soil and litter decomposition model and improving the representation of wildfires. The ocean biogeochemistry now represents cyanobacteria prognostically in order to capture the response of nitrogen fixation to changing climate conditions and further includes improved detritus settling and numerous other refinements. As something new, in addition to limiting drift and minimizing certain biases, the instrumental record warming was explicitly taken into account during the tuning process. To this end, a very high climate sensitivity of around 7 K caused by low‐level clouds in the tropics as found in an intermediate model version was addressed, as it was not deemed possible to match observed warming otherwise. As a result, the model has a climate sensitivity to a doubling of CO2 over preindustrial conditions of 2.77 K, maintaining the previously identified highly nonlinear global mean response to increasing CO2 forcing, which nonetheless can be represented by a simple two‐layer model.
Abstract. Weather and climate models are complex pieces of software which include many individual components, each of which is evolving under pressure to exploit advances in computing to enhance some combination of a range of possible improvements (higher spatio-temporal resolution, increased fidelity in terms of resolved processes, more quantification of uncertainty, etc.). However, after many years of a relatively stable computing environment with little choice in processing architecture or programming paradigm (basically X86 processors using MPI for parallelism), the existing menu of processor choices includes significant diversity, and more is on the horizon. This computational diversity, coupled with ever increasing software complexity, leads to the very real possibility that weather and climate modelling will arrive at a chasm which will separate scientific aspiration from our ability to develop and/or rapidly adapt codes to the available hardware. In this paper we review the hardware and software trends which are leading us towards this chasm, before describing current progress in addressing some of the tools which we may be able to use to bridge the chasm. This brief introduction to current tools and plans is followed by a discussion outlining the scientific requirements for quality model codes which have satisfactory performance and portability, while simultaneously supporting productive scientific evolution. We assert that the existing method of incremental model improvements employing small steps which adjust to the changing hardware environment is likely to be inadequate for crossing the chasm between aspiration and hardware at a satisfactory pace, in part because institutions cannot have all the relevant expertise in house. Instead, we outline a methodology based on large community efforts in engineering and standardisation, which will depend on identifying a taxonomy of key activities – perhaps based on existing efforts to develop domain-specific languages, identify common patterns in weather and climate codes, and develop community approaches to commonly needed tools and libraries – and then collaboratively building up those key components. Such a collaborative approach will depend on institutions, projects, and individuals adopting new interdependencies and ways of working.
Space antennas with a helical geometry are an advantageous choice for many applications, for instance if the transmission of electromagnetic waves with a circular polarization is intended, or if signals from terrestrial objects shall be received with a high angular resolution. In all these cases the desired electromagnetic properties of a helical geometry can be combined with the mechanical advantage that the antenna acts as a compression spring, provided that its core structure has the necessary high spring stiffness but can nevertheless easily be compressed. Such an antenna has been developed by DLR Institutes in Bremen and Braunschweig together with some industrial partners for a small satellite named AISat which shall be able to pursue the position of individual ships in critical sea areas in order to improve the security of seafare trade. The development was very challenging since the antenna must expand from a stowed stack length of only 10 centimeters to a total length of 4 meters. Only a special carbonfiber core under the conductive coating and a system of stabilizing cords led to a satisfying solution. Both the self-deployment and the self-stabilization function of this innovative antenna concept have been successfully tested and verified under zero-gconditions in the course of a parabolic flight campaign. It could be convincingly demonstrated that the helical antenna can really achieve its desired contour in weightlessness within some seconds and maintain the required stability. Beyond the current application for the AISat satellite it is therefore a quite promising concept for future satellites.
Abstract. The need for open science has been recognized by the communities of meteorology and climate science. While these domains are mature in terms of applying digital technologies, the implementation of open science methodologies is less advanced. In a session on “Weather and Climate Science in the Digital Era” at the 14th IEEE International eScience Conference domain specialists and data and computer scientists discussed the road towards open weather and climate science. Roughly 80 % of the studies presented in the conference session showed the added value of open data and software. These studies included open datasets from disparate sources in their analyses or developed tools and approaches that were made openly available to the research community. Furthermore, shared software is a prerequisite for the studies which presented systems like a model coupling framework or digital collaboration platform. Although these studies showed that sharing code and data is important, the consensus among the participants was that this is not sufficient to achieve open weather and climate science and that there are important issues to address. At the level of technology, the application of the findable, accessible, interoperable, and reusable (FAIR) principles to many datasets used in weather and climate science remains a challenge. This may be due to scalability (in the case of high-resolution climate model data, for example), legal barriers such as those encountered in using weather forecast data, or issues with heterogeneity (for example, when trying to make use of citizen data). In addition, the complexity of current software platforms often limits collaboration between researchers and the optimal use of open science tools and methods. The main challenges we observed, however, were non-technical and impact the practice of science as a whole. There is a need for new roles and responsibilities in the scientific process. People working at the interface of science and digital technology – e.g., data stewards and research software engineers – should collaborate with domain researchers to ensure the optimal use of open science tools and methods. In order to remove legal boundaries on sharing data, non-academic parties such as meteorological institutes should be allowed to act as trusted agents. Besides the creation of these new roles, novel policies regarding open weather and climate science should be developed in an inclusive way in order to engage all stakeholders. Although there is an ongoing debate on open science in the community, the individual aspects are usually discussed in isolation. Our approach in this paper takes the discourse further by focusing on “open science in weather and climate research” as a whole. We consider all aspects of open science and discuss the challenges and opportunities of recent open science developments in data, software, and hardware. We have compiled these into a list of concrete recommendations that could bring us closer to open weather and climate science. We acknowledge that the development of open weather and climate science requires effort to change, but the benefits are large. We have observed these benefits directly in the studies presented in the conference and believe that it leads to much faster progress in understanding our complex world.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
customersupport@researchsolutions.com
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
This site is protected by reCAPTCHA and the Google Privacy Policy and Terms of Service apply.
Copyright © 2024 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.