The increasing complexity of numerical modelling systems in environmental sciences has led to the development of different supporting architectures. Integrated environmental modelling can be undertaken by building a 'super model' simulating many processes or by using a generic coupling framework to dynamically link distinct separate models during run-time. The application of systemic knowledge management to integrated environmental modelling indicates that we are at the onset of the norming stage, where gains will be made from consolidation in the range of standards and approaches that have proliferated in recent years. Consolidation is proposed in six topics: metadata for data and models; supporting information; Software-as-a-service; linking (or interface) technologies; diagnostic or reasoning tools; and the portrayal and understanding of integrated modelling. Consolidation in these topics will develop model fusion: the ability to link models, with easy access to information about the models, interface standards such as OpenMI and software tools to make integration easier. For this to happen, an open software architecture will be crucial, the use of open source software is likely to increase and a community must develop that values openness and the sharing of models and data as much as its publications and citation records.Gold Open Access: This article is published under the terms of the CC-BY 3.0 license.The past few decades have seen the inexorable rise of numerical modelling as a useful tool in hydro-environmental and geomorphological modelling. Models have become more and more detailed, representing more and more processes and, with increasing computer power, being solved using larger and larger geo-spatial structures. These models can be aimed at solving a single set of equations, but have often branched out to include a wider range of processes with the formation of modelling suites.Albeit a little belatedly, numerical modelling has followed mainstream information technology (IT) in the way that the code is structured and deployed. Programmers began by writing short, self-contained bespoke applications, consisting of sequential lines of procedural code in languages such as FORTRAN. The benefits of callable sub-routines or functions were then quickly realized as applications grew in complexity, leading to a desire for reuse and clean interfaces. Many legacy applications, and those developed by scientific programmers, are still this way today. Object-oriented languages took this trend to its logical conclusion where every code segment has its own attributes and interfaces, and componentdriven architectures have increased the scale of such implementations.Alternatively, a set of environmental phenomena can be simulated using a complex composition of linked models, each of which is considered as a component at this level. In this way the progression of numerical model code structure can be summarized as in Table 1, from sequential programs of procedural code on the left to compositions of linked m...