1. Relevance of the research objectives
The need for better regional and local short-range weather predictions is continuously increasing in our society, not only in the agricultural, industrial and economic fields but also in relation with outdoor leisure activities. Of prime importance is the prediction of risks, for human life and heavy damages. Not only storms must be better forecasted, but also severe weather which may cause disruptions (in water and power distribution, surface and air transportation, ..), like the occurrence of heavy rain, snowfalls, or fog. The forecast of such weather elements is to be improved, in regards to their intensity and location, with an increased precision both in time and space.
Such an increasing demand impacts directly on numerical weather prediction, since NWP models, especially high-resolution limited-area ones, are now a major tool for weather forecasting. All models and each team have now got to make headway towards very high resolution, typically horizontal scales around 1-3 km. Besides, assuming that the performances of computers will keep increasing as they did in the past, it is realistic to envisage, within 3 to 4 years, horizontal resolutions of 2-3 km for NWP operational models, though on small targeted domains first. By most partners, the main operational applications should stay a little behind, with horizontal resolutions between 5 and 8 km.
However, NWP, at such scales, has to address new problems, linked either to the increased complexity of equations required for a better description of phenomena, or to the very large amounts of data the models will need (mainly meteorological observations) and produce. And, at each step and level, one will have to find compromises between scientific ambitions and the capacity of operational forecasting suites.
The STORMNET research plan can be described following 4 main directions, which may be considered in parallel in spite of many common points or interactions.
2. Deterministic forecasting at very high resolution
This domain combines the 3 main historical issues of limited-area NWP.
- a) "Dynamics"
There is now a general agreement that non-hydrostatic aspects must be taken into account at horizontal scales below 5 km, and even that a fully compressible system of equations should be used. Such features are implemented in all 4 European limited-area NWP models, though at different levels of achievement.
However, some debates are still open, between modelling groups or simply for improving any model: type of horizontal projection, finite-differences vs spectral methods, Eulerian vs semi-Lagrangian advection, from explicit to fully implicit time-stepping, choice of the vertical coordinate and the prognostic model variables, formulation of the lower and upper boundary conditions... Answering these questions, if possible, requires significant further work.
Besides, as horizontal or vertical resolution increases, new problems are likely to emerge, as concerns accuracy, stability or efficiency. Thus, some present choices for the very basic set of model equations may be reconsidered in the near future.
- b) "Coupling"
The driving of a limited-area model through its lateral boundary conditions by a global or a coarser limited-are model, must be carefully designed to allow a good representation of large scales, not described by the model itself. All models are presently using the same formulation, the pragmatic Davies’ relaxation scheme. It presents some known conception problems, but is quite simple and robust.
Though research in this domain started several years ago, there is still significant work required to implement a better alternative in operations, especially for spectral models. Once this difficult problem solved, more sophisticated algorithms could be addressed, if necessary.
The time-dimension of coupling is also to be reconsidered when facing very high resolution and short-range forecasts.
- c) "Physical parameterizations"
Model equations have also to describe the radiative and orographic forcings and exchanges with the surface, as well as the moist and subgrid-scale processes. This part of the model will undergo an important transformation, since one shall have to get rid simultaneously of two currently widely used key assumptions.
First of all, there will be less and less need to take into account details of the flow organisation internal to the mesh-box. Typically, organised convection and details of the subgrid scale underlying topography will cease to generate fluxes of magnitude comparable to those produced by explicit computations between neighbouring grid-points (this aspect deserving special attention at the intermediate resolutions). Secondly, the forcing of parameterizations will cease to be treated as horizontally homogeneous, which means that the 3 space-dimensions will play a role in the computation of turbulent and radiative fluxes. Besides, some aspects which have been neglected up to now because of the too coarse horizontal resolution, such as the impact of lakes or towns on the low atmosphere, will have to be parameterized.
This revolution in the perception that NWP code designers have of "physics" will also have a strong influence the way the parameterization codes are treated algorithmically. Old sources of inaccuracies and of latent instabilities will disappear, while one can safely anticipate that they shall be replaced by new ones, and interfacing with dynamics may have to be reconsidered. Exchanges with external experts should also be encouraged, to improve the description of air-sea exchanges, the impact of aerosols on radiation, the parameterization of run-off, etc ... or even couple directly the NWP model with another one (a procedure which is also likely to favour the emergence of problems).
- d) "Time-stepping"
The interaction of physical and large-scale forcings with dynamics will have to be revisited in some models. Two main challenges can be considered. The first one, dealing with the organisation of the time-step, should optimally allow to use various packages of physical parameterizations, and even combinations, with variants of the dynamics, while ensuring accuracy, stability and efficiency for the whole system. The second one is the progressive introduction of 3-dimensional aspects in physical parameterizations (typically for turbulence), which will lead to significantly more complex interactions between physics and dynamics.
3. Improved use of local observations for model initialization
A very good model is useless when run with bad initial conditions. Moreover the refinement of the initial state using meteorological observations, usually called data assimilation, is all the more important since the forecast model is sophisticated, and improvements of both systems must always be kept consistent. One has to note that developments in data assimilation should provide not only more reliable initial conditions for a local high-resolution weather forecast, but also analytical tools to exploit the information from local high-density observation networks. Enhanced links with nowcasting (i.e. very short-range - a few hours- forecast of extreme events) are also expected.
A wider range of horizontal scales, say 1-10 km, is considered here, since there are still significant improvements to be brought to the present operational data assimilation systems (i.e. at scales around 10 km, on average) while one starts from scratch at very high resolution, but the following list of research topics concentrates on issues specific to limited-area models.
- a) Use of new observations, improved use of conventional observations
Work will continue on the use of global-coverage data, with emphasis put on the specific pre-processing algorithms required for very dense observations and limited domains. Satellite data of various kind will be injected, at the highest possible resolution. Special attention will be paid to cloud or rain-affected data. However final products designed by specialized centres (SAF) will be used whenever possible.
Local remote-sensing observations, such as radar (reflectivities, 3d or radial winds) and GPS data, will also be assimilated. They are likely to provide useful information on humidity, and the initialization of moist variables is a crucial but very difficult issue at high resolution. The corresponding work has already started in some NWS, using more or less sophisticated assimilation tools, which allows to be confident on a future operational use.
Synoptic surface data, the most conventional observation type, will gain an extra interest. Indeed, the surface network remains a fairly cheap and dense source of informations for the surface and near-surface atmospheric conditions (e.g. snow coverage, precipitations, temperature and humidity at 2 m height, ...).
To end with, attention will be paid to the qualification of observations errors and to their representativeness.
- b) Three-dimensional analysis of atmospheric fields
Three major issues are considered here: the description of background errors, analysis algorithms, and the problem of initialization (filtering of spurious numerical modes).
The proper specification of the "background" error covariances is a major issue in modern data assimilation, whatever the analysis algorithm. These functions control the spatial filtering and the space propagation of analysis increments and provide balance conditions. The assumptions used up to now to prescribe them will have to be revisited at very high resolution, where balance conditions are different and moist processes play an increased part. Developments will concern the five main following aspects: sampling (error estimation techniques), covariance representations (on which space, with which level of simplification, ...), balance conditions, flow dependence, and tunings.
Though the simple nudging technique is used by a few teams, there is now a general trend to rather consider variational methods (3D-Var here). Apart from optimization issues, some topics should receive an increasing interest : time dimension (higher frequency of analyses versus semi-continuous systems), choice of adequate lateral boundary conditions, ensemble filters, and coupling of high-resolution limited-area and global synoptic-scale data assimilation systems.
Little is known about the actual need for a specific initialization, in the sense of an extra filtering of analysis fields, before launching a high resolution forecast. New diagnostic tools to evaluate the level of "noise" in analyzed fields have to be developed, while addressing the problem of the actual physical balance conditions present at very high resolution in the atmosphere.
- c) Four-dimensional analysis and related issues
F4D-Var techniques have been very successful and are widely used operationally for global data assimilation and also, recently, for operational data assimilation in a model at 10 km resolution, at the Japanese Meteorological Agency. Theoretical arguments are in favour of 4D-Var also on the mesoscale, knowing for example the importance of the time dimension and of moist processes, i.e. of physics. Many of the processes on the mesoscale are strongly non-linear, however, and this may be critical for 4D-Var, which makes heavy use of tangent-linear approximations. Multi-incremental methods (with re-linearizations) may help to circumvent this inherent problem.
A major issue is to settle a numerically efficient dynamical core: tangent-linear and adjoint formulations of the semi-Lagrangian time-stepping need to be developed (done already by the HIRLAM community). In fact, 4D-Var is and will stay expensive, as concerns computing cost and development work. A second important issue is to assess the minimum level of required simplified regular physics. It is likely that at least a boundary-layer description and micro-physics are needed.
Any 4D-Var system operating at 10 km or below is worth testing in research mode. These tests should also include derived applications such as gradient computations, sensitivity studies using the adjoint model, and a-posteriori retuning. Comparisons with three-dimensional assimilation schemes are certainly relevant in the early stages.
- d) "Surface analysis"
Two rather independent issues are to be considered here. Firstly, the initialization of the prognostic variables describing soil and surface (temperature and humidity).The present trend towards an off-line variational assimilation (since the associated time-scales are longer than for atmospheric process) will be maintained, trying to simultaneously go to more sophisticated algorithms, use new observation types (remote-sensing data), and keep the overall computing cost within safe limits. Secondly there will be an increasing interest for diagnostic analyses, using observations of any type close to the surface, in order to provide a fine description of the boundary layer. This will help forecasters in nowcasting, and may provide forcing to off-line models. As a first step, very basic evaluations will be required, such as the comparison of several pragmatical approaches, based on existing techniques.
4. Evaluation of the reliability of forecasts
This domain is of major importance, since the overall objective is the supply of improved local weather forecasts to any end-user. However, here one has also to consider the present, coarser, operational resolution, since the first following issues are at their first stage of development.
- a) Short-range predictability: the use of ensemble forecasting and other techniques
Predictability is the net result of several distinct sources of uncertainty: (i) the use of imperfect models; (ii) the use of observations that imperfectly sample the past and present state of the system, both properties that, to some extent, can be controlled, and (iii) a fundamental intrinsic property of the atmosphere itself, which, very much like other state parameters, can at best be measured but cannot be changed.
One popular way of figuring out predictability is ensemble forecasting. Other ways consist of coupling observations to the data assimilation system (adaptive observation) or to assimilate observations depending of predictability-related properties (adaptive assimilation).
The following issues should be addressed: (i) initialization of an ensemble forecast relevant to mesoscale, short-range prediction; the techniques should become able to take into account uncertainties in the humidity fields, surface conditions (soil wetness); (ii) the direct extension of forecast error representation as done in synoptic scale ensembles to convective processes is non trivial; (iii) assess the respective importance of the initial errors and those resulting from the diversity of coupling scenarios; (iv) the validation of an ensemble forecast tool, especially on the mesoscale, depends on the statistical of the ensemble with respect to some climatological reference; the usual approaches suffer from two major drawbacks related to the small number of validation cases and the kind of climatological reference; ways to improve on these methods using event-based approaches should be studied (v) the interpretation and use of an ensemble forecast is an open question; while probabilistic forecasts are directly usable in cost-loss models with explicit weather parameters and their uncertainty, their application to natural disaster management remains a challenge; (vi) since there are theoretical arguments on the inability of the current ensemble strategy (which is a kind of "clever under-sampling") to deal with extreme events, research on alternative approaches such as adaptive assimilation or observation should be extended to the mesoscale.
- b) Verification of model forecasts
An obvious way to control the skill of NWP models is to compare the observations of a meteorological network with the respective model values, using more or less advanced interpolation methods. However, this method has already reached its limits at the present operational scales, especially when forecasts of precipitation and cloudiness are to be evaluated. The presently used sample of observations is quite coarse, and smooth models are favoured: typically a good forecast of the intensity of a storm with a time-shift of a few hours is penalized against the prediction of a far weaker event at the right time. The use of ensemble prediction systems cannot prevent from addressing the development of new methods for the verification of deterministic forecasts (and also for probabilistic ones). In particular, there is a need for a method that values phase errors properly; such a method will probably be using modern pattern-recognition techniques. This involves common research work with data assimilation issues, for the comparison to new observations.
- c) Coupling with other environmental applications
A meaningful evaluation of the quality of meteorological forecasts has to consider their impact on downstream applications. The output of NWP models is a major source of information when running air-pollution, hydrological, oceanographic or sea-state models. A closer cooperation with the corresponding research teams will help in the qualification of the forecast skill of NWP models as well as in the evaluation of the relevance of new developments.
- d) Validation
This is the upstream counterpart of verification, many developments will be common ones. Modellers will have to define new diagnostic tools, better suited to very high resolution, and to face ensuing new problems. Besides, intensive numerical experimentation, combining forecast and assimilation, will be required.
5. Complementary aspects
Today NWP is much more than a model code: it is an ensemble of programmes with many and deep interactions between them, possibly running on several platforms. Moreover, with the substantial increases in horizontal and vertical resolution that we experience, the amount of data we have to deal with becomes a problem in itself.
Therefore, NWP centres are facing some challenging issues which need a joint effort :
- code organisation, normalization, optimization and portability, which would ensure efficiency and a significant lifetime of NWP codes;
- large scale data-handling, including fast data-bases accesses, format standardisation and data sharing between the NWP community;
- building and extension of frameworks to enable user-friendly applications of NWP system for research and operations: the research method would be to investigate, further develop, and implement new techniques, e.g. derived from GRID technology, and/or PRISM specifications;
The problem of code organisation and data handling is found in many disciplines and activity types. Research on this subject therefore has a very strong interdisciplinary content and intersectorial dimension. It requires the development and implementation of methods from ICT, based on a thorough understanding of specific meteorological requirements.