The programme of work is split in 12 main topics. Each partner is responsible for 1 to 4 topics, even if the basic work is shared between all teams. The details about the respective contributions of each partner can be found in part B.3, and the referenced publications in part A.2.
The development and validation of the vertical plane version of ALADIN, as a powerful and cheap tool for further studies in dynamics, were completed within the first 6 months of the ALATNET project. The specific piece of software is maintained since. A set of standard mountainflow tests was prepared and it serves to validate each new release of the ALADIN software. There is evolution and maintenance of associated tools, such as creation of idealized data files, postprocessing of the academic simulations and visualization. What was not specifically developed is the creation of suitable initial and lateral boundary conditions out from 3d real data. There was no real demand for having it (it was not felt really useful). Instead, environment and tools for idealized 3d experiments were developed (see Item 2). Once the 2d vertical plane tests give successful results, it enables to perform more complex experiments but still simplified in comparison with simulations using real data.
The refinement and test of a radiative upper boundary condition (RUBC) for hydrostatic and nonhydrostatic dynamics has started but it is not yet completed. An analysis of equations of the RUBC was done in order to assess the impact of the semiimplicit treatment on its efficiency. This step was necessary since the semiimplicit treatment modifies the phase velocity of the fast waves, which need to be radiated out of the model domain. The analysis gave an optimistic answer : there is no theoretical obstacle to apply the RUBC together with a semiimplicit approach. The actual coding has not yet started, however, since it is better to wait for the stabilization of the code of the predictorcorrector (centred iterative) timestepping, which happens to be useful for the RUBC application.
It can clearly be said that the formulation of the bottom boundary condition was improved for the semiLagrangian advection scheme. Even 2 methods were proposed and successfully implemented with tests in 2d and 3d models. The problem of a spurious standing wave ("chimney") above the mountain top was carefully investigated, understood and cured. Originally, one more step was considered, that is to explore a possibility of an alternative determination of the surface horizontal wind. This option was dropped from the plan, as being less important in comparison with the correct computation of the vertical acceleration at the surface, and likely to introduce further inconsistencies. Taking this fact into account we may consider that the major part of the work on the bottom boundary condition is completed.
Two important topics of research were identified during the ongoing work. Both of them are connected to the stability assessment of the semiimplicit temporal scheme applied to the elastic equations (fully compressible Euler equations). While at the start of the project it was believed that the only way for stabilisation was the application of the centred iterative scheme (known also as predictorcorrector scheme when only one iteration is applied), which was proposed as PhD topic for a young researcher position, it was discovered that there are two other important and complementing ways of reaching better stability.
The first one is a proper choice of model prognostic variables, which can substantially reduce the explicitly treated residuals that may potentially lead to instabilities. It was the first time that this problem was studied in depth for Euler equations, and two joint publications on this issue were submitted and accepted (Bénard et al., 2004 a & b).
The second way to increase stability is an alternative formulation of the linear constantcoefficient solver used to compute the semiimplicit correction. Until now this solver was always derived by applying a linear approximation around a very simplified atmospheric reference state, which was used for the whole system of equations. It was found, however, that the stability criteria leading to the choice of the linearsystem parameters (reference temperature, in this case) are in opposition for gravity and acoustic waves. Therefore it was not possible to use successfully the traditional approach. In contrast with it, the reference temperature was chosen to be different in the vertical momentum equation to fit the stability criteria of acoustic waves. With two distinct reference temperatures the linear system cannot be seen any more as some "physically simplified" system but it provides quite an efficient numerical control of both kinds of waves. To get some stability, the system with the traditional linear solver had to be iterated; when using proper reference temperatures iteration is not needed for typical atmospheric conditions and lengths of the timestep. Nevertheless it has to be stressed out that the iterative procedure still enhances stability and allows using even longer timesteps. A paper on the general approach to stability analysis of Euler equations was published (Bénard, 2003), as well as another one on the new approach of deriving a linear system to build up a semiimplicit scheme (Bénard, 2004).
Finally, research work on implicit centred iterative schemes (young researcher's PhD topic) was successfully completed. It was clearly shown, through stability analyses confirmed by experiments, that the iterative approach substantially enhances stability when the iterative procedure converges. Further, when convergence exists, the scheme converges to a unique solution. The convergence is conditioned by the linear system and nonlinear residuals. In case of very large residuals the scheme may diverge. Therefore it is very important to correctly "precondition" the iterative procedure for example by a proper choice of prognostic variables, suppressing the most unstable residual modes. In addition to the stability properties it was shown that there is no lost of accuracy compared to the classical semiimplicit scheme; this is valid also for the long (synoptic scale) waves.
The work on the evaluation of the nonhydrostatic version of the ALADIN model at very high resolution has been partially performed, since unfortunately little was done on the physics' aspects. Down to 5 km resolution the model can be run without problems for both (hydrostatic and nonhydrostatic) dynamics, while on 2.5 km usage of a smoothed orography is necessary on domains with high and complex landscape, in order to keep the model stable. The model's behaviour was studied through the performance of physical fluxes, which tend to become more and more unrealistic if the resolution is increasing. Comparison between the performances of "linear" and "quadratic" spectral truncations at high resolution was performed. As expected the "linear" option seems a proper choice for running the model at resolutions below 10 km. However it requires a modification of model's representation of orography, which has to be smoothed, keeping of lower spectral resolution than the model variables. In any case this smoothing is not detrimental to the forecast performance.
Nonhydrostatic runs show a tendency to reduce the peak amounts of precipitation and to generally smooth the precipitation field. On 2.5 km resolution the expected differences in the fields are more pronounced (more extended spatial features of fields and lower peak values).
In parallel to these actions concerning real atmospheric flows, the socalled ALPIA framework was developed and used for testing the robustness of the nonhydrostatic developments. It consists in imposing a regular flow (unique wind speed, piecewise constant static stability along the vertical, geostrophic and equivalent barotropic flow) on a realistic orography (that of the French Alps in our case). No analytical solution is known for this semiacademic type of experiments but it creates the diversity of situations typical of full 3d test cases within an environment with only a few parameters to be tuned depending on the type of study one is aiming at. With this new tool it could be shown that the efficiency advantage of the semiLagrangian timescheme, albeit diminishing with increasing resolution does not disappear before kilometric scales, at least. It was also shown that this scheme is also more precise than its Eulerian counterpart, thanks to a better accuracy of the vertical advection calculation, at equal length of the timesteps.
We can conclude that recent developments in dynamics enable to run limited area models at resolutions down to 2.5 km in spectral, semiimplicit, semiLagrangian compressible mode. However adapted physical parametrisations still have to be developed to match the performance of the dynamical part.
The topic about additional damping of orographic resonance in semiLagrangian advection was postponed since it was becoming of lesser priority, thanks to the progress mentioned about Item 1. Similarly the search for improved damping properties of a decentred semiimplicit semiLagrangian advection scheme was reoriented. Until recently, decentering was indeed quite a popular method to cure various numerical problems, such as orographic resonance, weak numerical instabilities and noise level (whatever origin the noise had). On the other hand it is well known that any decentring mechanism leads to an unwanted damping of the numerical solution. Some effort was therefore made to develop more gentle decentering methods, for example having a second order accuracy. At the same time an even larger effort was made to improve numerics in order not to use any decentering at all. This is the reason why the topic was finally put aside while work on noise control was reoriented to a new approach of horizontal diffusion, known as SLHD (SemiLagrangian Horizontal Diffusion).
A core version of this SLHD scheme had been developed back in 1999; however it was at that stage applied only to academic type of conditions and only for the hydrostatic primitive equations. The scheme simulates the process of kinetic energy dissipation in the model at unresolved scales. At the same time it enables to control the noise generated by numerical artefacts in the treatment of the shortest waves of the model spectra. Contrary to the traditionally used linear spectral horizontal diffusion (in spectral models, of course) it is a nonlinear scaleselective method, locally depending on the flow deformation. It uses the damping properties of semiLagrangian interpolators. This study is described in a (nonALATNET) Czech PhD thesis defended on 30/9/2003 (F. Vana).
Within the ALATNET effort, SLHD was extended to the nonhydrostatic Euler equations and also to the geometry of the global variable mesh. The SLHD scheme was then applied to real atmospheric conditions on a couple of case studies and it proved to have the desired properties : it either improved forecasts or had neutral impact. A lot of work was still devoted to optimal tuning and to verify whether the tuning may be sufficiently general and thus independent on the gridsize and timestep in the model. Dependency of the tuning on spectral resolution was addressed too and forced to come back on some of the previously made choices. Finally, optimisations of the scheme were made for diagnosis of the flow deformation, which is the main driving component of SLHD.
This issue came to an end during the ALATNET period, temporarily. The required modifications, following first a proposal of White and Bromley, were coded and validated both in ARPEGE and ALADIN. Sensitivity studies with ARPEGE, i.e. at the global scale and with a rather low resolution, showed as expected hardly any impact on global budgets. A more noticeable impact was expected in tropical regions and at higher resolutions, ALADIN tests in such regions couldn't really confirm an improvement, because of the relatively poor quality of initial and lateral boundary conditions.
In the meantime, a new approach was proposed by Staniforth and Wood (2003), fully consistent with the ALADIN NH dynamics, and easier to implement. The work is likely to restart from the very beginning, but only once all present developments in NH dynamics enter the main library.
The problem of the coupling updates for the lateral boundary conditions was motivated by the failure of several operational coupling choices for the famous French Christmas' storm (1999). This was due to a temporal interpolation of the coupling data with too short timeintervals (when compared to the model resolution and the timescale of the storm), during which the storm had time to cross the area where the relaxation towards coupling fields is applied.
During the first part of this work one was trying to find improved timeinterpolation schemes. Finally it was shown that it does not make much sense. Instead a strategy for monitoring the transfer of the coupling data was designed, with two methods proposed successively. The first one was based on a correction of the linear interpolation method. The second one is carried out by means of a highpass recursive digital filter applied along the timeintegration of the coupling model and providing local information on the timevariations of the fields. The second method (Termonia, 2004) turned out to be the most attractive, the most secure, and the most practical one, and will be used for the future dissemination of the coupling data to the ALADIN partners (applying it only for surface pressure, a single field necessary but also sufficient to give the necessary information and let the local administrators take the appropriate decisions).
The study of the intrinsic formulation of lateral boundary conditions (LBCs) started from the work of McDonald (with the HIRLAM model) on transparent and wellposed LBCs. His tests were done in a gridpoint model. Similar experiments in a spectral model showed that we have to pay extra attention to the socalled extension zone. Tests in a onedimensional tool version of ALADIN showed that we have to make the fields periodic at all timesteps, contrary to what used to be done with the classical Davies' scheme, where it is necessary only at coupling updates. At this point the issue of the biperiodicization has been reopened. One should find a cheaper algorithm than the currently existing one in ALADIN to go towards a more sophisticated lateral boundary forcing, even if many other problems must also be solved.
A spectral coupling method was developed, and extensively tested on cases where the classical Davies' coupling had failed (e.g. Christmas storms from December 1999, see above). Spectral coupling was also added to the 1d model designed for such studies. This 1d model was used as a test for studying the optimal setup of spectral coupling (coupling frequency, scale separation and appropriate weights). The main strengths and weaknesses of the spectral coupling approach were identified in real 3d simulations, where its behaviour was evaluated in various weather regimes (advective, frontal and stationary weather patterns). It was established that the procedure is not detrimental for the forecasts in any of those cases. An its allows to solve the problems encountered during the 1999 Christmas' storm. An operational application to extreme situations may now be considered.
Problem of jumps in resolution and of domain sizes were examined mainly through regional reanalyses at current operational ALADIN resolutions (around 10 km) from corresponding low resolution global datasets. An optimal coupling chain for regional and local reanalyses was determined objectively on some MAPSOP periods. The main goal of this study was to estimate amount of information that is "hidden" in lowresolution global reanalysis datasets and could be "revealed" by use of a mesoscale LAM.
Apart from this welcome extension of the topic to nonNWP downscaling problems, a similar approach was applied to study the optimal coupling strategy for highresolution nonhydrostatic numerical forecasts. The influence of intermediate nesting steps on the modelling chain from operational to highresolution NH models was compared to the impact of changes in model dynamics and physics relevant at high resolution.
The task about blending of fields in data assimilation for preserving high resolution forecast details was successfully completed. A blending method for both upperair and surface model variables was proposed and developed. An implicit approach to blend upperair fields was chosen at first, relying on the spatialtemporal filtering properties of the digital filter technique. This concrete method is then called dfi (digital filter initialization) blending and after its validation it was successfully implemented into operational forecasting. The same tool was used for further research where blended fields enter mesoscale 3d variational analysis as first guess. This approach currently provides the best results among other tried algorithms (Brozkova et al., 2001). However by now the idea of blending is being translated into the traditional variational formalism, where it would be represented as a special component of the total cost function. Instead of blending the fields, a penalty on the distance between the largescale and mesoscale analyses should be introduced, however only for the longwave part of the spectra.
Besides dfiblending, an alternative was developed relying on the simplest direct combination of spectra. This approach is called explicit blending. When comparing its performance to dfiblending it became clear that some additional gravity waves filtering has to be applied and no real simplification of the procedure could thus be reasonably achieved.
The topic about tendency coupling for surface pressure was suspended near the end of the project due to disappointing neutral results obtained so far. First, there are problems identified regarding the stability of the temporal scheme in such a case. Second, no real benefice was noticed in comparison with the reference scheme.
The problematic of the coupling strategy to be used in variational data assimilation was partly addressed when developing the blending technique (cf. supra). The general issue is to build a scale incremental algorithm, where the largescale features are not substantially reanalyzed in the mesoscale model but rather provide the basis for adding smallerscale information. This concerns also the estimation of the background error statistics, where the influence of the largescale forcing should be optimally removed. This led to the construction of the socalled "lagged" background error statistics for limitedare models, generated at unchanged lateralboundary forcing. These backgrounderror sets were evaluated, and applied in 3d variational analysis in combination with blending. However, the optimal strategy for estimating forecast errors, and the associated choices for coupling, remains open.
A first topic concerns the evaluation of different strategies for the numerical treatment of diabatic terms : where should the physical diabatic forcing be applied along the semiLagrangian trajectory and when inside the timestep ? Initially oriented towards the classical type of parametrizations used in ALADIN, the topic did not progress well. It was recently restarted with emphasis on the increased variety of time and space scales that physics at high resolution is supposed to treat (slow and rapid components of moist microphysics, 2d and 3d turbulence).
A second topic is the specific treatment of the diabatic terms in a nonhydrostatic model : which should be the partition of the heating / cooling impact of diabatic changes between temperature and pressure evolutions ? This topic started late with respect to the planning but some first results are encouraging about the possibility to apply the compressible diabatic equations without any "hydrostaticlike" approximations. However this will have to be implemented in a framework influenced by the outcome of the study of the topic just above,which may create additional practical constraints.
A third topic was introduced to the programme during the ALATNET lifetime : a new type of diagnostic for discovering "sleeping" problems of nonlinear instability and/or stiffness. While an overimplicit treatment of "heatequationlike terms" is known to be a good cure for some stiffness problems, it was decided to use an underimplicit treatment to reveal potential not yet detected improper formulations. Similarly, the twin problem (in space and not any more in time) of nonlinear instability along the vertical direction was attacked by a similar trick. After being tested on various parametrization schemes likely to be prone to such problems (a longstanding bug was discovered and corrected on this occasion), this double diagnostic tool was applied to the newest state of the operational physics package. At this occasion the culprit for another longstanding problem was found and corrected. The tool should become a permanent part of the validation procedures for physical developments.
Parameterization of the smallscale features of convection
The diagnostic deepconvection scheme of ALADIN was thoroughly examined in order to assess the present weaknesses and limitations on the way to higher resolution. A prognostic scheme for updraught and downdraught activity was developed and tested, producing some enhancement of the dynamical behaviour, in particular autoextinction as the horizontal resolution increases, i.e. when more precipitating systems are resolved. This work is described in a (nonALATNET) Belgian PhD thesis defended on 31/8/2001 (Luc Gérard).
The project was continued by including a prognostic microphysical scheme, and concentrating the effort on the integration of the different sources of precipitation and cloudiness (subgrid versus resolved) to address such problems as significant convective mesh fractions, independent life of up and downdraught, generation and detrainment of cloud condensate by the subgrid scheme, revised energetic calculations (avoiding double counting of precipitation), and the difficulties of the "grey zone" resolutions, where convective systems are partially resolved by the model grid but still need to be also partly parametrized. The 3d tests are now ongoing, and two articles are in preparation.
Improved representation of the planetary boundary layer
A first study (transferred from Item 9, in fact) aimed at testing how the total kinetic energy (TKE) of the turbulent motion can be used, either as a diagnostic or a prognostic variable, in order to solve the closure problem in atmospheric turbulence. An extended investigation was made of the way the traditional exchange coefficients can be parametrized in terms of TKE and of stability. A numerical integration scheme for solving the prognostic secondorder nonlinear differential equation for TKE was also proposed. However the present timestep length needed to guarantee stability in the integration of the TKE equation is still too short to envisage any operational applications.
A second study, about the cyclogenetic impact of parametrization schemes in the planetary boundary layer, initially aimed at understanding the role of stabilitycontrolled vertical turbulent exchanges on violent cyclone deepening. It was reoriented into two directions : that of the socalled symmetric instability and that of a more physicallybased definition of mixing lengths and stability functions for the computation of the (purely diagnostic in that case) exchange coefficients. The results were extensively compared with the current ALADIN computations of momentum and heat fluxes (that indirectly benefited of this study through several improvements), but the complexity of the issue and its interaction with cloud and radiative aspects make the operational outcome very hard to foresee, especially with the opposite perspectives of the abovementioned first way to attack the same problem.
It is difficult to find a transversal way to report on this Item, given the many topics that were addressed and the interactions between the different developments. One shall therefore only select here a list of those topics which show a doubtless positive impact and are more than a retuning, however sophisticated it may be, of a previously prevailing situation.
"Ensembling" entrainment
This was the most important of a long series of changes to the diagnostic deepconvection scheme. The idea behind this development is that when a cumulonimbus cloud population with various entrainment rates reaches higher and higher levels, only those clouds that have experienced less entrainment of cold air below keep enough buoyancy to "survive". But they are then warmer than the "average cloud" characteristics used for the parametrized computation, which means that the latter can see its moist potential temperature increase even if every individual cloud feels a decrease. This paradoxical behaviour is indeed observed in nature and diagnosed in very highresolution explicit cloud simulations and it was thus included in the scheme used in ALADIN and its global ARPEGE counterpart.
New options developed for the radiative computations
Following a comparison work with expensive radiation computations, it was decided to retest the option of adding a more sophisticated treatment of the "exchange with surface" term of the thermal radiation budget, alike for the "cooling to space" one. This development had been prepared back in 1996 but unfortunately deemed then as too expensive for its "low impact". The problem is that the impact is indeed low in standalone mode but that its cumulative character slowly but dramatically changes not only the thermal structure near the surface but also, indirectly, the whole of the model's upperair circulation. And this change is for the best, whatever parameter one looks at, as shown in the more sophisticated tests redone in 2002. This success led to go further in the search for a new solution to the problem of the balance between complexity of the radiation computations (for the sake of accuracy) and frequency of their calling (for a better interaction with cloud processes). The idea is to treat exactly also the exchange terms between adjacent layers and to interpolate between "easy to compute" extremes the remaining smaller terms, without recalculating the interpolation weights at each timestep. The design and coding work is nearly completed and some preliminary tests with a simplified option for the application (statistical computation of the quantities that are later supposed to be stored inbetween more complex computations) have shown a very nice potential. Work on the target application has however just started.
Rewriting the diagnostic cloudiness scheme (for radiation computations)
One kept the spirit of the old operational scheme (diagnostic of the condensate amount first, taking empirically into account three types of clouds, and cloudiness computation afterwards) but (i) one now combines the various watercontents before a single diagnostic of cloudiness and (ii) the used formula is more realistic. The obtained cloudiness is more sensible in its main features (especially more lowlevel clouds), even if sometimes much more "0/1" locally. This help maintaining a very reasonable capacity to simulate the intense storms of Christmas 98 and Christmas 99.
Shallowwater convection
A far more stable numerical algorithm, for an unchanged analytical formulation, was introduced in the shallow convection parametrization. The impact of this algorithmic change is quite important : disappearance of spurious oscillations for shallow convection (of course) but also for deep convection; markedly less noisy vertical profiles of temperature and humidity in the planetary boundary layer; reduction of the moist bias of the free atmosphere and of the dry bias underneath; contribution to the reduction of exaggerated cyclogenetic tendencies.
Addition of a prognostic albedo for snow
Snow albedo was introduced as a prognostic variable in the operational scheme, with a "fresh" value of 0.85 and a diminishing trend with age until 0.65. Furthermore, the vegetation cover (canopy) was considered in order to obtain a more realistic total albedo. All this led to a reduction of cold and warm biases for surface and lower boundarylayer in snowcovered areas. Besides a diagnostic of snow density, assumed constant before, was introduced. This work is linked to the ever ongoing one on the databases for soil and vegetation characteristics that has been one of the background ALATNET tasks all along the four years of the network, for all types of concerned parametrization schemes.
Mountain subgrid effects
The drag part was revised following most recent findings reported in the literature and the lift effect was made orthogonal to an estimate of the geostrophic wind and not any more to the wind itself. Furthermore a new set of diagnostics based on the ALPIA framework (see Item 2) was prepared and extensively used to steer the tuning of the five or so relevant parameters of the new version. The work has now reached the state where the socalled "envelope orography" artefact (a proxy for some missing lowlevel parametrized effects) can be removed, with equal quality scores, while there is a better description of the flow in the mountains' vicinity, with positive consequences on the rainfall forecast and probably on other parameters.
This topic better is understood as "use of classical observations at the mesoscale and use of new types of observations", the wording "classical" and "new" referring to the usage in global assimilation systems or not. Thus, the work accomplished within ALATNET splits in two streams :
Adaptation of classical observation types to a LAM assimilation.
Most centres which do have a local 3dvar application have started to
assess the potential benefits of the classical sources of
observations, namely:
 SYNOP : meansealevel pressure, but also
2 m relative humidity and 10 m wind;
 TEMP : using the
same specifications as in a global model, but with a pioneer work on
the drift of radiosondes during their ascent, significant for a
mesoscale model;
 Aircraft data : retuning of thinning
parameters, in order to retain more 3d information than in a global
lowerresolution system
 Radiances : adaptation of preprocessing
tools and bias corrections to the local systems, first experiments
in Budapest (see part B.1 for MORE details about the previous
points).
Some more original work has been undertaken to prepare the assimilation of IASI/AIRS data : creation of reliable and goodquality emissivity databases, estimation of backgrounderror standard deviations for AIRS channels. This work should lead to the completion of an ALATNET PhD and will be continued in coordination with ECMWF.
The work required to adapt observation management tools to limited domains is also to be mentioned.
Introduction of new observation systems
One effort concerns the use of Meteosat Second Generation radiance data (SEVIRI instrument). A chain of tools to receive buffertype data and convert them into a format compliant with our assimilation interfaces was prepared first. Furthermore, a selection of the relevant SEVIRI channels (infrared and water vapour) was performed, in order to retain the best set of uncorrelated, lowerror channels. Estimates of backgrounderror standard deviations, to be used in the data quality control and screening steps, were also produced. First case studies indicate that the MSG/SEVIRI data should be a valuable source of information for cloudfree areas, allowing a better description of temperature and humidity profiles. It must be mentioned that this study was mostly supported by a joint MétéoFrance/ALCATEL contract.
The introduction of vertical profiles of relative humidity, derived from METEOSAT imagery and cloud classification techniques, into ALADIN 3dvar analysis is also under progress. During the period of the ALATNET contract, specific datasets have been made available, and the impact of these data on specific case studies has been assessed. The pseudoprofiles could be an interesting complementary source of observations (to radiance data) in cloudy areas, but the method, which demands a complex chain of production, is technically difficult to implement on a routine basis. Also, an exact determination of the associated error statistics is probably impossible. However, work will continue.
Very preliminary work on the definition of radar reflectivity assimilation has started during the ALATNET period of contract. This work will really take off during the followon period, where it will be driven by the need to add fine scale sources of information for getting a meaningful initial state for the models that are now in sight. So far, the goal is to be able to run a "modeltoreflectivity" facility, along with a first version of screening of the data, by the end of 2004. Later, 1dvar retrieval tools will be developed in order to convert reflectivity information into pseudoprofiles of temperature and specific humidity.
A rather large amount of activity was dedicated to this topic, with the significant pushing of ALATNET efforts. The difficulty here is to give a complete and harmonious picture of a topic that leads naturally to many local choices and adaptations. We will therefore divide the results into three categories : background error statistics, assimilation cycling and variational tools.
Background error statistics:
As an alternative to the classical NMC method, the socalled "lagged
NMC" technique has been applied using constant lateral boundary
conditions. This choice produces implicitly more mesoscaleselective
error statistics, since then the socalled B matrix acts as a
bandpass filter, rather than as a lowpass filter. As a consequence,
the 3dvar analysis possesses shorter horizontal (and for some
predictors also vertical) correlation lengthscales, and the
associated analysis increments better suit the objectives of
mesoscale data assimilation (Siroka et al, 2003).
In the frame of
an ALATNET PhD study, one has successfully implemented the socalled
"ensemble" method that is an alternative to the NMC
sampling, in the framework of global models first. This method
requires to already have a local assimilation suite, in order to run
successions of analyses and forecasts, which differ by their
perturbed observation sets. Under proper assumptions, the full set of
analyses and forecasts should converge to the desired estimates of
analysis and forecast error statistics (respectively, the A and B
matrices in the usual assimilation jargon). So far, the ensemble B
matrix has become the operational B matrix in the MétéoFrance
ARPEGE global system, and one has started to evaluate its ALADIN
counterpart. The ALADIN ensemble statistics has also been used in
order to study the ARPEGE analysis error and the correlation between
ARPEGE analysis and ALADIN forecast errors.
Despite the many efforts and algorithmic tricks, one never can state
surely that the direct estimates obtained from any sampling method
properly reflect the relevant amount of background error variances.
However, the total background error variances, and the variance of
the main quantities to be analyzed (wind, temperature, humidity) are
the most important parameters that tell the assimilation how to
balance the background firstguess information with the observations.
Thus, two methods for tuning the "B" variances were
developed :
 a purely technical development of the classical
LönnbergHollingsworth method (binning and fitting of innovation
statistics in observation space). The code was developed in
coordination between the Budapest and Prague Centres. It should be
used now to assess the relevance of Bmatrix variances in the local
versions of 3dvar.
 aposteriori evaluation and retuning of
total B variance, based on original proposals by Talagrand,
Desroziers and Chapnik. This work should lead to the defence of a
nonALATNET PhD in toulouse in 2004.
The structure functions tell the analysis how an observation impulse,
which is primarily a Dirac input, should be spread throughout the
analysis regular gridpoint lattice. Little work has been devoted,
which could rapidly change the shape of the present B matrices. Thus,
the ALADIN 3dvar structure functions still rely heavily on a work
done in collaboration with HIRLAM, on the development of biFourier
space spectral correlation functions, under assumptions of
homogeneity and isotropy (Berre et al., 2002).
However, the
problem of wraparound increments in the analysis (the analysis
problem is solved on a torus, under biperiodic conditions) might lead
to some evolution in future. Already, first attempts to localize the
correlation functions have been tried (compact support), but with no
success. Other trials, to be pursued, aimed at partially removing the
assumption of isotropy, by also taking into account correlations
between neighbouring largescale wavenumbers.
A more original work
concerns the use of wavelet functions to model the horizontal
autocorrelations. These would then be associated with the classical
biFourier functions (for crosscorrelations) and the gridpoint
error variances. This part of the work is done in association between
the Toulouse and Brussels centres (Deckmyn and Berre, 2004).
Assimilation cycling:
A number of experiments were performed, mostly in Prague, in order to evaluate "simple" limited area assimilation cycles (where 3dvar and forecasts follow successively), and more complex systems where dfiblending (see Item 6) is added (socalled blendvar or varblend cycles). The results show a clear benefit for mixed systems, such as blendvar, in terms of analysis increments or scores. The potential benefit of a blendvar system was also stressed by a specific case study made in Toulouse on a well documented MAP situation. Presumably, the advantage of blendvar is that the latest global model analysis is also introduced into the limited area assimilation system, so that the overall limited area assimilation benefits from the latest updates, and the final analysis is more in accordance with the fresh lateral boundary data.
However, the blendvar technique has at least 2 major drawbacks :

it is technically a bit difficult to install, and rather difficult to
maintain;
 the error statistics for the 3dvar Bmatrix should be
adapted to the blended firstguess. This is however not
straightforward and produces implicitly correlations between the
first guess and the observations, when the latter already entered the
global model assimilation (which might be avoided in research work,
but probably not in operations).
An alternative method is to introduce the global model analysis information directly into the variational problem, as a new source of information. This approach leads to adding a new cost function to the 3dvar, whose weighting function should be the global model analysis error covariance matrix. The development of this term is underway in Toulouse (in the framework of a nonALATNET PhD).
In parallel, an extensive evaluation of a "simple" 3dvar assimilation system was performed in Budapest (an ALATNET PhD study). The results indicate that the forecasts driven from the assimilation system can be substantially different from those driven by dynamical adaptation (direct coupling of ALADIN with ARPEGE, and the previous ALADIN forecast is fully forgotten). In general, scores are either neutral or slightly in disadvantage for the assimilation so far, but specific case studies concentrating on mesoscale precipitations show that either the two systems are close, or the 3dvar assimilation is more realistic. One also tested several coupling strategies in the assimilation cycle, and found no significant impact when changing the coupling data (from ALADIN/LACE or ARPEGE forecasts, coupling with the analyses etc...).
Thus, for the time being, each centre works out its own cycling solution, also in accordance with its specific local needs.
Variational tools:
We refer to ALATNET topic 12 for Cornel Soci's work on sensitivity studies. Otherwise, the variational tools section mostly encompasses work on the maintenance and technical evolution of the tangent linear (TL) and adjoint (AD) models of ALADIN. This maintenance could be performed successfully for the hydrostatic, 10 km, models. However, we have basically abandoned the maintenance of the nonhydrostatic models, as their update with respect to the many NWPbreakthrough innovations (see ALATNET Item 1 above) in the full nonlinear NH model (new variables, predictor/corrector approach) is totally out of range in terms of human manpower (if not also in terms of coordination and algorithmic complexity).
Scientifically valuable tools such as gradient computations and singular vectors have been kept alive, but without any noticeable evolution. The good maintenance of the hydrostatic TL/AD models has allowed several case study applications. Concerning the 3DVAR analysis, the system is now safely installed in four ALADIN centres (Toulouse, Prague, Budapest, Casablanca), and is ready for installation in any other NMS.
Three actions have to be stressed, even if the initial objectives, far too ambitious, were not achieved :
Evaluation of simplified physics and sensitivity studies at 10 km resolution
C. Soci (ALATNET Romanian PhD defended on 19/4/2004) has shown that most difficulties presently encountered with the ARPEGE global model simplified physics become even more stringent when applied in the 10 km ALADIN model. He has proposed some retuning of control parameters when possible, but probably not always cured the intrinsic problems present in the parametrization schemes.
Within the same effort, sensitivity to the initial conditions, with
an emphasis on mesoscale precipitation systems, has shown mixed
results (Soci et al., 2003) :
 for some cases, no particular
sensitivity was obtained, presumably because lateral boundary forcing
and model errors play a leading role in the forecast;
 for one
interesting case, the initial conditions could be "corrected"
using the information from an adjoint wet model solution. However, it
is not clear whether this result means that a hypothetical 4dvar
assimilation at the mesoscale would have been beneficial, or whether
it rather points towards the inherent difficulty in predicting a non
linear system (since the case was a mixture of stratiform, large
scale, and convective activity).
First steps towards 3dFGAT
3dFGAT (First Guess at Appropriate Time) is an intermediate step between 3d and 4dvar, where the distance between forecast and observations is computed at the observation time rather than at the analysis time, i.e. at the middle of the timewindow. The apriori main innovation, 4dscreening (computing the distance to observations) was coded and validated. However significant cost problems were discovered hereafter, requiring a major update of the dataflow to make 3dFGAT affordable. Recoding is scheduled for mid 2004, with the help of C. Soci.
Simplified 2dvar assimilation of soil moisture
The timedimension of data assimilation was fully considered in the ALATNET PhD work of G.P. Balsamo, even if the method is closer to a dynamical optimal interpolation than to a variational one. His work was really successful and is being continued in Toulouse.
B