ALATNET developments during the second half of 2001 in the ALATNET centres

1. In Toulouse

The work of the four ALATNET PhD students in Toulouse, Gianpaolo Balsamo, Margarida Belo Pereira, Cornel Soci and Malgorzata Szczech, is described in separate reports, as well as other PhD reports. The summary hereafter corresponds to the joint efforts of the other visitors and the permanent staff.

1. Theoretical aspects of non-hydrostatism (NH)

a) Semi-Lagrangian advection (Pierre Bénard, Jan Masek, Alix Prudhomme)

Previous analyses of the stability of the semi-implicit scheme for 3-time-level semi-Lagrangian advection proved that the initial choice for the pair of additional NH variables (to represent the departure from hydrostatic pressure and the vertical divergence) did not allow stability for realistic flows. A new pair, named (P - or q=ln(1+ P)-, d3) was proposed instead 1. However further studies showed that stability is no longer ensured in the presence of orography, so that once again the semi-implicit scheme must be iterated for part of the nonlinear terms. This partial iteration must be avoided if these variables are to be combined with the use of a more general predictor-corrector approach and a 2-time-level scheme.

A linear stability analysis considering orography led to the proposal of two new choices for the pseudo-vertical divergence variable, d 4 and d5. They are identical analytically, differing only through the time-discretisation of equations. One may expect that d4 would be more precise than d5 , but reversely more problematic for the further introduction of diabatism. These two new variables were coded in the 2d (vertical plane) version of ALADIN, and the corresponding stability of NH dynamics systematically studied. Of course the validation, starting from simple academic experiments and examining a wide range of conditions, is only at the beginning. Some noise was observed with d5 near the upper boundary (intrinsic problem or bug ?), and some side problems noticed.

Besides Alix Prudhomme tried to reproduce notorious academical experiments (Héreil and Laprise-1996, Robert-1993) using the vertical plane model, in hydrostatic and non-hydrostatic dynamics. The sensitivity to several parameters was tested for the Héreil-Laprise experiments : background, decentering, filtering of orography, coupling zone, sponge, ... Attempts to simulate Robert's bubble were less successful, since the upper sponge layer introduces distortion.

b) Stability of Eulerian advection

Alena Trojakova investigated in details the stability of Eulerian NH dynamics. She designed a quasi-academic framework, called ALPIA, with 5 embedded models of increasing resolutions (from 10 km to 625 m horizontally, from 34 to 128 levels vertically) covering Western Alps and a very simplified initial flow. For each resolution, she tried to determine the maximum time-step ensuring stability, and the corresponding range of CFL numbers reached along the forecast. These CFL values decrease regularly as the resolution increase. At each step, the flows corresponding to the reference and maximum time-steps were also compared. Problems appeared when reached a mesh-size of 2.5 km, requiring a parameterization of the tropopause (to avoid too strong vertical gradients) and a change in NH variables (the initial pair was used for the first experiments).

c) Upper boundary condition

Martin Janousek resumed the work on the radiative upper boundary condition, starting from a recent proposal from Purser and Kar.

4. Removal of the thin layer hypothesis

The case of semi-Lagrangian advection was addressed.

6. Specific coupling problems

Jean-Marc Audoin resumed the work from Tamas Szabo on the coupling of tendencies for surface pressure (so as to limit problems related to differences in orography between the coupling and coupled models). He had first to introduce the modifications in the latest library. He focused afterwards on validation, in the framework of real 3d case studies (while the initial tests were mainly 2d ones). There are still unsolved problems.

8. Adaptation of physics to higher resolution ( Improved representation of boundary layer)

Jean-Marcel Piriou designed a new interactive formulation of mixing lengths depending on vertical stability. Tests were performed in the 1d and 3d model. He also analysed the problem of too frequent triggering of shallow convection, in the framework of the EUROCS Stratocumulus Case experiment.

Other refinements were intensively tested and introduced in the operational suite within the "CYCORA-ter" package. They are described in the previous Newsletter (A summary of the latest changes in the parameterization of turbulent fluxes and PBL processes, by Jean-François Geleyn).

9. Design of new physical parameterisations (9d: Improved representation of land surface, including the impact of vegetation and snow)

The new snow description was refined and intensively validated by Eric Bazile, while Adam Dziedzic investigated the skill of a more sophisticated one. Their work is described in a dedicated paper.

Olivier Latinne started a comparative study, in 3d-variational assimilation mode for ARPEGE, to assess the impact on the forecast of the use of a new database, of very high resolution, in the definition of physiographical fields in ALADIN and ARPEGE (vegetation and soil properties).

11. 3D-Var analysis and variational applications

a) Definition of new background error statistics

Loïk Berre computed the spectra of differences between ARPEGE and ALADIN forecasts valid for the same dates and ranges, in order to evaluate the part of small scales and validate the so-called "lagged-NMC" method used to compute background-error statistics for ALADIN.

He resumed the work of Mohammed Raouindi on the geographical variability of such statistics (impact of latitude first) and showed it is possible to introduce it in the formulation of Jb using a simple block-diagonal matrix. This work is now extended to longitudinal variability and a new formulation of Jb, in cooperation with Simona Stefanescu (Ro) and Alex Deckmyn (Be).

b) Cycling : investigation of potential combinations with blending and initialization

Adam Dziedzic, with some help from Claude Fischer, prepared and tested a "Blendvar" assimilation suite for ALADIN-France. Here the DFI-blended initial state is used as the first guess for 3d-var analysis and an additional, external, digital filter initialisation (DFI) is applied afterwards. He ran it over 3 days (28-29-30/12/2000) with a cycling period of 6  h. 48  h forecasts were also run starting from the 00 UTC analyses. This allowed a validation of the latest version of 3d-var and the study of the impact of initialization : with different stop-band edge periods (3  h as in operational dynamical adaptation mode or 1.5 h to reduce damping), and when applied to analysed fields, to the increments of 3d-var analysis or to the combined increments of blending and 3d-var. Investigations (looking at initial and forecasted fields, and at the time-evolution of some variables at 3 gridpoints) showed very small differences between incremental initialization and no initialization. Only full DFI allows to filter noise along the first hours of forecast (with little impact of the stop-band edge). Of course differences decrease with the range of forecast, the maximum ones are obtained at 3  h.

c) Development of variational type applications

Claude Fischer wrote a documentation on the use of variational tools for the a-posteriori validation and the retuning of data assimilation systems. A first diagnostic tool for the a-posteriori validation of 3d- and 4d-var was designed.

d) Management of observations in 3d-var / Screening

Sandor Kertesz updated screening (i.e. the quality control and geographical selection of observations for 3d-var) for ALADIN with Claude Fischer, and built its interface with ODB ("Observation Data Base", the new tool for observation management just upstream the model). He designed a procedure to build specific "LAM_ODB" databases for ALADIN, containing only observations in the ALADIN domain and wrote the corresponding documentation.

He focused afterwards on the problem of thinning (geographical selection) of AIREP (aircraft) and SATOB (satellite) data, over the ALADIN-France domain. AIREPs provide informations on wind and temperature, with a large dispersion in space (mainly around / between airports) and time (with a peak between 12 h and 18 h UTC) . With the operational thinning distance in ARPEGE, 170  km, most data are rejected. Experiments with decreasing horizontal thinning distances, down to 10  km, were performed. The main improvement is obtained between 25 and 10 km. An impact on 3d-var analysis increments is noticed in the upper troposphere and the stratosphere for wind, in the boundary layer and the stratosphere for temperature. This study enabled to underline a problem induced by the handling of AIREPs in screening, plane by plane independently. This allows to keep observations valid at the same point but not at the same time, so quite different, as input for 3d-var analysis. This fosters the march towards more continuous data assimilation systems (4d-var, or 3d-var at a higher frequency, or any intermediate solution). Problems are different for SATOBs. Thinning must be performed in 2 steps, but observations are far less and used only over sea. So the proposed reduction of the thinning distance is less drastic, from 250 to 150 km.

2. In Bruxelles

The work of the two ALATNET Post-Doc students in Bruxelles, Martin Gera and Ilian Gospodinov, is described in separate reports. The summary hereafter corresponds to the efforts of the permanent staff.

5. Coupling and high resolution modes

Piet Termonia pursued his search for alternatives to improve the interpolation of the temporal interpolation of the coupling data of the ALADIN forecasts. The idea of the introduction of an accelerative correction in the linear interpolation scheme was generalized to a perturbative approach of the interpolation. The upshot of this work is that it turns out to be possible to monitor the quality of the linear interpolation, by comparing higher-order corrections of a perturbative expansion to the lower-order contributions. This could serve as decisive tool to send extra coupling files to the ALADIN partners solely for those forecast intervals where the linear interpolation turns out to be deficient with the operational coupling frequency.

6. Specific coupling problems (6a: "Blending")

Alex Deckmyn performed further tests on DFI blending in the ALADIN-Belgium domain. In this nested model (coupled to ALADIN-France) he compared the results using the ALADIN-France files as initialisation or the ARPEGE long cut-off analysis.

As in other blending results, the change in objective scores is small. The improvements due to blending itself appear slightly larger than those due to the use of the long cut-off analysis.

Because it is difficult to assess the differences with objective scores, we are now looking at wavelet-based methods to study the local scale properties of the blended and non-blended fields. By comparing the information at small scales we are seeing that the fine structure of a blended initialisation is more comparable to that of the 24 h forecast for the same time.

8. Adaptation of physics to higher resolution (8a: Parameterisation of the small-scale features of convection) & 9. Design of new physical parameterisations (Use of liquid water and ice as prognostic variables, implementation of a new microphysics parameterisation)

Luc Gérard worked on the integration of the micro-physics and the convection parameterizations.

3. In Prague

1 Theoretical aspects of non-hydrostatism

1a) Top and bottom boundary conditions (C. Smith)

The research work continued in order to understand the reason why the semi-Lagrangian advection treatment creates a spurious standing wave above the top of the mountain. The term responsible for this pattern was found: it comes from the fact to apply the semi-Lagrangian vertical advection on the vertical derivative of a quantity (vertical divergence) instead on the quantity itself (vertical velocity). It is, of course, linked to the formulation of the bottom boundary condition, which is defined for the vertical velocity but the information on the w field is lost when switching to the vertical divergence and there is no easy way to reconstruct (in a unique way) the vertical divergence field (and w field) in the origin point of the trajectory. Several ideas how to cope with this difficulty are being currently explored.

1b) Predictor-Corrector scheme (J. Vivoda)

The Predictor-Corrector (PC) scheme was successfully implemented to the library cycle AL12/CY22T1 and scientifically validated: the results correspond well to those of the linear stability analysis. In last Newsletter it was wrongly reported, that the linear stability analysis showed that only one iteration of the PC scheme would have been sufficient to stabilize the 2TL NH scheme. A bug was found in the analysis and now the result says that 3 iterations are necessary to get a safe stability (even in no presence of the orography). The analysis was done for nearly all the options of vertical divergence variables (denoted as d0, d1, d2 and d3) and it showed a specific outcome for the 2TL scheme compared to 3TL scheme (it was done for the standard vertical temperature profile or for isothermal profile with a flat surface) . For example, after the first iteration the scheme remains highly unstable, no matter which vertical divergence variable is chosen. In 3TL case the benefit of a proper choice of vertical divergence variable is strong already after the first iteration. In 2TL case the benefit of the change of variables becomes noticeable only after the second iteration and gets stronger after the third one. In the model, only the results for the original d0 and P0 variables were validated against the linear analysis. The validation for the other combinations will be done after phasing the development in cycle AL15. The PC scheme also contains the possibility to make a decentering of the predictor or corrector step. The test of the impact of the decentering on the stability was inspired by the work of Semazzi. It was indeed confirmed that by heavily increasing the decentering coefficient XIDT (up to 0.5) the 2TL scheme becomes more stable. On the other hand such a choice has strong damping effect. An idea to combine the PC scheme with the decentering of the predictor step and gaining the precision by applying the corrector step was tested (this could lead to a stable scheme when doing only one iteration) ; however the non-decentered corrector step brings the instability back.

2. High resolution runs (D. Cemas)

The 1 km horizontal resolution domain, including a very sharp orography (the Julian Alps), was prepared. A first trial to run the NH version of ALADIN was made. As expected, the problems with stability were encountered (the test was made with AL12 version, not including new options for prognostic variables). Finally, the stability was reached by lowering the value of reference surface pressure for the semi-implicit background (SIPR) to about 800  hPa. With this value it was possible to use a NWP type of time-step. A plausible explication is that in presence of relatively high mountains, which makes the average surface pressure lower, it becomes necessary to adjust the semi-implicit background, too. The tested situation was one of MAP wet cases IOPs, hence the attention was put to simulated precipitations. We could observe a noise in this pattern; the fact which was more or less expected. A trial was made with the linear grid, however, the results were not very sensitive to that. Unfortunately, there was not enough time to make further tuning of the orographic forcing. In addition, one can reproach a lot of incoherent settings to this type of simulation: inappropriate physics, inadequate ratio between the vertical and horizontal resolutions, etc...; however, as a test bed it is still an interesting experiment.

3. Data assimilation related coupling issues

3.a) The incremental DFI & blending technique operational in ALADIN/LACE (F. Vana, M. Siroka)

See the ALADIN Newsletter for details.

3.b) The incremental DFI & blending technique on linear grid (S. Ivatek-Sahdan, R. Brozkova, F. Vana)

The linear grid option was tested in ALADIN/LACE together with the blending technique. The aim was to find the optimal value of the "low resolution" truncation used for scales separation between ARPEGE analysis and ALADIN forecast. When going to the linear grid, the nominal "high resolution" of ALADIN gets increased by a factor 1.5. Since the "low resolution" is determined by the empirically found ratio (weighted geometrical average) of the ARPEGE analysis and ALADIN resolution, its application would lead to some increase of the "low resolution", on the other hand the linear grid option may well require other proportion of the weights. Therefore a new tuning was made for the "low resolution" in the interval bounded by two extreme values: the original truncation and its multiplication by 1.5. A new optimal value for the "low resolution" truncation was found; the tuning was performed on the case of a strong cold front passing Germany and Poland. Even a parallel suite has been made. However, there was no signal in the scores, which remained perfectly neutral with respect to the operational suite. This means that details brought in by the increase in spectral resolution are so fine in the scale (like sharpening of the structures, etc.) that the verification against conventional SYNOP and TEMP network (relatively coarse) cannot detect such differences.

3.c) 3d-var strategy in ALADIN (M. Siroka, G. Boloni)

The tests of 3d-var algorithms known as "STANDARD","BLENDVAR”" and "VARBLEND" continued in two ways: on a case study of a strong cold front and on longer period of May 2001. One of new ingredients was the usage of IDFI technique. The results of the case study confirmed in general way all the previous results of the tests made in 2000, but there was a very interesting novelty: appearance of a gridpoint storm under some conditions.

- grid point storm story

The studied situation was characterized by strong convection with many active cells along the frontal limit (the case was from the end of August 2001). In case of STANDARD algorithm the following forecast created a gridpoint storm. It was not a case for the BLENDVAR or VARBLEND algorithms. The question was, whether it was due to the use of lagged background term (in BLENDVAR and VARBLEND) or due to the filtering properties of blending algorithm. Indeed, both these ingredients played a role. When BLENDVAR was combined with the standard background term, the gridpoint storm appeared. On the other hand the VARBLEND & standard background term experiment did not create the gridpoint storm. The conclusion is that the standard background term brings in strong and not well balanced increments of humidity and temperature, responsible for the later development of the gridpoint storm. In case of VARBLEND these increments gets controlled by the blending step following the analysis. Since blending corrects the large scales and keeps untouched the small-scale 3d-var result, we can conclude that the standard background term brings the imbalance rather in the larger part of spectra. This fact means that the STANDARD algorithm in general is not suitable for the coupled mesoscale analysis problem we try to solve in ALADIN. Further, it was figured out that not only the total variance, but also the shape of the background error structure functions play a role in the balance of increments (creation or not of the gridpoint storm). The level of noise was measured by the oscillations of surface pressure in the respective forecasts, but this gives picture rather on the gravity wave noise. The fact, whether the model creates the gridpoint storm or not, depends more on the temperature-humidity balance. At the moment we know a little on this type of balance, which is in fact a part of ALADIN Jb term (derived by the linear regression from the forecast errors).

- comparison with pure blending

It would be very interesting to compare the STANDARD, BLENDVAR and VARBLEND results with those of a simple blending. This is in order to know whether the currently used observations and assimilation method bring some innovation at the small scales or whether we are at the limits of predictability. Such test would be a very useful benchmark, allowing to compare future ingredients of the assimilation and forecast parts of the model (such as more observations, improved Jb and Jo terms, improved model, etc.). Unfortunately, such comparison has not been done yet.

- lagged statistics for the nested domain

The lagged statistics were computed for the ALADIN/HU model (Boloni and Horanyi, Proceedings of 10th ALADIN Workshop "on scientific developments", Toulouse 7-8 June 2001, pp 113-119). It was found that the total variances were much smaller than in the coupling model (ALADIN/LACE) and by consequence also the analysis increments are very small. Therefore the parallel verification of both models was done in Prague. It indeed showed that the scores of the two models are nearly identical: the nested model is completely driven by the coupling model in their current configuration.

3.d) Evaluation of the impact of balloon drift in sounding measurements (M. Benko, M. Siroka, R. Brozkova)

A set of sounding measurements with recalculated horizontal position of the balloon was monitored in order to evaluate the impact of the balloon drift. The monitoring results are compared to the classically treated sounding data (measurement is considered to keep the same horizontal position within the ascent of the balloon). Only measured parameters were monitored against the ALADIN forecast (at all ranges by 6 hour interval), that means pressure, temperature, wind and humidity. Only the correction of the balloon position was recalculated from the measured wind and not taken directly from the very detailed measurement report (this one is not normally available). The statistics made over one month for sounding stations in LACE verification domain showed a weak worsening of the "scores" in all parameters, increasing with the height (like the drift increases too). Since this result is opposite to what one could expect, it would be interesting to understand this. Either the recalculation of the position is not good, either this is caused by the fact that only the space correction is taken into account while the time correction is neglected. In any case it seems that there is no straightforward benefit from taking the balloon drift into account, contrary to the intuitive feeling, and that the perspective to use this additional information in the data assimilation is almost zero.

3.e) Assimilation of screen level humidity data (M. Jurasek, M. Siroka)

The humidity measurements at 2m above the ground are not used in the upperair analysis. This is because these measurements were influencing the surrounding atmosphere in a unrealistic depth, especially in the vertical. The situation has been examined in ALADIN, where the use of screen level measurements were activated in 3d-var. First, the single observation experiments were done both with the standard and lagged background terms. Indeed, the standard Jb caused again large impact of the measurements. For the lagged background structure functions the situation was better. The experiments have shown a strong coupling between humidity and temperature data: there was an impact of several degrees of Kelvin via the Jb coupling (it was again stronger in the standard Jb case). This point should be further investigated.

4. CFL versus Lipschitz criteria (A. Trojakova, R. Brozkova)

The study based on the ALPIA type of experiment continues. The tools for the coupling of academic experiments were developed and the work on nested domains started. The model was launched almost without no physics, except the vertical diffusion and exchange of momentum and dry static energy. Also the classical NWP type of setup was used for the horizontal diffusion. Other modifications with respect to the reference library were the following: the improved stabilization of the scheme by introducing a special type of the corrector step; the increased dimensions within DFI algorithm in order to perform the filtering over a longer window. The integration went very smoothly for the horizontal resolutions of 10  km and 5 km, where relatively long time steps could have been used. First problems with the stability were encountered at the resolution of 2.5  km. Here it was necessary to lower the value of the semi-implicit background pressure. The same problem occur for the horizontal resolution of 1.25 km. It is necessary to mention as well that each time also the vertical resolution was increased. Since the stability problems for 1.25  km domain became quite sever, all the modifications were phased into the cycle AL15, where new options for the NH prognostic variables may be used. However, no alternative choice of the vertical divergence variable was successful even if a tropopause was introduce to mild the temperature profile in the upper part of the domain. Finally, only the variable for the non-hydrostatic pressure departure was changed; together with a low value of the surface semi-implicit pressure it was possible to integrate the 1.25  km domain with a reasonable length of the time step. It was not successful to go further: the trials to integrate the 0.625  km ended always by the instability. Hence, the conclusion, regarding the stability, is that it would become necessary to use the predictor-corrector scheme to get a robust scheme. In any case, the integration becomes very expensive. Another development was made in order to diagnose Lipschitz criteria. This is to detect a possible crossing of the trajectories. For the examined cases the possible length of the semi-Lagrangian time-step determined by Lipschitz criteria remains relatively long. One can also say that there are other problems accompanying the semi-Lagrangian solution which appear already for shorter time steps than those allowed by Lipschitz.

4. In Budapest

Most of the research and development topics were concentrated around the ALATNET scientific plan and described briefly hereafter:

-- In the framework of the ALATNET project Raluca Radu had been started her work on the coupling problem. Raluca received a general training on the coupling problem and she had started to code the spectral coupling into the ALADIN model (more details can be found in the reports of the ALATNET students).

-- Cornel Soci also in ALATNET grants completed a 4 months stay in Budapest dealing with the mesoscale sensitivities of the ALADIN model. Some interesting synoptic cases were studied and the possible improvement of precipitation forecasts were investigated through sensitivity studies taking into account also physical parameterisation aspects. The main achievements of this work is summarised in a scientific paper just to be submitted (more details can be found at reports of the ALATNET students).

-- The Pre-doc student of Steluta Alexandru started her work in Budapest at the beginning of November. Her topic is devoted to the study of the 3d-var scheme of ALADIN in the context of the double-nested technology of the ALADIN/HU model (her report can be also found in this Newsletter).

-- The investigations around the effects of soil texture to the forecasts of the ALADIN model are continued and completed. The results are summarised in a scientific paper under submission.

-- The computation of background error statistics and its sensitivity to the integration lengths and forecast differences of the NMC method had been continued and the main conclusions of this work is summarised in a short report in this Newsletter.

-- The investigations of the possible inclusion of ATOVS radiances into the 3d-var data assimilation system had been started in the autumn (short summary can be found in this Newsletter).

5. In Ljublana

The former Hydrometeorological Institute of Slovenia as the ALATNET centre in Slovenia was merged into The Environmental Agency of Republic of Slovenia (EARS) in the middle of 2001. The EARS as successor of HMIS became the governing institution for ALATNET project in Slovenia. The transition between the two institutions was smooth as far the ALATNET project is concerned, the only problem is that, due to reorganization, the number of persons working in numerical weather prediction and correspondingly in the ALATNET project is now significantly reduced. The work related to ALATNET is now even more connected with the research stays of PhD students in Ljubljana.

The two major topics where research has been performed were "Case studies aspects of NH" and "Spectral coupling".

Case studies aspects of NH

Klaus Stadlbacher finished his first stay in Ljubljana at the end of August 2001. More information about this topic and his stay could be found in his report published in the last Newsletters.

Spectral coupling

The work on this subject was started by the new ALATNET student Raluca Radu. She started her study with stay in ALATNET centre at Hungarian Meteorological Service in Budapest where she got some insight of the spectral coupling problematics. The work continued in Ljubljana. She started with the coding of spectral coupling in the ALADIN model during her stay in Ljubljana in November and December. Due to good preparation of the work in Budapest and good working plan the work started successfully. She is expected to have the basic version of spectral coupling ready at the end of her first stay in Ljubljana (end of May), when also some results will be available. This will be also the time for the first evaluation of the spectral coupling method.




Home