ALADIN developments in Toulouse during the second half of 2001

The life of permanent staff and visitors in GMAP was strongly perturbed along these months, by 3 successive bursts :

However we managed :

© to publish the Newsletters;

© to organize a small mini-workshop (10-14/12/2001) to prepare the new medium-term research plan for ALADIN, with contributions from Alica Bajic (reporter for verification) , Doina Banciu (reporter for physics), Radmila Brozkova (reporter for dynamics), Ryad El Khatib (reporter for maintenance) , Claude Fischer (reporter for data assimilation), Dominique Giard (reporter for training), Jean-François Geleyn (reporter for operations) , Thomas Haiden, Jure Jerman (reporter for applications) , Jean Nicolau (reporter for predictability), Gabor Radnoti (reporter for coupling) , Jozef Roskar, and other ARPEGE / ALADIN scientists in Toulouse at that time;

© to complete the phasing of the assimilation and 923 stuff, building an up-to-date cycle AL15, thanks to Claude Fischer (as usual), Françoise Taillefer, Ryad El Khatib, Jean-Marc Audoin, Dominique Giard, Patrick Saez, and Andrey Bogatchev, Alex Deckmyn, Sandor Kertesz, Valery Spiridonov (+ Neva Pristov from Ljubljana);

© to progress in scientific issues as described below and in the ALATNET report.

Lora Gaytandjieva, Françoise Taillefer and François Bouyssel carried on the work on snow analysis, with some more refinements / simplifications and tests in assimilation mode in ARPEGE. There are still problems, mainly related to vertical interpolations, the lack of quality control and the low density of observations. Sorry, once again it didn't move to operations before winter !

Natalia Camara investigated with François Bouyssel two problems of surface analysis, (i.e. optimal-interpolation analyses of 2  m-temperature, 2  m-relative humidity and 10  m-wind), in the framework of the ALADIN-France version of Diagpack. A reduction (by about one third) of the characteristic lengths for observations allows to use far more data without introducing additional noise, and to get closer to observed fields. An attempt to reduce noise in analysed 2  m fields was performed. The soil moisture of the first guess was arbitrarily fixed to a mean (and sensible : average of field capacity and wilting point, depending only on clay fraction and soil depth) value everywhere. This led to smoother 2  m fields, especially for temperature, and smaller analysis increments, even when changing only superficial moisture. Moreover the analysed relative humidity is closer to observations.

Stjepan Ivatek-Sahdan also aimed at improving the analyses of 2  m fields, but this time in the framework of data assimilation, and addressing the statistical model. He performed a detailed study of forecast-error statistics, based on the computation of "observations - first guess" correlations on a large range on situations (Lönnberg-Hollingsworth method). Results were examined for 4 domains (Europe, USA, Africa, Australia) over 2 months (December 2000 and June 2001), and for twice 3 months (winter, summer) over Europe, to study the geographical and seasonal variability. This demonstrated that the old statistical model, deduced from that of upperair analysis in CANARI, required significant adjustments. The formulation of the coefficient of correlation with distance was changed :

ALADIN_Toulouse_Eq1.gif

and those of the standard deviations (σ) and characteristics lengths (a), depending on the mapping factor m, retuned :

ALADIN_Toulouse_Eq2.gif

with α changed from .02 to .05, (σ0T, σ0H) from (2.3  K, 17  %) to (1.7  K, 13 %) and (a 0T, a0 H) from (350  km, 300 km) to (105  km, 100  km). The experimental and parameterized formulations of the coefficient of correlation are illustrated hereafter. The oscillations observed for Africa (and worse for Australia) are due to the lack of observations over these domains.

Stjepan.gif

Coefficients of correlation for 2  m temperature: dependency to distance between points for different domains and June 2001 / for different correlation functions


There is a clear seasonal variation of the correlations, slightly weaker over Europe. Tests were performed with ARPEGE with "single-obs" experiments first, then on real situations, to compare the old and new formulations. There is a significant impact, especially in the Southern hemisphere. More experiments are now required to validate or refine the new statistical model.

Françoise Taillefer improved the French pre-operational version of Diagpack (use of more surface observations, new visualization tools).

Rashyd Zaaboul studied ODB, and ported it to the new IBM computer in Casablanca. His report and that of Sandor Kertesz provide a detailed and useful documentation on ODB for ALADIN (in English and in French) : structure, tools, use, compilation, ... Considering the parallel training of Philippe Caille, the ALADIN project can now rely on several experts in ODB.

Martin Janousek and Jean-Daniel Gril analysed together how to interface the new EGGX with the ALADIN & co libraries. We may hope it will be available in the next cycle.

Yong Wang came twice in Toulouse, in the framework of the bilateral AMADEUS project ("Improving the description of precipitations over mountainous areas"). The first stay was devoted to the design of a working plan for the analysis of precipitations, the second one to the possible refinements in the definition of the model orography.

Khoudir Tounsi and Mohamed Hajjej were trained to the use of ALADIN by Jean-Daniel Gril, and performed some case studies with ALADIN-Tunisia for convective events. The vegetation and the orography of the model were carefully controlled, and a new post-processing grid designed.

Besides Jean-Daniel Gril started to make to main ALADIN tools fully portable and to fight with the auxiliary library.

Siham Sbii, with some help from Jean-Marcel Piriou, developed a promising "model to satellite" application. Two additional modules were introduced in the ARPEGE physics to compute simulated cloudy and clear-sky brightness temperatures : first the computation of spectral luminances using the radiation scheme of J.J. Morcrette, second the channel selection and computation of brightness temperatures (to be compared to observed satellite images). As a first step, 4 fields are available (and stored as instantaneous fluxes) : cloudy / clear-sky for infrared / water vapour. The chosen radiation scheme is expensive but covers a wide range of wavelengths, which allows selection for several satellite channels. An alternative is to run as many cheap radiation schemes as required channels. The second necessary step was the management of observations : decoding raw GRIB data to obtain the same output format (latitude, longitude, temperatures), then filtering out stupid values, in order to be able to plot simulated and observed fields with the same colouring.

Martin Bellus, Martina Tudor and François Vinit investigated some odd behaviours of the operational ARPEGE and ALADIN suites. The study of Martin Bellus is described in a separate paper. François Vinit pursued the analysis of the "Trafalgar" case, without finding more indices on the origin of problems. Martina Tudor started on the diagnostic of PBL height. It was computed by searching from the first model level where the Richardson number reaches the critical value 0.5, starting from the surface. But this led to unrealistically low or high values of PBL height in case of a very cold or warm surface. To solve this problem it was decided to start from the lowest model level. This was efficient in removing extreme values but revealed unpleasant wave patterns, associated to very strong horizontal and vertical gradients of temperature. This is clearly due to a fibrillation problem, inducing a computational mode of period twice the time-step. Several remedies were tried. Some partly attenuated the problem, some solved it but cannot be used operationally, some made the situation even worse ... What proved efficient is a severe reduction of the time-step (divide by 2) or a uniform anti-fibrillation scheme (as applied in IFS). But retuning of the ARPEGE anti-fibrillation scheme, designed to be active locally, only where there should be problems, didn't lead to any improvement (on the contrary !). A first hypothesis to explain this behaviour was that this might be related to the underlying assumption that anti-fibrillation is useless in unstable cases : problems appeared on domains where the atmosphere was unstable in this situation. Later it was found that the anti-fibrillation scheme was in fact responsible, converting temporal oscillations into vertical ones when submitted to strong constraints. A solution is to reduce its spatial variability, limiting variations on the vertical. We may hope to have soon a nice PBL-height diagnostic available in ALADIN and ARPEGE.

Image12.jpg

Figure 1. PBL height (old computation)

Image13.jpg

Figure 2. PBL height (new computation)

Image14.jpg

Figure 3. Difference between temperature fields at the two lowest model level (T30-T31)

Image19.jpg

Figure 4. Impact of time discretisation : T(t+2Δt)+T(t)-T(t+Δt), Δt=900s, at level 31

To end with, a lot of work was devoted to the operational ARPEGE and ALADIN suite : investigating problems within 4d-var (spurious humidity increments, retuning of horizontal diffusion, sensitivity to resolution and time-step, instabilities, ...), debugging the convection scheme, preparing 3 major changes : "New cycles" (described in the previous Newsletter), "CYCORA-ter" and the increase of resolution, correcting mistakes, etc ...




Home