ALADIN developments in Prague during the first quarter of 2000

1. Data assimilation related developments.

a. Tuning of spectral blending.

The tuning of the spectral part of the blending technique continued for the parameters of internal DFI. The aim was to diminish the spin-up effect (measured via the "set-up" of physical fluxes of water vapor) and to get reasonably looking structures of vertical velocity in the initial and short-range forecast fields. The optimal DFI parameters were then fixed to TAUS=18000 and NSTDFI=5. These internal DFI values complete the already found optimal truncation for the lower resolution: NSMAX=28, NMSMAX=31. To remind, the operational values of ALADIN/LACE are: NSMAX=71, NMSMAX=79, TAUS=10800, NSTDFI=7. It still remains to retune a bit the production external DFI: this study is scheduled for April 2000. It should be noted that all these parameters are very probably domain/resolution dependent, hence the presented ratios between the operational and blending cycle values are probably not directly transferable without a little bit of re-tuning to other ALADIN applications.

Further on, a long blending cycle experiment of about two months has begun, being planned to cover the summer period of 1999. In this experiment both the newly tuned spectral blending parameters and the blending of surface variables (see Newsletter 17) are used.

More details can be asked to: Dijana Klaric, Stjepan Ivatek-Sahdan, Klaus Stadlbacher, Martin Janousek, Gabor Radnoti, Radmila Bubnova and Jean-Francois Geleyn.

b. ALADIN/LACE background error statistics to be used in Jb term of penalty function.

Like it was announced in the last newsletter, the study of background error model for ALADIN continued last winter. Beside the "standard statistics" obtained via differences of forecasts P36h - P12h valid at the same verification time (NMC method), having the known weakness of the predominating large-scale component, other combinations were tried:

After an analysis of the results we have noticed again the predominating large scale features in all the sets, though the last "diff-diff" set was indicating a little hope. Later on another fresh idea of Vincent Casse has been tried: to compute the statistics from the forecasts made by using the constant LBC (statistics called "VC"). Therefore a set of 12h forecasts was recomputed using the "one day old LBC but keeping the "fresh" initial file and the statistics based on the classical difference (P36 - P12)_lbc=const were computed. At the beginning of the exercise there was a debate, whether we should use "fresh initial conditions for the recomputed 12h forecast or whether we should use the 24h old LBC file also as the initial file. The bad surprise when looking to the results of the former scheme and further brainstorming analysis showed that the 24h LBC file should be used as the initial file of P12, in order not to create large discrepancies of the fields at the lateral borders and not to reintroduce again the "large-scale" component of the error. This means, finally, that the statistics were computed from the difference: (P36 - P12)_lbc=const, P12 starting from LBC-24h (statistics called "JFG"). This combination shows an internal growth of the forecast error when the lateral forcing is the same and when the initial state of P12 does not contain the analysis update due to any "fresh" observation network. This time the scale analysis of the obtained statistics looked quite optimistic, as it was hoped. However, when the single observation experiments are performed using these statistics, the increment amplitude remains quite small. Moreover, the impact of the observation network already weak when using the NMC method is quite eliminated. Hence, an additional empirical tuning of Jb will be needed to scale it well while the variance matrices and multivariate coupling of errors shall be taken from the last statistical "JFG" set. The new method to compute the statistics has been tried later on for the FRANCE domain, see Claude Fischer for further details.

More details can be asked to: Maria Siroka, Claude Fischer, Radmila Bubnova, Jean-Francois Geleyn.

c. Diag.pack

The development done by Meriem Zitouni (fall of 1999) of the land-sea contrast to be applied in the structure functions of the SYNOP-level analysis has been taken over by Jure Jerman, surely concerning the "distributed memory" piece of the code. The validation of "DM" version shall be done later on in Ljubljana before being phased to the main library.

More details can be asked to: Meriem Zitouni, Jure Jerman.

2. Developments in the physics.

There has been just a set of experiments to examine the impact of various CYCORA modifications on the forecast of 1999 Christmas Storm. The package of CYCORA was split to 12 sub-groups according to their nature. Further, four references were run with ARPEGE CYCORA and NON CYCORA, starting from the operational (CYCORA) analysis and from another (NON CYCORA) analysis (done at Meteo-France by Samuel Westrelin) resulting from a few days of the assimilation cycle where CYCORA was switched off. Then for each of the twelve sub-groups an ARPEGE forecast was run from both CYCORA and NON CYCORA initial states when the concerned sub-group was either activated (in NON CYCORA reference forecast) or inactivated (in CYCORA reference forecast). Thus we have got 48 + 4 forecasts of the case. The results were evaluated in a relative way according to the criteria of the depth (or intensity) of the cyclone and its position. Out of the twelve groups three of them were identified to play a definitely positive role: i) parameterization of PBL (namely Richardson limit riteria), ii) parameterization of the pressure departure within a cloud with respect to the environmental pressure, iii) scale dependency of the closure of the deep convection parameterization scheme. Within these three subgroups other refining tests shall be done. Just to remind: ARPEGE-CYCORA did ever the world best forecast of the storm of 26/12/1999, permanent over three consecutive network times (1999/12/25/r0, 1999/12/25/r12, 1999/12/26/r0).

More details can be asked to: Stephan Greilberger, Martin Janousek, Jean-Francois Geleyn.

3. Developments in the dynamics.

While the NH modelers continued their Prague follow-up efforts in Lautrec, our attention was brought back to the lateral boundary coupling problems thanks to the … Christmas 1999 Storm (!). We have mentioned in the previous paragraph an excellent work of ARPEGE, while not the same could be said about the performances of ALADIN. For example, ALADIN/FRANCE did not do better than the global model and ALADIN/LACE missed the storm in early hours of its existence and displacement. As we have figured out later it was due to the weaknesses of the lateral coupling. Especially the update frequency of LBC data was relieved crucial, since the displacement speed of the storm exceeded 100km/h in early hours of its existence. In any case a wide study of the lateral coupling problem has been started, see a special report on it by David Dvorak in this Newsletter. Short conclusion: a 3 hour LBC update frequency together with a piecewise parabolic function to estimate the time trajectory of LBC seems to be the best combination to be used at the moment. Just funds are missing to finance more powerfull telecom lines.

More details can be asked to: David Dvorak, Radmila Bubnova.

Beside the problems of the coupling, a 41-level version of ALADIN/LACE has been checked in the parallel suite, anticipating the future vertical resolution of ARPEGE. However, a higher level of noise has been detected and the impact on scores was also negative. A detailed study revealed a problem coming from the e927 extrapolation to the additional levels in the stratosphere. Since a short-range LAM does not really need to increase its resolution in the stratosphere, we have redefined the future vertical grid of ALADIN/LACE to use 37 levels (currently 31 levels), all added levels being in the troposphere in the same proportion as for the anticipated ARPEGE 31 to 41 levels increase. A few tests were run to examine the level of noise; since there was not anymore a problem, the increased vertical resolution from 31 to 37 levels shall be tested in a parallel suite.

More details can be asked to: Maria Siroka, Radmila Bubnova.

4. Developments in the diagnostics.

There has been a development of a little tool to diagnose the balance of physical fluxes, concretely radiation and water vapor fluxes including the time evolution of the balance within model forecast. This tool working on the history files was very useful for evaluation of the model spin-up in the blending experiments.

More details can be asked to: Klaus Stadlbacher, Martin Janousek.

5. Developments in the verification.

There was no development of the verification tools in the first quarter of 2000.

6. Technical developments.

The practical implementation of SMS (Scheduler-Monitor-Supervisor) software to the ALADIN/LACE operational suite is under way. It will also include more complex structures of the blending/assimilation tasks.

More details can be asked to: Metod Kozelj, Roman Zehnal, Martin Janousek.

7. Work on documentation.

There was no particular effort on the documentation in the first quarter of 2000.




Home