The operational ALADIN models

1. Changes in the operational version of ARPEGE along the second half of 2002

(more details samuel.westrelin++at++meteo.fr)

22 October 2002 : "New observations"

From the 22th of October at 06 UTC run, the raw radiances AMSU-A are assimilated in the 4d-var suite, instead of processed radiances HIRS and AMSU-A. The geopotential from radiosondes is not assimilated any more, but temperature at standard levels is now (in addition to that at characteristic levels). Wind data from European and American profilers are now assimilated. Concerning the physics, a few minor bug corrections have been introduced.

The coverage of polar zones with radiances is far better now. Evaluation statistics are good from low levels to the stratosphere. The relative humidity in the tropics is deteriorated due to the loss of HIRS channels.

17 December 2002 : "New cycle"

Starting from the 17th of December at 06 UTC run, the following modifications have been carried out :

- implementation of the new cycle, 25T1 ;

- assimilation of NOAA17 AMSU-A radiances ;

These changes did not bring a significant impact on scores.

Latest parallel suite : "DICORA" (DIffusion COnvection RAdiation)

Successfully tested from the 18th of December 2002 till the 10th of February 2003, it included the "first package" of changes in physics described in the dedicated paper by Jean-François Geleyn, and a few technical changes or bug corrections (ODB, Full-Pos, ISP, ...)

Next parallel suite : "COCONUT", "second package" of changes in physical parameterisations.

CAUTION

These changes will significantly enhance the discrepancy between the physics of the coupling and the coupled models, for those applications till on old versions, with a possible negative impact on forecasts. Back-phasing of the modifications in physics may be performed for those not yet on cycle 25T1, but not further than AL15.

2. Operational versions in Austria

(more details thomas.haiden++at++zamg.ac.at)

At the end of September 2002 the operational suite was switched from AL12 to AL15_Cycora-ter, with bug corrections in accoefk and aplpar. Roughly at the same time, daily pre-operational integrations of the LACE domain were started. Since 1 January 2003, both the ALADIN-LACE domain and the ALADIN-VIENNA domain are integrated operationally. ARPEGE coupling files (3-hourly) are downloaded via Internet.

3. Operational version in Belgium

(more details olivier.latinne++at++oma.be)

The installation of AL15/CY24T1 on SGI implied a long debugging task, because of some portability problems and some weaknesses of the compilers. Some shifts in the concepts of the output file contents required adapting our local post-processing tools and operational chain. At the same time, we decided to review the contents of the archives to limit the occupied volume and give a quicker access (avoiding additional post-processing) to the most often needed products. A parallel suite was launched in March, and later adapted in June following the discover of some bugs at Météo-France. However, the comparisons of some precipitation events showed a severe degradation compared to AL12; this was traced to be linked to the return to GCOMOD =1, which seemed to be inappropriate at high resolution.

4. Operational version in Bulgaria

(more details andrey.bogatchev++at++meteo.bg)

Nothing new along the last months.

5. Operational versions in Croatia

(more details , )

Main features of the ALADIN-CROATIA operational suite in the Croatian Meteorological service:

SGI 3400, 16 processors, Irix 6.5

A) ALADIN-LACE: 12.176 km mesh size, 229x205 (240x216) gridpoints, SW corner (34.00, 2.18), NE corner (56.96, 55.07), on 37 vertical levels, coupled to ARPEGE with 3 hour interval, 48h forecasts using AL12T1 cycle.

1) LBCs retrieved from sirius servers in Météo-France using CARNet (Croatian Academic and Research Network) Internet connection.

2) e927 starts as soon as the LBC file is downloaded, e001 starts after e927 is finished for the second file.

3) When all LBCs are available on time, the 48h forecast integration, with 14 DFI steps, takes 65 minutes using 11 processors.

B) ALADIN-CROATIA: 8 km mesh size, 127x109 (144x120) gridpoints, SW corner (41.79, 8.93), NE corner (49.53, 21.98), on 37 vertical levels, coupled to ALADIN-LACE with 3 hour interval, 48h forecasts using AL12T1 cycle.

1) The suite starts after the ALADIN-LACE 48h integration is finished.

2) 48 h forecast takes 30 minutes on 14 processors.

3) Both parts were used in a test suite since 1st November and became operational on 20th December 2002.

C) Dynamical adaptation (DADA) - wind field is adapted to higher resolution climatological surface data using the dynamical adaptation method on 4 domains along the Adriatic coast and 1 inland. It starts after the ALADIN-CROATIA 48 h integration is finished. Only surface wind field is used for operational purposes

D) Output fields are visualised using GrADS visualisation package.

E) PSEUDOTEMPs are produced from both 48h forecasts and HRID meteograms are visualised.

F) Visualisation of ALADIN-LACE, ALADIN-CROATIA, dynamical adaptation forecast data and data measured at SYNOP stations in Croatia (example on Figure 3).

Hr1_fig1a.gif Hr1_fig1b.gif

Figure 1: ALADIN/LACE (12 km resolution) and ALADIN/HR (8 km resolution) domains (left). ALADIN/HR (8 km resolution) and Dynamical adaptation (2 km resolution) domains for the surface wind fields (right), domains senj, masl, spli, dubr and karl are used operationally.

Hr1_fig2.gif Figure 2: The dates when parts of the ALADIN operational suite in the Cro. Met. Service finished, for end of December 2002 and beginning of January 2003

Hr1_fig3a.gif Hr1_fig3b.gif Hr1_fig3c.gif

Figure 3: Visualisation of ALADIN-LACE, ALADIN-CROATIA, dynamical adaptation forecast data and data measured at SYNOP stations in Croatia for temperature (left), mean sea level pressure (centre) and wind speed (right).

6. Operational versions in France

(more details samuel.westrelin++at++meteo.fr)

Similar changes in ALADIN-France as in ARPEGE along these months :

"New cycle"

Together with the implementation of AL25T1, some changes were brought to post-processing : adding new dynamical fields (iso-PV fields, isobar PV), filtering derived fields (vertical velocity, dynamical fields)

"DICORA"

The benefits on temperature, from ground to the top of the mixed layer, are noticeable (see Figures 1 and 2), and the same holds for 2 meters humidity (Figure 3).

rmse std. dev. abs. bias

dble_temp_T.gif

Figure 1. Scores against TEMP observations for ALADIN-France: oper/radiosonde - test/radiosonde.

(isolines every 0.05 K; 43 cases, from 21/12/2002 00 UTC to 05/02/2003 00 UTC)

(green : improvement, red : deterioration)

dble_synop_T.gif

Figure 2. Scores against SYNOP observations of 2m temperature for ALADIN-France

(43 cases, from 21/12/2002 00 UTC to 05/02/2003 00 UTC; with height correction)

 ¾ ¾  bias oper/SYNOP ´¾´¾ rms oper/SYNOP

  ¾ ¾  bias test/SYNOP ´¾´¾ rms test/SYNOP

dble_synop_H.gif

Figure 3. Scores against SYNOP observations of 2m relative humidity for ALADIN-France

(43 cases, from 21/12/2002 00 UTC to 05/02/2003 00 UTC)

 ¾ ¾  bias oper/SYNOP ´¾´¾ rms oper/SYNOP

  ¾ ¾  bias test/SYNOP ´¾´¾ rms test/SYNOP

ALADIN-Réunion stopped on the 26th of November 2002, after an increase of horizontal resolution in its former coupling model, ARPEGE-Tropiques (from TL299 to TL 359 ).

7. Operational version in Hungary

(more details horanyi++at++met.hu)

During the first half of 2002 the main migration from the SGI Origin 2000 platform to the IBM Regatta (p690) platform was successfully completed. The main achievements during the second part of 2002 were concentrating around the definition of new model domain and resolution together with the preparations needed for the foreseen changes in the operational part of the RC-LACE project. Beside that the major scripts were developed for the exploitation of the ALADIN 3d-var assimilation scheme in Budapest.

The most important activities around the local version of the ALADIN model in Budapest are detailed hereafter:

-- Installation of the loadleveler job scheduling system on the IBM machine. The loadleveler ensures that the operational applications get the full priority during their execution, while the research jobs are pre-empted during this time window. Different job classes were created for jobs with different memory and CPU consumption (oper, mono, small, medium, big and inter_class). The operational scripts were modified accordingly having a header needed for the loadleveler.

-- Definition of new model domain and creation of new climate files needed for the operational application (it is noted here that the 923 procedure was still executed on the old SGI machine due to the fact that in the newer model versions that configuration is not working). The new sets of climate files are : input climate files of ARPEGE resolution (Lambert projection), ALADIN/HU domain climate files (Lambert projection), post-processing climate files (lat × lon), dynamical adaptation climate files (Lambert projection).

-- Access was asked and obtained to the BDPE database of Météo-France for the operational access of LACE lateral boundary data available in Toulouse. Fail-safe scripts were written for accessing the LBC data (in 3 hours temporal frequency) from Toulouse through internet. The internet transfer rates were continuously monitored and evaluated since the operational application of the transfer (from mid-July). The reliability of the internet transfer was found to be satisfactory for a possible operational application, however it is emphasized that for the longer range a more reliable solution should be thought (this solution will be probably the RETIM 2000 satellite dissemination system, which is supposed to be operational at the first part of 2003).

-- Parallel suite was established for the new model domain, resolution, LBC frequency and systematic basic statistical comparison was carried out for comparing the operational and test model versions. The parallel suite was running from mid-July until the beginning of November. The results of the comparison showed an overall neutral impact, therefore the operational introduction of the new model version was scheduled for the beginning of November.

-- Accumulation of forecast information was ensured during the parallel suite phase (from mid-July until the beginning of November) and background error statistics were computed over the new ALADIN/HU domain.

-- New scripts were written for the exploitation of the 3d-var application. The 3d-var model version was put into parallel suite during the second part of December (see some more details in the ALATNET Newsletter in the same volume).

-- Special post-processing (Full-Pos) scripts were written and tested for converting the new domain's result into the nowcasting domain for the application of the Diagpack scheme (the domain for the CANARI optimal interpolation scheme has been kept unchanged). It is also noted here that instead of the previously used on-line Full-Pos post-processing an off-line solution was adopted (due to the not reliable functioning of the on-line version of Full-Pos package)

-- Some auxiliary softwares (pseudo-TEMP creation, VERAL verification package, etc.) were adapted and introduced before the end of the year in order to ensure smooth transition for the operational changes coming from the disappearance of central LACE operational exploitation in Prague.

-- After the careful evaluation described above (for a period of about 3 months) on the 5th of November, new model version was introduced operationally (see details of the operational version below).

The main characteristics of the new ALADIN/HU application are as follows:

-- The horizontal domain covers practically the same area as the former LACE domain (see the enclosed figure for the domain and its orography).

-- The horizontal resolution is 6.5 km and the vertical one is 37 model levels.

-- The number of gridpoints including the extension zone is 432×384.

-- The time-step used is 270 s.

-- The coupling frequency is 3 hours and the coupling information is coming directly from the ARPEGE global model (so the double-nested strategy was replaced by a single nesting solution).

-- The integration time of the model is around 1 hour using 24 processors of the IBM machine.

ALADIN_HU.gif

Since the introduction of the new model version it is running with high reliability. For the near future the following plans are considered:

-- Development of a graphical diagnostic tool for the monitoring of the operational suite by the operators.

-- Revision of the old and development of new operational scripts. The new script system is supposed to be more modular and systematically logical and will be prepared by a professional software company).

-- 24 hours monitoring of the operational suite with the possible intervention of the scientists in case of necessity (mobile phone and laptop for remote login possibility).

-- In the first half of 2003 (after completing the previously mentioned tasks) the operational application of the 3d-var scheme is planned.

8. Operational LACE application

(more details vana++at++chmi.cz)

1. Evolution of the ALADIN/LACE application.

The ALADIN/LACE suite switched to CYCORA-ter +++ :

26/11/2002 at 12 UTC network time for the production run and at 06 UTC network time for the assimilation cycle: CYCORA TER +++

This physics package known under the name of CYCORA-Ter became operational in ARPEGE already in November 2001. ALADIN-LACE did not follow the change due to a negative impact on the screen level temperature scores in the "continental winter" type of weather. Many tests were made to understand the cause; finally the retuning of mixing length profiles provided more satisfactory results. Besisde, in spring 2002 there were a few cases of too intense rain produced by the stratiform precipitation scheme. As a cure, shear-linked convection scheme was introduced and tested. The results were mostly improving the precipitation patterns. Finally, a time-smoothing of the shallow-convection scheme was introduced. All these changes were tested together in a parallel suite run, for winter and summer periods (parallel suites ABU and ABW described below). For both periods the score response was satisfactory.

Impact on the forecast : slight improvements in the scores and a significant improvement of the temperature forecast

Technical impact : none.

2. Parallel Suites & Code Maintenance

The Prague Team launched the following parallel tests to assess the impact of different modifications :

- Suites ABR, ABS and ABT: All these suites refer to the ABR suite (reported in the previous Newsletter), where we have tested the latest version of CYCORA-ter with (ABR) or without (ABS) activating the recently developed shear-linked convection scheme (in the previous Newsletter this development was denoted as slantwise convection but later it was renamed in order to fit better the nature of the scheme). In ABT suite we used another tuning of the USURID and USURIDE parameters. Among these three tests the ABR suite was giving the best results, however we noticed that the computations were not giving stable results pending which version of cross-compiler was used. This behaviour was analysed more in depth and it was obvious that the recently used compiler release contained a bug. The problem was addressed to NEC and with the help of an analyst it was possible to verify that a higher release of the compiler was stable and thus this newest version was installed both on the NEC computer as well as on the other servers with a cross-compiler licence. To be sure about the results, the suites denoted as ABU, ABV and ABW were recomputed using the stable compiler release and their performance is described below.

- Suite ABU: It is the ABR suite launched for the winter period. There is a slight improvement of geopotential, temperature and moisture scores near the surface.

- Suite ABV: It is like the ABR suite but with retuned USURID and USURIDE parameters (in fact it is the ABT setting). This suite was also launched for the winter period. It did not give as good results as the suite ABU.

- Suite ABW: This suite has the ABU settings and it is run for summer period. Like in winter, the scores are slightly better. The best result is reached for temperature score. This suite was further tested on the disastrous Central-European floods. Even here the precipitation pattern looked even more realistic than the already excellent precipitation forecast provided by the operational configuration.

The results of parallel tests may be consulted on / pages.

9. Operational versions in Morocco

(more details radi.ajjaji++at++cnrm.meteo.fr)

In addition to the operational ALADIN-Morocco suite, Casablanca is running, since the 18th of February, a second ALADIN version on the large North Africa area: ALADIN-North Africa (NORAF) is now a reality. Here is all the technical details of the new and the old applications.

The operational ALADIN-Morocco (ALBACHIR) is running until 60 hours twice a day at a new resolution of 9 km and without data assimilation. The associated grid is (360x320x41), the time step is 469.566 s . This new configuration for ALADIN-Morocco is very similar to that of ALADIN-France in terms of resolution, post-processed fields, frequency of coupling (3 hours) and code level (25T1 at the time of writing). But this configuration is now coupled with ALADIN-NORAF whose the characteristics are as follows:

Time step : 900 s

Horizontal resolution : 31 km

Number of vertical levels : 41

Number of gridpoints on the horizontal : 180x300

North-East corner, latitude : 43.°

North-East corner, longitude : 56.°

South-West corner, latitude : 0.°

South-West corner, longitude : -36.°

This model is coupled asynchronously with ARPEGE with a coupling frequency of 6 hours. It runs twice a day for a 72 hours range. ALADIN-NORAF is performing its own assimilation using the Optimum Interpolation code CANARI under ODB (which is running correctly on IBM since the migration to 25T1), for 4 networks (r00, r06, r12 and r18). All the observations are collected through the SMT channel and organized in a local "BDM" which is now well interfaced with ODB (thanks to the Batodb package given by Dominique Puech).

The wall-clock time taken by the several components of this new system (on IBM SP) are :

concerning ALADIN-NORAF :

Assimilation: 5 minutes per network (10 processors in MPI mode)

Coupling: 30 seconds per ee927 transformation ( 4 processors in MPI mode)

72 hours forecast: 25 minutes (16 processors in MPI mode)

Full-Pos: 20 seconds per output range ( 4 processors in MPI mode)

Graphics & GRIB: 30 seconds by output range.

concerning ALADIN-Morocco :

Coupling: 50 seconds by ee927 transformation ( 4 processors in MPI mode)

60 hours forecast: 40 minutes (32 processors in MPI mode)

Full-Pos : 40 seconds by output range ( 4 processors in MPI mode)

Graphics & GRIB: 30 seconds by output range.

We are now working on preparing our operational suite to a new configuration performing assimilation with the 3d-var technique (Blendvar mode). It seems that it is not obvious, taking into account the complexity of observation handling with ODB (Batodb, Lamflag, Shuffle, Screening, Minimization, etc). We hope that when 3d-var will become ready for operations, all these different components will be more flexible and less complex!

We are working especially on the observation part : we intend to prepare our own radiance observations using a local HRPT station and we hope to be helped by our colleagues in Lannion (France).

With the migration to cycle 25T1, the model timing performances were very bad : calculations corresponding to a diabatic time-step were very expensive compared to the old operational model with AL13 : more than 30 % of extra-cost. This encouraged us to investigate on the use of OpenMP into the source code.

To turn on OpenMP in the model on an IBM SP machine equipped by XLF Fortran compiler, you need just to compile using ``- qsmp=omp'' option. The ARPEGE branch is safe in terms of OpenMP syntax; it is not the case for some ALADIN routines (espc.F90, eslextpol.F90 , eslextpol1.F90, etc). But the correction of these compilation bugs is not difficult.

During executable static linking, you need to specify also ``-qsmp=omp'' option in order to take into account OpenMP libraries.

This table shows the results found in several mixed mode (MPI+OpenMP) configurations for a small ALADIN-Morocco version (180x180x41 points):

MPI tasks

(NPROC)

OpenMP threads

(OMP_NUM_THREADS)

Wall clock time for one diabatic time step

16

1

4.6 s

8

4

3.4 s

16

2

2.9 s

4

8

4.0 s

2

16

6.0 s

To optimize performance on ``mixed-mode'' hardware like the IBM SP :

· MPI is used for ``inter-node'' communication, and threads (OpenMP / Pthreads) are used for ``intra-node'' communication.

· Threads have lower latency and, as it is clear on the table, can speed up a code when MPI becomes latency bound.

· Threads can alleviate network contention of a pure MPI implementation.

· Runtime checking of overhead must be controlled by runtime environment variables.

But OpenMP performances are good if the size of the piece of work is large (as it is the case of ARPEGE/ALADIN). Theses performances are degraded by other processes on the computation node; it is important, thus, to be dedicated on an SMP node.

We suggest for all the ALADIN community to make efforts on OpenMP. One method to detect where we must include OpenMP directives, is to remember where the CRAY multi-tasking was (especially for Full-Pos and CANARI, which are not treated by the ECMWF specialists). For the time being, all the model part is OK! (at least regarding the few experiments done here at Casablanca), the examinations of norms into e001 listings relatively to one same run for the different cases expressed in the above table showed no differences ! This is perhaps wrong, but we suggest more investigations on different platforms.

10. Operational version in Poland

(more details zijerczy++at++cyf-kr.pl)

In the second half of year 2002 main efforts focused on three aims. The first one was the preparation of a new ALADIN suite based on AL15 and enhanced capabilities of operational software environment for the model. The second aim was the preparation of a dynamical adaptation application for the territory of Southern Poland. Two configurations with respectively 4.0 km and 2.6 km resolution are ready to use. The third target was building preliminary versions of two user interfaces for nowcasting of deep convection, fog and visibility. The interfaces are being built in www technology and give opportunity of simultaneous presentation NWP fields along with Meteosat images. All three tasks are to be continued in 2003.

11. Operational version in Portugal

(more details maria.monteiro++at++meteo.pt)

During the last half of 2002 no changes have taken place on the Portuguese operational suite (AL12_bf_CYCORA_bis). The dissemination of ALADIN/Portugal surface parameters fields is now used to feed an operational scheme of sea conditions forecasts running under the Portuguese Navy supervision.

The package CANARI/DIAGPACK is still a target for validation/verification purposes. On the other side, the identification of extreme precipitation events in order to test and validate the diagnostic tools with the ALADIN/Portugal fields is taking place.

12. Operational version in Romania

(more details banciu++at++meteo.inmh.ro)

Implementation of a new operational suite with shared tasks between SUN E4500 and ALPHA DEC platforms (detailed in the report on "ALADIN developments")

13. Operational version in Slovakia

(more details Olda.Spaniel++at++mail.shmu.sk)

There was nothing new in operations for ALADIN-SLOVAKIA. The workstation DEC Alpha Xp1000 was upgraded by 1GB of memory and AL25 version was implemented (at least DIGITAL Fortran F90 version V5.2 must be available for that ALADIN version).

Automatic text forecast from ALADIN for selected towns was updated (now full English version is available).

The new meteorological workstation ``Visual Weather'' was implemented in forecasters department for visualisation and data processing on NWP data. More information you can find on the site http://www.meteoam.it/egows/

The main effort during this period was concentrated on provision of ALADIN/LACE products for the beginning of 2003 year. The SHMI will cooperate with ZAMG Vienna on this field for the next period.

14. Operational version in Slovenia

(more details neva.pristov++at++rzs-hm.si)

Operational suite was still running on our old cluster of workstations during the second half of year 2002. Characteristics of operational model configuration can be found in ALADIN Newsletter 21. Two changes were implemented recently:

- switch to 37 vertical levels in the model

- replacement of coupling files prepared from ALADIN/LACE in Prague with coupling files prepared from ARPEGE in Toulouse at the end of the year. Coupling time-frequency changed from 6 hours to 3 hours.

Operational performance of the ALADIN/SI model was checked for year 2002. We can notice that the number of unsuccessfully finished model integrations increased compared to the previous year.

 

2000

2001

2002

00 run

12 run

00 run

12 run

00 run

12 run

operational not finished

27 (7.4%)

25 (6.8%)

14 (3.8%)

13 (3.6%)

28 (7.7%)

19 (5.2%)

hardware problems

19

15

4

3

26

19

missing LBC

7

6

6

4

1

0

ALADIN_SI.gif

Anyway products were available in reasonable time in 14 cases when model was rerun manually. The main reason for missing runs were hardware problems. It happened that the integration stopped few times because of floating point exception error, which did not always reappear when the model was rerun. Files with lateral boundary conditions from Prague and also from Toulouse during second half of December were transferred very regularly.

15. Operational version for Tunisia

(more details nmiri++at++meteo.nat.tn)

The arrival of a dedicated computer was unluckily delayed to 2003. See the ALADIN report.