{ "cells": [ { "cell_type": "markdown", "id": "35258028-2bbc-4a44-935d-406ded46bcce", "metadata": { "tags": [] }, "source": [ "
" ] }, { "cell_type": "markdown", "id": "67439f30-4c56-4e93-925c-47389dccfbc0", "metadata": { "jupyter": { "source_hidden": true }, "tags": [] }, "source": [ "

ICON Training - Hands-on Session

\n", "\n", "# Exercise 4: Running the ICON-LAM Setup\n", "\n", "---\n", "
\n", "\"alt_text\"/\n", "
Limited area domain over Central Europe with 6.5 km mesh size (R3B8).
\n", " The light colored area shows the extent of the boundary zone (4 cell rows) and nudging zone (10 cell rows).
\n", "
\n", "\n", "This step-by-step exercise will familiarize you with the basic steps to **set up and run ICON-LAM on the DKRZ cluster \"Levante\"**.\n", "\n", "It covers the following topics:\n", "\n", "- necessary input data for running ICON-LAM\n", "- preparing the ICON-LAM batch job and namelist\n", "- ICON dictionaries\n", "- running the ICON-LAM batch job\n", "\n", "**Note:** This script is not suited for operational use, it is part of the step-by-step tutorial.\n", "\n", "

Example Setup

\n", "\n", "In this exercise we will perform a **48h forecast run with ICON-LAM over Germany**, starting at **2021-07-14 00UTC**. Further details of the configuration are provided in the table below.\n", "The model will be driven by initial and boundary data which come from global ICON forecasts retrieved from DWD's database. These data were preprocessed in the previous exercise [`Pre-Processing for ICON``](../exercise_prepare_lam/icon_exercise_prepare_lam.ipynb).\n", "\n", "| Configuration | |\n", "| :--- | :--- |\n", "| mesh size | 6.5 km (R3B8) |\n", "| model top height | 22 km |\n", "| no. of levels | 65 |\n", "| no. of cells (per level) | 37488 |\n", "| time step | 60 s |\n", "| duration | 48h |\n", "\n", " \n", "---" ] }, { "cell_type": "markdown", "id": "f476b83b-5d0c-4178-8782-cd37b2e27576", "metadata": { "tags": [] }, "source": [ "## Necessary input data for ICON-LAM\n", "\n", "---\n", "
\n", "\"alt_text\"/\n", "
Basic input data for ICON-LAM
\n", "
\n", "\n", "\n", "Running ICON-LAM requires the following set of input files to be available:\n", "\n", "* **horizontal grid file**: coordinates and topological index relations between cells, edges and vertices\n", "* **external parameter file**: time invariant fields such as topography, lake depths or soil type\n", "* **initial conditions**: snapshot of an atmospheric state and soil state from which the model is integrated forward in time\n", "* **lateral boundary conditions**: snapshots of the atmospheric state at regular time intervals, which 'drives' the model at its lateral boundaries\n", "\n", "Please note that a **vertical grid file** is not required. The vertical grid is constructed during the initialization phase of ICON, based on a set of ICON namelist parameters.\n", "\n", "The preparation of these input files has been discussed in more depth in the exercise [`Pre-Processing for ICON`](../exercise_prepare_lam/icon_exercise_prepare_lam.ipynb). At this point we assume that all input files are available and located in the following directories:\n", "\n", "- **initial and boundary data:** `$SCRATCHDIR/data_lam`\n", "- **grids and external parameter:** `/pool/data/ICON/ICON_training/exercise_lam/grids`\n", "\n", "---" ] }, { "cell_type": "code", "execution_count": null, "id": "09f29456-d342-4ceb-bbc3-b02e018d28ec", "metadata": {}, "outputs": [], "source": [ "export SCRATCHDIR=/scratch/${USER::1}/$USER" ] }, { "cell_type": "markdown", "id": "743afb34-4216-44fe-9915-6bc2ea74f068", "metadata": {}, "source": [ "### Inspecting the boundary data files" ] }, { "cell_type": "markdown", "id": "279a4f40-ce28-4acf-b41b-2c4ab268f30e", "metadata": { "jp-MarkdownHeadingCollapsed": true, "tags": [] }, "source": [ "
\n", " Exercise: \n", " Run the CDO command cdo sinfov to investigate the content of a lateral boundary file located in $SCRATCHDIR/data_lam.\n", " The cdo command is available as soon as you have loaded the corresponding module with module load cdo.\n", " \n", "
" ] }, { "cell_type": "code", "execution_count": null, "id": "f0626c38-3631-41d7-87b6-54ff04c50460", "metadata": {}, "outputs": [], "source": [ "" ] }, { "cell_type": "markdown", "id": "41346248-3b3c-4404-ac6c-56fd021a9227", "metadata": {}, "source": [ "
\n", "Solution\n", "\n", "```\n", "module load cdo\n", "cdo sinfov $SCRATCHDIR/data_lam/forcing_ML_20210714T020000Z_lbc.nc\n", "```\n", "\n", "Answer:\n", "- horizontally interpolated from R3B7 (13km) to R3B8 (6.5km)\n", "- NetCDF format instead of GRIB2 format\n", "\n", "
\n", "\n", "
" ] }, { "cell_type": "markdown", "id": "d6202365-b602-43b7-a5d7-a6c694390666", "metadata": { "jp-MarkdownHeadingCollapsed": true, "tags": [] }, "source": [ "
\n", " Exercise (Time steps in the boundary data files):\n", "
\n", " How many timesteps are contained in a single file? \n", "
      

\n", " What is the time interval between two consecutive boundary data files?\n", "

" ] }, { "cell_type": "raw", "id": "38288f3b-bd14-417f-8c54-2082ec9899f4", "metadata": {}, "source": [ "" ] }, { "cell_type": "markdown", "id": "97f631cf-ac43-49ba-9493-c84130f2bd29", "metadata": {}, "source": [ "
\n", "Solution\n", "\n", "- [ ] none\n", "- [X] 1\n", "- [ ] 4\n", "\n", "The time interval between two consecutive boundary data files is **2 hours**\n", "\n", "
\n", "\n", "
" ] }, { "cell_type": "markdown", "id": "23780817-53c7-4d71-96fc-f7a26784e7c4", "metadata": { "jp-MarkdownHeadingCollapsed": true, "tags": [] }, "source": [ "
\n", " Exercise: \n", " Inspect the number of vertical levels in a boundary data file. Is it different from the number of vertical levels that is used by the model itself?\n", "
" ] }, { "cell_type": "raw", "id": "6eb993e2-1c0d-4d2d-b54f-4904a8d3bd5e", "metadata": {}, "source": [ "" ] }, { "cell_type": "markdown", "id": "5aebfff1-a639-48c6-8b52-f42b1c3872c3", "metadata": {}, "source": [ "
\n", "Solution\n", "\n", "**Answer: The number of vertical levels is different**\n", "\n", "Get vertical levels in a boundary file:\n", "```\n", "module load cdo\n", "cdo sinfov $SCRATCHDIR/data_lam/forcing_ML_20210715T060000Z_lbc.nc\n", "```\n", "**boundary file: 90 levels**\n", "\n", "**Model: 65 levels** (see table above in Section 'Example Setup')\n", "\n", "
\n", "\n", "
" ] }, { "cell_type": "markdown", "id": "33a85a82-2827-4ed8-9c85-25f6d570d804", "metadata": {}, "source": [ "## Preparing the ICON-LAM run (namelists etc.)" ] }, { "cell_type": "markdown", "id": "8c1c0ed3-8ad3-4a2e-9adc-246a60aae368", "metadata": {}, "source": [ "Execute the following cell in order to set some environment variables that will be used during this exercise." ] }, { "cell_type": "code", "execution_count": null, "id": "eccee5ab-ad5e-4d5e-ba1d-ef516e52d65a", "metadata": {}, "outputs": [], "source": [ "# base directory for ICON sources and binary:\n", "export ICONDIR=/pool/data/ICON/ICON_training/icon/\n", "\n", "# directory with input grids and external data:\n", "export GRIDDIR=/pool/data/ICON/ICON_training/exercise_lam/grids\n", "\n", "# directory with initial and boundary data:\n", "export SCRATCHDIR=/scratch/${USER::1}/$USER\n", "export DATADIR=$SCRATCHDIR/data_lam\n", "# Fallback reference initial and boundary data:\n", "#export DATADIR=/pool/data/ICON/ICON_training/exercise_lam/data_lam\n", "\n", "# absolute path to directory with plenty of space:\n", "export EXPDIR=$SCRATCHDIR/exercise_lam\n", "export OUTDIR=$EXPDIR\n", "\n", "# absolute path to files needed for radiation\n", "export RADDIR=${ICONDIR}/externals/ecrad/data" ] }, { "cell_type": "markdown", "id": "31cd776e-c91d-4063-86d9-e409cee9c54e", "metadata": {}, "source": [ "Copy input data: grids, external parameters, model. The directory for the experiment will be created, if not already there." ] }, { "cell_type": "code", "execution_count": null, "id": "299d3e82-1310-4109-a692-0cdcf8351fc8", "metadata": {}, "outputs": [], "source": [ "if [ ! -d $EXPDIR ]; then\n", " mkdir -p $EXPDIR\n", "fi\n", "cd ${EXPDIR}\n", "\n", "# grid files and external parameter: link to output directory\n", "ln -sf ${GRIDDIR}/iconR*.nc .\n", "ln -sf ${GRIDDIR}/extpar*.nc .\n", "\n", "# data files\n", "ln -sf ${DATADIR}/* .\n", "\n", "# dictionaries for the mapping: DWD GRIB2 names <-> ICON internal names\n", "ln -sf ${ICONDIR}/run/ana_varnames_map_file.txt .\n", "ln -sf ${GRIDDIR}/../exercise_lam/map_file.latbc .\n", "\n", "# For output: Dictionary for the mapping: names specified in the output nml <-> ICON internal names\n", "cp ${ICONDIR}/run/dict.output.dwd dict.output.dwd" ] }, { "cell_type": "markdown", "id": "d88b5019-6992-4972-b321-18c60f2fa95f", "metadata": { "tags": [] }, "source": [ "## Model namelists" ] }, { "cell_type": "markdown", "id": "447dd0c6-19c7-418f-aa0d-c85aa31fc564", "metadata": {}, "source": [ "### Create ICON master namelist" ] }, { "cell_type": "markdown", "id": "7ce4227a-5799-4ecc-9096-661375d9b72c", "metadata": {}, "source": [ "
\n", " Exercise: \n", " Set the correct start date ini_datetime_string and end date end_datetime_string.\n", "
" ] }, { "cell_type": "code", "execution_count": null, "id": "a1ddd28b-342a-488b-981f-36d54323eb24", "metadata": {}, "outputs": [], "source": [ "ndays_restart=60\n", "dt_restart=$((${ndays_restart}*86400))\n", "\n", "cat > icon_master.namelist << EOF\n", "&master_nml\n", " lrestart = .false.\n", "/\n", "&time_nml\n", " ini_datetime_string = \"??????????\"\n", " end_datetime_string = \"??????????\"\n", " dt_restart = $dt_restart ! dummy value to avoid crashes\n", "/\n", "&master_model_nml\n", " model_type=1\n", " model_name=\"ATMO\"\n", " model_namelist_filename=\"NAMELIST_NWP\"\n", " model_min_rank=1\n", " model_max_rank=65536\n", " model_inc_rank=1\n", "/\n", "EOF" ] }, { "cell_type": "markdown", "id": "ed3e91a7-8b2f-40f8-9d51-1037ebd96dd3", "metadata": {}, "source": [ "
\n", "Solution\n", "\n", "```\n", "ini_datetime_string = \"2021-07-14T00:00:00\"\n", "end_datetime_string = \"2021-07-16T00:00:00\"\n", "```\n", "\n", "
\n", "\n", "
" ] }, { "cell_type": "markdown", "id": "2b5cd531-35fe-4f69-bc02-943defad59aa", "metadata": { "tags": [] }, "source": [ "### Create the ICON model namelist NAMELIST_NWP\n", "\n", "In the following we build up the ICON namelist `NAMELIST_NWP` step by step. To do so, we use the UNIX `cat` command in order to collect individual namelists into a single file (named `NAMELIST_NWP`). \n", "\n", "- for a complete list of namelist parameters see [Namelist_overview.pdf](../Namelist_overview.pdf):\n", "\n", "**Please make sure that you execute each of the following cells only once!**" ] }, { "cell_type": "markdown", "id": "7b2364fc-d88d-4789-9a8d-d06e10bd2372", "metadata": {}, "source": [ "
\n", " Exercise (Enabling the ICON Limited Area Mode):
\n", " Enabling the Limited Area Mode (LAM) is rather straightforward. It is easily enabled by a top-level Namelist switch l_limited_area (grid_nml) and by specifying lateral boundary data. Of course, other namelist settings might need to be adapted as well, in order to make a proper ICON-LAM setup.\n", "
    \n", "
  • Switch on the limited area mode by setting l_limited_area and init_mode, see Section 6.3 of the ICON Tutorial.
  • \n", "
  • Initial data: Specify the initial data file via dwdfg_filename. Since we do not make use of additional analysis information from a second data file, remember to set lread_ana accordingly (see Section 6.3).
  • \n", "
  • Boundary data: Specify the lateral boundary data via latbc_filename. You will have to make use of the keywords <y>, <m> <d> and <h> (see Section 6.4.1 of the ICON Tutorial).
  • \n", "
\n", "
" ] }, { "cell_type": "code", "execution_count": null, "id": "b1417558-a45a-4304-bd3b-e513fba68a79", "metadata": {}, "outputs": [], "source": [ "cat > NAMELIST_NWP << EOF\n", "! grid_nml: horizontal grid --------------------------------------------------\n", "&grid_nml\n", " dynamics_grid_filename = 'iconR3B08_DOM01.nc' ! array of the grid filenames for the dycore\n", " radiation_grid_filename = 'iconR3B07_DOM00.nc' ! array of the grid filenames for the radiation model\n", " lredgrid_phys = .TRUE.,.TRUE. ! .true.=radiation is calculated on a reduced grid\n", " lfeedback = .TRUE. ! specifies if feedback to parent grid is performed\n", " ifeedback_type = 2 ! feedback type (incremental/relaxation-based)\n", " l_limited_area = ?????????? ! limited area run TRUE/FALSE\n", "/\n", "\n", "! initicon_nml: specify read-in of initial state ------------------------------\n", "&initicon_nml\n", " init_mode = ?????????? ! start from DWD initialized analysis\n", " dwdfg_filename = ?????????? ! initialized analysis data input filename\n", " lread_ana = ?????????? ! Read separate analysis file in addition to dwdfg_filename\n", " ltile_coldstart = .TRUE. ! coldstart for surface tiles\n", " ltile_init = .FALSE. !\n", " lp2cintp_sfcana = .TRUE. ! interpolate surface analysis from global domain onto nest\n", " ana_varnames_map_file = 'ana_varnames_map_file.txt' ! dictionary mapping internal names onto GRIB2 or NetCDF shortNames\n", "/\n", "\n", "&limarea_nml\n", " dtime_latbc = 7200. ! Time difference between two consecutive boundary data\n", " init_latbc_from_fg = .TRUE. ! If .TRUE., take lateral boundary conditions for initial time from FG\n", " itype_latbc = 1 ! Type of lateral boundary nudging\n", " latbc_boundary_grid = '' ! Grid file defining the lateral boundary\n", " latbc_varnames_map_file = 'map_file.latbc' ! dictionary mapping internal names onto GRIB2 or NetCDF shortNames\n", " latbc_path = './' ! path to boundary data\n", " latbc_filename = ?????????? ! Filename of boundary data input file in the latbc_path directory.\n", "/\n", "EOF" ] }, { "cell_type": "markdown", "id": "9bfa9128-0b25-428e-98e8-56ba7d8b2599", "metadata": {}, "source": [ "
\n", "Solution\n", "\n", "```\n", "&grid_nml\n", " l_limited_area = .TRUE.\n", "/\n", "&initicon_nml\n", " init_mode = 7\n", " dwdfg_filename = 'init_ML_20210714T000000Z.nc'\n", " lread_ana = .false. \n", "/\n", "\n", "&limarea_nml\n", " latbc_filename = 'forcing_ML_T0000Z_lbc.nc'\n", "/\n", "```\n", "\n", "
" ] }, { "cell_type": "markdown", "id": "3e013936-046a-4214-a4b5-3c98e13f0260", "metadata": {}, "source": [ "### Excursion: ICON dictionaries" ] }, { "cell_type": "markdown", "id": "48cfb8f3-9608-4160-8e49-a42c017697a2", "metadata": {}, "source": [ "ICON makes use of dictionaries during input and output. A dictionary has the purpose to translate the variable names in the GRIB2 or NetCDF input file into the ICON internal variable names (and vice versa). \n", "
\n", "There exist four different dictionaries, i.e.\n", "\n", "- `ana_varnames_map_file` in the namelist `initicon_nml`:
maps the variable names in the initial conditions file to the ICON internal names\n", "- `latbc_varnames_map_file` in the namelist `limarea_nml`:
maps the variable names in the lateral boundary files (LAM) to the ICON internal names\n", "- `output_nml_dict` in the namelist `io_nml`:
maps output variable names specified in the `output_nml` to the ICON internal names\n", "- `netcdf_dict` in the namelist `io_nml`:
mapping from internal names to names written to the NetCDF file" ] }, { "cell_type": "markdown", "id": "7b8fb99b-e07d-416c-8f65-6d0dccbe3388", "metadata": {}, "source": [ "
\n", " Exercise (The input dictionary):\n", "
\n", " Take a closer look at the dictionary ana_varnames_map_file.txt (initicon_nml), which is used in this exercise when reading the initial conditions.
\n", " According to the dictionary, what are the internal variable names and the GRIB2/NetCDF variable names for

\n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", "
VariableGRIB2/NetCDFICON internal
Pressure??????
Temperature??????
Half level height??????
\n", "

    \n", "
  • under which conditions does the dictionary ana_varnames_map_file.txt become obsolete (i.e. which conditions must be fulfilled by the initial conditions file)?
  • \n", "
\n", "
" ] }, { "cell_type": "raw", "id": "83563158-e538-4afe-8236-2181cdb29fd8", "metadata": {}, "source": [ "your answer" ] }, { "cell_type": "markdown", "id": "d852a311-2e68-46da-90c6-60a25bc326f6", "metadata": {}, "source": [ "
\n", "Solution\n", "\n", "\n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", "
VariableGRIB2/NetCDFICON internal
PressurePpres
TemperatureTtemp
Half level heightHHLz_ifc
\n", "\n", "The dictionary `ana_varnames_map_file` becomes obsolete, if \n", "- the initial conditions are provided in NetCDF format\n", "- the NetCDF variable names are equivalent to the ICON internal variable names\n", "\n", "
\n", "\n", "
" ] }, { "cell_type": "markdown", "id": "19acb4cc-8d7b-4e13-be7f-7a1ae8ae421b", "metadata": {}, "source": [ "Now we add the remaining namelist settings.\n", "\n", "*For a complete list see [Namelist_overview.pdf](../Namelist_overview.pdf)*" ] }, { "cell_type": "code", "execution_count": null, "id": "f1ef6db6-b1a3-43d6-b5eb-0e811300cc37", "metadata": {}, "outputs": [], "source": [ "cat >> NAMELIST_NWP << EOF\n", "\n", "! parallel_nml: MPI parallelization -------------------------------------------\n", "¶llel_nml\n", " nproma = 32 ! loop chunk length\n", " p_test_run = .FALSE. ! .TRUE. means verification run for MPI parallelization\n", " num_io_procs = 1 ! number of I/O processors\n", " num_restart_procs = 0 ! number of restart processors\n", " num_prefetch_proc = 1 ! number of processors for boundary data prefetching\n", " iorder_sendrecv = 3 ! sequence of MPI send/receive calls\n", " num_dist_array_replicas = 4 ! distributed arrays: no. of replications\n", " use_omp_input = .TRUE. ! allows task parallelism for reading atmospheric input data\n", "/\n", "\n", "! io_nml: general switches for model I/O -------------------------------------\n", "&io_nml\n", " dt_diag = 21600.0 ! diagnostic integral output interval\n", " dt_checkpoint = 864000.0 ! time interval for writing restart files.\n", " itype_pres_msl = 5 ! method for computation of mean sea level pressure\n", " itype_rh = 1 ! (1) 2: mixed phase (water and ice)\n", " output_nml_dict = 'dict.output.dwd' ! maps output_nml variable names onto internal ICON names\n", " lflux_avg = .TRUE. ! fluxes are accumulated rather than averaged for output\n", " lmask_boundary = .TRUE. ! mask interpolation zone\n", "/\n", "\n", "! run_nml: general switches ---------------------------------------------------\n", "&run_nml\n", " num_lev = 65 ! number of full levels (atm.) for each domain\n", " lvert_nest = .FALSE. ! vertical nesting\n", " dtime = 60. ! timestep in seconds\n", " ltestcase = .FALSE. ! idealized testcase runs\n", " ldynamics = .TRUE. ! compute adiabatic dynamic tendencies\n", " ltransport = .TRUE. ! compute large-scale tracer transport\n", " lart = .FALSE. ! ICON-ART main switch\n", " ntracer = 5 ! number of advected tracers\n", " iforcing = 3 ! forcing of dynamics and transport by parameterized processes\n", " msg_level = 11 ! controls how much printout is written during runtime\n", " ltimer = .TRUE. ! timer for monitoring the runtime of specific routines\n", " timers_level = 10 ! performance timer granularity\n", " output = \"nml\" ! main switch for enabling/disabling components of the model output\n", "/\n", "\n", "! nwp_phy_nml: switches for the physics schemes ------------------------------\n", "&nwp_phy_nml\n", " inwp_gscp = 2 ! cloud microphysics (incl. graupel) and precipitation\n", " inwp_convection = 1 ! convection\n", " lshallowconv_only = .FALSE. !\n", " lgrayzone_deepconv = .TRUE. !\n", " inwp_radiation = 4 ! radiation\n", " inwp_cldcover = 1 ! cloud cover scheme for radiation\n", " inwp_turb = 1 ! vertical diffusion and transfer\n", " inwp_satad = 1 ! saturation adjustment\n", " inwp_sso = 1 ! subgrid scale orographic drag\n", " inwp_gwd = 0 ! non-orographic gravity wave drag\n", " inwp_surface = 1 ! surface scheme\n", " latm_above_top = .TRUE. ! take into account atmosphere above model top for radiation computation\n", " ldetrain_conv_prec = .TRUE. !\n", " efdt_min_raylfric = 7200.0 ! minimum e-folding time of Rayleigh friction\n", " itype_z0 = 2 ! type of roughness length data\n", " icapdcycl = 3 ! apply CAPE modification to improve diurnalcycle over tropical land\n", " icpl_aero_conv = 1 ! coupling between autoconversion and Tegen aerosol climatology\n", " icpl_aero_gscp = 1 ! coupling between autoconversion and Tegen aerosol climatology\n", " icpl_o3_tp = 1 ! coupling between ozone mixing ratio and thermal tropopause\n", " dt_rad = 720 ! time step for radiation in s\n", " dt_conv = 120 ! time step for convection in s (domain specific)\n", " dt_sso = 120 ! time step for SSO parameterization\n", " dt_gwd = 120 ! time step for gravity wave drag parameterization\n", " mu_rain = 0.5 ! shape parameter in gamma distribution for rain\n", " rain_n0_factor = 0.1 ! tuning factor for intercept parameter of raindrop size distr.\n", "/\n", "\n", "&nwp_tuning_nml\n", " itune_albedo = 1\n", " tune_box_liq_asy = 4.0\n", " tune_gfrcrit = 0.333\n", " tune_gkdrag = 0.125\n", " tune_gkwake = 0.25\n", " tune_gust_factor = 7.0\n", " itune_gust_diag = 3\n", " tune_gustsso_lim = 20.\n", " tune_minsnowfrac = 0.3\n", " tune_sgsclifac = 1.0\n", " tune_rcucov = 0.075\n", " tune_rhebc_land = 0.825\n", " tune_zvz0i = 0.85\n", " icpl_turb_clc = 2\n", " max_calibfac_clcl = 2.0\n", " tune_box_liq = 0.04\n", " tune_eiscrit = 7.\n", "/\n", "\n", "! turbdiff_nml: turbulent diffusion -------------------------------------------\n", "&turbdiff_nml\n", " tkhmin = 0.5 ! scaling factor for minimum vertical diffusion coefficient\n", " tkhmin_strat = 0.75 ! Scaling factor for stratospheric minimum vertical diffusion coefficient\n", " tkmmin = 0.75 ! scaling factor for minimum vertical diffusion coefficient\n", " tkmmin_strat = 4.0 ! Scaling factor for stratospheric minimum vertical diffusion coefficient\n", " tur_len = 300. ! Asymptotic maximal turbulent distance\n", " pat_len = 750.0 ! effective length scale of thermal surface patterns\n", " q_crit = 2.0 ! critical value for normalized super-saturation \n", " rat_sea = 0.8 ! controls laminar resistance for sea surface\n", " ltkesso = .TRUE. ! consider TKE-production by sub-grid SSO wakes\n", " frcsmot = 0.2 ! these 2 switches together apply vertical smoothing of the TKE source terms\n", " imode_frcsmot = 2 ! in the tropics (only), which reduces the moist bias in the tropical lower troposphere\n", " imode_tkesso = 2 ! mod of calculating the SSO source term for TKE production\n", " itype_sher = 2 ! type of shear forcing used in turbulence\n", " ltkeshs = .TRUE. ! include correction term for coarse grids in hor. shear production term\n", " a_hshr = 1.25 ! length scale factor for separated horizontal shear mode\n", " alpha1 = 0.125 ! Scaling parameter for molecular roughness of ocean waves\n", " icldm_turb = 2 ! mode of cloud water representation in turbulence\n", " rlam_heat = 10.0 ! Scaling factor of the laminar boundary layer for heat\n", "/\n", "\n", "! lnd_nml: land scheme switches -----------------------------------------------\n", "&lnd_nml\n", " ntiles = 3 ! number of tiles\n", " nlev_snow = 3 ! number of snow layers\n", " lmulti_snow = .FALSE. ! .TRUE. for use of multi-layer snow model\n", " idiag_snowfrac = 20 ! type of snow-fraction diagnosis\n", " lsnowtile = .TRUE. ! .TRUE.=consider snow-covered and snow-free separately\n", " itype_canopy = 2 ! Type of canopy parameterization\n", " itype_root = 2 ! root density distribution\n", " itype_trvg = 3 ! BATS scheme with add. prog. var. for integrated plant transpiration since sunrise\n", " itype_evsl = 4 ! type of bare soil evaporation\n", " itype_heatcond = 3 ! type of soil heat conductivity\n", " itype_lndtbl = 4 ! table for associating surface parameters\n", " itype_snowevap = 3 ! Snow evap. in vegetated areas with add. variables for snow age and max. snow height\n", " cwimax_ml = 5.e-4 ! scaling parameter for maximum interception parameterization\n", " c_soil = 1.25 ! surface area density of the (evaporative) soil surface\n", " c_soil_urb = 0.5 ! surface area density of the (evaporative) soil surface, urban areas\n", " lseaice = .TRUE. ! .TRUE. for use of sea-ice model\n", " llake = .TRUE. ! .TRUE. for use of lake model\n", " lprog_albsi = .TRUE. ! sea-ice albedo is computed prognostically\n", " sstice_mode = 2 ! SST is updated by climatological increments on a daily basis\n", "/\n", "\n", "! radiation_nml: radiation scheme ---------------------------------------------\n", "&radiation_nml\n", " irad_o3 = 79 ! ozone climatology\n", " irad_aero = 6 ! aerosols\n", " islope_rad = 0 ! slope correction for surface radiation\n", " albedo_type = 2 ! type of surface albedo\n", " albedo_whitecap = 1 ! Ocean albedo increase by foam\n", " direct_albedo_water = 3 ! Direct beam surface albedo over water\n", " vmr_co2 = 425.e-06 ! values representative for 2023\n", " vmr_ch4 = 1900.e-09\n", " vmr_n2o = 334.0e-09\n", " vmr_o2 = 0.20946\n", " vmr_cfc11 = 220.e-12\n", " vmr_cfc12 = 490.e-12\n", " ecrad_data_path = \"${RADDIR}\" ! Path to folder containing ecRad optical property files\n", "/\n", "\n", "! nonhydrostatic_nml: nonhydrostatic model -----------------------------------\n", "&nonhydrostatic_nml\n", " iadv_rhotheta = 2 ! advection method for rho and rhotheta\n", " ivctype = 2 ! type of vertical coordinate\n", " itime_scheme = 4 ! time integration scheme\n", " ndyn_substeps = 5 ! number of dynamics steps per fast-physics step\n", " exner_expol = 0.333 ! temporal extrapolation of Exner function\n", " vwind_offctr = 0.2 ! off-centering in vertical wind solver\n", " damp_height = 12250. ! height at which Rayleigh damping of vertical wind starts\n", " rayleigh_coeff = 5.0 ! Rayleigh damping coefficient\n", " divdamp_order = 24 ! order of divergence damping \n", " divdamp_type = 32 ! type of divergence damping\n", " divdamp_fac = 0.004 ! scaling factor for divergence damping\n", " igradp_method = 3 ! discretization of horizontal pressure gradient\n", " l_zdiffu_t = .TRUE. ! specifies computation of Smagorinsky temperature diffusion\n", " thslp_zdiffu = 0.02 ! slope threshold (temperature diffusion)\n", " thhgtd_zdiffu = 125.0 ! threshold of height difference (temperature diffusion)\n", " htop_moist_proc = 22500.0 ! max. height for moist physics\n", " hbot_qvsubstep = 22500.0 ! height above which QV is advected with substepping scheme\n", "/\n", "\n", "! sleve_nml: vertical level specification -------------------------------------\n", "&sleve_nml\n", " min_lay_thckn = 20.0 ! layer thickness of lowermost layer\n", " stretch_fac = 0.65 ! stretching factor to vary distribution of model levels\n", " decay_scale_1 = 4000.0 ! decay scale of large-scale topography component\n", " decay_scale_2 = 2500.0 ! decay scale of small-scale topography component\n", " decay_exp = 1.2 ! exponent of decay function\n", " flat_height = 16000.0 ! height above which the coordinate surfaces are flat\n", " itype_laydistr = 1 ! Type of analytical function to specify the distr. of vert. coords\n", " top_height = 22000.0 ! height of model top\n", "/\n", "\n", "! dynamics_nml: dynamical core -----------------------------------------------\n", "&dynamics_nml\n", " iequations = 3 ! type of equations and prognostic variables\n", " divavg_cntrwgt = 0.50 ! weight of central cell for divergence averaging\n", " lcoriolis = .TRUE. ! Coriolis force\n", "/\n", "\n", "! transport_nml: tracer transport ---------------------------------------------\n", "&transport_nml\n", " ivadv_tracer = 3, 3, 3, 3, 3, 3 ! tracer specific method to compute vertical advection\n", " itype_hlimit = 3, 4, 4, 4, 4, 4 ! type of limiter for horizontal transport\n", " ihadv_tracer = 52, 2, 2, 2, 2, 3 ! tracer specific method to compute horizontal advection\n", " llsq_svd = .TRUE. ! use QR decomposition for least squares design matrix\n", " beta_fct = 1.005 ! limiter: boost factor for range of permissible values\n", "/\n", "\n", "! diffusion_nml: horizontal (numerical) diffusion ----------------------------\n", "&diffusion_nml\n", " lhdiff_vn = .TRUE. ! diffusion on the horizontal wind field\n", " lhdiff_temp = .TRUE. ! diffusion on the temperature field\n", " hdiff_order = 5 ! order of nabla operator for diffusion\n", " itype_vn_diffu = 1 ! reconstruction method used for Smagorinsky diffusion\n", " itype_t_diffu = 2 ! discretization of temperature diffusion\n", " hdiff_efdt_ratio = 24.0 ! ratio of e-folding time to time step \n", " hdiff_smag_fac = 0.025 ! scaling factor for Smagorinsky diffusion\n", "/\n", "\n", "! interpol_nml: settings for internal interpolation methods ------------------\n", "&interpol_nml\n", " nudge_zone_width = 10 ! Total width (in units of cell rows) for lateral boundary nudging zone\n", " nudge_max_coeff = 0.075 ! Maximum relaxation coefficient for lateral boundary nudging\n", " lsq_high_ord = 3 ! least squares polynomial order\n", " support_baryctr_intp = .FALSE. !.TRUE. ! .TRUE.: barycentric interpolation support for output\n", "/\n", "\n", "! gridref_nml: grid refinement settings --------------------------------------\n", "&gridref_nml\n", " denom_diffu_v = 150. ! denominator for lateral boundary diffusion of velocity\n", "/\n", "\n", "! extpar_nml: external data --------------------------------------------------\n", "&extpar_nml\n", " itopo = 1 ! topography (0:analytical)\n", " itype_lwemiss = 2 ! requires updated extpar data\n", " itype_vegetation_cycle = 1 ! specifics for annual cycle of LAI\n", " extpar_filename = 'extpar_DOM.nc' ! filename of external parameter input file\n", " n_iter_smooth_topo = 1 ! iterations of topography smoother\n", " heightdiff_threshold = 2250. ! threshold above which additional nabla2 diffuion is applied\n", " hgtdiff_max_smooth_topo = 750. ! see Namelist doc\n", " read_nc_via_cdi = .TRUE. ! read NetCDF input data via CDI library\n", "/\n", "\n", "&output_nml\n", " ! ----------------------------------------------- !\n", " ! --- ICON-LAM: output fields - lat/lon grid --- !\n", " ! ----------------------------------------------- !\n", " filetype = 4 ! output format: 2=GRIB2, 4=NETCDFv2\n", " dom = 1 ! write lam domain\n", " output_time_unit = 1 ! 1: seconds\n", " output_bounds = 0., 10000000., 3600. ! start, end, increment [s]\n", " steps_per_file = 1\n", " mode = 1 ! 1: forecast mode (relative t-axis), 2: climate mode (absolute t-axis)\n", " include_last = .true.\n", " output_filename = '$OUTDIR/ilfff' ! file name base\n", " filename_format = ''\n", " ml_varlist = 'pres_sfc', 'rh_2m', 't_2m', 'tqv_dia', 'tqc_dia', 'gust10', 'tot_prec', 'asodird_s', 'cape', 'alhfl_s', 'ashfl_s'\n", " remap = 1 ! 0: icon grid, 1: lat-lon\n", " reg_lon_def = 1.0, 0.05, 16.6\n", " reg_lat_def = 44.1, 0.05, 57.5\n", "/\n", "EOF" ] }, { "cell_type": "markdown", "id": "7223314d-dd0d-43ef-a796-1ec1936c2395", "metadata": { "tags": [] }, "source": [ "## The ICON-LAM batch job\n", "\n", "---" ] }, { "cell_type": "markdown", "id": "313c5018-1209-44d1-87f4-68f8142ae9a0", "metadata": { "tags": [] }, "source": [ "The ICON-LAM batch job shown below summarizes all steps which are necessary for **running ICON-LAM on the DKRZ Levante** platform. \n", "Here we assume that all mandatory input fields are available. \n" ] }, { "cell_type": "code", "execution_count": null, "id": "59b2c86f-8746-4900-98b6-3083aadb62d4", "metadata": { "tags": [] }, "outputs": [], "source": [ "cat > $EXPDIR/icon-lam.sbatch << 'EOF'\n", "#!/bin/bash\n", "#SBATCH --job-name=LAM_test\n", "#SBATCH --partition=compute\n", "#SBATCH --nodes=4\n", "#SBATCH --ntasks-per-node=128\n", "#SBATCH --output=slurm.%j.out\n", "#SBATCH --exclusive\n", "#SBATCH --mem-per-cpu=960\n", "#SBATCH --time=00:15:00\n", "\n", "### ENV ###\n", "env\n", "set -xe\n", "\n", "unset SLURM_EXPORT_ENV \n", "unset SLURM_MEM_PER_NODE\n", "unset SBATCH_EXPORT\n", "\n", "\n", "# limit stacksize ... adjust to your programs need\n", "# and core file size\n", "ulimit -s 204800\n", "ulimit -c 0\n", "ulimit -l unlimited\n", "\n", "export SLURM_DIST_PLANESIZE=\"32\"\n", "export OMPI_MCA_btl=\"self\"\n", "export OMPI_MCA_coll=\"^ml,hcoll\"\n", "export OMPI_MCA_io=\"romio321\"\n", "export OMPI_MCA_osc=\"ucx\"\n", "export OMPI_MCA_pml=\"ucx\"\n", "export UCX_HANDLE_ERRORS=\"bt\"\n", "export UCX_TLS=\"shm,dc_mlx5,dc_x,self\"\n", "export UCX_UNIFIED_MODE=\"y\"\n", "export MALLOC_TRIM_THRESHOLD_=\"-1\"\n", "export OMPI_MCA_pml_ucx_opal_mem_hooks=1\n", "\n", "export OMP_NUM_THREADS=1\n", "\n", "module load eccodes\n", "export ECCODES_DEFINITION_PATH=/pool/data/ICON/ICON_training/eccodes/definitions.edzw-2.27.0-1:$ECCODES_DEFINITION_PATH\n", "\n", "export MODEL=$ICONDIR/build/bin/icon\n", "\n", "srun -l --cpu_bind=verbose --hint=nomultithread --distribution=block:cyclic $MODEL\n", "\n", "EOF\n" ] }, { "cell_type": "markdown", "id": "3b646c83-11eb-4502-ae61-911b22c3f079", "metadata": {}, "source": [ "Submit the job to the HPC cluster, using the Slurm command `sbatch`." ] }, { "cell_type": "code", "execution_count": null, "id": "4703ede8-e217-4854-bb89-531be7b987c0", "metadata": { "tags": [] }, "outputs": [], "source": [ "export ICONDIR=$ICONDIR\n", "cd $EXPDIR && sbatch --account=$SLURM_JOB_ACCOUNT icon-lam.sbatch" ] }, { "cell_type": "markdown", "id": "e47a4068-8f0c-45bb-9eaf-e77ce9f5dcfd", "metadata": {}, "source": [ "
\n", " Exercise: \n", " Investigate the Slurm settings of the batch script. How many compute nodes and MPI tasks (in total) are used for this ICON-LAM run?\n", "
      
\n", " \n", "
" ] }, { "cell_type": "raw", "id": "75efeb1a-f03f-453e-bc96-52653e1ba61e", "metadata": {}, "source": [ "" ] }, { "cell_type": "markdown", "id": "172d2552-3d1d-41ff-9761-99dfce410238", "metadata": { "tags": [] }, "source": [ "
\n", "Solution\n", "\n", "**Exercise:** 4 nodes with 512 mpi tasks\n", "\n", "
\n", "\n", "
" ] }, { "cell_type": "markdown", "id": "9f86e76d-eb85-4fb9-b119-e211f95477e0", "metadata": {}, "source": [ "
\n", " Optional Exercise (Getting used to ICON dictionaries):\n", "
\n", " Experience has shown that many users find it difficult to set the aforementioned dictionaries, such as the output dictionary \n", "
output_nml_dict = 'dict.output.dwd' (namelist io_nml).
\n", " This particular dictionary is currently not in use, as we have been using ICON internal variable names for specifying the output variables:
\n", "
\n", " ml_varlist = 'pres_sfc', 'rh_2m', 't_2m', 'tqv_dia', 'tqc_dia', 'gust10', 'tot_prec', 'asodird_s', 'cape', 'alhfl_s', 'ashfl_s'\n", "

\n", " In some cases, e.g. for operational forecast runs, it turned out to be beneficial to use GRIB2 shortnames rather than ICON internal variable names for specifying the output variables.\n", "
\n", " Try to make use of the dictionary as follows:\n", "
    \n", "
  • in the ml_varlist, replace the ICON internal variable names by the corresponding DWD GRIB2 shortnames (see Table A in the ICON tutorial book, Appendix A, for the GRIB2 shortnames)
  • \n", "
  • add missing dictionary entries to dict.output.dwd in $EXPDIR
  • \n", "
  • Re-run the model, in order to check if the dictionary works.
    If your dictionary is wrong or incomplete, you will get a runtime error.
  • \n", "
\n", "
" ] }, { "cell_type": "raw", "id": "fca35c5b-3747-4306-bf62-4c49b8ab4694", "metadata": {}, "source": [ "" ] }, { "cell_type": "markdown", "id": "723244a4-68ad-4d65-b530-92a0775affb9", "metadata": { "tags": [] }, "source": [ "
\n", "Solution\n", "\n", "```\n", "ml_varlist = 'PS', 'RELHUM_2M', 'T_2M', 'TQV_DIA', 'TQC_DIA', 'VMAX_10M', 'TOT_PREC', 'ASWDIR_S', 'CAPE_CON', 'ALHFL_S', 'ASHFL_S'\n", "\n", "Additional dictionary entries:\n", "\n", "## GRIB2 shortName internal name\n", "ASWDIR_S asodird_s\n", "CAPE_CON cape\n", "```\n", "\n", "
\n", "\n", "
" ] }, { "cell_type": "markdown", "id": "9aac54de-f1c9-480c-8dc0-b32aea85a75a", "metadata": { "tags": [] }, "source": [ "## Visualizing the model output (the Ahr valley flood)\n", "\n", "---\n", "
\n", "\"alt_text\"/\n", "
Left – precipitation accumulated over two days (48h hour accumulation 13 July 00:00 UTC – 15 July 2021, 00:00 UTC).
\n", " Right – Precipitation accumulated over 24 hours for each of the individual days of the extreme precipitation event.
\n", "
\n", "\n", "As some of you might have recognized already, the chosen forecast date coincides with the date of the **Ahr valley flood event**. \n", "This heavy rainfall event occurred on July 14-15 near the boarder of Germany and Belgium and tragically caused many fatalities and significant damage to the infrastructure.\n", "The observed accumulated precipitation during that period is depicted in the figure to the right. \n", "\n", "An in-depth analysis of this event is, of course, beyond the scope of this exercise. However, we will have a look into our forecast result and check, if our very simplified setup \n", "was already capable of catching the flood event. To this end, we will visualize the simulated precipitation in our LAM domain and compare with the observations.\n", "\n", "Please check if the batch job queue is empty (`squeue`). Alternatively, you may have a final look at the ICON-LAM log output, to check if the run has finished successfully. \n", "To do so, please navigate to your experiment directory `$EXPDIR` and open the file `slurm..out`." ] }, { "cell_type": "markdown", "id": "75d3de78-9725-4cd5-a0af-850d606a0945", "metadata": {}, "source": [ "\n", " \n", " \n", " \n", " \n", "
\n", "
\n", " Exercise (Visualization):
\n", " Visualize the total column integrated water vapour and cloud water by running the script \n", " scripts/icon_exercise_lam_plot_tqv.ipynb. You may adapt the script in order to visualize different dates.\n", "
\n", "
\n", "
\n", " \"alt_text\"/\n", "
\n", " Simulated total column integrated water vapour at July 15 00UTC\n", "
\n", "
\n", "
\n" ] }, { "cell_type": "markdown", "id": "4b16505b-41fc-4c3a-adc7-948793fc724c", "metadata": {}, "source": [ "
\n", " Exercise (Visualization):\n", "
\n", " Visualize the 48h accumulated precipitation (2021-07-14 00UTC - 2021-07-16 00UTC) by running the script scripts/icon_exercise_lam_plot_totprec.ipynb. \n", "
\n" ] }, { "cell_type": "markdown", "id": "f080d0d5-07b4-4910-a1ef-ffd1ff16afbc", "metadata": { "tags": [] }, "source": [ "## Temporal Resolution of the Boundary Data\n", "\n", "---\n", "\n", "By changing the temporal resolution of your boundary data, you will get some idea how this might affect the quality of your simulation results.\n", "\n", "We suggest to create a copy of your experiment directory `$EXPDIR`, named `${EXPDIR}_orig`, in order to avoid overwriting your previous results. To do so, please execute the following cell:\n" ] }, { "cell_type": "code", "execution_count": null, "id": "63373d58-cdf7-4576-a613-a7fea1b7a048", "metadata": {}, "outputs": [], "source": [ "cp -r ${EXPDIR} ${EXPDIR}_orig" ] }, { "cell_type": "markdown", "id": "636c82ec-8a9d-4cc8-844b-9deeaadb2bf4", "metadata": {}, "source": [ "
\n", " Exercise (Boundary data update frequency):\n", "
    \n", "
  • Halve the time resolution of your forcing (boundary) data, see Section 6.4 of the ICON Tutorial for the namelist parameter. Write down the namelist parameter and your chosen value.
  • \n", "
\n", "We suggest to edit the Namelist ${EXPDIR}/NAMELIST_NWP manually in a terminal.\n", "
" ] }, { "cell_type": "raw", "id": "d1bf11e6-08f6-4c1b-affc-a94dc868e20d", "metadata": {}, "source": [ "your answer" ] }, { "cell_type": "markdown", "id": "6705f458-6c74-459b-adaa-fe7541953f16", "metadata": {}, "source": [ "
\n", "Solution\n", "\n", "Answer:\n", "\n", "- Namelist parameter: `dtime_latbc` (`limarea_nml`)\n", "- old value: 7200
**new value: 14400s**\n", "\n", "
\n", "\n", "
" ] }, { "cell_type": "markdown", "id": "d23d73ff-306c-45d8-88eb-6d1678b4f0f0", "metadata": {}, "source": [ "Submit the job to the HPC cluster, using the Slurm command `sbatch`." ] }, { "cell_type": "code", "execution_count": null, "id": "e73a6180-c9a9-46d8-a285-15ef937df997", "metadata": {}, "outputs": [], "source": [ "export ICONDIR=$ICONDIR\n", "cd $EXPDIR && sbatch --account=$SLURM_JOB_ACCOUNT icon-lam.sbatch" ] }, { "cell_type": "markdown", "id": "a0f913c7-4445-4d55-9c97-e74c77bda9f9", "metadata": {}, "source": [ "
\n", " Exercise:\n", "
    \n", "
  • Compare the results with your previous run. Does the boundary update frequency have a significant impact on the results?
  • \n", "
\n", "You can make use of the same plotting scripts as before \n", " \n", "
" ] }, { "cell_type": "raw", "id": "b847afb3-86a5-49b6-bf89-ce03e8292e29", "metadata": {}, "source": [ "your answer" ] }, { "cell_type": "markdown", "id": "4aff9a10-d4b5-4f3b-98d7-6833288f09a0", "metadata": {}, "source": [ "---" ] }, { "cell_type": "markdown", "id": "7c3bfe76-b650-498c-b7ca-c230cb25f2b2", "metadata": { "tags": [] }, "source": [ "

Congratulations! You have successfully completed Exercise 4.

" ] }, { "cell_type": "markdown", "id": "bc759a8d-52f7-49ec-b6cd-026786798869", "metadata": {}, "source": [ "---" ] }, { "cell_type": "markdown", "id": "7e39c53c-3467-4907-84f8-b01118f2c68e", "metadata": {}, "source": [ "## Further Reading and Resources\n", "\n", "- ICON Tutorial: https://www.dwd.de/DE/leistungen/nwv_icon_tutorial/nwv_icon_tutorial.html\n", "
A new draft version of the ICON Tutorial is available here: https://icon-training-2025-scripts-rendering-cc74a6.gitlab-pages.dkrz.de/index.html. It is currently being finalized and will be published soon.\n" ] }, { "cell_type": "markdown", "id": "cd838943-ef91-4a02-bf65-9fae0547457c", "metadata": {}, "source": [ "---\n", "\n", "*Author info: Deutscher Wetterdienst (DWD) 2025 :: icon@dwd.de. For a full list of contributors, see CONTRIBUTING in the root directory. License info: see LICENSE file.*" ] } ], "metadata": { "kernelspec": { "display_name": "Bash", "language": "bash", "name": "bash" }, "language_info": { "codemirror_mode": "shell", "file_extension": ".sh", "mimetype": "text/x-sh", "name": "bash" } }, "nbformat": 4, "nbformat_minor": 5 }