../../_images/ex06.png

ICON Training - Hands-on Session

Exercise 6.2: ICON ComIn, Part II#


The ICON Community Interface (ComIn) enables the integration of plugins which is a flexible way to extend the code with your own routines and variables at run-time. This part continues the practical exercise from the last section: more advanced, more complex features are now added to the Python plugin from “Step 1”:

  • Step P2: We learn how to perform data MPI gathering.

Step P2: MPI Gathering - Explanation of the Algorithm#

Using the MPI library for parallel communication is rather complex for newcomers. It requires dealing with the partitioning of data arrays into processes, which means translating between global and process-wise indices. For this reason, rather than explaining this mechanism directly in the ComIn plugin, we begin with a “dry run” in this Python-based Jupyter notebook.

Exercise: We simulate how distributed data across MPI processes can be gathered back to a single array. We will do so without using MPI and ComIn. This helps build an intuition for MPI gathering operation in parallel computing.

To complete this task, please follow the steps outlined in the next cells below.

../../_images/drawing_comIn_mpi_exercise.excalidraw01.png

Consider the following global array:

import numpy as np
my_array_global = np.array([10, 20, 30, 40, 50, 60, 70, 80, 90, 100])

Assume it’s distributed across 2 processes. Therefore we have a local array on each proccess.

nranks = 2
local_arrays = np.array_split(my_array_global, nranks)
ncells_glb = len(my_array_global)

Exercise A: Use an appropriate Python command to determine the global indices that each process owns.

Solution

The global indices “owned” by the two processes would be

  • [10, 20, 30, 40, 50] for process #0

  • [60, 70, 80, 90, 100] for process #1

global_idx = np.array_split(np.arange(ncells_glb), nranks)

Exercise B: Take the perspective of process 0 and verify that the data it received corresponds to the correct portion of the original global array.

Solution
assert np.all(my_array_global[global_idx[0]] == local_arrays[0]), "Mismatch for rank 0"

Exercise C: “Simulate” the gather operation with a simple for i in range(nranks) ... and confirm that the result matches the original global array.

Solution
gathered_data = [(global_idx[i], local_arrays[i]) for i in range(nranks)]
idx, val = zip(*gathered_data)

result = np.empty(ncells_glb)
result[np.concatenate(idx)] = np.concatenate(val)

print(f"{result=}")

Performing an MPI gathering operation in the ComIn plugin#

Let us now apply this gather operation to the actual ComIn Python script, which is attached to the ICON model.

Exercise: When working with ComIn plugins in ICON, you access variables locally, meaning each MPI process only sees its own subset of data. To reconstruct a global array, you need to perform a gather operation - similar to what you did in the previous exercise.

Instead of implementing the gather functionality manually, it’s better (and simpler) to use the gather functionality provided by the MPI library.

In this task, you will run a pre-written ComIn plugin scripts/comin_plugin_P2.py which demonstrates the same concept you explored above, now in an actual MPI-based plugin. All parts of this script are already given:

  • It accesses the variable pres_sfc from ICON.

  • It retrieves the host MPI communicator using the mpi4py Python package.

  • It gathers the distributed data array on rank 0.

Running the ICON model#

import os
import subprocess

user = os.environ['USER']
home = os.environ['HOME']
scratchdir = f"/scratch/{user[0]}/{user}"
icondir = f"/pool/data/ICON/ICON_training/icon"
expdir = f"{scratchdir}/exercise_comin/P2"

%env ICONDIR={icondir}
%env EXPDIR={expdir}
%env SCRATCHDIR={scratchdir}

Run setup script

!bash $HOME/icon-training-scripts/exercise_comin/prepared/prepare_icon_run.sh
# Append plugin block
with open(f"{expdir}/NAMELIST_ICON", 'a') as f:
    f.write(f"""&comin_nml
   plugin_list(1)%name           = "comin_plugin"
   plugin_list(1)%plugin_library = "{icondir}/build/externals/comin/build/plugins/python_adapter/libpython_adapter.so"
   plugin_list(1)%options        = "{home}/icon-training-scripts/exercise_comin/scripts/comin_plugin_P2.py"
/
""")
# Submit job
!cd $EXPDIR && sbatch --account=$SLURM_JOB_ACCOUNT --export=ICONDIR $EXPDIR/icon-lam.sbatch

!squeue -u $USER


Further Reading and Resources#


Author info: Deutscher Wetterdienst (DWD) 2025 :: icon@dwd.de. For a full list of contributors, see CONTRIBUTING in the root directory. License info: see LICENSE file.