How to Develop a MPI Application Based on SimpleGrid Package

If you want to develop a gateway application that runs MPI code on TeraGrid, you can achieve so by making some minor changes to SimpleGrid sample application code. We illustrate the steps to be taken by using an example MPI code. We use Abe cluster at NCSA as an example of TeraGrid clusters.

Example MPI code

The following is a simple MPI code that initializes MPI environment, outputs its own rank to stdout, stderr, and a file, then exits.

/* mympi.c */
#include <stdio.h>
#include <stdlib.h>
#include "mpi.h"
int main(int argc, char ** argv) {
        int myrank;
        MPI_Init(&argc, &argv);
        MPI_Comm_rank(MPI_COMM_WORLD, &myrank);
        fprintf(stdout, "my rank is %d\n", myrank);
        fprintf(stderr, "my rank is %d\n", myrank);
        char fn[128];
        sprintf(fn, "proc%d.out", myrank);
        FILE *f = fopen(fn, "w");
        if (f != NULL) {
                fprintf(f, "my rank is %d\n", myrank);
        return 0;

Deploy MPI on TeraGrid

    +openmpi-1.2.2-1-gcc           Open Message Passing Interface 1.2.2-1 with
    +openmpi-1.2.2-1-intel         Open Message Passing Interface 1.2.2-1 with
    +openmpi-1.2.4-gcc             Open Message Passing Interface 1.2.4 with Gnu compiler
    +openmpi-1.2.4-intel           Open Message Passing Interface 1.2.4 withIntel compiler
    +openmpi-1.3.2-gcc             OpenMPI 1.3.2 w/ GNU compilers, BLCR support,
    +openmpi-1.3.2-intel           OpenMPI 1.3.2 w/ Intel compilers, BLCR
    +pgi-10.0                      PGI 10.0 compilers
    +pgi-10.8                      PGI 10.8 compilers
    +pgi-10.9                      PGI 10.9 compilers
    +pgi-11.1                      PGI 11.1 compilers
# MPI compiler
$ which mpicc
# mpirun command
$ which mpirun
# Intel compiler
$ which icc
$ cd $HOME/test
$ mpicc -o mympi mympi.c
#PBS -N mympi_test
#PBS -o mympi.out
#PBS -e mympi.err
#PBS -S /bin/bash
#PBS -l nodes=1:ppn=8,walltime=00:01:00

if [ ! -d "$wdir" ]; then
        mkdir -p $wdir
cd $wdir
MPI_NP=`cat $MPI_MACHINEFILE | wc | awk '{print $1}'`
mpirun -np $MPI_NP -machinefile $MPI_MACHINEFILE $HOME/test/mympi

Note: Above script requests 8 cores of Abe cluster for 1 minute to run mympi; creates working directory $HOME/scratch-global/mympi_test; and calls mpirun to execute mympi binary code.

$ qsub mympi.qsub
This job will be charged to account: cwa (TG-SES090019)
$ qstat
Job id                    Name             User            Time Use S Queue
------------------------- ---------------- --------------- -------- - -----
3887514.abem5             mympi_test       gisolve                0 R normal  

After job is finished, qstat will return nothing. Then please check mympi.out and mympi.err for code output

Run your MPI code from science gateway: command line

Above steps illustrate how to run your MPI code on TeraGrid cluster. Now that you have a science gateway (e.g., the SimpleGrid science gateway), we will illustrate how to run mympi from science gateway machine where SimpleGrid package is installed. We use the SimpleGrid tutorial server as science gateway server.

$ /opt/software/
$ . /opt/software/
$ . ./simplegrid2/etc/
$ myproxy-logon -l gisolve -s -t 12
Enter MyProxy pass phrase:
A credential has been received for user gisolve in /tmp/x509up_u546.

Above command fetches a 12-hour long Grid proxy from TeraGrid MyProxy server

&  (jobType=mpi)
        (arguments="arg1" "arg2")

Note: above RSL basically asks for 8 core on 1 machine to run mympi executable that has been created on Abe cluster in previous steps

$ globusrun -r -f mympi.rsl 
globus_gram_client_callback_allow successful
GRAM Job submission successful

Note: is the Globus GRAM contact of Abe cluster for job submission purpose; the output of this MPI job, as specified in RSL file, is in file $HOME/test/gateway-mpi.out on the Abe cluster; gateway then can transfer output files from Abe cluster to gateway data server using GridFTP (command: globus-url-copy)

Run your MPI code from science gateway: gateway application

SimpleGrid provides an example gateway application called DMS. Once you understand how DMS is developed from a desktop code to a TeraGrid-enabled gateway application using SimpleGrid API, only the following source code files need to be changed to run mympi:

# change RSL template to specify job type as 'mpi'
# change application Java class to put your business logic in
# change main application class to streamline the process of data transfer, job submission/monitoring, and result fetching