Instructions on how to compile, run and execute the UCLA GCM


1) Setup

This document contains the information needed to compile, load and run the UCLA atmospheric general circulation model (AGCM). Instructions on how to set up a coupled GCM (atmosphere and ocean) run are also included. The code (which is 99% FORTRAN) resides in the directory camel_7.2p and the directories below it (for example, the agcm and esm directories).

To compile the code, there are several files which MAY need to be modified to each user's requirements. In the camel_7.2p directory there is a file called "cshrc.setup" in which the full directory path to the code must be specified (thru an environment variable called CAMILLEHOME). In the directory "camel_7.2p/include" there are two files which set options for the compiling and linking of the code. These files are "UserOptions.h" and "config.h".

In "UserOptions.h", one sets the options for compiling the AGCM code. There are two variables that control the inclusion of code for the coupling of other model to the UCLA AGCM. The variable USE_ESMF control wether or not the AGCM code is compiled in a form suitable for coupling to other gridded components utilizing ESMF package. If both USE_ESMF and OD_Package is set to 0, the AGCM will run stand-alone with prescribed conditions over ocean points. If variable OD_Package is not set to zero, then it will modify the AGCM code for coupling to different ocean general circulation model (OGCM) utilizing UCLA AGCM coupler or DDB. If either USE_ESMF or OD_Package not equal to zero is then the resulting executable must be built by linking to a suitable OGCM and the two models will run coupled together. In "UserOptions.h", one also sets the type of machine to run on, the dynamic memory allocation syntax, and message passing software to use via the variables: ARCH_OPTION, MEM_OPTION and MSG_OPTION. A portion of this file is reproduced below;
/* Choose architecture for current machine. */
#define ARCH_OPTION ARCH_DEC   /* Current ARCH_OPTION choices:
                                               ARCH_SGI_O2   */
/* Choose memory management macro for current machine. */
#define MEM_OPTION MEM_SUN4     /* Current MEM_OPTION choices:
/* Choose message passing macro for current machine. */
#define MSG_OPTION MSG_MPI             /*Current MSG_OPTION choices:

and shows these variables as they are set to run the code on the DEC/HP TRU64. Only certain specific combinations of these variables are appropriate and configured(for example, ARCH_SUN does not work with MSG_IBM).

In the "config.h" file, one must choose the compiler to be used, loader to be used and compiler and loader flags. One such configuration appropriate for the DEC/HP Alpha cluster is as follows:

DEC Alpha Workstation
#define RUN_RANLIB 1
AR = ar r
#if (Debug_Option == 1)
#if (Optimization_Option == 1)

CC = cc
F77 = f90
LDR = f90
CC = cc
F77 = f90
LDR = f90
MAKE = make

#if ((MSG_OPTION == MSG_MPI) )
CC = cc
F77 = /library/mpich/bin/mpif90
LDR = /library/mpich/bin/mpif90
MAKE = make

In addition, one must set the path for the communications and other libraries that are needed. In the example below ESMF, MPI(mpich), ocean(pop), NETCDF is used.

OTHLibs = -L/s/spahr/pop.popesmf/work -locean -L/s/spahr/ESMF/esmf_2_0_1/lib/
libO/OSF1.default.64.default -lesmf -L/library/netcdf/lib -lnetcdf -lc
INCLUDES = -I/library/mpich/include
MODLibs = -I/s/spahr/ESMF/esmf_2_0_1/mod/modO/OSF1.default.64.default -I/s/sp
ahr/pop.popesmf/work/compile/ObjDepends -I$(CAMILLEHOME)/esm/control

These file will need to be changed when going to another machine.


2) Compiling and Linking

Prior to compiling and linking the model an appropriate setup for the platform to be used, library location(s), specification of compiles and other utilities most be accomplished first. Don't forget to source "cshrc.setup" in the camel_7.2p directory every time you logon or open up a new window. Once the required changes have been made to these files, the code can be compiled (and linked) by typing "mkmf" followed by "make" in the camel_7.2p directory, or linked only by typing "make link". Typing "make" in any directory under the camel_7.2p directory will compile the source code in that directory only. Other models are included by specifying the path and name of their archive file (.a file) in config.h file (i.e. -L/s/spahr/pop.popesmf/work -locean). All other models need to be compiled and an archive file needs to be created before the AGCM's link step. If another model is modified you can link it together with the AGCM by typing "make link" in the camel_7.2p directory.


3) Running the AGCM or an ESM coupled system

Before the code can be run, all the model's input control/namelist file need to be setup for the run that to be made and put in the run directory which by default is camel_7.2p/bin. AGCM input (including length of simulation, names of I/O files, etc.) is given through the file "esminput". All the data files need to be specifed in the appriate input control file and the file most be accessable at run time. After linking, the resulting executable (called "camel2" by default) is in the "camel_7.2p/bin" directory. It can be run there by executing the script file called "camelrun".

Any questions about this code distribution, how to run the model, or access to required data files can be addressed to Joseph Spahr.


Description for the UCLA AGCM output files

     This document includes a description of the model's history tapes, and should be all that is needed by those who will not run the model, but simply use the results of existing runs.


    The MAIN history contains global data, including the instantaneous values of all the prognostic variables used for re-starting the model. This history is typically written every twelve or thirty-six hours. All data on these three tapes is kept in the model's sigma coordinate and C-grid, but a post-processing package is available to interpolate data on the MAIN history to pressure coordinates. The use of this package (FIELDS) and the PRESSURE history it produces will also be described below. We will refer to all data written at one time as a "day-time-group" (DTG). For each of the four types of histories we will describe all possible DTGs, the sequence in which they appear on each physical tape, and the way tapes are concatenated to form a history. The following naming convention is used for the output files:
    For the MAIN history, tapes are named: EeeeFfff Here eee stands for a user specified experiment number, and fff for the sequence number of the tape (e.g. E045F003 is the third MAIN tape produced by experiment 45). At the National Center for Atmospheric Research (NCAR), the tape naming is done automatically by the model each time it disposes data.
    (On the system at NCAR the "tapes" are really datasets on MASTOR, but these are treated exactly as though they were tapes, and we will continue to refer to them as such throughout this document.)

The Main History Day-Time Groups (DTGs)

    Each DTG on the MAIN history consists of a header record followed by JM-2 records of data in latitude strips (JM is the number of meridional grid points, including fictitious points at the poles). Each latitude record contains instantaneous values of the boundary conditions, the cloudiness, and all prognostic variables needed to restart the model. It may also contain time mean diagnostic fields, but these are optional. In addition, the option exists to output only part of the diagnostics normally kept by the model, and to vary the number of diagnostics output at different intervals. However it is done, all records after the header for a given DTG are identical and each contains data from a single latitude. This is shown in Figure 3.

The Header

The header record of each MAIN DTG is written with the FORTRAN:

                            WRITE (unit) TAU, SUN, CTP

TAU is a REAL scalar containing the simulated time in hours,
SUN an array of 9 reals whose contents are given in Table 2,
CTP an array of 100 reals given in Table 3.

The time TAU is in hours from 0Z on January 1st of year 0 of the run or sequence of runs. Leap years are ignored by the model. The day of the year, counting from 1st January = DAY 1, is thus
                            DAY = mod ( int ( TAU/24 ) , 365 )
and Greenwich time is
                            HOUR = mod ( int ( TAU ) , 24 )
Calendar information can also be read from the SUN array (Table 2).
Array CTP is a record of many of the parameters used in the run, and includes all information necessary to read and interpret the three history tapes. Only the first 89 CTPs have been used thus far. Table 3 lists those that are relevant to our discussion. As we will see below the PBP and TIME histories also contain headers that allow reading those tapes, but they are much abbreviated. The information in the CTP array of the MAIN header is thus the most complete record of the run.

The Latitude Records

    The header is followed by JM-2 (JM is stored in CTP (23)) latitude records. These come in three varieties, which we will call " standard" (STD), "short QP" (SQP) and "long QP" (LQP). They differ only in the number of diagnostic fields saved.
     All three record types are written with the unformatted FORTRAN:
                        WRITE (unit) TOPOG, Q, QP




is an array of IM * NO3FDS REALS containing both
predicted and prescribed surface data, as well as encoded
cloudiness information used in restarting the model. IM is
the number of grid points in the zonal direction (CTP(22))
and NO3FDS (CTP(30)) is in the current version of the model.
The 9 TOPOG variables are given in Table 4.



is an array of (IM + 2) * (NO1FDS + LM * NQFDS),
LM (CTP(24)) is the number of levels, NO1FDS (CTP(29)) is always
2 and NQFDS (CTP(31)) is either 4 or 5, depending on whether ozone
was included in the run. Q contains the primary prognostic variables
in the order they are given in Table 5.


is a REAL array of length LENQPC (CTP(75)). It contains
time averages of diagnostics produced by the physics.

    For an STD record, LENQPC is 1. A DTG made-up of STD records is simply a re-start with no diagnostics attached. SQP and LQP records contain all the STD data, plus the diagnostics. LQPs contain all diagnostics currently produced by the model. In this case
                      LENQPC = IM * (LM*NQPFDS + NQP2M)
where NQPFDS (CTP(32)) is the number of three dimensional diagnostics, and NQP2M (CTP(33)) the number of two dimensional diagnostics. These are arranged in the QP array in the order given in Table 6, with the 2-D fields in front of the 3-D fields. SQP records are similar, but only the first LENQPF (CTP(76)) words of the QP array are output (so LENQPC = LENQPF in this case). Typically SQPs contain only the two dimensional diagnostics, so LENQPC = IM * NQP2M. But it is always safest to read LENQPC off the tape and then read that length for QP.

Order of the Latitude Records

    Latitude records are put on the tape in the order of the computations in the GCM: from south to north, and within each strip, from west to east. Because the data is on the model's C-grid, not all variables with the same array indices are co-located. On the C-grid, the zonal velocity component (u) is kept half a grid interval east and west of the "mass" field (potential temperatures and pressures), and the meridional velocity component (v) half a grid interval north and south. Water vapor and ozone mixing ratios are kept the "mass" point. The indexing convention used in the model is that the u with the same zonal index as the "mass" variables is half a grid interval to the east, and the v with the same meridional index as the "mass" variables is half a grid interval to the south of the mass-point. This is shown on Figure 1.
    In longitude, the first mass- and v- points are at the dateline (180W), and the first u-point half a grid to the east (177.5W for a 5 degree zonal resolution model). In latitude, the first record after the header has the southernmost mass point, which is a full grid interval from the south pole (with 4 degree latitude resolution, this is at 86S). The v in this first record is set to zero. In the model's stretched polar grid this v is nominally at the pole and does not enter the calculations. The first non-zero v (in the second record) is located between the first and second mass latitudes (at 84S in the 4 degree model).
    All the QPs (in both SQP and LQP records) and the TOPOGs are at the mass points. The Qs are on the C-grid as described and there are IM + 2 of them, the last two being duplicate of the first two used in vectorizing along latitude circles.

Ordering of the Three DTG Types on the MAIN tapes

    To provide some flexibility in the amount of output produced, the model allows the three MAIN DTG types (STD, SQP, and LQP) to be output with different frequency. Since the three types differ only in the length of the trailing QP array, and this length appears in each DTG header, the tapes can be read without knowing in advance the ordering of the DTGs.
    Writing of the MAIN history is controlled by seven parameters. Five are stored in the array TAUHST (CTP(60) to CTP(64)), and the other two in QPWIF (CTP(85)) and QPWIL (CTP(86)). TAUHST (1) contains the writing interval for STD DTGs, QPWIF the writing interval for SQP DTGs, and QPWIL the writing interval for LQP DTGs. All three are in hours. QPWIF is an integer multiple of TAUHST(1), and QPWIL is an integer multiple of QPWIF. These intervals are all measured from the starting TAU of the history, which is stored TAUHST(4). The ending TAU of the history is stored in TAUHST(5). The remaining TAUHSTs control the staging and switching of tapes. These are discussed below.
    The following example should clarify how the DTGs are ordered. Say a run was started from a re-start (any MAIN DTG will do but we will assume it to be an STD) in which TAU was set to 24 (i.e. started at 0Z January 1st of year 0). Also the following are specified:
      TAUHST (1) = 12 TAUHST (4) = 136 TAUHST (5) = 240 QPWIF = 36 QPWIL = 72
    The first DTG on the MAIN history would be the one it re-started from with TAU=24 (which was an STD). The next one would be the first one written by the model at TAU=136, corresponding to 16Z on January 5th. There would then be a DTG on the history every 12 hours i.e., at 16Z and 04Z, as long as TAU <= 240. Thus the last DTG on the history would be at 16Z on the 9th. The DTG at TAU=136 would be an LQP. Every third one after that (every 36 hours) would be either an SQP or an LQP, and every other one of these (every 72 hours) would be an LQP. The strange times in this example, 16Z and 04Z, were chosen to emphasize that the intervals are relative to the beginning TAU of the history (TAUHST(4)), not the Greenwich clock. Figure 3 gives the full sequence of DTGs as they would appear on the tape produced by this example. An example of the ordering of DTGs in a MAIN history.As mentioned above, the TOPOG and Q arrays (the STD part) of all these DTGs consist of instantaneous values and are the model's restarts. The QP quantities, on the other hand, are all time means covering the period since they were last output. Thus in the example above, the QPs included in the SQPs would all be 36 hour means (even when they appear in an LQP), and the QPs that appear only in the LQPs would all be 72 hour means.
    Two final points. First, all output is done on the "physics" cycle, which in the present version is called once for each simulated hour (on the Greenwich hour). This is also the interval at which the QPs (which are all physics diagnostics) are incremented. If the model is restarted in the middle of one of the QP intervals the accumulated QP means will be for the period from when the model is started to the end of the QP interval. Since the LQP interval in particular is usually taken fairly long (a few days), one may wish to restart at the last available STD. In this case the appearance of the tape would be identical to that which would have been produced without the restart; however, the QPs from the affected period would contain means over the portion of the period after the restart. The number of times that went into the means are stored in QPWFF (CTP(83)) for the QPs that appear in SQPs and in QPWFL (CTP(84)) for those that appear only in LQPs. These two factors should be ignored in STDs, and only the first used in SQPs.

How Tapes are Concatenated

    The number of daytime groups the model puts on each tape is controlled by TAUHST(3) (CTP(62)). This is the tape switching interval in hours, again measured relative to TAUHST(1). When the writing time falls on this interval, the model writes the appropriate DTG type on the current tape, closes that tape, opens the next one, and copies the STD part of the DTG on the new tape. Thus the last restart on one tape is repeated as the first re-start on next tape.
    Using the example above, if TAUHST(3) was 72, the first tape (say E01F001) would end with the LQP at TAU=208. The next tape (E01F002) would start with a copy of the STD for TAU=208, followed by the expected STDs at TAU=220 and 232.