UCLA ESM/ESMF Software Test Plan


1. Scope
The scope of this project is restricted to the coupling of existing gridded components that form the UCLA Earth System Model (ESM) using the Earth System Modeling Framework (ESMF) software by means of the ESMF-compliant CCSM coupler via ESMF software.

1.1 System Overview
The software system to be tested includes three gridded components: the UCLA AGCM 7.2, the HYPOP 1.0 global ocean component, and the CICE 3.1 sea ice component; the ESMF-compliant version of the Community Climate System Model (CCSM) coupler; and the ESMF libraries.

2. Referenced Documents
UCLA AGCM Users Guide
http://www.atmos.ucla.edu/~mechoso/esm
POP 2.0 Requirements Document
CICE 3.0.1 Documentation http://www.acl.lanl.gov/climate/eclare/cicecode/cice.html
Community Sea Ice Model v4.0.1 Users Guide http://www.ccsm.ucar.edu/models/ccsm2.0.1/csim/
Community Sea Ice Model v4.0.1 Code Reference Guide
http://www.ccsm.ucar.edu/models/ccsm2.0.1/csim/
Community Sea Ice Model v4.0.1 Scientific Description
http://www.ccsm.ucar.edu/models/ccsm2.0.1/csim/

3. Software Test Environment
Our test hardware environment will be restricted to SGI Origin 3000 hardware at NASA Ames and LANL. We will utilize the current SGI compilers and system libraries provided in this hardware environment. Personnel conducting the tests will be John Baumgardner and Phil Jones from LANL and Joe Spahr from UCLA. We intend to focus on a single grid resolution. The atmosphere model will have a 5 degree longitude by 4 degree latitude resolution and 15 levels. The ocean model will be the so-called ëx3í displaced pole grid with approximately 3 degree resolution. The sea ice model will also have a resolution of approximately 3 degrees. We will write drivers and diagnostic routines primarily to test the correctness of the integration of the component models (verification).

4. Test Identification
This section describes the tests to be performed under this plan.

4.1 General Information
Tests will be performed to assess the reliability and performance of the UCLA ESM integrated with the ESMF.

4.1.1 Test Level
Tests 1, 2, and 3 are at the component/integration level, Test 4 at the integration level, and Test 5 at the system level.

4.1.2 Test Classes
Specify the type of tests that will be performed (performance, invalid input, capacity tests, operational tests)

4.2 Planned Tests
This section describes the specific tests to be performed under this plan.

4.2.1 Stand-Alone AGCM
4.2.2 Stand-Alone HYPOP
4.2.3 Side-by-side AGCM/HYPOP
4.2.4 Coupled AGCM/HYPOP/CICE (Functional)
4.2.5 Coupled AGCM/HYPOP/CICE (Performance)
5. Test Schedules
The tests will be performed sequentially

6. Requirements Traceability
Requirement TEST
Stand-alone AGCM Stand-alone HYPOP & CICE Side-by-side AGCM/HYPOP Coupled AGCM/HYPOP/CICE (Functional) Coupled AGCM/HYPOP/CICE (Performance)
Organization using the ESMF component design model 1.1-1.5
1.7-1.11
1.1-1.5
1.7-1.11
1.6
Inclusion of an ESMF compliant application component 2.1-2.4.1
Inclusion of an ESMF compliant coupler 3.1-3.1.1
System contains atmosphere, ocean and sea-ice components 4.1-4.2.4
Verification of ESMF timing overhead 5.1



Interface Control Document


1.1 Specification of Gridded Components

Our application includes three gridded components: an atmosphere component, UCLA AGCM 7.2; an ocean component, HYPOP 1.0; and a sea ice component, CICE 3.1.

For ESMF compatibility, each of the gridded components will have a single externally visible entry point that corresponds to an ESMF ëset servicesí subroutine. This subroutine will be called at start time to register with the framework the names of driver subroutines that manage the component initialization, run, and finalize tasks.

At the top level each gridded component will be organized in terms of three subroutines that separately direct initialization, run, and finalization functions. The initialization subroutine, called once, will read in required input parameters, allocate the required memory, read in or generate the grid, initialize the required fields and boundary conditions, and read in or generate everything else the model needs in order to run. The run subroutine will be called multiple times. On each call it will accept any required new state data from other components, do its computation, perform any required external I/O, and produce output state data needed by other components. The finalization subroutine, called once, will write out required final results, release allocated memory, close opened files, and otherwise provide a graceful shutdown of the computation. Each component will communicate to other components only through the framework. All non-framework data items will be private to each component. UCLA AGCM, HYPOP and CICE already have initialize and finalize routines, but their interfaces still require some work to be ESMF compliant. For example, all gridded components have to be prepared to receive timing and control variables from the ESMF.

Each gridded component will use at least portions of the ESMF lower level infrastructure. In particular, all components will adopt the ESMF clock/calendar subroutines for their time management requirements. The UCLA AGCM does not yet have an interface to the ESMF time management routines while HYPOP and CICE already do. Components will also use ESMF error logging and I/O utilities.

The AGCM uses a "C" grid (Mesinger and Arakawa, 1976). The resolution we intend to use for the atmosphere is 5o longitude and 4o latitude, which is 72 cells in the longitude direction and 44 cells in latitude with 15 levels in the vertical from the Earthís surface to 1 hPa. The layout we will typically use consists of 16 blocks (of approximately 16x12 cells each). POP uses a "B" grid (Mesinger and Arakawa, 1976). The resolution we plan to use for the ocean and sea ice models will be one known as the 'x3' grid, which has its north pole displaced into Greenland. This grid has 100 cells in the longitude direction and 116 cells in the longitude direction with 25 vertical levels for the ocean. The layout we intend to use is 16 25x29-cell blocks for the ocean and 4 25x116-cell blocks for sea-ice. The running sequence is atmosphere-sea ice-ocean. We anticipate coupling to occur once per hour between the atmosphere and sea-ice components and once per day between the ocean and both the atmosphere and sea-ice components.

1.2 Gridded Component Export Fields

Atmosphere Component (UCLA AGCM 7.2)

fielddescriptionunits
PRECtotal precipitationmm/s
RAINwater flux due to raing/cm2/s
SNOWwater flux due to snowg/cm2/s
FSSsensible heat flux at the surfaceW/m2
FWSwater flux at the surface from evaporationg/cm2/s
DRAGU"U" surface dragdynes/cm2
DRAGV"V" surface dragdynes/cm2
SSnet short wave flux at the surfaceW/m2
DSWBOTdownward short wave flux at the sfcW/m2
ULWBOTupward long wave flux at the sfcW/m2
DLWBOTdownward long wave flux at the sfcW/m2
GCsurface condition
Tpotential temperature of the PBLK
SHspecific humidity of the PBLkg/kg
SLPsea level pressuremb
RAOAatmospheric air densitykg/m3
SWDIDFdownward shortwave near IR
diffuse heat flux
W/m2
SWDIDRdownward shortwave near IR
direct heat flux
W/m2
SWDVDFdownward shortwave visible
diffuse heat flux
W/m2
SWDVDRdownward shortwave visible
direct heat flux
W/m2

Ocean Component (HYPOP 1.0)
fielddescriptionunits
DHDYnorth-south surface slopem/m
DHDXeast-west surface slopem/m
QICEheat flux due to freezing or melting
potential (heat flux correction)
W/m2
SSSsea surface salinityppt
SSTsea surface temperatureK
VVELnorth-south surface velocitym/s
UVELeast-west surface velocitym/s

Sea Ice Component (CICE 3.1)
fielddescriptionunits
ALBIDFnear IR diffuse albedonone
ALBIDRnear IR direct albedonone
ALBVDFvisible diffuse albedonone
ALBVDRvisible direct albedonone
EVAPevaporated water fluxkg/m2/s
IFRCice fractionnone
LATlatent heat fluxW/m2
LWUPoutgoing longwave radiationW/m2
MELTHheat used to melt iceW/m2
MELTWwater flux from ice meltkg/m2/s
NETSWshortwave penetrating to bottom of iceW/m2
SALTsalt flux from ice meltkg/m2/s
SENSsensible heat fluxW/m2
TAUXAeast-west wind stressN/m2
TAUYAnorth-south wind stressN/m2
TREF2m atm reference temperatureK
TSFCsurface temperature of sea iceK


1.3 Gridded Component Import Fields

Atmosphere Component (UCLA AGCM 7.2)
fielddescriptionunits
SSTsea surface temperatureK
TSFCsurface temperature of sea iceK
IFRACice fraction%

Ocean Component (HYPOP 1.0)
fielddescriptionunits
EVAPwater flux from evaporationkg/s/m2
IFRACice fractionnone
LWDWNdownward longwave heat fluxW/m2
LWUPupward longwave heat fluxW/m2
MELTwater flux from ice meltkg/s/m2
MELTHFheat flux from ice meltW/m2
MSLPmean sea level pressurePa
NETSWshortwave heat fluxW/m2
PRECwater flux from precipitationkg/s/m2
ROFFwater flux due to runoffkg/s/m2
SALTsalt fluxkg/s/m2
SENHFsensible heat fluxW/m2
TAUXeast-west wind stressN/m2
TAUYnorth-south wind stressN/m2

Sea Ice Component (CICE 3.1)
fielddescriptionunits
FLWdownward longwave heat fluxW/m2
FWheat flux from freezing or melt potentialW/m2
POTTatmospheric potential temperatureK
QAatmospheric specific humiditykg/kg
RAINwater flux due to rainkg/m2/s
RHOAatmospheric air densitykg/m3
SNOWwater flux due to snowkg/m2/s
SSSsea surface salinityppt
SSTsea surface temperatureK
SWDIDFdownward shortwave near IR
diffuse heat flux
W/m2
SWDIDRdownward shortwave near IR
direct heat flux
W/m2
SWDVDFdownward shortwave visible
diffuse heat flux
W/m2
SWDVDRdownward shortwave visible
direct heat flux
W/m2
TAIRatmospheric air temperatureK
TILTXeast-west sea surface slopem/m
TILTYnorth-south sea surface slopem/m
UATMeast-west atmospheric windm/s
UOCNeast-west ocean currentm/s
VATMnorth-south atmospheric windm/s
VOCNnorth-south ocean currentm/s
ZLVLatmospheric level heightm

1.4 ESMF State Objects

The import and export fields enumerated in items 1.2 and 1.3 above will include, in addition to the data arrays, the metadata required for ESMF compatibility such as gridding, layout, coordinate, and units information. In most cases fields will be bundled. Fluxes, which require conservative remapping between components, will be bundled separately from fields that do not require conservative remapping. Metadata providing the remapping specifications required by the coupler component will be included with the bundling description. The import state object and export state object for each component will consist of the fields listed in items 1.2 and 1.3 with the requite bundling and coupler specifications.

2.1 Full Application

Our full UCLA ESMF coupled global climate application will be built from the three gridded components described in section 1.1 and shown in Figure 1, an atmosphere component, UCLA AGCM 7.2, an ocean component, HYPOP 1.0, and a sea ice component, CICE 3.1, all coupled through a single coupler component, the ESMF-compliant NCAR Community Climate System Model (CCSM) coupler (see Figure 2). This coupler will use the ëhub and spokeí strategy to handle all data conversion/transfer among gridded components. The CCSM coupler will understand ESMF objects including the ESMF data objects of states, fields, bundles, grids, and arrays; ESMF services such as regrid and route; and ESMF execution/environment objects such as layouts. We have chosen to use the CCSM coupler because it will have the all the functionality required for our application. The CCSM coupler will be required to accommodate the grid and all the fields of our ocean and sea-ice components and will be able to handle the grid and all the fields associated with the UCLA atmospheric model. In addition the CCSM coupler will likely be available for use several months before the essential routines required to build such a coupler will be available from the ESMF library.

The main program for our climate application will call the ESMF initialization code and provide the environment for components to be created and run. It will first create and then call an application component. This application component in turn will create each of the gridded and coupler components. It will also call the ESMF set services entry point for each component it creates. In addition the application component will manage the main time stepping loop for the application. When computations are complete, the main program will destroy the application component and thereby close down the framework and release all its associated resources.

References
Mesinger, F., and A. Arakawa, 1976: Numerical methods used in atmospheric models. GARP Publication Series 17, 1, WMO.




Figure 1. UCLA ESM Component Structure




Figure 2. UCLA ESM Data Flow Diagram in one coupling cycle.
Data Field boxes indicate local variable storage for the particular model.