NWP SAF

AAPP Installation Guide       

  Document ID:  NWPSAF-MO-UD-005
  Version:           7.3
   Date:                01 June 2015

Table of Contents

3 Installation

Before you can run AAPP, either for the test cases provided or for your own data, you have to install AAPP v7 on your workstation. The full installation consists of the following steps:
 
1. Download the required AAPP files to your workstation
2.
Install any external libraries that you may need
3. Configure the installation to suit your workstation 
4. Build AAPP by runing "make"

The "core" AAPP package can be installed without external libraries; this will allow you to process NOAA and Metop HRPT data through to level 1d (see Chapter 4). The simple steps to achieve this core AAPP installation and to get started are described in steps 3.1, 3.2 and 3.3. However, to obtain the full functionality of AAPP it is necessary to install external libraries and specify additional settings at the configuration step. These installation details are described in section 3.4 onwards.

 

3.1 Copy AAPP files

To install the AAPP code it is sufficient to copy just the following file to a suitable location on your workstation.
AAPP_7.x.tgz        where x is a version identifier (e.g. AAPP_7.1.tgz)
If you wish to run the test cases, or view any of the documentation then you should copy the relevant .tgz files (see list in Chapter 2). Each .tgz file creates its own subdirectory when unpacked.

If you are also installing an update release, you should copy the update file(s) to your workstation (e.g. AAPP_update_7.2.tgz).

3.2 Unzip AAPP files

Unzip the AAPP package by typing

tar -xzf AAPP_7_x.tgz

If your system does not support the "-z" option in tar, you could try gtar, or use

gunzip -c AAPP_7_x.tgz | tar -xf -

A sub-directory AAPP_7 (or similar name - see Release Note) will automatically be created. We refer to this as the AAPP top directory.

To unpack an update release, you should follow the instructions contained in the corresponding Release Note. In most cases, you should cd to the AAPP top directory, and then use tar (as above) to unpack the file, e.g.

cd AAPP_7
tar -xzf ../AAPP_update_7_2.tgz


Update releases can be unpacked before compiling AAPP.

3.3 Install (configure and build) the "core" AAPP

All that is required to install "core" AAPP is to go to the AAPP top directory and run the configure script with your site-id specified as:

--site-id=  a 3-character abbreviation of your centre

and the Fortran compiler you are using as

--fortran-compiler=  any of : AIX AIX-Hitachi HPUX-f90 HPUX-f77 IRIX Solaris g77 g95 gfortran ifc ifort pgf90

For example, to configure AAPP for a Linux system using gfortran you could type (from the top directory AAPP_7)

./configure --site-id=UKM --fortran-compiler=gfortran

There are additional options (type ./configure from the top directory to view them) that you may wish to specify, such as your local station name, and some must be specified to obtain full AAPP functionality as described below (section 3.4). But to complete the "core" AAPP installation type

make

Check that the installation was successful and that you did not see any errors. If it was OK you may start testing AAPP as in Chapter 4. You will be able to run the NOAA test case examples described in 4.1, and parts of the Metop test case.

3.4 Install external libraries

The table below lists the external libraries, their use in AAPP, and how to build them (after unpacking). In the table, xxx and yyy indicate directories that you have to specify on your system. Some of these libraries (e.g. HDF5) can be downloaded as binaries, but the table assumes you are building from source. Most users will not need to install all the libraries, just the ones needed for your application.

Library
Uses in AAPP
Available from
Install scripts
Notes
BUFRDC
Encoding and decoding of AMSU, MHS, HIRS, IASI, ATMS and CrIS datasets
ECMWF
./build_library
./install
Fortran77 and C code.
The build_library interactive script offers you a choice of compilers and installation location. Use the same compiler for AAPP.
Environment variable BUFR_TABLES is needed at run time.

EMOSLIB may be used as an alternative to BUFRDC.
GRIB_API
For reading NWP model fields. For the AVHRR cloud mask (MAIA2.1 and MAIA3) the library is optional: if model fields are not available then climatology will be used. For the VIIRS cloud mask (MAIA4) the library is required.
ECMWF
./configure [options]
./make
Fortran77, Fortran90 and C code.
Requires JasPer or OpenJPEG (see below). Specify Fortran compiler via environment variables, to be compatible with AAPP e.g.

cd grib_api-1.9.9

export F77=ifort
export FC=ifort

./configure --prefix=xxx \
  --with-jasper=PATH_TO_JASPER

[ or --with-openjpeg=
PATH_TO_OPENJPEG ]

JasPer
jpeg2000 packing/unpacking. Used by GRIB_API.
University of Victoria
./configure --prefix=yyy
make
make install

OpenJPEG
Alternative to JasPer. Used by GRIB_API. Google Code
./configure --prefix=yyy
make
make install
Note: if OpenJPEG is installed on your system then GRIB_API will use it in preference to JasPer.
HDF5
Reading Sensor Data Record files for ATMS, CrIS, VIIRS, MWHS, MWTS, IRAS. Reading IASI eigenvector files.
The HDF group
#For AIX:
FCFLAGS="-qextname"
export FCFLAGS

./configure [options]
make
make install
Typical configure:

./configure --prefix=yyy \
  --enable-fortran


The Fortran interface is required if you wish to use MAIA4. It is not required for the other applications.

You may also need (see below):

 
--with-szlib=PATH_TO_SZIP

For MAIA4, it is best to use HDF5 version 1.8.8 or later (a patch is available if you want to use earlier versions - contact the NWP SAF Helpdesk for details).

Note for AIX xlf: By default, AAPP is built with the "-qextname" compiler flag; in which case you should use it for the hdf5 library also.
SZIP and ZLIB
May be used by HDF5.
The HDF group
./configure --prefix=zzz
make
make install
These libraries are not explicitly used by AAPP, but they may be needed for your installation.

The IASI level 1 processor OPS-LRS is an optional addition to AAPP. It is installed after AAPP - i.e. you do not need to refer to OPS-LRS when building AAPP. The installation of OPS-LRS is described in a separate document [RD09].

3.5 Configure your 'fuller' AAPP installation

Go to the AAPP top directory and type ./configure to see the options.

You have to specify the following:
--site-id=  a 3-character abbreviation of your centre 
--fortran-compiler=  any of : AIX AIX-Hitachi HPUX-f90 HPUX-f77 IRIX Solaris g77 g95 gfortran ifc ifort pgf90

Note 1: In the case of fortran-compiler, some of the choices are compiler-specific (i.e. g77, g95, gfortran, ifc, ifort and pgf90 are available on Linux and some other platforms), whereas the other choices are platform-specific (AIX, HPUX, etc.). If in doubt, check the contents of the files in directory config. (ifc refers to v7 of the Intel Fortran compiler; ifort refers to v8 or later).

Note 2: Some components of AAPP (notably MAIA3, originally released as part of update 6.12, and MAIA4) cannot be compiled using Fortran 77. There is a script fortran_compatibility that disables compilation of MAIA3 and MAIA4 if you are using an f77 compiler.

Note 3: Some Fortran 90 compilers require different flags to indicate the directory to search for Module files. The default is "-I", but you may need "-M" (see config/Solaris). If necessary, you can adjust the applicable file in the config directory (and please inform the NWP SAF Helpdesk if you find something that needs changing).
In addition, you can optionally specify:
--prefix=  installation prefix, if you wish to install the AAPP executables and data files to somewhere other than the current directory,
e.g. $HOME/AAPP_7_runtime
Note: do not choose this directory to be the same as the AAPP top directory, otherwise all your source files will be deleted!
--station=  your local station name (see list in AAPP/src/navigation/libnavtool/stations.txt). The default is "dummy_station"
--station_id=  local station ID, for "datset name" field in the level 1a/1b header. The default is WE (Western Europe)
--external-libs=  location and options for external libraries (in quotes), e.g. 
"-L/opt/emos/lib -lemos"  or
"-L/opt/bufr/lib -lbufr -L/opt/grib_api/lib -lgrib_api_f77 -lgrib_api_f90 -lgrib_api -L/opt/jasper/lib -ljasper"
--external-includes=  location and library options for external include files, e.g.
"-I${hdf}/include"
--external-modules=  location for external Fortran90 modules, e.g.
"-M/opt/grib_api/include"
        Note: some compilers use "-I" instead of "-M".
--c-compiler=  name of your C compiler (for use with METOP tools, IASI tools and HDF5).
This over-rides the settings specified in the config file for fortran-compiler.
--tle-user=    www.space-track.org username; required for using tle retrieval from the internet
--tle-passwd=    
www.space-track.org password
--scriptdir=    optional directory in which to place the AAPP shell scripts. Normally the scripts and .exe files are placed together
in "bin" directories, but on some systems, notably Cygwin, they must be separated, e.g. specify "scripts". 


To run MAIA3 or MAIA4, you must specify a Fortran90 compiler. If you specify a Fortran77 compiler (e.g. g77) then MAIA3 and MAIA4 are omitted from the build. In addition, for MAIA4 you should specify, in external-libs, "-lhdf5 -lhdf5hl_fortran". Some systems require, in addition, "-lhdf5_hl -lhdf5_fortran -lz". (Note: "-lhdf5" tells AAPP to build those components that use the HDF5 C library, whereas "-lhdf5hl_fortran" tells AAPP to build the components that use the HDF5 Fortran interface). You will probably need to use the same Fortran compiler for AAPP as was used to build the HDF5 Fortran interface. You may also need some of the same compiler flags (e.g. "-qextname" for the AIX xlf compiler - see section 3.4).

If the Fortran compiler you wish to use is not listed, you may create your own configuration file in the config directory, taking one of the existing ones as a template.

If your  station is not contained in the station list file, AAPP/src/navigation/libnavtool/stations.txt , you can edit and update the file if necessary before you go on to build AAPP.

The "station", "station_id", "tle_user" and "tle_password" options are optional. The values that you enter are stored in the ATOVS_ENV7 file (see section 3.7); you may edit this file later if you want to change the values.

When you build AAPP, the source, data files and executables are initially placed in a fixed directory tree structure as follows:
AAPP_7/AAPP/bin
AAPP_7/AAPP/data
AAPP_7/AAPP/src
AAPP_7/metop-tools/bin
AAPP_7/metop-tools/src
etc.
This will be fine for many users. However, if the "prefix" option is specified then you can install elsewhere the bin and data directories, and other files needed at run time. This is done via the "make install" command (section 3.6). The path that you enter is also stored in the ATOVS_ENV7 file, as environment variable AAPP_PREFIX. If you do not use the "prefix" option then AAPP_PREFIX is set to the AAPP_7 top directory. Thus the run-time directories are always $AAPP_PREFIX/AAPP/bin, $AAPP_PREFIX/AAPP/data, etc.

AAPP provides tools to decode and encode ATOVS level 1c BUFR files using the ECMWF BUFR library. AAPP can also accept GRIB format forecast files for MAIA. If you wish to make use of these tools, the appropriate libraries should be installed before building AAPP. See section 3.4 and 3.10.

For example, an installation with hdf5, BUFR and GRIB capability could be set up like this:
hdf=/path_to_hdf5
bufr=/path_to_bufr
grib_api=/path_to_grib_api
jasper=/path_to_jasper

[[ $LD_LIBRARY_PATH = *${hdf}* ]] || LD_LIBRARY_PATH=$LD_LIBRARY_PATH:${hdf}/lib

configure --station=exeter --fortran-compiler=ifort --site-id=UKM \
  --external-libs="-L$bufr/lib -lbufr -L$grib_api/lib -lgrib_api_f77 \
-lgrib_api_f90 -lgrib_api -L$jasper/lib -ljasper -L$hdf/lib -lhdf5" \
  --external-includes="-I$hdf/include -I$grib_api/include"












Note that the order of the grib_api libraries is important: the grib_api_f77 and grib_api_f90 must come before grib_api. If a library is centrally installed on your system you will not need to specify "-L", or to modify LD_LIBRARY_PATH. But you will need to point to it using "-l".

You will need to know whether your grib_api library has been built using jasper or openjpeg. To find out, locate your libgrib_api.a, and type "strings libgrib_api.a | grep -e opj_ -e jas_". You should see a list of all the relevant opj_ or jas_ functions. Configure AAPP with the appropriate library, i.e. -ljasper or -lopenjpeg.  

With HDF5, you may need to add a flag -lz, to link the zlib library. (This was necessary on the AIX system at ECMWF). You may also need -lsz, to link the szip library. The error messages from the build step (section 3.6) should tell you if any libraries are missing.

The configure script creates a file Makefile.ARCH in the top directory, which contains the system settings and is referenced by all the lower-level makefiles. You may wish to modify the contents of this file before proceeding, for example:
Some components of AAPP may be customised by the user. For example, the IASI level 1d format is defined by the file AAPP_7/AAPP/include/iasi1d.h. This file has parameters that define the maximum number of Principal Components scores that can be accommodated in the 1d file, and the maximum number of channels. If you want something other than the default (366 channels, plus 300 scores) then you should modify this file before building the software. The number of channels or scores actually used may be set at run time, provided it is less than the maximum.

3.6 Build AAPP

To build the complete package you just need to do one of the following:

If you did not specify an installation prefix in the configure step, type

make

If you did specify an installation prefix, type
 
make install

There are a number of other possibilities that users requiring flexibility may adopt:
make clean
make clean
make lib
cd ../bin
make bin
make dat
perl makefile.PL

Note that not all systems are capable of building "iasi_tools" (HPUX is not). Therefore if the make fails when compiling iasi_tools, and you are not planning to run IASI OPS-LRS, then you do not need to worry about it. However, if make fails in either "AAPP" or "metop-tools" then you should investigate further.

If make fails because an external library is missing, locate the library and modify your Makefile.ARCH or re-run configure. Remember that -L specifies the location of the library (if it is not in one of the standard directories) and -lx links the library named libx.a. For example, on the ECMWF AIX system, when building AAPP with HDF5 support it was necessary to add -lz to link the centrally-installed zlib. 

If the make fails for reasons that are not obvious, before re-trying it is recommended that you remove any partially installed components using make clean. Similarly if having built AAPP you decide that you want to install the BUFR or GRIB libraries, then you can make clean, re-run configure, then re-run make.

3.7 The ATOVS_ENV7 file

In AAPP v1 to v5 the ATOVS_ENV file that sets up the AAPP environment was always located in the user's $HOME directory. This caused problems for some users and made it difficult to run different versions of AAPP simultaneously. For AAPP v6 the system was changed, and it is the same in AAPP v7:
The user may if required move the ATOVS_ENV7 file to his $HOME directory (or create a link). Then to set up the environment for AAPP the user's script (or interactive shell) needs:
. $HOME/ATOVS_ENV7

Alternatively the user's script may source ATOVS_CONF:
AAPP_PREFIX=.......
. $AAPP_PREFIX/ATOVS_CONF

In either case all the AAPP scripts and binaries (including those in metop-tools and iasi-tools) will then be accessible by name, since ATOVS_ENV7 sets up $PATH to include the necessary directories.

You may need to edit the ATOVS_ENV7 file after it has been created by configure - e.g. to set the correct path for the GMT executables (called by the MAIA3 and MAIA4 test cases) and to set $LD_LIBRARY_PATH for use with HDF5 (see section 3.8).

3.8 Notes on MAIA

3.8.1 MAIA versions and capabilities
MAIA (Mask AVHRR for Inversion ATOVS) provides an AVHRR-derived cound mask.  There are three different versions of MAIA currently implemented in AAPP: version 2.1 provides a cloud mask on the HIRS grid; MAIA version 3 (introduced in AAPP v6.12) provides a cloud mask on the AVHRR grid; MAIA version 4 (introduced in AAPP v7.5) provides a cloud mask on the VIIRS grid. All three require prior information on the state of the atmosphere and surface: MAIA2.1 and MAIA3 can be configured to use either climatological fields or NWP model fields; MAIA4 has to use NWP model fields.

3.8.2 Format of forecast files
If you intend using the climatogical files distributed with AAPP for the cloud tests, then no changes are required. However, to obtain the best accuracy in the computed cloud mask and cloud classification, it is recommended to use forecast files. If you wish to use forecast files, then you will need to make certain changes. 

Two formats are available for the forecast file: ASCII and the standard meteorological format GRIB. To run AAPP with a GRIB forecast file, the ECMWF GRIB API library is needed - see section 3.4. You should also note that there is an environment variable FORECAST_FORMAT in ATOVS_ENV7: its default is "grib" in AAPP v7. You do not need to change any of the code or makefiles (unlike in versions of AAPP prior to v6). For MAIA3, the variable is called MAIA3_FORECAST_FORMAT. The format for the ASCII forecast files can be found in the AAPP data formats document.

Note that the GRIB API library handles both GRIB1 and GRIB2 files. The older GRIBEX library is no longer supported in AAPP. 

The location of the forecast files is determined by variables in ATOVS_ENV7. You may change them if needed:

Other environment variables include  NFORPERDAY (number of forecasts per day for MAIA2.1) and  MAIA3_NFORPERDAY (same for MAIA3). For MAIA2.1 you should also check that the variables in AAPP/include/maia.h are appropriate for your forecast files (e.g. number of lines/colums/levels in the forecast grid). Similarly for  MAIA3 you should check the values in AAPP/src/mod_maia.F90.

A test case, with forecast files, is provided for AAPP v7 that runs both MAIA2.1 and MAIA3. If forecast files are not available, or if AAPP has not been built with the GRIB_API library, then climatology is used.

For information about availability of ECMWF forecast files, please consult the ECMWF web site or submit an enquiry to the ECMWF calldesk. The NWP SAF is not able to supply this data routinely. See also the AAPP Data Formats Document (NWPSAF-MF-UD-003), which provides details of which model fields are used by MAIA.

For MAIA4, AAPP can download GFS forecast files from the Internet. To enable this, set the environment variable: MAIA4_USE_GFS="yes". You can also set MAIA4_REMOTE_GFS_DIR=url, where by default the url is set to http://jpssdb.ssec.wisc.edu/cspp_v_2_0/ancillary/ but can be defined to http://jpssdb.ssec.wisc.edu/ancillary/ to process old data.  

3.8.3 Atlas and threshold files for MAIA

For MAIA2.1, the atlas database is stored in tar form as a file atlas.dat. This file is found in directory AAPP/src/preproc/libavh2hirs_maia_2.1, and copied to the AAPP/data/preproc directory during installation. The contents are automatically unpacked the first time you run AVH2HIRS (see AAPP/src/preproc/bin/maia2_env.ksh).

In the case of MAIA3, the atlas database is substantially larger, and it is distributed as a separate file. Save the file on your computer. To unpack it, go to the AAPP top directory (i.e. the directory containing AAPP, metop-tools and iasi-tools) and type
tar -xvzf [dir]/AAPP_MAIA3_data.tgz

where [dir] is the directory where you have stored the .tgz file. MAIA4 also has a separate atlas database file.

3.8.4 Use of Fortran90
MAIA3 and MAIA4 are written in Fortran90. If you are using a Fortran77 compiler, the AAPP/Makefile will automatically be adjusted when you run "make" such that the MAIA3 and MAIA4 directories are not compiled. If you subsequently decide to use a Fortran90 compiler you can restore the makefiles to their original state by typing
perl Makefile.PL

3.8.5 Debug mode for MAIA3
In the ATOVS_ENV7 file, you have 2 variables, AVHRRINidbg and  MAIA3idbg.
For AVHRRINidbg
= 0 , no debug file
= 1 , a debug file exists but nothing is written 
= 2 , a debug file exists. Headers and records corresponding to AVHRR line1 until AVHRR line2 are written. Only channels data from pixel pix1 to pix2 are written
 
For MAIA3idbg
= 0 , no debug file
= 1 , a debug file exists. Some information are written. 
= 2 , a debug file exists. Headers and records corresponding to LINE1idbg until LINE2idbg are written in 2 files on unit PRINT1unit and PRINT2unit . Only channels data from pixel PIX1idbg to PIX2idbg are written.
= 3 , a debug file exists. A lot of information are written. 

If you want to have a deep debug of maia3, there is a fortran variable maia_idbg in maia3_main.F90, just before call to the maia subroutine. This variable is hard coding to 0 , but you can put it to 1 or 2. Then you need to compile again (type make in the top directory).  Take care of the large size of the log file. That why this variable was set to 0.

3.9 HDF5 libraries

The HDF5 format is increasingly being used as a standard for self-describing datasets. HDF5 is now one of the standards for NPP and FY-3 data, and is also used by EUMETSAT in the dissemination of eigenvector datasets for Principal Component compression of IASI data.

To work with these files the user will need to install the HDF5 library (see section 3.4) and link to it while building AAPP (section 3.5). Either the source code or pre-built binaries may be used. You will also need the szip compression library. If your system is not already set up to use HDF5 then you may need to set up your $LD_LIBRARY_PATH environment variable at run time, and when building AAPP, as in the example of section 3.4.

Note that older versions of the HDF5 library (5-1.8.1 and earlier) also required the SZIP library to be included in $LD_LIBRARY_PATH.

A code fragment to set up $LD_LIBRARY_PATH is included in the default ATOVS_ENV7 file, and may be uncommented and customized as required. HDF5 utilities such as h5dump can be found in ${HDF5}/bin. You may wish to add this directory to $PATH.

Before building AAPP you should run the configure command using the --external_libs and --external-includes options, e.g.:

./configure \
--external-libs="-L${BUFR}/lib -lbufr -L${HDF5}/lib -lhdf5" \
--external-includes="-I${HDF5}/include" \
--[other options]

It is assumed in the above that you also require the BUFR library (see below). As noted earlier, some systems may also need the external-libs flags -lz -lsz.

3.10 ECMWF libraries

To install the ECMWF libraries, you should visit ECMWF Software Services and download either BUFRDC or EMOSLIB. You may also need GRIB-API. As a general rule, it is recommended to use the latest release. Then install the required package(s) following the ECMWF documentation. Please check that the compiler used when building the ECMWF library is compatible with the compiler you wish to use for AAPP. Note that the EMOS library requires a Fortran 90 compiler whereas the BUFR library can be compiled under Fortran 77. If the default compiler is not suitable for your system you may be able to use a different one by editing the appropriate file in the config directory; e.g. for Linux the file to edit is config/config.linux.in. For example, typical commands to build and install the BUFR library would be:

tar -xzf bufr_000400.tar.gz  #having downloaded to a suitable directory
cd bufr_000400
./build_library    #answering the various questions, e.g. installation directory
./install

You should define an environment variable BUFR_TABLES pointing to the directory where the BUFR tables are installed. For other optional run-time environment variables applicable to the encoding program (e.g. originating centre and version numbers), please see the comments in the aapp_encodebufr_1c script.

The way that the ECMWF libraries handle BUFR tables has changed over the years. Prior to BUFR version 000320 the BUFR files were supplied as ASCII files but needed to be converted to binary before they could be used. In later versions of the software this conversion is unnecessary, it uses the ASCII files directly. EUMETSAT currently distribute MetOp BUFR data using Master Table 13; you will require tables B0000000000000013000.txt and D0000000000000013000.txt. Master Table 16 is used for NPP data. If the tables that you need are not already in the $BUFR_TABLES directory then you can create them as a link to existing files. BUFR tables are supplied with the AAPP test cases.

Some 64-bit machines may require additional compiler flags. If you are using a 64-bit HP machine, and version 000387 or earlier of the BUFR library, you should check whether the preprocessor flag JPBW_64 is being used:

  1. Check the BUFR or EMOS configuration file for your system, in the config directory (e.g. config.hppaR64)
  2. By doing a search in the bufrdc directory, check whether this flag is actually used in the source code (e.g. it is not used in BUFR library version 000310 even though it is defined in config.hppaR64 and config.hpia64R64)
  3. Check that the declaration of the PARAMETERS (bufrdc/parameter.F in the newest versions; otherwise in-line code) is consistent with the declaration of the same parameters inAAPP routines aapp_encodebufr_1c.F and aapp_decodebufr_1c.F. If you need to use the JBPW_64 flag then before building AAPP you should edit Makefile.ARCH and add -DJPBW_64 to the list of FFLAGS. Unfortunately there is no way of doing this automatically as different versions of the BUFR library have different conventions.
AAPP v6.1 was originally tested with ECMWF BUFR library versions 000270, 000300 and 000310. Subsequently, versions 000320 to 000400 have been tested. If you encounter run-time errors, please check that the sizes of the PARAMETERS declared in aapp_encodebufr_1c.F and aapp_decodebufr_1c.F are large enough (see comments in source code).

3.11 Special considerations for Windows PC installation

It is possible to install the core AAPP and metop-tools on a PC running a Unix emulator. Note that the speed will in general not be as good as it would be on a native Linux operating system. Some notes on two commonly available emulators are given below. Some of these points may apply to other emulators also.

    1.   Microsoft Windows Services for Unix (SFU)

SFU can be installed as an add-on package for PCs running Windows 2000 or XP Professional. The current version is 3.5, available from http://www.microsoft.com/download/en/details.aspx?id=274. The advice in this section is based on the testing of AAPP v6 with SFU during 2006. 

SFU includes a korn shell. It is recommended that you install all the SFU components. By default, SFU creates a HOME directory at "/dev/fs/C/Documents and Settings/username" - type "echo $HOME" to see this. The spaces in the directory name cause problems for some Unix scripts, so you should change it, e.g. by typing,

cd /dev/fs/C
mkdir aapp
export HOME=/dev/fs/C/aapp
Note that Unix directory /dev/fs/C/aapp is equivalent to Windows directory C:\aapp. To make this change permanent, you can create a file .profile in your original HOME directory containing the line "export HOME=/dev/fs/C/aapp". (You could also add a line "export PS1='$PWD> ' " to create a more meaningful prompt).

The g77 compiler is included as part of SFU, so you may wish to configure AAPP for g77. (However, note that on one Met Office PC there was a problem with g77 - any compilation attempt failed with a segmentation fault in cc1; the reason for this is not clear).

When testing AAPP v6 under SFU at the Met Office the following changes were also found to be necessary:
ln -s /usr/local/bin/perl /usr/bin/perl
CFLAGS=existing flags  -D_ALL_SOURCE

    2.   Microsoft Subsystem for UNIX-based Applications (SUA)

SUA is part of the Windows operating system for Windows 7, Windows Server 2008 and Vista - see, for example, http://suacommunity.com/sua.aspx. AAPP v7 has not yet been tested on SUA.

    3.   Cygwin

Cygwin is available from http://www.cygwin.com/. As well as standard utilities, you need to install perl, a korn shell (pdksh.exe) and a suitable Fortran compiler. The following changes are needed:

LIB=
ln -s /bin/pdksh.exe /bin/ksh
perl Makefile.PL

3.12 AAPP Directory Structure

After the successful installation of AAPP on your machine you should find on your system the following AAPP directory structure: 

   AAPP_7/
           ATOVS_ENV7
           ATOVS_CONF
           Makefile.ARCH
           config/

           AAPP/
                  bin/
                  man/
                         man1/
                         man3/
                         man5/
                  lib/
                  include/
                  src/
                         decommutation/
                                bin/
                                libdecom/
                         calibration/
                                libavhrcl/
                                libcal/
                                libhirscl_algoV4/
                                bin/
                                libmsucl/
                                libhirscl/
                                libamsuacl/
                                libamsubcl/
                                libmhscl/
                         navigation/
                                libsgp/
                                libbrolyd/
                                libtbus/
                                bin/
                                libephtrack/
                                libnavtool/
                                libnavnoaa/
                                libtle/
                                libspm/
                                libMSLIB77_V3.1/
                         preproc/
                                libatov
                                libatovpp
                                /libavh2hirs_maia_2.1
                                libatovin
                                /bin
                                libmappiasi
                                libmaia_2.1
                         tools/
                                libf7nl1b/
                                libf7cp/
                                libf7tp/
                                bin/
                                libf7gp/
                                libsatid/
                                libf7ml/
                                libaappbufr/
                                libaapphdf5/
                         libauxdeliverables/aapp/
                         libgribexdummy/
                         libbufrdummy/
                         maia3/          [from AAPP v6.12]
                                libmaia3
                                libmaia3_tools
                                bin
                         maia4/          [from AAPP v7.5]
                                libmaia4
                                libaapp_viirs
                               
libmaia4_IO
                               
bin

                  data/
                         calibration/
                                coef/
                                       amsua/
                                       amsub/
                                       hirs/
                                       msu/
                                       avhcl/
                                       mhs/
                         navigation/
                         preproc/
                  gmt/

           metop-tools/
                  bin/
                  man/
                  lib/
                  include/
                  src/
                         bin/
                         libaapp_avhrrl1b/
                         
libccsds/
                         
libeps_common/
                         
libeps_metopl0/
                         
libmetop_amsua/
                         
libmetop_avhrr/
                         
libmetop_common/
                         
libmetop_hirs/
                         libmetop_intex/
                         libmetop_mhs/
                         
libobtutc/
                         
libeps_avhrrl1b_6.5/
                         
libmetop_admin/
           iasi-tools/
                  bin/
                  man/
                  lib/
                  include/
                  src/
                         bin/
                         libcnes_iasi_brd_1.6/
                         libcnes_iasi_brd_1.7/         [from AAPP v6.12]
                         libcnes_iasi_odb_1.4/
                         libcnes_iasi_grd_1.6/
                         libcnes_iasi_grd_1.7/         [from AAPP v6.12]
                         libeps_iasil1c_6.6/
                         libeps_iasil1c_9.0/         [from AAPP v6.12]
                         libcnes_iasi_ctx_1.2/

In the run-time directory (if applicable) you will find:

  ${AAPP_PREFIX}/
           ATOVS_ENV7
           ATOVS_CONF
           AAPP/
                  bin/
                  lib/
                  include/
                  data/
           metop-tools/
                  bin/
                  lib/
                  include/
           iasi-tools/
                  bin/
                  lib/
                  include/


TBUS and TLE files for the test cases are provided as part of the test case (see chapter 4). For running with your own HRPT data you should create the following directories:

    ${DIR_NAVIGATION}/tbus_db   and/or   ${DIR_NAVIGATION}/tle_db

(where DIR_NAVIGATION is defined in ATOVS_ENV7) and place your TBUS or TLE files in monthly subdirectories. In the case of TLE, the "get_tle" script does all this automatically, creating directories as required.

3.13 After Installation

After you have executed the installation procedure a few steps have to be performed before you can go to run AAPP:

i)   Check whether your installation was successful. Note that the error messages from the Fortran compilations might differ depending on the compiler you are using. Some compilation warnings are to be expected, but you should not see any actual errors. The "make" process will terminate early if there is an error.

ii)  If you have customised the configuration file (e.g. are using a different compiler from those that are supported), it is advisable to run the tools det_ftnfort and det_reclen (both installed into AAPP/bin) to check that you have used the correct convention for Fortran file names and record length specifiers. These scripts will tell you what to do if the values need changing.

iii)  For AAPP versions 1 to 4.3 it was necessary to byte swap certain input files when running on a little-endian machine. For version 4.4 onwards this is performed automatically, but you should be aware that output files will always be in the native byte order of the machine you are running on (except BUFR output, which is character-oriented). If you're not sure whether your machine is big or little endian, type a command such as echo 01 | od -x . The result will be 3130 on a little-endian system and 3031 on a big-endian system.

iv)  If your UNIX system does not have a Korn shell then you will need to adapt the various scripts (e.g. in directory AAPP/bin) to your shell type. Modify also accordingly the *.ksh scripts contained in subdirectories of AAPP/src in order not to lose the modifications when you rebuild AAPP in order to implement e.g. code modifications.

v)  Add directory (destination directory)/AAPP/man/ to your MANPATH list to get access to the AAPP manual pages by e.g. adding to your .profile

MANPATH=(destination directory)/AAPP/man:$MANPATH
The man pages are contained in the subdirectories man1/, man3/ and man5/ of directory (dest.-dir.)/AAPP/man. You can display the man pages in the usual way by typing at your shell prompt
man filename (without .suffix).
However, the definitive descriptions of the AAPP commands are given in the software description document, rather than the man pages. And not all commands have a man page.

vi)  To allow you to call AAPP scripts and binaries directly, it is recommended that you source the ATOVS_CONF file as part of your own setup, as described in section 3.7.

vii)  Orbital prediction:

Three orbital prediction methods are implemented in AAPP v6: TBUS, Two-Line-Element (TLE) and a method referred to as the "SPOT model". The SPOT model is applicable to Metop-A only; it is being phased out and will not be used for Metop-B or C.

TLE is recommended as an alternative to TBUS since the accuracy is usually better. The data may be downloaded from the Space-Track web site using AAPP script get_tle, or alternatively from Celestrak. You will need to obtain a user name and password for the Space-Track site, which should be entered in ATOVS_ENV7 under parameters PAR_NAVIGATION_TLE_USER and PAR_NAVIGATION_TLE_PASSWD. This assumes that the computer on which you run AAPP has internet access; if it does not then you will need to download the files using a different computer and transfer them.

To choose which method to use for a given satellite, just modify the ATOVS_ENV7 variable PAR_NAVIGATION_LISTEBUL. In AAPP v6.12 and v7.1 the default is set to:

PAR_NAVIGATION_DEFAULT_LISTESAT='noaa19 noaa18 noaa17 noaa16 noaa15 M04 M02 M01'
PAR_NAVIGATION_DEFAULT_LISTEBUL='tle tle tle tle tle tle tle tle'
AAPP will use TBUS bulletins for satellites for which "tbus" is specified, TLE for satellites with "tle" and SPOT for satellites with "spm". These parameters are used by commands AAPP_RUN_NOAA, AAPP_RUN_METOP, alleph, amsuacl, amsubcl, hirscl, mhscl and avhrcl. These programs are not applicable to the NPP satellite, and therefore NPP is not included in the PAR_NAVIGATION_DEFAULT_LISTESAT list. But you can generate satpos files for NPP in the usual way, by running tleing -s NPP and satpost -s NPP

There are 4 other optional environment parameters related to TLE - get_tle will provide default values if they are not defined by the user or in ATOVS_ENV7. The defaults are given below:
export DIR_DATA_TLE=${DIR_NAVIGATION}/tle_db    #directory to store data
export PAR_NAVIGATION_TLE_TIMEOUT=60            #connection timeout in seconds
export PAR_NAVIGATION_TLE_URL_DOWNLOAD='https://www.space-track.org'
export PAR_NAVIGATION_TLE_CATALOGUE='25338,26536,27453,28654,33591,37849,29499,38771,27431,32958,37214,25994,27424'   #Catalogue numbers
export PAR_NAVIGATION_TLE_URL_LOGIN='https://www.space-track.org/perl/login.pl';     #URL for login (not used with new space-track site)

viii)  Choice of HIRS calibration algorithm:

The HIRS calibration algorithm is selected by means of environment variable HIRSCL_VERSION in ATOVS_ENV7. To select the old algorithm, as used in AAPP v4 and earlier, set the value to 0. To select the newer algorithm set it to 1. The new algorithm accumulates data on calibrations runs from previous passes, and is designed to give better consistency with global data, especially for partial super-swaths. There are various other parameters available for the new algorithm - see comment lines in ATOVS_ENV7. If in doubt, use the default for these.

ix) Locale

On some systems the unix "sort" command (which is used by several AAPP scripts) may give unexpected answers. You should check your locale settings (type locale): you should see LC_ALL=C or LANG=C (with LC_ALL undefined). If this is not the case, add LC_ALL=C to your shell startup script (or add it to ATOVS_ENV7).

3.14 A note on the Community Satellite Processing Package (CSPP) and HDF utilities

The CSPP is supplied by the University of Wisconsin, for level 1 and level 2 processing Suomi-NPP and JPSS direct readout data. For detailed information, see the CSPP web page. A detailed description of CSPP is beyond the scope of this document. However, it is worth pointing out some matters relevant to AAPP.

Firstly, CSPP includes pre-built copies of various libraries, including HDF5 library (C interface - not Fortran90), SZIP, Jasper, etc. You may wish to link to these when building AAPP.

Secondly CSPP includes the HDF5 tools, which can be used to complement AAPP. These are in directory CSPP1.3/SDR_1_3/common/local/bin/ . You may wish to add this directory to your $PATH. Of particular relevance is the utility nagg, which can be used to concatenate NPP products and granules. For example, to concatenate granules of VIIRS channels M9, M10 and the terrain-corrected geolocation:

nagg -g 0 -n 100 -O cspp -D dev -t GMTCO,SVM09,SVM10 GMTCO*.h5 SVM*.h5

For more detail on this utility, see the corresponding HDF group web page.

 
 
 

Chapter 4:  Testing