surface_forcing_wrf - UK-FVCOM-Usergroup/uk-fvcom GitHub Wiki

WRF atmospheric model

Outlines how to use WRFhttp://wrf-model.org) to generate forcing for FVCOM.

Download

WRF requires registration before you are able to download the code.

For a minimal working model run, you need to download:

  • WRF-ARW - The main WRF model
  • WPS - The WRF preprocessing tool
  • geog - The geography data. Either use the full dataset (~48GB uncompressed) or the minimal dataset (much smaller).

There are also libraries required for WRF:

  • netCDF
  • JasPer
  • zlib
  • libpng
  • mpich

These are available from the WRF website.

Compilation

Libraries

For a desktop Fedora Linux machine, the system mpich is used for WRF but the other dependencies are compiled. An example script shows the steps to build the libraries on Fedora Linux. This script is put in a new directory within the directory where WRF has been downloaded: $HOME/Models/WRF/LIBRARIES.

WRF

Compiling WRF must take place after the libraries have been compiled and installed (if the scripts in this wiki have been used, that is $HOME/Models/WRF/LIBRARIES). The main WRF source code (WRFV3 directory) is in $HOME/Models/WRF. We used a build script to compile WRF for use in parallel across a cluster (i.e. distributed memory rather than shared memory).

WPS

To configure the data required for a WRF run, WPS needs to be installed. The compilation of WPS must occur after WRF itself as WPS leverages WRF functions. The compile WPS, We placed its source code in the $HOME/Models/WRF directory and used another (configuration:surface_forcing_wrf:build_wps|build script) to build the necessary binaries.

WPS Grid Configuration

The first step to running WRF is to generate the necessary input files with WPS. An example (configuration:surface_forcing_wrf:namelist.wps|namelist.wps) shows a 3 domain nested configuration for the UK shelf. Use geogrid.exe to generate the netCDF grids used as input for the main WRF model.

Forcing data

NCEP FNL global output

The easiest way to use WRF is with GRIB formatted data to use as forcing. The NCEP FNL Operational Model Global Tropospheric Analyses data (July 1999 to present) are available for free from the CISL Research Archive. The data are only available upon receipt of an account at CISL (which is free) and once an account has been granted, the data can be batch downloaded. There is a shell script which can be used to download the data for a given period here.

Once the data have been downloaded, they need to be linked into the WPS directory (if you have been using the scripts in this wiki, that is $HOME/Models/WRF/WPS) with the WRF-provided link_grib.csh script. Following that, the ungrib.exe binary needs to be run to generate the correct format for WRF (to extract the necessary variables from the GRIB files). A Vtable file must be symlinked into the $HOME/Models/WRF/WPS directory first (for the NCEP FNL data, that file is available from the CISL site). The example [namelist.wps(namelist.wps) must also be in the directory with the output from link_grib.csh. Then metgrid.exe needs to be run to interpolate the forcing data onto the model grids.

An example script which runs the link_grib.csh, ungrib.exe and metgrid.exe steps all together can be found here. It requires a directory layout as follows:

    .
    ├── bin
    ├── grids
    └── templates

where bin contains the link_grib.csh script and the ungrib.exe and metgrid.exe binaries, grids contains the output of model grid generation (geogrid.exe) and templates contains a template namelist.wps and Vtable.

Outputs from the preparation script will be placed in a directory for each year being processed.

ECMWF ERA-Interim

Alternatively, the ECMWF's ERA-Interim data can be used to force WRF. The process is similar to that for the NCEP FNL data. This table shows the variables required for WRF run with ERA-Interim.

As with the NCEP FNL data, the process is to download the required variables and convert from grib to WRF intermediate format with ungrib.exe. In addition, the files need coefficients applying with calc_ecmwf_p.exe before interpolating with metgrid.exe.

As with the NCEP data, these scripts assume a directory structure of:

    .
    ├── bin
    ├── grids
    └── templates

Place the template files from the pages linked below into the templates directory, the WPS pre-processing binaries (ungrib.exe, metgrid.exe and calc_ecmwf_p.exe) into bin and your WRF grid netCDF files into grids.

This bash script automatically downloads the ECMWF ERA-Interim data necessary for WRF from templates. A separate script converts those raw ECMWF ERA-Interim files from grib to WRF intermediate format. Finally, this script interpolates the WRF intermediate files into WRF forcing files.

Please see the compilation section for instructions on compiling WRF, or follow the official instructions.

Running WRF

WRF is actually two programs: real.exe and wrf.exe.

real.exe preprocesses the input files and prepares them for the main wrf.exe program and needs only be run for a 'cold start' (i.e. from the global initial conditions).

wrf.exe is the actual WRF model. It can be run from a cold start (with a wrfbdy file) or from a restart (with a set of wrfrst_d01_2003-01-01_00:00:00 files for each domain you have).

To run WRF on ARCHER, We have written a PBS submission script which will run WRF for a given period (configured in the PBS script). It requires the model namelist (namelist.input) to exist in a directory models (with the namelist files named $year-$month.input - one per month of a given year(s)). Outputs are stored in output/$year and logs (rsl.out.* and rsl.error.*) are moved to logs. The input files shoud live in a directory called input/$year (those files being the output of metgrid.exe). In summary, the directory structure for use with the examples in this wiki is:

    run
       ├── input
       │   ├── 2000
       │   ├── ...
       │   └── 2010
       ├── launch
       │   └── wrf_2000-2010.pbs
       ├── logs
       ├── models
       │   ├── 2000-01.input
       │   ├── 2000-02.input
       │   ├── ...
       │   ├── 2010-11.input
       │   └── 2010-12.input
       └── output
           ├── 2000
           │   ├── 01
           │   ├── ...
           │   ├── 12
           │   └── restart
           ├── ...
           └── 2010
               ├── 01
               ├── ...
               ├── 12
               └── restart

Post-processing output

FVCOM can't use WRF output directly. Instead, FVCOM provides a number of routines to correct the WRF output to the formats and variable types aniticipated by FVCOM. The principal one of use is wrf_to_fvcom.f90 which can be found in the FVCOM source code. This is a serial program, so compile it without MPI.

To simplify this process somewhat, I've included a makefile for the wrf_to_fvcom.f90 program which can be found in the the FVCOM source code directory. We also modified wrf_to_fvcom.f90 to output pressure in millibars and take a new option (-latitude) to specify the central latitude at which to compute the heat flux parameters (previously, it defaulted to George's Bank). The number of iterations (nits) in the COARE2.6 routines has also been reduced from 6 to 3 to improve the stability of the solution (having 6 iterations sometimes generated a NaN value).