Develop new configurations for UFS‐DATM SCHISM WW3 - oceanmodeling/ufs-weather-model GitHub Wiki


WARNING: THIS PAGE IS CURRENTLY IN ACTIVE DEVELOPMENT. USE THESE INSTRUCTIONS WITH CAUTION AT THIS TIME. CHECK BACK IN FOR A FINALIZED VERSION.


UFS-Coastal is a modeling framework for coupled coastal applications and regional prediction systems, contains coupled model components that link the atmospheric, ocean and terrestrial realms under one common framework, contains data components and an ESMF/NUOPC based mediator component, includes two types of components: 1) 1-way and 2-way coupled model components (coupled via the mediator); 2) Data component that pass forcing data, as required, to the model components.

UFS‐DATM+SCHISM+WW3

UFS-Coastal

To download the application use the “git” command:

git clone --recurse-submodules https://github.com/oceanmodeling/ufs-weather-model (git clone --recurse-submodules https://github.com/oceanmodeling/ufs-weather-model.git -b feature/coastal_app)

image

  • This command will download UFS-Coastal and all its components into the directory “ufs-weather-model”
  • Pay attention to the “--recursive” option in the git command as it is required to successfully download model components that are defined as “submodules” in the application
  • Omitting this option will result in empty directories in “ufs-weather-model” for all the submodules

To update the UFS-coastal in the folder of ufs-weather-model by using git pull

If the user wants UFS-Coastal to be installed locally using another name, the above “git” command can be modified as: git clone --recursive https://github.com/oceanmodeling/ufs-weather-model.git other_name where “other_name” is the directory name to download (clone) the application to (this does not affect how ufs-coastal is being used)

Prerequisites

The application is exclusively designed to use the spack-stack framework spack-stack needs to be installed in the HPC cluster, desktop or any other platform before UFS-Coastal can be used spack-stack installs all library dependencies and the “Lmod” module environment system that is used to load the required libraries for UFS-Coastal (spack-stack is not limited for UFS-Coastal usage only) One or both of the supported compiler options (GNU, Intel) should be available in the host operating system (OS) It is recommended that recent versions of GNU or Intel compilers are used The UFS-Coastal team has extensively tested UFS-Coastal and its model components using different versions of the supported compilers. Issues were resolved and updates were pushed to model component GitHub repos – testing is always an ongoing process for the application A working MPI (Message Passing Interface) installation. Some common MPI libraries are: OpenMPI, mpich, mvapich, impi The MPI should be compatible with the compilers used (usually different versions of the libraries are installed in the host OS) – important Recent versions of git and git-tools

Compilers

GNU family of compilers Version 8.x and above is recommended Intel family of compilers Version 19.x and above is recommended

Platforms

Since the UFS Coastal is developed as a fork of the UFS Weather Model, the same platforms supported by the UFS Weather Model (see list of suported platforms) is also used by the UFS Coastal.

Supported are all Tier 1 HPC clusters (cloud clusters included)

Modulefiles for preconfigured platforms are located in modulefiles/ufs_.. For example, to load the modules from the ufs-coastal directory on MSU’s Hercules:

cd ufs-coastal module use modulefiles module load ufs_hercules.intel

Build System Interface

CMake is used to drive the build (compilation) process Built-in cmake modules have been developed to drive all “application” based compilations The cmake modules are continually being updated to include all options available in model components (the goal is to be able to use the component(s) under a variety of different configurations) Bash scripting is used in the top level to drive the CMake build process and the configuration of a specific application supported in UFS-Coastal The ufs-weather-model/tests/rt.sh script is used to compile and run a regression test case (pre-configured test cases) The ufs-weather-model/tests/compile.sh script can be used to only compile the UFS-weather-model executable.

User is responsible to setup and configure all model components for the target application

Use-case workflows

Use case: I want to start with an existing DATM+SCHISM+WW3 RT but swap out for my own atm data and mesh:

After cloning the UFS-Coastal, go to the test folder: cd ufs-weather-model/tests/

Modified the rt.sh in the folder: DISKNM=/work/noaa/nems/tufuk/RT Set the output folder to your own folder dprefix=/work2/noaa/nos-surge/yunfangs/demo

image

To run the regression test

./rt.sh -a nos-surge -l rt_coastal.conf -c -k -n "coastal_ike_shinnecock_atm2sch2ww3 intel"

image

Now the RT for atm2sch2ww3 is successfully finished.

The we go to the run folder which is shown in the above screenshot: /work2/noaa/nos-surge/yunfangs/demo/stmp/yunfangs/FV3_RT/rt_3398111 And then the job folder: /work2/noaa/nos-surge/yunfangs/demo/stmp/yunfangs/FV3_RT/rt_3398111/coastal_ike_shinnecock_atm2sch2ww3_intel

image

In this folder: The executable file for atm2sch2ww3: fv3.exe The log file on each processor: PET??.ESMF_LogFile The CDEPS log file: datm.log The CMEPS log file: mediator.log The WW3 log file: log.ww3 The WW3 output files: 20080*.out_grd.ww3.nc The SCHISM output files in oldio: schout_00000?_?.nc Fv3.exe: this is the ufs model executable for atm2sch2ww3 this is the field dictionary yaml file: fd_ufs.yml

#To develop our own case by using the template of RT:

We can copy the job folder cp -r coastal_ike_shinnecock_atm2sch2ww3_intel coastal_ike_shinnecock_atm2sch2ww3_intel_copy

Then we need to remove all the output and log files: rm -rf *log* *Log* ufs*.nc 2008* outputs/*

image

Now we need to replace our own WW3 input files: inlet.msh mod_def.ww3 ww3_shel.nml SCHISM input files: albedo.gr3 bctides.in hgrid.gr3 hgrid.ll manning.gr3 param.nml vgrid.in watertype.gr3 windrot_geo2proj.gr3

For the coupling parts:datm_in datm.streams fd_ufs.yaml model_configure ufs.configure

datm_in Datm_in Contains two namelist arguments: model_maskfile = & model_meshfile= Both must point to your new mesh file Also includes args for domain size nx_global & ny_global which you can check for your particular mesh using for example ncdump (there is an nc variable called origGridDims) : $ncdump -v origGridDims mesh.nc → gives dimensions nx, ny Don’t touch any other namelist options in datm_in image

The following need be modified:

image

data.streams Datm.streams All the exported fields are defined in SCHISM-ESMF/src/schism/schism_nuopc_cap.F90. Look for state=importState in this file. Note that: 1) the fields are model dependent (e.g. ROMS requires more fields than SCHISM); 2) SCHISM exported variables currently doesn’t include what’s needed for many of the SCHISM modules like temp, precipitation, etc. CDEPS does the time & spatial interpolation based on the interpolation parameters set in this file. E.g., the input data could be every 3 hours but the coupling interval can be set differently In this file, check and set the following: Stream_mesh_file01: “name_of_your_new_mesh.nc” Check time dimension in your input file. The time dimension needs to have time:calendar, and units are important (note that CDEPS follows CF standard for netcdf files) Stream_data_files input data file Change dates in data.streams to match your run year: yearFirst01, yearLast01, yearAlign01 Stream_offset could be important, but prob set to 0 Stream_data_variables01: “variable_from_data variable_used_by_CDEPS”

image

The following need be modified: image

model_configure

image Change forecast hour nhours_fcst The following need be modified:

image

ufs.configure

image image

The following need be modified:

image Number of total cores in _petlist_bounds needs to be consistent with job submission script . The MED_petlist_bounds should be consistent with job_card, which it the total number of processor used. The ATM_petlist_bounds is the processors used in CDEPS. The OCN_petlist_bounds is the processors used in the ocean component, here it is SCHISM. The WAV_petlist_bounds is the processors used in WWIII.

mesh_wav is the WW3 mesh file produced by ESMF:

For curvilinear grid can use NCL, which has functions that create scripgrid def file (an additional nc file that defines your grid): link → creates scrip.nc Use scrip grid def file to use esmf offline tools to convert that file to an esmf mesh file: After git clone-ing the ufs coastal model to ufs-coastal directory, run module use /modulefiles, and module load ufs_.intel ESMF_Scrip2Unstruct scrip.nc ESMFmesh.nc 0 ESMF

Run the new build ./compile.sh "-DAPP=CSTLS -DNO_PARMETIS=OFF -DOLDIO=ON" "NewRunName" intel YES NO

Add to this all SCHISM inputs and atm data Generate ESMFmesh NetCDF file for that atm dataset and store it in the work directory Copy these to the work directory: Model_configure, ufs.configure, datm_in, datm.streams, fd_ufs.yml, .job files One way to get these files is to run a pre-configured test case. They will be saved in the stmp/.../FV3_RT/rt_/coastal_ike_shinnecock_atm2sch_intel directory Then modify the files as needed Compile SCHISM using the compile.sh script, e.g. ./compile.sh hera "-DAPP=CSTLS -DNO_PARMETIS=OFF -DOLDIO=ON" "coastalS" intel YES NO This will create a build folder build_fv3_coastalS and the executable fv3_coastalS.exe Will also build SCHISM tools in build_fv3_coastalS/bin Copy all the resulting binaries into a new work/bin directory Submit the simulation run/sbatch from within work

Additional notes Combining SCHISM output (for OLDIO, which UFS Coastal currently requires) requires a complete ‘stack’ (no partial combine) combine_output11_MPI -n 44 -b 1 -e 31 -w 1 In ufs.configure, set debug_level 10 in the OCN section to get .vtk files for each variable

In UFS-weather, It will find in the common location The directory is created once in the regression test, by using “-c” to create a baseline in the directory , this baseline is for intel that will change for gnu and different for different platform

Use case: I want to creating a new configuration using compile.sh (datm+schism+ww3)

To compile the source codes:

Run the new build in the folder of /work2/noaa/nos-surge/yunfangs/demo/ufs-weather-model/tests, ./compile.sh <platform> " -DAPP=CSTLSW -DUSE_ATMOS=ON -DUSE_WW3=ON -DNO_PARMETIS=OFF -DOLDIO=ON -DPDLIB=ON -DBUILD_UTILS=ON " "NewRunName" intel YES NO

We will received the fv3_NewRunName.exe as the executable file in /work2/noaa/nos-surge/yunfangs/demo/ufs-weather-model/tests

Add to this all SCHISM inputs, WW3 inputs and atm data Generate ESMFmesh NetCDF file for that atm dataset and store it in the work directory with the same procedure. Copy the following files from the RT test to the work directory: Model_configure, ufs.configure, datm_in, datm.streams, fd_ufs.yml, *.job files

To get these files is to run a pre-configured test case. They will be saved in the stmp/.../FV3_RT/rt_*/coastal_ike_shinnecock_atm2sch_intel directory Then modify the files as needed as above instructions.

Compile SCHISM using the compile.sh script, e.g. ./compile.sh hera "-DAPP=CSTLSW -DNO_PARMETIS=OFF -DOLDIO=ON -DBUILD_UTILS=ON" "coastalS" intel YES NO This will create a build folder build_fv3_coastalS and the executable fv3_coastalS.exe Will also build SCHISM tools in build_fv3_coastalS/bin Copy all the resulting binaries into a new work/bin directory Submit the simulation run/sbatch from within work

Additional notes Combining SCHISM output (for OLDIO, which UFS Coastal currently requires) requires a complete ‘stack’ (no partial combine) combine_output11_MPI -n 44 -b 1 -e 31 -w 1 In ufs.configure, set debug_level 10 in the OCN section to get .vtk files for each variable

In UFS-weather, It will find in the common location The directory is created once in the regression test, by using “-c” to create a baseline in the directory , this baseline is for intel that will change for gnu and different for different platform

⚠️ **GitHub.com Fallback** ⚠️