C48_S2SW gfs_fcst_seg0 - TerrenceMcGuinness-NOAA/global-workflow GitHub Wiki

C48_S2SW GFS Forecast Segment 0 - Test Case Documentation

Test Case: C48_S2SW-gfs_fcst_seg0.yaml
Configuration: C48_S2SW (Coupled Sea-to-Sea-to-Wave)
Job: JGLOBAL_FORECAST
Duration: 120-hour coupled forecast (f000-f120)
Status: ✅ VERIFIED CORRECT - No changes needed
Last Updated: October 1, 2025


Overview

This test validates the complete 4-component coupled forecast capability integrating Atmosphere, Ocean, Sea Ice, and Wave models through the UFS Weather Model coupled framework. This is the most comprehensive forecast test, demonstrating full Earth system interactions over a 5-day period.

Total Files:

  • Input: 24 files (4-component initial conditions + wave grid definitions)
  • Output: ~380+ files (multi-component history + restart files)

Coupled System Components

UFS Coupled Model (CMEPS Mediator)
    ├─> FV3 Atmosphere (GFS)
    ├─> MOM6 Ocean
    ├─> CICE6 Sea Ice
    └─> WaveWatch III (8 regional/global grids)

Coupling Frequency: Components exchange fluxes every model timestep
Coupling Variables: Heat, momentum, freshwater, radiation, surface roughness


File Breakdown

Input Files: 24

Component Count Files Location
Atmosphere IC 13 gfs_ctrl.nc, gfs_data/sfc_data tiles gdas/model/atmos/input/
Ocean Restart 1 MOM.res.nc gdas/model/ocean/restart/
Ice Restart 1 cice_model.res.nc gdas/model/ice/restart/
Wave Restart 1 restart.ww3 gdas/model/wave/restart/
Wave Grid Definitions 8 mod_def.{grid} files gdas/model/wave/restart/

Wave Grids (8 models):

  • glo_30m - Global 30-minute resolution
  • uglo_100km - Unstructured global 100 km
  • aoc_9km - Arctic-Ocean 9 km
  • gnh_10m - Global Northern Hemisphere 10-minute
  • gsh_15m - Global Southern Hemisphere 15-minute
  • at_10m - Atlantic 10-minute
  • ep_10m - Eastern Pacific 10-minute
  • wc_10m - West Coast 10-minute

Output Files: ~380+ (Across All Components)

Component Files Frequency Location
Configuration 7 One-time gfs/model/*/
Atmosphere History 123 3-hourly gfs/model/atmos/history/
Ocean History 20 6-hourly gfs/model/ocean/history/
Ice History 21 6-hourly + IC gfs/model/ice/history/
Wave uglo_100km 73 Hourly→3hr gfs/model/wave/history/
Wave Points 121 Hourly gfs/model/wave/history/
Other Wave Grids Variable Per-grid gfs/model/wave/history/

Total: ~380+ files (exact count varies with wave grid configuration)


Key Insights

Complex Wave Output Pattern

Wave uglo_100km Grid Strategy:

Time Period          Frequency    Files    Configuration
─────────────────────────────────────────────────────────
f000-f048 (0-2 days)  Hourly      49      FHOUT_WAV=1
f051-f120 (3-5 days)  3-hourly    24      FHOUT_WAV_GFS=3
─────────────────────────────────────────────────────────
Total                             73

Note: Gap at f049, f050 during transition to 3-hourly

Configuration Source: parm/config/gfs/config.base.j2

export FHOUT_WAV=1           # Hourly wave output f000-f048
export FHOUT_WAV_GFS=3       # 3-hourly wave output f051-f120
export FHMAX_HF_WAV=48       # High-frequency cutoff at 48 hours

Operational Rationale:

  • Days 1-2: Hourly data critical for wave warnings, ship routing
  • Days 3-5: 3-hourly sufficient for extended wave forecasts
  • Storage savings: ~3-4 GB saved while maintaining operational utility

Why Different Component Frequencies?

Component Frequency Reasoning
Atmosphere 3-hourly Standard NWP practice, balances detail vs storage
Ocean 6-hourly Slower evolution, averaged over 6-hour windows
Ice 6-hourly Slow processes, matches ocean output
Wave (early) Hourly Rapid wave development, critical for forecasts
Wave (late) 3-hourly Mature wave field, less detail needed
Wave points Hourly Point data small, useful for validation

Coupled System Complexity

Why S2SW has 1.8× more files than ATM-only:

  1. Multiple Components: 4 models vs 1 (ATM, OCN, ICE, WAV)
  2. Component Interactions: Coupled exchanges every timestep
  3. Different Output Rates: Each component optimized separately
  4. Wave Multi-Grid: 8 wave model grids, each with unique output
  5. Specialized Products: Wave point data, grid-specific files

Efficiency Note: Only 1.8× files despite 4× components because:

  • Ocean/Ice use 6-hourly output (fewer than atmos 3-hourly)
  • Wave uses variable frequency (hourly only when needed)
  • Shared atmosphere output (same 41 files regardless of coupling)

Data Flow

Initial Conditions (24 files)
    ├─> Atmosphere: 13 files (gfs_ctrl.nc + tiles)
    ├─> Ocean: 1 file (MOM.res.nc)
    ├─> Ice: 1 file (cice_model.res.nc)
    └─> Wave: 1 restart + 8 grid definitions
    ↓
UFS Coupled Model (120-hour run with CMEPS mediator)
    ↓
Atmosphere Component (every 3 hours)
    ├─> 41 × atmf*.nc (3D atmospheric state)
    ├─> 41 × sfcf*.nc (2D surface fields)
    ├─> 41 × master.grb2f* (GRIB2 for products)
    └─> 41 × sfluxgrbf*.grib2 (surface flux)
    ↓
Ocean Component (every 6 hours)
    └─> 20 × gfs.ocean.t{cyc}z.6hr_avg.f*.nc (f006-f120)
    ↓
Ice Component (every 6 hours + initial)
    └─> 21 × gfs.ice.t{cyc}z.6hr_avg.f*.nc (IC + f006-f120)
    ↓
Wave Component (variable frequency)
    ├─> 73 × uglo_100km binary (hourly→3-hourly)
    ├─> 121 × point output NetCDF (hourly all forecast)
    └─> Additional grid-specific outputs
    ↓
Total Output: ~380+ files across all components

Coupling Process:

  1. Atmosphere computes surface fluxes
  2. CMEPS mediator redistributes fluxes to ocean/ice/wave
  3. Ocean/ice compute SST, ice concentration
  4. Wave computes surface roughness
  5. Updated surface conditions fed back to atmosphere
  6. Repeat every coupling timestep

Comparison with ATM-Only Configuration

Aspect C48_S2SW (Coupled) C48_ATM (Atmosphere-Only)
Components 4 (ATM+OCN+ICE+WAV) 1 (ATM)
Input Files 24 13
Output Files ~380+ 209
Complexity High (coupling overhead) Simple (single component)
Run Time ~2-3× longer Baseline
Storage ~30-40 GB ~15-20 GB
File Count Ratio 1.8× 1.0× (baseline)

Why only 1.8× files despite 4× components?

  • Optimization through different component output frequencies
  • Ocean/Ice: 6-hourly (fewer files than atmospheric 3-hourly)
  • Wave: Variable rate (hourly only when critical)
  • Shared atmosphere: Same 41 atmos files regardless of coupling

Operational Significance

What This Test Validates

Multi-Component Coupling: All 4 components exchange fluxes correctly
Atmosphere-Ocean Coupling: Heat/momentum flux exchanges
Atmosphere-Ice Coupling: Surface albedo, temperature interactions
Atmosphere-Wave Coupling: Surface roughness, stress calculations
Wave Multi-Grid System: 8 regional/global grids run simultaneously
5-Day Coupled Stability: No crashes over 120-hour integration
Variable Output Frequencies: Each component uses optimal cadence

Critical for Production Applications

Hurricane Forecasting:

  • Ocean feedback on intensity (SST cooling under storm)
  • Wave height impacts on coastal inundation

Sea Ice Predictions:

  • Ice-atmosphere coupling for polar forecasts
  • Ice albedo feedback on atmospheric temperatures

Wave Height Forecasts:

  • Multi-grid coverage (global + regional detail)
  • Atmosphere-wave coupling for surface stress

Coastal Applications:

  • Coupled storm surge and wave models
  • Ocean-atmosphere interactions for sea breeze

Directory Structure Pattern

gfs.{PDY}/{cyc}/model/
├── atmos/
│   ├── history/        # 41 × (atmf*.nc + sfcf*.nc) + logs
│   └── master/         # 41 × (master.grb2 + sflux.grib2)
├── ocean/
│   └── history/        # 20 × 6hr_avg.f*.nc (f006-f120)
├── ice/
│   └── history/        # 21 × 6hr_avg.f*.nc (IC + f006-f120)
└── wave/
    └── history/        # 73 × uglo_100km + 121 × points + other grids

Critical: All use model/ directory for raw forecast output, NOT products/ directory. Products are generated in separate downstream jobs (ocean/ice products tasks).


Wave Output Strategy Deep Dive

Why Hourly f000-f048, Then 3-Hourly?

Period Files Storage Purpose
f000-f048 (hourly) 49 ~5-6 GB Rapid development, validation
f051-f120 (3-hourly) 24 ~2-3 GB Mature field, operational products

Storage Savings: 3-hourly after f048 saves ~3-4 GB while maintaining operational utility

Operational Logic:

  • First 2 days: Hourly data critical for wave warnings, ship routing, coastal hazards
  • Days 3-5: 3-hourly sufficient for extended forecasts, planning activities

Multi-Grid Wave System

Why 8 Wave Grids?

Grid Coverage Resolution Purpose
glo_30m Global 30' (~55 km) Baseline global waves
uglo_100km Global unstructured ~100 km Efficient global coverage
aoc_9km Arctic-Ocean 9 km High-lat ice-wave interaction
gnh_10m N. Hemisphere 10' (~18 km) Northern ocean detail
gsh_15m S. Hemisphere 15' (~28 km) Southern ocean detail
at_10m Atlantic 10' (~18 km) Hurricane wave forecasts
ep_10m E. Pacific 10' (~18 km) Pacific storm waves
wc_10m West Coast 10' (~18 km) US coastal waves

Grid Selection Strategy:

  • Global grids: Ensure wave energy propagates correctly worldwide
  • Regional grids: High resolution for critical forecast areas
  • Unstructured grid: Computational efficiency with variable resolution

Verification Commands

Run This Test

# Execute coupled forecast test
ctest -R "C48_S2SW.*fcst.*validate" --verbose

# Check multi-component output
ls -lh gfs.{PDY}/{cyc}/model/atmos/history/atmf*.nc
ls -lh gfs.{PDY}/{cyc}/model/ocean/history/*.nc
ls -lh gfs.{PDY}/{cyc}/model/ice/history/*.nc
ls -lh gfs.{PDY}/{cyc}/model/wave/history/*.bin

Verify Component File Counts

# Atmosphere (should match ATM-only test)
find model/atmos/history -name "atmf*.nc" | wc -l   # 41
find model/atmos/master -name "master.grb2f*" | wc -l # 41

# Ocean (6-hourly: f006, f012, ..., f120)
find model/ocean/history -name "*6hr_avg*.nc" | wc -l  # 20

# Ice (6-hourly + initial: IC, f006, f012, ..., f120)
find model/ice/history -name "*6hr_avg*.nc" | wc -l    # 21

# Wave uglo_100km (hourly f000-f048 + 3-hourly f051-f120)
find model/wave/history -name "uglo_100km*.bin" | wc -l # 73

Verify Wave Transition Pattern

# Hourly period (f000-f048): 49 files
ls model/wave/history/uglo_100km.f0[0-3][0-9]*.bin | wc -l  # 40 files (f000-f039)
ls model/wave/history/uglo_100km.f04[0-8]*.bin | wc -l      # 9 files (f040-f048)

# Gap at f049, f050 (transition period)
ls model/wave/history/uglo_100km.f049*.bin  # Should not exist
ls model/wave/history/uglo_100km.f050*.bin  # Should not exist

# 3-hourly period (f051-f120): 24 files
ls model/wave/history/uglo_100km.f0[5-9][1-9]*.bin  # f051, f054, f057, ...
ls model/wave/history/uglo_100km.f1[0-2][0-9]*.bin  # f102, f105, ..., f120

Configuration Details

Key Parameters from config.base.j2

# Forecast Configuration
CASE="C48"                    # ~200 km resolution
CASE_ENS="C48"                # Ensemble resolution (same)
FHMAX=120                     # Maximum forecast hour
FHOUT=3                       # Atmosphere output frequency

# Wave Configuration
FHOUT_WAV=1                   # Wave output f000-f048 (hourly)
FHOUT_WAV_GFS=3               # Wave output f051-f120 (3-hourly)
FHMAX_HF_WAV=48               # High-frequency wave cutoff

# Ocean/Ice Configuration
FHOUT_OCN=6                   # Ocean output frequency
FHOUT_ICE=6                   # Ice output frequency

# Coupling
DO_ATM_OCEAN_COUPLING=".true."
DO_ATM_ICE_COUPLING=".true."
DO_ATM_WAVE_COUPLING=".true."

Resolution Characteristics

Atmosphere (C48):

  • Cubed-sphere: 6 tiles × 48×48 = 13,824 columns
  • Horizontal: ~200 km
  • Vertical: 127 levels (GFS operational)

Ocean (MOM6):

  • Tripolar grid: ~1° nominal resolution
  • Vertical: 75 levels
  • Global coverage including Arctic

Ice (CICE6):

  • Same horizontal grid as ocean
  • 5 ice thickness categories
  • 4 ice layers for thermodynamics

Wave (WaveWatch III):

  • 8 grids with varying resolution
  • Global: 30' to 100 km
  • Regional: 9 km to 10' (~18 km)

File Size Estimates

Component Files Size/File Total
Atmosphere history 82 ~200 MB ~16 GB
Atmosphere GRIB2 82 ~100 MB ~8 GB
Ocean history 20 ~100 MB ~2 GB
Ice history 21 ~50 MB ~1 GB
Wave uglo_100km 73 ~20 MB ~1.5 GB
Wave points 121 ~5 MB ~600 MB
Other wave grids Variable Variable ~5 GB
Total ~380+ ~34 GB

Lessons Learned (October 1, 2025)

Test Case Validation Success ✅

Status: This test required NO CHANGES because:

  1. ✅ Correctly uses model/ paths (not products/)
  2. ✅ Generates multi-component forecast output
  3. ✅ All ~380 expected files present
  4. ✅ Provides correct inputs for downstream ocean/ice products tests

MCP Tool Utilization 🔧

Global Workflow MCP insights applied:

  • Configuration file locations across platforms
  • Component coupling patterns
  • Multi-grid wave system documentation

Value demonstrated:

  • Quick reference to complex coupled system structure
  • Understanding of variable output frequencies
  • Wave grid configuration rationale

Related Test Cases

  1. C48_ATM-gfs_fcst_seg0.yaml - Atmosphere-only version (simpler)
  2. C48_S2SW-gfs_ocean_prod_f006.yaml - Consumes ocean output
  3. C48_S2SW-gfs_ice_prod_f006.yaml - Consumes ice output
  4. C48_S2SW-gefs_fcst_mem001.yaml - Ensemble version (coupled)

References

Source Files

  • Test Definition: dev/ctests/cases/C48_S2SW-gfs_fcst_seg0.yaml
  • Job Script: jobs/JGLOBAL_FORECAST
  • Execution Script: scripts/exglobal_forecast.py
  • Coupled Logic: ush/forecast_det.sh
  • Wave Setup: ush/waveprep.sh

Configuration Files

  • Base Config: parm/config/gfs/config.base.j2
  • Wave Config: parm/config/gfs/config.wave
  • Ocean Config: parm/config/gfs/config.ocean
  • Ice Config: parm/config/gfs/config.ice
  • UFS Templates: parm/ufs/coupled/

Documentation

  • Repository: TerrenceMcGuinness-NOAA/global-workflow
  • Branch: ctest_case_updates
  • Changelog: CTEST_UPDATES_CHANGELOG.md

Created: January 16, 2025
Updated: October 1, 2025
Status: Production-ready coupled forecast test, verified correct