Step by Step Instructions - skoriche/NGIAB-Calibration-DevCon25 GitHub Wiki

Step-by-Step Instructions

This section provides detailed guidance through the calibration workshop using the provided datasets and Docker images.

💡 New to the workshop? Check out our Quickstart Guide for a faster path to get up and running!

Prerequisites

This workshop requires Docker and UV.

📋 For installation and setup details: Pre-Workshop Checklist

Setting Up Your Environment

Getting the Workshop Data

Option A: Jetstream Virtual Machines (Workshop Participants)

#Connect to JetStream (Workshop Participants)
ssh -L 5906:localhost:5906 exouser@<ip address> #000.000.000.000   

#Now in your VNC client go to localhost:5906

# Navigate to the pre-installed data directory
cd /home/exouser/workshop/calibration/
  • Quick caveat! If you do not have firefox installed, just run the following command:
sudo apt-get update && sudo apt-get install firefox --yes

Option B: Download Data (Local/Other Systems)

# Download and extract the workshop data
wget https://communityhydrofabric.s3.us-east-1.amazonaws.com/example_data/provo-10154200.tar.gz
tar -xzf provo-10154200.tar.gz

Verify Data Structure

ls -la provo-10154200
# You should see: config/, forcings/, metadata/, outputs/

Understanding the Dataset

The workshop uses data for USGS gauge 10154200 (Provo River). This dataset has been pre-processed and includes:

  • Hydrofabric data: Catchment and network information
  • Forcing data: Meteorological inputs (AORC data)
  • Model configurations: Pre-configured NextGen model setup
  • Observation data: Streamflow observations for calibration

Creating a Calibration Configuration

  1. Generate the calibration setup:

    uvx ngiab-cal provo-10154200 -g 10154200 -i 4

    This command will:

    • Create a calibration directory
    • Copy and prepare necessary configuration files
    • Set up observation data files
    • Generate calibration-specific configurations
  2. Examine the created structure:

    ls -la provo-10154200/calibration/

    You should see:

    calibration/
    ├── crosswalk.json
    ├── ngen_cal_conf.yaml      # Main calibration configuration
    ├── obs_hourly_discharge.csv # Observation data
    ├── realization.json        # Model configuration
    └── troute.yaml            # Routing configuration
    
  3. Review the calibration configuration:

    less provo-10154200/calibration/ngen_cal_conf.yaml

    Key settings in this file include:

    • Parameter ranges: Min/max values for calibration parameters
    • Objective function: Metrics used to evaluate model performance
    • Time periods: Calibration and validation time windows
    • Algorithm settings: DDS (Dynamic Dimensioned Search) parameters
  4. Check observation data:

    head provo-10154200/calibration/obs_hourly_discharge.csv

Running the Calibration Process

Quick Test Run (Recommended for Learning)

For a quick test to understand the workflow:

uvx ngiab-cal provo-10154200 -g 10154200 --run -i 4

This runs a 4-iteration calibration (takes ~12 minutes with 16 cores).

Full Calibration Run

For a more thorough calibration:

uvx ngiab-cal provo-10154200 -g 10154200 --run -i 100

This runs 100 iterations (varies with system resources but iteration time is consistent enough to estimate the total time required).

Advanced Configuration Options

You can customize the calibration with additional parameters:

# Specify warmup period (days) and calibration/validation split
uvx ngiab-cal provo-10154200 -g 10154200 --run -i 50 -w 365 --calibration_ratio 0.7

# Force recreation of calibration config
uvx ngiab-cal provo-10154200 -g 10154200 -f --run -i 50

Parameter explanations:

  • -i, --iterations: Number of calibration iterations (default: 100)
  • -w, --warmup: Warmup period in days (default: 365)
  • --calibration_ratio: Split between calibration and validation (0.7 = 70% calibration, 30% validation)
  • -f, --force: Overwrite existing calibration configuration

Monitoring Calibration Progress

A window will show you the progress of the calibration process while it is running:

╭──────────────────────────────────── ngen simulation live output ─────────────────────────────────────╮
│ NGen top-level timings:                                                                              │
│         NGen::init: 2.54646                                                                          │
│         NGen::simulation: 46.7027                                                                    │
│         NGen::routing: 28.1302                                                                       │
╰───── tail -n 4 /ngen/ngen/data/calibration/Output/Validation_Run/ngen_wgnt7rg5_worker/ngen.log ──────╯

Background Monitoring

If you run the task in the background, monitor progress:

# Watch the main calibration log
tail -f provo-10154200/calibration/Output/Calibration_Run/ngen_*/ngen.log

# Check calibration status
ls provo-10154200/calibration/Output/Calibration_Run/

Understanding Log Output

The calibration log shows:

  • Parameter values for each iteration
  • Objective function improvements
  • Model performance metrics (NSE, RMSE, etc.)
  • Convergence information

Examining Calibration Results

Generated Plots and Outputs

After calibration completes, examine the results:

# List all generated plots
ls -la provo-10154200/calibration/Output/Calibration_Run/ngen_*/Plot_Iteration/

Key output files:

  • *_hydrograph_iteration.png: Comparison of simulated vs observed streamflow
  • *_param_iteration.png: Parameter evolution through iterations
  • *_objfun_iteration.png: Objective function improvement
  • *_fdc_iteration.png: Flow duration curve comparison
  • *_metrics_iteration.csv: Performance metrics for each iteration

Parameter Results

View the final calibrated parameters:

# Show the best parameters from calibration
cat provo-10154200/calibration/Output/Calibration_Run/ngen_*/*_params_iteration.csv | tail -n 1

# View the calibrated parameters file
cat provo-10154200/config/calibrated_params.json

Performance Metrics

Check calibration performance:

# View metrics evolution
head -5 provo-10154200/calibration/Output/Calibration_Run/ngen_*/*_metrics_iteration.csv
tail -5 provo-10154200/calibration/Output/Calibration_Run/ngen_*/*_metrics_iteration.csv

Applying Calibrated Parameters

The calibration process automatically creates updated configuration files with the best parameters:

  1. Calibrated parameters: provo-10154200/config/calibrated_params.json
  2. Updated model configs: Individual catchment configuration files in provo-10154200/config/cat_config/

These can be used for future model runs with the calibrated parameters.

Running Validation

After calibration, you can run validation using the calibrated parameters:

  1. Check validation setup:

    ls -la provo-10154200/calibration/Output/Validation_Run/
  2. Run validation (if validation scripts are available):

    cd provo-10154200/calibration/Output/Validation_Run/
    # Follow validation-specific instructions

Interpreting Results

Good Calibration Indicators

  • Objective function improvement: Should maximize [minimize] over iterations, depending on the chosen optimization.
  • Parameter convergence: Parameters should stabilize toward the end
  • Visual fit: Hydrograph should match observed data reasonably well
  • Performance metrics: NSE > 0.5, lower RMSE values

Common Issues and Solutions

🔧 Having problems? See our Troubleshooting Guide for detailed solutions.

Quick tips:

  • Poor convergence: Increase iterations or check parameter bounds
  • High computation time: Start with fewer iterations for testing
  • Calibration fails: Check Docker status, data structure, missing values, ...

Next Steps

  1. Experiment with parameters: Modify provo-10154200/calibration/ngen_cal_conf.yaml and re-run
  2. Try different time periods: Adjust calibration and validation periods
  3. Explore regionalization: Apply calibrated parameters to nearby ungauged basins
  4. Development: See Development Setup for code modifications

For a quicker start, see our Quickstart Guide.

For understanding the underlying workflow, see Workflow Process.

Back to Home

⚠️ **GitHub.com Fallback** ⚠️