GitHub MCP Tools installed for global‐workflow software development and how they work - TerrenceMcGuinness-NOAA/global-workflow GitHub Wiki
🚀 MCP Server Components Created:
-
Python MCP Server (mcp-server.py)
- Lightweight, pure Python implementation
- Provides workflow structure information
- Lists job scripts and configurations
-
Node.js MCP Server (mcp-server.js)
- Enhanced features using MCP SDK
- More detailed component explanations
- Better error handling and tool descriptions
-
VS Code Configuration (settings.json)
- Automatic server registration
- Ready for immediate use
-
Package Configuration (package.json)
- Node.js dependencies for MCP SDK
- Scripts for easy server management
🔧 How to Start the MCP Server:
Option 1: Python Server (Immediate)
cd /home/tmcguinness/GITHUB/COPILOT/global-workflow_forked
python3 mcp-server.py
Option 2: Node.js Server (Enhanced)
cd /home/tmcguinness/GITHUB/COPILOT/global-workflow_forked
npm install
npm start
Option 3: VS Code Auto-Start
The server should automatically start when you open VS Code in this workspace, thanks to the settings.json configuration.
🎯 Capabilities:
The MCP servers provide GitHub Copilot with contextual knowledge about:
- Global workflow job scripts and their purposes
- System configurations for HPC platforms (Hera, Orion, WCOSS2, etc.)
- Component relationships (GSI, UFS, MOM6, CICE, etc.)
- Best practices for workflow development
- Directory structure and file organization
This will significantly enhance GitHub Copilot's ability to provide relevant suggestions and assistance when working with the global-workflow codebase!
TerrenceMcGuinness-NOAA: Explain the differences between the Python Server an the Node.js server. Do these servers run locally and how do the interact with GItHub?
GitHub Copilot: Great question! Let me explain the key differences between the Python and Node.js MCP servers, and clarify how they work with GitHub Copilot.
Key Differences Between Python and Node.js MCP Servers
Python MCP Server (mcp-server.py)
Advantages:
- Minimal Dependencies: Pure Python 3.8+ - no external packages required
- Lightweight: Smaller memory footprint and faster startup
- Simple Implementation: Basic JSON-RPC protocol handling
- Self-Contained: No package manager dependencies
Limitations:
- Manual Protocol Implementation: Hand-coded JSON-RPC message handling
- Basic Error Handling: Limited error recovery and validation
- Fewer Features: Simpler tool implementations
Code Example:
# Manual JSON-RPC handling
async def handle_request(self, request: Dict[str, Any]) -> Dict[str, Any]:
method = request.get("method")
if method == "tools/call":
# Manual parameter extraction and validation
tool_name = params.get("name")
arguments = params.get("arguments", {})
Node.js MCP Server (mcp-server.js)
Advantages:
- Official MCP SDK: Uses
@modelcontextprotocol/sdk
- the official implementation - Robust Error Handling: Built-in validation, error recovery, and protocol compliance
- Rich Features: More sophisticated tool definitions and schema validation
- Professional Implementation: Production-ready with proper typing and structure
- Better Debugging: Enhanced logging and error reporting
Requirements:
- Node.js 18+: Requires modern Node.js runtime
- NPM Dependencies: Needs
npm install
to get MCP SDK
Code Example:
// Official SDK with built-in validation
server.setRequestHandler(CallToolRequestSchema, async (request) => {
const { name, arguments: args } = request.params;
// Automatic schema validation and error handling
});
How These Servers Work Locally
Local Execution Model
Both servers run locally on your machine and communicate with VS Code/GitHub Copilot through:
- Standard I/O (stdin/stdout): The MCP protocol uses JSON-RPC over stdio
- Local Process: VS Code spawns the server as a child process
- No Network: No internet connection required for basic operation
VS Code Integration Flow
# VS Code starts the server locally
VS Code → Spawns Process → python3 mcp-server.py
↓
JSON-RPC over stdio
↓
GitHub Copilot gets context
Configuration in VS Code
The settings.json tells VS Code how to start the server:
{
"mcpServers": {
"global-workflow": {
"command": "python3", // Command to run
"args": ["mcp-server.py"], // Arguments
"cwd": "/path/to/project" // Working directory
}
}
}
Interaction with GitHub
Important Clarification:
These servers do NOT directly communicate with GitHub. Here's the actual flow:
Your Local Machine:
┌─────────────────────────────────────────────────────────────────┐
│ VS Code │
│ ↓ │
│ GitHub Copilot Extension │
│ ↓ │
│ MCP Server (Python/Node.js) │
│ ├─ Document Indexer ← Reads local files │
│ ├─ Vector Embeddings ← Creates searchable knowledge base │
│ ├─ RAG Engine ← Retrieves relevant context │
│ └─ Semantic Search ← Finds similar code patterns │
│ ↓ │
│ Enhanced Context + Retrieval Results │
│ ↓ │
│ Provides enriched context to Copilot │
└─────────────────────────────────────────────────────────────────┘
↓
GitHub Copilot API
(with RAG-enhanced context)
How Context Flows to GitHub Copilot
- Local File Analysis: MCP server reads your local repository files
- Context Injection: Server provides structured information to GitHub Copilot
- Enhanced Suggestions: Copilot uses this context for better code suggestions
- AI Processing: Copilot sends enhanced context to GitHub's AI models
What Gets Shared
- Local Repository Structure: Directory listings, file types
- Configuration Information: System configs, job script names
- Workflow Context: Component relationships, best practices
- NO Source Code: The MCP server provides metadata, not actual code content
Recommendation
For your global-workflow project, I'd recommend:
Start with Python Server if:
- You want immediate setup with no dependencies
- You're testing MCP functionality
- You prefer minimal complexity
Upgrade to Node.js Server if:
- You want production-ready implementation
- You need robust error handling
- You plan to extend functionality
- You're comfortable with Node.js ecosystem
Testing the Local Setup
You can verify the servers work locally:
# Test Python server
echo '{"method": "tools/list", "id": 1}' | python3 mcp-server.py
# Test Node.js server (after npm install)
echo '{"method": "tools/list", "id": 1}' | node mcp-server.js
Both should return a JSON response listing available tools, confirming they're working locally and ready to provide context to GitHub Copilot!
Summary: Python MCP Server is Ready! 🚀
Your Python MCP server is now fully tested and ready to use with GitHub Copilot. Here's what we've accomplished:
✅ Server Status
- Working: All tools tested and functional
- Logging: Enhanced with debugging information
- Startup Script: Easy-to-use start-mcp-server.sh
- VS Code Integration: Configured in settings.json
🔧 Available Tools
get_workflow_info
- Provides workflow overview and component informationlist_job_scripts
- Lists all 70+ job scripts in the repositoryget_config_info
- System configuration details for HPC platforms
🎯 How to Use
Manual Testing
cd /home/tmcguinness/GITHUB/COPILOT/global-workflow_forked
# Test the server
./start-mcp-server.sh test
# Start the server manually
./start-mcp-server.sh start
VS Code Integration
- Restart VS Code in this workspace
- The MCP server should automatically start when VS Code opens
- GitHub Copilot will now have enhanced context about your workflow
Testing Copilot Integration
- Open test-copilot-integration.py
- Start typing in the functions
- Copilot should suggest workflow-specific content like:
- Job script names (JGDAS_, JGFS_, etc.)
- System names (hera, orion, wcoss2, etc.)
- Component explanations (GSI, UFS, MOM6, etc.)
📋 Server Logs
The server logs to stderr, so you can see what's happening:
2025-07-28 11:43:13,244 - GlobalWorkflowMCP - INFO - Initializing MCP server
2025-07-28 11:43:13,245 - GlobalWorkflowMCP - INFO - Handling request: initialize
2025-07-28 11:43:13,245 - GlobalWorkflowMCP - INFO - Server initialized successfully
🎉 Next Steps
- Restart VS Code to activate the MCP integration
- Test with Copilot by working in the test file
- Try asking Copilot questions about workflow jobs, systems, or components
- Monitor server logs in VS Code's output panel for debugging
Your Python MCP server is now providing GitHub Copilot with rich context about the global workflow system!
📋 Job Script Categories
🏠 Production Jobs Directory (jobs)
Atmosphere Analysis & Forecasting:
JGDAS_ATMOS_ANALYSIS_DIAG
- Atmospheric analysis diagnosticsJGDAS_ATMOS_CHGRES_FORENKF
- Change resolution for EnKFJGDAS_ATMOS_GEMPAK
- GEMPAK atmospheric processingJGDAS_ATMOS_VERFOZN
- Ozone verificationJGDAS_ATMOS_VERFRAD
- Radiance verificationJGLOBAL_ATMOS_ANALYSIS
- Global atmospheric analysisJGLOBAL_ATMOS_ANALYSIS_CALC
- Analysis calculationsJGLOBAL_ATMOS_ENSSTAT
- Ensemble statisticsJGLOBAL_ATMOS_POST_MANAGER
- Post-processing managerJGLOBAL_ATMOS_PRODUCTS
- Atmospheric productsJGLOBAL_ATMOS_SFCANL
- Surface analysisJGLOBAL_ATMOS_UPP
- Unified Post ProcessorJGLOBAL_FORECAST
- Global forecast
Ensemble Data Assimilation:
JGDAS_ENKF_DIAG
- EnKF diagnosticsJGDAS_ENKF_ECEN
- EnKF ensemble centerJGDAS_ENKF_POST
- EnKF post-processingJGDAS_ENKF_SELECT_OBS
- EnKF observation selectionJGDAS_ENKF_SFC
- EnKF surface analysisJGDAS_ENKF_UPDATE
- EnKF update stepJGLOBAL_ATMENS_ANALYSIS_*
- Atmospheric ensemble analysis suite
Aerosol Processing:
JGDAS_AERO_ANALYSIS_GENERATE_BMATRIX
- Aerosol B-matrix generationJGLOBAL_AERO_ANALYSIS_*
- Aerosol analysis suiteJGLOBAL_PREP_OBS_AERO
- Aerosol observation preparationJGLOBAL_PREP_EMISSIONS
- Emissions preparation
Marine/Ocean Analysis:
JGDAS_GLOBAL_OCEAN_ANALYSIS_ECEN
- Ocean ensemble centerJGLOBAL_MARINE_ANALYSIS_*
- Marine analysis suiteJGLOBAL_MARINE_BMAT*
- Marine B-matrix operationsJGLOBAL_OCEANICE_PRODUCTS
- Ocean/ice productsJGLOBAL_PREP_OCEAN_OBS
- Ocean observation preparation
Wave Modeling:
JGLOBAL_WAVE_*
- Complete wave modeling suiteJGLOBAL_WAVE_GEMPAK
- Wave GEMPAK processingJGLOBAL_WAVE_INIT
- Wave initializationJGLOBAL_WAVE_POST_*
- Wave post-processingJGLOBAL_WAVE_PRDGEN_*
- Wave product generation
Products & Verification:
JGFS_ATMOS_AWIPS_20KM_1P0DEG
- AWIPS 20km productsJGFS_ATMOS_CYCLONE_*
- Cyclone tracking/genesisJGFS_ATMOS_FBWIND
- First guess at observation time windsJGFS_ATMOS_GEMPAK*
- GEMPAK processing suiteJGFS_ATMOS_VERIFICATION
- Atmospheric verification
Utilities & Archiving:
JGLOBAL_ARCHIVE_*
- Archiving operationsJGLOBAL_CLEANUP
- Cleanup operationsJGLOBAL_EXTRACTVARS
- Variable extractionJGLOBAL_FETCH
- Data fetchingJGLOBAL_GLOBUS_ARCH
- Globus archivingJGLOBAL_STAGE_IC
- Initial condition staging
🔧 Development Jobs Directory (jobs)
Analysis & Data Assimilation:
anal.sh
- Analysis- analcalc.sh /
analcalc_fv3jedi.sh
- Analysis calculations analdiag.sh
- Analysis diagnosticsatmanlvar.sh
- Atmospheric variational analysisatmanlfinal.sh
- Atmospheric analysis finalizationprep.sh
- Data preparationprep_sfc.sh
- Surface preparationprepatmiodaobs.sh
- Atmospheric IODA observations
Ensemble Operations:
- ecen.sh /
ecen_fv3jedi.sh
- Ensemble center echgres.sh
- Ensemble change resolutionediag.sh
- Ensemble diagnostics- eobs.sh - Ensemble observations
epos.sh
- Ensemble post-processingesfc.sh
- Ensemble surfaceeupd.sh
- Ensemble updateatmos_ensstat.sh
- Atmospheric ensemble statistics
Forecast & Post-processing:
- fcst.sh - Forecast
upp.sh
- Unified Post Processoratmos_products.sh
- Atmospheric productsmetp.sh
- Model evaluation toolspostsnd.sh
- Post-processing soundings
Marine & Ocean:
marineanl*.sh
- Marine analysis suitemarinebmat*.sh
- Marine B-matrix operationsoceanice_products.sh
- Ocean/ice products- prepoceanobs.sh - Ocean observation preparation
ocnanalecen.sh
- Ocean analysis ensemble center
Wave Processing:
waveinit.sh
- Wave initializationwaveprep.sh
- Wave preparationwavepost*.sh
- Wave post-processing suitewaveawips*.sh
- Wave AWIPS productswavegempak.sh
- Wave GEMPAK processing
Aerosol & Chemistry:
- aerosol_init.sh - Aerosol initialization
aeroanlvar.sh
- Aerosol variational analysisprepobsaero.sh
- Aerosol observation preparationprep_emissions.sh
- Emissions preparation
Products & Graphics:
awips*.sh
- AWIPS product generationgempak*.sh
- GEMPAK processing suitenpoess.sh
- NPOESS products- fbwind.sh - First guess winds
Verification & Quality Control:
verfozn.sh
- Ozone verification- verfrad.sh - Radiance verification
vminmon.sh
- Variational minimization monitoringfit2obs.sh
- Fit to observationstracker.sh
- Hurricane trackinggenesis*.sh
- Cyclone genesis
Snow Analysis:
snowanl.sh
- Snow analysisesnowanl.sh
- Ensemble snow analysis
Utilities:
cleanup.sh
- Cleanup operations- extractvars.sh - Variable extraction
fetch.sh
- Data fetchingstage_ic.sh
- Initial condition stagingarch_*.sh
- Archiving operationsglobus_*.sh
- Globus operations
📝 Execution Scripts Directory (scripts)
Python Scripts:
- Analysis:
exglobal_*_analysis_*.py
- Forecasting:
exglobal_forecast.py
- Marine:
exglobal_marine_*.py
- Aerosol:
exgdas_aero_*.py
,exgfs_aero_*.py
- Archiving:
exglobal_archive_*.py
,exglobal_globus_*.py
- EnKF:
exgdas_enkf_*.py
Shell Scripts:
- Atmospheric:
exglobal_atmos_*.sh
- Wave:
exgfs_wave_*.sh
- GEMPAK:
ex*_gempak_*.sh
- Post-processing:
exgfs_pmgr.sh
,exglobal_atmos_pmgr.sh
📊 Summary Statistics:
- Total Production Jobs: ~80 jobs
- Total Development Jobs: ~80 shell scripts
- Total Execution Scripts: ~75 scripts
- Primary Languages: Shell scripts (.sh) and Python (.py)
- Main Categories: Analysis, Forecast, Ensemble, Marine, Wave, Aerosol, Products, Verification
This comprehensive list shows the global-workflow's extensive job script ecosystem covering all aspects of numerical weather prediction from data ingestion through final product generation.
🔧 Available MCP Tools
1. Workflow Information Tools
get_workflow_info
(Python) /get_workflow_structure
(Node.js)- Purpose: Get overview and structure of the global workflow system
- Parameters:
component
(optional) - "jobs", "scripts", "configs", "overview" - Returns: Detailed information about workflow components, systems, and organization
2. Job Script Discovery
list_job_scripts
(Both Python & Node.js)- Purpose: List all available job scripts in the workflow
- Parameters: None required
- Returns: Complete list of JGDAS, JGFS, JGLOBAL job scripts with descriptions
3. System Configuration
get_config_info
(Python) /get_system_configs
(Node.js)- Purpose: Get configuration information for different HPC systems
- Parameters:
system
- "hera", "orion", "hercules", "wcoss2", "gaeac5", "gaeac6" - Returns: System-specific configuration details and file locations
4. Component Explanation (Node.js only)
explain_workflow_component
- Purpose: Explain specific workflow components in detail
- Parameters:
component
(required) - "rocoto", "gsi", "ufs", "mom6", "cice", etc. - Returns: Detailed explanations of workflow components and their roles
💡 How These Tools Work
MCP Protocol Architecture
GitHub Copilot ←→ VS Code ←→ MCP Server ←→ Global Workflow Repository
- Integration Point: Tools are integrated through VS Code's MCP extension
- Communication: Uses JSON-RPC over stdin/stdout for real-time communication
- Context Enhancement: Provides GitHub Copilot with repository-specific knowledge
- Real-time Access: Tools query live repository structure and files
Tool Execution Flow
- Trigger: GitHub Copilot requests context about global workflow
- Route: VS Code forwards request to MCP server
- Process: MCP server executes appropriate tool function
- Return: Contextual information flows back to Copilot
- Enhancement: Copilot provides more accurate, context-aware suggestions
🎯 Can You Use Them Directly?
Yes, you have several direct usage options:
1. Manual Command Line Testing
# Test Python MCP Server
echo '{"method": "tools/list", "id": 1}' | python3 mcp-server.py
# Test specific tool
echo '{"method": "tools/call", "params": {"name": "list_job_scripts", "arguments": {}}, "id": 2}' | python3 mcp-server.py
# Test Node.js MCP Server
echo '{"method": "tools/list", "id": 1}' | node mcp-server.js
2. Through VS Code Integration
- Automatic: Tools work automatically when using GitHub Copilot in VS Code
- Configuration: Already configured in settings.json
- Transparent: Runs in background, enhancing Copilot responses
3. Custom Script Integration
# Start MCP server and interact programmatically
./start-mcp-server.sh start
4. Direct Function Calls (If Extending)
- You can import the Python MCP server classes and call tool functions directly
- Extend the Node.js server to add custom tools
- Integrate with other development tools that support MCP
🛠️ Practical Usage Examples
When GitHub Copilot Uses These Tools:
- Asking about jobs: "What job scripts handle atmospheric analysis?" → Uses
list_job_scripts
- System questions: "How do I configure for Hera?" → Uses
get_system_configs
- Component queries: "Explain the GSI component" → Uses
explain_workflow_component
- Workflow questions: "What's the structure of this workflow?" → Uses
get_workflow_structure
Direct CLI Usage:
# List all available tools
echo '{"method": "tools/list", "id": 1}' | python3 mcp-server.py
# Get job scripts
echo '{"method": "tools/call", "params": {"name": "list_job_scripts", "arguments": {}}, "id": 2}' | python3 mcp-server.py
# Get system info
echo '{"method": "tools/call", "params": {"name": "get_config_info", "arguments": {"system": "hera"}}, "id": 3}' | python3 mcp-server.py
🎬 Current Status
- ✅ Implemented: Both Python and Node.js MCP servers
- ✅ Configured: VS Code integration ready
- ✅ Tested: Tools are functional and tested
- ✅ Documented: Complete documentation available
- 🔄 Active: Tools enhance your current GitHub Copilot experience
The MCP tools are designed to work seamlessly in the background, providing GitHub Copilot with deep knowledge about the global workflow repository structure, job scripts, system configurations, and component relationships. They're already active and enhancing your development experience!