General: New Projects - FlipsideCrypto/fsc-evm GitHub Wiki
Requires fsc-evm v4.0.0+
Fsc-evm for New Projects
Overview
The fsc-evm package is at the heart of all EVM dbt model repos. New EVM blockchain projects deploy through a structured, phased approach using automated make commands that handle RPC node compatibility testing, infrastructure setup, and model deployment. Each project begins with the main_package, which provides foundational models (fact_blocks, fact_transactions, fact_event_logs, fact_traces) and essential automation including GitHub Actions workflow management, Snowflake task scheduling, and streamline data ingestion pipelines.
The framework's custom variable management system extends dbt's native functionality, allowing project-specific configuration through dedicated macro files while maintaining backwards compatibility. Additional specialized packages (decoder_package, curated_package, balances_package, scores_package) can be enabled as needed to provide smart contract decoding, protocol analytics, balance tracking, and activity scoring capabilities.
Key Macros
RPC Node Compatibility
Macro: call_sample_rpc_node(blockchain, node_provider, network, ...)
- Calls the
sample_rpc_nodestored procedure to test RPC node capabilities and compatibility - Accepts parameters for blockchain, node_provider, network, and various overrides
- Uses project variables as defaults when specific parameters are not provided
- Critical for determining what blockchain data fields are available during initial deployment to establish core table columns
Macro: set_dynamic_fields(gold_model)
- Dynamically determines which fields are available for specific gold models (
fact_blocks,fact_transactions) - Queries the
admin__fact_rpc_detailstable to check field availability from RPC responses - Returns dictionary mapping field names to boolean availability status
- Enables conditional model building based on what data the blockchain actually provides
- Note:
fieldinall_fieldsmust be set manually if a new object key is presented in the node output for a blockchain
Macro: create_sample_rpc_node_sp()
- Creates the
admin.sample_rpc_nodestored procedure in Snowflake - Only runs when
UPDATE_UDFS_AND_SPSvariable is set to true - The stored procedure tests RPC node endpoints for compatibility with various blockchain data types
- Creates logging table
admin.rpc_node_logsto track RPC testing results, which outputs data such as theblocks_per_hour,range_testedand applicable fields for blocks, transactions, receipts (and traces if enabled) - Grants appropriate permissions to internal roles for procedure usage
Livequery & Streamline Deployment
Macro: livequery_grants()
- Grants usage permissions on
_liveand_utilsschemas to project-specific roles - Only executes when
UPDATE_UDFS_AND_SPSvariable is set to true - Applies grants to
AWS_LAMBDA_{PROJECT}_API,DBT_CLOUD_{PROJECT}, andINTERNAL_DEVroles - Essential for enabling livequery functionality in deployed projects
Macro: create_evm_streamline_udfs()
- Creates the
streamlineschema and essential UDFs whenUPDATE_UDFS_AND_SPSis true - Deploys core streamline functions such as
bulk_rest_api_v2,bulk_decode_logs_v2, andbulk_decode_traces_v2 - Required for streamline data ingestion processes to function properly
Github Actions Workflow Management
Macro: drop_github_actions_schema()
- Safely drops all existing tasks in the
github_actionsschema - Used for clean deployments when recreating GitHub Actions automation and updating workflows or schedules
Macro: generate_workflow_schedules(chainhead_schedule)
- Generates cron schedules for all GitHub Actions workflows based on a root chainhead schedule
- Creates unique schedules per repository using database name as a seed to avoid conflicts
- Supports various cadences: hourly, every 4 hours, daily, weekly, monthly, and custom
- Returns workflow names with their corresponding cron schedules and cadence types
- Respects variable overrides for custom scheduling when defined
Macro: create_workflow_table(workflow_values)
- Creates the
github_actions.workflowstable with workflow names from.github/workflows/...ymlfiles - Applied during make command execution to catalog available workflows
- Grants appropriate permissions to internal and project-specific roles
Macro: create_gha_tasks()
- Creates Snowflake tasks that trigger GitHub Actions workflows on defined schedules
- Reads from
github_actions__workflow_schedulemodel to get task definitions - Tasks execute GitHub workflow dispatches via the
workflow_dispatchesfunction - Supports optional automatic task resumption via
RESUME_GHA_TASKSvariable
Macro: alter_gha_tasks(task_names, task_action)
- Modifies specific GitHub Actions tasks by name with actions like RESUME or SUSPEND
- Accepts comma-separated list of task names for batch operations
Macro: alter_all_gha_tasks(task_action)
- Applies the same action (RESUME/SUSPEND) to all GitHub Actions tasks in the project
- Reads task list from
github_actions__workflow_schedulemodel for comprehensive management
These macros all work together through the deployment phases (deploy_chain_phase_1 ... deploy_chain_phase_4) to establish a fully functional EVM blockchain data pipeline, with proper scheduling, monitoring, and data processing capabilities.
Key Packages
Package: main_package
- Core foundational package that provides essential blockchain data models and infrastructure
- Contains fundamental data layers: bronze (raw ingestion), silver (cleaned/transformed), and gold (analytics-ready)
- Includes critical sub-packages such as (subject to change):
- core - Primary blockchain data models including
fact_blocks,fact_transactions,fact_event_logs,fact_traces,ez_token_transfers,ez_native_transfers, anddim_contracts - streamline - Data ingestion pipeline models for real-time and historical blockchain data collection
- admin - Administrative models for variable management, RPC node details, and system metadata
- prices - Price data models including
ez_prices_hourly,ez_asset_metadata, anddim_asset_metadata - labels - Address labeling and categorization through
dim_labels - observability - Data quality monitoring and pipeline health tracking
- utils - Utility models and helper functions
- github_actions - Workflow automation and task management models
- core - Primary blockchain data models including
Package: decoder_package
- Specialized package for smart contract interaction decoding and ABI management
- Enables interpretation of raw blockchain data into human-readable formats
- Contains primary sub-packages such as (subject to change):
- abis - Contract ABI collection, management, and storage across bronze, silver, gold, and streamline layers
- decoded_logs - Event log decoding pipeline with models like
ez_decoded_event_logsthat translate raw log data using collected ABIs
Package: curated_package
- Higher-level analytics package providing protocol-specific and statistical models
- Built on top of core blockchain data to deliver business intelligence
- Contains specialized sub-packages such as (subject to change):
- protocols - Protocol-specific models (e.g., Vertex exchange analytics with models like
ez_market_stats,ez_account_stats,ez_liquidations,ez_market_depth_stats) - stats - Blockchain-wide statistical analysis including
ez_core_metrics_hourlyfor network health and usage metrics
- protocols - Protocol-specific models (e.g., Vertex exchange analytics with models like
Package: balances_package
- Specialized package for tracking account balances and state changes
- Provides comprehensive native asset and token balances tables
Each package follows the standard dbt layered medallion architecture (bronze → silver → gold) and integrates seamlessly with the fsc-evm variable management system to provide configurable, scalable blockchain data infrastructure.
Deployment Process
Please see New Project Set Up for an in-depth look into the new project deployment process, including step by step instructions and developer notes.