General: DeFi: Bridges - FlipsideCrypto/fsc-evm GitHub Wiki
Building and Deploying Bridge Models
Bridges, also known as cross-chain transfer protocols, are platforms that enable the transfer of assets and data between different blockchain networks. These protocols facilitate interoperability in the fragmented blockchain ecosystem, allowing users to move tokens and other digital assets across disparate chains without relying on completely on centralized intermediaries. For blockchain networks, including EVMs, bridge activity typically occurs through specialized smart contracts deployed on both the source and destination chains. These contracts work in tandem to lock assets on one chain and mint or release equivalent representations on another, ensuring that the total supply of bridged assets remains consistent across networks. This guide will detail how to 1) Research activity on Bridge protocols and 2) Build the Silver/Gold DBT model layers for Bridges.
Research Phase
Please reference the Research Phase
section in How to Research and Curate EVM Protocols for official documentation on the research process. The following information will be relevant to Bridge Activity.
Summary:
- In order to properly build a Bridge model, you'll need to identify the contract or bridge address for the protocol. This address is crucial as it serves as the primary point of interaction for users initiating cross-chain transfers. You'll also need to understand the specific events emitted by the bridge contracts, and while these models focus on events stemming from the Source Chain to the Destination Chain, it's important to track both sides of a bridge transaction to get a complete picture of the transfer and fully understand the implications of the events/functions. Some key metrics to consider during the research phase are:
- The source and destination chain identifiers
- Token addresses and amounts on both chains
- User addresses initiating and receiving the transfers
- Transaction hashes on both chains
- Block Number and Timestamp of the initiation and completion of the transfer
Best Practices, Tips & Tricks:
- Bridge events generally occur directly on the router or Bridge contract address. We are typically tracking Bridge events OUT from the Source to the Destination Chain, rather than Bridge events IN to the Source Chain.
- While Bridge events are similarly structured between the majority of protocols, certain ones may refer to token and amount sent differently.
- Other protocols may be structured more similarly to DEX Liquidity Pools and Swaps models. Please see silver_bridge.stargate_swap for examples on how to build these models.
- It is common to build Bridge models with
silver.decoded_logs
, rather than manually parse viasilver.logs
. This is because the structure of the data in each emitted event may vary widely and therefore, it is more straightforward / performant to use the decoded tables as the foundation.- Required logic to parse Bridge Transfers/Transactions Out from the Source to the Destination Chain may vary widely per protocol or protocol version. In some cases, certain protocols (such as Wormhole) may not have the required values emitted in events. You may need to reference transfers tables or traces to successfully parse all relevant columns that make up a Bridge transaction.
Silver Layer Modeling
This guide outlines the process of creating the Silver Layer DBT models for tracking Bridge events, transfers or traces on EVM-compatible blockchains, along with the applicable DBT tests for maintaining data integrity. This includes both Protocol Specific and Curated Complete models.
Please reference the Silver Layer Modeling
section in How to Research and Curate EVM Protocols for more details on the modeling process. The following information is related directly to Bridge Activity only.
1. Protocol Specific Models
Summary:
Built with silver.logs
, silver.decoded_logs
and/or silver.traces
, silver.decoded_traces
, the Silver models are designed to capture essential information about Bridge Activity OUT from the Source to the Destination Chain, rather than Bridge Activity IN to the Source Chain. This includes token and amount sent, sender address, recipient address, destination chain and other relevant values.
Best Practices, Tips & Tricks:
- In rare cases, models may require contract reads to collect the tokens and other relevant metadata, if they cannot be parsed onchain through event logs or traces. Please reference the guide on Contract Reads for more details.
- Some protocols may have different deployed versions or topic_0 / events for swaps on same or different contract addresses. If the structure of the event is different, we typically deploy multiple models for each version (e.g.
silver_bridge.across_fundsdeposited
,silver_bridge.across_v3fundsdeposited
.
Implementation Steps, Variables & Notes:
- Set up the model configuration
- Use
incremental
materialization with adelete+insert
strategy. - Define a unique key to identify each record that the incremental strategy should apply to. The unique key defined in the config will be different than the unique column / unique row in the table. This is because the model must delete+insert on a broader key to properly handle block reorg scenarios. Please see the Block Reorg guide for more information. We typically use
block_number
in Bridge models because it allows us to reload blocks in their entirety, rather than reloading by transaction or event and potentially missing, loading or deleting incorrect records. - Cluster by columns with low cardinality, such as
block_timestamp::DATE
, to improve query performance. - Add relevant tags for model categorization (typically tagged as
curated
andreorg
) so that the model may be picked up by the relevant scheduled workflows.
- Use
{{ config(
materialized = 'incremental',
incremental_strategy = 'delete+insert',
unique_key = "block_number",
cluster_by = ['block_timestamp::DATE'],
tags = ['curated','reorg']
) }}
-
Create a CTE for Bridge events
- Manually parse event data to extract relevant information (token address, amount sent, sender address, recipient address, destination chain etc.).
- For incremental runs, filter data based on the latest
_inserted_timestamp
, with a12 hour
lookback.
- For incremental runs, filter data based on the latest
- Manually parse event data to extract relevant information (token address, amount sent, sender address, recipient address, destination chain etc.).
-
Add the final SELECT statement
- Select all relevant fields from the Bridge events CTE.
- Manually add the protocol name as the
platform
, where necessary. - If not already emitted in the event, determine the
destination_chain_receiver
address. For bridging between EVMs or EVM L2s, this column is typically the same as thereceiver
address. However, for non-EVMs, the address may vary and requires decoding/encoding via UDFs. Please see the silver_bridge.wormhole_transfers for examples on how this can be transformed. - The
source_chain
and/ordestination_chain
values may not come standardized within the protocol's emitted events. Please see silver_bridge.allbridge_sent for examples on how we attempt to standardize these values.
-
Create the
.yml
file and add table/column level tests- At minimum tests should include, but are not limited to, the following;
- Uniqueness on 1+ columns (typically
_log_id
but may vary by model) not_null
tests onbridge_address
,token_address
,amount
,sender
,receiver
anddestination_chain_receiver
- Uniqueness on 1+ columns (typically
- At minimum tests should include, but are not limited to, the following;
version: 2
models:
- name: silver_bridge__celer_cbridge_send
tests:
- dbt_utils.unique_combination_of_columns:
combination_of_columns:
- _LOG_ID
columns:
- name: BLOCK_NUMBER
...
Examples, References & Sources:
2. Complete Models
Summary:
The complete_bridge_activity
model aggregates all protocol specific Bridge models together into one, allowing centralized joins on silver.contracts
and ez_prices_hourly
for token metadata, hourly price data, streamlined logic and standardized columns. This includes platform names, versions, USD amounts, symbols, decimals and other relevant columns for all Bridge Activity.
Best Practices, Tips & Tricks:
- Protocol Versions: If there are multiple versions of a protocol, coming from separate models, each should have it's own CTE.
- The name for the
platform
column should include both theprotocol
andversion
separated by a-
(dash). You do not need to include,v1
, but instead begin incrementing atv2
and beyond. - The
version
column should represent a version specific to the CTE/model being referenced. For example, if there are multiple CTEs/models required for Across, you would addacross
as theplatform
andv1
as theversion
in the first CTE, thenacross-v3
as theplatform
andv3
as theversion
in the next CTE. This is required because thedelete+insert
incremental strategy uses bothplatform
andversion
to identify new incremental records.
- The name for the
- When adding new Protocols/CTEs/models to the
complete
model, please pay attention to the number of columns being pulled through, the order of those columns and the data type/format and names of those columns, as this will materially impact the output of the model.- More specifically, pay attention to how
destination_chain
anddestination_chain_id
are handled, or if these columns are NULL/missing entirely. Unique logic may be required for certain Bridges.- In some cases, only the
chain_id
is available and we need to join the chain list found inexternal.defillama.dim_chains
to get the chain names, or vice versa.
- In some cases, only the
- More specifically, pay attention to how
Implementation Steps, Variables & Notes:
- Set up the model configuration
- Use
incremental
materialization with adelete+insert
strategy. - Define a unique key to identify each record that the incremental strategy should apply to. The unique key defined in the config will be different than the unique column / unique row in the table. This is because the model must delete+insert on a broader key to properly handle block reorg scenarios. Please see the Block Reorg guide for more information. We typically use ['block_number','platform','version'] in the complete swaps model because it allows us to reload blocks, platforms and versions in their entirety, rather than reloading by transaction or event and potentially missing, loading or deleting incorrect records.
- Cluster by columns with low cardinality, such as
block_timestamp::DATE
andplatform
, to improve query performance. - Add Search Optimization (SO) in the post_hook with
EQUALITY
andSUBSTRING
. Please only add SO on non-number columns, where applicable. - Add relevant tags for model categorization (typically tagged as
['curated','reorg','heal']
) so that the model may be picked up by the relevant scheduled workflows.
- Use
{{ config(
materialized = 'incremental',
incremental_strategy = 'delete+insert',
unique_key = ['block_number','platform','version'],
cluster_by = ['block_timestamp::DATE','platform'],
post_hook = "ALTER TABLE {{ this }} ADD SEARCH OPTIMIZATION ON EQUALITY(tx_hash, origin_from_address, origin_to_address, origin_function_signature, bridge_address, sender, receiver, destination_chain_receiver, destination_chain_id, destination_chain, token_address, token_symbol), SUBSTRING(origin_function_signature, bridge_address, sender, receiver, destination_chain_receiver, destination_chain, token_address, token_symbol)",
tags = ['curated','reorg','heal']
) }}
- Create CTEs for each Protocol Specific model/version, where applicable.
- Select all columns from the source table
- Use standardized column names, as each of these CTEs will be unioned together towards the end of the model.
- Add columns with
NULL
values in cases where the column exists in other Protocol Specific CTEs.- Columns for destination chain may require additional minor transformations.
- Add columns with
- Hardcode the
platform
andversion
, if not done already. - For incremental runs, filter data based on the latest
_inserted_timestamp
, with theLOOKBACK
variable, defaulted to4 hours
.- The name used to reference `var('HEAL_MODELS') should match the name of the CTE.
...
across_v3 AS (
SELECT
block_number,
block_timestamp,
origin_from_address,
origin_to_address,
origin_function_signature,
tx_hash,
event_index,
bridge_address,
event_name,
platform,
'v3' AS version,
sender,
receiver,
destination_chain_receiver,
destination_chain_id :: STRING AS destination_chain_id,
NULL AS destination_chain,
token_address,
NULL AS token_symbol,
amount AS amount_unadj,
_log_id AS _id,
_inserted_timestamp
FROM
{{ ref('silver_bridge__across_v3fundsdeposited') }}
{% if is_incremental() and 'across_v3' not in var('HEAL_MODELS') %}
WHERE
_inserted_timestamp >= (
SELECT
MAX(_inserted_timestamp) - INTERVAL '{{ var("LOOKBACK", "4 hours") }}'
FROM
{{ this }}
)
{% endif %}
),
...
- Union all Protocol Specific CTEs together with
SELECT *
andUNION ALL
...
all_bridges AS (
SELECT
*
FROM
across
UNION ALL
SELECT
*
FROM
across_v3
UNION ALL
SELECT
...
-
Create the
complete
CTE based on the unioned data, and apply transformation logic.- Apply any and all transformation logic required to properly produce the values needed for the different protocols, using as much standardized logic as possible.
- This is where you will join
silver.contracts
andprice.ez_prices_hourly
to include applicable USD bridge pricing and token metadata values.
-
Add the
heal_model
CTE with theHEAL_MODEL
var. For for information, please see the Incremental Heal Logic guide.- For Bridge Activity, we typically need to ensure that late-arriving decimals and USD amounts are properly handled.
-
Create the final SELECT statement.
- Add the
id
column usingdbt_utils.generate_surrogate_key
and include the column that represents one unique row. - Add
inserted_timestamp
andmodified_timestamp
usingSYSDATE()
to represent the timestamp where new rows are inserted or modified in the model. - Add the
_invocation_id
column, which outputs a UUID generated for each newly inserted row. For more information, please see DBT Docs: invocation_id.
- Add the
{{ dbt_utils.generate_surrogate_key(
['_id']
) }} AS complete_bridge_activity_id,
SYSDATE() AS inserted_timestamp,
SYSDATE() AS modified_timestamp,
'{{ invocation_id }}' AS _invocation_id
- Create the
.yml
file and add table/column level tests- At minimum tests should include, but are not limited to, the following;
- Uniqueness on 1+ columns (typically
_log_id
but may vary by model) not_null
tests onbridge_address
,token_address
,amount
,sender
,receiver
anddestination_chain_receiver
- Uniqueness on 1+ columns (typically
- At minimum tests should include, but are not limited to, the following;
Examples, References & Sources:
- ethereum.silver_bridge.complete_bridge_activity.sql
- ethereum.silver_bridge.complete_bridge_activity.yml
Gold Layer Modeling
Please reference the Gold Layer Modeling
section in How to Research and Curate EVM Protocols for more details on the modeling process. The following information is related directly to Bridge Activity only.
Summary:
This guide outlines the process of creating the Gold Layer DBT models for tracking Bridge events, transfers or traces on EVM-compatible blockchains, along with the applicable table and column descriptions for transparency and documentation. The Gold model will select directly from the complete
model, and will be surfaced to all users and clients.
Implementation Steps, Variables & Notes:
- Set up the model configuration
- Use
view
materialization. - Add persist_docs, with
relation
andcolumns
set to `true. - Add
database_tags
to themeta
section in the config. These should include the names of everyPROTOCOL
(platform) in the model, along with one word values that define thePURPOSE
(category) of the model.
- Use
{{ config(
materialized = 'view',
persist_docs ={ "relation": true,
"columns": true },
meta ={ 'database_tags':{ 'table':{ 'PROTOCOL': 'ACROSS, ALLBRIDGE, AXELAR, CELER, CBRIDGE, DLN, DEBRIDGE, EYWA, HOP, MESON, MULTICHAIN, NATIVE, STARGATE, SYMBIOSIS, SYNAPSE, WORMHOLE',
'PURPOSE': 'BRIDGE' } } }
) }}
-
SELECT all relevant columns to pull through from
complete_bridge_activity
and surface to users in the Gold layer.- Adjust the column names so that they are intuitive for a user analyzing what the table represents, which in this case, is an EZ view on DEX Swap events and related USD values.
- Apply custom logic to NULL USD amounts in cases where the USD value is greater than
1e15
. We do this to avoid overstating volumes in cases where bridging between low-liquidity tokens applies. - Apply the
ROUND
function, and round to the nearest 2 decimals to emulate USD denominations.
-
Create the
.yml
file and add table/column level descriptions.