Free 30-min Web3 Consultation
Book Consultation
Smart Contract Security Audits
View Audit Services
Custom DeFi Protocol Development
Explore DeFi
Full-Stack Web3 dApp Development
View App Services
Free 30-min Web3 Consultation
Book Consultation
Smart Contract Security Audits
View Audit Services
Custom DeFi Protocol Development
Explore DeFi
Full-Stack Web3 dApp Development
View App Services
Free 30-min Web3 Consultation
Book Consultation
Smart Contract Security Audits
View Audit Services
Custom DeFi Protocol Development
Explore DeFi
Full-Stack Web3 dApp Development
View App Services
Free 30-min Web3 Consultation
Book Consultation
Smart Contract Security Audits
View Audit Services
Custom DeFi Protocol Development
Explore DeFi
Full-Stack Web3 dApp Development
View App Services
LABS
Guides

Setting Up Cross-Chain Data Feeds for Tokenized Assets

A technical tutorial for deploying and maintaining oracle data feeds that serve price and metadata for tokenized assets across multiple blockchain ecosystems.
Chainscore © 2026
introduction
FOUNDATIONS

Introduction

Tokenized assets require reliable, real-time data across multiple blockchains. This guide explains how to build and manage cross-chain data feeds.

Tokenized assets—representing everything from real estate to corporate bonds—are increasingly deployed across multiple blockchains to access broader liquidity and user bases. However, their value and functionality depend on accurate, timely data that must be synchronized across these fragmented networks. A cross-chain data feed aggregates and verifies information like price, collateralization ratios, or ownership status from disparate sources, delivering a single source of truth to smart contracts on any chain. Without this infrastructure, tokenized assets risk becoming isolated and illiquid.

Building these feeds presents unique technical challenges. Data must be securely transmitted across chains with varying security models and consensus mechanisms. Solutions like Chainlink's Cross-Chain Interoperability Protocol (CCIP), LayerZero's Omnichain Fungible Tokens (OFT), and Wormhole's Generic Message Passing provide frameworks for this, but integrating them requires careful design. Developers must decide on an architecture—choosing between oracle networks, light clients, or optimistic verification—based on their specific needs for latency, cost, and trust assumptions.

This guide provides a practical, code-focused walkthrough for setting up a production-ready cross-chain data feed. We'll cover the core components: a source contract on an origin chain (e.g., Ethereum) that emits data, a relay service or oracle network that transports it, and a destination contract on a target chain (e.g., Arbitrum or Polygon) that consumes and validates the incoming information. Each step will include Solidity and JavaScript examples using real protocols to illustrate the implementation details and security considerations.

prerequisites
FOUNDATION

Prerequisites

Essential knowledge and tools required to build with cross-chain data feeds for tokenized assets.

Before implementing cross-chain data feeds, you need a solid grasp of core blockchain concepts. This includes understanding smart contracts, the EVM (Ethereum Virtual Machine), and the role of oracles like Chainlink or Pyth in fetching off-chain data. Familiarity with token standards such as ERC-20 and ERC-721 is crucial, as these represent the assets you'll be tracking. You should also be comfortable with the basics of interoperability protocols and the security models of different blockchain networks.

Your development environment must be properly configured. You will need Node.js (v18 or later) and a package manager like npm or yarn. Essential tools include Hardhat or Foundry for smart contract development and testing, and a wallet such as MetaMask for interacting with testnets. You'll also require access to RPC endpoints for the chains you intend to use (e.g., Sepolia, Arbitrum Sepolia, Base Sepolia) from providers like Alchemy or Infura, and a small amount of testnet ETH or native gas tokens to deploy contracts.

For this guide, we will use Chainlink's Cross-Chain Interoperability Protocol (CCIP) and Data Feeds as our primary examples. Ensure you have an account on the Chainlink Functions Subscriptions platform and understand its credit-based billing model. We will write contracts in Solidity 0.8.19+ and use the @chainlink/contracts NPM package. All code examples will be executable on Ethereum Sepolia testnet, with the principle being applicable to mainnet and other EVM-compatible chains like Polygon or Avalanche.

architecture-overview
SYSTEM ARCHITECTURE OVERVIEW

Setting Up Cross-Chain Data Feeds for Tokenized Assets

A cross-chain data feed aggregates and verifies asset information from multiple blockchains, enabling unified applications like multi-chain DEXs and lending protocols.

A cross-chain data feed is a critical infrastructure component that collects, validates, and serves real-time information about tokenized assets—such as prices, liquidity, and total supply—from multiple blockchain networks to a single application. Unlike a standard oracle that fetches off-chain data, a cross-chain feed must handle the inherent complexities of different consensus mechanisms, block times, and data formats. The primary goal is to create a single source of truth for asset states across chains, which is essential for decentralized applications (dApps) like multi-chain decentralized exchanges (DEXs), cross-chain lending platforms, and portfolio trackers that need a consolidated view of user assets.

The core architectural challenge is data integrity and synchronization. You cannot assume data from one chain is immediately valid or final on another. A robust architecture typically employs a three-layer model: 1) Source Layer: Light clients or RPC nodes on each origin chain (Ethereum, Solana, Avalanche) that fetch raw data. 2) Aggregation & Validation Layer: A decentralized network of nodes that attests to the validity of the sourced data, often using cryptographic proofs like Merkle Patricia proofs for EVM chains or light client state proofs for others. 3) Delivery Layer: A standardized API or on-chain smart contract (e.g., on a destination chain like Arbitrum) where applications can query the verified, aggregated data feed.

For developers, implementing the source layer requires chain-specific tools. For an EVM chain like Ethereum, you might use an RPC provider with the eth_getProof method to generate storage proofs for a token's total supply. On Solana, you would query an RPC for an account's lamports and parse the token account data structure. This raw data is then passed to the validation layer. A common pattern is to use a zk-SNARK circuit or an optimistic verification game to prove the correctness of the cross-chain state transition before the data is finalized in the delivery layer, preventing a malicious node from reporting false liquidity from a forked chain.

The delivery mechanism dictates the feed's latency and trust assumptions. An on-chain delivery via a smart contract (like a Chainlink Data Feed or a custom oracle contract) provides the highest security for DeFi applications but incurs gas costs and slower update speeds. An off-chain API delivery offers lower latency and is suitable for analytics dashboards, but requires users to trust the API operator. Many systems use a hybrid approach: critical financial data (like price for liquidation) is written on-chain, while supplementary data (like 24h volume) is served via a permissioned API. The Chainlink CCIP and Wormhole Queries are examples of production systems tackling this problem.

When designing your feed, key trade-offs must be evaluated. Decentralization vs. Performance: A fully decentralized validator set increases security but can slow down data finality. Generality vs. Optimization: A feed supporting arbitrary data types is more flexible but less efficient than one built specifically for ERC-20 token balances. Cost: On-chain storage and verification, especially with ZK proofs, can be expensive. Start by identifying the minimum viable data (e.g., just price and liquidity for top 10 assets) and the maximum acceptable latency (e.g., 2 minutes) for your application to scope the architecture effectively.

Ultimately, a well-architected cross-chain data feed acts as the nervous system for the interoperable economy. It enables scenarios like a lender on Avalanche accepting wrapped Bitcoin from Bitcoin as collateral at a real-time price, or a DEX routing a trade through the chain with the deepest liquidity for a given pair. By understanding the source, validation, and delivery layers, developers can build or integrate feeds that are not only functional but also secure and resilient against cross-chain manipulation attacks.

key-concepts
DATA INFRASTRUCTURE

Core Components

These are the foundational protocols and services required to build reliable cross-chain applications for tokenized assets.

DATA FEED INFRASTRUCTURE

Cross-Chain Messaging Protocol Comparison

Key protocols for building cross-chain data feeds, evaluated on security, cost, and developer experience.

Feature / MetricLayerZeroWormholeAxelarCCIP

Security Model

Decentralized Verifier Network

Guardian Network (19/33)

Proof-of-Stake Validator Set

Risk Management Network

Finality Time (avg.)

< 2 minutes

< 15 seconds

~6 minutes

< 10 minutes

Gas Cost per Message (est.)

$0.25 - $1.50

$0.10 - $0.75

$0.50 - $2.00

$0.75 - $3.00

Arbitrary Data Payload

Native Token Transfers

Programmable Actions (General Msg)

Supported Chains (Mainnet)

50+

30+

55+

10+

Developer Tooling

SDK, Scan, Testnet

SDK, Explorer, Devnet

SDK, Satellite, Testnet

SDK, Functions, Faucet

data-format-standardization
INTEROPERABILITY

Standardizing the Data Payload

A consistent data schema is the foundation for reliable cross-chain communication, enabling smart contracts to understand and act on information from any network.

When a smart contract on Ethereum needs to verify the price of a tokenized asset whose primary market is on Solana, it cannot directly read the Solana blockchain. Instead, it relies on a data feed—a stream of verified information relayed across the chain boundary. The critical challenge is ensuring that the data structure, or payload, is identical when encoded on the source chain and decoded on the destination chain. Without a strict, shared schema, a contract might misinterpret a uint256 price as an address or fail to parse the data entirely, leading to failed transactions or incorrect logic execution.

Standardization typically involves defining a schema using formats like Protocol Buffers (protobuf) or Solidity structs that are agreed upon by all participating systems. For a tokenized asset feed, this payload must include essential fields such as assetId (a unique identifier like a CAIP-19 asset namespace), price (as a fixed-point integer to avoid floating-point errors), timestamp (in Unix time), and a dataSource identifier (e.g., the oracle or DEX pool address). This structured approach replaces ambiguous, comma-separated strings with strongly-typed, versioned data packets that are both efficient to transmit and unambiguous to interpret.

Implementing this in practice requires coordination between the off-chain oracle or relayer and the on-chain contracts. For example, a Chainlink oracle on Ethereum emitting a price feed would use an External Adapter to fetch data from a Solana Pyth network feed, format it into the predefined protobuf schema, and submit it on-chain. The consuming contract, pre-programmed with the same schema via an interface, can then decode the payload's bytes array directly into usable variables: (string assetId, uint256 price, uint64 timestamp) = abi.decode(data, (string, uint256, uint64));.

Beyond basic data fields, a robust payload standard must also account for error states, data freshness, and multi-source aggregation. Including a status enum (e.g., ACTIVE, STALE, ERROR) allows destination contracts to handle unreliable data gracefully. Timestamp tolerances can be enforced to reject outdated information. For critical financial data, the payload may contain multiple price observations from different sources (like Orca, Raydium, and Serum on Solana) alongside a computed aggregate, enabling the destination contract to implement its own validation or averaging logic.

Adopting a community-driven standard, such as extending the Cross-Chain Interoperability Protocol (CCIP) message format or the Wormhole VAA (Verified Action Approval) payload structure, provides immediate compatibility with existing infrastructure. This eliminates the need for custom parsing logic for each new feed. The ultimate goal is to create a lingua franca for asset data, where any application on any chain can subscribe to and trust a feed simply by implementing the standard decoder, dramatically accelerating the development of interoperable DeFi primitives for tokenized real-world assets, equities, and commodities.

PRACTICAL GUIDES

Implementation Steps by Platform

Deploying a Chainlink Data Feed

Chainlink Data Feeds provide decentralized price oracles for tokenized assets. Implementation requires a smart contract to consume the feed data.

Key Steps:

  1. Select a Data Feed: Identify the correct proxy address for your asset pair (e.g., BTC/USD) on the target network from the Chainlink Data Feeds directory.
  2. Write Consumer Contract: Import the AggregatorV3Interface and call the latestRoundData() function.
  3. Deploy and Fund: Deploy your contract and ensure it has LINK tokens to pay for oracle updates if using a request-response model (most data feeds are push-based and do not require this).
solidity
// SPDX-License-Identifier: MIT
pragma solidity ^0.8.7;

import "@chainlink/contracts/src/v0.8/interfaces/AggregatorV3Interface.sol";

contract PriceConsumerV3 {
    AggregatorV3Interface internal priceFeed;

    constructor(address _priceFeedAddress) {
        priceFeed = AggregatorV3Interface(_priceFeedAddress);
    }

    function getLatestPrice() public view returns (int) {
        (
            ,
            int price,
            ,
            ,
        ) = priceFeed.latestRoundData();
        return price;
    }
}
relayer-logic
BUILDING THE RELAYER SERVICE

Setting Up Cross-Chain Data Feeds for Tokenized Assets

A relayer service is the core infrastructure for fetching, verifying, and transmitting asset data across blockchains. This guide details its architecture and implementation.

A cross-chain data feed relayer is a trust-minimized oracle that synchronizes state information for tokenized assets (like wrapped BTC or bridged USDC) between a source chain and a destination chain. Its primary function is to listen for events—such as a mint or burn on the source chain—and submit corresponding proofs to a verification contract on the destination chain. This enables applications like cross-chain DEXs or lending protocols to have accurate, real-time collateral balances. Unlike general-purpose oracles, these relayers are optimized for specific asset bridges, requiring deep integration with their smart contract interfaces.

The service architecture typically consists of three core components: an event listener, a proof generator, and a transaction submitter. The listener monitors the source chain's bridge contract using an RPC provider. When a Transfer or Lock event is emitted, the relayer captures the transaction hash, block number, and event logs. For high reliability, use a service like Chainlink Functions or POKT Network for decentralized RPC access, or run your own node cluster. The proof generator then uses this data to create a Merkle proof or query a light client for state verification, depending on the bridge's security model.

Implementing the listener requires careful handling of chain reorganizations and RPC failures. Here's a basic Node.js example using ethers.js to listen for events on an arbitrary bridge contract:

javascript
const provider = new ethers.providers.JsonRpcProvider(SOURCE_CHAIN_RPC);
const bridgeContract = new ethers.Contract(BRIDGE_ADDRESS, BRIDGE_ABI, provider);

bridgeContract.on('TokensLocked', (sender, token, amount, event) => {
  const block = await provider.getBlock(event.blockNumber);
  // Store event data and block timestamp for proof generation
  relayQueue.add({ txHash: event.transactionHash, block, event });
});

Always implement a block confirmation delay (e.g., 10-15 blocks) before processing events to avoid orphaned blocks during reorgs.

The most critical step is proof generation and verification. For optimistic bridges like Across or Synapse, the relayer simply needs to forward the message. For light client bridges (e.g., IBC, Near Rainbow Bridge), it must generate a Merkle proof of the transaction's inclusion. For zero-knowledge bridges like zkBridge, it submits a validity proof. The relayer calls the destination chain's verification contract, such as a State Receiver on Polygon zkEVM or an AMB (Arbitrary Message Bridge) on Gnosis Chain. Gas costs on the destination chain are a key consideration; often, the relayer's transactions are sponsored or use meta-transactions via services like Biconomy or Gelato.

Operational best practices include monitoring for failed transactions and implementing automatic retry logic with increasing gas premiums. Use health checks for RPC endpoints and switch providers on failure. For production systems, run multiple relayers in a high-availability configuration to prevent single points of failure. Security is paramount: the relayer's private keys must be stored securely using a hardware security module (HSM) or a managed service like AWS KMS or GCP Secret Manager. Finally, open-source implementations like the SocketDL relayer or Chainlink's CCIP provide valuable reference architectures for building your own robust service.

CROSS-CHAIN DATA FEEDS

Common Issues and Troubleshooting

Addressing frequent challenges developers face when integrating cross-chain data for tokenized assets like RWAs, including latency, data integrity, and oracle selection.

Stale data in cross-chain feeds typically stems from high latency in the underlying oracle network's update cycle or blockchain finality delays.

Primary causes:

  • Oracle Update Frequency: Feeds from oracles like Chainlink have a heartbeat (e.g., 1 hour) and a deviation threshold. Prices only update when one condition is met.
  • Source Chain Finality: Data from networks with probabilistic finality (e.g., PoW chains) requires multiple confirmations, adding latency.
  • Relayer Delays: The cross-chain messaging protocol (e.g., Axelar, Wormhole, LayerZero) may have its own block time and proof generation delays.

How to fix:

  • Check the specific oracle's configuration on the destination chain. For Chainlink, verify the updatedAt timestamp in the aggregator contract.
  • Implement a staleness check in your smart contract: require(block.timestamp - updatedAt < MAX_DELAY, "Stale price");
  • Consider using oracles with faster update mechanisms (e.g., Pyth's pull-based model) for high-frequency assets.
CROSS-CHAIN DATA FEEDS

Frequently Asked Questions

Common technical questions and solutions for developers implementing cross-chain data feeds for tokenized assets like RWAs, stablecoins, and NFTs.

A cross-chain data feed is a decentralized oracle service that securely transmits off-chain or on-chain data (like price, interest rates, or collateral status) to smart contracts on multiple blockchains. For tokenized assets such as Real World Assets (RWAs), stablecoins, or cross-chain NFTs, these feeds are the trust layer that ensures the asset's representation on each chain reflects its real-world or canonical state.

Without reliable feeds, a tokenized stock's price on Arbitrum could diverge from its price on Base, or a cross-chain stablecoin could become undercollateralized on one chain without the others knowing. This creates arbitrage opportunities, breaks peg mechanisms, and introduces systemic risk. Feeds must be low-latency, tamper-resistant, and provide cryptographic proof of data validity to maintain asset integrity across fragmented liquidity.

conclusion
IMPLEMENTATION GUIDE

Conclusion and Next Steps

You have now configured a cross-chain data feed for tokenized assets. This guide covered the core components: selecting oracles, deploying smart contracts, and establishing secure data transmission.

Your cross-chain data feed is a critical piece of infrastructure. It enables applications like cross-chain lending where collateral value is verified on another chain, or multi-chain DEX aggregators that need real-time price data. The reliability of your application now depends on the robustness of this feed. Regularly monitor the data freshness and deviation thresholds you configured in your AggregatorV3Interface consumer contract to ensure it meets your application's needs.

For production deployment, consider these next steps to enhance security and reliability:

  • Implement a multi-oracle strategy: Don't rely on a single data source. Use an on-chain aggregator like Chainlink Data Feeds that already synthesizes data from numerous premium providers, or build your own aggregation contract that queries multiple independent oracles (e.g., Pyth, API3, Witnet) and calculates a median value.
  • Add circuit breakers and emergency shutdowns: Code pause functions and manual overrides into your consumer contracts. This allows you to halt operations if the data feed reports an implausible value, such as a 50% price swing in one block, which could be a sign of an oracle attack or market manipulation.

To test your implementation thoroughly, use a cross-chain testing framework. For EVM chains, tools like Foundry's forge allow you to simulate forked environments and message passing. Deploy your contracts on testnets (e.g., Sepolia, Arbitrum Sepolia, Polygon Amoy) and use testnet oracle services to validate the entire data flow without spending mainnet gas. Stress-test the system by simulating oracle downtime or reporting malicious data to ensure your fallback logic works.

Finally, stay informed on the evolving landscape. Interoperability protocols like Chainlink CCIP, LayerZero, and Axelar are developing standardized cross-chain messaging frameworks that include built-in oracle services. Adopting these can reduce your system's complexity. Monitor security best practices from audits of similar systems and consider engaging a professional auditing firm before locking significant value in your mainnet deployment. Your cross-chain data feed is now live—maintain it with the same rigor as your core application logic.