Free 30-min Web3 Consultation
Book Consultation
Smart Contract Security Audits
View Audit Services
Custom DeFi Protocol Development
Explore DeFi
Full-Stack Web3 dApp Development
View App Services
Free 30-min Web3 Consultation
Book Consultation
Smart Contract Security Audits
View Audit Services
Custom DeFi Protocol Development
Explore DeFi
Full-Stack Web3 dApp Development
View App Services
Free 30-min Web3 Consultation
Book Consultation
Smart Contract Security Audits
View Audit Services
Custom DeFi Protocol Development
Explore DeFi
Full-Stack Web3 dApp Development
View App Services
Free 30-min Web3 Consultation
Book Consultation
Smart Contract Security Audits
View Audit Services
Custom DeFi Protocol Development
Explore DeFi
Full-Stack Web3 dApp Development
View App Services
LABS
Guides

Setting Up a Cross-Chain Analytics Dashboard for Liquidity Flows

A step-by-step guide to building a dashboard that indexes and visualizes key metrics for tokenized asset bridges, including TVL, volume, and validator data.
Chainscore © 2026
introduction
INTRODUCTION

Setting Up a Cross-Chain Analytics Dashboard for Liquidity Flows

This guide explains how to build a dashboard to track and analyze liquidity movements across multiple blockchains.

A cross-chain analytics dashboard aggregates data from disparate blockchain networks to provide a unified view of asset and liquidity flows. This is critical for DeFi protocols, liquidity providers, and researchers who need to understand capital migration, arbitrage opportunities, and systemic risk across ecosystems like Ethereum, Arbitrum, Polygon, and Solana. Unlike single-chain explorers, these dashboards must handle varying data structures, indexing methods, and consensus finality times.

The core technical challenge involves sourcing reliable data. You can't query a single RPC endpoint. Instead, you must integrate with specialized data providers. For on-chain event logs and state, use services like The Graph for subgraphs or Covalent's Unified API. For aggregated liquidity metrics and pool-level data, DEX aggregators like 0x or 1inch offer APIs, while DefiLlama provides TVL and protocol-specific statistics. Each source requires API key management and rate limiting.

Once data is ingested, it must be normalized. A transfer on Ethereum uses the ERC-20 Transfer event, while the same action on Solana is a TransferChecked instruction. Your dashboard's data layer must map these to a common schema with fields like chain_id, token_address, from, to, amount, and timestamp. This often involves an ETL (Extract, Transform, Load) pipeline, which can be built with tools like Airflow or directly in your application code.

For the frontend, frameworks like Next.js or Vue.js paired with charting libraries such as D3.js or Chart.js are common. The key is to visualize flows intuitively: directed graphs for asset movement between chains, time-series charts for volume trends, and heatmaps for liquidity concentration. Interactive filters for time ranges, specific tokens (e.g., USDC, WETH), and chains are essential for user-driven analysis.

A practical first step is to track a major stablecoin. For example, you could build a view showing USDC bridges from Ethereum to Layer 2s. Query the Deposit event from the official Arbitrum Bridge contract (0x8315177aB297bA92A06054cE80a67Ed4DBd7ed3a) and the Mint event on the Arbitrum USDC contract. Correlate this with DEX liquidity pool deposits on Arbitrum's Uniswap v3 USDC/ETH pool to see how bridged capital is immediately deployed.

This dashboard becomes a powerful tool for identifying trends, such as liquidity fleeing a chain during periods of high fees or congestion, or capital rotating into new yield opportunities. By following this guide, you'll create a foundational system that can be extended to monitor bridges, cross-chain messaging protocols like LayerZero, and overall network health, providing actionable intelligence for DeFi strategy.

prerequisites
SETUP GUIDE

Prerequisites and Tech Stack

Building a cross-chain analytics dashboard requires a specific set of tools and foundational knowledge. This guide outlines the essential software, APIs, and concepts you need before writing your first query.

A modern cross-chain analytics stack is built on three core layers: data ingestion, processing, and visualization. For ingestion, you'll need access to blockchain data providers like The Graph for indexed subgraphs, Dune Analytics for on-chain SQL, or direct RPC nodes via services like Alchemy or Infura. The processing layer often involves a backend service, typically written in Node.js or Python, to aggregate and transform this raw data. Finally, the visualization layer is your frontend dashboard, commonly built with React or Next.js and charting libraries like Recharts or D3.js.

Your development environment must be properly configured. Ensure you have Node.js v18+ or Python 3.10+ installed. You will need a package manager like npm or yarn, and familiarity with Git for version control is essential. For managing dependencies and secrets, use a .env file with a library like dotenv. Crucially, you must obtain API keys from your chosen data providers; for example, a free-tier key from The Graph's hosted service or an Etherscan API key for contract verification and direct calls.

Understanding the data you're tracking is paramount. You should be comfortable with core DeFi concepts: liquidity pools (Uniswap v3, Curve), bridge mechanics (Wormhole, LayerZero), and common token standards (ERC-20, ERC-721). Your analytics will query events like Swap, AddLiquidity, and Transfer. Knowing how to decode these events from raw transaction logs using ABIs is a key skill. Start by exploring existing dashboards on Dune to see how complex queries are structured.

For the backend, choose a framework that supports asynchronous operations for handling multiple API calls. In Node.js, Express.js or Fastify are excellent choices. In Python, FastAPI or Flask work well. You will write scripts to fetch data on a schedule, so familiarity with cron jobs or task queues like Bull (Node.js) or Celery (Python) is beneficial. Always implement robust error handling and rate-limiting to respect provider API constraints.

The frontend dashboard should be designed for clarity. Use a component library like Material-UI or Chakra UI for rapid development. Your key task is to fetch processed data from your backend API and present it effectively. Implement interactive charts to show liquidity flows over time, tables for top token movements, and filter controls for chains (Ethereum, Arbitrum, Polygon) and timeframes. Consider using WebSocket connections for real-time data updates on high-frequency dashboards.

Finally, plan your deployment. For a full-stack application, you can deploy the backend to Railway, Render, or an AWS EC2 instance. The frontend can be hosted on Vercel or Netlify. Ensure all sensitive API keys are stored as environment variables in your deployment platform, never hardcoded. Start with a simple MVP tracking a single bridge or DEX across two chains before scaling to a multi-protocol view.

architecture-overview
SYSTEM ARCHITECTURE OVERVIEW

Setting Up a Cross-Chain Analytics Dashboard for Liquidity Flows

A technical blueprint for building a dashboard that aggregates and visualizes liquidity movement across multiple blockchains.

A cross-chain analytics dashboard for liquidity flows is a data pipeline that ingests, processes, and visualizes transaction data from multiple blockchain networks. The core architectural challenge is data heterogeneity—each chain (Ethereum, Solana, Arbitrum, etc.) has its own data structures, RPC endpoints, and indexing methods. The primary goal is to normalize this disparate data into a unified model that tracks asset movements between chains, typically via bridges like Wormhole, LayerZero, and Axelar. This requires a modular system with distinct components for data ingestion, transformation, storage, and presentation.

The architecture typically follows an ETL (Extract, Transform, Load) pattern. The Extract layer uses specialized indexers or RPC nodes to pull raw blockchain data. For EVM chains, you might use The Graph subgraphs or direct calls to an archive node. For Solana, you would use the Geyser plugin interface or Helius RPCs. This raw data—containing bridge contract interactions, token transfer events, and cross-chain messages—is then streamed into a message queue like Apache Kafka or Amazon Kinesis to handle variable load and ensure data durability.

In the Transform phase, the system applies business logic to raw events. This involves decoding calldata, mapping token addresses to canonical symbols using registries like CoinGecko or a local asset list, and correlating source and destination transactions to form a complete "cross-chain transfer" record. A critical step is calculating the finality time—the delay between a transaction being submitted on the source chain and being confirmed on the destination chain—which varies by bridge security model (e.g., optimistic vs. light client verification).

The transformed data is then Loaded into a time-series database (e.g., TimescaleDB) for efficient querying of historical trends and a real-time database (e.g., Redis) for serving current state. The analytics engine, often built with a framework like Apache Superset, Metabase, or a custom React frontend with D3.js, queries this data layer. Key visualizations include: liquidity inflow/outflow heatmaps per chain, bridge volume and fee comparisons, top asset migration patterns, and real-time monitoring of bridge health and latency.

To make this actionable, start by defining your data scope. Will you track all major bridges or focus on a specific set like Stargate (LayerZero) and Portal (Wormhole)? Next, choose your indexing strategy. Using a managed service like Covalent, Flipside Crypto, or Goldsky can accelerate development, while running your own indexer offers more control. Finally, your dashboard should expose APIs for programmatic access, allowing users to fetch metrics like total_value_locked per chain or avg_bridge_time for a specific asset pair, enabling integration into other DeFi tools.

indexing-bridge-events
DATA INGESTION

Step 1: Indexing Bridge Events with Subgraphs

The foundation of any cross-chain analytics dashboard is reliable, queryable data. This step covers how to use The Graph's subgraphs to index and structure raw blockchain events from major bridges.

Cross-chain bridges operate by emitting specific on-chain events when assets are locked, minted, or burned. For example, the Wormhole bridge emits a LogMessagePublished event, while Across emits a FundsDeposited event. A subgraph is a dedicated indexing service that listens for these events, processes the data, and stores it in a structured GraphQL API. Instead of making complex, rate-limited RPC calls to multiple chains, you query a single endpoint. Popular bridges like Hop, Synapse, and Stargate already have public subgraphs hosted on The Graph's decentralized network, which you can explore in their Subgraph Studio.

To use a subgraph, you need its deployment ID or subgraph name. For instance, the Hop Ethereum bridge subgraph on The Graph's hosted service is hop-protocol/hop-ethereum. You query it using GraphQL to fetch structured data about transfers. A basic query to get recent deposits might look like this:

graphql
query RecentDeposits {
  depositEvents(first: 5, orderBy: timestamp, orderDirection: desc) {
    id
    amount
    token
    from
    destinationChainId
    timestamp
  }
}

This returns a clean JSON object with the essential fields for analysis, abstracting away the complexity of raw log data and event decoding.

For bridges without a public subgraph, you must define and deploy your own. This involves creating a subgraph manifest (subgraph.yaml) that specifies the smart contract address, the blockchain network (e.g., mainnet), the events to index, and the mapping logic written in AssemblyScript. The mapping file (mapping.ts) transforms event data into the entities defined in your schema. While more involved, this gives you complete control over the data structure. For analytics, key entities to define are Deposit, Withdrawal, Token, and Bridge, with relationships between them.

When indexing bridge data, pay close attention to data normalization. Different bridges use various identifiers for chains and tokens. A crucial step is creating a consistent internal mapping. For example, map a destinationChainId of 1 to Ethereum and a token address to its canonical symbol USDC. You should also index the transaction hash, block number, and timestamp for every event to enable time-series analysis and data verification. This normalized dataset becomes the single source of truth for your dashboard's visualizations and alerts.

Finally, consider query performance and cost. For production dashboards, schedule recurring queries or use GraphQL subscriptions for real-time updates. If using The Graph's decentralized network, queries consume GRT tokens from a billing balance. For high-volume applications, you may need to index the subgraph yourself on a dedicated service or use The Graph's hosted service for reliable uptime. The output of this step is a robust, queryable data layer containing timestamped, normalized records of all cross-chain transfers, ready for aggregation and analysis in Step 2.

calculating-derived-metrics
ANALYTICS ENGINE

Calculating Derived Metrics in the Backend

Transform raw blockchain data into actionable insights by calculating key liquidity metrics. This step is the core of your dashboard's analytical power.

Raw transaction data from the previous step is voluminous but not immediately insightful. Your backend must process this data to calculate derived metrics that reveal the true state of cross-chain liquidity. These metrics answer critical questions: Which assets are moving? Which bridges are most active? What are the net flows between chains? Start by aggregating the raw Transfer events. Group them by asset, sourceChain, destinationChain, and bridgeProtocol. This creates the foundational dataset for all subsequent calculations.

With data aggregated, calculate the primary flow metrics. Total Volume is the sum of all transfer amounts for a given asset and bridge pair over your chosen time window (e.g., 24h). Transaction Count is simply the number of transfers. More importantly, calculate Net Flow by summing incoming amounts and subtracting outgoing amounts for each asset-chain pair. A positive net flow indicates capital inflow to that chain, while negative shows outflow. These three metrics form the basis for most dashboard visualizations like volume charts and flow heatmaps.

For deeper analysis, implement velocity and concentration metrics. Calculate the Mean Transfer Size (total volume / transaction count) to understand user behavior—large mean sizes may indicate institutional activity. Determine Bridge Dominance by calculating a bridge's share of the total volume for a specific asset route. Furthermore, track Unique Sender/Receiver Addresses to gauge user adoption versus whale activity. These calculations often require window functions in your database (e.g., PostgreSQL) or are performed in-memory with a tool like Pandas before being stored.

Here is a simplified Python example using Pandas to calculate 24-hour metrics from a list of transfer events:

python
import pandas as pd

transfers_df = pd.DataFrame(transfer_events)  # From Step 1
transfers_df['timestamp'] = pd.to_datetime(transfers_df['timestamp'])
transfers_df['amount'] = transfers_df['amount'].astype(float)

# Filter for last 24 hours
cutoff = pd.Timestamp.now() - pd.Timedelta(hours=24)
recent_transfers = transfers_df[transfers_df['timestamp'] > cutoff]

# Calculate key metrics
metrics = recent_transfers.groupby(['asset', 'sourceChain', 'destinationChain', 'bridge']).agg(
    total_volume=('amount', 'sum'),
    tx_count=('amount', 'count'),
    mean_tx_size=('amount', 'mean'),
    unique_senders=('from', 'nunique')
).reset_index()

# Calculate net flow per asset-chain (simplified)
inflow = recent_transfers.groupby(['asset', 'destinationChain'])['amount'].sum()
outflow = recent_transfers.groupby(['asset', 'sourceChain'])['amount'].sum()
net_flow = (inflow - outflow).reset_index(name='net_flow')

Finally, schedule these calculations to run at regular intervals (e.g., every 15 minutes) using a cron job or a task queue like Celery. Store the results in a dedicated aggregated_metrics table in your database, timestamped with the calculation period. This denormalized storage is crucial for performance, allowing your API to serve complex dashboard queries without real-time aggregation. The next step will focus on building the API endpoints that expose these calculated metrics to your frontend dashboard.

building-visualization-dashboard
FRONTEND INTEGRATION

Step 3: Building the Visualization Dashboard

This step focuses on creating a React-based frontend to visualize the processed cross-chain liquidity data from your backend API.

With your API serving aggregated liquidity data, you can now build a dashboard to visualize cross-chain flows. A common approach uses React with Recharts or D3.js for charting and Tailwind CSS for styling. The core task is to fetch data from your /api/liquidity-flows endpoint and render it in an intuitive format. Key visualizations include a time-series line chart showing total value locked (TVL) across chains, a bar chart comparing daily bridge volumes, and a network graph illustrating the dominant asset flow paths between chains like Ethereum, Arbitrum, and Polygon.

Start by setting up a new React project (e.g., using create-react-app or Vite). Install the necessary charting library, such as recharts. Create a service module to handle API calls to your backend. Implement components for each chart type, ensuring they are responsive and update dynamically. For the network graph, libraries like vis-network or react-force-graph can map source and destination chains as nodes, with edge thickness representing transfer volume. This provides an immediate visual understanding of liquidity corridors.

Here is a simplified example of a component fetching and displaying TVL data:

javascript
import { LineChart, Line, XAxis, YAxis, CartesianGrid } from 'recharts';
import { useEffect, useState } from 'react';

function TVLChart() {
  const [data, setData] = useState([]);
  useEffect(() => {
    fetch('http://localhost:3001/api/liquidity-flows/tvl')
      .then(res => res.json())
      .then(setData);
  }, []);

  return (
    <LineChart width={600} height={300} data={data}>
      <CartesianGrid strokeDasharray="3 3" />
      <XAxis dataKey="date" />
      <YAxis label={{ value: 'TVL (USD)', angle: -90 }} />
      <Line type="monotone" dataKey="ethereum" stroke="#627eea" />
      <Line type="monotone" dataKey="arbitrum" stroke="#28a0f0" />
    </LineChart>
  );
}

This component plots the TVL for Ethereum and Arbitrum over time, using data from your aggregated API endpoint.

To enhance the dashboard, add interactive filters allowing users to select specific time ranges (last 7 days, 30 days), chains, or bridge protocols (like Hop, Across, or Stargate). Implement a data table below the charts showing raw transaction details, with sortable columns for amount, source chain, destination chain, and timestamp. For production, consider using a state management library like Zustand or Redux Toolkit to manage the global filter state efficiently across all chart components, preventing redundant API calls.

Finally, deploy your frontend for public access. You can use Vercel, Netlify, or AWS Amplify for static hosting. Ensure your backend API is also deployed (e.g., on a cloud service like Railway or AWS EC2) and its CORS settings are configured to allow requests from your frontend domain. The completed dashboard transforms raw blockchain data into actionable insights, enabling users to track capital movement, identify dominant bridges, and monitor the health of cross-chain liquidity in real-time.

CORE DASHBOARD METRICS

Key Metrics Definitions and Formulas

Essential calculations for tracking cross-chain liquidity health and efficiency.

MetricDefinitionFormulaData Source

Total Value Locked (TVL)

Sum of assets deposited across all monitored liquidity pools.

Σ (Pool Asset Price * Pool Balance)

DEX Subgraphs, DeFiLlama API

Daily Bridge Volume

Total value of assets transferred via bridges in the last 24 hours.

Σ (Bridge Tx Value in USD)

Bridge APIs (e.g., Socket, Li.Fi), Chainscan

Net Flow (24h)

Net change in asset balance on a chain (inflows - outflows).

Total Inflow - Total Outflow

On-chain event logs, Dune Analytics

Dominant Bridge Share

Market share of the leading bridge by volume for a specific asset pair.

(Leading Bridge Volume / Total Bridge Volume) * 100

Bridge Aggregator APIs

Average Transfer Time

Mean time for a cross-chain transfer to finalize.

Σ (Finalization Timestamp - Initiation Timestamp) / Number of Txs

Bridge Transaction Receipts

Bridge Fee Efficiency

Average fee as a percentage of the transferred value.

(Σ Bridge Fees in USD / Σ Transferred Value in USD) * 100

Bridge Transaction Data

Liquidity Depth (Top 5 Pools)

Available liquidity for immediate swaps in a chain's largest pools.

Σ (Pool Reserve 0 * Price 0) + (Pool Reserve 1 * Price 1)

DEX Smart Contracts (e.g., Uniswap, Curve)

Cross-Chain Arbitrage Opportunity

Price discrepancy for the same asset between two DEXs on different chains.

((Price Chain B - Price Chain A) / Price Chain A) * 100

DEX Price Oracles, On-chain Price Feeds

CROSS-CHAIN ANALYTICS

Frequently Asked Questions

Common technical questions and solutions for developers building dashboards to track liquidity flows across blockchains.

Reliability depends on the blockchain and asset. For a robust dashboard, aggregate data from multiple sources:

  • On-chain Indexers: Use The Graph subgraphs for EVM chains (e.g., Uniswap, Curve) and Flipside Crypto for SQL-based queries. For Solana, Helius provides robust RPC and API endpoints.
  • Bridge & Messaging Protocols: Directly query protocol contracts and APIs. For Wormhole, use the Guardian RPC and the Wormhole SDK. For LayerZero, monitor the Endpoint and UltraLightNodeV2 contracts.
  • Centralized Exchange Feeds: For off-ramp tracking, use WebSocket feeds from major CEX APIs like Binance, Coinbase, and Kraken for deposit/withdrawal addresses.

Always implement data validation by cross-referencing at least two sources for critical metrics like total value locked (TVL) or bridge volume to mitigate the risk of indexer lag or incorrect event parsing.

How to Build a Cross-Chain Analytics Dashboard for Liquidity | ChainScore Guides