Free 30-min Web3 Consultation
Book Consultation
Smart Contract Security Audits
View Audit Services
Custom DeFi Protocol Development
Explore DeFi
Full-Stack Web3 dApp Development
View App Services
Free 30-min Web3 Consultation
Book Consultation
Smart Contract Security Audits
View Audit Services
Custom DeFi Protocol Development
Explore DeFi
Full-Stack Web3 dApp Development
View App Services
Free 30-min Web3 Consultation
Book Consultation
Smart Contract Security Audits
View Audit Services
Custom DeFi Protocol Development
Explore DeFi
Full-Stack Web3 dApp Development
View App Services
Free 30-min Web3 Consultation
Book Consultation
Smart Contract Security Audits
View Audit Services
Custom DeFi Protocol Development
Explore DeFi
Full-Stack Web3 dApp Development
View App Services
LABS
Guides

How to Architect a Real-Time Total Value Locked (TVL) Decomposition Dashboard

A technical guide to building a system that decomposes protocol TVL into constituent parts like asset type, pool, and chain in real-time. Includes architecture, data pipelines, and code examples.
Chainscore © 2026
introduction
INTRODUCTION

How to Architect a Real-Time Total Value Locked (TVL) Decomposition Dashboard

A guide to building a system that breaks down and visualizes the composition of DeFi protocol liquidity in real-time.

Total Value Locked (TVL) is the primary metric for assessing the scale and health of a decentralized finance protocol. It represents the aggregate value of all assets deposited by users into a protocol's smart contracts. However, a single TVL number is often insufficient for deep analysis. A TVL decomposition dashboard provides the critical next layer of insight by breaking down this total into its constituent parts: asset types, pools, chains, and user segments. This real-time visibility is essential for developers monitoring protocol performance, researchers analyzing market trends, and liquidity providers optimizing their capital allocation.

Architecting this system requires a robust data pipeline. The process begins with on-chain data ingestion, typically using services like The Graph for indexed historical data or direct RPC calls to nodes for real-time events. For a protocol like Aave on Ethereum, you would track events such as Deposit, Withdraw, and Borrow to calculate user positions. Each asset's USD value must be calculated in real-time using price oracles like Chainlink or decentralized exchange pools. The core challenge is accurately attributing value across nested yield strategies, such as a user's staked ETH in Lido that is subsequently supplied as stETH to a Curve pool.

The data processing layer must handle composition logic. This involves mapping token addresses to canonical names, filtering out internal accounting tokens (like LP tokens), and applying the correct pricing logic for each asset type. For a lending protocol, you decompose TVL into supplied versus borrowed assets. For a decentralized exchange like Uniswap V3, you break it down by individual pools (e.g., USDC/ETH, DAI/USDC) and fee tiers. The architecture must be chain-agnostic to support multi-chain protocols; TVL on Arbitrum and Polygon needs to be aggregated and decomposed with the same logic.

Finally, the presentation layer delivers insights through the dashboard. Effective visualizations include: time-series charts showing TVL trends per asset, pie or treemap charts for instant composition snapshots, and data tables with sortable columns for deep dives. The system should allow filtering by date range, asset, and specific smart contract. By building this dashboard, you move from knowing that a protocol has $5B TVL to understanding that 40% is in stablecoins, 30% is in staked ETH derivatives, and 15% is concentrated in a single liquidity pool—information vital for risk assessment and strategic decision-making.

prerequisites
PREREQUISITES AND ARCHITECTURE OVERVIEW

How to Architect a Real-Time Total Value Locked (TVL) Decomposition Dashboard

This guide outlines the technical prerequisites and system architecture for building a dashboard that decomposes and tracks TVL across DeFi protocols in real-time.

Total Value Locked (TVL) is the aggregate value of all crypto assets deposited in a decentralized finance protocol's smart contracts. A decomposition dashboard breaks this monolithic figure into its constituent parts: assets per chain, protocol, and specific pool. To build one, you need a foundational understanding of blockchain data indexing, real-time data pipelines, and the specific APIs of protocols like Aave, Uniswap, and Lido. The core challenge is sourcing accurate, consistent data from heterogeneous and often unaudited on-chain sources.

The system architecture typically follows a modular data pipeline. First, a data ingestion layer pulls raw on-chain data using RPC nodes (e.g., Alchemy, Infura) and subgraphs (The Graph). This data includes token balances, LP positions, and staking deposits. Second, a processing and enrichment layer normalizes this data, applying token prices from oracles (Chainlink, Pyth) and calculating USD values. This layer must handle complex logic, such as identifying wrapped assets (e.g., wETH) and reconciling them with their native counterparts for a clean aggregation.

Finally, the serving and presentation layer exposes processed data via a REST or GraphQL API to a frontend dashboard. For real-time updates, this layer often uses WebSocket connections or server-sent events. A critical design decision is the calculation methodology: do you use a spot price at block time or a time-weighted average price (TWAP)? Each choice impacts the reported TVL's volatility and accuracy. The architecture must also be chain-agnostic to accommodate Ethereum, Layer 2s like Arbitrum and Optimism, and alternative Layer 1s like Solana.

Key prerequisites include proficiency in a backend language like Node.js or Python, experience with databases (PostgreSQL for relational, TimescaleDB for time-series), and familiarity with message queues (Kafka, RabbitMQ) for managing data streams. You will also need to understand the token list standards (Token Lists, CoinGecko API) for accurate price mapping and the ERC-20 token standard for balance calculations. Setting up a robust error-handling and data validation system is non-negotiable, as on-chain data can be messy and oracle prices can occasionally fail.

A practical first step is to prototype for a single protocol on one chain. For example, track all liquidity pools in a Uniswap V3 deployment on Ethereum. Your pipeline would listen for Mint, Burn, and Swap events, calculate the USD value of each pool's reserves using a price feed, and sum them. This isolated build reveals the complexities of data consistency and latency before scaling to a multi-chain, multi-protocol system. The end goal is a dashboard that answers specific questions: 'Which asset contributes most to Aave's TVL on Arbitrum?' or 'How has Curve's stablecoin pool composition changed this week?'

key-concepts
ARCHITECTURE GUIDE

Core Concepts for TVL Decomposition

Building a real-time TVL dashboard requires understanding the data sources, aggregation logic, and infrastructure needed to track billions in assets across multiple chains and protocols.

04

Real-Time Architecture

A production system needs a scalable backend. A common stack uses:

  • Message Queues (Apache Kafka, RabbitMQ) to decouple data ingestion from processing.
  • Stream Processing (Apache Flink, Spark Streaming) for aggregating events and calculating rolling TVL.
  • Time-Series Databases (InfluxDB, TimescaleDB) optimized for storing and querying metric history.
  • Caching Layer (Redis) for serving the latest dashboard figures with sub-second latency. This pipeline must be resilient to chain reorgs and RPC node failures.
05

Cross-Chain Aggregation

Modern DeFi spans 50+ blockchains. Aggregating TVL across them requires:

  • Chain-specific clients configured for each network's RPC endpoints and block times.
  • Canonical vs. Native Assets: Bridge tokens (multichain assets) must be traced back to their canonical source to avoid double-counting. A bridged USDC on Arbitrum should map to the Ethereum-native USDC supply.
  • Unified timestamping: Synchronize data across chains using block timestamps or a centralized clock to provide consistent hourly/daily snapshots. Tools like Chainscore's APIs handle this complexity by normalizing data across EVM and non-EVM chains.
06

Dashboard & Alerting

The frontend and monitoring layer turns data into insight.

  • Visualization: Use libraries like D3.js or frameworks like Streamlit to create charts for TVL trends, chain dominance, and protocol breakdown.
  • Anomaly Detection: Set up alerts for sudden TVL drops (>10% in an hour) which may indicate an exploit, mass withdrawal, or oracle failure.
  • Data Export: Provide CSV/JSON exports for user analysis and integrate with BI tools like Grafana for internal monitoring. The goal is actionable intelligence, not just a static number.
data-pipeline-setup
ARCHITECTURE

Step 1: Setting Up the Data Pipeline

This step covers the core infrastructure for ingesting, processing, and storing blockchain data to power a real-time TVL decomposition dashboard.

A robust data pipeline is the foundation of any on-chain analytics dashboard. For a Total Value Locked (TVL) decomposition dashboard, the pipeline must handle high-volume, real-time data from multiple blockchains and protocols. The primary architectural components are: a data ingestion layer (indexers/RPC nodes), a transformation engine (stream processing), and a queryable storage layer (time-series database). This setup allows you to track assets across thousands of smart contracts, categorize them by protocol and chain, and calculate TVL metrics with millisecond latency.

Start by sourcing raw data. You need direct access to blockchain nodes via RPC endpoints (e.g., from providers like Alchemy, Infura, or QuickNode) or use specialized indexing services like The Graph or Goldsky. For TVL, you must listen for specific events: Deposit, Withdraw, Transfer in liquidity pools, and monitor token balances in vault contracts. A common approach is to use an event streaming platform like Apache Kafka or Amazon Kinesis to decouple data ingestion from processing, ensuring durability and fault tolerance as block data flows in.

The transformation layer is where raw logs become actionable metrics. Using a stream-processing framework like Apache Flink or ksqlDB, you can write jobs that decode event data, normalize token amounts to USD using real-time price oracles (e.g., Chainlink feeds), and apply business logic for categorization. For example, a transaction adding ETH and USDC to a Uniswap V3 pool on Arbitrum must be mapped to the DeFi category, the Uniswap protocol, the Arbitrum chain, and its value summed in USD. This requires maintaining state for exchange rates and protocol definitions.

Processed data must be stored for low-latency querying. Time-series databases like TimescaleDB or InfluxDB are optimal for this use case, as they efficiently handle high-write volumes and time-range queries. Structure your data schema around the core entities: protocol, chain, asset, and user (if applicable). Each TVL snapshot should include timestamp, protocol ID, chain ID, asset address, raw token amount, and USD value. This granular storage enables the dashboard to slice data by any dimension—such as showing Ethereum DeFi TVL versus Solana DeFi TVL over the last 24 hours.

Finally, implement data quality and monitoring. Given the financial nature of TVL, inaccuracies can mislead significantly. Build checks for: data freshness (is the pipeline lagging?), price feed staleness, and balance reconciliation (does the sum of all tracked contracts match protocol-reported TVL?). Tools like Grafana for dashboarding pipeline health and Great Expectations for validation can automate this. With this pipeline architecture operational, you have a reliable, real-time feed of decomposed TVL data ready for visualization in the next steps.

calculating-decomposed-metrics
DATA AGGREGATION

Step 2: Calculating Decomposed Metrics

After establishing your data sources, the next step is to process the raw blockchain data into the decomposed TVL metrics that power your dashboard's insights.

The core of a decomposition dashboard is transforming raw balance and price data into categorized value. This requires calculating metrics for each protocol and asset. The fundamental formula is: TVL = Σ (Asset Balance * Asset Price). For decomposition, you apply this per category. For a lending protocol like Aave on Ethereum, you would sum the supply balance of all deposited assets (e.g., USDC, WETH) and the borrow balance of all borrowed assets, using real-time prices from an oracle like Chainlink. These are your first two key decomposed metrics: Total Supply Value and Total Borrow Value.

To calculate Net Liquidity Provided, you must account for utilization. For a DEX liquidity pool (e.g., a 50/50 USDC/WETH Uniswap V3 pool), you sum the value of both assets in the pool. However, for a concentrated liquidity position, you must query the position's specific liquidity and tick range from the pool contract and calculate the asset amounts based on the current price. This is more complex than a simple balance check and often requires using the pool's positions mapping and helper libraries like the Uniswap V3 SDK.

For yield-bearing assets, you need to differentiate between the underlying principal and accrued yield. When a user deposits USDC into Compound's cUSDC contract, your dashboard should track the underlying USDC value, not the cUSDC token value. This is done by calling the cToken's exchangeRateStored() function and calculating: Underlying Value = cToken Balance * exchangeRate. The difference between this value and the initial deposit, accrued over time, represents the Yield Generated metric. Failing to use exchange rates will misrepresent the true TVL.

Here is a simplified TypeScript example using ethers.js to calculate supply and borrow values for a user on Aave:

typescript
import { ethers } from 'ethers';
import { Pool } from '@aave/contracts';

async function getUserAavePosition(userAddress: string, poolContract: Pool) {
  // Fetch user's reserve data for a specific asset, e.g., USDC
  const reserveData = await poolContract.getUserAccountData(userAddress);
  
  // totalCollateralBase and totalDebtBase are in USD scaled by 8 decimals
  const supplyValueUSD = reserveData.totalCollateralBase / 1e8;
  const borrowValueUSD = reserveData.totalDebtBase / 1e8;
  
  return { supplyValueUSD, borrowValueUSD };
}

This calls the optimized getUserAccountData view function, which aggregates across all assets, saving significant RPC calls versus summing each asset individually.

Finally, you must aggregate these protocol-level metrics into chain-level and cross-chain totals. This involves summing Total Supply Value across all tracked lending protocols on a chain, and doing the same for Total Borrow Value, Net DEX Liquidity, and Yield Generated. For a multi-chain dashboard, you convert all values to a common currency (like USD) using a consistent price oracle. The result is a complete, decomposed TVL dataset showing exactly where value is locked: how much is supplied, borrowed, providing liquidity, and earning yield across the entire ecosystem.

DATA LAYER

TVL Decomposition Metrics and Data Sources

Comparison of core metrics for TVL decomposition and their primary data sources.

MetricDefinitionPrimary Data SourceUpdate FrequencyComplexity

Total TVL

Sum of all assets locked in protocol contracts.

DefiLlama, Token Terminal

Real-time to 24h

Low

TVL by Chain

TVL segmented by underlying blockchain (e.g., Ethereum, Arbitrum).

DefiLlama API, Subgraphs

Real-time to 1h

Low

TVL by Protocol

TVL segmented by individual DeFi application (e.g., Aave, Uniswap).

Protocol Subgraphs, The Graph

Real-time to 15min

Medium

TVL by Asset

Value locked per token (e.g., ETH, USDC, stETH).

On-chain RPC calls, Dune Analytics

Block-by-block

High

TVL by Pool/Vault

Value in specific liquidity pools or yield vaults.

Smart Contract Event Logs

Block-by-block

High

Staked vs. Supplied TVL

Distinguishes staked (e.g., securing chain) from supplied (e.g., lending) assets.

Custom Indexer Logic

Real-time

High

Net Liquidity (Inflows/Outflows)

Change in TVL over time, measuring capital movement.

Historical On-chain Data

Daily

Medium

real-time-dashboard-build
ARCHITECTURE & IMPLEMENTATION

Step 3: Building the Real-Time Dashboard Frontend

This guide details the frontend architecture for a real-time TVL decomposition dashboard, focusing on data flow, state management, and visualization libraries.

The frontend architecture is built around a reactive data flow. A central state manager, like Zustand or Jotai, holds the processed TVL data streamed from your backend API. This state is updated in real-time via a WebSocket connection or by polling a Server-Sent Events (SSE) endpoint. The UI components are then simple, stateless functions that subscribe to this central store. This separation ensures the complex data logic is isolated from the view layer, making the application easier to test and maintain as you add new data sources or visualizations.

For data visualization, Recharts or Victory are excellent choices for building interactive charts in React. You'll create components for: a primary area chart showing total TVL over time, stacked bar charts to decompose TVL by protocol (e.g., Aave, Lido, Uniswap) or by asset type (e.g., ETH, stablecoins, LSTs), and donut charts for snapshot breakdowns. Use tooltips to display precise values and percentages on hover. For displaying large tables of underlying positions or transactions, integrate a virtualized table library like TanStack Table to maintain performance with high-frequency updates.

Implementing real-time updates requires handling connection states and data freshness. Your WebSocket client should automatically reconnect with exponential backoff. Display a clear visual indicator (like a pulsing dot or timestamp) showing the last update. To manage rapid data streams without overwhelming the UI or the user, consider implementing client-side throttling or debouncing for chart re-renders. For example, you might update the central store on every message but only redraw the complex area chart every 500ms.

A critical feature is user-driven decomposition. Beyond the default views, provide interactive filters. Users should be able to click on a segment in the donut chart (e.g., "Liquid Staking") to filter all other charts to show only TVL related to that category. Similarly, a time-range selector (1D, 7D, 30D, All) should dynamically fetch and display aggregated data for that period. These controls dispatch actions that query your backend's filtered endpoints, which should be optimized with database indexes for fast response times.

Finally, focus on performance optimization from the start. Use React.memo for expensive chart components to prevent unnecessary re-renders. Lazy-load non-critical dashboard panels using React.lazy() and Suspense. Serve static assets like logos and library code from a CDN. The goal is to achieve a Lighthouse performance score above 90, ensuring the dashboard remains usable even on slower connections, which is vital for monitoring tools that may need to be accessed in time-sensitive situations.

TVL DASHBOARD ARCHITECTURE

Frequently Asked Questions

Common technical questions and solutions for developers building real-time Total Value Locked (TVL) decomposition dashboards.

Building a reliable TVL dashboard requires aggregating data from multiple sources to ensure accuracy and resilience. Primary sources include:

  • On-chain RPC nodes: Direct queries to Ethereum, Arbitrum, or other L2 nodes using eth_call for contract state. Use providers like Alchemy or Infura for reliability.
  • Subgraph Indexers (The Graph): For complex historical queries and aggregated protocol data (e.g., Uniswap pools, Aave markets).
  • Decentralized Oracle Networks: Use Chainlink Data Feeds for real-world asset prices (e.g., ETH/USD).
  • Protocol APIs: Many DeFi projects like Lido or Compound offer official APIs for staking and lending data.

A robust architecture polls these sources in parallel, implements fallback logic, and uses a consensus mechanism (e.g., taking the median price from three oracles) to mitigate single-point failures. Cache aggressively with a short TTL (5-30 seconds) to manage rate limits and performance.

conclusion-next-steps
ARCHITECTURE REVIEW

Conclusion and Next Steps

This guide has outlined the core components for building a real-time TVL decomposition dashboard. The next steps involve production deployment, data validation, and exploring advanced analytical features.

You now have a functional architecture for a TVL decomposition dashboard. The system ingests raw blockchain data via RPC nodes or indexers, processes it through a pipeline that normalizes and enriches the data, and stores it in a time-series database like TimescaleDB for efficient querying. The frontend, built with a framework like Next.js, visualizes this data through interactive charts and tables. The key to real-time performance is the incremental update strategy, where you listen for new blocks and process only the delta changes to positions and prices, rather than performing full-chain rescans.

For production deployment, focus on robustness and scalability. Implement comprehensive error handling and retry logic in your data ingestion service to handle RPC node failures. Use a message queue like RabbitMQ or Apache Kafka to decouple data ingestion from processing, ensuring no events are lost during peak loads. Set up monitoring with Prometheus and Grafana to track pipeline health, data freshness, and API latency. Crucially, implement a data validation layer that cross-references your calculated TVL with public sources like DeFiLlama's API to ensure accuracy and build trust.

To extend your dashboard's utility, consider adding advanced features. Implement protocol-specific risk dashboards that track metrics like collateralization ratios for lending protocols (e.g., Aave, Compound) or impermanent loss for concentrated liquidity pools. Add cross-chain aggregation to provide a unified view of a protocol's TVL across Ethereum, Arbitrum, and Polygon. For deeper insights, incorporate wallet-level analytics to identify the behavior of large holders (whales) and their impact on protocol liquidity. These features transform your dashboard from a simple data display into a strategic decision-making tool.

The underlying data schema is critical for future flexibility. Design your fact tables to be extensible, with separate tables for deposits, withdrawals, swaps, and price_updates. Use a protocol_dimension table to store metadata about each integrated DeFi application. This normalized structure allows you to easily add new protocols or new types of events (like liquidations) without schema overhauls. Always version your database migrations and keep a detailed changelog.

Finally, engage with the community. Share your methodology and findings. Open-source components of your pipeline on GitHub to solicit feedback and contributions. Participate in forums like the DeFi Llama Telegram group or the Flipside Crypto community to discuss data challenges and solutions. Building a reliable TVL dashboard is an iterative process that benefits from collaborative scrutiny and the rapidly evolving tooling in the blockchain data space.