Free 30-min Web3 Consultation
Book Consultation
Smart Contract Security Audits
View Audit Services
Custom DeFi Protocol Development
Explore DeFi
Full-Stack Web3 dApp Development
View App Services
Free 30-min Web3 Consultation
Book Consultation
Smart Contract Security Audits
View Audit Services
Custom DeFi Protocol Development
Explore DeFi
Full-Stack Web3 dApp Development
View App Services
Free 30-min Web3 Consultation
Book Consultation
Smart Contract Security Audits
View Audit Services
Custom DeFi Protocol Development
Explore DeFi
Full-Stack Web3 dApp Development
View App Services
Free 30-min Web3 Consultation
Book Consultation
Smart Contract Security Audits
View Audit Services
Custom DeFi Protocol Development
Explore DeFi
Full-Stack Web3 dApp Development
View App Services
LABS
Guides

Setting Up a DEX Volume and Market Share Tracker

A technical guide for developers to build a system that monitors DEX performance metrics, calculates market share, and visualizes trends against competitors.
Chainscore © 2026
introduction
INTRODUCTION

Setting Up a DEX Volume and Market Share Tracker

This guide explains how to build a system for tracking decentralized exchange (DEX) trading volume and calculating market share across major protocols like Uniswap, Curve, and PancakeSwap.

Decentralized exchanges (DEXs) facilitate over $1.5 billion in daily trading volume. For developers, analysts, and investors, tracking this activity is essential for understanding market trends, protocol dominance, and liquidity flows. A DEX tracker aggregates raw on-chain data to compute key metrics: total volume, protocol-specific volume, and the resulting market share percentages. This data is foundational for building dashboards, conducting research, or powering investment strategies.

The core technical challenge is sourcing reliable, real-time data. You have several options: querying blockchain nodes directly via RPC, using a dedicated indexer like The Graph, or leveraging data APIs from providers such as Covalent, Dune Analytics, or Chainscore. Each approach involves trade-offs between data freshness, historical depth, and development complexity. For a production-grade tracker, you'll need to handle data normalization, as different DEXs (e.g., Uniswap V3, Curve V2) emit events in distinct formats.

A basic architecture involves three components: a data ingestion layer, a processing engine, and a storage/API layer. The ingestion layer listens for Swap events from DEX contracts. The processing engine calculates the USD value of each swap using real-time price oracles, aggregates volumes by protocol and time window (hourly, daily), and computes market share. Finally, the processed data is stored in a database and served via an API. You can use Python with Web3.py for prototyping or a more robust stack with Node.js and a time-series database for scalability.

Here's a simplified Python snippet using Web3.py to fetch recent swap events from a Uniswap V2 Pair contract, a common starting point:

python
from web3 import Web3
import json

w3 = Web3(Web3.HTTPProvider('YOUR_RPC_URL'))
uniswap_pair_address = '0x...'
pair_abi = json.loads('[ABI_JSON]')
contract = w3.eth.contract(address=uniswap_pair_address, abi=pair_abi)

event_filter = contract.events.Swap.createFilter(fromBlock='latest')
events = event_filter.get_new_entries()
for event in events:
    amount0In = event['args']['amount0In']
    amount1Out = event['args']['amount1Out']
    # Calculate USD value using token decimals and price feed

After collecting raw swap data, you must calculate the USD value. This requires knowing the token pair involved, the amount swapped, and the token's current price. Use decentralized oracles like Chainlink or Pyth to fetch accurate prices. The formula for a swap's USD volume is typically: (token_amount * token_price). Aggregate these values per protocol over your desired timeframe. Market share for a protocol like Uniswap is then: (Uniswap's Volume / Total DEX Volume) * 100. Be mindful of double-counting volume in multi-hop trades across different pools.

For a production system, consider using specialized tools. The Graph subgraphs for protocols like Uniswap provide pre-indexed swap data, saving significant development time. Alternatively, Chainscore's unified API offers normalized DEX volume data across multiple chains, which can simplify backend logic. The final step is visualizing the data. You can use libraries like Chart.js or platforms like Grafana to create dashboards that display trends in volume and shifting market shares between leaders like Uniswap, Curve, and emerging DEXs.

prerequisites
BUILDING A DEX TRACKER

Prerequisites and Tech Stack

Before querying on-chain data, you need to configure your development environment and select the right tools. This guide outlines the core components required to build a DEX volume and market share tracker.

A functional DEX tracker requires a reliable data source, a processing layer, and a storage/visualization component. The core technical stack typically includes a blockchain node or indexer (like an RPC provider or The Graph), a backend runtime (Node.js, Python), a database (PostgreSQL, TimescaleDB), and a frontend framework (React, Next.js). For real-time analytics, you'll also need a task scheduler or message queue to handle periodic data ingestion. This architecture separates data collection from analysis, ensuring scalability.

Your primary data source will be a node provider offering access to historical and real-time blockchain data. Services like Alchemy, Infura, or QuickNode provide managed Ethereum RPC endpoints. For aggregated DEX-specific data, consider using subgraphs on The Graph protocol, which index events from protocols like Uniswap, Curve, and SushiSwap. You'll need to understand the GraphQL query language to fetch metrics such as dailyVolumeUSD, totalLiquidity, and txCount directly from these decentralized APIs.

The backend service, written in Node.js with Ethers.js or Python with Web3.py, will call these data sources. You must handle chain reorgs, rate limiting, and error logging. Use a library like node-cron or Celery to schedule daily or hourly jobs that fetch and process the latest swap events. The processed data—clean, normalized, and aggregated—should be stored in a time-series optimized database. TimescaleDB (a PostgreSQL extension) is ideal for storing metrics like hourly volume per pool, allowing for efficient time-window queries.

For the initial setup, install Node.js (v18+) and a package manager like npm or yarn. Create a new project directory and initialize it: npm init -y. Then, install your core dependencies: npm install ethers dotenv node-cron. You will also need a database driver, such as pg for PostgreSQL. Store your provider API keys and database credentials in a .env file, never in your codebase. Version control this file with a .env.example template.

Finally, consider the compute and infrastructure requirements. For a production system, you may deploy the fetcher service on a cloud VM or a serverless function (AWS Lambda, Google Cloud Functions). Database hosting can be managed via services like AWS RDS or Supabase. The choice between a monolithic application or microservices depends on the tracking scope—monitoring 10 pools is different from tracking 10,000. Start simple, then iterate based on data latency and reliability needs.

architecture-overview
GUIDE

System Architecture Overview

A technical blueprint for building a system to track DEX volume and market share across multiple blockchains.

A DEX volume and market share tracker is a data pipeline that aggregates, processes, and visualizes trading activity from decentralized exchanges. The core architecture consists of three primary layers: the data ingestion layer that pulls raw blockchain data, the processing and storage layer that transforms and stores this data, and the application layer that serves insights via an API or frontend. This separation of concerns ensures scalability, allowing you to add support for new chains like Arbitrum or Solana without overhauling the entire system. The goal is to move from raw, on-chain events to clean, queryable metrics like 24-hour volume, total value locked (TVL), and protocol dominance.

The data ingestion layer is responsible for listening to blockchain events. You can use specialized node providers for reliability, such as Alchemy or QuickNode, or run your own archive nodes for specific chains. For Ethereum and EVM-compatible chains (Polygon, Avalanche), you will primarily listen for Swap events emitted by DEX contracts like Uniswap V3's NonfungiblePositionManager or the Pair contracts from older AMMs. For Solana, you would monitor transactions interacting with programs like Raydium or Orca. This layer streams raw transaction logs and event data to a message queue like Apache Kafka or a cloud service (AWS Kinesis, Google Pub/Sub) to handle data bursts during high network activity.

In the processing and storage layer, the raw event data is transformed. A stream processing framework (Apache Flink, Spark Streaming) or a batch job (scheduled Airflow DAG) decodes the event logs, calculates derived values like USD amounts using real-time price feeds from Chainlink or a decentralized oracle, and normalizes the data into a structured schema. The processed data is then stored in a time-series database like TimescaleDB or InfluxDB for efficient querying of metrics over time, and a relational database (PostgreSQL) for storing aggregated, dimensioned data (e.g., volume by protocol, by chain, by token pair). This is where you compute key performance indicators (KPIs) such as daily volume, weekly growth rates, and market share percentages.

Finally, the application layer exposes this data for consumption. A backend API, built with a framework like FastAPI or Express.js, serves pre-aggregated data to a frontend dashboard. For real-time updates, you can implement WebSocket connections that push new data as it's processed. The API should offer endpoints for specific queries: /api/v1/volume/chain/ethereum?period=7d or /api/v1/marketshare?date=2024-01-15. For public data transparency, you might also publish the aggregated datasets to a decentralized storage network like IPFS or Arweave. The entire architecture should be containerized using Docker and orchestrated with Kubernetes for resilience and easy scaling of individual components.

data-sources
DEX TRACKER FOUNDATION

Primary Data Sources and Collection Methods

Building a reliable DEX tracker starts with selecting the right data sources. This section covers the core on-chain and off-chain methods for collecting volume and market share data.

05

Calculating Real Volume & Avoiding Wash Trading

Raw transaction data is polluted with wash trading. Apply filters to calculate legitimate volume.

  • Minimum Liquidity Thresholds: Ignore trades in pools with less than $100,000 TVL to filter out manipulative low-liquidity pools.
  • Suspicious Pattern Detection: Flag circular arbitrage trades and repeated trades between the same wallets.
  • Price Impact Checks: Discard trades that move the pool price by more than 5%, as they are often artificial. Platforms like DefiLlama use these methods to report adjusted volumes.
35-60%
Estimated Wash Trade Volume on some DEXs
06

Building a Data Pipeline Architecture

A production tracker requires a robust pipeline to ingest, process, and store data.

  • Ingestion Layer: Use message queues (Kafka, RabbitMQ) to stream data from RPCs, subgraphs, and APIs.
  • Processing Engine: Implement a worker (in Python, Go, or Rust) to decode logs, aggregate trades by time window, and apply wash trade filters.
  • Storage: Use a time-series database (TimescaleDB) for aggregated metrics and a data warehouse (BigQuery, Snowflake) for raw event storage and complex historical queries.
calculating-core-metrics
DEX ANALYTICS

Calculating Core Metrics: Volume, Liquidity, and Share

A technical guide to building a tracker that calculates the core metrics defining a decentralized exchange's performance and market position.

To analyze a DEX's health and competitive standing, you need to track three foundational metrics: trading volume, total value locked (TVL), and market share. Volume represents the total USD value of assets swapped over a period, indicating user activity and fee generation. TVL measures the total capital deposited in the exchange's liquidity pools, representing its depth and slippage resistance. Market share is the DEX's proportion of total volume or TVL within a broader market (e.g., all DEXs on Ethereum), revealing its relative dominance. These metrics are interdependent; high TVL often supports higher volume with lower slippage, which can feed back into attracting more liquidity.

Setting up a tracker begins with data sourcing. For Ethereum and EVM chains, you primarily interact with DEX smart contracts and liquidity pool contracts. Key contracts include the factory (e.g., Uniswap V2's UniswapV2Factory, Uniswap V3's UniswapV3Factory) to enumerate all pools, and individual pair or pool contracts for transaction and reserve data. You must also connect to an RPC node or use a indexed data service like The Graph for historical queries. For volume, you listen to Swap events emitted by pool contracts, parsing the amount0In, amount1Out, and related fields to calculate the USD value of each trade using real-time price oracles.

Calculating daily volume requires aggregating all Swap events per pool, converting token amounts to USD, and summing them. A robust implementation fetches pool reserves to derive prices or uses a decentralized oracle like Chainlink. For TVL, you query the getReserves function (V2) or slot0 and liquidity positions (V3) for each pool, then value each token reserve. Market share is computed by dividing your DEX's metric by the aggregate of all tracked competitors. For accuracy, your script should run periodically (e.g., hourly via cron job), store results in a database, and handle chain reorgs by confirming block finality.

Here is a simplified Python example using Web3.py to fetch a Uniswap V2 pool's reserves and calculate a basic TVL snapshot, assuming you have token price data:

python
from web3 import Web3
web3 = Web3(Web3.HTTPProvider('YOUR_RPC_URL'))

pair_address = '0x...' # USDC/WETH pair address
pair_abi = [...] # ABI for UniswapV2Pair
contract = web3.eth.contract(address=pair_address, abi=pair_abi)

# Get reserves (reserve0, reserve1, blockTimestampLast)
reserves = contract.functions.getReserves().call()
token0 = contract.functions.token0().call()
token1 = contract.functions.token1().call()

# Assume you have price fetching logic: get_price(token_address)
price0 = get_price(token0) # e.g., 1.0 for USDC
price1 = get_price(token1) # e.g., 3500 for WETH

tvl_usd = (reserves[0] / 10**6 * price0) + (reserves[1] / 10**18 * price1)
print(f"Pool TVL: ${tvl_usd:.2f}")

Beyond basics, consider fee-adjusted volume (volume minus protocol fees), concentrated liquidity TVL for V3-style DEXs, and volume by pool type (stable vs. volatile pairs). Track metrics across multiple chains if the DEX is deployed on several networks. Your dashboard should visualize trends: daily/weekly volume charts, TVL composition, and market share over time. For production, use robust error handling, monitor RPC rate limits, and consider using subgraphs from The Graph for efficient historical querying. This data pipeline forms the core for deeper analysis like liquidity provider returns, impermanent loss, and identifying the most profitable trading pairs.

TRACKER DATA POINTS

Key DEX Metrics: Definitions and Calculation Methods

Core metrics for analyzing decentralized exchange performance, liquidity, and user activity.

MetricDefinitionTypical Calculation MethodPrimary Data Source

24h Trading Volume

Total value of all trades executed on the DEX in the last 24 hours.

Sum(price * amount) for all trades in the period.

DEX Subgraph or On-Chain Event Logs

Total Value Locked (TVL)

Total capital deposited in the DEX's liquidity pools.

Sum(token price * pool balance) for all pools.

DeFiLlama API or On-Chain Pool Contracts

Market Share

The DEX's volume as a percentage of total DEX volume across all tracked protocols.

(DEX 24h Volume / Aggregate DEX Volume) * 100

Dune Analytics or Custom Aggregator

Liquidity Depth

Available liquidity for a trading pair at a specific price range (e.g., within 1% of spot).

Sum of liquidity in active ticks/price bins near the current price.

Pool Contract State or Subgraph

Fee Revenue

Total fees generated for liquidity providers and the protocol.

Sum(trade volume * fee tier) + any protocol fees.

Fee Collection Event Logs

Unique Active Wallets (UAW)

Count of distinct wallet addresses that interacted with the DEX's core functions.

Count(DISTINCT from_address) for swap, add/remove liquidity transactions.

On-Chain Transaction Data

Average Trade Size

The mean value of individual trades over a period.

24h Trading Volume / Number of Trades.

DEX Subgraph or Transaction Logs

Slippage for $10k Swap

The price impact incurred for a standardized swap size, indicating liquidity efficiency.

(Effective Price - Mid Price) / Mid Price for a simulated $10,000 swap.

Pool Contract quoteExactInput Function

building-data-pipeline
TUTORIAL

Building the Data Pipeline: Code Walkthrough

This guide walks through the practical steps of building a data pipeline to track DEX volume and market share, using real-world APIs and code.

A robust DEX analytics pipeline requires structured data ingestion from multiple sources. The core components are an Ethereum RPC node (like Alchemy or Infura) for on-chain data, and The Graph for accessing indexed historical data from protocols like Uniswap and SushiSwap. You'll also need a data storage layer, such as a PostgreSQL database or a cloud data warehouse like Google BigQuery, to store processed metrics. Setting up this foundation allows you to query raw swap events, liquidity pool states, and token prices programmatically.

The first step is to fetch raw swap events. Using a library like ethers.js, you can listen for the Swap event emitted by Uniswap V2/V3 pools. You need the pool's contract address and ABI. For historical analysis, querying a subgraph is more efficient. For example, the Uniswap V3 subgraph on The Graph's hosted service provides a GraphQL endpoint to fetch swaps within a specific time range, including amounts, tokens, and pool fees. This avoids the complexity and cost of scanning entire blockchain history.

Once you have raw swap data, you must process it to calculate volume. Volume is typically the USD value of the output token in a swap. This requires a price feed. You can use an on-chain oracle like Chainlink, query a DEX's subgraph for pool-derived prices, or use an external API like CoinGecko. The calculation involves multiplying the output token amount by its USD price at the time of the swap. Be mindful of token decimals and WETH/ETH wrapping conventions, as these are common sources of error in volume aggregation.

To compute market share, you need total volume across all major DEXs on a chain. Your pipeline should aggregate daily volume per DEX (e.g., Uniswap, Curve, Balancer). Market share for a specific DEX is then its daily volume divided by the total daily volume across all tracked DEXs. Maintaining a clean reference of pool addresses to their parent protocol is crucial. This often involves maintaining a static mapping or querying protocol registry contracts.

For production, automate the pipeline with scheduled jobs. A framework like Apache Airflow or a simple cron job can run your Node.js or Python scripts daily. The script should: 1) Fetch the last 24 hours of swap events from subgraphs, 2) Calculate USD volume for each swap, 3) Aggregate by protocol, 4) Compute market share percentages, and 5) Write results to your database. Logging and error handling for API rate limits and failed price lookups are essential for reliability.

Finally, consider data validation. Cross-reference your calculated totals with public dashboards like Dune Analytics to spot discrepancies. Optimize costs by caching price data and batching GraphQL queries. The end result is a reliable, automated pipeline that outputs key metrics like daily_protocol_volume and market_share_pct, forming the backbone for any DEX analytics dashboard or research report.

FRONTEND INTEGRATION

Visualization and Dashboard Implementation

Core Dashboard Components

An effective DEX tracker dashboard focuses on key metrics for traders and analysts. The primary view should display Total Volume (24h) and Market Share % for each tracked DEX (e.g., Uniswap, Curve, PancakeSwap).

Essential visualizations include:

  • Time-series charts for volume trends (hourly/daily) using libraries like Chart.js or Recharts.
  • Pie/bar charts for real-time market share distribution.
  • Data tables with sortable columns for DEX name, chain, volume, and share.
  • Chain filters to isolate data for Ethereum, Arbitrum, Base, etc.

Prioritize a clean, responsive layout. Use color coding consistently (e.g., by protocol or chain) and ensure all numbers update in near real-time via WebSocket or frequent API polling.

correlation-analysis
GUIDE

Setting Up a DEX Volume and Market Share Tracker

Learn to build a system that correlates on-chain events with trading metrics to analyze DEX performance in real-time.

To track DEX volume and market share, you must first identify and ingest the relevant on-chain events. For an Automated Market Maker (AMM) like Uniswap V3, the core event is Swap, emitted by the pool contract on every trade. A robust tracker subscribes to these events using a node provider like Alchemy or QuickNode, parsing the emitted data: sender, recipient, amount0, amount1, sqrtPriceX96, and liquidity. This raw event stream is the foundation for all subsequent volume calculations. For multi-chain analysis, you must replicate this setup for each protocol and network, such as PancakeSwap on BNB Chain or Curve on Arbitrum.

The next step is calculating accurate USD volume from the raw event data. Simply summing token amounts is insufficient due to price volatility. The correct method involves using the sqrtPriceX96 value from the Swap event to calculate the execution price at the time of the trade. You then apply this price to the delta of the output token amount. For a WETH/USDC pool, if a swap results in 1,000 USDC leaving the pool, the USD volume for that trade is approximately $1,000. This price must be sourced from the pool itself at the block of the transaction to avoid reliance on external, potentially manipulated oracles. Aggregate this calculated volume per pool, per DEX, and per day.

Correlating this volume data with broader market events requires a temporal database. Store each processed swap with its block number, timestamp, calculated USD volume, and pool identifier. You can then join this dataset with an indexed log of major events, such as new token listings, governance proposals, or liquidity incentive programs. For example, query for days where Uniswap's share of total DEX volume spiked by over 15% and check for a correlated event like the launch of a major liquidity mining campaign on Uniswap Governance. This analysis reveals the direct impact of protocol decisions on user behavior and capital flows.

To operationalize this, a common architecture uses a listener service, a processor, and a time-series database. The listener captures Swap events in real-time. The processor, which can be written in Python or Node.js, calculates the USD volume and writes records to a database like TimescaleDB or ClickHouse. A separate cron job can then run SQL queries to compute rolling 24-hour volumes and market share percentages (e.g., (Uniswap Volume / Total DEX Volume) * 100). This pipeline enables dashboards that update with each new block, providing a live view of the competitive DEX landscape.

For developers, here is a simplified code snippet demonstrating the core calculation using the sqrtPriceX96 from a Uniswap V3 Swap event in JavaScript (using ethers.js):

javascript
// price = (sqrtPriceX96 / 2^96)^2
const price = (sqrtPriceX96 ** 2) / (2 ** 192);
// Determine which token is token0 (e.g., WETH) and its decimals
const token0Amount = amount0 / (10 ** token0Decimals);
const token1Amount = amount1 / (10 ** token1Decimals);
// Volume is the absolute value of the output token amount * price
if (amount0 < 0) { // token0 is output
  usdVolume = Math.abs(token0Amount) * price;
} else {
  usdVolume = Math.abs(token1Amount) * price;
}

This logic, applied to every swap, forms the basis of your volume dataset.

Ultimately, this tracker provides actionable intelligence. Projects can monitor the success of their liquidity deployments, investors can identify trending protocols before they appear on aggregate sites, and analysts can quantify the market impact of upgrades like Uniswap V4's hook system. By directly linking on-chain actions to economic outcomes, you move beyond simple metrics into causal analysis, which is essential for informed decision-making in the dynamic DeFi sector.

DEX TRACKER SETUP

Frequently Asked Questions

Common technical questions and solutions for developers building or integrating DEX volume and market share analytics.

For accurate DEX volume and market share data, you need to aggregate on-chain data directly. Relying on a single source like The Graph subgraph or a centralized API can lead to discrepancies.

Primary sources include:

  • Direct RPC calls to archive nodes for historical event logs.
  • Subgraphs from The Graph (e.g., Uniswap V3, SushiSwap).
  • Decentralized data lakes like Dune Analytics or Flipside Crypto for pre-aggregated queries.

Key considerations:

  • Always verify calculations by comparing multiple sources; API-reported volume can differ from on-chain sums.
  • Account for internal transactions and flash loans, which some trackers incorrectly count as volume.
  • Use block explorers (Etherscan, Arbiscan) as a manual verification tool, not a primary data feed.
conclusion-next-steps
IMPLEMENTATION SUMMARY

Conclusion and Next Steps

You have now built a foundational DEX volume and market share tracker. This guide covered the essential steps: data sourcing, processing, and visualization.

Your tracker now provides a clear, real-time view of the competitive landscape across major DEXs like Uniswap, PancakeSwap, and Curve. By aggregating data from sources like The Graph and Dune Analytics, you can monitor key metrics such as 24-hour trading volume, total value locked (TVL), and fee generation. This data is crucial for identifying leading protocols, spotting emerging trends like the rise of a new AMM on Arbitrum or Base, and making informed decisions based on concrete on-chain activity rather than speculation.

To enhance your tracker, consider these next steps. First, expand your data sources by integrating direct RPC calls to fetch pool reserves or subscribing to real-time events from contracts. Second, implement historical analysis by storing daily snapshots in a database (e.g., PostgreSQL or TimescaleDB) to calculate weekly/monthly trends and volatility. Third, add cross-chain comparative analytics to normalize volume across different native assets (like ETH vs. SOL) or adjust for layer-2 scaling solutions, providing a more accurate market share picture.

For advanced functionality, explore building predictive models or alert systems. You could use the historical data to train simple ML models for volume forecasting or set up alerts for anomalous activity, such as a sudden 50% drop in a DEX's market share, which could indicate a migration event or a liquidity crisis. The code and architecture from this guide serve as a springboard for these more complex, value-added analytics products tailored to traders, liquidity providers, or protocol developers.

How to Build a DEX Volume and Market Share Tracker | ChainScore Guides