Free 30-min Web3 Consultation
Book Consultation
Smart Contract Security Audits
View Audit Services
Custom DeFi Protocol Development
Explore DeFi
Full-Stack Web3 dApp Development
View App Services
Free 30-min Web3 Consultation
Book Consultation
Smart Contract Security Audits
View Audit Services
Custom DeFi Protocol Development
Explore DeFi
Full-Stack Web3 dApp Development
View App Services
Free 30-min Web3 Consultation
Book Consultation
Smart Contract Security Audits
View Audit Services
Custom DeFi Protocol Development
Explore DeFi
Full-Stack Web3 dApp Development
View App Services
Free 30-min Web3 Consultation
Book Consultation
Smart Contract Security Audits
View Audit Services
Custom DeFi Protocol Development
Explore DeFi
Full-Stack Web3 dApp Development
View App Services
LABS
Guides

Setting Up Analytics for Layer 2 Rollup Adoption

This guide provides a technical framework for developers to measure the adoption and performance of their applications on Layer 2 rollups. It covers key metrics, data sources, and code for tracking bridge inflows, transaction economics, user growth, and activity composition to evaluate L2 deployment ROI.
Chainscore © 2026
introduction
INTRODUCTION

Setting Up Analytics for Layer 2 Rollup Adoption

This guide explains how to measure and analyze adoption metrics for Layer 2 rollups using on-chain data.

Understanding Layer 2 (L2) rollup adoption requires moving beyond simple transaction counts. Effective analytics track the health, decentralization, and economic activity of a rollup ecosystem. Key metrics include Total Value Locked (TVL), daily active addresses, transaction fee analysis, and the rate of new contract deployments. For developers and researchers, setting up a robust analytics pipeline is essential for making data-driven decisions about protocol development, investment, or user acquisition strategies.

The foundation of any rollup analytics setup is accessing reliable on-chain data. You can use providers like The Graph for indexed subgraphs, Dune Analytics for community-built dashboards, or direct RPC calls to archive nodes. For Arbitrum and Optimism, their respective block explorers offer APIs for transaction and address data. A common starting point is querying daily transaction volumes and unique active addresses to establish a baseline growth trend, which can be visualized using tools like Grafana or Superset.

To gain deeper insights, you must analyze specific activity types. This includes monitoring bridging activity (deposits/withdrawals to L1), DeFi protocol usage (unique interactions with AMMs or lending markets), and NFT minting/trading volume. For example, tracking the weekly net inflow of assets from Ethereum to an Optimism Superchain rollup reveals user confidence and capital migration patterns. Implementing these queries often involves parsing event logs from core bridge and factory contracts.

Advanced analytics involve calculating protocol-specific metrics and network health indicators. You should measure average transaction fees over time to assess scalability claims, monitor sequencer decentralization progress by tracking block proposers, and analyze contract creation trends to gauge developer activity. Setting up alerts for anomalous events—like a sudden 50% drop in daily transactions or a spike in failed bridge withdrawals—can provide early warning signs of network issues or declining user engagement.

Finally, presenting this data effectively is crucial. Build dashboards that compare multiple rollups side-by-side, such as Arbitrum One, OP Mainnet, Base, and zkSync Era. Focus on comparative metrics like cost per transaction, time-to-finality, and developer ecosystem growth. Use this analysis to answer concrete questions: Which rollup is most cost-effective for users? Where are developers deploying new applications? The insights guide everything from infrastructure investment to end-user recommendations.

prerequisites
ANALYTICS FOUNDATION

Prerequisites and Data Sources

Before analyzing Layer 2 adoption, you need the right tools and data. This guide covers the essential software, APIs, and data sources required to build a robust analytics pipeline.

To analyze Layer 2 rollup adoption, you'll need a development environment capable of interacting with blockchain data. The core prerequisites are Node.js (v18 or later) and Python (v3.10+), which provide the runtime for most data-fetching scripts and analysis libraries. Essential packages include ethers.js or web3.py for interacting with RPC nodes, pandas for data manipulation, and visualization tools like matplotlib or plotly. A package manager like npm or pip is required to install these dependencies. Setting up a code editor such as VS Code with relevant extensions will streamline your workflow.

The primary data source is a reliable RPC (Remote Procedure Call) endpoint. You need access to nodes for both the Layer 1 (e.g., Ethereum Mainnet) and the target Layer 2s (e.g., Arbitrum, Optimism, Base). Services like Alchemy, Infura, or QuickNode provide managed RPC access. For broader, aggregated metrics, blockchain explorers like Etherscan and its L2 counterparts (Arbiscan, Optimistic Etherscan) offer APIs for querying transaction histories, contract interactions, and address balances, which are crucial for tracking user migration and contract deployment.

For standardized, queryable datasets, The Graph is indispensable. This decentralized protocol indexes blockchain data into subgraphs, allowing you to query complex information with GraphQL. Key subgraphs for adoption analysis include those for major DEXs (Uniswap, Curve), lending protocols (Aave, Compound), and bridge contracts. Additionally, Dune Analytics provides a massive repository of community-built dashboards and a powerful query engine for on-chain data. You can fork existing queries for metrics like daily active addresses, transaction volume, and Total Value Locked (TVL) across chains to accelerate your analysis.

Beyond raw chain data, specialized data providers offer refined metrics. Platforms like L2BEAT provide trusted, risk-adjusted TVL data and technical summaries for every major rollup. DefiLlama is the definitive source for aggregated TVL and protocol-specific data across all chains. For tracking bridging activity, analyze the official bridge contracts (e.g., Arbitrum's Inbox, Optimism's L1StandardBridge) or use a bridge aggregator API like Socket. These sources help you measure capital flows, a direct indicator of adoption.

Finally, structure your project for reproducibility. Create a dedicated directory with subfolders for scripts/ (data fetching), data/ (raw and processed CSVs/JSON), and analysis/ (Jupyter notebooks or analysis scripts). Use environment variables (a .env file) to securely store your RPC URLs and API keys. Implement basic logging and error handling in your scripts. Start by writing a simple script that fetches the latest block number and native token balance from both an L1 and L2 RPC endpoint to verify your setup is functioning correctly before proceeding to complex queries.

key-metrics
LAYER 2 ANALYTICS

Core Adoption Metrics to Track

Quantifying Layer 2 adoption requires tracking specific on-chain data points. This guide covers the essential metrics for developers and researchers to analyze rollup health and user growth.

tracking-bridge-inflows
ANALYTICS GUIDE

Tracking Bridge Inflows and User Migration

A technical guide for developers and analysts to monitor capital and user movement from Ethereum to Layer 2 rollups using on-chain data.

Tracking bridge inflows is the foundational metric for measuring Layer 2 adoption. It quantifies the total value locked (TVL) migrating from Ethereum's mainnet to a rollup via its official bridge contract. To track this, you must monitor the deposit or depositETH function calls on the bridge contract. For example, the Arbitrum One bridge is at 0x8315177aB297bA92A06054cE80a67Ed4DBd7ed3a, while Optimism uses 0x99C9fc46f92E8a1c0deC1b1747d010903E884bE1. By summing the value of all successful transactions to these functions, you can calculate the cumulative ETH bridged. This raw inflow data is available via block explorers or by querying an archive node using libraries like ethers.js or viem.

User migration analysis goes beyond raw capital to measure unique wallet adoption. A simple count of unique from addresses interacting with the bridge deposit function provides a baseline. However, for deeper insights, you must track subsequent on-chain activity. A migrated user is typically defined as an address that has both bridged assets and executed at least one transaction on the L2 after a successful deposit. This filters out one-time bridgers or airdrop farmers. Tools like Dune Analytics and Flipside Crypto offer pre-built dashboards for this, but you can build custom queries by cross-referencing bridge events on Ethereum with transaction logs on the rollup's chain.

To set up a robust analytics pipeline, start by streaming bridge event logs using a provider like Alchemy or QuickNode. For a code-based approach, use the Viem client to listen for events. For example, to track Arbitrum inflows, you would instantiate a client for Ethereum mainnet and create an event filter for the Bridge contract's Deposit event. Each event log contains the depositor's address (from) and the amount (value or amount). You should store this data in a time-series database like TimescaleDB or ClickHouse for efficient historical querying and trend analysis.

Analyzing the data reveals key adoption signals. Look for trends in deposit size distribution (whale vs. retail activity), deposit frequency, and asset composition (ETH vs. ERC-20 tokens). A spike in small, frequent deposits may indicate growing retail engagement, while large, lump-sum inflows could signal institutional movement. Furthermore, correlating bridge deposits with subsequent DeFi interactions on the L2—such as swaps on Uniswap or lending on Aave—provides a retention rate metric. This tells you what percentage of bridged capital is being actively utilized within the rollup's ecosystem versus sitting idle.

For production-grade monitoring, automate dashboards using the data stack. Services like Dune allow you to publish queries as live dashboards. Alternatively, you can use Grafana connected to your analytical database to visualize metrics like daily unique depositors, weekly net inflow, and user retention cohorts. Setting up alerts for anomalous activity—such as a sudden 50% drop in daily inflows or a single massive deposit—can provide early signals for ecosystem shifts or potential security events. This data-driven approach is essential for protocols, investors, and researchers to objectively gauge rollup growth and health.

analyzing-transaction-economics
DATA-DRIVEN INSIGHTS

Analyzing Transaction Cost Savings

A practical guide to measuring and quantifying the financial benefits of migrating your dApp or user activity to Layer 2 rollups using on-chain analytics.

To analyze transaction cost savings, you must first establish a baseline. This involves querying historical transaction data from the base layer (e.g., Ethereum Mainnet) for a representative sample of your application's operations. Key metrics to extract include average gas price (in Gwei), gas used per transaction type (simple transfer, token swap, NFT mint), and the resulting fee in USD or ETH. Tools like The Graph for indexed subgraphs, Dune Analytics dashboards, or direct RPC calls to archive nodes are essential for this historical analysis. This baseline represents your 'cost of doing business' on Layer 1.

Next, replicate the analysis for the same activity on your target Layer 2, such as Arbitrum, Optimism, or Base. Here, you'll calculate the L2 transaction fee, which includes the L2 execution fee and the pro-rated cost of data publication back to L1. While L2 block explorers provide fee data, for programmatic comparison, you should use the rollup's SDK or RPC methods. For example, you can estimate fees using eth_estimateGas and the current l1GasPrice fetched via an oracle contract like GasPriceOracle on Optimism. Comparing the USD cost per transaction between Layer 1 and Layer 2 reveals the raw savings percentage.

Beyond simple per-transaction comparison, implement a Total Cost of Ownership (TCO) model for user journeys. A single DeFi interaction often involves multiple transactions—approval, swap, staking. Calculate the bundled cost of this journey on both layers. Furthermore, factor in indirect savings from improved user experience: lower costs enable micro-transactions, more frequent interactions, and complex batch operations that are prohibitively expensive on L1. Use analytics platforms like Flipside Crypto or custom scripts to segment users by transaction volume and map how reduced fees correlate with increased engagement over time.

For ongoing monitoring, automate your analytics pipeline. Create a script that periodically (e.g., daily) fetches average gas prices from Etherscan's API and equivalent fee data from L2 sequencers, then logs the savings ratio to a database or dashboard. Alert on anomalies—if L1 gas prices crash or L2 congestion temporarily increases fees, your savings analysis should reflect this volatility. Presenting this data as a real-time dashboard, perhaps using Grafana, provides compelling, evidence-based justification for L2 adoption to both your team and end-users, moving the conversation from speculation to quantified financial fact.

measuring-user-growth
MEASURING USER AND ACTIVITY GROWTH

Setting Up Analytics for Layer 2 Rollup Adoption

A practical guide to tracking and analyzing key metrics for Layer 2 rollup ecosystems using on-chain data and analytics platforms.

Tracking Layer 2 (L2) adoption requires moving beyond simple transaction counts. To measure genuine user and activity growth, you must analyze on-chain data for metrics like Daily Active Addresses (DAA), Total Value Locked (TVL), and transaction fee economics. Platforms like Dune Analytics, Nansen, and Flipside Crypto provide the foundational dashboards. For example, a Dune dashboard tracking Arbitrum's weekly active users versus its transaction fees can reveal adoption trends independent of speculative airdrop farming.

To build a custom analysis, start by querying core data. Using the Dune SQL engine, you can fetch daily unique addresses. A basic query for Optimism might look like:

sql
SELECT 
    DATE(block_time) as day,
    COUNT(DISTINCT "from") as daily_active_addresses
FROM optimism.transactions
WHERE block_time > NOW() - INTERVAL '30' day
GROUP BY 1
ORDER BY 1;

This provides the raw count, but for deeper insight, segment these addresses into categories like contract deployers, DEX swappers, or bridge users to understand what type of activity is growing.

Beyond user counts, analyze economic activity and retention. Monitor transaction fee revenue paid in ETH (or the native L2 token) to gauge network utility. Compare this with the cost of posting data to Ethereum L1 (the "batch submission cost") to assess the rollup's economic efficiency. Tools like L2BEAT provide aggregated safety and TVL data, while Token Terminal tracks protocol revenue. A sudden spike in new addresses with low transaction frequency may indicate sybil activity, whereas steady growth in addresses with recurring transactions suggests organic adoption.

For application-specific growth, implement event tracking within your smart contracts. Emit standardized events for key actions like swaps, deposits, or NFT mints. This allows you to build precise dashboards that track your dApp's contribution to the overall L2 activity. Combining this with wallet analytics from a provider like Covalent or The Graph can help you understand user journeys and identify friction points in your application's flow.

Finally, contextualize your data by comparing it across chains. Use a multi-chain dashboard to benchmark your target L2 (e.g., Base, zkSync Era) against competitors and Ethereum Mainnet. Look at metrics like cost per transaction, time-to-finality, and the ratio of DAA to TVL. This comparative analysis reveals whether growth is driven by unique value propositions like lower fees or new primitives, or is merely following broader market trends. Consistent measurement across these vectors provides a complete picture of sustainable rollup adoption.

ANALYTICS FOCUS

Comparing On-Chain Activity Composition

Breakdown of transaction types to identify core adoption drivers across different rollups.

Activity MetricArbitrum OneOptimismzkSync EraBase

DeFi Swap Volume

42%

38%

31%

55%

NFT Minting & Trading

18%

22%

35%

25%

Bridge Deposits/Withdrawals

15%

12%

10%

8%

Social/Gaming Transactions

8%

10%

15%

7%

Contract Deployments

5%

6%

4%

3%

Average TX Fee (USD)

$0.10

$0.08

$0.05

$0.15

MEV Bot Activity

Native Account Abstraction

calculating-roi
LAYER 2 ANALYTICS

Calculating Deployment ROI and Setting Dashboards

This guide explains how to measure the return on investment for deploying on a Layer 2 rollup and how to set up dashboards to track key adoption metrics.

Deploying a smart contract on a Layer 2 (L2) like Arbitrum, Optimism, or zkSync involves strategic costs, including gas fees for the initial deployment and bridging assets. The primary ROI drivers are reduced transaction fees for users, which can be 10-100x cheaper than Ethereum mainnet, and increased user activity from a better experience. To calculate a basic ROI, track the cost savings for your users versus mainnet and correlate it with growth in key metrics like daily active addresses and transaction volume. The break-even point is when the value of user growth and retained fees surpasses the initial and ongoing operational costs.

Effective monitoring requires a dashboard built on core L2 adoption indicators. Essential metrics include Total Value Locked (TVL) in your protocol's contracts, daily/weekly active users, transaction count and volume, and average transaction fee paid by users. For cross-chain projects, it's critical to track the bridge inflow/outflow of assets to and from the L2. Tools like Dune Analytics and Flipside Crypto allow you to build custom dashboards by writing SQL queries against indexed blockchain data. Start by forking an existing dashboard for your chosen L2 and modifying the queries to filter for your specific contract addresses.

Here is a simplified example of a Dune Analytics query to track daily active users for a specific contract on Arbitrum. This query counts unique from addresses interacting with your contract each day.

sql
SELECT
  DATE(block_time) as day,
  COUNT(DISTINCT "from") as daily_active_users
FROM arbitrum.transactions
WHERE
  "to" = '0xYourContractAddressHere'
  AND block_time >= NOW() - INTERVAL '30' day
GROUP BY 1
ORDER BY 1;

Running this query forms the basis of a chart in your dashboard. You can create similar queries for transaction count, volume, and gas fees.

To assess ROI quantitatively, create a dashboard panel that compares costs and benefits. Calculate the cumulative gas savings for your users by estimating the mainnet gas cost for each L2 transaction and summing the difference. Combine this with revenue metrics (if applicable) like protocol fees generated on the L2. A positive trend where cumulative savings and revenue cross the deployment cost line visually demonstrates ROI. Furthermore, track user retention cohorts to see if lower fees lead to higher repeat usage compared to mainnet, indicating improved product-market fit.

For advanced analysis, integrate off-chain data from your application's backend with on-chain data from your dashboard. Correlate on-chain deposit events with user sign-ups in your database, or track how reductions in gas fees (post-L2 deployment) affect conversion rates at key funnel steps. Set up alerts for metric anomalies, such as a sudden drop in daily transactions, which could indicate network issues or a competitor's launch. Regularly review these dashboards to inform decisions on incentive programs, feature development, and potential expansion to additional L2 networks.

LAYER 2 ANALYTICS

Optimization and Troubleshooting

Practical guidance for developers implementing and debugging analytics to track user adoption and performance on Layer 2 rollups.

This is often caused by querying the wrong data source. Most analytics platforms default to Ethereum Mainnet. To track Layer 2 activity, you must explicitly configure your queries to read from the rollup's specific chain ID and RPC endpoint.

Key checks:

  • Verify your provider (Alchemy, Infura, QuickNode) supports the target L2 (Arbitrum, Optimism, Base, etc.).
  • Confirm you are using the correct Chain ID (e.g., 42161 for Arbitrum One, 10 for Optimism).
  • Ensure your event indexing or subgraph is deployed to and syncing from the L2 network, not a fork or testnet.
  • For cross-chain dApps, segment your analytics by chain_id to avoid conflating Mainnet and L2 user actions.
conclusion
IMPLEMENTATION SUMMARY

Conclusion and Next Steps

You have configured a robust analytics pipeline to track Layer 2 rollup adoption. This guide covered the essential steps from data sourcing to dashboard creation.

Your analytics setup should now provide a clear view of key adoption metrics like daily active addresses, transaction volume, and TVL across rollups like Arbitrum, Optimism, and Base. By connecting to data providers like Dune Analytics, The Graph, and Covalent, you have access to both aggregated dashboards and raw, queryable data. The next step is to establish a regular review cadence—weekly or monthly—to identify trends, such as spikes in activity from new dApp launches or shifts in user behavior post-protocol upgrades.

To deepen your analysis, consider implementing more advanced tracking. Set up alerts for significant metric deviations using tools like Grafana or PagerDuty. Begin correlating on-chain data with off-chain events, such as grant program announcements or major partnership reveals, to understand causality. For developers, instrumenting your own dApp with specific event logging (e.g., using a service like Mixpanel or Amplitude) can provide user journey insights that pure chain data cannot.

The field of L2 analytics is rapidly evolving. Stay updated by monitoring new data tools like Goldsky or Shadow, and protocol-specific data portals like the Optimism Superchain Explorer. Engage with the community in forums like the Dune Discord or Ethereum Magicians to share methodologies. Finally, consider open-sourcing your dashboards or queries to contribute to public knowledge, helping to standardize how the ecosystem measures growth and health across the burgeoning rollup landscape.

How to Track Layer 2 Rollup Adoption and Performance | ChainScore Guides