Free 30-min Web3 Consultation
Book Consultation
Smart Contract Security Audits
View Audit Services
Custom DeFi Protocol Development
Explore DeFi
Full-Stack Web3 dApp Development
View App Services
Free 30-min Web3 Consultation
Book Consultation
Smart Contract Security Audits
View Audit Services
Custom DeFi Protocol Development
Explore DeFi
Full-Stack Web3 dApp Development
View App Services
Free 30-min Web3 Consultation
Book Consultation
Smart Contract Security Audits
View Audit Services
Custom DeFi Protocol Development
Explore DeFi
Full-Stack Web3 dApp Development
View App Services
Free 30-min Web3 Consultation
Book Consultation
Smart Contract Security Audits
View Audit Services
Custom DeFi Protocol Development
Explore DeFi
Full-Stack Web3 dApp Development
View App Services
LABS
Guides

Launching a Protocol for Monitoring Daily Active Wallets (DAW)

A technical guide for developers to build a system that tracks Daily Active Wallets, a core growth metric for dApps and protocols.
Chainscore © 2026
introduction
WEB3 ANALYTICS

Introduction to Daily Active Wallets (DAW)

Daily Active Wallets (DAW) is a core metric for measuring user engagement and network health in blockchain ecosystems. This guide explains its definition, calculation, and significance for protocol developers.

Daily Active Wallets (DAW) is a metric that counts the number of unique wallet addresses that have signed at least one on-chain transaction within a 24-hour period. Unlike monthly active users in web2, DAW provides a high-frequency, transparent view of genuine protocol interaction directly from the blockchain ledger. It is a foundational Key Performance Indicator (KPI) for decentralized applications (dApps), Layer 1 blockchains like Ethereum and Solana, and Layer 2 networks such as Arbitrum and Optimism. Accurate DAW tracking helps teams move beyond vanity metrics like total wallets to understand real, recurring usage.

Calculating DAW requires parsing raw blockchain data to filter for unique, interacting addresses. A simple methodology involves querying transaction logs, grouping by the from address field, and applying a date filter. For example, an Ethereum DAW query in SQL might use COUNT(DISTINCT from_address) FROM transactions WHERE block_timestamp >= CURRENT_DATE. However, this basic count can be refined by excluding certain transaction types—like simple native token transfers or failed transactions—to focus on meaningful economic activity. Advanced analytics platforms like Dune Analytics or Flipside Crypto provide pre-built dashboards for these metrics.

For protocol teams, monitoring DAW is critical for several reasons. It serves as a direct indicator of product-market fit and user retention. A steadily growing or stable DAW suggests healthy adoption, while a declining trend can signal issues with user experience, network congestion, or competitive pressures. Furthermore, DAW data is essential for informed decision-making in areas like business development (e.g., proving traction to partners), community grants (allocating resources to high-engagement dApps), and marketing (measuring campaign effectiveness). It provides an objective benchmark against competitors in the same vertical.

Launching a protocol-specific DAW monitoring system involves several technical steps. First, you must establish a reliable data pipeline, typically using a node provider (e.g., Alchemy, Infura) or a decentralized network like The Graph to index your protocol's events. Next, you'll need to define your activity criteria: what constitutes an "active" interaction? This could be depositing into a liquidity pool, executing a swap, staking tokens, or voting in governance. Finally, you must build a dashboard for visualization, using tools like Grafana, Superset, or a custom frontend that queries your indexed data, presenting trends and cohort analyses to your team.

prerequisites
DAW MONITORING

Prerequisites and System Architecture

This guide outlines the technical foundation required to build a system for monitoring Daily Active Wallets (DAW), covering essential tools, data sources, and architectural components.

Before querying blockchain data, you need access to it. The core prerequisite is a reliable RPC node provider for the chains you wish to monitor. For production systems, services like Alchemy, Infura, or QuickNode are essential, as they offer high request limits, dedicated endpoints, and WebSocket support for real-time event listening. For development and testing, you can run a local node (e.g., geth for Ethereum) or use a public RPC, though rate limits will apply. You will also need a developer API key from a block explorer like Etherscan or Arbiscan to fetch historical transaction logs and verify contract ABIs, which are not available via standard RPC calls.

The system architecture for DAW tracking typically follows an ETL (Extract, Transform, Load) pipeline. The Extract layer involves subscribing to new blocks via your RPC provider's WebSocket connection and fetching relevant transaction data. For smart contract interactions, you must filter transactions by the contract's address and decode the input data using its Application Binary Interface (ABI). The eth_getLogs RPC method is crucial for efficiently querying past events emitted by contracts, such as transfers or specific function calls, which directly indicate user activity.

In the Transform phase, raw transaction data is processed to identify unique wallet addresses. This involves parsing transaction from and to fields, as well as decoded log data from complex interactions like token swaps on a DEX or deposits into a lending protocol. A critical step is address normalization—converting all addresses to checksummed format (EIP-55) to avoid duplicates. You must also implement logic to filter out contract-created transactions and protocol-owned wallets to ensure you're counting genuine user activity, not system operations.

For the Load and analysis layer, you need a persistent data store. A time-series database like TimescaleDB (built on PostgreSQL) or InfluxDB is ideal for storing daily aggregate counts and enabling efficient time-window queries. Your application logic, likely written in Node.js, Python, or Go, will handle the ETL process, calculate the final DAW metric for a given protocol and date, and expose the data via an API. The complete architecture ensures you can track DAW trends, generate reports, and power dashboards for protocol analytics.

defining-activity
FOUNDATION

Step 1: Defining 'Active' for Your Protocol

The first and most critical step in building a Daily Active Wallets (DAW) monitoring protocol is establishing a precise, on-chain definition of user activity. This definition will serve as the core logic for your entire system.

A generic 'active user' metric is useless for on-chain analysis. Your protocol must define activity based on specific, verifiable on-chain actions. Common definitions include:

  • Transaction Initiator: A wallet that signs and submits a transaction, paying gas.
  • Contract Interactor: A wallet that calls a function on a specific smart contract, like a DEX or lending pool.
  • Event Emitter: A wallet that triggers a predefined on-chain event, such as an NFT transfer or a governance vote. The choice depends on your protocol's focus. A DeFi dashboard might track interactions with a set of liquidity pools, while an NFT analytics platform would monitor transfers and sales.

Your definition must be computable from raw blockchain data. This means writing logic that processes blocks or subscribes to events. For an EVM chain, you might filter transactions by to address (a specific contract) and record the from address. For a definition based on events, you would parse logs for a specific event signature and extract the involved addresses from the indexed topics. The complexity increases for actions involving internal transactions or proxy contracts, requiring deeper trace analysis.

Consider the time window for 'daily' activity. Is a wallet counted if it performs an action at 11:59 PM and another at 12:01 AM? Most systems use UTC calendar days for simplicity, but you could implement rolling 24-hour windows. You must also handle wallet deduplication across multiple actions within the same period to avoid double-counting. A robust system stores a unique wallet address and timestamp for each qualifying action, then performs aggregation.

Here's a simplified conceptual example in pseudocode for tracking DEX swappers:

code
function isActiveWallet(tx, targetContracts) {
  if (tx.to NOT IN targetContracts) return false;
  // Check if the transaction calls a 'swap' function
  if (tx.inputData does NOT start with swapFunctionSelector) return false;
  return tx.from; // The wallet address that initiated the swap
}

This logic would run for every transaction in a block, collecting unique tx.from addresses that interacted with your listed contracts via a swap.

Finally, document your definition clearly. It should answer: What exact on-chain signature constitutes an 'action'? and Over what time period is activity aggregated? This clarity is essential for users to trust your metrics and for developers to build on your data. A vague definition leads to inconsistent data and makes your protocol's output unreliable for serious analysis.

METRICS

Common DAW Activity Definitions

Standardized definitions for wallet activity used by analytics platforms to calculate Daily Active Wallets (DAW).

Activity TypeDefinitionChainscore DefaultCommon Alternatives

Transaction Sender

Wallet that signs and submits a transaction to the network.

Transaction Receiver

Wallet that is the primary recipient in a transaction (e.g., to address).

Contract Interactor

Wallet that calls a function on a smart contract.

Token Transfer

Wallet that sends or receives an ERC-20, ERC-721, or ERC-1155 token.

Gas Fee Payer

Wallet that pays the gas fees for a transaction, which may differ from the sender (e.g., via a relayer).

Internal TX Participant

Wallet involved in a transaction triggered by a smart contract (internal call).

Event Emitter

Wallet associated with an emitted event log, even without a direct transaction.

Minimum Value Threshold

Filters out activity below a specific USD value (e.g., $1).

$1.00

Varies by platform

data-ingestion-pipeline
ARCHITECTURE

Step 2: Building the Data Ingestion Pipeline

A robust data pipeline is the core of any monitoring protocol. This step details how to collect, process, and structure on-chain data for Daily Active Wallet (DAW) analysis.

The first architectural decision is choosing your data source. You can index raw blockchain data directly using an RPC node (e.g., Geth, Erigon) or leverage a specialized indexer like The Graph, SubQuery, or Goldsky. Direct RPC access offers maximum control but requires significant infrastructure to handle historical queries and chain reorganizations. Indexers provide pre-processed, queryable data via GraphQL APIs, drastically reducing engineering overhead at the cost of relying on a third-party service and potential indexing lag.

For a DAW metric, you need to capture all transactions and their signers. Your pipeline must filter for successful transactions (status: 1 on EVM chains) and extract the from address. A critical consideration is handling internal transactions (calls from smart contracts), as they represent user activity but are not in the standard transaction receipt. Services like Etherscan's API or Alchemy's alchemy_getAssetTransfers can provide this data. Implement deduplication logic to ensure a wallet interacting with multiple dApps in a day is counted only once.

Here is a conceptual code snippet for a simple EVM ingestion service using ethers.js and a hypothetical database:

javascript
const provider = new ethers.providers.JsonRpcProvider(RPC_URL);
const latestBlock = await provider.getBlockNumber();
const block = await provider.getBlockWithTransactions(latestBlock);

for (const tx of block.transactions) {
    const receipt = await provider.getTransactionReceipt(tx.hash);
    if (receipt.status === 1) {
        // Store unique wallet-day pair
        await db.upsertActiveWallet({
            address: tx.from,
            date: new Date().toISOString().split('T')[0],
            txHash: tx.hash
        });
    }
}

This naive example polls the latest block; a production system would use a block subscription or backfill from a specific start block.

Data validation and schema design are paramount. Your database schema should efficiently store wallet-activity pairs. A simple table might include wallet_address, date (as a DATE type), chain_id, and last_tx_hash. Use composite unique constraints on (wallet_address, date, chain_id) to prevent duplicates. Consider partitioning tables by date for faster queries on time ranges. For cross-chain protocols, you must normalize addresses (e.g., to checksum format) and tag data with its origin chain.

Finally, plan for resilience. Your pipeline needs error handling for RPC rate limits, intermittent failures, and chain reorgs. Implement retry logic with exponential backoff and a dead-letter queue for problematic transactions. Monitor pipeline health with metrics like blocks processed per second, ingestion latency, and error rates. Tools like Prometheus for metrics and Grafana for dashboards are standard here. The output of this step is a clean, queryable dataset of daily wallet activity, ready for the aggregation and analysis in Step 3.

deduplication-processing
DATA PIPELINE

Step 3: Address Deduplication and Processing Logic

This step transforms raw blockchain data into a clean dataset of unique daily active wallets (DAW), the core metric for protocol health and user engagement analysis.

After collecting raw transaction logs from the previous step, the data contains significant noise. A single user interacting with your protocol will generate multiple transaction records with the same from address. The primary goal of deduplication is to collapse all activity from a single address within a 24-hour window into a single count. This is typically done by extracting the from field from each transaction, converting timestamps to a standard date format (e.g., UTC day), and creating a unique key combining address and date. For example, if address 0xabc... made 5 swaps and 2 deposits on May 26, 2024, it should count as one active wallet for that day.

The processing logic must also filter for protocol-specific interactions. Not all transactions on a smart contract are equal. You need to identify which function calls or event logs signify a genuine user action. For a DEX, this might be the swap function; for a lending protocol, supply or borrow. Using the contract's Application Binary Interface (ABI), you can decode transaction input data and log events to filter for these specific signatures. This ensures you're counting engaged users, not just failed transactions or internal contract calls.

Implementing this requires a robust data pipeline. A common approach uses a script or service (e.g., in Python or Node.js) that: 1) Fetches raw block data from an RPC provider or indexer like The Graph, 2) Applies the ABI-based filtering, 3) Extracts and hashes address-date pairs, and 4) Inserts only unique pairs into a database table. Here's a simplified Python logic snippet for deduplication:

python
unique_daw_set = set()
for tx in transactions:
    if is_protocol_interaction(tx):
        address = tx['from']
        date = timestamp_to_utc_date(tx['timeStamp'])
        unique_key = f"{address}_{date}"
        unique_daw_set.add(unique_key)
daily_active_wallets = len(unique_daw_set)

Edge cases are critical for accuracy. You must handle: Contract wallets and smart accounts (e.g., Safe, Argent), where the from address is a proxy; the true user-owned address may be in a log. Batch transactions from relayers or bundlers, which may contain multiple user operations in a single TX. Internal transactions (calls between contracts) that shouldn't count as user activity. Addressing these often requires deeper log analysis beyond the top-level transaction object, potentially tracking msg.sender through call chains.

The output of this step is a clean, time-series dataset. Each record typically contains the date and the count of unique addresses that interacted with the protocol. This dataset powers the next steps: calculating trends (like week-over-week growth), segmenting users (new vs. returning), and feeding into dashboards. Accurate deduplication is non-negotiable; overcounting inflates metrics, while undercounting misses ecosystem vitality. This processed DAW metric becomes a fundamental Key Performance Indicator (KPI) for your protocol.

segmentation-analysis
ANALYTICS LAYER

Step 4: Implementing Segmentation and Trend Analysis

Move beyond basic counts by categorizing wallet activity and identifying behavioral patterns to generate actionable insights.

Raw Daily Active Wallet (DAW) counts provide a high-level health metric, but segmentation reveals the underlying composition of your user base. By applying filters to your on-chain data, you can isolate specific cohorts such as new users (first-time interactors), returning users, high-value users (based on transaction volume or frequency), and contract-specific users (e.g., those only using your staking pool). This is achieved by querying historical transaction data and applying logic like MIN(timestamp) per address or summing value transferred. Tools like the Chainscore API or Dune Analytics enable these queries with SQL or GraphQL.

With cohorts defined, trend analysis tracks their behavior over time. The key is to monitor not just totals, but rates of change and retention. Calculate week-over-week (WoW) or month-over-month (MoM) growth for each segment. A protocol might see total DAW rising, but trend analysis could reveal that growth is solely from low-value, one-time users while core user retention is falling—a critical strategic insight. Implementing this requires time-series data storage (e.g., in a PostgreSQL database) and visualization through dashboards in Grafana or Superset to plot metrics like new_user_retention_day_7.

For developers, implementing this often involves a scheduled job (e.g., a Cron job or AWS Lambda) that runs a daily analysis script. This script would query your indexed blockchain data, apply segmentation logic, calculate trends against the previous period's data, and write the results to an analytics database. A simple Python pseudocode step might be: segments = {'new': first_tx_date == today, 'returning': first_tx_date < today}. The output is a structured dataset ready for your dashboard or alerting system.

Actionable outputs from this step include identifying which product features drive returning users, detecting the early decay of a marketing campaign's user cohort, or spotting anomalous growth in a specific segment that may indicate a bot attack. By moving from "we have 10,000 DAW" to "we have 2,000 new users, with 40% retained from last week's campaign, and our core user segment of 500 wallets transacted 5x more this month," you gain the precision needed for data-driven protocol development and marketing.

ANALYTICS FRAMEWORK

Key DAW Segmentation Metrics and Queries

Comparison of core metrics and SQL-like queries for segmenting Daily Active Wallets across different analytical approaches.

Metric / QueryBasic CountSegmented AnalysisAdvanced Cohort

Unique Wallets (24h)

COUNT(DISTINCT wallet)

COUNT(DISTINCT wallet) WHERE chain = 'ethereum'

COUNT(DISTINCT wallet) WHERE first_tx_date > NOW() - INTERVAL '30 days'

Transaction Volume

SUM(tx_value_usd)

SUM(tx_value_usd) GROUP BY protocol

AVG(tx_value_usd) OVER (PARTITION BY wallet) AS avg_wallet_size

New vs. Returning

CASE WHEN first_seen_date = CURRENT_DATE THEN 'new' ELSE 'returning' END

DATEDIFF('day', first_seen_date, CURRENT_DATE) AS user_age_days

Activity Frequency

COUNT(tx_hash)

PERCENTILE_CONT(0.5) WITHIN GROUP (ORDER BY daily_tx_count)

LAG(active_date, 1) OVER (PARTITION BY wallet ORDER BY active_date) AS last_active

Cross-Chain Users

Gas Fee Analysis

AVG(gas_paid_usd)

SUM(gas_paid_usd) WHERE tx_status = 'success'

CORR(tx_value_usd, gas_paid_usd) AS value_gas_correlation

Protocol Stickiness

COUNT(DISTINCT date) / 30 AS monthly_activity_rate

MAX(sequence_days) FROM wallet_streaks WHERE active = true

Query Latency

< 1 sec

2-5 sec

5-30 sec

tools-resources
DAW MONITORING STACK

Tools and Infrastructure Resources

Essential tools and frameworks for building a robust system to track Daily Active Wallets (DAW) across blockchains.

06

Implementation Architecture

A typical DAW pipeline involves: 1. Data Ingestion (Node RPCs, Indexers), 2. Event Processing (Filter transactions, deduplicate wallets), 3. Storage (Time-series DB like TimescaleDB), and 4. Serving (API layer).

  • Key Consideration: Define "active"—is it any transaction, a specific contract interaction, or a gas-spending event?
  • Deduplication: Use a daily timestamp window per wallet address across all chains.
  • Tools: Combine The Graph for indexing, a message queue for processing, and PostgreSQL for storage.
DAW PROTOCOL

Optimization, Scaling, and FAQ

Answers to common technical questions and best practices for building and scaling a Daily Active Wallets (DAW) monitoring system.

Discrepancies in Daily Active Wallet (DAW) counts are common and stem from methodological differences. Key factors include:

  • Definition of "Active": Your protocol might count a wallet that signs any transaction, while others may require a value transfer or a specific contract interaction.
  • Data Sources: Are you using a full archive node, a provider like Alchemy or QuickNode, or The Graph subgraphs? Each has different indexing speeds and completeness.
  • Filtering Logic: Differences in filtering out bot activity, internal contract calls (e.g., token approvals), or failed transactions will change the final number.
  • Time Window: Using UTC vs. rolling 24-hour windows creates variance.

For consistency, document your methodology clearly and consider calculating multiple metrics (e.g., unique signers, unique receivers) for comparison.

conclusion-next-steps
NEXT STEPS

Conclusion and Extending the System

This guide has covered building a core system for monitoring Daily Active Wallets (DAW). The final step is launching your protocol and exploring advanced extensions.

To launch your DAW monitoring protocol, you need to deploy the smart contracts to a live network like Ethereum mainnet, Arbitrum, or Polygon. Use a deployment script with Hardhat or Foundry, ensuring you set the correct constructor parameters for your initial threshold and updateInterval. After deployment, verify the contract source code on a block explorer like Etherscan. The final step is to integrate your frontend dashboard with the live contract address and a production RPC provider like Alchemy or Infura, allowing users to interact with the protocol in real-time.

The basic system can be extended in several key ways. Implementing a multi-chain aggregator would allow the protocol to track DAW across multiple Layer 2s and sidechains, providing a unified view of user activity. You could also add role-based access control using OpenZeppelin's AccessControl library, enabling features like a governance council to adjust the activity threshold or an admin to pause the contract in an emergency. For richer data, consider emitting more granular events, such as WalletActivityDetailed, which could log the specific contract interactions or transaction values that triggered the activity.

For advanced analytics, integrate off-chain data indexing. A subgraph on The Graph protocol can index your contract's events to enable complex queries, like calculating weekly active users or tracking cohort retention. Alternatively, you could build a backend service that listens to events and writes aggregated data to a database for custom dashboards. To enhance security and automation, explore using Chainlink Automation or Gelato to trigger the daily snapshotAndReset function reliably without relying on a manual transaction, ensuring the protocol's state updates are always on schedule.