Free 30-min Web3 Consultation
Book Consultation
Smart Contract Security Audits
View Audit Services
Custom DeFi Protocol Development
Explore DeFi
Full-Stack Web3 dApp Development
View App Services
Free 30-min Web3 Consultation
Book Consultation
Smart Contract Security Audits
View Audit Services
Custom DeFi Protocol Development
Explore DeFi
Full-Stack Web3 dApp Development
View App Services
Free 30-min Web3 Consultation
Book Consultation
Smart Contract Security Audits
View Audit Services
Custom DeFi Protocol Development
Explore DeFi
Full-Stack Web3 dApp Development
View App Services
Free 30-min Web3 Consultation
Book Consultation
Smart Contract Security Audits
View Audit Services
Custom DeFi Protocol Development
Explore DeFi
Full-Stack Web3 dApp Development
View App Services
LABS
Guides

Setting Up On-Chain Analytics for Treasury Monitoring

A technical guide to building a dashboard that aggregates data from block explorers, DeFi protocols, and oracles to monitor treasury health, wallet balances, and yield accrual.
Chainscore © 2026
introduction
TUTORIAL

Setting Up On-Chain Analytics for Treasury Monitoring

A practical guide to building a real-time dashboard for tracking DAO or protocol treasury assets, liabilities, and financial health using on-chain data.

On-chain treasury monitoring provides real-time transparency into a protocol's financial position. Unlike traditional finance reliant on quarterly reports, blockchain data is public and verifiable. This allows stakeholders to track assets across multiple chains, monitor liabilities like token vesting schedules, and assess risk exposure directly. For DAOs and DeFi protocols, this is critical for governance decisions, budgeting, and maintaining community trust. The foundation is querying data from blockchain nodes and indexing services like The Graph or Covalent.

The first step is defining your treasury's scope. A comprehensive view must account for: - Native tokens (e.g., ETH, MATIC) held in wallets, - ERC-20 tokens including stablecoins and LP positions, - Non-fungible tokens (NFTs) representing assets or collateral, and - Liabilities such as outstanding loans or unlocked token grants. You'll need the contract addresses for all assets and the wallet addresses (EOAs or multisigs) controlled by the treasury. Tools like Etherscan or Dune Analytics can help discover these addresses.

To collect the data, you interact with blockchain nodes. For balances, use the eth_getBalance RPC call for native tokens and the balanceOf function on ERC-20 contracts. For historical data and complex queries, use a subgraph on The Graph. Here's a basic JavaScript example using ethers.js to fetch two asset balances:

javascript
const provider = new ethers.providers.JsonRpcProvider(RPC_URL);
const ethBalance = await provider.getBalance(treasuryAddress);
const contract = new ethers.Contract(usdcAddress, abi, provider);
const usdcBalance = await contract.balanceOf(treasuryAddress);

Raw balance data needs context to be useful. You must apply pricing oracles like Chainlink's price feeds to convert token amounts into a common unit (e.g., USD). Calculate Total Value Locked (TVL) by summing the USD value of all assets. Track changes over time to visualize inflows and outflows. It's also essential to monitor liquidity depth for treasury assets; holding a large position in a low-liquidity token poses a significant market risk if it needs to be sold.

For a production system, automate data ingestion and build a dashboard. Set up a cron job or use a serverless function to periodically fetch balances and prices, storing them in a database. Visualization tools like Grafana or a custom React app can display key metrics: - Treasury Composition (pie chart of assets), - Net Flow (line chart of daily change), and - Concentration Risk (exposure to single assets). Always include links to on-chain verification, such as the Etherscan page for the main treasury wallet.

Advanced monitoring includes setting up alerts for significant transactions (e.g., large withdrawals) using services like OpenZeppelin Defender or Tenderly. Regularly audit the treasury's smart contract risk—ensure wallet signers are secure and asset contracts aren't vulnerable. By implementing this system, teams move from reactive accounting to proactive financial management, enabling data-driven decisions on grants, investments, and runway planning based on live, verifiable capital.

prerequisites
TREASURY MONITORING

Prerequisites and Setup

This guide outlines the technical foundation required to build a system for on-chain treasury monitoring, covering essential tools, data sources, and initial configuration.

Effective on-chain treasury monitoring begins with establishing a reliable data pipeline. The core prerequisite is access to blockchain data, typically achieved via a node provider or a blockchain indexer. For Ethereum and EVM-compatible chains, services like Alchemy, Infura, or QuickNode provide high-availability RPC endpoints. For more complex querying of historical data, including specific event logs and aggregated balances, an indexer like The Graph or Covalent is essential. You will need to create an account with one of these services and obtain an API key for programmatic access.

Your development environment must be configured to interact with these data sources. This requires a working knowledge of a programming language like JavaScript/TypeScript (with ethers.js or viem) or Python (with web3.py). Install the necessary packages, for example, npm install ethers or pip install web3. You should also be familiar with basic smart contract interaction, as you will need contract ABIs to decode transaction data and event logs for wallets you are monitoring. Tools like Etherscan's contract verification can provide these ABIs.

Define the scope of your monitoring by identifying the treasury addresses and assets to track. This includes the organization's primary multi-signature wallets (e.g., Safe, Gnosis Safe), DAO treasuries (e.g., governed by Governor contracts), and associated DeFi positions. Create a structured list of these addresses and the tokens they hold, including both native assets (ETH, MATIC) and standard tokens (ERC-20, ERC-721). This list will serve as the input for your data-fetching scripts.

Finally, set up a basic script to test your connectivity and data fetching. A simple initial task is to query the current ETH balance and the latest transactions for a single address. This verifies your RPC connection, API keys, and library setup. From this foundation, you can expand to tracking token balances, parsing specific transaction types, and calculating aggregate metrics across all defined treasury addresses.

architecture-overview
GUIDE

System Architecture Overview

A technical blueprint for building a robust, real-time on-chain analytics system to monitor DAO or protocol treasuries.

An effective on-chain treasury monitoring system is built on a modular architecture that ingests, processes, and visualizes blockchain data. The core components are a data ingestion layer (indexers or RPC nodes), a processing and storage engine (databases like PostgreSQL with TimescaleDB), and an application layer (APIs and dashboards). This separation of concerns ensures scalability, allowing you to swap out components—like moving from a centralized indexer to The Graph's decentralized protocol—without rebuilding the entire system. The primary goal is to transform raw, sequential blockchain data into structured, queryable insights on treasury balances, transaction flows, and asset composition.

The data ingestion layer is the foundation. You can start with direct RPC calls from providers like Alchemy or Infura to fetch current balances and recent transactions for treasury addresses. For historical analysis and complex event tracking, you will need an indexing strategy. This involves listening for specific event logs (e.g., Transfer, Approval) from ERC-20 contracts and decoding them. Tools like Ethers.js or Viem libraries are essential here. For example, to track USDC inflows to a treasury, you would filter for Transfer events where the to address matches your treasury wallet and the contract address is the USDC token on the relevant chain (e.g., 0xA0b86991c6218b36c1d19D4a2e9Eb0cE3606eB48 on Ethereum Mainnet).

Once data is ingested, it must be normalized and stored. A time-series database like TimescaleDB (a PostgreSQL extension) is ideal for storing balance snapshots and transaction histories, enabling efficient queries like "show me the daily change in ETH balance over the last 30 days." Your processing logic should calculate derived metrics such as total portfolio value (by fetching real-time prices from an oracle like Chainlink or a DEX API), asset diversification ratios, and net flow (inflows minus outflows). This logic is typically implemented in a backend service (using Node.js, Python, or Go) that periodically runs jobs to update these metrics.

The application layer exposes this data for consumption. A REST or GraphQL API built with frameworks like Express.js or FastAPI serves the processed data to frontend dashboards. For visualization, libraries like Recharts or D3.js can create charts for asset allocation, cash flow timelines, and multi-chain balance summaries. Security is critical at this layer; implement authentication and rate limiting for your API. Furthermore, for proactive monitoring, integrate alerting systems (e.g., via Discord or Telegram webhooks) to notify stakeholders of large, unexpected transactions or if a treasury balance falls below a defined threshold.

A practical next step is to deploy this architecture. You can containerize each service (indexer, processor, API) using Docker and orchestrate them with Docker Compose for development or Kubernetes for production. This guide will proceed by detailing the implementation of each layer, providing code snippets for setting up an indexer, designing the database schema, and building a simple dashboard. The final system will provide a real-time, customizable view of treasury health, which is fundamental for informed governance and financial decision-making in any Web3 organization.

data-sources
TREASURY MONITORING

Core Data Sources and APIs

Essential tools and data endpoints for building a comprehensive on-chain treasury dashboard. These sources provide the raw data for tracking assets, transactions, and protocol health.

backend-setup
ARCHITECTURE

Step 1: Backend Data Pipeline Setup

Build a robust data ingestion system to collect, process, and structure on-chain treasury data from multiple sources.

The foundation of any effective treasury monitoring system is a reliable data pipeline. This backend service is responsible for the continuous extraction of raw blockchain data, transforming it into a structured format, and loading it into a queryable database. For a multi-chain treasury, you must ingest data from various sources: direct RPC node connections for real-time block data, indexing services like The Graph for historical event logs, and price oracles for asset valuation. The primary challenge is handling the volume and velocity of blockchain data while ensuring data integrity and consistency across chains.

A common architectural pattern uses a message queue like Apache Kafka or RabbitMQ to decouple data ingestion from processing. Listeners (or consumers) subscribe to specific events—such as new blocks on Ethereum Mainnet or transactions on an Arbitrum sequencer—and publish raw data to the queue. This allows your system to scale processing for different chains independently and provides resilience; if the processor for Polygon data fails, the Avalanche pipeline remains unaffected. Each chain's RPC endpoint and required contract ABIs should be managed in a configuration file, not hardcoded.

The transformation layer is where raw data becomes actionable. This involves parsing transaction logs using contract ABIs to decode complex events like Transfer, Swap, or Deposit. For treasury analysis, you must normalize this data: converting token amounts using their decimals, applying the correct USD price at the block's timestamp, and mapping contract addresses to human-readable asset names (e.g., 0xA0b8... to USDC). This processed data is then loaded into a time-series database like TimescaleDB or a data warehouse like Google BigQuery, which are optimized for the aggregations and time-window queries essential for financial reporting.

Implementing robust error handling and monitoring is non-negotiable. Your pipeline should log processing failures (e.g., a missed block, a failed RPC call) to a dedicated table and include automatic retry logic with exponential backoff. Use a monitoring stack like Prometheus and Grafana to track key metrics: ingestion latency, block processing rate, and database queue depth. For development, you can start with a simpler batch-based approach using the Ethers.js or Viem libraries to fetch logs for a range of blocks, but design with the eventual need for real-time streaming in mind.

frontend-dashboard
IMPLEMENTATION

Step 2: Building the Frontend Dashboard

This guide covers building a React-based dashboard to visualize on-chain treasury data, connecting the backend API to a user interface with charts and real-time metrics.

Start by initializing a new React application using a modern framework like Next.js or Vite. Install essential dependencies for data fetching and visualization: axios or swr for API calls, recharts or victory for charting, and ethers.js or viem for any direct wallet interactions. Structure your project with clear components: a main dashboard layout, individual metric cards, chart components for historical data, and a table for transaction listings. Use a state management library like Zustand or React Query to efficiently cache and sync data from your backend API.

The core of the dashboard is fetching and displaying data from the endpoints built in Step 1. Create a service layer, for example apiService.js, that defines functions to call your backend: fetchTreasuryBalance(), fetchTransactionHistory(), fetchTokenAllocation(). Implement these calls using swr for automatic revalidation or React Query for robust caching and error handling. This ensures the UI reflects near real-time on-chain state without manual refreshes. Display key metrics—like total value, asset distribution, and recent inflow/outflow—in a grid of summary cards at the top of the dashboard.

For data visualization, use a library like Recharts to render time-series charts. Plot historical balance data from the /treasury/history endpoint to show value trends over days or weeks. Create a pie or bar chart to visualize the token allocation data from /treasury/allocation. Ensure charts are interactive, allowing users to hover for precise values. For the transaction table, fetch data from /transactions and display columns for timestamp, transaction hash (linked to a block explorer like Etherscan), value, and counterparty address. Implement pagination or infinite scroll for handling large datasets.

To enhance usability, add filtering and time-range selectors. Allow users to filter the transaction table by type (incoming/outgoing) or token. Provide date pickers to adjust the window for historical charts. For a professional finish, implement a dark/light theme toggle using CSS variables or a library like next-themes. Consider adding export functionality, enabling users to download transaction history as a CSV file. Finally, ensure the dashboard is fully responsive, providing a consistent experience on desktop and mobile devices by using a CSS framework like Tailwind CSS.

MONITORING FRAMEWORK

Key Treasury Metrics and Calculation Methods

Essential on-chain metrics for assessing treasury health, liquidity, and risk exposure.

MetricCalculation MethodTarget RangeMonitoring Frequency

Runway (Months)

Treasury Balance / Avg Monthly Burn

18 months

Weekly

Liquid Asset Ratio

Liquid Assets / Total Assets

30%

Daily

Concentration Risk

Max Token Holding / Total Assets

< 20%

Weekly

Stablecoin Peg Deviation

|Market Price - $1| / $1

< 2%

Real-time

Protocol-Owned Liquidity

Protocol-owned LP / Total TVL

10-40%

Daily

Debt-to-Assets Ratio

Outstanding Debt / Total Assets

< 15%

Weekly

Treasury Yield (APY)

Annualized Revenue / Avg Treasury Balance

Risk-Free Rate

Monthly

Gas Cost Efficiency

Treasury Ops Gas / Treasury Volume

< 0.5%

Per Transaction

automation-alerts
OPERATIONALIZING INSIGHTS

Step 3: Implementing Automation and Alerts

Transform raw on-chain data into proactive treasury management by automating analysis and setting up real-time alerts for critical events.

Automation is the core of effective treasury monitoring, moving from manual checks to programmatic oversight. The goal is to create a system that continuously scans for predefined conditions—like a significant drop in DEX liquidity, an unexpected large token transfer, or a change in governance voting power—and triggers an alert. This requires connecting your data pipeline (from Step 2) to a logic engine and notification service. Common tools for this include Cron jobs for scheduled tasks, serverless functions (AWS Lambda, Google Cloud Functions), and dedicated Web3 automation platforms like Gelato Network or Chainlink Automation for on-chain condition checking.

Start by defining your alert logic. For a DAO treasury, critical alerts might include: - Large outgoing transfer (>5% of treasury) - DEX pool imbalance exceeding a 70/30 ratio - Governance proposal creation from a new delegate - Staking APR falling below a benchmark. Write this logic in your chosen environment. For example, a Node.js script using Ethers.js could fetch the latest balance from a Multisig contract and compare it to a threshold. For more complex, real-time on-chain events, use indexers like The Graph or RPC providers with websockets to listen for specific contract events without polling.

Here is a basic conceptual example using a scheduled function to check a wallet balance and send a Slack alert if it drops below a threshold:

javascript
const { ethers } = require('ethers');
const { WebClient } = require('@slack/web-api');

const provider = new ethers.providers.JsonRpcProvider(process.env.RPC_URL);
const treasuryAddress = '0x...';
const THRESHOLD_ETH = ethers.utils.parseEther('100');
const slack = new WebClient(process.env.SLACK_TOKEN);

async function checkTreasury() {
  const balance = await provider.getBalance(treasuryAddress);
  if (balance.lt(THRESHOLD_ETH)) {
    await slack.chat.postMessage({
      channel: '#treasury-alerts',
      text: `⚠️ Treasury balance low: ${ethers.utils.formatEther(balance)} ETH`
    });
  }
}
// Run every hour
setInterval(checkTreasury, 3600000);

This script should be deployed to a reliable server or serverless environment.

For production systems, consider robustness and cost. Polling an RPC for multiple data points can become expensive and slow. Instead, use a dedicated data platform like Chainscore, Dune Analytics, or Flipside Crypto to run your analytical queries on indexed data and set up email or webhook alerts directly from their dashboards. These services handle the infrastructure and data freshness, allowing you to focus on defining the business logic of your alerts. Always test your alert thresholds in a staging environment to avoid alert fatigue from false positives.

Finally, integrate alerts into your team's workflow. Critical alerts should go to high-priority channels like SMS (via Twilio) or PagerDuty, while informational alerts can go to Slack or Discord. Document each alert with its purpose, threshold rationale, and response procedure. This turns your analytics from a reporting tool into an active risk management system, ensuring your team is notified of treasury events in time to take informed action.

tooling-resources
TREASURY ANALYTICS

Tools and Development Resources

A curated set of tools and frameworks for building a comprehensive on-chain treasury monitoring system. These resources help you track assets, analyze transactions, and automate reporting.

TREASURY ANALYTICS

Frequently Asked Questions

Common technical questions and solutions for developers implementing on-chain treasury monitoring systems.

Inconsistent balance data is often caused by using a default public RPC endpoint, which can be rate-limited, lag behind the chain tip, or serve requests from non-archival nodes. For reliable treasury monitoring, you need a dedicated, archival-grade RPC connection.

Key solutions:

  • Use a dedicated node provider like Alchemy, Infura, or QuickNode with archival capabilities.
  • Implement data validation by cross-referencing balances across multiple block heights.
  • Handle chain reorganizations by confirming transactions after a sufficient number of block confirmations (e.g., 12+ for Ethereum).
  • Cache historical data locally to reduce API calls and ensure consistency during provider outages.
conclusion-next-steps
IMPLEMENTATION GUIDE

Conclusion and Next Steps

Your on-chain treasury monitoring system is now operational. This section outlines how to maintain, extend, and act upon the insights you've gathered.

You have successfully configured a foundational on-chain analytics pipeline. The core components—data ingestion via The Graph or Covalent, transformation in Python or Node.js, and visualization in Grafana—are in place to track key metrics like wallet balances, token composition, and transaction history. The real value, however, comes from evolving this system from a passive dashboard into an active management tool. Start by establishing regular review cycles, such as weekly or monthly, to analyze the data trends and correlate them with your organization's operational milestones.

To enhance your monitoring, consider implementing alerting. Use your data pipeline to program checks for specific conditions, such as a treasury balance falling below a predefined threshold, a suspicious large outflow, or significant concentration in a single asset exceeding a risk limit. These alerts can be sent via email, Slack, or integrated into incident management platforms like PagerDuty. For code-based treasuries managed by Gnosis Safe or similar multisigs, you can connect event listeners to trigger alerts directly from on-chain events using services like OpenZeppelin Defender or Tenderly.

The next logical step is to deepen your analysis. Move beyond simple balance tracking to calculate Portfolio Value-at-Risk (VaR), monitor the health of DeFi positions (e.g., loan health factors on Aave, impermanent loss in Uniswap v3 pools), and track grant or investment distributions. This requires integrating more specialized data sources or subgraphs. Furthermore, explore on-chain governance tracking for DAO treasuries by monitoring proposal states, voter participation, and delegation patterns using subgraphs from Snapshot or Tally.

For teams ready to scale, architecting for modularity is key. Containerize your data fetcher and transformer scripts using Docker to ensure consistent environments. Orchestrate scheduled runs with Apache Airflow or Prefect for reliable, monitored workflows. Consider moving from a simple database to a time-series database like TimescaleDB or InfluxDB for more efficient storage and querying of historical blockchain data, enabling complex longitudinal analysis.

Finally, use your analytics to inform tangible treasury actions. Data should drive decisions on rebalancing assets, diversifying across chains or asset types, optimizing yield through DeFi strategies, and planning runway based on burn rate. The system you've built is not an endpoint but a foundation for proactive, data-driven treasury management that enhances security, transparency, and strategic financial oversight in the Web3 ecosystem.