A DAO treasury dashboard is a critical tool for governance, providing real-time visibility into the collective assets held across multiple chains and protocols. Unlike a simple wallet balance, a comprehensive dashboard aggregates data from sources like Ethereum, Arbitrum, and Polygon, normalizes token values, and presents key financial metrics. The core components you'll need to build include: a data ingestion layer to fetch on-chain state, a price oracle for valuation, a database for historical tracking, and a frontend for visualization. Popular starting points for developers are subgraphs from The Graph for indexed event data and APIs from Covalent or Alchemy for broader chain coverage.
How to Create a Treasury Reporting and Analytics Dashboard
How to Create a DAO Treasury Reporting and Analytics Dashboard
A step-by-step guide to building a custom dashboard for tracking, analyzing, and reporting on a DAO's on-chain treasury assets and transactions.
The first technical step is data ingestion. You must identify and query all treasury addresses, including Gnosis Safe multisigs and DAO-specific vaults like Aragon or DAOhaus. Use Etherscan-like APIs or direct RPC calls to fetch current token balances. For historical analysis, you need to capture all incoming and outgoing transactions. This is where indexing services excel; for example, you can create a subgraph that listens for Transfer events to and from your treasury addresses. Store this raw data in a time-series database (e.g., PostgreSQL or TimescaleDB) to enable trend analysis over weekly or monthly periods.
Next, you must calculate the USD value of the treasury. This requires integrating a price oracle. For major tokens, use decentralized oracle networks like Chainlink. For long-tail or LP tokens, you may need to calculate prices via on-chain DEX pools using the constant product formula x * y = k. Aggregate the value of all assets to get the Total Value Locked (TVL). A key analytical feature is categorizing assets by type: stablecoins (USDC, DAI), volatile assets (ETH, governance tokens), and liquidity pool (LP) positions. This breakdown is essential for risk assessment presented to token holders.
For the reporting layer, design charts and metrics that answer governance questions. Key visualizations include: a portfolio allocation pie chart, a TVL trend line over time, and a transaction history table showing large inflows/outflows. Implement alerts for significant events, such as a single transaction exceeding 5% of the treasury. Use a framework like React or Vue.js for the frontend with charting libraries such as D3.js or Recharts. The dashboard should allow users to filter data by time range, asset type, and connected blockchain.
Finally, ensure your dashboard is secure and verifiable. All displayed data should be traceable back to on-chain transactions via Etherscan links. Consider implementing a caching strategy to reduce API costs and improve load times, but clearly label the data freshness (e.g., 'Prices updated 5 minutes ago'). For DAOs with complex, multi-chain treasuries, the end product is not just a reporting tool but a foundational piece of infrastructure for transparent, data-driven governance.
Prerequisites and Tech Stack
Before building a treasury dashboard, you need the right tools and data sources. This guide covers the essential software, APIs, and blockchain infrastructure required to track and analyze on-chain treasury assets.
A modern treasury dashboard is a full-stack application. You'll need a frontend framework like React or Vue.js for the user interface, a backend runtime such as Node.js or Python for data processing, and a database (PostgreSQL, TimescaleDB) for storing historical metrics. For on-chain data, you must interact with blockchain nodes via providers like Alchemy, Infura, or a self-hosted node. The core of your stack will be a set of data indexing libraries—ethers.js for Ethereum, viem for EVM chains, or @solana/web3.js for Solana—to query wallet balances and transaction history.
Your dashboard's accuracy depends on reliable data sources. You'll integrate multiple blockchain explorers and DeFi protocol APIs to aggregate information. For token prices and market data, use APIs from CoinGecko, CoinMarketCap, or decentralized oracles like Chainlink. To track treasury positions across DeFi protocols (e.g., Aave, Compound, Uniswap V3), you need to query their respective smart contracts for user-specific data, such as supplied collateral or LP positions. Services like The Graph can simplify accessing indexed historical data from these protocols.
For robust analytics, implement a dedicated data pipeline. This often involves using a task scheduler (e.g., cron, Celery) to run periodic scripts that fetch on-chain data, process it, and update your database. You'll write scripts to calculate key metrics: Portfolio Value (sum of all asset holdings), Asset Allocation (percentage breakdown by token or protocol), Yield Generated (from staking or lending), and Gas Spend (transaction cost analysis). Using a library like pandas in Python can help with complex financial calculations and time-series analysis.
Security is non-negotiable. Never store private keys in your application code or environment variables for read-only queries. Use dedicated treasury wallet addresses that are publicly verifiable. For signing transactions (if your dashboard includes asset management features), integrate a non-custodial wallet connector like MetaMask or WalletConnect, and consider using safe transaction services (Safe{Wallet}) for multi-signature operations. All API keys and sensitive configuration should be managed via environment variables using a tool like dotenv.
Finally, consider deployment and monitoring. Containerize your application with Docker for consistency. Use a cloud provider (AWS, GCP) or a platform like Vercel/Heroku for hosting. Implement logging (Winston, Pino) and error tracking (Sentry) to monitor the health of your data feeds. The end goal is a system that provides real-time, accurate financial reporting, requiring a stack that is both resilient to blockchain reorgs and scalable as your treasury grows across multiple chains like Ethereum, Arbitrum, and Polygon.
Step 1: Aggregating Treasury Data Sources
The first step in building a treasury dashboard is programmatically collecting data from disparate on-chain and off-chain sources. This involves connecting to blockchain nodes, DeFi protocols, and exchange APIs to fetch raw financial data.
A treasury's financial state is distributed across multiple blockchains and centralized platforms. To create a unified view, you must aggregate data from several core sources. On-chain data includes token balances in wallets and smart contracts, positions in DeFi protocols like Aave and Compound, and NFT holdings. Off-chain data encompasses exchange account balances on platforms like Coinbase and Binance, fiat bank account information, and manual transaction records. Each source requires a different connection method and authentication strategy.
For on-chain data, you typically interact with blockchain nodes via JSON-RPC calls or use a node provider service like Alchemy, Infura, or QuickNode. To query token balances, you can use the eth_getBalance method for native ETH or call the balanceOf function on ERC-20 token contracts. For more complex DeFi positions, you need to query the specific protocol's smart contracts. For example, to get a user's collateral and debt on Aave V3, you would call the getUserAccountData function on the Pool contract. Using a multicall contract can batch these requests to improve efficiency.
Off-chain data from centralized exchanges is accessed via their REST or WebSocket APIs, which require API keys with read-only permissions. For instance, the Coinbase Prime API provides endpoints like GET /accounts to list all trading and custody balances. It is critical to store these API keys securely using environment variables or a secrets management service, and to implement robust error handling for rate limits and downtime. This aggregated raw data forms the foundational dataset for all subsequent analysis, valuation, and reporting steps in your dashboard.
A practical implementation often involves setting up a scheduled job (e.g., using cron or Celery) that runs your data aggregation scripts. A simple Node.js script might use the Ethers.js library to connect to Ethereum and the Axios library to call exchange APIs. You should design your data schema to store timestamps, raw values, and source identifiers, enabling historical tracking and auditability. The output of this step is a structured dataset ready for the next phase: normalization and valuation.
Comparison of Treasury Dashboard Tooling Options
Key differences between building a custom dashboard, using SaaS platforms, and deploying self-hosted open-source solutions for on-chain treasury management.
| Feature / Metric | Custom-Built Dashboard | SaaS Platform (e.g., Llama, DeFi Saver) | Self-Hosted Open Source (e.g., Rotki, Zerion Fork) |
|---|---|---|---|
Initial Development Time | 8-16 weeks | 1-2 hours (setup) | 2-4 weeks (deployment & config) |
Recurring Cost | Developer salaries | $500-$5k+/month | Infrastructure (~$50-$200/month) |
Data Source Control | |||
Multi-Chain Support | Fully customizable | Pre-defined list (5-15 chains) | Depends on fork/implementation |
Smart Contract Wallet Integration (Safe) | |||
Real-time P&L Calculation | Custom logic required | Limited/plugin-based | |
Audit Trail & Compliance Reporting | Custom logic required | ||
Requires Blockchain Node | Yes (RPC endpoints) | Yes (RPC endpoints) |
Step 2: Building Core Queries with Dune Analytics
Learn how to construct the foundational SQL queries that will power your on-chain treasury dashboard, focusing on balance tracking, transaction categorization, and performance metrics.
The core of any treasury dashboard is the data pipeline. In Dune Analytics, this means writing SQL queries that extract, transform, and aggregate raw blockchain data into actionable insights. Start by identifying your primary data sources: the ethereum.traces table for internal transfers and smart contract calls, ethereum.transactions for standard ETH transfers, and the relevant erc20.ERC20_evt_Transfer tables for token movements. Your first query should establish a complete view of the treasury's wallet addresses, including multi-sigs and DAO treasuries like Gnosis Safe or Governor contracts.
With addresses defined, build a balance snapshot query. This query unions data from multiple sources to calculate net inflows and outflows. A common pattern uses WITH clauses (CTEs) to create temporary tables for inflows, outflows, and internal_transfers. For example, to track ETH, you would sum value from ethereum.traces where the to address matches your treasury, and subtract sums where the from address matches it. Remember to divide by 1e18 to convert from wei. For ERC-20 tokens, you'll join transfer events with price data from prices.usd to get USD values.
Categorizing transactions is critical for analysis. Create a reference table or a CASE WHEN statement in your query to tag transactions by type: Protocol Revenue, Grant Payment, Liquidity Provision, or Operational Expense. You can often derive categories from the to_address or from_address interacting with known protocol contracts (e.g., Uniswap routers, Aave lending pools) or from the function signature in the data field. This categorization enables the treasury_metrics query, which aggregates data into time-series summaries like daily net flow, asset diversification ratios, and runway calculations based on average monthly expenses.
Step 3: Calculating Financial KPIs and Runway
Transform raw treasury data into actionable insights by calculating key performance indicators and runway projections.
With your treasury data aggregated and categorized, the next step is to calculate the Key Performance Indicators (KPIs) that define your DAO's financial health. The most critical metrics include Total Treasury Value (TTV), Runway, and Asset Allocation. TTV is the sum of all assets across all chains and protocols, expressed in a stable denomination like USD. Runway estimates how long the treasury can fund operations at the current burn rate, typically calculated as Runway (months) = TTV / Monthly Burn Rate. Accurate categorization from Step 2 is essential here, as you should calculate runway using only liquid, non-vested assets designated for operations.
To implement this, your dashboard's backend logic needs to perform these calculations. For example, using a TypeScript function with ethers.js for on-chain data and CoinGecko's API for pricing, you can compute the core metrics. The function would iterate through your categorized asset list, fetch current prices, sum values by category, and apply the runway formula. It's crucial to exclude locked or vesting tokens from the runway calculation to avoid overestimating available funds. This logic should run on a schedule (e.g., daily) to keep the dashboard current.
Beyond basic totals, advanced KPIs provide deeper insight. Concentration Risk measures exposure to any single asset (e.g., "60% of treasury in our native token"). Liquidity Profile tracks the percentage of assets in stablecoins versus volatile crypto. For DeFi treasuries, calculate Yield Generated from staking, lending, or LP positions. These metrics should be visualized clearly: use a time-series chart for TTV and runway trends, a pie chart for asset allocation, and summary cards for key numbers. This transforms raw data into a strategic dashboard for informed governance proposals and financial planning.
Finally, ensure your calculations are transparent and verifiable. Document the formulas, data sources (e.g., priceSource: 'coingecko'), and categorization logic. Consider publishing a hash of your calculation script on-chain or in a public repository. This builds trust with your community by allowing anyone to audit how the dashboard's financial snapshots are derived from the underlying on-chain reality, turning your analytics from a black box into a foundational tool for decentralized stewardship.
Step 4: Building the Dashboard Frontend
This guide details how to construct a React-based frontend to visualize treasury data, connect to the backend API, and create interactive charts for on-chain analytics.
The frontend is the user-facing interface for your treasury dashboard. We'll use React with TypeScript for type safety and Vite for fast development. The core libraries include wagmi and viem for blockchain interactions, TanStack Query for efficient data fetching from your backend API, and Recharts for building responsive data visualizations. Start by initializing the project with npm create vite@latest treasury-dashboard -- --template react-ts and install the required dependencies.
The application state is managed by connecting to the user's wallet via wagmi, which provides hooks like useAccount and useBalance. Configure the wagmi client to support the networks your treasury operates on (e.g., Ethereum Mainnet, Arbitrum, Optimism). Use TanStack Query's useQuery hook to fetch aggregated data from your backend endpoints, such as GET /api/treasury/overview. This separates on-chain query logic from the frontend, improving performance and caching.
For the dashboard layout, create key components: a Header with wallet connection, a MetricsGrid for high-level stats (Total Value, Asset Diversity, Monthly Gas Spend), and a ChartContainer for time-series data. Use Recharts to build line charts for historical TVL and bar charts for asset allocation. Implement a TransactionTable component that displays the recent transactions fetched from the /api/transactions endpoint, with columns for timestamp, protocol, value, and chain.
To ensure a responsive design, utilize CSS Grid or Flexbox along with a library like Tailwind CSS. Make the charts interactive by adding tooltips that display precise values on hover and filters to toggle between different time ranges (7D, 30D, 90D). For a professional finish, integrate a dark/light theme toggle using React context and conditionally apply CSS classes. Always handle loading and error states for all data-fetching operations to improve user experience.
Finally, build and deploy the frontend. Run npm run build to create a production-ready bundle. You can deploy the static files to services like Vercel, Netlify, or Cloudflare Pages. Ensure your environment variables for the backend API URL (e.g., VITE_API_BASE_URL) are correctly set in the deployment platform. The live dashboard will now provide real-time, visual insights into your DAO or protocol's treasury health and activity.
Essential Resources and Documentation
These resources help developers design, implement, and maintain a treasury reporting and analytics dashboard with verifiable onchain data, reproducible queries, and auditable accounting logic.
Define Treasury Data Models and Accounting Rules
Before building a dashboard, define a treasury data model that reflects how assets, liabilities, and cash flows are tracked onchain. This prevents inconsistent metrics and broken reports.
Key elements to specify:
- Asset classification: native tokens, ERC-20s, LP tokens, staked positions, NFTs
- Valuation rules: spot price vs TWAP, USD vs ETH base, oracle source
- Time granularity: block-level, hourly, daily snapshots
- Accounting method: realized vs unrealized PnL, cost basis, impairment rules
For example, many DAOs treat governance tokens held by the treasury as non-circulating and exclude them from runway calculations. Document these assumptions in versioned specs so analytics queries remain stable as the treasury evolves.
Security Considerations and Automation
This section covers essential security practices and automation strategies for maintaining a reliable, tamper-proof treasury dashboard.
A treasury dashboard is a high-value target. Implement robust access controls and audit logging from day one. Use a multi-sig wallet for any administrative actions, such as updating data sources or smart contract addresses. All API keys, especially for RPC providers like Alchemy or Infura, should be stored as environment variables and never hardcoded. For on-chain data fetching, consider using a service like The Graph for decentralized indexing to avoid single points of failure and potential rate-limiting attacks on your node.
Automation is critical for operational integrity. Use scheduled tasks (e.g., via GitHub Actions, cron jobs, or a dedicated service like Pipedream) to trigger your data ingestion scripts. This ensures your metrics update consistently without manual intervention. For critical alerts—such as a wallet balance falling below a threshold or a suspicious large transfer—set up notifications through Slack, Discord, or Telegram. Tools like OpenZeppelin Defender can automate on-chain monitoring and response for smart contract-based treasuries.
Data validation is a non-negotiable security layer. Your pipeline should include checksums and sanity checks. For example, compare the total value from individual asset queries against an aggregate query. Implement circuit breakers in your scripts: if data from a source deviates anomalously from historical patterns, pause updates and send an alert. This prevents corrupted or manipulated data from poisoning your analytics. Regularly back up your processed data to a secure, immutable store like Arweave or IPFS for historical integrity.
When visualizing data, prioritize clarity and accuracy. Use libraries like D3.js or Plotly for custom, client-side charts to avoid server-side rendering risks. For dashboards built with frameworks like Streamlit or Dash, ensure all user inputs are sanitized to prevent injection attacks. Consider implementing read-only views for public stakeholders and a separate, secured admin interface for configuration. The principle of least privilege should extend to every component of your dashboard's architecture.
Finally, establish a routine maintenance and incident response plan. Document your data sources, scripts, and dependencies. Use version control for all code and consider containerizing your application with Docker for consistent deployment. Periodically review access logs and conduct penetration testing on the public-facing components. A secure, automated dashboard is not a one-time build but a system that requires ongoing vigilance to protect the assets it monitors.
Frequently Asked Questions
Common technical questions and troubleshooting for building on-chain treasury dashboards using data from Chainscore.
Chainscore aggregates on-chain data from multiple sources to power treasury dashboards. Key data includes:
- Token Balances: Real-time and historical holdings across EVM chains (Ethereum, Arbitrum, Optimism, etc.) and Solana, sourced directly from wallet addresses.
- Transaction History: Inflows, outflows, and internal transfers with decoded contract interactions.
- DeFi Positions: Value and health of positions in protocols like Aave, Compound, Uniswap V3, and Curve.
- NFT Holdings: Floor price and valuation data for collections.
- Token Prices & Market Data: Real-time prices from DEX and CEX oracles.
You can access this data via the Chainscore REST API or GraphQL endpoint, using the portfolio and transactions modules as primary entry points.
Conclusion and Next Steps
You have now built a functional dashboard for on-chain treasury analytics. This final section consolidates key learnings and outlines paths for further development.
This guide has walked through the core components of a Web3 treasury dashboard: connecting to blockchain data via providers like Alchemy or QuickNode, aggregating token balances across multiple wallets and chains, calculating real-time valuations using price oracles, and visualizing the data. The primary goal is operational clarity—transforming raw, fragmented on-chain data into a single source of truth for treasury health, enabling informed decisions on capital allocation, risk management, and reporting.
To extend your dashboard, consider these advanced features. Implement historical tracking to graph treasury value over time and calculate performance metrics. Add multi-signature wallet support for DAOs by listening for ExecutionSuccess events on Safes or Gnosis Safe contracts. For DeFi positions, integrate protocols like Aave or Compound to track supplied/borrowed assets and health factors. Automated alerting for large transactions or balance thresholds can be built using services like OpenZeppelin Defender or Pragma Oracle.
Security and maintenance are critical for a production system. Regularly audit the smart contracts of the oracles and price feeds you integrate. Implement data validation checks to flag discrepancies between sources. Consider using a dedicated server or serverless function for backend aggregation to keep API keys secure, rather than performing all logic client-side. Document your data sources and update mechanisms for team members.
The landscape of treasury management tools is evolving. Explore specialized platforms like Llama for DAO analytics or integrate with accounting software via their APIs. The dashboard you've built serves as a foundational layer; its real power comes from tailoring it to your organization's specific reporting needs, risk parameters, and investment thesis, creating a bespoke financial command center for the on-chain era.