Free 30-min Web3 Consultation
Book Consultation
Smart Contract Security Audits
View Audit Services
Custom DeFi Protocol Development
Explore DeFi
Full-Stack Web3 dApp Development
View App Services
Free 30-min Web3 Consultation
Book Consultation
Smart Contract Security Audits
View Audit Services
Custom DeFi Protocol Development
Explore DeFi
Full-Stack Web3 dApp Development
View App Services
Free 30-min Web3 Consultation
Book Consultation
Smart Contract Security Audits
View Audit Services
Custom DeFi Protocol Development
Explore DeFi
Full-Stack Web3 dApp Development
View App Services
Free 30-min Web3 Consultation
Book Consultation
Smart Contract Security Audits
View Audit Services
Custom DeFi Protocol Development
Explore DeFi
Full-Stack Web3 dApp Development
View App Services
LABS
Guides

Setting Up a Whale Transaction Alert System

A technical guide for developers to build a system that detects and alerts on large on-chain transactions using real-time data streams and customizable logic.
Chainscore © 2026
introduction
INTRODUCTION

Setting Up a Whale Transaction Alert System

Learn to build a real-time monitoring system to track and analyze large-scale cryptocurrency transactions across blockchains.

A whale transaction alert system is a critical tool for on-chain analysts, traders, and security researchers. It automates the detection of high-value transfers—often exceeding $1 million USD equivalent—involving major assets like ETH, USDC, or WBTC. These transactions can signal market-moving events, such as institutional accumulation, exchange inflows/outflows, or the movement of funds from known entities like protocol treasuries or hacker wallets. By setting up your own system, you move beyond manual blockchain explorers and gain a proactive, customizable view of capital flows.

The core architecture involves three key components: a data source (like a node RPC or indexer API), a filtering and logic layer (your application code), and an alerting mechanism (e.g., Telegram bot, Discord webhook, email). You'll write logic to subscribe to new blocks, decode transaction data, calculate the fiat value of transfers using real-time price feeds, and compare it against a configurable threshold. For Ethereum and EVM chains, you can listen for native ETH transfers and specific ERC-20 Transfer events using libraries like ethers.js or viem.

Here's a basic conceptual flow in pseudocode:

code
1. Connect to WebSocket endpoint (e.g., wss://mainnet.infura.io/ws).
2. Subscribe to `newHeads` or `logs` for Transfer events.
3. For each new block, fetch transactions and event logs.
4. Decode log data to get `from`, `to`, and `value`.
5. Fetch current token price from an oracle (Chainlink, CoinGecko API).
6. If (token_amount * price) > THRESHOLD, format an alert.
7. Send alert via configured notification channel.

This process must be efficient to handle high-throughput chains without missing blocks.

Choosing your data infrastructure is crucial. Running your own archive node offers maximum autonomy but requires significant resources. Services like Alchemy, Infura, or QuickNode provide managed WebSocket endpoints. For historical analysis and richer context, indexers like The Graph or Covalent allow querying aggregated data. Your alert logic can be extended to track specific smart contracts (e.g., DeFi protocol treasuries), monitor known addresses from sanction lists or exploiters, and even analyze transaction patterns to distinguish between simple transfers and complex contract interactions.

Beyond simple value thresholds, advanced systems incorporate on-chain analytics. This includes tracking net flow changes for exchanges (identifying potential buy/sell pressure), clustering addresses to identify entities, and analyzing transaction success/failure states. Integrating with platforms like Etherscan's API or Tenderly can provide immediate contextual data for a flagged transaction, such as the interacting contract's name or the wallet's transaction history. Setting up this system provides not just alerts, but a foundational framework for real-time on-chain surveillance and research.

prerequisites
FOUNDATION

Prerequisites and Architecture

Before building a whale transaction alert system, you need the right tools and a clear architectural blueprint. This section covers the essential components and how they interact.

A robust whale alert system is built on three core pillars: a reliable data source, a processing engine, and an alerting mechanism. For on-chain data, you'll need access to a blockchain node or a node provider API like Alchemy, Infura, or QuickNode. These services provide the raw transaction and block data you'll monitor. For processing logic, you'll use a backend runtime such as Node.js or Python. Finally, you need an alerting service like Discord Webhooks, Telegram Bots, or Twilio to notify users when a significant transaction occurs.

The system architecture follows a publish-subscribe model. Your application subscribes to new blocks from your node provider. As each block is received, you parse its transactions, filtering for those involving addresses of interest (your whale list) or exceeding a defined value threshold. This filtering logic is the core of your application. A common approach is to maintain a database (e.g., PostgreSQL or Redis) to store tracked addresses and thresholds, allowing for dynamic updates without redeploying code.

For Ethereum and EVM chains, you will primarily work with the eth_getBlockByNumber JSON-RPC call. This returns a block object containing an array of transactions. Each transaction object includes fields like from, to, value (in wei), and input data. Your code must convert the value from wei to a human-readable unit like ETH to apply your threshold check. For token transfers, which occur within the input data, you'll need to decode the transaction using the token's ABI to identify transfers of ERC-20 or ERC-721 assets.

Setting up the development environment requires initializing a project and installing dependencies. For a Node.js example, you would run npm init and install packages like web3.js or ethers.js for blockchain interaction, and axios for HTTP requests to your node provider. A basic script structure involves an infinite loop that polls for the latest block, processes it, and then sleeps for a few seconds before checking again, ensuring you don't miss any new data.

Security and reliability are critical. Your node provider API key must be kept secret using environment variables. Implement error handling for rate limits, network timeouts, and malformed data. For production systems, consider using a message queue (e.g., RabbitMQ) to decouple block fetching from transaction processing, making the system more resilient and scalable. This also allows you to replay messages if part of your pipeline fails.

Finally, define what constitutes a 'whale' for your use case. This could be a simple ETH value threshold (e.g., >100 ETH), a list of specific addresses known to be large holders, or a dynamic threshold based on the network's current gas price or transfer volume. Your architecture should be flexible enough to adjust these parameters, as a static threshold may generate too many or too few alerts as market conditions change.

key-concepts
ARCHITECTURE

Key System Components

Building a robust whale alert system requires integrating several core components. This section details the essential tools and concepts for monitoring, analyzing, and responding to significant on-chain transactions.

defining-thresholds
FOUNDATION

Step 1: Defining Whale Thresholds and Targets

The first and most critical step in building a whale transaction alert system is establishing precise, data-driven criteria for what constitutes a 'whale' and which transactions are worth monitoring.

A 'whale threshold' is the minimum transaction value that triggers an alert. This is not a one-size-fits-all number; it must be contextual to the specific blockchain and asset. For example, a $100,000 USDC transfer on Ethereum is significant, while the same amount in a high-market-cap token like Bitcoin might be less noteworthy. Effective thresholds are often defined as a percentage of the asset's daily trading volume (e.g., 0.5% to 5%) or as a fixed, high-value amount in USD terms (e.g., $500k+). Using on-chain data from sources like Dune Analytics or The Graph, you can analyze historical transaction distributions to set a threshold that captures the top 0.1% of transfers by value, filtering out routine activity.

Beyond the raw value, you must define your monitoring targets. This involves specifying the exact smart contracts, token addresses, or wallet clusters you want to watch. Key targets include: - Centralized Exchange (CEX) Deposit Addresses: Large inflows to known exchange wallets (e.g., Binance, Coinbase) often precede market sales. - DeFi Protocol Treasuries & Governance Contracts: Large deposits or withdrawals can signal protocol investment or instability. - Known Entity Wallets: Tracking wallets associated with venture capital funds, foundations, or prominent individuals. - Newly Created Contracts: Large, immediate liquidity provisions to new pools can indicate a launch or potential scam.

To implement this programmatically, you'll structure your alert logic around these definitions. For instance, using the Ethers.js library, you can filter blockchain events based on value and recipient. Your system's core logic should check: if (transaction.value > WHALE_THRESHOLD_IN_WEI && TARGET_ADDRESSES.includes(transaction.to)) { triggerAlert(transaction) }. Continuously backtest your thresholds against market movements to calibrate for sensitivity, ensuring your alerts signal genuine market-moving activity rather than creating noise.

data-ingestion-code
TUTORIAL

Step 2: Ingesting Real-Time Data with Code

This guide walks through setting up a system to ingest and process real-time blockchain transaction data, focusing on identifying and alerting on large 'whale' transfers.

To build a whale alert system, you need a reliable source of real-time transaction data. While you could run your own node, using a specialized data provider like Chainscore is more efficient for high-throughput, filtered streams. Chainscore provides WebSocket and REST APIs that deliver parsed transaction data directly, allowing you to focus on logic instead of infrastructure. You'll need an API key, which you can obtain by signing up on the Chainscore Dashboard. For this tutorial, we'll use the WebSocket endpoint for live data ingestion.

The core of ingestion is establishing a persistent connection to the data stream and filtering for relevant transactions. We'll use the Chainscore WebSocket API to subscribe to a feed. The key is defining a precise filter to avoid processing irrelevant data. A whale transaction is typically defined by a large value transfer; for Ethereum, we might filter for transfers of 100+ ETH. The following Node.js code snippet establishes a connection and subscribes to a filtered stream for the Ethereum mainnet.

javascript
const WebSocket = require('ws');
const API_KEY = 'your_chainscore_api_key';

const ws = new WebSocket(`wss://api.chainscore.dev/v1/stream/ws?api_key=${API_KEY}`);

ws.on('open', function open() {
  const subscription = {
    jsonrpc: '2.0',
    id: 1,
    method: 'subscribe',
    params: {
      network: 'ethereum',
      event_type: 'transaction',
      filters: [
        { field: 'value', operator: 'gte', value: '100000000000000000000' } // 100 ETH in wei
      ]
    }
  };
  ws.send(JSON.stringify(subscription));
  console.log('Subscribed to 100+ ETH transactions.');
});

Once connected, your application will receive a stream of transaction objects. Each event contains structured data like the transaction hash, from and to addresses, value, and gas details. You must write an event handler to process this data. The immediate step is to parse the incoming JSON, extract the critical fields, and apply any additional logic, such as checking if the recipient is a known exchange address or calculating the USD value using a real-time price feed. This handler is where you prepare the data for the alerting logic covered in the next step.

For production systems, consider implementing reconnection logic, error handling, and data validation. WebSocket connections can drop; your code should automatically attempt to reconnect with exponential backoff. Furthermore, validate all incoming data against a schema before processing to ensure system resilience. Ingesting data is the foundational step—ensuring a clean, reliable stream of high-value transactions enables you to build robust monitoring and alerting features on top of it.

ALERT TRIGGER COMPARISON

Filtering Logic: Meaningful Actions vs. Noise

Comparison of transaction types to help configure a whale alert system that focuses on strategic moves instead of routine activity.

Transaction Type / MetricHigh-Signal ActionPotential NoiseRecommended Alert Threshold

Large DEX Swap (e.g., Uniswap, Curve)

$1M or > 30% of pool liquidity

CEX Deposit/Withdrawal

Only alert on withdrawal to new, non-CEX address

Governance Vote Casting

Vote on major proposal with > 5% of supply

NFT Purchase (High-Value)

Single NFT purchase > 50 ETH or floor-breaking bid

Token Transfer to New Wallet

Ignore unless part of multi-sig setup or followed by delegation

Liquidity Provision/Removal

Alert on removal > $5M; ignore standard LP farming deposits

Bridge Interaction (e.g., Across, Wormhole)

Cross-chain transfer > $2M to L2 or new chain

DeFi Position Change (e.g., Aave, Compound)

Alert on new borrow > $10M; ignore rate-optimizing repay/withdraw cycles

processing-logic
CORE LOGIC

Step 3: Implementing Processing and Filtering Logic

This step transforms raw blockchain data into actionable alerts by defining the rules that identify significant transactions.

The core of your alert system is the processing and filtering logic. This code block sits between the data stream from your indexer and the final alert notification. Its primary function is to analyze incoming transaction data against a set of user-defined rules to determine if an event qualifies as a "whale alert." You will implement this logic in a serverless function (like an AWS Lambda or Vercel Edge Function) or within a dedicated backend service. The function is triggered each time a new transaction block is processed by your data pipeline.

Define your filtering criteria using clear, quantifiable thresholds. Common parameters include transaction value in USD (e.g., value_usd > 1000000), token type (e.g., token_address == '0xA0b869...' for USDC), and wallet addresses (e.g., monitoring specific known entity wallets). You can also filter by transaction type, such as large DEX swaps, NFT purchases, or bridge withdrawals. Use a configuration file or environment variables to make these thresholds easily adjustable without redeploying code.

Here is a simplified JavaScript example of a filtering function using ethers.js to parse transaction logs and check value thresholds:

javascript
async function processTransaction(tx) {
  const usdValue = tx.value * getTokenPrice(tx.tokenAddress);
  const MIN_ALERT_VALUE = 1000000; // $1M USD

  if (usdValue >= MIN_ALERT_VALUE) {
    await sendAlert({
      txHash: tx.hash,
      from: tx.from,
      to: tx.to,
      value: usdValue,
      token: tx.tokenSymbol
    });
  }
}

This function calculates the USD value and triggers an alert if it exceeds the defined minimum.

To reduce noise and false positives, implement additional logic layers. This includes address allowlists/denylists (ignoring known exchange or contract addresses), time-based cooldowns (preventing repeated alerts from the same wallet within a short period), and transaction bundling (identifying if multiple large transfers are part of a single intent). For DeFi-specific alerts, you may need to decode complex contract interactions using the transaction's input data and the protocol's ABI to understand the exact action, such as a liquidation or a large liquidity provision.

Finally, structure the output of your filter into a clean, standardized alert object. This object should contain all necessary data for your notification service, such as a unique alert ID, timestamp, network (Ethereum, Arbitrum, etc.), the involved addresses, the asset amount and value, and a link to a block explorer like Etherscan. This standardized format ensures consistency whether you're sending alerts via Discord webhook, Telegram bot, email, or storing them in a database for historical analysis.

alert-integration
WHALE ALERT SYSTEM

Step 4: Integrating Alerting and Notification Systems

Configure real-time notifications to track significant on-chain movements and potential market-moving events.

A whale transaction alert system monitors blockchain data for large-value transfers, which can signal market sentiment shifts, fund movements to exchanges, or accumulation by large holders. You can build this by setting up a listener for specific event types on a node or using a data indexing service like The Graph. The core logic involves defining a threshold (e.g., 1,000 ETH or $1M USD equivalent) and filtering transactions or token transfers that exceed it. Services like Chainscore's Real-Time Alerts API provide a managed solution for this, allowing you to define custom triggers without running infrastructure.

To implement a basic system, you need to connect to a WebSocket endpoint from a node provider like Alchemy, Infura, or QuickNode. Listen for pending or confirmed transactions on relevant addresses or for specific ERC-20 tokens. For each transaction, decode the input data to check the transfer value and compare it against your threshold. Here's a simplified Node.js snippet using ethers.js to listen for large ETH transfers:

javascript
const { ethers } = require('ethers');
const provider = new ethers.providers.WebSocketProvider('YOUR_WS_URL');
const THRESHOLD = ethers.utils.parseEther('1000'); // 1000 ETH

provider.on('block', async (blockNumber) => {
  const block = await provider.getBlockWithTransactions(blockNumber);
  block.transactions.forEach(tx => {
    if (tx.value.gte(THRESHOLD)) {
      console.log(`Whale TX: ${tx.hash} Value: ${ethers.utils.formatEther(tx.value)} ETH`);
      // Trigger notification
    }
  });
});

For more complex tracking—such as following specific whale wallets, monitoring DeFi withdrawals, or calculating net flow to exchanges—you'll need to aggregate data across multiple transactions and tokens. This requires maintaining a state of address balances and using event logs to track ERC-20 transfers, not just native coin transactions. Consider using a dedicated alerting platform to handle the data pipeline, deduplication, and delivery. Key notification channels to integrate include Discord webhooks, Telegram bots, Slack apps, and email. Ensure your system includes rate limiting and deduplication logic to avoid alert fatigue from multiple related transactions in the same block.

When designing your alert logic, focus on actionable intelligence. Instead of just reporting a large transfer, enrich the alert with context: - Is the sending address a known exchange cold wallet? - Is the receiving address a new contract or a decentralized exchange router? - What is the USD value of the transfer at current prices? You can use Chainscore's Wallet API to fetch wallet labels and transaction history for this context. This transforms a raw data point into a signal that can inform trading, risk management, or investigative decisions.

Finally, test your system thoroughly in a development environment before deploying to production. Use testnets to simulate large transactions and verify your notification triggers. Monitor the system's performance and false-positive rate, adjusting thresholds and filters as needed. A well-tuned whale alert system provides a significant informational edge, but it must be reliable and context-aware to be truly valuable.

scaling-considerations
WHALE ALERT SYSTEM

Step 5: Deployment, Scaling, and Cost Considerations

This section covers the final steps for launching your on-chain monitoring system, focusing on production deployment, handling high transaction volumes, and managing operational costs.

Deploying your whale alert system to a production environment requires a robust infrastructure setup. For a Node.js-based listener, you can containerize the application using Docker and deploy it on a cloud service like AWS ECS, Google Cloud Run, or a dedicated server. The critical component is ensuring your node provider connection is stable and redundant; consider using a service like Chainscore's RPC Load Balancer to automatically fail over between providers like Alchemy, Infura, and QuickNode to prevent downtime. Set up process management with PM2 to automatically restart the application on crashes and implement comprehensive logging to a service like Datadog or Grafana Loki for monitoring.

As transaction volume increases, your system must scale efficiently. The primary bottleneck is often the RPC provider's rate limits and WebSocket connection stability. To scale horizontally, you can shard monitoring by specific criteria: deploy separate listener instances for different token contracts (e.g., one for USDC, one for WETH) or chain segments. Implement a message queue like RabbitMQ or Redis to decouple the event detection from the alert logic, allowing you to scale alert processors independently. For cost-effective scaling, use serverless functions (AWS Lambda, GCP Cloud Functions) triggered by your core listener to send alerts, as they scale to zero when idle.

Operational costs are driven by RPC requests, data storage, and alert delivery. Optimize RPC usage by batching eth_getLogs calls for multiple addresses and using the fromBlock and toBlock parameters efficiently to avoid redundant historical scans. For persistent data, a managed database like Supabase or AWS DynamoDB is cost-effective for storing alert history and user preferences. Alert delivery via SMS (Twilio) or phone calls can become expensive; implement cost controls by setting daily limits per user and prioritizing email or push notifications (via services like OneSignal) for less critical thresholds. Regularly audit your logs to identify and eliminate inefficient polling patterns or unused alert rules.

WHALE ALERT SYSTEM

Frequently Asked Questions

Common technical questions and troubleshooting for developers building on-chain monitoring and alert systems for large transactions.

A whale transaction alert system is a monitoring tool that tracks and notifies users of large-value transfers on a blockchain. It works by subscribing to real-time blockchain data via a node or RPC provider, parsing transaction data, and applying filters to detect transfers exceeding a defined threshold (e.g., 1000 ETH).

Core components typically include:

  • Data Source: A connection to an archive node or service like Chainscore, Alchemy, or QuickNode.
  • Event Listener: A script using WebSocket subscriptions (e.g., eth_subscribe for newPendingTransactions and newHeads) to capture live data.
  • Filtering Logic: Code that decodes transaction inputs, checks recipient/ sender addresses against a watchlist, and evaluates transfer value against your threshold.
  • Alert Mechanism: An integration (e.g., Discord webhook, Telegram Bot API, SMS) to dispatch notifications.
conclusion-next-steps
RECAP & FORWARD PATH

Conclusion and Next Steps

You have now built a functional system to monitor and alert on large on-chain transactions. This guide covered the core components from data ingestion to notification delivery.

Your alert system's foundation is a robust data pipeline. By using a WebSocket connection to a node provider like Alchemy or QuickNode, you capture real-time mempool and on-chain data. The filtering logic you implemented—checking for transaction value thresholds, specific token addresses, or interactions with known whale wallets—transforms this raw data stream into actionable intelligence. Remember to handle reorgs and implement rate limiting to ensure system stability and avoid being rate-limited by your RPC provider.

To extend this system, consider integrating more sophisticated analysis. You could track wallet clustering to identify entities controlling multiple addresses, implement machine learning models to detect anomalous spending patterns, or cross-reference transactions with off-chain data like exchange inflows/outflows. Adding support for more chains (e.g., Solana via a WebSocket to Helius, or Arbitrum via its native RPC) will make your monitoring cross-chain. Always store filtered transactions in a database like PostgreSQL or TimescaleDB for historical analysis and alert auditing.

The next practical step is to harden your deployment. Containerize your application using Docker for consistent environments and deploy it on a reliable cloud service or dedicated server. Implement proper logging with tools like Winston or Pino and set up monitoring for the health of your WebSocket connection and queue processor. Finally, review the security of your notification endpoints; if using a webhook, validate incoming requests, and ensure any API keys for services like Telegram or Discord are stored securely using environment variables or a secrets manager.