Free 30-min Web3 Consultation
Book Consultation
Smart Contract Security Audits
View Audit Services
Custom DeFi Protocol Development
Explore DeFi
Full-Stack Web3 dApp Development
View App Services
Free 30-min Web3 Consultation
Book Consultation
Smart Contract Security Audits
View Audit Services
Custom DeFi Protocol Development
Explore DeFi
Full-Stack Web3 dApp Development
View App Services
Free 30-min Web3 Consultation
Book Consultation
Smart Contract Security Audits
View Audit Services
Custom DeFi Protocol Development
Explore DeFi
Full-Stack Web3 dApp Development
View App Services
Free 30-min Web3 Consultation
Book Consultation
Smart Contract Security Audits
View Audit Services
Custom DeFi Protocol Development
Explore DeFi
Full-Stack Web3 dApp Development
View App Services
LABS
Guides

How to Build a Regulatory Reporting Dashboard for Cross-Border Transactions

A technical guide on building an automated dashboard to aggregate, visualize, and report cross-border security token transactions for regulators like the SEC and under MiCA.
Chainscore © 2026
introduction
INTRODUCTION

How to Build a Regulatory Reporting Dashboard for Cross-Border Transactions

A technical guide for developers on constructing a dashboard to monitor and report blockchain-based cross-border payments for compliance with regulations like the Travel Rule.

Cross-border transactions on blockchain networks like Ethereum, Solana, or Polygon introduce significant regulatory complexity. Unlike traditional finance, where a single bank handles reporting, decentralized systems require proactive monitoring. Key regulations include the Financial Action Task Force (FATF) Travel Rule (Recommendation 16), which mandates that Virtual Asset Service Providers (VASPs) share sender and recipient information for transactions over a certain threshold (e.g., $/€1000). Building a dashboard to automate this reporting is critical for any compliant exchange, wallet provider, or financial institution operating in the space.

The core technical challenge is aggregating and enriching raw blockchain data. A basic dashboard must ingest transaction data from node providers (e.g., Alchemy, QuickNode) or indexers (The Graph), then correlate wallet addresses with real-world identity information from your internal KYC system or external VASP directories like the Travel Rule Universal Solution Technology (TRUST) or open protocols like OpenVASP. This involves parsing transaction logs, identifying cross-chain bridge interactions (e.g., via Wormhole, LayerZero), and applying business logic to determine reportable events.

Your dashboard's architecture should separate data ingestion, processing, and presentation. A common stack uses a backend service (in Python/Node.js) to listen for on-chain events, a database (PostgreSQL, TimescaleDB) to store enriched transactions, and a frontend framework (React, Vue.js) for visualization. Critical data points to display per transaction include: transaction hash, amount, asset type, sender/VASP, recipient/VASP, timestamp, and compliance status. You must also log the proof of submission to regulatory bodies or counterparty VASPs.

For developers, implementing the logic requires handling edge cases. For example, a user might send funds via a decentralized exchange (DEX) aggregator like 1inch, which routes through multiple pools—your system must trace the ultimate source and destination. Smart contract analysis tools (like OpenZeppelin Defender) can help monitor for suspicious patterns. Code snippets for fetching and parsing transactions are essential. Here's a simplified Node.js example using ethers.js to get transaction details:

javascript
const tx = await provider.getTransaction("0x...");
const receipt = await provider.getTransactionReceipt(tx.hash);
// Parse logs for specific event signatures related to transfers

Finally, the dashboard must generate standardized reports, such as the IVMS 101 (InterVASP Messaging Standard) data format, for sharing with other VASPs. Automation is key: set up alerts for threshold breaches and schedule daily summary reports. Integrating with secure messaging channels or dedicated Travel Rule solution APIs is the final step. By building this tool, you move from manual, error-prone processes to a scalable, auditable compliance workflow that can adapt to evolving regulations across different jurisdictions.

prerequisites
GETTING STARTED

Prerequisites and Tech Stack

Before building a dashboard for cross-border crypto transactions, you need the right tools and data sources. This guide outlines the essential software, APIs, and architectural decisions required for a robust reporting system.

The foundation of any regulatory reporting dashboard is reliable data ingestion. You'll need to connect to multiple blockchains and centralized exchanges (CEXs) to capture transaction flows. For on-chain data, use providers like The Graph for indexed historical data or run your own archive node with Erigon or Geth. For real-time monitoring, WebSocket connections to node providers like Alchemy or Infura are essential. Off-chain data from CEXs can be accessed via their official REST APIs, which provide detailed transaction and user KYC information necessary for compliance.

Your backend tech stack must handle high-throughput, time-series data. A common architecture uses Node.js or Python (with frameworks like FastAPI) for API orchestration, fetching data from the sources above. This data is then processed and stored in a time-series database like TimescaleDB (a PostgreSQL extension) or InfluxDB, which are optimized for the sequential, timestamped nature of financial transactions. For complex aggregation and reporting logic, consider using Apache Kafka or RabbitMQ to create a resilient event-driven pipeline that decouples data ingestion from analysis.

The frontend dashboard needs to present complex data clearly. A modern framework like React or Vue.js is recommended, paired with a powerful charting library such as Apache ECharts or Recharts. For the core visualization of transaction graphs, geographic flows, and time-series trends, D3.js offers the most control. State management with Redux or Zustand will help manage the application's complex data layer. Ensure your frontend can securely authenticate with your backend API, typically using JWT tokens, to protect sensitive financial data.

Compliance reporting requires mapping raw transaction data to regulatory schemas. You will need to implement logic that tags transactions with purposes (e.g., trade, remittance, investment) and flags those exceeding thresholds like the FATF Travel Rule (often $/€1000). This involves integrating with blockchain analytics tools like Chainalysis or TRM Labs via their APIs to assess risk scores and wallet clustering. Your code must also handle multiple fiat currency conversions using reliable forex rate APIs to calculate equivalent values at the time of each transaction.

Finally, the operational environment is critical. Use Docker to containerize your services for consistency. Orchestrate them with Kubernetes or Docker Compose. Implement rigorous logging with the ELK Stack (Elasticsearch, Logstash, Kibana) and monitoring with Prometheus and Grafana. Since you're handling sensitive data, security is paramount: encrypt data at rest and in transit, use secure secret management (e.g., HashiCorp Vault), and ensure all infrastructure complies with relevant standards like SOC 2. Start by building a minimal viable pipeline for one blockchain (e.g., Ethereum) and one exchange before scaling.

key-concepts
REPORTING INFRASTRUCTURE

Core System Components

A regulatory dashboard requires specialized infrastructure to aggregate, analyze, and report on cross-chain transaction data. These are the foundational components you need to build.

05

Audit & Data Provenance Module

Essential for regulatory examinations. This module provides an immutable record of the reporting process.

  • Data Lineage: Tracks the origin of every data point in a report back to the on-chain block and transaction.
  • Process Logging: Records every action by the rules engine and any manual overrides by compliance staff.
  • Snapshot Integrity: Uses cryptographic hashes (e.g., Merkle roots) to prove the data state at the time of report generation has not been altered.

This is often built using immutable logging databases or by anchoring periodic state hashes onto a public blockchain like Ethereum for timestamping and verification.

06

Jurisdictional Logic Mapper

Cross-border transactions involve multiple regulatory regimes. This component manages the mapping of transaction attributes to applicable rules.

  • VASP Identification: Determines if a counterparty is a regulated Virtual Asset Service Provider (VASP) using directories like the Travel Rule Universal Solution Technology (TRUST) or national lists.
  • Rule Set Selection: Applies the stricter rule when transaction endpoints fall under different jurisdictions (commonly the "sender's jurisdiction" rule).
  • Geo-Location & IP Analysis: Helps infer jurisdiction for non-VASP counterparties using auxiliary data.

Maintaining an accurate, up-to-date database of global VASP licenses and jurisdictional rules is the primary challenge here.

architecture-overview
SYSTEM ARCHITECTURE AND DATA FLOW

How to Build a Regulatory Reporting Dashboard for Cross-Border Transactions

This guide outlines the technical architecture for a dashboard that aggregates, analyzes, and reports on cross-chain and cross-border crypto transactions to meet regulatory requirements like the EU's MiCA or the FATF's Travel Rule.

A robust reporting dashboard requires a modular architecture that separates data ingestion, processing, storage, and presentation. The core components are: a data ingestion layer that pulls raw transaction data from blockchains and off-chain sources; a normalization and enrichment engine that standardizes this data and attaches relevant metadata (e.g., identifying VASPs, calculating fiat values); a compliance rules engine that applies jurisdictional logic to flag reportable events; and a secure API and frontend for visualization and report generation. This separation of concerns ensures scalability and maintainability as regulatory frameworks evolve.

The data flow begins with on-chain ingestion. You'll need to run archive nodes or use services like Chainscore, The Graph, or Chainlink to index transactions from relevant blockchains (Ethereum, Solana, etc.). For cross-chain activity, you must also integrate with major bridge and swap protocols (e.g., Wormhole, Axelar, Uniswap). The key is capturing not just the transaction hash and amount, but the sender/receiver addresses, the smart contract interacted with, and any memo fields that may contain Travel Rule information. This raw data is streamed into a message queue (e.g., Apache Kafka, Amazon Kinesis) for asynchronous processing.

Next, the normalization engine processes this stream. It converts all token amounts to a standard unit (e.g., wei to ETH), uses price oracles (Chainlink, Pyth) to attach fiat values at the time of the transaction, and enriches addresses using on-chain analytics (e.g., TRM Labs, Chainalysis) or internal databases to identify Virtual Asset Service Providers (VASPs) or high-risk wallets. A critical step is entity resolution, where you link multiple wallet addresses to a single user or institution based on on-chain heuristics or KYC data, which is essential for accurate reporting thresholds.

The processed data is then passed to the compliance rules engine. This is a configurable module where you codify regulatory logic. For instance, you might define a rule: "Flag any cross-border transaction over €1,000 where the counterparty VASP is not in our trusted registry." The engine evaluates each transaction against these rules, generating alerts and creating structured report objects that include all required fields specified by regulations like the Travel Rule (originator/beneficiary info, transaction hash, asset, amount). These reportable events are stored in a dedicated database (e.g., PostgreSQL) with strong audit trails.

Finally, the presentation layer exposes this data via a secure REST or GraphQL API to a React/Vue.js dashboard. The frontend should provide real-time monitoring of transaction flows, dashboards showing report status, and tools to generate and submit standardized reports (like IVMS 101 data files) to regulators or counterparty VASPs. Security is paramount: implement strict role-based access control (RBAC), audit logging for all data accesses, and end-to-end encryption for sensitive PII. The entire system should be deployable via infrastructure-as-code (Terraform, Pulumi) on cloud platforms or on-premise.

To start building, use open-source tools like Apache Superset or Grafana for initial visualization, and leverage Celery or Apache Airflow for orchestrating data pipelines. The major challenge is maintaining data consistency across chains and keeping enrichment databases current. A practical first step is to prototype the ingestion and normalization layer for a single chain (like Ethereum Goerli) using the Ethers.js library and a simple Node.js service, proving the data flow before scaling to a multi-chain, production-ready system.

step1-data-ingestion
DATA PIPELINE FOUNDATION

Step 1: Ingesting On-Chain and Off-Chain Data

The first step in building a regulatory reporting dashboard is establishing a robust data ingestion pipeline. This involves aggregating raw transaction data from both blockchain networks and traditional financial systems into a unified, queryable format.

A regulatory reporting dashboard requires a holistic view of cross-border capital flows. This means ingesting data from two distinct sources: on-chain data from public blockchains like Ethereum, Solana, or Polygon, and off-chain data from traditional banking APIs, payment processors (e.g., SWIFT), and centralized exchanges. On-chain data is transparent and immutable but requires parsing raw transaction logs and smart contract events. Off-chain data is often private, structured, and accessed via secure APIs, but its format and availability vary by institution.

For on-chain ingestion, you need to connect to blockchain nodes. Services like Alchemy, Infura, or QuickNode provide reliable RPC endpoints. Using a library like ethers.js or web3.py, you can listen for specific events or poll for transactions related to your monitored addresses. A critical task is decoding event logs using a contract's Application Binary Interface (ABI) to transform hexadecimal data into human-readable information like token amounts, sender, and receiver addresses.

Simultaneously, you must ingest off-chain data. This typically involves setting up secure connections to banking APIs (using protocols like OFX or modern REST APIs) and exchange APIs (like Coinbase or Binance). Data here includes fiat transaction records, KYC information, and internal ledger entries. The key challenge is normalization—mapping diverse data schemas into a common model that can correlate an on-chain USDT transfer with an off-chain bank deposit.

To manage this pipeline, consider using a workflow orchestrator like Apache Airflow or Prefect. You can create directed acyclic graphs (DAGs) that schedule and monitor data extraction jobs. For example, one task might fetch the latest Ethereum blocks, another might call a bank's API for yesterday's transactions, and a third would load the cleaned data into a data warehouse like Google BigQuery or Snowflake for analysis.

Always implement idempotent ingestion logic to handle failures and avoid duplicate records. Use incremental extraction based on block numbers or timestamps rather than full historical loads. For scalability, consider using a streaming approach with services like Kafka or Amazon Kinesis for high-volume on-chain data, allowing for near-real-time reporting, which is often a regulatory requirement for monitoring large transactions.

step2-data-model
ARCHITECTURE

Step 2: Designing the Compliance Data Model

A robust data model is the foundation of any effective reporting system. This step defines the core entities, relationships, and attributes needed to capture, store, and query cross-border transaction data for regulatory compliance.

The primary goal is to structure raw on-chain and off-chain data into a queryable format that maps directly to regulatory requirements like the Travel Rule (FATF Recommendation 16) or MiCA transaction reporting. Your model must capture the full transaction lifecycle, not just final states. Key entities include Transaction, Wallet (VASP or user), Jurisdiction, and ComplianceCheck. Each Transaction record should link to originator and beneficiary Wallet details, which in turn reference their associated Jurisdiction and Virtual Asset Service Provider (VASP).

Critical data fields for a Transaction entity extend beyond basic amount and asset type. You must record: transaction_hash, originator_address, beneficiary_address, asset_type (e.g., ERC-20, BTC), amount, timestamp, and the protocol or bridge used. For compliance, essential metadata includes originator_name (if available via TRISA or other protocol), beneficiary_name, and the jurisdictional_rule_applied. Structuring this with clear foreign keys enables efficient joins for generating reports.

Incorporate status flags and audit trails directly into the model. Add fields like sanctions_screening_result (PASS, HIT, PENDING), risk_score (a calculated integer), and travel_rule_status (NOT_REQUIRED, PENDING, COMPLIED). An AuditLog table linked to the Transaction can track every state change and screening event, which is crucial for audits. This design allows you to filter transactions by risk_score > 70 AND sanctions_screening_result = 'PENDING' to prioritize reviews.

For practical implementation, consider using a schema-first approach with tools like Prisma or Django ORM. Below is a simplified Prisma schema snippet defining core relationships:

prisma
model Transaction {
  id        String   @id @default(cuid())
  hash      String   @unique
  amount    String  // Use String for precision with crypto decimals
  asset     String
  timestamp DateTime
  // Relations
  originator Wallet @relation("OriginatorTransactions", fields: [originatorAddress], references: [address])
  originatorAddress String
  beneficiary Wallet @relation("BeneficiaryTransactions", fields: [beneficiaryAddress], references: [address])
  beneficiaryAddress String
  complianceCheck ComplianceCheck[]
}

model Wallet {
  address     String   @id
  isVasp      Boolean  @default(false)
  vaspId      String?  // Linked to a registered VASP directory
  jurisdiction Jurisdiction @relation(fields: [jurisdictionCode], references: [code])
  jurisdictionCode String
  transactionsAsOriginator Transaction[] @relation("OriginatorTransactions")
  transactionsAsBeneficiary Transaction[] @relation("BeneficiaryTransactions")
}

Finally, plan for extensibility. Regulatory frameworks evolve, and new data points like DeFi protocol interaction details or NFT collection identifiers may become reportable. Use nullable fields for emerging requirements and consider a JSONB or similar column for unstructured but queryable supplemental data. This model, when populated via the ingestion pipelines built in Step 1, becomes the single source of truth for all subsequent reporting, analytics, and alerting modules in your dashboard.

DATA TRANSFORMATION

Mapping Data to Regulatory Report Fields

How to structure on-chain and off-chain data to meet common regulatory report schemas.

Report Field (Example)On-Chain SourceOff-Chain SourceTransformation Required

Transaction Value (USD)

amount in transaction log

Fiat payment gateway API

Apply real-time oracle price feed for conversion

Counterparty Address

to / from fields

KYC provider customer ID

Map customer ID to on-chain withdrawal address via internal database

Jurisdiction

IPFS-strapped KYC proof (if available)

User-provided data during onboarding

Validate against blockchain analytics (e.g., Chainalysis) for high-risk wallets

Asset Type / Token

ERC-20 address or native chain asset

Custodian settlement ledger

Map internal asset codes to public token identifiers (e.g., CoinGecko ID)

Transaction Purpose Code

User-declared purpose field in UI

Enforce standardized code list (e.g., ISO 20022 pacs.008)

Beneficiary/Owner Name

Decentralized Identity (DID) Verifiable Credential

Traditional KYC record (name, DOB)

Resolve DID to legal name; maintain audit trail of resolution

Timestamp

Block timestamp

Fiat settlement confirmation time

Synchronize off-chain event to nearest block for immutable proof

step3-report-generation
IMPLEMENTATION

Step 3: Automating Report Generation and Submission

This guide details how to build an automated pipeline for generating and submitting regulatory reports on cross-chain transactions, moving from manual analysis to a production-ready system.

The core of automation is a scheduled job that queries your enriched transaction database. Using a framework like Cron or a task queue such as Celery, you can trigger a script daily or weekly. This script executes SQL queries or uses an ORM to filter transactions based on jurisdiction, threshold amounts (e.g., FATF's $1000/VASP or $3000/non-VASP Travel Rule), and asset type. The output is structured into report-ready data objects, typically in JSON or CSV format, containing fields like transaction hash, origin/destination addresses, amounts, asset identifiers, and the involved VASP's LEI or similar identifier.

For submission, you must integrate with regulatory APIs or secure filing portals. Many jurisdictions provide specific endpoints; for example, the Travel Rule protocol TRISA uses gRPC APIs for secure VASP-to-VASP communication. Your automation script should include a client module that handles authentication (often with mTLS certificates), formats the payload to the required schema (like the IVMS 101 data model), and submits it. Implement robust error handling with retry logic and alerting (e.g., via PagerDuty or Slack webhooks) for failed submissions to ensure compliance deadlines are met.

Code maintainability and auditability are critical. Structure your project with clear separation of concerns: a data_fetcher module, a report_generator that applies business logic, and a submission_client for API interactions. Use environment variables for sensitive data like API keys and certificate paths. Log every step—query execution, report generation, and submission attempts—with unique correlation IDs for traceability. This creates an immutable audit trail for regulators. Open-source libraries like web3.py for blockchain queries and requests for HTTP calls are foundational tools for this pipeline.

Finally, before full automation, run the system in a dry-run mode for several cycles. This mode executes all steps except the final API submission, instead writing the would-be payloads to a log or test directory. Review these outputs to verify accuracy against manual reports. Conduct integration tests with sandbox environments provided by regulators or protocols like TRISA's testnet. Only after consistent validation should you enable live submissions. This phased approach minimizes risk and ensures your automated system is reliable and compliant from day one.

step4-audit-trail
DATA INTEGRITY

Step 4: Implementing an Immutable Audit Trail

This step focuses on creating a permanent, tamper-proof record of all cross-border transaction data, which is essential for regulatory compliance and dispute resolution.

An immutable audit trail is a chronological, append-only ledger of all transaction events, metadata, and state changes. For a regulatory dashboard, this means logging every action—from transaction submission and AML screening results to final settlement confirmations and any manual overrides by compliance officers. The core principle is non-repudiation: once an event is recorded, it cannot be altered or deleted, providing a single source of truth for auditors. This is distinct from merely storing raw transaction data; it captures the process and decisions around each transaction.

To implement this, you need to define a structured audit event schema. Each event should include a unique event ID, a timestamp (preferably in UTC), the event type (e.g., TX_SUBMITTED, AML_CHECK_PASSED, SETTLEMENT_CONFIRMED), the acting entity (user ID or system component), the affected transaction ID, and a detailed payload in JSON format. This payload contains the specific data relevant to the event, such as the screening score, the rule that was triggered, or the old and new statuses of the transaction. Using a consistent schema is critical for querying and reconstructing timelines.

The most robust method for immutability is to anchor these audit logs to a public blockchain like Ethereum or a purpose-built ledger like Hedera Hashgraph. After batching events, you generate a cryptographic hash (e.g., SHA-256) of the batch and publish that hash as a transaction on-chain. This creates a cryptographic proof that the logs existed at that point in time and have not been modified since. Tools like OpenZeppelin's Defender Sentinel can automate this anchoring process. For internal systems, you can use a write-once-read-many (WORM) storage system or a database with immutable tables, though these offer weaker guarantees than public cryptographic verification.

Your dashboard's backend must expose APIs for both writing audit events and querying them. The write API should be strictly internal, accessible only by your core transaction processing services. The query API powers the audit trail viewer in the dashboard UI. It should allow filtering by transaction ID, date range, event type, and user. For performance with large datasets, consider indexing the transaction_id and timestamp fields in your database. Always return events in chronological order to facilitate timeline reconstruction.

Finally, the frontend component should present this data clearly. A common pattern is a vertical timeline view for a single transaction, showing each event as a node with its timestamp, type, and key details from the payload. For broader audits, a filterable table listing all events is essential. Crucially, for any event anchored on-chain, the UI should display the transaction hash and a link to a block explorer (e.g., Etherscan) so regulators can independently verify the proof of existence. This transparent verifiability transforms your audit log from an internal record into a trusted regulatory artifact.

DEVELOPER FAQ

Frequently Asked Questions

Common technical questions and solutions for building a blockchain-based regulatory reporting dashboard for cross-border transactions.

A robust dashboard must aggregate data from multiple on-chain and off-chain sources. Key sources include:

  • On-Chain Data: Transaction logs, wallet addresses, and smart contract events from relevant blockchains (e.g., Ethereum, Polygon, Arbitrum). Use indexers like The Graph or Covalent for efficient querying.
  • Off-Chain Data: Know Your Customer (KYC) verification status from providers like Synaps or Fractal, and fiat transaction records from payment processors.
  • Oracles: Real-world data feeds for exchange rates and counterparty information via services like Chainlink.
  • Regulatory Lists: Sanctions lists (OFAC) and Politically Exposed Persons (PEP) lists, which require regular updates via APIs.

The primary challenge is normalizing this heterogeneous data into a unified schema for analysis and reporting.

conclusion
IMPLEMENTATION SUMMARY

Conclusion and Next Steps

You have now built a foundational dashboard for monitoring cross-border crypto transactions against regulatory requirements. This guide covered the core architecture, data sourcing, and visualization logic.

The dashboard you've constructed provides a real-time view of transaction flows, applying rule-based logic to flag potential issues like sanctions list matches, large transfers exceeding jurisdictional thresholds, or transactions involving high-risk VASPs. By integrating on-chain data via APIs like Chainalysis or TRM Labs with off-chain KYC/AML databases, you create a single pane of glass for compliance teams. The key technical achievement is the automated alerting engine, which triggers notifications when a transaction's risk score, calculated from its attributes and counterparties, breaches a configurable limit.

To move from a proof-of-concept to a production system, several critical next steps are required. First, enhance data resilience by implementing redundant data pipelines and archival solutions for audit trails. Second, integrate with official regulatory reporting channels, such as the Travel Rule solution for your jurisdiction (e.g., using the IVMS 101 data standard) or direct APIs to financial intelligence units. Third, conduct a security audit of the entire stack, focusing on the privacy of the sensitive Personally Identifiable Information (PII) and transaction data you are processing.

For ongoing development, consider expanding the dashboard's capabilities. Implement machine learning models to detect anomalous transaction patterns that rule-based systems miss. Add support for emerging regulations like the EU's Markets in Crypto-Assets (MiCA) framework by creating new compliance modules. Explore decentralized identity protocols (e.g., Verifiable Credentials) to streamline and cryptographically verify counterparty KYC data, reducing reliance on centralized vendors.

The regulatory landscape for cross-border crypto transactions is evolving rapidly. Maintaining this dashboard requires a commitment to continuous monitoring of regulatory changes from bodies like the Financial Action Task Force (FATF), FinCEN, and local authorities. Subscribe to updates from the Blockchain Association or Global Digital Finance for industry insights. Your dashboard is not a set-and-forget tool; it is a living system that must adapt as laws and typologies change.

Finally, document your architecture, data sources, and rule-sets thoroughly. This documentation is crucial for internal training and for demonstrating your compliance program's robustness to external auditors or regulators. The code and principles from this guide—emphasizing data aggregation, rule-based analysis, and actionable visualization—provide a scalable foundation for building more sophisticated financial surveillance tools in the Web3 ecosystem.