Free 30-min Web3 Consultation
Book Consultation
Smart Contract Security Audits
View Audit Services
Custom DeFi Protocol Development
Explore DeFi
Full-Stack Web3 dApp Development
View App Services
Free 30-min Web3 Consultation
Book Consultation
Smart Contract Security Audits
View Audit Services
Custom DeFi Protocol Development
Explore DeFi
Full-Stack Web3 dApp Development
View App Services
Free 30-min Web3 Consultation
Book Consultation
Smart Contract Security Audits
View Audit Services
Custom DeFi Protocol Development
Explore DeFi
Full-Stack Web3 dApp Development
View App Services
Free 30-min Web3 Consultation
Book Consultation
Smart Contract Security Audits
View Audit Services
Custom DeFi Protocol Development
Explore DeFi
Full-Stack Web3 dApp Development
View App Services
LABS
Guides

How to Architect a Regulator-Facing Oracle Data Feed

A step-by-step technical guide for developers building secure, real-time data feeds that stream verified on-chain and off-chain data to regulatory bodies for compliance reporting.
Chainscore © 2026
introduction
INTRODUCTION

How to Architect a Regulator-Facing Oracle Data Feed

This guide outlines the technical and architectural considerations for building a blockchain oracle data feed designed for regulatory scrutiny and compliance.

A regulator-facing oracle is a specialized data feed that provides verifiable, auditable, and tamper-resistant information to a blockchain for compliance purposes. Unlike standard price oracles, its primary function is to attest to real-world regulatory states, such as a licensed entity's good standing, a jurisdiction's legal status, or a user's verified credentials. The architecture must prioritize data integrity, audit trails, and transparent attestation to satisfy external auditors and legal frameworks. This requires a fundamental shift from optimizing for speed and cost to ensuring verifiability and legal defensibility at every layer.

The core challenge is bridging the trust gap between off-chain legal reality and on-chain smart contract logic. A regulator cannot audit a smart contract's internal state alone; they need proof that the data it acts upon is correct and sourced legitimately. Your architecture must therefore produce a cryptographic proof of provenance for every data point. This involves designing a system where each attestation is signed by a known, accountable entity (or a decentralized set of them), and where the entire history of data submissions and updates is immutably recorded, either on-chain or in a verifiable data structure like a Merkle tree.

Key architectural components include a secure off-chain data sourcing layer with legal agreements, a transparent attestation and signing mechanism, and an on-chain verification and dispute resolution system. For example, data about a business license might be sourced directly from a government API, cryptographically signed by the oracle node operator, and published on-chain with a timestamp. The smart contract consuming this data can verify the signature against a known public key registry. More advanced designs might use zero-knowledge proofs to attest to data validity without revealing the raw data, or decentralized identifier (DID) standards for managing verifiable credentials.

When implementing the on-chain component, smart contract design is critical. Functions should expose not just the current data state but also metadata like the attestor's identity, timestamp of last update, and a pointer to the proof (e.g., an IPFS hash of the signed payload). Consider implementing a time-lock or challenge period for new data, allowing a window for competing oracle nodes or auditors to dispute an attestation before it becomes final. This creates a cryptoeconomic security layer, similar to Optimistic Rollups, where correctness is assumed but can be penalized if proven false.

Finally, operational transparency is non-negotiable. Your system should generate automatic audit reports that map on-chain transaction hashes to off-chain data sources and signing events. Tools like The Graph for indexing or Etherscan's contract verification for source code transparency become part of the compliance toolkit. The goal is to enable a third-party regulator to independently verify, without your assistance, that the on-chain state accurately reflects the authorized off-chain world. This architectural rigor transforms the oracle from a simple data pipe into a credible legal bridge.

prerequisites
PREREQUISITES

How to Architect a Regulator-Facing Oracle Data Feed

Building a data feed for financial regulators requires a fundamentally different architecture than a standard DeFi oracle. This guide covers the core technical and conceptual prerequisites.

A regulator-facing oracle must prioritize data integrity, auditability, and legal compliance over raw speed and low cost. The primary goal is to provide a tamper-evident and immutable record of market data that regulators can independently verify. This shifts the architectural focus from a single on-chain price to a complete data pipeline with cryptographic proofs. Key prerequisites include understanding the data source attestation model used by oracles like Chainlink, where data providers cryptographically sign the data they supply, creating a verifiable chain of custody from source to blockchain.

You must architect for multi-source aggregation with clear provenance. Regulators require transparency into which sources were used, when the data was fetched, and how discrepancies were resolved. This often involves implementing a commit-reveal scheme or using a threshold signature scheme (TSS) among multiple, independent node operators. The system must log metadata such as API endpoints, fetch timestamps, and aggregation logic on-chain or in an immutable ledger. Tools like Chainlink's Off-Chain Reporting (OCR) protocol demonstrate how decentralized nodes can reach consensus on data off-chain before submitting a single, aggregated transaction.

The technical stack requires robust identity and attestation mechanisms. Each data provider and oracle node must have a verifiable on-chain identity, often through a decentralized identifier (DID) or a public key infrastructure (PKI). Data submissions should be accompanied by cryptographic attestations, such as digital signatures, which bind the data to a specific provider and timestamp. This creates an audit trail that is resistant to repudiation. Understanding zero-knowledge proofs (ZKPs) can be advanced but valuable, as they allow you to prove data correctness without revealing the raw source data, balancing transparency with privacy.

Finally, you need a clear legal and operational framework. This includes Service Level Agreements (SLAs) with data providers, defined dispute resolution processes, and data licensing that permits regulatory use. The architecture must support regulatory queries, allowing authorized parties to efficiently retrieve and verify historical data points and their proofs. This often necessitates a complementary off-chain data availability layer (like IPFS or a blockchain with cheap storage) to store the full dataset and attestations cost-effectively, while keeping critical hashes and signatures on the main chain.

key-concepts
REGULATOR-FACING ORACLE

Core Architectural Components

Building a data feed for financial regulators requires a distinct architecture focused on auditability, data provenance, and tamper-evident reporting. These components form the foundation.

01

On-Chain Attestation Layer

The immutable core where data commitments are anchored. This involves publishing cryptographic proofs (like Merkle roots) of the source data onto a public blockchain (e.g., Ethereum, Base). This creates a tamper-evident audit trail that regulators can independently verify. Key considerations include:

  • Data Batching: Aggregating reports to optimize gas costs.
  • Timestamping: Using the block timestamp as a consensus-backed proof of time.
  • Signature Schemes: Employing BLS signatures for efficient multi-signer aggregation.
02

Off-Chain Data Pipeline

The secure system for collecting, processing, and preparing raw data for attestation. This is where data provenance is established. The pipeline must:

  • Ingest from Authoritative Sources: Connect directly to regulated entities' APIs or signed data feeds.
  • Apply Business Logic: Transform raw data into the standardized report format required by the regulator.
  • Generate Attestation Package: Create the structured data and cryptographic proofs for the on-chain layer.
  • Ensure High Availability: Use redundant, geographically distributed nodes to meet SLA requirements.
03

Regulator Verification Portal

A dedicated, user-friendly interface for regulators to monitor and audit the data feed. This component bridges the on-chain proofs with human-readable reports. Essential features include:

  • Proof Verification Widget: Allows a regulator to paste a transaction hash or Merkle root and see the corresponding raw data, verifying its integrity.
  • Historical Data Explorer: Tools to query and visualize the complete history of submitted reports.
  • Alerting & Monitoring: Dashboards showing data freshness, node health, and any submission failures.
  • Compliance Documentation: Clear documentation of the attestation methodology and data schema.
04

Decentralized Validator Network

A set of independent, permissioned nodes responsible for signing the attestations. This network provides Byzantine Fault Tolerance and removes single points of failure. Architecture involves:

  • Permissioned Node Set: Nodes operated by pre-vetted, reputable entities (e.g., audit firms, infrastructure providers).
  • Multi-Signature Schemes: Requiring a threshold (e.g., 5-of-7) of signatures to finalize an attestation, preventing any single node from manipulating data.
  • Slashing Conditions: Economic penalties for malicious or unreliable behavior, enforceable via smart contracts.
  • Key Management: Secure, often hardware-based, key management for signing operations.
05

Data Schema & Compliance Logic

The formal specification that defines the structure, semantics, and validation rules for all reported data. This is the legal and technical contract of the feed. It includes:

  • Standardized Format: A strict schema (e.g., JSON Schema, Protocol Buffers) for all data points.
  • Validation Rules: Programmatic checks for data consistency, range limits, and business logic (e.g., sum of balances equals total assets).
  • Versioning & Upgrades: A clear governance process for updating the schema without breaking historical auditability.
  • Immutable References: The schema hash should be referenced in on-chain attestations, linking data to its governing rules.
06

Audit & Monitoring Subsystem

Continuous oversight mechanisms that ensure the system operates as designed. This provides operational transparency for both the operator and the regulator. Components are:

  • Node Performance Metrics: Public dashboards showing uptime, latency, and signing participation for each validator.
  • Attestation Discrepancy Alerts: Automated systems to detect and flag differences in data prepared by different nodes before signing.
  • Incident Response Logs: A transparent, immutable record of any operational incidents and remedial actions taken.
  • Third-Party Audit Feeds: The ability for external auditors to run their own light clients or observers to verify the system's output independently.
data-sourcing-layer
ARCHITECTURE

Step 1: Design the Data Sourcing Layer

The foundation of a regulator-facing oracle is its data sourcing layer. This step defines how raw, authoritative data is collected, validated, and prepared for on-chain consumption.

A regulator-facing oracle feed must source data from authoritative primary sources. These are the official, legally recognized entities that publish the data a regulation depends on. Examples include the U.S. Bureau of Labor Statistics for CPI data, the U.K. Financial Conduct Authority for approved price benchmarks, or a national land registry's API for real estate titles. The sourcing layer's first job is to establish secure, programmatic connections to these sources, typically via HTTPS APIs or dedicated data feeds, ensuring the data's provenance is indisputable.

Data must be cryptographically signed at the source whenever possible. For the highest assurance, work with data providers to obtain signatures on their published data using their official keys (e.g., via the OpenAttestation framework). If direct signing isn't available, you must implement a robust system of attestations. This involves running multiple independent sentry nodes that fetch the same data, compare results for consensus, and then cryptographically attest to the data's accuracy and timestamp before it progresses to the aggregation layer.

The architecture must handle source failure gracefully. Design for redundancy by identifying multiple primary or equivalent secondary sources for the same data point. For example, if a regulatory rule references "the EUR/USD exchange rate," you might source it from both the European Central Bank and a designated benchmark administrator like ICE Benchmark Administration. The sourcing logic should include health checks, fallback mechanisms, and clear alerting for any source degradation, ensuring continuous data availability.

Every data point requires a verifiable timestamp proving when it was published by the source. This is critical for regulations with time-sensitive triggers. Your sourcing nodes should capture the data alongside the source's publication timestamp and the precise time it was observed (using a synchronized clock like NTP). This metadata forms an immutable audit trail, allowing regulators to verify that an on-chain transaction correctly used the data that was officially valid at that moment in time.

Finally, the sourced raw data must be normalized into a standardized schema for processing. Different sources format data differently—one API might deliver a JSON object while another provides an XML feed. The sourcing layer should parse, validate (checking for schema adherence and outliers), and transform all incoming data into a consistent internal format. This prepares clean, structured data packets for the next stage: aggregation and consensus.

attestation-mechanism
ARCHITECTURE

Step 2: Implement the Attestation Mechanism

This step details how to design and code the core attestation logic that cryptographically signs and timestamps data for regulatory consumption.

The attestation mechanism is the cryptographic heart of your oracle. Its primary function is to produce a verifiable proof that a specific data point was observed and signed by an authorized entity at a precise moment. This proof, often called an attestation or signed message, must be immutable and publicly verifiable. For regulators, this moves data reporting from a claim to evidence. Common implementations use elliptic curve digital signatures (like those from the secp256k1 or Ed25519 curves) because they are standard, efficient, and widely supported for verification on-chain and off-chain.

Architecturally, the attestation service should be a separate, secure module. It receives validated data from your aggregation logic and outputs a structured payload containing the data, a timestamp, a unique identifier (like a nonce or request ID), and the signature. A typical payload schema in JSON format might look like:

json
{
  "feedId": "REG_FED_FUNDS_RATE",
  "value": 5.33,
  "timestamp": 1735689600,
  "roundId": 12587,
  "signature": "0x1234abcd..."
}

The signature is generated by signing a deterministic hash (e.g., keccak256) of the other fields. This ensures any alteration invalidates the proof.

Security of the signing key is paramount. The private key must never be exposed to the public internet or the main application server. Best practices include using a hardware security module (HSM), a cloud-based key management service (like AWS KMS or GCP Cloud KMS), or a dedicated, air-gapped signing server. Your code should call a secure API or use a client library provided by these services to request signatures, never handling the raw private key directly. This isolates the critical secret and provides audit logs for every signing operation.

For on-chain verification, you must deploy a corresponding verification contract. This smart contract holds the public key or address of the authorized signer. When a consumer contract receives an attestation, it calls the verifier to check the signature against the signed data hash. A Solidity snippet for a basic verifier might use OpenZeppelin's ECDSA library:

solidity
import "@openzeppelin/contracts/utils/cryptography/ECDSA.sol";

contract DataVerifier {
    using ECDSA for bytes32;
    address public immutable trustedSigner;

    constructor(address signer) {
        trustedSigner = signer;
    }

    function verifyAttestation(
        bytes32 dataHash,
        bytes memory signature
    ) public view returns (bool) {
        return dataHash.recover(signature) == trustedSigner;
    }
}

This provides a trustless, automated check that the data is authentic.

Finally, consider attestation lifecycle management. Implement key rotation policies without service disruption, which may involve a multi-signature scheme or a verifiable key update process. Also, design for auditability: every attestation should be logged with metadata (e.g., data source IDs, operator ID, signing key version) and potentially published to an immutable storage layer like IPFS or a blockchain event log. This creates a permanent, tamper-evident record for regulators to inspect, completing the chain of custody from source data to signed proof.

storage-publishing
DATA LIFECYCLE

Step 3: Choose Storage and Publishing Models

Selecting how to store and publish data is critical for building a regulator-facing oracle feed that is both trustworthy and compliant.

The storage model defines where and how your oracle's raw and processed data is persisted before publication. For a regulator-facing feed, you must prioritize immutability and provenance. On-chain storage, such as using a data availability layer like Celestia or EigenDA, provides a permanent, tamper-proof record. This creates an immutable audit trail, a key requirement for regulatory scrutiny. Alternatively, you can use decentralized storage solutions like Arweave or Filecoin for cost-effective, long-term archiving of large datasets, with only the essential data hashes or Merkle roots published on-chain for verification.

The publishing model determines how data is transmitted from the oracle's off-chain infrastructure to the blockchain. The two primary models are push and pull. A push model, where the oracle node proactively submits data on-chain at regular intervals (e.g., every block or epoch), is common for high-frequency price feeds. A pull model, where data is only published when explicitly requested by a smart contract, can reduce on-chain costs for less time-sensitive data. For regulatory compliance, you must ensure the publishing mechanism is reliable and provides clear attribution, linking each data point to a specific oracle node and timestamp.

Your architecture must also define the data format and schema. Use standardized, self-describing formats like Protocol Buffers or Avro for serialization, as they enforce a strict schema. This ensures data consistency and makes historical data interpretable by auditors years later. The published on-chain payload should be minimal, often just a key (e.g., `"BTC/USD") and a value. The full context—including data sources, calculation methodology, and timestamps—should be stored off-chain in the immutable storage layer, with a content identifier (CID) or hash referenced on-chain.

Consider implementing a multi-phase commit for critical updates. For example, a node first publishes an intent to update with a hash of the new data (announce phase). After a challenge period where other nodes or watchdogs can dispute, the actual data is revealed and finalized. This adds a layer of security and procedural transparency, mimicking formal review processes that regulators understand. Frameworks like Chainlink's Off-Chain Reporting (OCR) and Pythnet's pull oracle demonstrate sophisticated implementations of these principles.

Finally, architect for selective disclosure. Regulators may require access to underlying source data or internal attestations without making them public. Your system should support generating zero-knowledge proofs (ZKPs) or using trusted execution environments (TEEs) to cryptographically prove data was processed correctly according to predefined rules, without exposing the raw inputs. This balances transparency with necessary confidentiality for sensitive commercial data sources.

regulator-api-design
ARCHITECTURE

Step 4: Design the Regulator API and Portal

Build a secure, auditable interface for regulators to access and verify on-chain compliance data in real-time.

A regulator-facing portal is not a standard dashboard; it is a read-only, audit-grade interface designed for transparency and verification. Its primary function is to provide authorized regulatory bodies with direct, immutable access to the compliance data your oracle network attests to on-chain. This architecture shifts the burden of proof from periodic manual reporting to continuous, automated verification. The core components are a secure API layer and a web-based portal that visualizes the data feeds, proof of attestation, and validator signatures.

The Regulator API must be built with strict authentication, authorization, and audit logging. Use API keys or OAuth 2.0 with client credentials grant, tied to specific regulator entities. All requests should be logged with timestamps, IP addresses, and query parameters to create an immutable access trail. The API endpoints should serve structured data, such as GET /v1/feeds/{feedId}/attestations to retrieve all attestation transactions for a specific compliance rule, or GET /v1/validators/{address}/reputation to show a validator's historical performance and stake.

Data served by the API must be cryptographically verifiable. Each response should include the on-chain transaction hash, block number, and the raw event logs for any reported data point. This allows regulators to independently verify that the data presented in the portal matches what is permanently recorded on the blockchain. For example, a response for an AML check might include the txHash of the attestation and the logIndex where the ComplianceAttested event was emitted, enabling direct cross-referencing with a block explorer.

The web portal visualizes this data for practical use. Key features include: a live feed of all attestations, detailed views of specific compliance checks (showing triggering addresses, rule parameters, and validator consensus), and validator network health metrics. Crucially, every data point on the portal should have a "Verify on Chain" button that links to the exact transaction on a block explorer like Etherscan or a dedicated archive node. The interface must be clear, devoid of complex Web3 jargon, and focused on audit trails.

Security is paramount. The portal and API should be hosted on infrastructure separate from your core oracle node operations to limit attack surfaces. Implement rate limiting, DDoS protection, and regular security audits. Consider using a zero-trust architecture where every access request is authenticated and authorized, and all data is encrypted in transit using TLS 1.3. Access should be provisioned and revoked programmatically, with clear SLAs for uptime and data freshness to meet regulatory expectations for reliable access.

Finally, design for extensibility and standards. As regulatory requirements evolve, new data feeds and report formats will be needed. Structure your API using a versioned endpoint strategy (e.g., /v1/, /v2/) and consider adopting emerging standards like OpenAPI for documentation. Providing regulators with a well-documented, reliable, and verifiable window into your on-chain attestations builds essential trust and demonstrates a proactive approach to compliant decentralized finance.

REGULATORY COMPLIANCE

Oracle Architecture Pattern Comparison

Comparison of common oracle patterns for building data feeds that meet regulatory requirements for auditability, data provenance, and dispute resolution.

Architectural FeatureCentralized API ProxyMulti-Source AggregationSigned Data Attestation

Data Provenance & Audit Trail

Real-Time Data Latency

< 1 sec

2-5 sec

1-3 sec

On-Chain Data Storage Cost

Low

High

Medium

Regulatory Audit Support

Limited

Comprehensive

Comprehensive

Dispute Resolution Mechanism

Off-chain only

On-chain challenge period

On-chain cryptographic proof

Single Point of Failure Risk

High

Medium

Low

Implementation Complexity

Low

High

Medium

Example Protocol

Chainlink Any API

Chainlink Data Feeds

Pyth Network

security-availability
ARCHITECTING FOR REGULATORS

Step 5: Ensure Security and High Availability

Building a data feed for financial regulators requires a system that is not only secure but also consistently available and verifiably correct. This step focuses on the architectural patterns that meet these stringent demands.

A regulator-facing oracle must prioritize data integrity and liveness above all else. Unlike consumer DeFi applications, the tolerance for downtime or incorrect data is near zero. The architecture should be designed with redundancy at every layer: multiple independent data sources, a decentralized network of node operators, and fallback mechanisms for every critical component. This multi-layered approach ensures that a single point of failure, whether a data provider outage or a node compromise, does not halt the feed or corrupt the output.

Security for these systems extends beyond smart contract audits. It requires a robust cryptographic attestation model. Each data point should be signed by the oracle node's private key, creating an immutable, on-chain record of provenance. Regulators can then cryptographically verify that the data they received matches what was committed to the blockchain. Furthermore, implement a slashing mechanism where node operators risk their staked collateral for malicious behavior, such as submitting incorrect data or going offline during a critical update window.

High availability is engineered through active-active redundancy. Deploy identical oracle node clusters across separate cloud providers and geographic regions. Use a load balancer and health checks to route requests only to healthy nodes. For the consensus layer, a Byzantine Fault Tolerant (BFT) protocol, like Tendermint Core used by Chainlink's DONs, is essential. It guarantees liveness (the network continues) and safety (validators never commit conflicting blocks) as long as less than one-third of the validating power is malicious.

Data sourcing must also be redundant. Connect to multiple primary data vendors (e.g., Bloomberg, Reuters) and aggregate their prices using a median or trimmed mean function to filter out outliers. This aggregation should occur off-chain within the oracle network's secure enclave before a single, validated value is broadcast on-chain. Maintain a publicly accessible attestation registry, perhaps on IPFS or a dedicated subgraph, that logs every data fetch, signature, and on-chain submission for external auditability.

Finally, establish a formal incident response plan and service level objective (SLO) documentation. Define clear procedures for manual intervention, key rotation, and disaster recovery. Your SLOs should publicly commit to metrics like 99.99% uptime and sub-second latency for price updates. This transparency and preparedness are critical for passing operational due diligence reviews conducted by regulatory bodies before they integrate your data feed into their monitoring systems.

REGULATORY ORACLES

Frequently Asked Questions

Common technical questions and solutions for building oracle data feeds that meet regulatory compliance requirements.

A regulator-facing oracle is a specialized data feed designed to provide verifiable, auditable, and legally recognized data to on-chain applications subject to compliance rules, such as those in Real-World Assets (RWA) or regulated DeFi. Unlike standard oracles that prioritize speed and cost for price feeds, regulator-facing oracles emphasize:

  • Data Provenance: Immutable proof of the original data source and its chain of custody.
  • Audit Trail: A tamper-resistant log of all data submissions, transformations, and access events.
  • Legal Attestation: Data signed or attested by recognized legal entities or trusted third parties.
  • Regulatory Jurisdiction: Explicit mapping of data to specific legal frameworks (e.g., MiCA, SEC rules).

Architecturally, this often requires a multi-layered approach combining on-chain verification with off-chain attestation services from entities like Chainlink Proof of Reserve, API3's dAPIs with first-party providers, or custom zero-knowledge proof (ZKP) circuits for data privacy.

conclusion
ARCHITECTURAL SUMMARY

Conclusion and Next Steps

Building a regulator-facing oracle data feed requires a deliberate architecture that prioritizes auditability, data integrity, and legal compliance over raw speed and cost.

Architecting a regulator-facing oracle is fundamentally different from building for DeFi. The primary design goals shift from low-latency and low-cost to immutable audit trails, proven data provenance, and regulatory-grade attestations. Your system must be able to withstand forensic examination, proving not just what data was delivered, but how and from where it was sourced, and who verified it. This often involves a multi-layered approach combining on-chain data feeds with off-chain legal agreements and compliance frameworks.

A robust implementation typically involves several key components working in concert. The data sourcing layer must use vetted, primary sources with clear licensing. The validation and attestation layer requires trusted, identifiable entities (not anonymous nodes) to cryptographically sign data submissions. The on-chain delivery should use a commit-reveal scheme or zk-proofs to maintain transparency while managing front-running risks. Finally, a persistent off-chain data availability layer, like Arweave or Filecoin, is essential for storing the full attestation packages and source proofs that are too large for on-chain storage.

For next steps, begin by formalizing your data requirements with legal and compliance teams. Identify the specific regulatory reporting standards (e.g., MiCA, SEC rules) your feed must satisfy. Then, prototype the core attestation flow using a framework like Chainlink Functions or a custom Oracle.sol contract with a permissioned node set. Test the entire lifecycle—from data fetch and signing to on-chain delivery and off-chain archiving—in a testnet environment. Document every assumption and design decision, as this documentation will be as critical as the code itself during any regulatory review.

The long-term evolution of these systems points toward increasing automation of compliance. Look to integrate with on-chain KYC/AML providers like Verite or Circle's Verite to cryptographically verify the identity of data providers and attestors. Explore zero-knowledge proofs (ZKPs) to allow regulators to verify data correctness and processing without exposing sensitive source data. The endpoint is a verifiable data pipeline where regulatory compliance is a programmable, auditable feature of the oracle's architecture, not an external afterthought.

How to Build a Regulator-Facing Oracle Data Feed | ChainScore Guides