Free 30-min Web3 Consultation
Book Consultation
Smart Contract Security Audits
View Audit Services
Custom DeFi Protocol Development
Explore DeFi
Full-Stack Web3 dApp Development
View App Services
Free 30-min Web3 Consultation
Book Consultation
Smart Contract Security Audits
View Audit Services
Custom DeFi Protocol Development
Explore DeFi
Full-Stack Web3 dApp Development
View App Services
Free 30-min Web3 Consultation
Book Consultation
Smart Contract Security Audits
View Audit Services
Custom DeFi Protocol Development
Explore DeFi
Full-Stack Web3 dApp Development
View App Services
Free 30-min Web3 Consultation
Book Consultation
Smart Contract Security Audits
View Audit Services
Custom DeFi Protocol Development
Explore DeFi
Full-Stack Web3 dApp Development
View App Services
LABS
Guides

How to Implement Dynamic Compliance Based on Token Type

A developer tutorial for building a system where compliance rules are dynamically applied based on tokenized asset classification, using a registry and factory pattern.
Chainscore © 2026
introduction
GUIDE

How to Implement Dynamic Compliance Based on Token Type

This guide explains how to build smart contracts that enforce different compliance rules depending on whether a token is fungible (ERC-20/ERC-777) or non-fungible (ERC-721/ERC-1155).

Token standards define a token's fundamental properties, and compliance logic must adapt accordingly. A fungible token like an ERC-20 represents interchangeable units, so compliance typically focuses on transfer amounts, sender/receiver addresses, and cumulative volume limits. In contrast, a non-fungible token (NFT) like an ERC-721 represents a unique asset, requiring checks on specific token IDs, collection-level rules, or metadata attributes. A dynamic system uses the token's interface to detect its type and apply the correct rule set.

The core technical challenge is creating a modular compliance engine that can be attached to various token contracts. This is often achieved through a proxy pattern or a modular hook system. For example, an ICompliance interface can define functions like checkTransferCompliance. A master registry then maps token addresses to specific compliance module addresses. When a transfer is initiated, the token contract calls its assigned module, which executes logic tailored to the token type, such as checking an allowlist for an ERC-20 or verifying ownership duration for an ERC-721.

For fungible tokens, implement checks using the transfer's amount and participant addresses. Common rules include per-transaction limits, daily volume caps using timestamps, and sanctions screening against on-chain or off-chain lists (e.g., OFAC). The checkTransferCompliance function for an ERC-20 might look up the sender's total transferred volume in a rolling 24-hour period from a storage mapping and revert if a cap is exceeded.

For non-fungible tokens, logic must incorporate the tokenId. Compliance can restrict transfers based on the NFT's traits (e.g., only 'Gold' tier NFTs are transferable), enforce holding periods (using the timestamp of when the tokenId was minted or last transferred), or implement gated community access. An ERC-721 compliance module would receive the tokenId as a parameter and could query a separate metadata contract to evaluate the asset's properties before approving the transfer.

To future-proof the system, design it to be extensible for new token standards like ERC-1155 (which can be both fungible and non-fungible) or ERC-3643 (for permissioned tokens). Use abstract contracts and interfaces. A well-designed factory can deploy new compliance module instances for each token, and an upgradeable proxy pattern allows for rule updates without migrating the token contract itself. Always include comprehensive event logging for all compliance decisions to ensure auditability.

Testing is critical. Write unit tests for each compliance module using frameworks like Foundry or Hardhat. Test edge cases: transferring the maximum allowed amount of an ERC-20, transferring a restricted NFT tokenId, and attempting transfers that should fail. Consider integrating with Chainlink Functions or a similar oracle service to fetch real-world compliance data, like fluctuating regulatory lists, ensuring your dynamic system remains current without requiring manual contract upgrades.

prerequisites
DYNAMIC COMPLIANCE

Prerequisites

Before implementing dynamic compliance logic, you must establish the foundational technical environment and understand the core concepts of token classification.

To build a system that applies different rules based on token type, you need a development environment configured for blockchain interaction. This includes installing a Node.js runtime (v18+), a package manager like npm or yarn, and a code editor such as VS Code. You will also need access to a blockchain node or a provider service like Alchemy, Infura, or a local Hardhat/Anvil instance for testing. Essential libraries include an Ethereum client like ethers.js (v6) or web3.js (v1.x) for interacting with smart contracts and the JSON-RPC API.

A critical prerequisite is understanding the token standards you will be evaluating. The primary standards on Ethereum and EVM-compatible chains are ERC-20 for fungible tokens, ERC-721 for non-fungible tokens (NFTs), and ERC-1155 for multi-tokens. Your compliance logic will branch based on the interface identifiers of these standards. You must know how to query a contract's supportsInterface function using the standard Interface Identifiers: 0x36372b07 for ERC-20, 0x80ac58cd for ERC-721, and 0xd9b67a26 for ERC-1155. This is the definitive method for on-chain token type detection.

You should have a smart contract or backend service architecture in mind where the compliance logic will reside. Common patterns include a modular rule engine, a proxy contract that routes transactions through checks, or an off-chain API service. You will need to design data structures to hold different rule sets—for example, a mapping from tokenType to a struct containing maxTransferAmount, requiredKYCLevel, and allowedJurisdictions. Decisions made here will affect gas costs (for on-chain logic) and latency (for off-chain services).

Finally, ensure you have access to real-world data for testing. Use verified contract addresses for major tokens like USDC (ERC-20), Bored Ape Yacht Club (ERC-721), and The Sandbox ASSETs (ERC-1155) on testnets like Sepolia or Goerli. Having these addresses allows you to mock and test your detection logic against live contract bytecode. Understanding these prerequisites ensures you can focus on implementing the dynamic business logic rather than troubleshooting environment or data issues.

architecture-overview
DYNAMIC COMPLIANCE

System Architecture Overview

Designing a system that automatically adjusts its security and compliance logic based on the type of token being processed.

A dynamic compliance architecture moves beyond one-size-fits-all rules by classifying tokens and applying tailored logic. The core design involves a token classifier that inspects incoming transactions to determine the token type—such as ERC-20, ERC-721, ERC-1155, or a wrapped asset. This classification triggers a specific compliance module containing the relevant rules. For instance, a FungibleTokenModule might enforce daily transfer limits and OFAC sanctions screening, while a NonFungibleTokenModule could focus on royalty enforcement and creator verification. This modular approach centralizes policy management and makes the system adaptable to new token standards.

Implementation begins with defining a canonical source of truth for token metadata, often an on-chain registry or an off-chain API that maps contract addresses to their properties. The classifier can use the ERC-165 standard's supportsInterface function to programmatically identify a contract's capabilities. A factory pattern is then used to instantiate the correct compliance handler. For example, a ComplianceFactory contract would have a function like getModuleForToken(address tokenAddress) that returns the address of the deployed module responsible for that token's rule set, enabling clean separation of concerns.

Practical applications are evident in cross-chain bridges and institutional custodians. A bridge might apply strict travel rule logic for stablecoins like USDC but only basic anti-fraud checks for a governance token. Code-wise, a module interface ensures consistency:

solidity
interface IComplianceModule {
    function validateTransfer(address from, address to, uint256 amountOrId) external returns (bool);
    function getRequiredChecks() external view returns (string[] memory);
}

Each concrete module implements this interface with its specific validation logic, allowing the main system to call validateTransfer without knowing the underlying token type.

Key challenges include managing gas costs for complex off-chain checks and ensuring the classifier remains up-to-date with emerging token standards like ERC-404. Best practices involve using upgradeable proxy patterns for compliance modules to allow rule updates without migrating the entire system and maintaining a fallback to a strict default policy for unclassified tokens. This architecture future-proofs DeFi applications and custodial services, providing a scalable framework for regulatory adherence across diverse digital assets.

key-concepts
DYNAMIC COMPLIANCE

Key Concepts and Components

Implementing automated compliance rules that adapt based on token classification, jurisdiction, and holder status.

01

Token Classification Frameworks

The foundation of dynamic compliance is accurate token classification. Key frameworks include:

  • FATF's VASP Guidance: Defines Virtual Asset Service Providers and their obligations.
  • SEC's Howey Test: Determines if a token is an "investment contract" (security) in the US.
  • MiCA's Classification (EU): Categorizes tokens as asset-referenced tokens (ARTs), e-money tokens (EMTs), or utility tokens.
  • Technical Standards: ERC-20 (fungible), ERC-721/1155 (non-fungible/NFT). Implement on-chain registries or oracles to tag tokens with these classifications for automated rule application.
02

On-Chain Identity & Credentials

Dynamic rules require verifying user attributes. Solutions include:

  • Decentralized Identifiers (DIDs): Self-sovereign identities (e.g., W3C standard) that users control.
  • Verifiable Credentials (VCs): Tamper-proof attestations (like KYC/AML status or accreditation) issued by trusted entities.
  • Zero-Knowledge Proofs (ZKPs): Allow users to prove compliance (e.g., "I am over 18" or "I am not a sanctioned entity") without revealing underlying data. Protocols like Polygon ID or Veramo provide toolkits for integrating these primitives.
03

Programmable Compliance Modules

Embed logic directly into smart contracts or use modular attachable policies.

  • Smart Contract Rules: Code restrictions like require(isAccreditedInvestor[msg.sender], "Not accredited"); or transfer limits based on token type.
  • Policy Engines: Use OpenZeppelin Defender for admin-managed rules or KYCDAO's SDK for plug-in compliance.
  • Composable Security: Implement the Diamond Standard (EIP-2535) to upgrade compliance logic without migrating the core token contract.
04

Jurisdictional Rule Engines

Compliance rules must vary by user location. Implement:

  • Geolocation Oracles: Services like Chainlink Functions can fetch verified location data (IP-based, with user consent) to trigger jurisdiction-specific rules.
  • Rule Sets as Data: Store country-specific regulations (e.g., US OFAC SDN list, EU travel rule thresholds) in an updatable on-chain or off-chain database.
  • Automated Sanctions Screening: Integrate with APIs from providers like Chainalysis or Elliptic to screen addresses in real-time against global watchlists before permitting transactions.
05

Real-Time Monitoring & Reporting

Dynamic systems need continuous oversight.

  • Event Listening: Use The Graph to index and query transaction events related to restricted token types or flagged addresses.
  • Anomaly Detection: Implement algorithms to detect unusual patterns (e.g., rapid fragmentation of large holdings) that may indicate evasion attempts.
  • Automated Reporting: Generate regulatory reports (e.g., for the EU's Travel Rule) by structuring transaction data with sender/receiver VASP information using standards like IVMS 101.
06

Case Study: Security Token Transfers

A practical example for an ERC-3643 security token:

  1. Pre-Transfer Check: The contract calls a verifyTransfer function.
  2. Dynamic Validation: It checks an on-chain registry: Is this token classified as a security? Is the recipient's address KYC'd and an accredited investor in their jurisdiction?
  3. Rule Application: If checks pass, the transfer proceeds. If not, it routes to a custodian or compliance officer for manual approval.
  4. Audit Trail: All checks and results are immutably logged on-chain for regulators. This ensures only eligible holders can transact, adapting rules per user.
COMPLIANCE MATRIX

Token Type to Compliance Rule Mapping

Mapping of common token standards to recommended on-chain compliance controls.

Compliance FeatureERC-20 (Fungible)ERC-721 (NFT)ERC-1155 (Semi-Fungible)

Transfer Restrictions (e.g., Blocklists)

Transaction Volume Limits

Holder Count Limits

Royalty Enforcement on Secondary Sales

Minting Allowlist/Gatelist

Transfer Cooldown Periods

0-24 hours

0-24 hours

Required Verifiable Credentials for Holders

step1-registry
FOUNDATION

Step 1: Building the Token Metadata Registry

A dynamic compliance system requires a canonical source of truth for token attributes. This step involves creating an on-chain registry that maps token addresses to their compliance-relevant metadata.

The core of dynamic compliance is the Token Metadata Registry, a smart contract that stores essential attributes for each supported token. This registry acts as the system's single source of truth, enabling downstream rules to query a token's type (e.g., ERC-20, ERC-721, ERC-1155), its associated risk category (e.g., STABLE, GOVERNANCE, MEME), and other flags like isSanctioned or requiresKYC. Without this registry, compliance logic would need hardcoded addresses, making it brittle and impossible to update for new assets.

Implementing the registry typically involves a mapping from address to a struct containing the metadata. A common pattern is to use an upgradeable contract or delegate management to a multi-signature wallet or DAO for governance. For example, a basic struct might include: tokenType, riskTier, issuer, and a metadataURI pointing to an off-chain JSON file for extended details. The OpenZeppelin EnumerableMap library is useful for efficiently storing and iterating over these entries.

Here is a simplified Solidity code snippet illustrating the registry's storage structure:

solidity
struct TokenInfo {
    bytes32 tokenStandard; // e.g., keccak256("ERC20")
    uint8 riskTier; // 0=Low, 1=Medium, 2=High
    address issuer;
    bool isSanctioned;
}
mapping(address => TokenInfo) public tokenMetadata;
address[] public registeredTokens;

This allows other contracts to query tokenMetadata[tokenAddress].riskTier to apply corresponding transfer limits or validation rules.

Populating the registry is a critical operational task. For mainnet deployment, you would seed it with verified data for major assets like USDC, WETH, and popular governance tokens. The process often involves an initial administrative setup followed by a community-driven proposal system for adding new tokens. It's crucial to implement strict access controls, typically using OpenZeppelin's Ownable or AccessControl, to ensure only authorized parties can update sanction status or risk classifications, preventing malicious manipulation.

Finally, the registry must be designed for low-latency queries. Compliance checks often happen within a single transaction, so gas efficiency is key. Consider storing frequently accessed data (like riskTier and isSanctioned) in a single uint256 packed with bitwise operations, and emit events for every metadata update to allow indexers and frontends to stay synchronized. This registry becomes the foundational layer that all subsequent compliance modules, such as transfer validators or wallet screening tools, will depend upon.

step2-rule-modules
STEP 2

Creating Modular Compliance Rules

Implement dynamic, token-specific compliance logic using Chainscore's rule engine to automate risk assessment and transaction validation.

Modular compliance rules are the core logic units that determine whether a transaction or wallet address is compliant. Unlike static rules, they are designed to be dynamic and context-aware, evaluating data like token type, transaction history, and on-chain behavior. In Chainscore, you define these rules using a combination of pre-built modules and custom JavaScript functions, which are then compiled into verifiable zero-knowledge circuits for private execution. This approach allows protocols to enforce complex policies—such as restricting NFT purchases to verified holders or limiting DeFi interactions based on wallet age—without exposing sensitive user data.

The rule engine evaluates conditions against a standardized Compliance Context Object. This object contains structured data about the transaction being checked, including the tokenType (e.g., ERC20, ERC721, ERC1155), sender and receiver addresses, amount, and relevant on-chain state. Your rule's logic inspects this context to return a boolean (true for compliant, false for non-compliant). For example, a rule for an ERC721 token sale might check if the sender is the verified owner of the token ID in question, while a rule for a governance token might verify the holder's voting power exceeds a minimum threshold.

To implement a rule, you write a function in Chainscore's Rule SDK. Here is a basic example that restricts transfers of a specific ERC721 token to a pre-approved list of addresses:

javascript
// rules/restricted-nft-transfer.js
export const handler = async (context) => {
  const ALLOWED_RECEIVERS = [
    '0x742d35Cc6634C0532925a3b844Bc9e...',
    '0x53f0C9F1b6eA8F...'
  ];
  // Apply rule only to ERC721 tokens
  if (context.tokenType !== 'ERC721') return true;
  // Check if receiver is in allowlist
  return ALLOWED_RECEIVERS.includes(context.receiver.toLowerCase());
};

After writing your rule, you register it with the Chainscore protocol, which generates a ZK proof of its logic. This proof, not the underlying rule data, is what is verified on-chain, ensuring the rule's integrity and privacy.

For more complex scenarios, you can compose multiple conditions and leverage external data via Chainscore's oracle modules. Consider a DeFi protocol that wants to enforce a tiered access system: users must hold a minimum balance of a specific ERC20 token and have completed a KYC process with a partner verifier. This requires checking on-chain token balances (via the TokenBalanceModule) and an off-chain attestation (via the KYCOracleModule). The rule's logic seamlessly integrates these checks, and the resulting proof cryptographically attests that both conditions were satisfied without revealing the user's actual balance or KYC details.

Finally, deploy and manage your rules through the Chainscore Dashboard or CLI. Each rule is assigned a unique Rule ID and version, allowing for easy updates and audits. You can attach rules to specific token contracts, geographies, or user segments. By building a library of modular rules, you create a flexible compliance layer that can adapt to new regulations, token standards, and business logic without requiring smart contract upgrades, significantly reducing operational overhead and technical debt for your Web3 application.

step3-factory-pattern
ARCHITECTURE

Step 3: Implementing the Token Factory

This section details the core factory contract logic for deploying tokens with embedded, type-specific compliance rules.

The token factory is a smart contract that acts as a blueprint deployer. Its primary function is to instantiate new token contracts based on a predefined template, but with parameters that determine their compliance behavior. Instead of writing separate contracts for a utility token, a security token, and a stablecoin, you create a single, audited factory that stamps out instances configured for each type. This pattern ensures consistency, reduces deployment gas costs, and centralizes upgrade logic for the compliance modules. Key parameters passed during deployment include the token's name, symbol, initialSupply, and a tokenType enum (e.g., UTILITY, SECURITY, STABLECOIN).

The factory's intelligence lies in how it maps the tokenType to specific compliance rules. Upon deployment, it calls an internal function, _attachComplianceModules(tokenType), which configures the new token instance. For a SECURITY type, this might automatically enable a transfer restrictor that checks an on-chain whitelist and a cap table manager. For a UTILITY token, it might attach only a simple blacklist module. This dynamic attachment is achieved using Solidity's delegatecall or by setting immutable variables within the token that point to the logic of different internal functions or external library contracts.

Here is a simplified code snippet illustrating the factory's deployment function:

solidity
function createToken(
    string memory name,
    string memory symbol,
    uint256 initialSupply,
    TokenType tokenType
) public returns (address) {
    Token newToken = new Token(name, symbol, initialSupply);
    _configureTokenByType(address(newToken), tokenType);
    emit TokenCreated(address(newToken), tokenType, msg.sender);
    return address(newToken);
}

function _configureTokenByType(address tokenAddress, TokenType tType) internal {
    IToken token = IToken(tokenAddress);
    if (tType == TokenType.SECURITY) {
        token.setTransferRestrictor(whitelistContract);
        token.setCapTableManager(capTableContract);
    } else if (tType == TokenType.STABLECOIN) {
        token.setPauser(pauserAddress);
        token.setBlacklist(blacklistContract);
    }
    // UTILITY token may have no extra modules set
}

Implementing the factory this way requires careful design of your base token contract's interface. The Token contract must expose configuration functions (like setTransferRestrictor) that can only be called by the factory or a designated owner, typically using the initializer pattern or access control modifiers. This prevents the deployed token's rules from being altered arbitrarily after creation. The compliance modules (whitelist, blacklist, pauser) are often separate contracts referenced by address, promoting modularity and allowing individual modules to be upgraded without redeploying every token.

For production systems, consider extending this pattern with a module registry. Instead of hardcoding module addresses in the factory, the factory can fetch them from a registry contract using the tokenType as a key. This allows for governance-driven updates to the compliance logic for all future tokens of a certain type. Furthermore, emitting a clear TokenCreated event with all parameters is essential for off-chain indexers and dashboards to correctly categorize and monitor the newly deployed compliant assets.

step4-integration-testing
IMPLEMENTATION

Step 4: Integration and Testing Strategy

This guide details the integration of dynamic compliance logic into your smart contract and outlines a robust testing strategy to ensure its security and correctness.

Integrating dynamic compliance begins with defining a clear interface for your rule engine. This is typically a separate contract that your main token contract will query. The core function is checkTransferRestrictions(address from, address to, uint256 amount, bytes memory data), which returns a boolean and an optional error message. Your token's _beforeTokenTransfer hook (from OpenZeppelin's ERC20) should call this function and revert if the check fails. This separation of concerns keeps your token logic clean and allows for upgrading the compliance rules independently.

For testing, a comprehensive strategy is essential. Start with unit tests for the rule engine contract in isolation, verifying each token type's specific logic (e.g., a RESTRICTED token rejecting unverified addresses). Then, write integration tests for the full token contract, simulating cross-chain messages from a bridge or a DAO governance proposal changing a token's classification. Use forked mainnet tests with tools like Foundry's cheatcodes or Hardhat's network forking to simulate real-world conditions, such as interacting with live oracle contracts for KYC status.

Key test scenarios must include: state transitions (e.g., a token moving from UNRESTRICTED to RESTRICTED mid-stream), edge cases like zero-value transfers and contract-to-contract interactions, and failure modes such as a malfunctioning oracle. Employ fuzz testing (with Foundry or Echidna) to send random addresses and amounts to your transfer function, ensuring no unexpected reverts or, worse, silent failures. Log all rule evaluations in events for off-chain monitoring and audit trails.

Finally, prepare for mainnet deployment with a staged rollout. Deploy the rule engine and token contracts on a testnet first, ideally one that supports the same cross-chain messaging protocol you'll use (like LayerZero's or Axelar's testnets). Execute your full test suite there. Consider implementing a timelock or a multisig for any function that can update the rule engine address or token type mappings, as these are critical governance actions. Your testing should prove the system is secure not just in theory, but under the unpredictable conditions of a live blockchain.

DYNAMIC COMPLIANCE

Frequently Asked Questions

Common questions and solutions for implementing token-type-specific compliance logic in smart contracts.

Dynamic compliance refers to smart contract logic that enforces different rules based on the type of token being transferred. This is critical for protocols that handle multiple asset classes, such as ERC-20s, ERC-721s (NFTs), and ERC-1155s (semi-fungible tokens).

Why it's needed:

  • Regulatory Requirements: Stablecoins (e.g., USDC) may require KYC checks, while utility tokens may not.
  • Protocol Logic: An NFT marketplace may need to enforce royalty payments, while a DEX pool does not.
  • Security: Restricting certain high-value or non-standard tokens from specific functions prevents exploits.

Without dynamic checks, a one-size-fits-all rule either blocks legitimate transactions or creates security/regulatory gaps.

conclusion-next-steps
IMPLEMENTATION SUMMARY

Conclusion and Next Steps

This guide has outlined a modular framework for building dynamic compliance systems that adapt to different token standards and their associated risks.

Implementing dynamic compliance requires a structured approach. The core principle is to separate the compliance logic from the token contract itself, using a registry or factory pattern. A central ComplianceRegistry contract can map token addresses to their designated ComplianceModule. This allows you to upgrade rules for a token type without redeploying the token, and new token deployments can be assigned a pre-audited module. Key functions include getModuleForToken(address token) for lookups and validateTransfer(...) which delegates checks to the appropriate module.

Your next step is to define the specific compliance rules within each module. For an ERC-20 module, you might implement daily transfer limits or geographic restrictions using an oracle. An ERC-721 module could enforce royalty payments on secondary sales via the EIP-2981 standard. For ERC-1155, which handles both fungible and non-fungible assets, your module must handle batch transfers and apply rules based on the token ID, potentially restricting certain NFT collections while allowing free trading of fungible game items. Always use OpenZeppelin's Ownable or AccessControl for administrative functions.

To test your system, use a development framework like Foundry or Hardhat. Write unit tests that deploy mock ERC-20, ERC-721, and ERC-1155 tokens, register them with your compliance registry, and simulate transfers that should pass or fail based on the active rules. Consider edge cases like batch operations and contract interactions. For production, you must thoroughly audit the compliance modules and the registry's upgrade paths. Resources like the OpenZeppelin Contracts Wizard and Solidity by Example are excellent for reference.

Looking forward, you can extend this architecture. Integrate real-time risk oracles from providers like Chainlink to adjust limits based on market volatility. Implement a time-delayed upgrade mechanism using a TimelockController to ensure changes are transparent. For complex DeFi integrations, design modules that can interface with lending protocols or DEX routers to enforce compliance at the point of interaction, not just transfer. The goal is a system that is both secure by default and adaptable to the evolving regulatory and technological landscape of tokenized assets.

How to Implement Dynamic Compliance Based on Token Type | ChainScore Guides