Free 30-min Web3 Consultation
Book Consultation
Smart Contract Security Audits
View Audit Services
Custom DeFi Protocol Development
Explore DeFi
Full-Stack Web3 dApp Development
View App Services
Free 30-min Web3 Consultation
Book Consultation
Smart Contract Security Audits
View Audit Services
Custom DeFi Protocol Development
Explore DeFi
Full-Stack Web3 dApp Development
View App Services
Free 30-min Web3 Consultation
Book Consultation
Smart Contract Security Audits
View Audit Services
Custom DeFi Protocol Development
Explore DeFi
Full-Stack Web3 dApp Development
View App Services
Free 30-min Web3 Consultation
Book Consultation
Smart Contract Security Audits
View Audit Services
Custom DeFi Protocol Development
Explore DeFi
Full-Stack Web3 dApp Development
View App Services
LABS
Guides

Setting Up a Verifiable Credentials Framework for Your Web3 Product

A technical guide for developers on implementing a W3C Verifiable Credentials framework that meets legal and compliance requirements, including KYC/AML and data privacy.
Chainscore © 2026
introduction
DEVELOPER GUIDE

Setting Up a Verifiable Credentials Framework for Your Web3 Product

A technical guide to implementing a verifiable credentials (VC) system for regulatory compliance and user verification in decentralized applications.

Verifiable Credentials (VCs) are a W3C standard for creating tamper-proof, privacy-preserving digital attestations. In a Web3 context, they enable decentralized applications to request and verify user data—such as KYC status, accredited investor status, or proof of age—without relying on a central database. A VC is a cryptographically signed JSON object containing claims about a subject, issued by a trusted entity (issuer), held by the user (holder), and presented to a relying party (verifier), like your dApp. This creates a trust model based on cryptographic proofs rather than custodial data storage.

The core components of a VC framework are the issuer, holder, and verifier. The issuer (e.g., a KYC provider) creates a VC and signs it with their private key. The holder (your user) stores this credential in a digital wallet. When interacting with your dApp, the user presents the VC. Your application, as the verifier, checks the credential's cryptographic signature against the issuer's public key (often found on a decentralized identifier, or DID) and validates that the claims meet your business rules. This process, defined by the Verifiable Credentials Data Model, ensures data integrity and authenticity.

To implement this, you must choose a DID method for your issuers and verifiers. Common methods include did:ethr (Ethereum), did:key, or did:web. The issuer's DID and its associated public key are crucial for verification. Next, select a signature suite like Ed25519Signature2018 or EcdsaSecp256k1Signature2019 to define the cryptographic proof format. For on-chain verification, you can use libraries like veramo (JavaScript/TypeScript) or ssi (Go) to handle credential creation, signing, and verification logic within your backend services.

A practical implementation involves three steps. First, your backend defines a verification policy, specifying the required credentials (e.g., "type": "KYCAMLAttestation") and the trusted issuer DIDs. Second, you integrate a wallet connection that supports VCs, such as an SSI wallet, and request credentials using a Presentation Exchange protocol. Third, your verification service receives the presented credential, checks the proof, validates the issuer's DID on its blockchain registry, and evaluates the claims. Here's a simplified verification check using a Node.js example with a hypothetical SDK:

javascript
const isValid = await verifier.verifyCredential({
  credential: presentedVC,
  trustedIssuerDIDs: ['did:ethr:0x1234...']
});
if (isValid && presentedVC.credentialSubject.country === 'US') {
  // Grant access to regulated feature
}

For compliance use cases like financial regulations, you must ensure the issuer is a certified entity (e.g., a licensed KYC provider) and that the credential's evidence field contains audit trails. The framework must also handle credential revocation. Implement checks against a revocation list (like a smart contract or a verifiable status list) to ensure the credential hasn't been invalidated. Storing only the verification result (a boolean) and the credential's minimal disclosure (e.g., a zero-knowledge proof of being over 18) on-chain, rather than the raw data, aligns with privacy-by-design principles and regulations like GDPR.

Adopting a VC framework future-proofs your application for interoperable digital identity. It moves compliance from a one-time, intrusive data scrape to a reusable, user-centric model. Start by integrating with existing VC issuers in your sector, define clear verification policies, and use open-source libraries to handle the complex cryptography. This setup reduces liability, enhances user privacy, and creates a portable identity layer for your Web3 product.

prerequisites
FOUNDATION

Prerequisites and Core Dependencies

Before implementing verifiable credentials, you must establish the core technical stack and understand the underlying standards that enable decentralized identity.

The foundation of any verifiable credentials (VC) system is built on three key standards: Decentralized Identifiers (DIDs), Verifiable Credentials Data Model, and Verifiable Presentations. DIDs, defined by the W3C, are a new type of identifier that enable verifiable, decentralized digital identity. Unlike traditional identifiers (like an email), a DID is controlled by the identity holder, can be resolved to a DID Document, and is independent of any centralized registry. You will need to choose a DID method, such as did:ethr for Ethereum-based identities or did:key for simple key pairs, which dictates how the DID is created, resolved, and managed on its underlying blockchain or network.

Your technical stack requires specific libraries to handle the cryptographic operations and data model compliance. For a JavaScript/TypeScript environment, essential npm packages include did-resolver and ethr-did-resolver (or another method-specific resolver) to resolve DIDs to their documents. The veramo framework provides a modular toolkit for issuing and verifying credentials, while @transmute/did-key is useful for generating did:key identifiers. For the core data model, you will work with libraries like vc-js or the modules within Veramo that implement the W3C VC specifications. These handle the creation of JSON-LD or JWT formatted credentials, which are the two primary serialization formats.

A crucial prerequisite is setting up a secure key management system for the issuer, holder, and verifier roles. The issuer (your product) needs a signing key pair (e.g., Ed25519 or secp256k1) to cryptographically sign credentials. The holder (your user) requires a wallet or agent to securely store their private keys, manage their DIDs, and create verifiable presentations. For development, you can start with software key management, but production systems often integrate with Hardware Security Modules (HSMs) or cloud KMS solutions for the issuer's root keys. The @veramo/kms-local plugin can manage local keys, while other plugins connect to external KMS providers.

You must establish a verifiable data registry, which is a system component that DIDs use for recording and resolving DID Documents. For many Web3 projects, this is a public blockchain like Ethereum, Polygon, or a dedicated sidechain. The registry stores the DID Document associated with a DID, which contains public keys, service endpoints, and verification methods. When using did:ethr, for instance, the DID Document is anchored to the Ethereum blockchain via a smart contract (the Ethr-DID-Registry). Your application's resolver will query this blockchain to fetch the current state of a DID, allowing verifiers to check the validity of a signature.

Finally, plan your data schemas and revocation strategy. Define the structure of the credentials you will issue using JSON Schema, ensuring they are clear and minimal. For revocation, decide between status lists (a cryptographically verifiable list of revoked credential IDs), smart contract-based registries, or time-based expiration. Implementing a revocation check is critical for verifiers. A basic flow involves the verifier's agent resolving the issuer's DID, fetching the revocation status from a service endpoint listed in the DID Document, and checking if the specific credential's ID is present on the revocation list.

key-concepts
IMPLEMENTATION GUIDE

Core Components of a VC System

A Verifiable Credentials (VC) framework requires specific technical components to issue, hold, and verify credentials in a decentralized way. This guide covers the essential building blocks.

05

Holder Wallet & Agent

The holder is the entity that receives and stores VCs. A digital wallet or agent software manages their DIDs, private keys, and credentials.

  • Functions: Receives, stores, and presents credentials upon request.
  • Interoperability: Should support the DIDComm v2 protocol for secure, peer-to-peer messaging with issuers and verifiers.
  • User Experience: Critical for adoption; can be a mobile app, browser extension, or cloud agent.
DATA LAYER

Comparing Verifiable Credential Data Formats

A technical comparison of the primary data formats used to encode and transmit verifiable credentials, focusing on developer implementation.

Feature / MetricJSON-LD (W3C)JWT (SD-JWT)CBOR (AnonCreds)

Primary Standard

W3C Verifiable Credentials Data Model 2.0

IETF SD-JWT VC Draft 10

Hyperledger AnonCreds Specification v1.0

Core Data Structure

Linked Data JSON

Nested JSON Web Token

Concise Binary Object Representation

Selective Disclosure

Zero-Knowledge Proof Support

Via external proofs (BBS+)

Limited (Key Binding)

Native (CL-Signatures)

Signature Format

Linked Data Proofs (Ed25519, BBS+)

JSON Web Signature (JWS)

COSE Sign1 (EdDSA, ES256)

Avg. Credential Size (KB)

5-15

2-8

1-4

Schema Binding & Validation

JSON-LD Contexts

JSON Schema

Credential Definition on Ledger

Primary Use Case

Interoperable, semantic-rich credentials

Compact, web-friendly bearer credentials

Privacy-preserving, mobile-optimized credentials

step-issuer-setup
VERIFIABLE CREDENTIALS FRAMEWORK

Step 1: Setting Up a Trusted Issuer

Establish a trusted identity issuer using decentralized identifiers (DIDs) and the W3C Verifiable Credentials data model to create cryptographically secure attestations.

A trusted issuer is the foundational entity in a verifiable credentials (VC) system. It creates and cryptographically signs digital attestations, such as proof of KYC completion or membership status, which can be verified by any third party. In Web3, this role is decentralized using Decentralized Identifiers (DIDs). A DID is a self-sovereign identifier, like did:ethr:0xabc123..., controlled by the issuer's private key and resolvable to a DID Document containing public keys and service endpoints. This setup removes reliance on centralized certificate authorities.

To begin, you must choose and implement a DID method. For Ethereum-based projects, did:ethr (from the Ethr-DID library) is common, while did:key is useful for simple, offline scenarios. Your issuer's identity is established by creating a DID Document and anchoring it to your chosen system, such as the Ethereum blockchain for did:ethr. This document declares the public keys used for signing VCs. The private key associated with this DID is the issuer's root of trust and must be secured via hardware modules or cloud KMS.

Next, define your credential schema. This is a JSON Schema that structures the data fields within your credentials (e.g., firstName, memberSince, level). Publishing this schema to a persistent, immutable storage layer like IPFS or a blockchain ensures verifiers can fetch it to validate credential structure. A schema referenced by a Content Identifier (CID) like ipfs://QmXoypiz... guarantees data consistency. Use libraries like json-schema to validate credential data against this schema before issuance.

The issuance process involves creating a Verifiable Credential JSON-LD object. This object includes the credential metadata (issuer DID, issuance date, expiration), the credential subject's DID, and the claims data conforming to your schema. You then create a Verifiable Presentation of this credential, which is signed with the issuer's private key using a JWT (JSON Web Token) or Linked Data Proof format like Ed25519Signature2020. This signature is the cryptographic proof of authenticity.

Here is a simplified code example using the did-jwt-vc library to issue a credential:

javascript
import { createVerifiableCredentialJwt } from 'did-jwt-vc';
const issuer = { did: 'did:ethr:0x...', privateKey: '0x...' };
const vcPayload = {
  sub: 'did:ethr:0x...', // Holder's DID
  vc: {
    '@context': ['https://www.w3.org/2018/credentials/v1'],
    type: ['VerifiableCredential', 'MembershipCredential'],
    credentialSchema: {
      id: 'ipfs://QmXoypiz...',
      type: 'JsonSchemaValidator2018'
    },
    credentialSubject: {
      id: 'did:ethr:0x...',
      tier: 'Gold',
      since: '2023-01-01'
    }
  }
};
const vcJwt = await createVerifiableCredentialJwt(vcPayload, issuer);
// vcJwt is the signed, issuable credential

Finally, expose an issuance API for your application. This endpoint should authenticate the user (the future credential holder), validate their submission data against your schema, and return the signed JWT. Ensure your issuer's DID Document is publicly resolvable via a DID resolver so verifiers can fetch your public key. With this infrastructure, you can issue credentials that are tamper-evident, privacy-respecting, and independently verifiable across the Web3 ecosystem, forming the basis for trusted user interactions.

step-credential-design
ARCHITECTURE

Step 2: Designing Compliant Credential Schemas

A well-defined schema is the foundation of any verifiable credential system. This step covers how to design data models that are interoperable, privacy-preserving, and legally sound.

A credential schema is the formal data model that defines the structure of a Verifiable Credential (VC). It specifies the claims (or attributes) that can be issued, their data types, and any constraints. For example, a KYCClaim schema might define attributes like givenName (string), dateOfBirth (date), and idDocumentNumber (string). Using a standardized schema ensures that issuers, holders, and verifiers share a common understanding of the data, which is critical for interoperability across different platforms and jurisdictions. The W3C Verifiable Credentials Data Model recommends using JSON Schema for this purpose.

Compliance begins with schema design. You must map your data requirements to relevant regulatory frameworks. For a financial credential, this means aligning with Travel Rule (FATF Recommendation 16) attributes or Anti-Money Laundering (AML) data points. For a professional license, it involves referencing the specific accreditation body's requirements. Design schemas to collect only the minimum necessary data for the intended verification purpose. This principle, often called data minimization, is a core tenet of regulations like the GDPR and reduces liability and attack surfaces. Store the schema on-chain (e.g., using a schema registry like those on the Veramo protocol) or on a decentralized storage network to ensure its immutability and public verifiability.

In practice, you define your schema as a JSON Schema document. Here is a simplified example for a proof-of-age credential:

json
{
  "$schema": "http://json-schema.org/draft-07/schema#",
  "name": "Over18Credential",
  "type": "object",
  "properties": {
    "isOver18": {
      "type": "boolean"
    },
    "birthYear": {
      "type": "integer",
      "minimum": 1900
    }
  },
  "required": ["isOver18"],
  "additionalProperties": false
}

Notice that birthYear is collected but not required; the credential can satisfy the verifier with just the boolean isOver18, demonstrating selective disclosure. The additionalProperties: false setting is crucial for ensuring issuers cannot add unexpected, potentially sensitive data.

For complex use cases, consider using modular and composite schemas. Instead of one monolithic credential, issue several linked credentials. A user might hold a base IdentityCredential and a separate CreditScoreCredential. A verifier can then request proof of identity and a score above a threshold without learning the exact score. This approach, supported by protocols like W3C's Verifiable Credentials and DIF's Presentation Exchange, enhances user privacy and flexibility. Always version your schemas (e.g., Over18Credential-v1.2) and establish a clear deprecation process to manage updates without breaking existing credentials in circulation.

Finally, document the semantic meaning of each attribute. Define what countryOfResidence means: is it legal tax residence, primary physical address, or nationality? Use established ontologies or linked data vocabularies like schema.org or the W3C VC Extension Community Group's contexts to align with broader ecosystems. This documentation should be accessible alongside the schema itself, providing clarity for developers and auditors. A well-designed schema is not just a technical spec; it is a legal and operational contract between all parties in the trust triangle.

step-revocation-mechanism
VERIFIABLE CREDENTIALS FRAMEWORK

Step 3: Implementing a Revocation Mechanism

Learn how to implement a secure and decentralized revocation mechanism for your verifiable credentials, a critical component for maintaining trust and compliance.

A revocation mechanism is essential for any production-grade verifiable credentials (VC) system. It allows an issuer to invalidate a credential before its natural expiration, addressing scenarios like a user losing access to a private key, a credential being compromised, or a user's status changing (e.g., a certification being suspended). Without revocation, a VC is permanently valid, which poses a significant security and compliance risk. In Web3, revocation must be handled in a privacy-preserving manner, meaning the verifier can check a credential's status without learning which specific credential is being presented.

The most common standard for revocation is the W3C Status List 2021. This approach uses a bitstring status list, a cryptographically signed list where each bit represents the revocation status of a single credential. A value of 0 indicates the credential is not revoked, and 1 indicates it is revoked. The credential's metadata includes a credentialStatus field pointing to the status list and the index of its specific bit. This design is highly efficient, allowing a single status list JSON file to manage the revocation status for tens of thousands of credentials with minimal data overhead.

To implement this, you first need to generate and host the status list. As the issuer, you create a JSON document containing the encoded bitstring and sign it. This file must be hosted at a publicly accessible, immutable URL. Many projects use IPFS or Arweave for decentralized hosting, ensuring the list's integrity and availability. The credentialStatus object in the issued VC would look like this:

json
{
  "id": "https://example.com/status-list.json#94567",
  "type": "StatusList2021Entry",
  "statusPurpose": "revocation",
  "statusListIndex": "94567",
  "statusListCredential": "https://example.com/status-list.json"
}

This tells a verifier exactly where to fetch the list and which bit to check.

When a verifier receives a VC, they must perform a revocation check. The process involves: fetching the status list credential from the URL in statusListCredential, verifying its issuer's signature to ensure authenticity, decoding the compressed bitstring, and checking the bit at the statusListIndex. This check returns a simple boolean: revoked or not. Libraries like veramo or vc-js provide built-in methods for this. It's crucial to cache status lists appropriately to avoid performance bottlenecks during high-volume verification.

For advanced use cases, consider selective disclosure with revocation. When using BBS+ signatures for zero-knowledge proofs, you must ensure the revocation check can be performed on the derived, partially disclosed credential. The status list index must be a disclosed attribute in the proof. Alternatively, explore revocation registries used in Indy-based systems or smart contract-based revocation where a registry on-chain (e.g., an Ethereum smart contract or a Solana program) maintains a mapping of credential IDs to their status, enabling real-time, permissionless updates by the issuer.

COMPLIANCE ARCHITECTURE

Mapping VC Flows to Regulatory Requirements

How different credential issuance and verification patterns align with key regulatory frameworks like GDPR, eIDAS, and AML/CFT.

Regulatory RequirementCentralized IssuanceSelf-Sovereign IssuanceHybrid/Delegated Issuance

GDPR Right to Erasure (Article 17)

Partial

eIDAS Qualified Electronic Signature (QES)

AML/CFT Identity Verification (KYC)

Central Ledger

Selective Disclosure

Verifier-Managed

Data Minimization (GDPR Article 5)

Low

High

Medium

Issuer Liability & Non-Repudiation

High

Low

Medium

Cross-Border Recognition (eIDAS)

Certificate-Based

Schema-Based

Bridge-Based

User Consent & Portability (GDPR)

Provider-Dependent

User-Controlled

Consent Receipts

Audit Trail & Proof of Process

Central Log

Verifiable Log (VC)

Hybrid Attestation

step-verifier-integration
IMPLEMENTATION

Step 4: Building the Verifier and Presentation Exchange

This step focuses on the verifier's role, detailing how to request, receive, and validate verifiable presentations from a holder to enable trustless verification in your application.

The verifier is the entity that requests proof from a user (the holder). Its core function is to define what credentials it needs to see and to cryptographically verify the resulting presentation. This process, defined by the W3C Presentation Exchange (PE) specification, replaces traditional API calls with a standardized, privacy-preserving flow. Instead of asking for raw data, a verifier creates a Presentation Definition—a machine-readable set of rules specifying the required credential types, trusted issuers, and necessary data fields.

To implement this, you will use a Verifiable Credentials (VC) SDK like veramo or didkit. First, configure your verifier agent with a Decentralized Identifier (DID) and associated cryptographic keys. This DID represents your application's identity in the trust graph. The agent uses these keys to create the presentation request and later to verify the cryptographic proofs. You must also configure resolvers for the DID methods (e.g., did:ethr, did:key) and verification methods you expect to encounter from holders and issuers.

The critical technical artifact is the Presentation Definition. It's a JSON object that specifies input_descriptors. Each descriptor outlines requirements for a single credential, including its type (e.g., "VerifiableCredential, UniversityDegreeCredential"), constraints on the issuer's DID, and the specific claims needed (like "degree.name"). You can also set policies such as requiring a signature from the holder (holder binding) to prevent presentation replay attacks. This definition is sent to the holder's wallet, typically via a Deep Link or a QR code.

Upon receiving a Verifiable Presentation (VP) back from the holder, your verifier agent must perform a multi-step validation: 1) Verify the VP's cryptographic signature, 2) Verify the signatures of all included VCs, 3) Check that all VC issuers' DIDs are trusted and not revoked, 4) Validate that the presentation fulfills the original Presentation Definition, and 5) Check the status of each VC (e.g., against a revocation registry). Only if all checks pass should your application grant access or approve the transaction.

For development, you can test the flow using holder wallets like ssi-snap for MetaMask or creds CLI tools. A common integration pattern is to expose a backend endpoint that generates a unique presentation request for each user session. The frontend then retrieves this request and passes it to the user's wallet software via the DIF Wallet Rendering protocol. After verification, your backend logic uses the validated claims—such as a KYC status or professional accreditation—to execute application-specific business rules.

DEVELOPER FAQ

Frequently Asked Questions on VC Implementation

Common technical questions and solutions for developers implementing Verifiable Credentials in Web3 applications.

A Verifiable Credential (VC) and an NFT serve fundamentally different purposes. An NFT is a unique, on-chain token representing ownership of a specific digital asset, like art or a collectible. Its primary function is provable scarcity and transferability.

A VC is a cryptographically signed attestation about a subject (like a user's identity, qualifications, or reputation). The credential itself is typically stored off-chain, with only its cryptographic proof (like a hash or a zero-knowledge proof) being referenced on-chain. The core value is verifiability and data minimization, not ownership. For example, an NFT proves you own a diploma, while a VC proves you earned the degree, without revealing the diploma's contents.

conclusion
IMPLEMENTATION SUMMARY

Conclusion and Next Steps

You have now explored the core components for building a verifiable credentials framework. This section outlines key takeaways and provides a roadmap for further development.

Implementing a Verifiable Credentials (VC) framework requires integrating several key components: an issuer (like a smart contract or trusted service), a holder (a user-controlled wallet), and a verifier (your application's logic). The core technical flow involves issuing a signed credential as a JSON-LD or JWT, storing its Decentralized Identifier (DID) and proof, and allowing the holder to present a Verifiable Presentation for verification. Using standards like the W3C VC Data Model and DID methods (e.g., did:ethr, did:key) ensures interoperability across the ecosystem.

For next steps, begin by defining the specific credentials your product needs. Will they attest to KYC status, proof-of-humanity, guild membership, or skill certifications? Next, select your DID method and signature suite. For Ethereum-based projects, did:ethr with EIP-712 signatures is a common choice. You'll need to write or integrate an issuer service. A simple Solidity issuer example uses a mapping to store credential status: mapping(bytes32 credentialHash => bool isValid) public credentials;. The issuer function would sign a structured message containing the credential data and store the hash.

The verification step happens in your application's backend or directly in a smart contract. A verifier must check the credential's cryptographic proof against the issuer's public DID, confirm it hasn't been revoked, and validate that the presented claims satisfy your business rules. Libraries like Veramo, SpruceID's Credible, or DIF's universal resolver can abstract much of this complexity. Always implement a revocation registry, such as a smart contract with a revocation list or a StatusList2021 credential, to manage credential lifecycle.

Finally, consider the user experience. How will users discover, store, and present their VCs? Integrating with wallet connectors that support VCs, like MetaMask Snaps or SpruceID's Kepler, is crucial. For further learning, explore the W3C Verifiable Credentials Implementation Guidelines and the Decentralized Identity Foundation's working groups. Building with verifiable credentials positions your product at the forefront of user-owned identity, enabling trust-minimized and privacy-preserving interactions in Web3.

How to Implement W3C Verifiable Credentials for Web3 Compliance | ChainScore Guides