On-chain verifiable credentials (VCs) are tamper-proof digital attestations anchored to a blockchain. Unlike traditional credentials, they are cryptographically signed, privacy-preserving, and can be verified by anyone without contacting the original issuer. The core components of a VC infrastructure are the Decentralized Identifier (DID) for the subject, the credential schema defining the data structure, and the digital proof (like a digital signature) that binds the issuer to the credential data. This setup enables trustless verification of claims about identities, qualifications, or memberships.
Setting Up a Verifiable Credentials Infrastructure
Setting Up a Verifiable Credentials Infrastructure
A practical guide to building a foundational system for issuing, holding, and verifying on-chain credentials using decentralized identifiers and digital signatures.
The first step is establishing decentralized identities for all participants. Each entity—issuer, holder, and verifier—needs a DID. A DID is a URI that points to a DID Document containing public keys and service endpoints. For Ethereum-based systems, you can use the did:ethr method, where a DID is derived from an Ethereum account (e.g., did:ethr:0xabc123...). The associated DID Document, often managed by a registry like the Ethereum DID Registry, holds the public key used for signing and verifying credentials. This creates a cryptographically verifiable link between an identity and its controlling keys.
Next, define the structure of your credential using a JSON-LD schema. This schema specifies the expected properties and data types. For a university diploma credential, the schema might define fields for degreeName, issuanceDate, and recipient. Schemas are typically published to a public repository or referenced via a content identifier (CID) on IPFS to ensure immutability. Here's a simplified example of a credential payload before signing:
json{ "@context": ["https://www.w3.org/2018/credentials/v1"], "type": ["VerifiableCredential", "UniversityDegreeCredential"], "issuer": "did:ethr:0xissuer123", "issuanceDate": "2023-10-26T10:00:00Z", "credentialSubject": { "id": "did:ethr:0xholder456", "degreeName": "Bachelor of Science" } }
The issuer cryptographically signs the credential to generate a verifiable data integrity proof. Using the JSON Web Signature (JWS) or Linked Data Proofs standard, the issuer signs a hash of the credential data with their private key, corresponding to the public key in their DID Document. This proof is attached to the credential, making it a Verifiable Credential. The holder stores this signed VC, often in a digital wallet. For on-chain verification, critical elements like the issuer's DID, the schema hash, or a credential status index (e.g., for revocation) can be stored on-chain, providing a universal trust anchor.
To verify a credential, a verifier performs several checks: 1) Validate the signature against the issuer's public key in their DID Document, 2) Check the credential's status (e.g., against a revocation registry on-chain), and 3) Ensure the credential schema matches expectations. On-chain components, like a smart contract storing revocation lists, allow for permissionless, real-time status checks. This infrastructure enables use cases like Sybil-resistant governance, under-collateralized lending with credit scores, and authenticated access to gated content or DAOs, moving beyond simple token-gating.
Prerequisites and Tech Stack
Building a verifiable credentials (VC) system requires a foundational understanding of decentralized identity concepts and a specific set of tools. This guide outlines the essential knowledge and software stack needed to begin development.
Before writing any code, you must understand the core components of the Self-Sovereign Identity (SSI) model. This includes the roles of Issuers (entities that create credentials), Holders (individuals who receive and store them), and Verifiers (entities that request and validate proofs). The system relies on Decentralized Identifiers (DIDs) as persistent, cryptographically verifiable identifiers not tied to a central registry. You should be familiar with the W3C Verifiable Credentials Data Model specification, which defines the standard JSON-LD structure for credentials and presentations.
Your development environment needs a modern Node.js runtime (v18 or later) and a package manager like npm or yarn. Core cryptographic libraries are essential; we recommend @noble/curves for elliptic curve operations and @stablelib for hashing and key management. For handling DIDs and VCs, you will use SDKs such as did-jwt-vc from the Veramo framework or daf-core for agent-based architectures. A basic understanding of TypeScript is highly beneficial for working with these libraries and their type definitions.
You will need a method for DID resolution and key management. For development, you can start with did:key or did:ethr methods, which are simpler to implement. The Veramo CLI tool is invaluable for generating keys, creating DIDs, and issuing test credentials from your terminal. For persistent storage of DIDs, private keys, and credentials, configure a database. SQLite is perfect for local development, while PostgreSQL or MongoDB are used for production systems, often integrated via TypeORM or other ORMs supported by your chosen framework.
To simulate a complete flow, set up at least three separate agent projects: one for the Issuer, one for the Holder, and one for the Verifier. Each agent will manage its own DID, private keys, and message routing. Use the did:web method for Issuers that need a publicly resolvable endpoint, or explore did:ion for a scalable, Sidetree-based DID method on Bitcoin. For blockchain interactions, such as anchoring DIDs or checking revocation status, an Ethereum testnet RPC URL (e.g., from Alchemy or Infura) and a small amount of test ETH may be required.
Finally, plan your credential schema. Define the structure of your credentials using JSON Schema and publish them to a trusted context repository or a decentralized network like the Cheqd network or Element's Trust Registry. This ensures verifiers can understand the data they are checking. Your tech stack is now ready to issue a credential, store it in a Holder's digital wallet (like Veramo Agent or Trinsic's ecosystem), and create a cryptographic proof for verification.
Core Concepts: W3C VC Data Model and DIDs
A technical guide to the foundational standards for creating, issuing, and verifying digital credentials on the web.
The W3C Verifiable Credentials (VC) Data Model is the core specification for creating cryptographically secure, privacy-preserving digital credentials. It defines a standard data format for representing claims—such as a university degree or proof of age—in a way that is machine-verifiable, tamper-evident, and can be shared without relying on a central authority. A VC is a JSON-LD or JWT-based package containing metadata, the credential subject's claims, and cryptographic proofs. This model enables a paradigm shift from centralized identity silos to user-centric, portable digital credentials.
Decentralized Identifiers (DIDs) are the essential counterpart to VCs, providing the mechanism for verifiable, self-sovereign identity. A DID is a unique, persistent identifier (e.g., did:ethr:0xabc123...) that is controlled by the identity holder, not a centralized registry. It resolves to a DID Document, a JSON file that contains public keys, verification methods, and service endpoints. This document is the root of trust, allowing the holder to cryptographically prove control over their DID—a prerequisite for signing VCs or creating verifiable presentations. DIDs are registered on verifiable data registries, which can be blockchains, distributed ledgers, or other decentralized networks.
To set up a basic VC infrastructure, you need three core roles: an Issuer (creates and signs VCs), a Holder (stores and presents VCs), and a Verifier (requests and cryptographically checks VCs). The flow begins with the Issuer creating a VC, signing it with a private key linked to their DID, and transmitting it to the Holder. The Holder stores this VC in a digital wallet. When a Verifier requires proof, the Holder creates a Verifiable Presentation, selectively disclosing credentials and proving control of their DID. The Verifier checks the signatures against the Issuer's and Holder's public keys, which are fetched from their respective DID Documents on the registry.
For developers, implementing this stack involves choosing DID methods (e.g., did:ethr for Ethereum, did:key for simple keys) and signature suites (e.g., Ed25519Signature2018, EcdsaSecp256k1RecoverySignature2020). A practical first step is using a library like did-jwt-vc or vc-js to create a credential. The code snippet below demonstrates creating a simple VC using the did:key method and the Ed25519 signature suite, resulting in a JWT-formatted credential ready for issuance.
javascript// Example using did-jwt-vc library import { createVerifiableCredentialJwt } from 'did-jwt-vc'; const vcPayload = { sub: 'did:key:z6Mk...', // Holder's DID vc: { '@context': ['https://www.w3.org/2018/credentials/v1'], type: ['VerifiableCredential', 'UniversityDegreeCredential'], credentialSubject: { id: 'did:key:z6Mk...', degree: { type: 'BachelorDegree', name: 'Computer Science' } } } }; const vcJwt = await createVerifiableCredentialJwt(vcPayload, issuerSigner); // vcJwt is a signed, base64-encoded JWT
Key considerations for production infrastructure include revocation mechanisms (like status lists or smart contracts), selective disclosure for privacy (using BBS+ signatures or zero-knowledge proofs), and interoperability across different DID methods and wallet implementations. The ecosystem is evolving with frameworks like Sphereon's SSI-SDK and Microsoft's ION for Bitcoin. By building on these open W3C standards, developers create systems that are portable, secure by design, and resistant to vendor lock-in, forming the backbone of decentralized identity for Web3, enterprise, and government applications.
VC Implementation Patterns: On-Chain vs. Off-Chain
Comparison of core technical and operational characteristics for implementing Verifiable Credentials using blockchain registries versus traditional off-chain methods.
| Feature | On-Chain Registry (e.g., Ethereum, Polygon) | Off-Chain Registry (e.g., HTTP, IPFS) | Hybrid (Status List on-chain, VC off-chain) |
|---|---|---|---|
Credential Status/Revocation | |||
Decentralized Identifier (DID) Resolution | |||
Immutable Audit Trail | |||
Gas/Cost per Status Update | $2-10 (varies) | $0 | $0.50-2 (one-time) |
Verification Latency | ~15 sec (block time) | < 1 sec | ~15 sec (status only) |
Data Privacy (VC content exposure) | None (only hashes/status) | Full (if compromised) | None (only status on-chain) |
Censorship Resistance | Partial | ||
Infrastructure Dependency | Public Blockchain RPC | Issuer's HTTP Server | Public Blockchain RPC |
Step 1: Build the Credential Registry Smart Contract
The on-chain registry is the foundational component of a verifiable credentials system, acting as a tamper-proof ledger for credential schemas and status.
A Credential Registry is a smart contract that serves as the single source of truth for your credentialing system. Its primary functions are to register credential schemas (defining the data structure of a credential) and manage credential status (like revocation or suspension). By anchoring these operations on-chain, you create a decentralized, transparent, and non-repudiable record. Popular choices for deployment include EVM-compatible chains like Ethereum, Polygon, or Arbitrum, or other ecosystems like Solana using the Anchor framework, depending on your requirements for cost, speed, and finality.
The core of the registry is the credential schema. This is a blueprint that defines the fields, data types, and structure of the verifiable credentials you will issue. For example, a "KYC Credential" schema might include fields like fullName (string), dateOfBirth (uint256), and countryCode (string). Storing the schema's unique identifier (like a schemaId hash) on-chain allows any verifier to confirm the expected structure of a credential presented to them, ensuring data integrity and interoperability.
Beyond schemas, the registry must manage credential status. The most common status is a revocation registry, often implemented as a bitmap or a mapping that allows the original issuer (or a designated controller) to revoke a credential by its unique identifier without revealing the credential's contents. For more complex status logic—like expirable credentials, suspension lists, or status based on external conditions—you can implement a Status List following the W3C Status List 2021 specification, storing a compressed bitstring on-chain.
Here is a simplified example of a registry contract skeleton in Solidity, outlining key state variables and functions:
soliditycontract CredentialRegistry { // Maps schemaId hash to schema metadata mapping(bytes32 => Schema) public schemas; // Maps credentialId to a boolean revocation status mapping(bytes32 => bool) public revokedCredentials; // Maps statusListId to a bitstring (e.g., as a bytes data URI) mapping(bytes32 => bytes) public statusLists; struct Schema { address issuer; string schemaDefinition; // JSON string or IPFS CID uint256 createdAt; } function registerSchema(bytes32 schemaId, string memory schemaDef) external { // Register a new schema definition } function revokeCredential(bytes32 credentialId) external { // Revoke a specific credential } }
When designing your contract, critical security considerations include access control for sensitive functions (using OpenZeppelin's Ownable or role-based AccessControl), ensuring event emission for all state changes to enable off-chain indexing, and planning for upgradability if schema formats or status mechanisms may evolve. For production systems, consider using established frameworks like the Veramo DID-ETH libraries or building upon the Ethereum Attestation Service (EAS) schema registry to leverage audited, community-tested patterns.
Step 2: Integrate Decentralized Identifiers (DIDs)
Decentralized Identifiers (DIDs) are the cryptographic anchors for verifiable credentials, providing a self-sovereign, portable identity layer for users and entities.
A Decentralized Identifier (DID) is a globally unique, persistent identifier that an individual, organization, or device controls without reliance on a central registry. Unlike traditional usernames or email addresses, a DID is cryptographically verifiable. It is expressed as a URI, such as did:ethr:0xabc123..., and resolves to a DID Document. This document contains the public keys, authentication mechanisms, and service endpoints necessary to interact with the DID's controller. This architecture enables trust without centralized intermediaries.
The DID Document is the core technical specification. It's a JSON-LD file that describes how to use the DID. Key components include the id (the DID itself), verificationMethod entries for public keys, and service endpoints for interacting with the holder (like a credential repository). For example, a simple DID Document for the Ethereum mainnet might define a verificationMethod using the EcdsaSecp256k1RecoveryMethod2020 type, allowing signatures to be verified against an Ethereum account. The document is typically stored on a verifiable data registry, like a blockchain or a distributed ledger.
To implement DIDs, you must choose a DID Method. This is the specification defining how a DID is created, resolved, updated, and deactivated on a specific network. Common methods include did:ethr (Ethereum), did:key (simple key pairs), did:web (web domains), and did:ion (Bitcoin/Sidetree). Your choice depends on your requirements for decentralization, cost, and tooling. For instance, did:ethr leverages Ethereum's security and smart contracts for document updates, while did:key is a lightweight method suitable for static, self-contained identifiers.
For developers, libraries like did-jwt-vc (JavaScript/TypeScript) or ssi (Go) provide essential utilities. A basic flow involves: 1) Creating a DID and its keys, 2) Publishing the DID Document to a registry (if required by the method), and 3) Resolving a DID to fetch its current document. Here's a minimal example using the did:key method with the did-jwt-vc library:
javascriptimport { DID } from 'dids'; import { Ed25519Provider } from 'key-did-provider-ed25519'; import { getResolver } from 'key-did-resolver'; // Generate or load a key const seed = new Uint8Array(...); // 32-byte entropy const provider = new Ed25519Provider(seed); const did = new DID({ provider, resolver: getResolver() }); await did.authenticate(); // Creates the DID instance console.log(did.id); // e.g., did:key:z6Mk...
Integrating DIDs establishes the foundation for issuing and verifying credentials. The issuer and holder each have their own DIDs. When an issuer creates a Verifiable Credential, they sign it with the private key corresponding to a verification method in their DID Document. The verifier can then resolve the issuer's DID to obtain the public key and cryptographically verify the signature's authenticity. This creates a trust chain rooted in the decentralized identifier, enabling interoperable and portable digital credentials across different platforms and ecosystems without centralized identity providers.
Step 3: Issue a Verifiable Credential Off-Chain
This step details the core process of creating and signing a W3C-compliant Verifiable Credential using a private key, before it is anchored on-chain.
Issuing a Verifiable Credential (VC) is the act of a trusted entity, the issuer, creating a digitally signed attestation about a subject (e.g., a user or entity). The VC is a JSON-LD document following the W3C Verifiable Credentials Data Model. It contains the credential's metadata, the claim data itself, and a cryptographic proof. For off-chain issuance, you generate this proof using a private key, typically stored in a secure environment like a Hardware Security Module (HSM) or a backend wallet. The resulting VC is a self-contained, portable JSON object that the holder can store and present.
A minimal VC JSON structure includes essential fields: @context defines the linked data vocabularies, id is a unique URI for the credential, type specifies the credential schema (e.g., ["VerifiableCredential", "UniversityDegreeCredential"]), issuer is the DID of the issuing entity, issuanceDate is a timestamp, credentialSubject contains the actual claims (like "degreeType": "Bachelor of Science"), and proof holds the cryptographic signature. The proof field's structure depends on the chosen signature suite, such as Ed25519Signature2018 or JsonWebSignature2020, which dictates how the credential is cryptographically bound to the issuer's DID.
To sign the credential, you must create a canonicalized version of the JSON-LD data. This process, called hashing to issuance, transforms the document into a deterministic byte array, ensuring the same hash is generated by any compliant verifier. You then sign this hash with the issuer's private key corresponding to the public key listed in their DID Document. Libraries like jsonld-signatures or platform-specific SDKs (e.g., veramo, ssi-sdk) handle this complexity. The resulting signature, along with the proof type and verification method (the public key ID), is embedded into the VC's proof object, completing the issuance.
For developers, using a framework like Veramo streamlines this process. Here is a simplified TypeScript example using Veramo to create and sign a VC:
typescriptimport { createAgent } from '@veramo/core'; import { CredentialIssuer, ICredentialIssuer } from '@veramo/credential-w3c'; // ... agent setup with DID resolver and key management const credential = await agent.createVerifiableCredential({ credential: { issuer: { id: 'did:ethr:mainnet:0x...' }, credentialSubject: { id: 'did:ethr:mainnet:0x...', achievement: 'Completed Chainscore Developer Course', date: '2024-01-15' } }, proofFormat: 'jwt', // or 'lds' }); console.log(JSON.stringify(credential, null, 2));
This code outputs a signed JWT or JSON-LD credential ready for the holder to receive.
The off-chain issued VC is now a holder-bound asset. The subject (holder) can store it in a digital wallet. Its verifiability stems from the cryptographic proof, which any party can check by resolving the issuer's DID to their public key and verifying the signature against the canonicalized data. This off-chain model offers privacy and portability, as the credential content is not broadcast to a public ledger. The subsequent step involves creating a Verifiable Presentation for sharing selective claims and, optionally, registering a cryptographic commitment (like a hash) of the VC on-chain to enable revocation checks and enhance trust without exposing the underlying data.
Step 4: Build Verifier Logic for Conditional Access
This step details how to programmatically verify credentials to enforce access control, the core of a trustless, decentralized system.
The verifier is the logic that checks a user's credentials against a predefined policy before granting access to a resource. This is where your application's business rules are encoded. A policy defines the required credentials—such as holding a specific NFT, being above a certain age, or having a KYC attestation—and the conditions under which they are considered valid. The verification process is performed off-chain by your application server or a smart contract, depending on your architecture. It does not require interaction with the credential issuer, relying instead on the cryptographic proofs embedded in the Verifiable Credential (VC) or Verifiable Presentation (VP).
Verification involves three key checks: 1) Proof Validity: Cryptographically verifying the digital signature on the credential or presentation to ensure it was issued by the claimed issuer and hasn't been tampered with. 2) Credential Status: Checking that the credential has not been revoked by the issuer, typically by querying a status registry or a revocation list. 3) Policy Compliance: Ensuring the credential's claims (e.g., "age" > 21, "membershipType" == "gold") satisfy the access rules you've defined. Libraries like veramo or did-jwt-vc provide built-in methods for these verification steps.
Here is a simplified Node.js example using a hypothetical SDK to verify a user's presentation for age-gated access:
javascriptasync function verifyAccess(presentation, policy) { // 1. Verify the cryptographic proof const proofValid = await sdk.verifyPresentation(presentation); if (!proofValid) throw new Error('Invalid proof'); // 2. Check credential status (e.g., against a revocation list) const isRevoked = await checkStatus(presentation.credentialId); if (isRevoked) throw new Error('Credential revoked'); // 3. Evaluate policy against credential claims const claims = presentation.verifiableCredential[0].credentialSubject; if (claims.age < policy.minimumAge) { throw new Error('Age requirement not met'); } // Additional checks for issuer DID, credential type, etc. return { success: true, claims }; }
For decentralized applications, this logic can be embedded directly into a smart contract verifier. Projects like Verax on Linea or EAS (Ethereum Attestation Service) provide on-chain registries and libraries for storing and verifying attestations. A smart contract can hold the policy and use a precompile or library to verify the ZK-proof or signature associated with an off-chain credential. This creates a powerful pattern where access to a token-gated Discord server, a DeFi pool, or a physical event can be controlled by a transparent, autonomous contract that trusts only the cryptographic proof.
The output of a successful verification is a boolean decision and, often, the extracted claims. Your application uses this result to gate the next action: minting an access token, unlocking content, or recording an on-chain transaction. By decoupling the verification logic from the issuance process, you create systems that are interoperable (any issuer your policy trusts can provide credentials) and privacy-preserving (users can share minimal, selective proofs). This step transforms static credentials into dynamic, programmable permissions.
Implementation Examples by Use Case
Self-Sovereign Identity (SSI) Wallets
Implementing verifiable credentials for user-controlled identity. A common pattern uses W3C Decentralized Identifiers (DIDs) anchored on-chain (e.g., Ethereum, Polygon) and W3C Verifiable Credentials (VCs) issued by trusted entities.
Key Components:
- Issuer: A smart contract or off-chain service that signs credentials (e.g., a university issuing a diploma).
- Holder: A user's wallet (like MetaMask with SSI extensions or a specialized wallet) that stores and manages VCs.
- Verifier: A dApp or service that requests and cryptographically verifies a VC's signature and status.
Example Flow:
- User (Holder) generates a DID (
did:ethr:0x123...). - Issuer creates a VC containing the claim (e.g.,
{"degree": "BSc Computer Science"}), signs it, and sends it to the Holder. - Holder presents the VC to a Verifier (e.g., a job platform).
- Verifier checks the Issuer's DID document for the public key, verifies the signature, and queries an on-chain registry (like Ethereum's ERC-1056 or ERC-3643) to ensure the credential hasn't been revoked.
Frequently Asked Questions (FAQ)
Common technical questions and solutions for developers implementing a verifiable credentials (VC) infrastructure using decentralized identifiers (DIDs) and blockchain.
A Decentralized Identifier (DID) is a globally unique, cryptographically verifiable identifier for a subject (person, organization, or thing). It's controlled by the subject via a private key and resolves to a DID Document containing public keys and service endpoints.
A Verifiable Credential (VC) is a tamper-evident, cryptographically signed attestation (like a digital driver's license) issued about a subject. The VC contains claims and is signed by the issuer's DID. The subject's DID is the id field in the credential.
In practice, a DID is the identity anchor, while a VC is a portable, signed statement about that identity. You need a DID to issue or receive a VC.
Essential Resources and Tools
These resources cover the core building blocks required to design, implement, and operate a production-grade Verifiable Credentials (VC) infrastructure. Each card focuses on a concrete standard or tool developers actively use in real deployments.
Conclusion and Next Steps
You have now established the core components of a verifiable credentials (VC) infrastructure. This foundation enables you to issue, hold, and verify digital credentials in a privacy-preserving and interoperable manner.
Your infrastructure should now include: an issuer service using a library like didkit or veramo to sign VCs, a holder wallet (such as a mobile app with walt.id or credential-handler-polyfill), and a verifier service that can check proofs and credential status. You have configured a Decentralized Identifier (DID) method, likely did:web or did:key for development, and understand the flow of creating a VerifiablePresentation for sharing claims. The core standards—W3C Verifiable Credentials Data Model and Decentralized Identifiers—ensure your system can interact with others in the ecosystem.
For production, several critical next steps are required. First, audit your cryptographic implementations and key management. Move from test DIDs to a robust method like did:ethr or did:ion. Implement credential status checking using a revocation list (e.g., StatusList2021) or a smart contract. You must also design your credential schemas carefully, publishing them to a trusted registry, and establish clear governance around who can issue credentials and under what policies. Security reviews are non-negotiable.
To deepen your understanding, explore advanced patterns. Study Selective Disclosure (e.g., BBS+ signatures) for revealing specific attributes without sharing the entire credential. Experiment with zero-knowledge proofs for complex claim verification. Integrate with identity hubs or cloud wallets for better key recovery and storage. The Decentralized Identity Foundation and W3C VC Working Group are essential resources for specifications and best practices.
Finally, consider the application landscape. Your infrastructure can be used for KYC/AML compliance, educational certificates, professional licenses, access credentials for DAOs, and supply chain provenance. Start with a pilot project that addresses a clear user need. The true test of your system is its usability and reliability for both issuers and holders, ensuring credentials are not just technically verifiable but also practically useful and trusted.