Verity embeds a protocol of cryptographically enforced conditions that govern the proofs associated with data as it flows from source to destination. Developers can extend this protocol to suit their use cases, designing custom data pipelines that impose their own cryptographic guarantees over data integrity.

Verity’s Data Flow Proofs (DFPs) are composed of recursive sub-proofs, each layer corresponding to a specific stage of the data pipeline. The base protocol abstracts proof generation and management mechanics, allowing developers to focus on defining data-specific protocols.

1. Data Sourcing

Data sourcing involves obtaining third-party data from TLS-enabled endpoints, to combine with first-party data.

Third-party data sources include:

  1. Government APIs
  2. Financial APIs
  3. Weather APIs
  4. Social media APIs

First-party data sources include:

  1. Private user data
  2. Systems data
  3. Custom data

TLS Proof Generation

Verity generates TLS proofs for data sourced from these endpoints, allowing developers to aggregate proofs across multiple requests into a single, cryptographically verifiable proof.

Verity leverages a fork of TLSNotary, an MPC-TLS implementation supported by Ethereum’s Privacy and Scaling Explorations (PSE).

See examples of TLS proofs.

2. Data Processing

At this stage, data is processed and/or transformed into a format compatible with the destination system. This transformation must be verifiable to ensure the integrity of the applied logic.

  • Aggregation examples include computing median or mean values, as seen in Decentralised Oracle Networks.
  • Computational logic must be provably adhered to using a Verifiable Data Processing Environment (VDPE).

TLS Proof Aggregation

The Prover orchestrates the aggregation of TLS proofs within a VDPE, ensuring verifiability of transformations.

A VDPE can be:

  • zkVM: For privacy-preserving computation.
  • Replicated Compute Platform (e.g., Internet Computer): For faster processing with simpler proofs (e.g., tECDSA).

Verity natively supports the following platforms for verifiable data processing:

  • RiscZero zkVM (default ZK VDPE).
  • Internet Computer (replicated compute VDPE).

zkTLS Flow & Optimisation

In Verity, “zkTLS” refers to a data flow where a zkVM-based VDPE is used to aggregate and process MPC-TLS proofs.

Essentially, zkTLS represents TLS data flow through ZK compute. Therefore, Verity uses zkTLS proofs and Data Flow Proofs (DFPs) interchangeably.

To enhance zkTLS, Verity offloads pre-computation to replicated compute platforms, significantly reducing zkVM proving times from hours to minutes.

3. Output

The output is a succinct, verifiable value ready for delivery to the destination system, typically a Smart Contract.

zkSNARK or tECDSA Proof

  • zkSNARK: Ensures privacy, recommended for private TLS data.
  • tECDSA: Efficient, suitable for public TLS data.

The choice of proof format ensures compatibility with nearly all programmable blockchains.

4. Proof & Data Delivery

Verity equips developers with tools for orchestrating proof generation, aggregation, and delivery. The delivery mechanism, often referred to as the Orchestrator, manages:

  1. Proof generation timing (e.g., accounting for blockchain re-orgs).
  2. Data authentication.
  3. Follow-on actions upon proof delivery.
Example: A zkVM-based Verity pipeline might wait for a predetermined period post-blockchain event detection before generating and delivering the ZK-DFP to mitigate risks in blockchain re-orgs.

Oracle Integrations

Usher Labs collaborates with Decentralised Oracle Networks (DONs) and Data Delivery Networks (DDNs) to integrate zkTLS proofs into blockchain systems. These partnerships simplify the delivery of private data on-chain while preserving security. Learn more about integrations.

Recursion

A unique aspect of Verity’s protocol is its recursive compatibility. A proof from one data pipeline can serve as an input for another, enabling layered pipelines with independently verifiable stages.

While this feature is currently experimental, it demonstrates Verity’s potential to scale trust across complex data systems.