To grasp the scope of Verity, it’s important to understand each of stage of a provable data flow.
Verity embeds a protocol of cryptographically enforced conditions that govern the proofs associated with data as it flows from source to destination. Developers can extend this protocol to suit their use cases, designing custom data pipelines that impose their own cryptographic guarantees over data integrity.
Verity’s Data Flow Proofs (DFPs) are composed of recursive sub-proofs, each layer corresponding to a specific stage of the data pipeline.
The base protocol abstracts proof generation and management mechanics, allowing developers to focus on defining data-specific protocols.
Verity generates TLS proofs for data sourced from these endpoints, allowing developers to aggregate proofs across multiple requests into a single, cryptographically verifiable proof.Verity leverages a fork of TLSNotary, an MPC-TLS implementation supported by Ethereum’s Privacy and Scaling Explorations (PSE).See examples of TLS proofs.
At this stage, data is processed and/or transformed into a format compatible with the destination system. This transformation must be verifiable to ensure the integrity of the applied logic.
Aggregation examples include computing median or mean values, as seen in Decentralised Oracle Networks.
Computational logic must be provably adhered to using a Verifiable Data Processing Environment (VDPE).
In Verity, “zkTLS” refers to a data flow where a zkVM-based VDPE is used to aggregate and process MPC-TLS proofs.Essentially, zkTLS represents TLS data flow through ZK compute. Therefore, Verity uses zkTLS proofs and Data Flow Proofs (DFPs) interchangeably.To enhance zkTLS, Verity offloads pre-computation to replicated compute platforms, significantly reducing zkVM proving times from hours to minutes.
Verity equips developers with tools for orchestrating proof generation, aggregation, and delivery. The delivery mechanism, often referred to as the Orchestrator, manages:
Proof generation timing (e.g., accounting for blockchain re-orgs).
Data authentication.
Follow-on actions upon proof delivery.
Example: A zkVM-based Verity pipeline might wait for a predetermined period post-blockchain event detection before generating and delivering the ZK-DFP to mitigate risks in blockchain re-orgs.
Usher Labs collaborates with Decentralised Oracle Networks (DONs) and Data Delivery Networks (DDNs) to integrate zkTLS proofs into blockchain systems. These partnerships simplify the delivery of private data on-chain while preserving security. Learn more about integrations.
A unique aspect of Verity’s protocol is its recursive compatibility. A proof from one data pipeline can serve as an input for another, enabling layered pipelines with independently verifiable stages.While this feature is currently experimental, it demonstrates Verity’s potential to scale trust across complex data systems.