Categories
Cryptography

Zero Knowledge: SNARKs vs. STARKs

Introduction

With the rise of cryptocurrencies like Ethereum, zero-knowledge proof technology is increasing in popularity due to the variety its applicable use cases, such as verifiable computation and privacy-preservation. In this article, we aim to review the class of zero-knowledge proof constructions by Ben-Sasson, Bentov, Horesh and Riabzev (BBHR18) in 2018 that overcomes the abovementioned weaknesses: ZK-STARKs (Zero-Knowledge Scalable Transparent Arguments of Knowledge).

ZK-STARKs are the class of argument systems that satisfy the following properties:

  • Zero-Knowledge: Private inputs are shielded
  • Scalable: Proofs for computational integrity lasting [latex]T[/latex] cycles are generated with time quasi-linear in [latex]T[/latex] (roughly [latex]T[/latex] cycles) and verified with time exponentially faster than [latex]T[/latex] (roughly [latex]\log T[/latex] cycles)
  • Transparent: No trusted setup, verifier messages are random coins
  • Argument of Knowledge: An efficient procedure can extract the secrets from a prover (more informally, proofs can only be generated by parties with knowledge of the private inputs)

ZK-SNARKs

Proposed by Joe Kilian and Silvio Micali in the 1990s, Zero-Knowledge Succinct Non-Interactive Arguments of Knowledge (ZK-SNARKs) are a class of computationally-sound ZK proof systems that were considered intriguing yet impractical, and invoked mainly for their theoretical applications. In particular, the feasibility of the prover and verifier runtimes were a huge concern.

Recently, new ideas and schemes lead to both improved theoretical foundations and full working ZK-SNARK implementations, such as the cryptocurrency zcash. These recent ZK-SNARKs are instantiated differently than the ones originally conceived by Kilian and Micali: they primarily rely on the Knowledge of Exponent (KEA1) assumption, and they involve reductions to circuits (and constraints on them), then further reduction into polynomials before setting up a linear PCP (discussed later) and then realizing it into a ZK-SNARK.

The ZK-STARK implementation discussed in this article has roots in the early ZK-SNARK system described by Kilian and Micali, with large improvements in feasibility and efficiency by using a generalization of the original proofs used (“PCPs”) known as IOPs (and even further extended with IOPPs), which will be discussed later. Despite having more similarities in construction with the Kilian-Micali ZK-SNARK, we aim to compare ZK-STARKs with the ZK-SNARKs that have more concrete implementations, namely the class of ZK-SNARKs based on KEA1 and how it offers (more concrete) increases in performance and security guarantees. Hence, from this point forth we will refer to all such KEA1-based ZK-SNARKs simply as ZK-SNARKs. Such ZK-SNARKs have three primary weaknesses that we wish to examine:

  • Reliance on a trusted setup mechanism, such as pre-generated keys or a multiparty computation
  • Not very scalable in terms of verifying the computational integrity of proofs
  • Uses elliptic curves, pairings and the Knowledge-of-Exponent assumption, making it susceptible to quantum attacks

I will give an overview of the primary novel units (and the intuition behind them) that forms the basis of ZK-STARKs. The novel construction presented in the paper is a scalable and transparent ZK system in the IOP model, realized as a ZK-STARK which consists of the following high-level components:

  • Arithmetization to Reed-Solomon (RS) Proximity Testing (RPT) Problems
  • Interactive Oracle Proof of Knowledge (IOP) via Fast Reed-Solomon IOP of Proximity (FRI)
  • Argument of Knowledge (ZK-STARK)

Motivation for ZK-SNARKs

In the paper by BBHR18, a DNA Profile Match (DPM) problem scenario was used to illustrate the motivation (and application) of ZK-STARKs. In short, the DPM scenario is as follows: suppose that a candidate has been offered a position as a high-ranking government official, conditioned on having committed no serious crimes. The candidate argues that if they were guilty, they would appear in the criminal DNA database maintained by the police. The police maintains a large database of DNA records for criminals. If we accept the candidate’s argument, then we would need to check that every criminal DNA entry in the database is different from the candidate’s DNA. The DNA sample would have to be exposed for public verification (no privacy preserved) and the database checking will take a very long time due to its size.

The DPM scenario poses several problems that motivates ZK-STARKs:

  • Computational Integrity: A prover [latex]P[/latex] executing a public computation [latex]C[/latex] on a dataset [latex]D[/latex] may be dishonest. How do we ensure [latex]P[/latex] reports the correct output [latex]\alpha_P=C(D)[/latex]?
  • Scalable Verification: With a public dataset [latex]D[/latex], any verifier [latex]V[/latex] can verify that [latex]\alpha_P=C(D)[/latex] naively, but it does not scale because [latex]V[/latex] must read [latex]D[/latex] in its entirety and the verifier runtime is equal to the runtime of [latex]C[/latex]. How do we make the verification scalable?
  • Confidentiality: If [latex]D[/latex] is confidential, then [latex]D[/latex] cannot be made public and any broker [latex]P[/latex] computing [latex]C(D)[/latex] may violate computational integrity under the guise of confidentiality. How do we achieve confidentiality without compromising computational integrity?

Probabilistically-Checkable Proofs (PCPs)

ZK-based proof and argument systems replace the need for a third-party “auditor” as a way to guarantee computational integrity over some confidential data. More formally:

A Zero-Knowledge (ZK) system [latex]S[/latex] for a publicly computable function [latex]C[/latex] is a pair of randomized algorithms [latex]S=(P,V)[/latex] where the prover [latex]P[/latex] is the algorithm/procedure used to generate proofs of computational integrity, and the verifier [latex]V[/latex] is the algorithm used to check such proofs. [latex]P[/latex] can efficiently generate proofs accepted by [latex]V[/latex] for all truisms, but only accepted proofs with negligible probability for any falsities.

Early ZK systems with scalable verification used Probabilistically Checkable Proofs (PCPs). The PCP setup is to prove some [latex]x \in L[/latex] for some language [latex]L\in\textbf{NTIME}(T(n))[/latex] where:

  • Verifier has oracle access to a PCP [latex]\pi[/latex] and runs in time [latex]\text{poly}(n+\log(T(n)))[/latex]
  • If [latex]x\in L[/latex] then there exists an accepting proof [latex]\pi[/latex] that is accepted with probability 1
  • If [latex]x\notin L[/latex] then all such non-accepting proofs [latex]\pi^*\neq\pi[/latex] are rejected with probability more than [latex]1/2[/latex]

This offered a trade-off where the prover runtime scales polynomially with respect to naive computation time ([latex]C(D)[/latex]), and the verifier runtime decreases exponentially with respect to prover runtime – i.e. [latex]T_P=T_C^{\mathcal{O}(1)}, T_V=\log^{\mathcal{O}(1)}T_C[/latex]. Such ZK systems based on PCPs (“ZK-PCPs”) offer 6 primary guarantees:

  • Post-Quantum Resilience. The collision-resistant hash functions and common-access random functions (in the Random Oracle Model) are not susceptible to attacks by quantum computers.
  • Proof-of-Knowledge (POK). ZK-PCPs provide a POK, or when realized in the DPM scenario, an Argument-of-Knowledge (ARK).
  • Transparency. This is achieved via public randomness (of the verifier) which removes the need for an external trusted setup phase. This is particularly important since the ability of very powerful cheating provers is severely limited as long as public randomness is maintained. Such transparency is realized through Arthur-Merlin protocols.
  • Universality. ZK-PCPs can be applied to any efficient publicly computable [latex]C[/latex].
  • Confidentiality. Private inputs are not compromised.
  • Scalable Verification, as discussed in the trade-off above.

Hence, ZK-PCPs are very useful for publically-trusted computational integrity over confidential data. However, until the ZK-STARK paper, no ZK-PCP has been realized in code since the proofs were very convoluted. More recent implementations of general-purpose ZK systems fail to achieve all of the abovementioned properties.

Interactive Oracle Proofs (IOPs)

In a bid to enhance the scalability of the prover while maintaining the properties of ZK-PCPs, a generalization of the PCP (and other similar models) was suggested: an Interactive Oracle Proof (IOP). By definition, PCP systems are transparent 1-round IOP systems. Similarly, the IOP verifier does not need to read the entire prover message but instead sample random locations. The prover and verifier interact over several rounds. An IOP is setup to prove for some language [latex]L\in\textbf{NTIME}(T(n))[/latex] that some [latex]x\in L[/latex]:

  • Verifier ([latex]V[/latex]) has oracle access to first oracle proof [latex]\pi_1[/latex]
  • [latex]V[/latex] sends public random token [latex]r_1[/latex]
  • [latex]V[/latex] has oracle access to second oracle proof [latex]\pi_2[/latex]
  • [latex]\ldots[/latex]
  • [latex]V[/latex] has oracle access to [latex]n[/latex]-th oracle proof [latex]\pi_n[/latex]
  • [latex]V[/latex] sends public random token [latex]r_n[/latex]
  • Verifier runtime is [latex]\text{poly}(n+\log(T(n)))[/latex]
  • If [latex]x\in L[/latex] there exists accepting proofs [latex]\pi_1,\pi_2,\ldots,\pi_t[/latex] that are accepted with probability 1
  • If [latex]x\notin L[/latex] then all proofs [latex]\overrightarrow{\pi}[/latex] are rejected with probability at least [latex]1/2[/latex]

IOPs with perfect ZK and scalable verifiers for languages in the classes [latex]\textbf{NP}[/latex] and [latex]\textbf{NEXP}[/latex] were recently illustrated, and the prover running time is shown to be bounded by [latex]T_C\cdot\log^{\mathcal{O}(1)}T_C[/latex] (referred to in the ZK-STARK paper as “scalable proving time”).

Scalable Transparent IOP of Knowledge (STIK)

Scalable ZK System: A ZK system is scalable if both prover and verifier times are bounded by the poly-logarithmic factor [latex]T_C\cdot\log^{\mathcal{O}(1)}T_C[/latex] for some publicly computable function [latex]C[/latex] that runs with time [latex]T_C[/latex].

ZK-STIK: A ZK system in the Scalable Transparent IOP of Knowledge (STIK) model for a language [latex]L\in\textbf{NTIME}(T(n))[/latex] is

  • Scalable, as per the definition above
  • Transparent, through the use of public random coins for all verifier messages
  • An IOP of Knowledge, if there is some poly-time extractor that extracts a witness [latex]w[/latex] for membership of [latex]x\in L[/latex] from some honest prover

Basically, A ZK in the IOP model (ZK-IOP) that has the 6 primary guarantees of a ZK-PCP as well as the scalability described above (Scalable ZK) is called a Scalable Transparent IOP of Knowledge (ZK-STIK).

Theoretical constructions of ZK-STIKs have surfaced recently, but the ZK-STARK paper is the only one so far that demonstrates their efficiency and applicability.

Arithmetization

Arithmetization is a commonly used technique in numerous modern ZK systems, which involves the reduction of computational problems to algebraic ones. Such algebraic problems involve a polynomial over some finite field [latex]\mathbb{F}[/latex], where the degree of the polynomial is much smaller than the size of [latex]\mathbb{F}[/latex] (low-degree).

Why do we need such reductions? We require similar math to polynomial oversampling used in erasure coding (which is used to make data fault-tolerant). If we have some data set that we encode as a polynomial of degree [latex]d=10^6[/latex], we can sample [latex]2d[/latex] points from the polynomial. This way, any [latex]d+1[/latex] points will allow us to recover the original data set via polynomial interpolation. However, any single error in the original data set is amplified and will change at least [latex]d[/latex] points on the polynomial. In a similar fashion, algorithms in ZK-STARKs make use of polynomial oversampling and interpolation to the effect of error amplification.

In a ZK system, arithmetization begins with a computational integrity claim that some prover [latex]P[/latex] wishes to prove, e.g. “[latex]\alpha[/latex] is the result of executing [latex]T[/latex] steps of [latex]C[/latex] on input [latex]x[/latex]”. It is then converted into a polynomial (or algebraic intermediate representation/AIR), then determined if it is proximal to the expected polynomial. The problem is easy if we have the ability to look at every point on the polynomial, but it is not viable for large polynomials in practice – hence the number of queries if limited. More concretely, for a polynomial with a very high degree [latex]d[/latex], and we want to verify the degree [latex]d[/latex] with less than [latex]d[/latex] queries. It is very difficult to do directly, however if we sample a small subset of points and perform a proximity test (there’s always the possibility that most are on the low-degree polynomial, but some are not) we can reduce the false positive rate to a negligible value. The exact arithmetization is much more complex, but this hopefully provides some high-level intuition.

The Reed-Solomon Proximity Testing problem is the reduction target of computational problems in ZK-STARKs.

Reed-Solomon Codes: For an [latex]n[/latex]-element evaluation set [latex]S[/latex] in a finite field [latex]\mathbb{F}[/latex] ([latex]S\subset\mathbb{F}, |S|=n)[/latex] and rate parameter [latex]\rho\in(0,1][/latex], [latex]\textbf{RS}[\mathbb{F},S,\rho][/latex] is the set of Reed-Solomon Codes [latex]f:S\rightarrow\mathbb{F}[/latex] that are evaluations of polynomials with degree [latex]d<\rho n[/latex].

Reed-Solomon Proximity (RP) Problem: Given oracle access to a Reed-Solomon code [latex]f:S\rightarrow\mathbb{F}[/latex], the Reed-Solomon Proximity Problem asks that a verifier [latex]V[/latex] distinguishes between two cases with high probability: (1) [latex]f\in \textbf{RS}[\mathbb{F},S,\rho][/latex] and (2) [latex]f[/latex] is [latex]\delta[/latex]-far pairwise Hamming distance from all [latex]f^\prime\in\textbf{RS}[\mathbb{F},S,\rho][/latex], [latex]f\neq f^\prime[/latex].

Reed-Solomon Proximity Testing (RPT) Problem: RPT extends the RP Problem with access to a proof of proximity (such as an interactive oracle proof of proximity (IOPP)) and its oracle.

What’s novel in this BBHR18 paper is that the authors made the effort to reduce the many computational bottlenecks in arithmetization which is key to making the system scalable:

  • A new protocol is used to solve the RPT problem, called the Fast RS IOPP or FRI. With FRI, the runtime (asymptotic and concrete) runtimes of previous RPT solutions (which require quasilinear prover arithmetic complexity) are improved significantly (to strictly linear complexity).
  • The binary field [latex]\mathbb{F}_{2^{64}}[/latex] is chosen for its efficiency in arithmetic operations, in conjunction with cryptographic hash functions that are efficient on binary fields (e.g. SHA2, AES128-DM).
  • Reducing the maximal degree the resultant “arithmetized” polynomial. The main bottleneck dominating prover space and time complexity is the cost of performing polynomial interpolations and evaluations, which in turn is dominated by the maximal degree of the polynomials in questions. This is achieved through some linear algebraic tricks and computation structure-specific optimization. This gave a large multiplicative improvement factor ([latex]>5000[/latex]x) over existing arithmetization methods in ZK systems.
  • Using a more recent and novel additive FFT for polynomial interpolation and evaluation, which has strictly quasi-linear arithmetic complexity.
  • Using the Davies-Meyer hash with AES (AES-DM) as the hash function of choice for the commit-reveal scheme, since AES is available as a native instruction set on many modern processors.

From STIK to STARK

After some computational integrity claim is converted into an algebraic representation, an IOP is setup for the proximity proof of the algebraic representation (polynomial). Such a setup is also known as an Interactive Oracle Proof of Proximity or IOPP – it differs from an IOP in that instead of proving some [latex]x\in L, L\in\textbf{NTIME}(T(n))[/latex], we want to prove that some oracle (in this case, a Reed-Solomon (RS) code) [latex]f[/latex] is proximal to some code [latex]\mathbb{W} \in \mathcal{F}^n[/latex] for some family of codes [latex]\mathcal{F}[/latex] (such as [latex]\textbf{RS}[\mathbb{F},S,\rho][/latex]). If [latex]f\notin \mathbb{W}[/latex] then all proofs [latex]\overrightarrow{\pi}[/latex] are rejected with probability at least [latex]s(\Delta(f, \mathbb{W}))[/latex], where [latex]\Delta[/latex] is a distance function and [latex]s[/latex] is a soundness function that we wish to maximize. Once we are done with all the transformations to a scalable and transparent IOPP model (STIK), there are two main ways to transform the system into a STARK:

  • Interactive STARK (ZK-iSTARK): Collision-resistant hash functions can be used to convert a STIK into an interactive argument system (as shown by Kilian). If the STIK has perfect ZK (as defined before) then the argument system has computational ZK.
  • Non-Interactive STARK (ZK-nSTARK): Via the PCP and IOP models, any STIK can be compiled into a non-interactive argument system in the ROM model.

ZK-STARKs vs. ZK-SNARKs

ZK-SNARKs have some problems that prevents it from gaining traction in mainstream cryptographic use, such as in blockchain technology and cryptocurrencies:

  • There is a trusted setup phase that can be compromised. For example in the cryptocurrency zcash that uses ZK-SNARKs, there is a Genesis block that needs to be setup and the assumption is that it is secure.
  • The scalability can be improved in terms of proof generation and verification.
  • For reasons we will see in a bit, ZK-SNARKs are vulnerable to quantum attacks.

Transparency

The largest problem with ZK-SNARKs is that the user base has to trust the initial setup phase (and in turn, the parties involved in setting up the system honestly). The users will never know if the setup phase was compromised from the get-go (or in the future). Contrast to ZK-STARKs, where there is no external trusted setup phase. All randomness is publicly verifiable so no party (without being discovered) can obtain the setup parameters and generate false accepting proofs.

Scalability

ZK-STARKs exhibit much more scalability than ZK-SNARKs (initial benchmarks from the paper show promising results). ZK-STARKs improve many bottlenecking aspects of ZK-SNARKs:

  • Arithmetic complexity. The reduction to algebraic representations and their relevant computations are key to proof generation and verification, and the authors have made many optimizations to reduce the computational and arithmetic complexity.
  • Prover and verifier complexity. Due to reductions in arithmetic complexity, ZK-STARKs are faster than ZK-SNARKs by a factor of 8-10x with respect to computation size for proof generation. The usage of the FRI algorithm and a newer additive FFT substantially sped up the verifier runtime in ZK-STARKs.
  • Communication complexity. The amount of messages required between [latex]n[/latex] parties to solve a problem (in this case, verifying a proof) grows linearly in [latex]n[/latex] for a ZK-SNARK; in contrast, ZK-STARKs grow at a much slower rate.

While ZK-STARKs have the above scalability benefits, ZK-SNARKs have a much smaller proof size. In this regard, ZK-STARKs may have its own scalability limitation.

Post-Quantum Resilience

ZK-SNARKs rely on private-public encryption schemes such as RSA and ECDSA to generate private-public key pairs. With the (upcoming) advent of quantum computing, there exists algorithms that breaks such cryptosystems – for example, Shor’s algorithm greatly speeds up the derivation of such private keys from their public keys by solving the large prime factoring problem. In contrast, ZK-STARKs adopt collision-resistant hash functions for ZK-iSTARKs, and the random oracle model for ZK-nSTARKs. In the IOPP of a ZK-STARK, Merkle trees (which are post-quantum resilient) are used. All these cryptographic primitives make ZK-STARKs resistant to known quantum computation attacks.

Conclusion

ZK-STARKs are still very new and in the process of commercialization. If realized, it could be leveraged across many use-cases. The added benefits of being scalable and transparent with universal application enables creates better trust in the technology. They provide safer versions of ZK-SNARKs with a simpler structure in terms of the underlying cryptographic assumptions.

References

[BBHR18] Eli Ben-Sasson, Iddo Bentov, Yinon Horesh and Michael Riabzev. Scalable, transparent, and post-quantum secure computational integrity. In Cryptology ePrint Archive, Report 2018/046. https://eprint.iacr.org/2018/046

[Kil92] Joe Killian. A note on efficient zero-knowledge proofs and arguments (extended abstract). In STOC ’92 Proceedings of the twenty-fourth annual ACM symposium on Theory of Computing, pages 723-732. May 1992.

[Mic94] Silvio Micali. CS proofs (extended abstracts). In 35th FOCS}, pages 436–453. IEEE Computer Society Press, November 1994.

[BBHR17] Eli Ben-Sasson, Iddo Bentov, Ynon Horesh, and Michael Riabzev. Fast Reed-Solomon interactive oracle proofs of proximity (2nd revision). Electronic Colloquium on Computational Complexity (ECCC), 24:134, 2017.

Leave a Reply

Your email address will not be published. Required fields are marked *