You can also click for my papers (some online), my C V, or my research overview.

* Holographic proof, transparent proof, instantly checkable proof,
probabilistically checkable proof, PCP* are names of a form in which every
proof or record of computation can be presented. This form has a remarkable
property: the presence of any errors (essential deviations from the form or
other requirements) is instantly apparent after checking just a negligible
fraction of bits of the proof. Traditional proofs are verifiable in time that
is a constant power (say, quadratic) of the proof length $n$. Verifying
holographic proofs takes a * polylogarithmic*, i.e., a constant power of
the * logarithm* of $n$, number of bit operations. This is a tiny
fraction: the binary logarithm of the number of atoms in the known Universe is
under 300.

(Of the names in the header, the phrase ``probabilistically checkable'' is somewhat misleading since both holographic and traditional proofs can be checked either deterministically or probabilistically, though randomness does not speed up the checking of traditional proofs.)

The transformation of an ordinary proof into the holographic form
takes a slightly superlinear in $n$ number of bit operations. There are,
however, four caveats: First, the verification is a so-called *
Monte-Carlo* algorithm. It makes certain random choices, and any single
round of verification has a chance of overlooking essential errors due to an
unlucky choice. This chance never exceeds 50%, regardless of the nature of
errors, and vanishes as $1/2^k$ with $k$ independent rounds.

Second, only * essential* errors (i.e., not correctable from the
context) have this probabilistic detection guarantee. There is a *
proofreader* procedure, also running in polylogarithmic Monte-Carlo time,
that confirms or corrects any given bit in any proof accepted by the verifier.
Of course, the instant verification can only assure the success of this
proofreading; it has no time to actually perform it for all bits. An enhanced
version can be made very tolerant: it can guarantee that any errors affecting a
small, say 5%, fraction of bits will be inessential, correctable by the
proofreader and tolerated by the verifier with high probability.

The third caveat is trivial but often overlooked. The claim must be *
formal* and * self-contained*: one cannot just write a mathematical
theorem in English. The translation to a formal system, e.g., Zermelo-Fraenkel
set theory, may be quite involved. Suppose one wants to state the Jordan
theorem that every closed curve breaks a plane into two disconnected parts. One
must give a number of concepts from advanced calculus just to explain what a
curve is. This requires some background from topology, algebra, geometry, etc.
One must add some set theory to formalize this background. Throw in some
necessary logic, parsing, syntax procedures and one obtains a book instead of
the above informal one-line formulation.

Fourth, the claim which the proof is to support (or the input/output, the matching of which the computation is to confirm) also must be given in error-correcting form. Otherwise one could supply a remarkable claim with a perfect proof of its useless (obtained by altering one random bit) variation. Were the claim given in a plain format, such tiny but devastating discrepancies could not be noticed without reading a significant fraction of it, for which the instant verifier has no time. The error-correcting form does not have to be special: Any classical (e.g., Reed-Solomon) code, correcting a constant fraction of errors in nearly linear time, will do. Then the verifier confirms that the code is within the required distance of a unique codeword encoding the claim supported by the given perfect (when filtered through the proofreader) proof.

Despite these caveats the result is surprising. Some known mathematical proofs are so huge that no single human has been able to verify them. Examples are the four color theorem (verified with an aid of a computer, see [AHK]), the classification of simple finite groups (broken into many pieces, each supposedly verified by one member of a large group of researchers, see [Go]), and others. Even more problematic seems the verification of large computations. In a holographic form, however, the verification time barely grows at all, even if the proof fills up the whole Universe.

The graphs are taken from a standard family: only sizes and colorings can differ. An example is the family of shuffle exchange graphs. Their nodes are enumerated by binary strings (addresses) $x$. The neighbors of $x$ are obtained by simple manipulations of its bits: adding 1 to the first (or last) bit or shifting bits left. These operations (or their constant combinations) define the edges of the graph. The graphs are finite: length of $x$ is fixed for each. In the actual construction, $x$ is broken into several variables, so it is convenient to shuffle bits just within the first variable and permute variables (cycle-shifts of all variables and of the first two suffice). These graphs may be replaced by any other family with edges expressed as linear transformations of variables, as long as it has sufficient connectivity to implement an efficient sorting network.

A proof system is an algorithm that verifies a proof (given as input) and outputs the proven statement. Such an algorithm can be efficiently simulated, first on a special form of a random access machine and then on a sorting network. This allows one to reduce the problem of finding a proof in any particular proof system to the above standard domino problem.

Then comes the arithmetization stage. The coloring is a function from nodes
of the graph to colors. Nodes are interpreted as elements of a field (a finite
field or a segment of the field of rationals) and the coloring is a polynomial
on it. This function is then extended from its original domain to a larger one
(a larger field or a segment of rationals). The extension is done using the
same expression, i.e., **without increasing the degree of the coloring
polynomial**.

The condition that all dominos are restricted to given types is also expressed as equality to $0$ of a low degree polynomial $P(x)$ of a node $x=x_1,...,x_k$, its neighbors, and their colors. Over rationals, we need to check that $P^2(x)$ sum to 0, when $x_i$ vary over the original domain. (In finite fields, constructing the equation to check is trickier). A transparent proof is the collection of values of this expression where summation is taken only over the first several variables. The other variables are taken with all values that exist in the extended domain.

The verification consists of statistical checking that all partial sums (with possibly only a small fraction of deviating points) are polynomials of low, for the extended domain, degree. Then the consistency of the sums with their successors (having one more variable summed over) is checked too. This is easy to do since low degree polynomials cannot differ only in a small fraction of points: e.g., two different straight lines must differ in all points but one. Of course, all parameters must be finely tuned and many other details addressed. The above description is just a general sketch.

- [AHK] K. Appel, W. Haken, J Koch.
Every Planar Map is Four Colorable. Part II: Reducibility.
*Illinois J. Math.*, 21:491-567, 1977. - [Go] D. Gorenstein. The Enormous Theorem,
*Scientific American*, 253(6):104-115, December 1985. - [L] L. Levin. One-Way Functions and Pseudorandom Generators.
*Combinatorica*, 7(4):357-363, 1987. - [BFL] L. Babai, L. Fortnow, C. Lund. Non-Deterministic
Exponential Time Has Two-prover Interactive Protocols.
*31th IEEE Symp. on Foundation of Computer Science*, St. Louis, October 1990. - [LFKN] C. Lund, L. Fortnow, H. Karloff, N. Nisan.
Algebraic Methods for Interactive Proof Systems.
*ibid.* - [Shamir] A. Shamir. IP=PSPACE.
*ibid.* - [BFLS] L. Babai, L. Fortnow, L. Levin, M. Szegedy.
Checking Computation in Polylogarithmic Time,
*23rd ACM Symp. on Theory of Computation,*New Orleans, May 1991. - [FGLSS] U.Feige, S.Goldwasser, L.Lovasz, S.Safra, M.Szegedy.
Approximating Clique is Almost NP-complete.
*32nd IEEE Symp. on Foundation of Computer Science,*San Juan, October 1991.