Computational entropy and information leakage (MA Thesis)
Date
2011-01-07
DOI
Authors
Fuller, Benjamin
Reyzin, Leonid
Version
OA Version
Citation
Fuller, Benjamin; Reyzin, Leonid. "Computational Entropy and Information Leakage (MA Thesis)", Technical Report BUCS-TR-2011-001, Computer Science Department, Boston University, January 7, 2011. [Available from: http://hdl.handle.net/2144/11358]
Abstract
We investigate how information leakage reduces computational entropy of a random variable X. Recall that HILL and metric computational entropy are parameterized by quality (how distinguishable is X from a variable Z that has true entropy) and quantity (how much true entropy is there in Z). We prove an intuitively natural result: conditioning on an event of probability p reduces the quality of metric entropy by a factor of p and the quantity of metric entropy by log 1/p (note that this means that the reduction in quantity and quality is the same, because the quantity of entropy is measured on logarithmic scale). Our result improves previous bounds of Dziembowski and Pietrzak (FOCS 2008), where the loss in the quantity of entropy was related to its original quality. The use of metric entropy tightens the result of Reingold et. al. (FOCS 2008) and makes it easy to measure entropy even after conditioning on several events. Further, we simplify dealing with information leakage by investigating conditional metric entropy. We show that, conditioned on leakage of λ bits, metric entropy gets reduced by a factor 2^λ in quality and λ in quantity. Our formulation allow us to formulate a "chain rule" for leakage on computational entropy. We show that conditioning on λ bits of leakage reduces conditional metric entropy by λ bits. This is the same loss as leaking from unconditional metric entropy. This result makes it easy to measure entropy even after several rounds of information leakage.