Information projectionIn information theory, the information projection or I-projection of a probability distribution q onto a set of distributions P is
where is the Kullback–Leibler divergence from q to p. Viewing the Kullback–Leibler divergence as a measure of distance, the I-projection is the "closest" distribution to q of all the distributions in P. The I-projection is useful in setting up information geometry, notably because of the following inequality, valid when P is convex:[1] . This inequality can be interpreted as an information-geometric version of Pythagoras' triangle-inequality theorem, where KL divergence is viewed as squared distance in a Euclidean space. It is worthwhile to note that since and continuous in p, if P is closed and non-empty, then there exists at least one minimizer to the optimization problem framed above. Furthermore, if P is convex, then the optimum distribution is unique. The reverse I-projection also known as moment projection or M-projection is
Since the KL divergence is not symmetric in its arguments, the I-projection and the M-projection will exhibit different behavior. For I-projection, will typically under-estimate the support of and will lock onto one of its modes. This is due to , whenever to make sure KL divergence stays finite. For M-projection, will typically over-estimate the support of . This is due to whenever to make sure KL divergence stays finite. The reverse I-projection plays a fundamental role in the construction of optimal e-variables.
See alsoReferences
|