Given two discrete random variables X and Y, with probability distributions p = (p1,..., pn) and q = (q1,..., qm), respectively, denote by C(p, q) the set of all joint distributions of X and Y that have p and q as marginals. In this paper, we study the problem of finding the joint probability distribution in C (p, q) of minimum entropy (equivalently, the joint probability distribution that maximizes the mutual information between X and Y), and we discuss several situations where the need for this kind of optimization naturally arises. Since the optimization problem is known to be NP-hard, we give an efficient algorithm to find a joint probability distribution in C(p, q) with entropy exceeding the minimum possible by at most 1, thus providing an approximation algorithm with additive approximation factor of 1. We also discuss some related consequences of our findings.
How to find a joint probability distribution of minimum entropy (almost) given the marginals
Cicalese, Ferdinando
;
2017-01-01
Abstract
Given two discrete random variables X and Y, with probability distributions p = (p1,..., pn) and q = (q1,..., qm), respectively, denote by C(p, q) the set of all joint distributions of X and Y that have p and q as marginals. In this paper, we study the problem of finding the joint probability distribution in C (p, q) of minimum entropy (equivalently, the joint probability distribution that maximizes the mutual information between X and Y), and we discuss several situations where the need for this kind of optimization naturally arises. Since the optimization problem is known to be NP-hard, we give an efficient algorithm to find a joint probability distribution in C(p, q) with entropy exceeding the minimum possible by at most 1, thus providing an approximation algorithm with additive approximation factor of 1. We also discuss some related consequences of our findings.I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.