Given two discrete random variables X and Y, with probability distributions p = (p(1), . . . , p(n)) and q = (q(1), . . . , q(m)), respectively, let us denote by C(p, q) the set of all couplings of p and q, that is, the set of all bivariate probability distributions that have p and q as marginals. In this paper, we study the problem of finding a joint probability distribution in C(p, q) of minimum entropy (equivalently, a coupling that maximizes the mutual information between X and Y), and we discuss several situations where the need for this kind of optimization naturally arises. Since the optimization problem is known to be NP-hard, we give an efficient algorithm to find a joint probability distribution in C(p, q) with entropy exceeding the minimum possible at most by 1 bit, thus providing an approximation algorithm with an additive gap of at most 1 bit. Leveraging on this algorithm, we extend our result to the problem of finding a minimum-entropy joint distribution of arbitrary k >= 2 discrete random variables X-1, . . . , X-k, consistent with the known k marginal distributions of the individual random variables X-1, . . . , X-k. In this case, our algorithm has an additive gap of at most log k from optimum. We also discuss several related applications of our findings and extensions of our results to entropies different from the Shannon entropy.

Minimum-Entropy Couplings and Their Applications

Cicalese, Ferdinando;
2019-01-01

Abstract

Given two discrete random variables X and Y, with probability distributions p = (p(1), . . . , p(n)) and q = (q(1), . . . , q(m)), respectively, let us denote by C(p, q) the set of all couplings of p and q, that is, the set of all bivariate probability distributions that have p and q as marginals. In this paper, we study the problem of finding a joint probability distribution in C(p, q) of minimum entropy (equivalently, a coupling that maximizes the mutual information between X and Y), and we discuss several situations where the need for this kind of optimization naturally arises. Since the optimization problem is known to be NP-hard, we give an efficient algorithm to find a joint probability distribution in C(p, q) with entropy exceeding the minimum possible at most by 1 bit, thus providing an approximation algorithm with an additive gap of at most 1 bit. Leveraging on this algorithm, we extend our result to the problem of finding a minimum-entropy joint distribution of arbitrary k >= 2 discrete random variables X-1, . . . , X-k, consistent with the known k marginal distributions of the individual random variables X-1, . . . , X-k. In this case, our algorithm has an additive gap of at most log k from optimum. We also discuss several related applications of our findings and extensions of our results to entropies different from the Shannon entropy.
2019
Entropy minimization; mutual information maximization; coupling; majorization
File in questo prodotto:
File Dimensione Formato  
Minimization-revised-18Jan2019.pdf

solo utenti autorizzati

Tipologia: Documento in Post-print
Licenza: Accesso ristretto
Dimensione 479.69 kB
Formato Adobe PDF
479.69 kB Adobe PDF   Visualizza/Apri   Richiedi una copia

I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.

Utilizza questo identificativo per citare o creare un link a questo documento: https://hdl.handle.net/11562/1011976
Citazioni
  • ???jsp.display-item.citation.pmc??? ND
  • Scopus 17
  • ???jsp.display-item.citation.isi??? 13
social impact