It is well known that the entropy H(X) of a discrete random variable X is always greater than or equal to the entropy H(f(X)) of a function f of X, with equality if and only if f is one-to-one. In this paper, we give tight bounds on H(f(X)) when the function f is not one-to-one, and we illustrate a few scenarios where this matters. As an intermediate step towards our main result, we derive a lower bound on the entropy of a probability distribution, when only a bound on the ratio between the maximal and minimal probabilities is known. The lower bound improves on previous results in the literature, and it could find applications outside the present scenario.

Bounds on the Entropy of a Function of a Random Variable and their Applications

Ferdinando Cicalese
;
2018-01-01

Abstract

It is well known that the entropy H(X) of a discrete random variable X is always greater than or equal to the entropy H(f(X)) of a function f of X, with equality if and only if f is one-to-one. In this paper, we give tight bounds on H(f(X)) when the function f is not one-to-one, and we illustrate a few scenarios where this matters. As an intermediate step towards our main result, we derive a lower bound on the entropy of a probability distribution, when only a bound on the ratio between the maximal and minimal probabilities is known. The lower bound improves on previous results in the literature, and it could find applications outside the present scenario.
2018
Approximation algorithms, Entropy Frequency modulation, NP-hard problem, Probability distribution, Random variables
File in questo prodotto:
Non ci sono file associati a questo prodotto.

I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.

Utilizza questo identificativo per citare o creare un link a questo documento: https://hdl.handle.net/11562/974799
Citazioni
  • ???jsp.display-item.citation.pmc??? ND
  • Scopus 15
  • ???jsp.display-item.citation.isi??? 15
social impact