Loading…

A Mutual Information Inequality and Some Applications

In this paper we derive an inequality relating linear combinations of mutual information between subsets of mutually independent random variables and an auxiliary random variable. One choice of a family of auxiliary variables leads to a new proof of a Stam-type inequality regarding the Fisher Inform...

Full description

Saved in:
Bibliographic Details
Published in:IEEE transactions on information theory 2023-10, Vol.69 (10), p.1-1
Main Authors: Lau, Chin Wa, Nair, Chandra, Ng, David
Format: Article
Language:English
Subjects:
Citations: Items that this one cites
Online Access:Get full text
Tags: Add Tag
No Tags, Be the first to tag this record!
Description
Summary:In this paper we derive an inequality relating linear combinations of mutual information between subsets of mutually independent random variables and an auxiliary random variable. One choice of a family of auxiliary variables leads to a new proof of a Stam-type inequality regarding the Fisher Information of sums of independent random variables. Another choice of a family of auxiliary random variables leads to new results as well as new proofs of results relating to strong data processing constants and maximal correlation between sums of independent random variables. Other results obtained include convexity of Kullback-Leibler divergence over a parameterized path along pairs of binomial and Poisson distributions, as well as a new duality-based argument relating the Stam-type inequality and entropy power inequality.
ISSN:0018-9448
1557-9654
DOI:10.1109/TIT.2023.3285928