Information gains from Monte Carlo Markov Chains

In this paper, we present a novel method to compute the relative entropy as well as the expected relative entropy using an MCMC chain. The relative entropy from information theory can be used to quantify differences in posterior distributions of a pair of experiments. In cosmology, the relative entr...

Full description

Saved in:
Bibliographic Details
Published in:European physical journal plus 2020-05, Vol.135 (5), p.393, Article 393
Main Authors: Mehrabi, Ahmad, Ahmadi, A.
Format: Article
Language:eng
Subjects:
Online Access:Get full text
Tags: Add Tag
No Tags, Be the first to tag this record!
Description
Summary:In this paper, we present a novel method to compute the relative entropy as well as the expected relative entropy using an MCMC chain. The relative entropy from information theory can be used to quantify differences in posterior distributions of a pair of experiments. In cosmology, the relative entropy has been proposed as an interesting tool for model selection, experiment design, forecasting and measuring information gain from subsequent experiments. In contrast to Gaussian distributions, these quantities are not available analytically and one needs to use numerical methods to estimate them which are computationally very expensive. We propose a method and provide its python package to estimate the relative entropy as well as expected relative entropy from an MCMC sample. We consider the linear Gaussian model to check the accuracy of our code. Our results indicate that the relative error is below 0.2% for sample size larger than 10 5 in the linear Gaussian model. In addition, we study the robustness of our code in estimating the expected relative entropy in the Gaussian case.
ISSN:2190-5444
2190-5444