Approximation of the measured Kullback–Leibler divergence as a generalized chi-squared random variable

    Research output: Contribution to journalArticlepeer-review

    Abstract

    The Kullback–Leibler (KL) divergence between a pair of parameterized probability distributions becomes a random variable when the parameters are equipped with a prior distribution. In the case of exponential family distributions, the canonical coordinates can be drawn from a conjugate-prior distribution, also in the exponential family. Using the Bayesian Central Limit Theorem we show that, as long as the certainty associated to the conjugate prior is large, the measured KL-divergence between two sampled exponential family distributions is approximately distributed as a generalized chi-squared random variable. Furthermore, when the two distributions are drawn from the same conjugate prior the KL-divergence is approximately chi-squared distributed. We show similar results hold when one of the likelihood distributions is held fixed or is projected onto a submanifold via information projection. The chi-squared approximations are demonstrated for normal and categorical distributions. Additionally, these results are used to formulate a generic form of chi-squared hypothesis testing for exponential family distributions which reduces to the standard G-test as a special case.

    Original languageEnglish (US)
    Pages (from-to)231-264
    Number of pages34
    JournalInformation Geometry
    Volume8
    Issue number2
    DOIs
    StatePublished - Nov 2025

    All Science Journal Classification (ASJC) codes

    • Statistics and Probability
    • Geometry and Topology
    • Computer Science Applications
    • Computational Theory and Mathematics
    • Applied Mathematics

    Fingerprint

    Dive into the research topics of 'Approximation of the measured Kullback–Leibler divergence as a generalized chi-squared random variable'. Together they form a unique fingerprint.

    Cite this