Equivalence of distance-based and RKHS-based statistics in hypothesis testing

Dino Sejdinovic, Bharath Sriperumbudur, Arthur Gretton, Kenji Fukumizu

Research output: Contribution to journalArticlepeer-review

347 Scopus citations

Abstract

We provide a unifying framework linking two classes of statistics used in two-sample and independence testing: on the one hand, the energy distances and distance covariances from the statistics literature; on the other, maximum mean discrepancies (MMD), that is, distances between embeddings of distributions to reproducing kernel Hilbert spaces (RKHS), as established in machine learning. In the case where the energy distance is computed with a semimetric of negative type, a positive definite kernel, termed distance kernel, may be defined such that the MMD corresponds exactly to the energy distance. Conversely, for any positive definite kernel, we can interpret the MMDas energy distance with respect to some negative-type semimetric. This equivalence readily extends to distance covariance using kernels on the product space. We determine the class of probability distributions for which the test statistics are consistent against all alternatives. Finally, we investigate the performance of the family of distance kernels in two-sample and independence tests: we show in particular that the energy distance most commonly employed in statistics is just one member of a parametric family of kernels, and that other choices from this family can yield more powerful tests.

Original languageEnglish (US)
Pages (from-to)2263-2291
Number of pages29
JournalAnnals of Statistics
Volume41
Issue number5
DOIs
StatePublished - Oct 2013

All Science Journal Classification (ASJC) codes

  • Statistics and Probability
  • Statistics, Probability and Uncertainty

Fingerprint

Dive into the research topics of 'Equivalence of distance-based and RKHS-based statistics in hypothesis testing'. Together they form a unique fingerprint.

Cite this