Nyström Kernel Stein Discrepancy

Research output: Contribution to journalConference articlepeer-review

Abstract

Kernel methods underpin many of the most successful approaches in data science and statistics, and they allow representing probability measures as elements of a reproducing kernel Hilbert space without loss of information. Recently, the kernel Stein discrepancy (KSD), which combines Stein's method with the flexibility of kernel techniques, gained considerable attention. Through the Stein operator, KSD allows the construction of powerful goodness-of-fit tests where it is sufficient to know the target distribution up to a multiplicative constant. However, the typical U- and V-statistic-based KSD estimators suffer from a quadratic runtime complexity, which hinders their application in large-scale settings. In this work, we propose a Nyström-based KSD acceleration-with runtime O(mn + m3) for n samples and m ≪ n Nyström points-, show its √nconsistency with a classical sub-Gaussian assumption, and demonstrate its applicability for goodness-of-fit testing on a suite of benchmarks. We also show the √n-consistency of the quadratic-time KSD estimator.

Original languageEnglish (US)
Pages (from-to)388-396
Number of pages9
JournalProceedings of Machine Learning Research
Volume258
StatePublished - 2025
Event28th International Conference on Artificial Intelligence and Statistics, AISTATS 2025 - Mai Khao, Thailand
Duration: May 3 2025May 5 2025

All Science Journal Classification (ASJC) codes

  • Software
  • Control and Systems Engineering
  • Statistics and Probability
  • Artificial Intelligence

Fingerprint

Dive into the research topics of 'Nyström Kernel Stein Discrepancy'. Together they form a unique fingerprint.

Cite this