Abstract
Kernel methods underpin many of the most successful approaches in data science and statistics, and they allow representing probability measures as elements of a reproducing kernel Hilbert space without loss of information. Recently, the kernel Stein discrepancy (KSD), which combines Stein's method with the flexibility of kernel techniques, gained considerable attention. Through the Stein operator, KSD allows the construction of powerful goodness-of-fit tests where it is sufficient to know the target distribution up to a multiplicative constant. However, the typical U- and V-statistic-based KSD estimators suffer from a quadratic runtime complexity, which hinders their application in large-scale settings. In this work, we propose a Nyström-based KSD acceleration-with runtime O(mn + m3) for n samples and m ≪ n Nyström points-, show its √nconsistency with a classical sub-Gaussian assumption, and demonstrate its applicability for goodness-of-fit testing on a suite of benchmarks. We also show the √n-consistency of the quadratic-time KSD estimator.
| Original language | English (US) |
|---|---|
| Pages (from-to) | 388-396 |
| Number of pages | 9 |
| Journal | Proceedings of Machine Learning Research |
| Volume | 258 |
| State | Published - 2025 |
| Event | 28th International Conference on Artificial Intelligence and Statistics, AISTATS 2025 - Mai Khao, Thailand Duration: May 3 2025 → May 5 2025 |
All Science Journal Classification (ASJC) codes
- Software
- Control and Systems Engineering
- Statistics and Probability
- Artificial Intelligence