Abstract
We develop two approaches for analyzing the approximation error bound for the Nyström method that approximates a positive semidefinite (PSD) matrix by sampling a small set of columns, one based on a concentration inequality for integral operators, and one based on random matrix theory. We show that the approximation error, measured in the spectral norm, can be improved from O(N/√ m) to O(N/m1-1) in the case of large eigengap, where N is the total number of data points, m is the number of sampled data points, and \rho \in (0, 1/2) is a positive constant that characterizes the eigengap. When the eigenvalues of the kernel matrix follow a p-power law, our analysis based on random matrix theory further improves the bound to O(N/mp-1) under an incoherence assumption. We present a kernel classification approach based on the Nyström method and derive its generalization performance using the improved bound. We show that when the eigenvalues of the kernel matrix follow a p-power law, we can reduce the number of support vectors to N2p/(p2-1) which is sublinear in N when p > 1+√2, without seriously sacrificing its generalization performance.
Original language | English (US) |
---|---|
Article number | 6547995 |
Pages (from-to) | 6939-6949 |
Number of pages | 11 |
Journal | IEEE Transactions on Information Theory |
Volume | 59 |
Issue number | 10 |
DOIs | |
State | Published - 2013 |
All Science Journal Classification (ASJC) codes
- Information Systems
- Computer Science Applications
- Library and Information Sciences