Abstract
In this paper, the joint mean-covariance estimation problem is considered under the scenario that the number of samples is small relative to the problem dimension. The samples are assumed drawn independently from a heavy-tailed distribution of the elliptical family, which can model scenarios where the commonly adopted Gaussian assumption is violated either because of the data generating process or the contamination of outliers. Under the assumption that prior knowledge of the mean and covariance matrix is available, we propose a regularized estimator defined as the minimizer of a penalized loss function, which combines the prior information and the information provided by the samples. The loss function is chosen to be the negative log-likelihood function of the Cauchy distribution as a conservative representative of heavy-tailed distributions, and the penalty term is constructed with the prior being its global minimizer. The resulting regularized estimator shrinks the mean and the covariance matrix to the prior target. The existence and uniqueness of the estimator for finite samples are established under certain regularity conditions. Numerical algorithms are derived for the estimator based on the majorization-minimization framework with guaranteed convergence and simulation results demonstrate that the proposed estimator achieves better estimation accuracy compared to the benchmark estimators.
Original language | English (US) |
---|---|
Article number | 7069228 |
Pages (from-to) | 3096-3109 |
Number of pages | 14 |
Journal | IEEE Transactions on Signal Processing |
Volume | 63 |
Issue number | 12 |
DOIs | |
State | Published - Jun 15 2015 |
All Science Journal Classification (ASJC) codes
- Signal Processing
- Electrical and Electronic Engineering