Regularized robust estimation of mean and covariance matrix under heavy-tailed distributions

Ying Sun, Prabhu Babu, Daniel P. Palomar

Research output: Contribution to journalArticlepeer-review

15 Scopus citations


In this paper, the joint mean-covariance estimation problem is considered under the scenario that the number of samples is small relative to the problem dimension. The samples are assumed drawn independently from a heavy-tailed distribution of the elliptical family, which can model scenarios where the commonly adopted Gaussian assumption is violated either because of the data generating process or the contamination of outliers. Under the assumption that prior knowledge of the mean and covariance matrix is available, we propose a regularized estimator defined as the minimizer of a penalized loss function, which combines the prior information and the information provided by the samples. The loss function is chosen to be the negative log-likelihood function of the Cauchy distribution as a conservative representative of heavy-tailed distributions, and the penalty term is constructed with the prior being its global minimizer. The resulting regularized estimator shrinks the mean and the covariance matrix to the prior target. The existence and uniqueness of the estimator for finite samples are established under certain regularity conditions. Numerical algorithms are derived for the estimator based on the majorization-minimization framework with guaranteed convergence and simulation results demonstrate that the proposed estimator achieves better estimation accuracy compared to the benchmark estimators.

Original languageEnglish (US)
Article number7069228
Pages (from-to)3096-3109
Number of pages14
JournalIEEE Transactions on Signal Processing
Issue number12
StatePublished - Jun 15 2015

All Science Journal Classification (ASJC) codes

  • Signal Processing
  • Electrical and Electronic Engineering


Dive into the research topics of 'Regularized robust estimation of mean and covariance matrix under heavy-tailed distributions'. Together they form a unique fingerprint.

Cite this