An identity for the Fisher information and Mahalanobis distance

Abram Kagan, Bing Li

Research output: Contribution to journalArticlepeer-review

3 Scopus citations


Consider a mixture problem consisting of k classes. Suppose we observe an s-dimensional random vector X whose distribution is specified by the relations P (X ∈ A | Y = i) = Pi (A), where Y is an unobserved class identifier defined on { 1, ..., k }, having distribution P (Y = i) = pi. Assuming the distributions Pi having a common covariance matrix, elegant identities are presented that connect the matrix of Fisher information in Y on the parameters p1, ..., pk, the matrix of linear information in X, and the Mahalanobis distances between the pairs of P's. Since the parameters are not free, the information matrices are singular and the technique of generalized inverses is used. A matrix extension of the Mahalanobis distance and its invariant forms are introduced that are of interest in their own right. In terms of parameter estimation, the results provide an independent of the parameter upper bound for the loss of accuracy by esimating p1, ..., pk from a sample of Xs, as compared with the ideal estimator based on a random sample of Ys.

Original languageEnglish (US)
Pages (from-to)3950-3959
Number of pages10
JournalJournal of Statistical Planning and Inference
Issue number12
StatePublished - Dec 1 2008

All Science Journal Classification (ASJC) codes

  • Statistics and Probability
  • Statistics, Probability and Uncertainty
  • Applied Mathematics


Dive into the research topics of 'An identity for the Fisher information and Mahalanobis distance'. Together they form a unique fingerprint.

Cite this