Abstract
Consider a mixture problem consisting of k classes. Suppose we observe an s-dimensional random vector X whose distribution is specified by the relations P (X ∈ A | Y = i) = Pi (A), where Y is an unobserved class identifier defined on { 1, ..., k }, having distribution P (Y = i) = pi. Assuming the distributions Pi having a common covariance matrix, elegant identities are presented that connect the matrix of Fisher information in Y on the parameters p1, ..., pk, the matrix of linear information in X, and the Mahalanobis distances between the pairs of P's. Since the parameters are not free, the information matrices are singular and the technique of generalized inverses is used. A matrix extension of the Mahalanobis distance and its invariant forms are introduced that are of interest in their own right. In terms of parameter estimation, the results provide an independent of the parameter upper bound for the loss of accuracy by esimating p1, ..., pk from a sample of X′s, as compared with the ideal estimator based on a random sample of Y′s.
Original language | English (US) |
---|---|
Pages (from-to) | 3950-3959 |
Number of pages | 10 |
Journal | Journal of Statistical Planning and Inference |
Volume | 138 |
Issue number | 12 |
DOIs | |
State | Published - Dec 1 2008 |
All Science Journal Classification (ASJC) codes
- Statistics and Probability
- Statistics, Probability and Uncertainty
- Applied Mathematics