TY - JOUR
T1 - Asymptotic performance of vector quantizers with a perceptual distortion measure
AU - Li, Jia
AU - Chaddha, Navin
AU - Gray, Robert M.
N1 - Funding Information:
Manuscript received May 1, 1997; revised September 1, 1998. This work was supported in part by the National Science Foundation under Grant NSF MIP-9016974. The material in this paper was presented in part at the IEEE International Symposium on Information Theory, Ulm, Germany, June 1997. J. Li and R. M. Gray are with the Information Systems Laboratory, Department of Electrical Engineering, Stanford University, CA 94305 USA (e-mail: [email protected]; [email protected]). N. Chaddha is with Microsoft Corporation, Redmond, WA 98052 USA (e-mail: [email protected]) Communicated by R. Laroia, Associate Editor for Source Coding. Publisher Item Identifier S 0018-9448(99)03552-X.
PY - 1999
Y1 - 1999
N2 - Gersho's bounds on the asymptotic performance of vector quantizers are valid for vector distortions which are powers of the Euclidean norm. Yamada, Tazaki, and Gray generalized the results to distortion measures that are increasing functions of the norm of their argument. In both cases, the distortion is uniquely determined by the vector quantization error, i.e., the Euclidean difference between the original vector and the codeword into which it is quantized. We generalize these asymptotic bounds to input-weighted quadratic distortion measures and measures that are approximately output-weighted-quadratic when the distortion is small, a class of distortion measures often claimed to be perceptually meaningful. An approximation of the asymptotic distortion based on Gersho's conjecture is derived as well. We also consider the problem of source mismatch, where the quantizer is designed using a probability density different from the true source density. The resulting asymptotic performance in terms of distortion increase in decibels is shown to be linear in the relative entropy between the true and estimated probability densities.
AB - Gersho's bounds on the asymptotic performance of vector quantizers are valid for vector distortions which are powers of the Euclidean norm. Yamada, Tazaki, and Gray generalized the results to distortion measures that are increasing functions of the norm of their argument. In both cases, the distortion is uniquely determined by the vector quantization error, i.e., the Euclidean difference between the original vector and the codeword into which it is quantized. We generalize these asymptotic bounds to input-weighted quadratic distortion measures and measures that are approximately output-weighted-quadratic when the distortion is small, a class of distortion measures often claimed to be perceptually meaningful. An approximation of the asymptotic distortion based on Gersho's conjecture is derived as well. We also consider the problem of source mismatch, where the quantizer is designed using a probability density different from the true source density. The resulting asymptotic performance in terms of distortion increase in decibels is shown to be linear in the relative entropy between the true and estimated probability densities.
UR - http://www.scopus.com/inward/record.url?scp=0032643189&partnerID=8YFLogxK
UR - http://www.scopus.com/inward/citedby.url?scp=0032643189&partnerID=8YFLogxK
U2 - 10.1109/18.761252
DO - 10.1109/18.761252
M3 - Article
AN - SCOPUS:0032643189
SN - 0018-9448
VL - 45
SP - 1082
EP - 1091
JO - IEEE Transactions on Information Theory
JF - IEEE Transactions on Information Theory
IS - 4
ER -