Gradient algorithms for designing predictive vector quantizers

Pao Chi Chang, Robert M. Gray

研究成果: 雜誌貢獻期刊論文同行評審

79 引文 斯高帕斯(Scopus)

摘要

A predictive vector quantizer (PVQ) is a vector extension of a predictive quantizer. It consists of two parts: A conventional memoryless vector quantizer (VQ) and a vector predictor. Two gradient algorithms for designing a PVQ are developed in this paper: The steepest descent (SD) algorithm and the stochastic gradient (SG) algorithm. Both have the property of improving the quantizer and the predictor in the sense of minimizing the distortion as measured by the average mean-squared error. The differences between the two design approaches are the period and the step size used in each iteration to update the codebook and predictor. The SG algorithm updates once for each input training vector and uses a small step size, while the SD updates only once for a long period, possibly one pass over the entire training sequence, and uses a relatively large step size. Code designs and tests are simulated for both Gauss-Markov sources and for sampled speech waveforms, and the results are compared to codes designed using techniques that attempt to optimize only the quantizer for the predictor and not vice versa.

原文???core.languages.en_GB???
頁(從 - 到)679-690
頁數12
期刊IEEE Transactions on Acoustics, Speech, and Signal Processing
34
發行號4
DOIs
出版狀態已出版 - 8月 1986

指紋

深入研究「Gradient algorithms for designing predictive vector quantizers」主題。共同形成了獨特的指紋。

引用此