We propose an iterative algorithm that incorporates model selection into entropy-constrained vector quantization. Two model selection steps are added to the classic Lloyd algorithm as additional necessary conditions for optimality. Codewords are pruned by using a Lagrangian with entropy and codebook size constraints. Relevant features are found by using a partitioned vector quantization. Relevant and irrelevant features are modelled independently. Moreover, we model irrelevant features by a global probability density function to make them independent of partition cells. This enables us to avoid a problem in comparing the performances of vector quantizers in different dimensional spaces. As a Lagrangian decreases, we not only obtain a locally optimal codebook, but also reduce codebook size and identify relevant features.
C.N. ManikopoulosHaijiang SunGrigoris Antoniou
Aree WitoelarA. GhoshJ. J. G. de VriesBarbara HammerMichael Biehl