In this paper, several distance measures for hidden Markov models (HMMs) are compared. The most commonly used distance measure between two HMMs is Kullback-Leibler divergence (KLD). Since there is no closed form solution, Monte-Carlo method is usually applied to calculate the KLD. However, the computational complexity in Monte-Carlo estimation may be prohibitive in practical applications, which motivated researchers to propose new distance measures for HMMs. Numerical examples are presented comparing three such distance measures against the Monte-Carlo method. Results show that it is possible to approximate the KLD with a saving of hundreds of times in computational complexity
M. FalkhausenHerbert ReiningerDietrich E. Wolf
Rune B. LyngsøChristian N. S. PedersenHenrik Nielsen
Rune B. LyngsøChristian N. S. Pedersen
Daniel DeMenthonDavid DoermannM.V. Stuckelberg
Daniel G. BrownJakub Truszkowski