Monday, January 12, 2009

Hidden Markov Models

Based on the last post, I was thinking some more about the implementation of Hidden Markov Models (HMM), i.e. http://www.answers.com/topic/hidden-markov-model, for both knowledge discovery and clustering. Remember that HMM is basically a directed graph with N states which can be from a discrete or continuous pdf. Churbanov and Winters-Hilt (2008) have an interesting article on the use of the Baum-Welch decoding algorithms which is based on Expectation Maximization (EM). The authors use Java and the JAHMM library, i.e. http://www.run.montefiore.ulg.ac.be/~francois/software/jahmm/, for their analysis and determined that a linear memory procedure may provide some utility in terms of memory use and speed.

A tutorial on using HMM for speech recognition is at http://www.ece.ucsb.edu/Faculty/Rabiner/ece259/Reprints/tutorial%20on%20hmm%20and%20applications.pdf as well as examples of the Baum-Welch and Viterbi algorithms along with a list of C and MATLAB packages-http://www.accenture.com/NR/rdonlyres/6EA0F25D-7FA7-4B43-AFDB-CBA983F1347F/0/HMMTutorialPart2.pdf .


References

Churbanov, A. and S. Winters-Hilt (2008). "Implementing EM and Viterbi algorithms for Hidden Markov Model in linear memory," BMC Bioinformatics 9:224.

No comments: