Summary:
Markov chains try to solve the problem of recognizing sequences of observations by comparison to a signal model that is found to characterize some sequence of symbols. A first attempt suggests fitting linear system models over several short time segments to approximate the signal itself. This model is rejected in favor of the Hidden Markov Model.
A Hidden Markov Model consists of a number of states, each of which is associated with a probability distribution to decide what the next state to transition to should be. As each state is entered, an observation is recorded which is determined by a fixed, state-specific "observation probability distribution". In addition, there is an initial probability distribution which describes the likelihood of being in each state when observations begin.
Three types of problems associated with HMMs are outlined and their solutions are given. The first problem deals with finding the probability of an observation sequence given an observation sequence and a model. The second problem is concerned with finding the optimal sequence of states associated with a given observation sequence and is usually accomplished by considering each state individually. The third problem concentrates on how to optimize model parameters to best describe how the observed sequence occurs. The three problems can be solved using the forward-backward procedure, the Viterbi algorithm, and the Baum-Welch method, respectively.
The paper lists recognition results for three different HMM observation models (all of which are above 97%) when applied to single word recognition on vocabularies of 10 digits.
Discussion:
Hidden Markov Models are used in speech recognition, but the paper did not mention how a HMM could be applied to gesture recognition. I thought the coin tossing example was more confusing than simply defining the model. My introduction to Markov chains was in the context of weather prediction. Given a context for examples with inherent practical motivation seemed to make more sense than a non-motivated context, like the outcome of tossing coins.
Subscribe to:
Post Comments (Atom)
No comments:
Post a Comment