Hidden Markov Model (HMM)

Codes for this note

Some parts in this note are from following slide: http://www.cs.cmu.edu/~aarti/Class/10701/slides/Lecture17.pdf

Markov Assumption

Current observation only depends on past m observations

$$ p({\bf X}) = \prod_{i=1}^n p(X_n|X_{n-1}, ..., X_{n-m}) $$

screen shot 2014-10-10 at 15 30 44

Hidden Markov Models

screen shot 2014-10-10 at 15 32 42


screen shot 2014-10-10 at 15 35 38


screen shot 2014-10-10 at 15 35 42

An example of HMM problem

A casino has two die

  • fair
    • $ P(1) = P(2) = ... = P(6) = \frac{1}{6}$
  • loaded
    • $ P(1) = P(2) = ... = P(5) = \frac{1}{10}$
    • $ P(6) = \frac{1}{2} $

screen shot 2014-10-10 at 15 41 25

Observed sequence

screen shot 2014-10-10 at 15 45 34

Hidden state transitions

screen shot 2014-10-10 at 15 45 37

HMM model

screen shot 2014-10-10 at 15 47 01

Solve problems

screen shot 2014-10-10 at 15 49 23

In the learning problem, here $\theta$ should represents emission probabilities.

Algorithms

screen shot 2014-10-10 at 15 53 33


Forward Algorithm

Intuition

Evaluation problem

screen shot 2014-10-10 at 16 03 35

Forward probability

screen shot 2014-10-10 at 16 07 52

Forward algorithm

screen shot 2014-10-10 at 16 07 58


Backward Algorithm

Decoding problem

screen shot 2014-10-10 at 17 15 28

Here, the joint probability is divided into two parts.

Backward algorithm

screen shot 2014-10-10 at 17 15 44


screen shot 2014-10-10 at 17 15 52


Viterbi Algorithm

Decoding problem 2

screen shot 2014-10-10 at 17 55 13

Viterbi Decoding

screen shot 2014-10-10 at 17 55 18

Viterbi Algorithm

screen shot 2014-10-10 at 17 55 22

Computational complexity

screen shot 2014-10-10 at 17 55 27

Intuition


EM Algorithm

Learning problem

screen shot 2014-10-14 at 16 04 57

´┐╝Baum-Welch (EM) Algorithm

screen shot 2014-10-14 at 16 05 03

then,

screen shot 2014-10-14 at 16 05 09

Intuition


Conclusion

screen shot 2014-10-14 at 16 05 15