Tutorial 05

Download as ppt, pdf, or txt
Download as ppt, pdf, or txt
You are on page 1of 16

Markov Chains

Tutorial #5

Ydo Wexler & Dan Geiger


.
Statistical Parameter Estimation
Reminder

Data set
The basic paradigm:
Model
Parameters:

MLE / bayesian approach

Input data: series of observations X1, X2 Xt


-We assumed observations were i.i.d (independent identical distributed)

Heads - P(H) Tails - 1-P(H)

.
Markov Process
Markov Property: The state of the system at time t+1 depends only
on the state of the system at time t

Pr X t 1 xt 1 | X 1 X t x1 xt Pr X t 1 xt 1 | X t xt

X1 X2 X3 X4 X5

Stationary Assumption: Transition probabilities are independent of


time (t)
Pr X t 1 b | X t a pab

Bounded memory transition model

3
Markov Process
Simple Example
Weather:
raining today 40% rain tomorrow
60% no rain tomorrow

not raining today 20% rain tomorrow


80% no rain tomorrow
Stochastic FSM: 0.6
0.4 0.8

rain no rain

0.2
4
Markov Process
Simple Example
Weather:
raining today 40% rain tomorrow
60% no rain tomorrow

not raining today 20% rain tomorrow


80% no rain tomorrow
The transition matrix:
Stochastic matrix:
0.4 0.6 Rows sum up to 1
P Double stochastic matrix:
0.2 0.8 Rows and columns sum up to 1

5
Markov Process
Gamblers Example
Gambler starts with $10
- At each play we have one of the following:
Gambler wins $1 with probability p
Gambler looses $1 with probability 1-p

Game ends when gambler goes broke, or gains a fortune of $100


(Both 0 and 100 are absorbing states)
p p p p

0 1 2 99 100

1-p 1-p 1-p 1-p


Start
(10$)
6
Markov Process
Markov process - described by a stochastic FSM
Markov chain - a random walk on this graph
(distribution over paths)

Edge-weights give us Pr X t 1 b | X t a pab

We can ask more complex questions, like Pr X t 2 a | X t b ?

p p p p

0 1 2 99 100

1-p 1-p 1-p 1-p


Start
(10$)

7
Markov Process
Coke vs. Pepsi Example
Given that a persons last cola purchase was Coke,
there is a 90% chance that his next cola purchase will
also be Coke.
If a persons last cola purchase was Pepsi, there is
an 80% chance that his next cola purchase will also be
Pepsi.

transition matrix: 0.9 0.1


0.8
0.9 0.1
P
coke pepsi
0.2 0.8 0.2

8
Markov Process
Coke vs. Pepsi Example (cont)

Given that a person is currently a Pepsi purchaser,


what is the probability that he will purchase Coke two
purchases from now?
Pr[ Pepsi?Coke ] =
Pr[ PepsiCokeCoke ] + Pr[ Pepsi Pepsi Coke ] =
0.2 * 0.9 + 0.8 * 0.2 = 0.34

00.9.9 00.1.1 0.9 0.1 0.83 0.17


P
2

00.2.2 00.8.8 0.2 0.8 0.34 0.66

Pepsi ? ? Coke
9
Markov Process
Coke vs. Pepsi Example (cont)

Given that a person is currently a Coke purchaser,


what is the probability that he will purchase Pepsi
three purchases from now?

0.9 0.1 0.83 0.17 0.781 0.219


P
3

0.2 0.8 0.34 0.66 0.438 0.562

1
Markov Process
Coke vs. Pepsi Example (cont)

Assume each person makes one cola purchase per week


Suppose 60% of all people now drink Coke, and 40% drink Pepsi
What fraction of people will be drinking Coke three weeks from now?

0.9 0.1 0.781 0.219


P P
3

0.2 0.8 0 . 438 0. 562
Pr[X3=Coke] = 0.6 * 0.781 + 0.4 * 0.438 = 0.6438

Qi - the distribution in week i


Q0=(0.6,0.4) - initial distribution
Q3= Q0 * P3 =(0.6438,0.3562)
1
Markov Process
Coke vs. Pepsi Example (cont)
Simulation:

2/3
0.9 0.1 2
2
3
1
3 3
1
3
0 . 2 0 . 8
Pr[Xi = Coke]

stationary distribution

0.9 0.1 0.8

coke pepsi

0.2

week - i
1
Hidden Markov Models - HMM

Hidden states

H1 H2 Hi HL-1 HL

X1 X2 Xi XL-1 XL

Observed
data

1
Hidden Markov Models - HMM
Coin-Tossing Example
0.9
0.9
0.1 transition probabilities
fair loaded
0.1

1/2 1/2 3/4 1/4 emission probabilities

H T H T

Fair/Loade
d
H1 H2 Hi HL-1 HL

X1 X2 Xi XL-1 XL

Head/Tail

1
Hidden Markov Models - HMM
C-G Islands Example
C-G islands: Genome regions which are very rich in C and G
q/4

q/4 P q
A G
Regular P q
change
DNA P q
P q

q/4
T C
q/4 p/6 p/3

(1-P)/4 A G

(1-q)/6 C-G island


(1-q)/3 p/3
P/6

T C

1
Hidden Markov Models - HMM
C-G Islands Example
ge
an
ch
G

A C
A C

T
T

C-G /
Regular
H1 H2 Hi HL-1 HL

X1 X2 Xi XL-1 XL

{A,C,G,T} To be continued
1

You might also like