What does i and j represent in Markov Chains?

Status
Not open for further replies.

elvis0206

Junior Member level 1
Joined
Oct 21, 2010
Messages
15
Helped
0
Reputation
0
Reaction score
0
Trophy points
1,281
Activity points
1,400
in Markov Chains, the conditional probability is
P{Xt+1 = j︱Xt= i} =Pij

I don't know what are i and j represent here. Do they represent the step go to j from i or go to i from j???
can anyone give the explanation to me...thanks a lots!
 

Yep - It's a homogeneous Markov chain since any notation of time-dependence has been dropped.
see
Probability, Markov chains, queues ... - Google Books

Markov chains are only concerned with present state and possible next states -
like a random walk, where at any time you're at one single place.
Think of a drunk staggering - he's somewhere now, but his next step will be governed by probability.

Your expression essentially reads:
For all time (i.e. homogeneous), The probability of being in state j (conditionally) given we had just been in state i can be termed Pij,
and will be constant (no time dependence) for any occurrence of the transition from state i to j.

You know it's i to j (not j to i) because it was i (conditionally) at t, whereas j is at t+1
 

can you go on further, with its physical implementation part
 

Status
Not open for further replies.
Cookies are required to use this site. You must accept them to continue using the site. Learn more…