+0

# Working out probability in a continuous time markov chain

0
472
6
+248

I can do question 1-3 no problem but I'm lost with part 4. Can anyone here explain the steps I need to go through to get this answer in more detail?

Brodudedoodebrodude  Jun 19, 2015

#3
+26405
+15

Does this help?

.

Alan  Jun 19, 2015
Sort:

#1
+26405
+10

Are you sure the answer given for part 1 is correct?  Looks to me like the 3 on the bottom line of Q should be -3.

If you then calculate P(0.1) you get the numbers in the first column in part 4 the same as given in the answer.

.

Alan  Jun 19, 2015
#2
+248
+5

These are the answers provided by my lecturer but yes I also think it's a typo and should be -3 rather than 3, forgot to mention that!

My problem is just the fundamentals, how does the calculation for P(0.1) look like? Once I see it I can make sense of it (I hope)

Brodudedoodebrodude  Jun 19, 2015
#3
+26405
+15

Does this help?

.

Alan  Jun 19, 2015
#4
+248
+5

Yeah it does thanks Alan. What's still causing some confusion for me is what is the point of mentioning that the initial distribution for the markov chain is (1/3 , 1/3, 1/3)?

Brodudedoodebrodude  Jun 19, 2015
#5
+26405
+10

The state of the Markov chain after a time t depends on the initial state.  The occupancy of state 0 after time t is its initial occupancy (1/3) multiplied by the sum of the probabilities of the transitions to and from that state from the other states (the first column of P(t)).

Notice that the sum of the terms in the first column of P(t) given in the answer to part 4 is divided by 3 (i.e. multiplied by 1/3).

Alan  Jun 19, 2015
#6
+248
+5

I understand it now, thanks a lot Alan =)

Brodudedoodebrodude  Jun 19, 2015

### 6 Online Users

We use cookies to personalise content and ads, to provide social media features and to analyse our traffic. We also share information about your use of our site with our social media, advertising and analytics partners.  See details