1. Introduction Before we give the deﬁnition of a Markov process, we will look at an example: Example 1: Suppose that the bus ridership in a city is studied. After examining several years of data, it was found that 30% of the people who regularly ride on buses in a given year do not regularly ride the bus in the next year. Also it was found that 20% of the people who do not regularly ride the bus in that year, begin to ride the bus regularly the next year. If 5000 people ride the bus and 10,000 do not ride the bus in a given year, what is the distribution of riders/non-riders in the next year? In 2 years? In n years? First we will determine how many people will ride the bus next year. Of the people who currently ride the bus, 70% of them will continue to do so. Of the people who don’t ride the bus, 20% of them will begin to ride the bus. Thus: 5000(0.7) + 10, 000(0.2) = The number of people who ride bus next year. = b1 By the same argument as above, we see that: 5000(0.3) + 10, 000(0.8) = The number of people who don’t ride the bus next year. = b2 This system of equations is equivalent to the matrix equation: M x = b where 0.7 0.2 0.3 0.8 5000 10, 000 b1 b2
and b =
. For computing the result after 2 years, we just use the same matrix M , however we use b 9500 in place of x. Thus the distribution after 2 years is M b = M 2 x. In fact, after n years, the distribution is given by M n x. The forgoing example is an example of a Markov process. Now for some formal deﬁnitions: Deﬁnition 1. A stochastic process is a sequence of events in which the outcome at any stage depends on some probability. Deﬁnition 2. A Markov process is a stochastic process with the following properties: (a.) The number of possible outcomes or states is ﬁnite. (b.) The outcome at any stage depends only on the outcome of the previous stage. (c.) The probabilities are constant over time. If x0 is a vector which represents the initial state of a system,...
Please join StudyMode to read the full document