SOME MARKOV CHAIN EXAMPLES Example 1 Three players are tossing a frisbee. Player 1 always throws to player 2, who in turn always throws to player 3. However, player 3 is equally likely to throw to player 1 or player 2. a. show the transition matrix for this process. b. over the long run, what proportion of the throws will be made to each of these players? Example 2 Suppose a car rental agency has three locations in Ottawa: Downtown location (labeled A), East end location (labeled B) and a West end location (labeled C). The agency has a group of delivery drivers to serve all three locations. The agency's statistician has determined the following: 1. Of the calls to the Downtown location, 30% are delivered in Downtown area, 30% are delivered in the East end, and 40% are delivered in the West end 2. Of the calls to the East end location, 40% are delivered in Downtown area, 40% are delivered in the East end, and 20% are delivered in the West end 3. Of the calls to the West end location, 50% are delivered in Downtown area, 30% are delivered in the East end, and 20% are delivered in the West end. After making a delivery, a driver goes to the nearest location to make the next delivery. This way, the location of a specific driver is determined only by his or her previous location. a. Write the transition matrix to model this system. b. If a driver starts at the East end location, what is his probability of being at the West end location after four deliveries are completed? c. Find the long time behaviour of the drivers' locations. Example 3 If you have lived in Ottawa for a while, you must have realized that the weather is a main concern of the population. An unofficial study of the weather in the city in early spring yields these observations: 1. It is almost impossible to have two nice days in a row 2. If we have a nice day, we just as likely to have snow or rain the next day 3. If we have snow or rain, then we have an even chance to have the same the next day 4. If there is a change from snow or rain, only half of the time is this a change to a nice day. Since the weather tomorrow depends only on today, this is a Markov chain process. a. Write the transition matrix to model this system. b. If it is nice today, what is the probability of being nice after one week? c. Find the long time behaviour of the weather. Example 4 A panel of three judges must reach a unanimous decision either to approve or reject a wiretap request. If the numbr voting for approval on a given ballot is 0 or 3, the matter is settled. However, if the vote is split, either 1 to 2 or 2 to 1, then there is a 20% chance that the dissenter will give in to the majority on the next ballot, a 10% chance he will win exactly one of the others over to his position, and a 70% chance that the vote will remain the same on the next ballot. If the initial vote is one for approval and two against, then: a. how many more ballots will be needed on the average? b. what is the probability that the wiretap will eventually be approved? References: Joseph Khoury, "Applications to Markov Chains," University of Ottawa (2011). Daniel Gallin, "Finite Mathematics," Scott-Foresman and Company (1984).