# Hello World

Topics: Markov chain, Stochastic matrix, Probability theory Pages: 17 (3828 words) Published: December 8, 2012
Northwest Corner and Banded Matrix Approximations to a Countable Markov Chain∗ Yiqiang Q. Zhao, W. John Braun, Wei Li† Department of Mathematics and Statistics University of Winnipeg Winnipeg, MB Canada R3B 2E9 September 2, 2004

Abstract: In this paper, we consider approximations to countable state Markov chains and provide a simple, direct proof for the convergence of certain probabilistic quantities when one uses a northwest corner or a banded matrix approximation to the original probability transition matrix.

1

Introduction

Consider a Markov chain with a countable state space and stochastic matrix P . In this paper, we provide simple proofs of convergence when a northwest corner or a banded matrix is used to approximate certain measures associated with P . Our treatment is uniﬁed in the sense that these probabilistic quantities are well deﬁned for both ergodic and nonergodic Markov chains, and all results are valid for both approximation methods. Our proofs are simple in the sense that they only depend on one theorem from analysis: Weierstrass’ M-test. Our results include the convergence of stationary probability distributions when the Markov chain is ergodic. This work was directly motivated by the need to compute stationary probabilities for inﬁnite-state Markov chains, but applications need not be limited to this. Computationally, when we solve for the stationary distribution of a countable-state Markov chain, the transition probability matrix has to be: i) truncated in some way to a ﬁnite matrix; or ii) banded in some way such that the computer implementation is ﬁnite. The second method is highly recommended for preserving properties of structured probability transition matrices. There are two questions which naturally arise here: i) in which ways can we truncate or band the transition matrix? and ii) for a selected truncation or banded restriction, does the solution approximate the original probability distribution? This work has been supported by grants from the Natural Sciences and Engineering Research Council of Canada. † Also with the Institute of Applied Mathematics, Chinese Academy of Sciences. ∗

1

2 By truncation methods, we refer to all methods where the northwest corner of the inﬁnite transition matrix is considered and appropriate values (which may be all 0) are added to certain entries of the northwest corner. If the resulting matrix is required to be stochastic then the methods are often referred to augmentations. If the resulting matrix is the northwest corner itself the method is called the northwest corner approximation. Approximating the stationary probabilities of an inﬁnite Markov chain in terms of augmentations was initially studied by Seneta [18]. Most of the current results by him or from his collaboration with other researchers are included in a paper by Gibson and Seneta [2]. Other researchers include Wolf [24], who used a diﬀerent approach from that of Seneta et al., Heyman [5], who provided a probabilistic treatment of the problem, and Grassmann and Heyman [3], who justiﬁed convergence for a class of block structured Markov chains. There are relatively few references for comparison among various methods regarding the truncation errors or convergence speed. In many cases, the last column augmentation produces the minimal truncation error [2] or a faster rate of convergence [3], while the ﬁrst column augmentation does the worst. That is the main reason why the last column augmentation is often preferred. However, the convergence of the probabilities cannot always be guaranteed (for example, [2] and [24]). The censoring method provides the minimal error for all cases [25] but is usually diﬃcult to implement. This paper introduces a new alternative — the northwest corner approximation. We prove that the convergence results hold for all irreducible Markov chains. We do not make any assumption(s) on the northwest corner; for example, the irreducibility of the truncation which is...