# Gauss Markov Theorem

**Topics:**Gauss–Markov theorem, Matrix, Linear function

**Pages:**1 (295 words)

**Published:**January 8, 2013

In the mode [pic]is such that the following two conditions on the random vector [pic]are met: 1. [pic]

2. [pic]

the best (minimum variance) linear (linear functions of the [pic]) unbiased estimator of [pic]is given by least squares estimator; that is, [pic]is the best linear unbiased estimator (BLUE) of [pic]. Proof:

Let [pic]be any [pic]constant matrix and let [pic]; [pic] is a general linear function of [pic], which we shall take as an estimator of [pic]. We must specify the elements of [pic]so that [pic]will be the best unbiased estimator of [pic]. Let [pic] Since [pic] is known, we must find [pic]in order to be able to specify [pic]. For unbiasedness, we have

[pic]

But, to be unbiased, [pic] must equal [pic], and this implies that [pic]for all [pic]. Thus, unbiasedness specifies that [pic]. For property of “best” we must find the matrix [pic]that minimized [pic], where [pic], subject to the restriction [pic]. To examine this, consider the covariance

[pic]

Let [pic]Then [pic]. The diagonal elements of [pic] are respective variances of the [pic]. To minimize each [pic], we must, therefore, minimize each diagonal elements of [pic]. Since [pic]and [pic]are constants, we must find a matrix [pic]such that each diagonal element of [pic]is a minimum. But [pic] is positive semidefinite; hence [pic] Thus the diagonal elements of [pic] will attain their minimum when [pic] for [pic]. But, if [pic]then [pic] Therefore, if [pic]is to equal 0 for all [pic], it must be true that [pic]for all [pic]and [pic]. This implies that [pic]. The condition[pic]is compatible with the condition of unbiasedness, [pic]. Therefore, [pic]and [pic]. This completes the proof.

Please join StudyMode to read the full document