List of figures and tables ................................. xii
Preface ................................................... xiii
1 Multivariate analysis and the linear regression model
1.1 Introduction ............................................ 1
1.2 Existence of a solution to the normal equation .......... 7
1.3 The concept of wide-sense conditional expectation ...... 10
1.4 Conditional expectation with normal variables .......... 14
1.5 The relation between wide-sense and strict-sense
conditional expectation ................................ 15
1.6 Conditional means and minimum mean-square error ........ 17
1.7 Bay es estimation ...................................... 20
1.8 The relation between Boyes and Gauss-Markov
estimation in the case of a single independent
variable ............................................... 23
1.9 Exercises .............................................. 27
2 Least-squares and Gauss-Markov theory
2.1 Least-squares theory ................................... 30
2.2 Principles of estimation ............................... 31
2.3 The concept of a generalized inverse of a matrix ....... 33
2.4 The matrix Cauchy-Schwarz inequality and an
extension .............................................. 35
2.5 Gauss-Markov theory .................................... 37
2.6 The relation between Gauss-Markov and least-squares
estimators ............................................. 41
2.7 Minimum-bias estimation ................................ 49
2.8 Multicollinearity and the imposition of dummy linear
restrictions ........................................... 51
2.9 Specification error .................................... 55
2.10 Exercises .............................................. 60
3 Multicollinearity and reduced-rank estimation
3.1 Introduction ........................................... 65
3.2 Singular-value decomposition of a matrix ............... 65
3.3 The condition number of a matrix ....................... 68
3.4 The Eckart-Young theorem ............................... 70
3.5 Reduced-rank estimation ................................ 81
3.6 Exercises .............................................. 86
4 The treatment of linear restrictions
4.1 Estimation subject to linear restrictions .............. 88
4.2 Linear aggregation and duality ......................... 92
4.3 Testing linear restrictions ........................... 101
4.4 Reduction of mean-square error by imposition of
linear restrictions ................................... 106
4.5 Uncertain linear restrictions ......................... 108
4.6 Properties of the generalized ridge estimator ......... 109
4.7 Comparison of restricted and generalized ridge
estimators ............................................ 112
4.A Appendix (to Section 4.4): Guide to the computation
of percentage points of the noncentral F
distribution .......................................... 115
4.8 Exercises ............................................. 122
5 Stein estimation
5.1 Stein's theorem and the regression model .............. 126
5.2 Lemmas underlying the James-Stein theorem ............. 132
5.3 Some further developments of Stein estimation ......... 138
5.4 Exercises ............................................. 141
6 Autocorrelation of residuals - 1
6.1 The first-order autoregressive model .................. 143
6.2 Efficiency of trend estimation: the ordinary
least-squares estimator ............................... 147
6.3 Efficiency of trend estimation: the Cochrane-Orcutt
estimator ............................................. 154
6.4 Efficiency of trend estimation: the Prais-Winsten
weighted-difference estimator ......................... 157
6.5 Efficiency of trend estimation: the Prais-Winsten
first-difference estimator ............................ 161
6.6 Discussion of the literature .......................... 162
6.7 Exercises ............................................. 165
7 Autocorrelation of residuals - 2
7.1 Anderson models ...................................... 167
7.2 Testing for autocorrelation: Anderson's theorem and
the Durbin-Watson test ................................ 177
7.3 Distribution and beta approximation of the Durbin-
Watson statistic ...................................... 189
7.4 Bias in estimation of sampling variances .............. 196
7.5 Exercises ............................................. 200
8 Simultaneous-equations estimation
8.7 The identification problem ............................ 202
8.2 Anderson and Rubin's "limited-information maximum-
likelihood" (LIML) method, 1: the handling of linear
restrictions .......................................... 210
8.3 Anderson and Rubin's "limited-information maximum-
likelihood" method, 2: constrained maximization of
the likelihood function ............................... 215
8.4 The contributions of Basmann and Theil ................ 223
8.5 Exact properties of simultaneous-equations
estimators ............................................ 238
8.6 Approximations to finite-sample distributions ......... 251
8.7 Recursive models ...................................... 268
8.8 Exercises ............................................. 283
9 Solutions to the exercises
9.1 Chapter 1 ............................................. 287
9.2 Chapter 2 ............................................. 294
9.3 Chapter 3 ............................................. 304
9.4 Chapter 4 ............................................. 309
9.5 Chapters .............................................. 318
9.6 Chapter 6 ............................................. 323
9.7 Chapter 7 ............................................. 329
9.8 Chapter 8 ............................................. 334
Notes ...................................................... 349
Bibliography ............................................... 357
Index ...................................................... 385
|