Introduction to Linear Regression Analysis, 4th Edition

by ; ;
Edition: 4th
Format: Hardcover
Pub. Date: 2006-07-01
Publisher(s): Wiley-Interscience
  • Free Shipping Icon

    This Item Qualifies for Free Shipping!*

    *Excludes marketplace orders.

List Price: $157.50

Rent Textbook

Select for Price
There was a problem. Please try again later.

New Textbook

We're Sorry
Sold Out

Used Textbook

We're Sorry
Sold Out

eTextbook

We're Sorry
Not Available

How Marketplace Works:

  • This item is offered by an independent seller and not shipped from our warehouse
  • Item details like edition and cover design may differ from our description; see seller's comments before ordering.
  • Sellers much confirm and ship within two business days; otherwise, the order will be cancelled and refunded.
  • Marketplace purchases cannot be returned to eCampus.com. Contact the seller directly for inquiries; if no response within two days, contact customer service.
  • Additional shipping costs apply to Marketplace purchases. Review shipping costs at checkout.

Summary

A comprehensive and up-to-date introduction to the fundamentals of regression analysis The Fourth Edition of Introduction to Linear Regression Analysis describes both the conventional and less common uses of linear regression in the practical context of today's mathematical and scientific research. This popular book blends both theory and application to equip the reader with an understanding of the basic principles necessary to apply regression model-building techniques in a wide variety of application environments. It assumes a working knowledge of basic statistics and a familiarity with hypothesis testing and confidence intervals, as well as the normal, t, x2, and F distributions. Illustrating all of the major procedures employed by the contemporary software packages MINITAB(r), SAS(r), and S-PLUS(r), the Fourth Edition begins with a general introduction to regression modeling, including typical applications. A host of technical tools are outlined, such as basic inference procedures, introductory aspects of model adequacy checking, and polynomial regression models and their variations. The book discusses how transformations and weighted least squares can be used to resolve problems of model inadequacy and also how to deal with influential observations. Subsequent chapters discuss: * Indicator variables and the connection between regression and analysis-of-variance models * Variable selection and model-building techniques and strategies * The multicollinearity problem--its sources, effects, diagnostics, and remedial measures * Robust regression techniques such as M-estimators, and properties of robust estimators * The basics of nonlinear regression * Generalized linear models * Using SAS(r) for regression problems This book is a robust resource that offers solid methodology for statistical practitioners and professionals in the fields of engineering, physical and chemical sciences, economics, management, life and biological sciences, and the social sciences. Both the accompanying FTP site, which contains data sets, extensive problem solutions, software hints, and PowerPoint(r) slides, as well as the book's revised presentation of topics in increasing order of complexity, facilitate its use in a classroom setting. With its new exercises and structure, this book is highly recommended for upper-undergraduate and beginning graduate students in mathematics, engineering, and natural sciences. Scientists and engineers will find the book to be an excellent choice for reference and self-study.

Author Biography

DOUGLAS C. MONTGOMERY is ASU Foundation Professor of Engineering and Professor of Statistics at Arizona State University.

ELIZABETH A. PECK is Logistics Modeling Specialist at the Coca-Cola Company in Atlanta, Georgia.

G. GEOFFREY VINING is Professor and Head of the Department of Statistics at Virginia Polytechnic Institute and State University. All three authors have published extensively in both journals and books.

Table of Contents

Preface xiii
1. Introduction
1(11)
1.1 Regression and Model Building,
1(4)
1.2 Data Collection,
5(4)
1.3 Uses of Regression,
9(1)
1.4 Role of the Computer,
10(2)
2. Simple Linear Regression
12(51)
2.1 Simple Linear Regression Model,
12(1)
2.2 Least-Squares Estimation of the Parameters,
13(9)
2.2.1 Estimation of β0 and β1,
13(4)
2.2.2 Properties of the Least-Squares Estimators and the Fitted Regression Model,
17(3)
2.2.3 Estimation of σ²,
20(1)
2.2.4 Alternate Form of the Model,
21(1)
2.3 Hypothesis Testing on the Slope and Intercept,
22(6)
2.3.1 Use of t Tests,
22(1)
2.3.2 Testing Significance of Regression,
23(2)
2.3.3 Analysis of Variance,
25(3)
2.4 Interval Estimation in Simple Linear Regression,
28(5)
2.4.1 Confidence Intervals on β0, β1, and σ²,
28(2)
2.4.2 Interval Estimation of the Mean Response,
30(3)
2.5 Prediction of New Observations,
33(2)
2.6 Coefficient of Determination,
35(1)
2.7 Using SAS for Simple Linear Regression,
36(1)
2.8 Some Considerations in the Use of Regression,
37(4)
2.9 Regression Through the Origin,
41(6)
2.10 Estimation by Maximum Likelihood,
47(2)
2.11 Case Where the Regressor x is Random,
49(5)
2.11.1 x and y Jointly Distributed,
49(1)
2.11.2 x and y Jointly Normally Distributed: Correlation Model,
49(5)
Problems,
54(9)
3. Multiple Linear Regression
63(59)
3.1 Multiple Regression Models,
63(3)
3.2 Estimation of the Model Parameters,
66(14)
3.2.1 Least-Squares Estimation of the Regression Coefficients,
66(8)
3.2.2 Geometrical Interpretation of Least Squares,
74(1)
3.2.3 Properties of the Least-Squares Estimators,
75(1)
3.2.4 Estimation of σ²,
76(1)
3.2.5 Inadequacy of Scatter Diagrams in Multiple Regression,
77(2)
3.2.6 Maximum-Likelihood Estimation,
79(1)
3.3 Hypothesis Testing in Multiple Linear Regression,
80(13)
3.3.1 Test for Significance of Regression,
80(4)
3.3.2 Tests on Individual Regression Coefficients,
84(5)
3.3.3 Special Case of Orthogonal Columns in X,
89(1)
3.3.4 Testing the General Linear Hypothesis,
90(3)
3.4 Confidence Intervals in Multiple Regression,
93(6)
3.4.1 Confidence Intervals on the Regression Coefficients,
93(1)
3.4.2 Confidence Interval Estimation of the Mean Response,
94(2)
3.4.3 Simultaneous Confidence Intervals on Regression Coefficients,
96(3)
3.5 Prediction of New Observations,
99(2)
3.6 Using SAS for Basic Multiple Linear Regression,
101(1)
3.7 Hidden Extrapolation in Multiple Regression,
101(4)
3.8 Standardized Regression Coefficients,
105(4)
3.9 Multicollinearity,
109(3)
3.10 Why Do Regression Coefficients Have the Wrong Sign?,
112(2)
Problems,
114(8)
4. Model Adequacy Checking
122(38)
4.1 Introduction,
122(1)
4.2 Residual Analysis,
123(18)
4.2.1 Definition of Residuals,
123(1)
4.2.2 Methods for Scaling Residuals,
123(6)
4.2.3 Residual Plots,
129(5)
4.2.4 Partial Regression and Partial Residual Plots,
134(3)
4.2.5 Using MINITAB and SAS for Residual Analysis,
137(1)
4.2.6 Other Residual Plotting and Analysis Methods,
138(3)
4.3 PRESS Statistic,
141(1)
4.4 Detection and Treatment of Outliers,
142(3)
4.5 Lack of Fit of the Regression Model,
145(8)
4.5.1 Formal Test for Lack of Fit,
145(4)
4.5.2 Estimation of Pure Error from Near Neighbors,
149(4)
Problems,
153(7)
5. Transformations and Weighting to Correct Model Inadequacies
160(29)
5.1 Introduction,
160(1)
5.2 Variance-Stabilizing Transformations,
161(3)
5.3 Transformations to Linearize the Model,
164(7)
5.4 Analytical Methods for Selecting a Transformation,
171(5)
5.4.1 Transformations on y: The Box–Cox Method,
171(3)
5.4.2 Transformations on the Regressor Variables,
174(2)
5.5 Generalized and Weighted Least Squares,
176(7)
5.5.1 Generalized Least Squares,
177(2)
5.5.2 Weighted Least Squares,
179(1)
5.5.3 Some Practical Issues,
180(3)
Problems,
183(6)
6. Diagnostics for Leverage and Influence
189(12)
6.1 Importance of Detecting Influential Observations,
189(1)
6.2 Leverage,
190(3)
6.3 Measures of Influence: Cook's D,
193(2)
6.4 Measures of Influence: DFFITS and DFBETAS,
195(2)
6.5 A Measure of Model Performance,
197(1)
6.6 Detecting Groups of Influential Observations,
198(1)
6.7 Treatment of Influential Observations,
199(1)
Problems,
199(2)
7. Polynomial Regression Models
201(36)
7.1 Introduction,
201(1)
7.2 Polynomial Models in One Variable,
201(13)
7.2.1 Basic Principles,
201(6)
7.2.2 Piecewise Polynomial Fitting (Splines),
207(6)
7.2.3 Polynomial and Trigonometric Terms,
213(1)
7.3 Nonparametric Regression,
214(6)
7.3.1 Kernel Regression,
214(1)
7.3.2 Locally Weighted Regression (Loess),
215(4)
7.3.3 Final Cautions,
219(1)
7.4 Polynomial Models in Two or More Variables,
220(6)
7.5 Orthogonal Polynomials,
226(5)
Problems,
231(6)
8. Indicator Variables
237(24)
8.1 General Concept of Indicator Variables,
237(12)
8.2 Comments on the Use of Indicator Variables,
249(2)
8.2.1 Indicator Variables versus Regression on Allocated Codes,
249(1)
8.2.2 Indicator Variables as a Substitute for a Quantitative Regressor,
250(1)
8.3 Regression Approach to Analysis of Variance,
251(5)
Problems,
256(5)
9. Variable Selection and Model Building
261(44)
9.1 Introduction,
261(9)
9.1.1 Model-Building Problem,
261(1)
9.1.2 Consequences of Model Misspecification,
262(3)
9.1.3 Criteria for Evaluating Subset Regression Models,
265(5)
9.2 Computational Techniques for Variable Selection,
270(13)
9.2.1 All Possible Regressions,
270(7)
9.2.2 Stepwise Regression Methods,
277(6)
9.3 Strategy for Variable Selection and Model Building,
283(3)
9.4 Case Study: Gorman and Toman Asphalt Data Using SAS,
286(14)
Problems,
300(5)
10. Validation of Regression Models 305(18)
10.1 Introduction,
305(1)
10.2 Validation Techniques,
306(13)
10.2.1 Analysis of Model Coefficients and Predicted Values,
306(2)
10.2.2 Collecting Fresh Data Confirmation Runs,
308(2)
10.2.3 Data Splitting,
310(9)
10.3 Data from Planned Experiments, 318 Problems,
319(4)
11. Multicollinearity 323(46)
11.1 Introduction,
323(1)
11.2 Sources of Multicollinearity,
323(3)
11.3 Effects of Multicollinearity,
326(5)
11.4 Multicollinearity Diagnostics,
331(10)
11.4.1 Examination of the Correlation Matrix,
333(1)
11.4.2 Variance Inflation Factors,
334(1)
11.4.3 Eigensystem Analysis of X'X,
335(5)
11.4.4 Other Diagnostics,
340(1)
11.4.5 SAS Code for Generating Multicollinearity Diagnostics,
341(1)
11.5 Methods for Dealing with Multicollinearity,
341(22)
11.5.1 Collecting Additional Data,
341(1)
11.5.2 Model Respecification,
342(2)
11.5.3 Ridge Regression,
344(11)
11.5.4 Principal-Component Regression,
355(5)
11.5.5 Comparison and Evaluation of Biased Estimators,
360(3)
11.6 Using SAS to Perform Ridge and Principal-Component Regression,
363(2)
Problems,
365(4)
12. Robust Regression 369(28)
12.1 Need for Robust Regression,
369(3)
12.2 M-Estimators,
372(12)
12.3 Properties of Robust Estimators,
384(2)
12.3.1 Breakdown Point,
385(1)
12.3.2 Efficiency,
385(1)
12.4 Survey of Other Robust Regression Estimators,
386(11)
12.4.1 High-Breakdown-Point Estimators,
386(3)
12.4.2 Bounded Influence Estimators,
389(2)
12.4.3 Other Procedures,
391(2)
12.4.4 Computing Robust Regression Estimators, 392 Problems,
393(4)
13. Introduction to Nonlinear Regression 397(30)
13.1 Linear and Nonlinear Regression Models,
397(2)
13.1.1 Linear Regression Models,
397(1)
13.1.2 Nonlinear Regression Models,
398(1)
13.2 Origins of Nonlinear Models,
399(4)
13.3 Nonlinear Least Squares,
403(2)
13.4 Transformation to a Linear Model,
405(3)
13.5 Parameter Estimation in a Nonlinear System,
408(9)
13.5.1 Linearization,
408(6)
13.5.2 Other Parameter Estimation Methods,
414(1)
13.5.3 Starting Values,
415(1)
13.5.4 Computer Programs,
416(1)
13.6 Statistical Inference in Nonlinear Regression,
417(2)
13.7 Examples of Nonlinear Regression Models,
419(1)
13.8 Using SAS PROC NLIN,
420(3)
Problems,
423(4)
14. Generalized Linear Models 427(48)
14.1 Introduction,
427(1)
14.2 Logistic Regression Models,
428(21)
14.2.1 Models with a Binary Response Variable,
428(2)
14.2.2 Estimating the Parameters in a Logistic Regression Model,
430(3)
14.2.3 Interpretation of the Parameters in a Logistic Regression Model,
433(3)
14.2.4 Statistical Inference on Model Parameters,
436(8)
14.2.5 Diagnostic Checking in Logistic Regression,
444(2)
14.2.6 Other Models for Binary Response Data,
446(1)
14.2.7 More Than Two Categorical Outcomes,
447(2)
14.3 Poisson Regression,
449(5)
14.4 The Generalized Linear Model,
454(11)
14.4.1 Link Functions and Linear Predictors,
455(1)
14.4.2 Parameter Estimation and Inference in the GLM,
456(4)
14.4.3 Prediction and Estimation with the GLM,
460(1)
14.4.4 Residual Analysis in the GLM,
461(3)
14.4.5 Overdispersion,
464(1)
Problems,
465(10)
15. Other Topics in the Use of Regression Analysis 475(36)
15.1 Regression Models with Autocorrelated Errors,
475(11)
15.1.1 Source and Effects of Autocorrelation,
475(1)
15.1.2 Detecting the Presence of Autocorrelation,
476(3)
15.1.3 Parameter Estimation Methods,
479(7)
15.2 Effect of Measurement Errors in the Regressors,
486(2)
15.2.1 Simple Linear Regression,
486(2)
15.2.2 Berkson Model,
488(1)
15.3 Inverse Estimation—The Calibration Problem,
488(5)
15.4 Bootstrapping in Regression,
493(7)
15.4.1 Bootstrap Sampling in Regression,
494(1)
15.4.2 Bootstrap Confidence Intervals,
494(6)
15.5 Classification and Regression Trees (CART),
500(2)
15.6 Neural Networks,
502(3)
15.7 Designed Experiments for Regression,
505(2)
Problems,
507(4)
APPENDIX A. Statistical Tables 511(18)
APPENDIX B. Data Sets For Exercises 529(17)
APPENDIX C. Supplemental Technical Material 546(38)
C.1 Background on Basic Test Statistics,
546(2)
C.2 Background from the Theory of Linear Models,
548(4)
C.3 Important Results on SSR and SSRes,
552(6)
C.4 Gauss–Markov Theorem, Var(epsilon) = σ²I,
558(2)
C.5 Computational Aspects of Multiple Regression,
560(2)
C.6 Result on the Inverse of a Matrix,
562(1)
C.7 Development of the PRESS Statistic,
562(2)
C.8 Development of S²(i),
564(1)
C.9 Outlier Test Based on R-Student,
565(3)
C.10 Independence of Residuals and Fitted Values,
568(1)
C.11 The Gauss–Markov Theorem, Var(epsilon) = V,
569(2)
C.12 Bias in MSRes When the Model Is Underspecified,
571(1)
C.13 Computation of Influence Diagnostics,
572(1)
C.14 Generalized Linear Models,
573(11)
APPENDIX D. Introduction to SAS 584(10)
D.1 Basic Data Entry,
584(5)
D.2 Creating Permanent SAS Data Sets,
589(1)
D.3 Importing Data from an EXCEL File,
590(1)
D.4 Output Command,
591(1)
D.5 Log File,
591(2)
D.6 Adding Variables to an Existing SAS Data Set,
593(1)
References 594(15)
Index 609

An electronic version of this book is available through VitalSource.

This book is viewable on PC, Mac, iPhone, iPad, iPod Touch, and most smartphones.

By purchasing, you will be able to view this book online, as well as download it, for the chosen number of days.

Digital License

You are licensing a digital product for a set duration. Durations are set forth in the product description, with "Lifetime" typically meaning five (5) years of online access and permanent download to a supported device. All licenses are non-transferable.

More details can be found here.

A downloadable version of this book is available through the eCampus Reader or compatible Adobe readers.

Applications are available on iOS, Android, PC, Mac, and Windows Mobile platforms.

Please view the compatibility matrix prior to purchase.