Applied Linear Regression.

By: Weisberg, SanfordSeries: Wiley Series in Probability and Statistics SerPublisher: Somerset : John Wiley & Sons, Incorporated, 2013Copyright date: ©2014Edition: 4th edDescription: 1 online resource (370 pages)Content type: text Media type: computer Carrier type: online resourceISBN: 9781118594858Subject(s): Linear regression.;Mathematics.;Regression analysisGenre/Form: Electronic books. Additional physical formats: Print version:: Applied Linear RegressionDDC classification: 519.5/36 LOC classification: QA278.2.W457 2014ebOnline resources: Click to View
Contents:
Cover -- Title page -- Copyright page -- Dedication -- Contents -- Preface to the Fourth Edition -- CHAPTER 1: Scatterplots and Regression -- 1.1 Scatterplots -- 1.2 Mean Functions -- 1.3 Variance Functions -- 1.4 Summary Graph -- 1.5 Tools for Looking at Scatterplots -- 1.5.1 Size -- 1.5.2 Transformations -- 1.5.3 Smoothers for the Mean Function -- 1.6 Scatterplot Matrices -- 1.7 Problems -- CHAPTER 2: Simple Linear Regression -- 2.1 Ordinary Least Squares Estimation -- 2.2 Least Squares Criterion -- 2.3 Estimating the Variance σ2 -- 2.4 Properties of Least Squares Estimates -- 2.5 Estimated Variances -- 2.6 Confidence Intervals and t-Tests -- 2.6.1 The Intercept -- 2.6.2 Slope -- 2.6.3 Prediction -- 2.6.4 Fitted Values -- 2.7 The Coefficient of Determination, R2 -- 2.8 The Residuals -- 2.9 Problems -- CHAPTER 3: Multiple Regression -- 3.1 Adding a Regressor to a Simple Linear Regression Model -- 3.1.1 Explaining Variability -- 3.1.2 Added-Variable Plots -- 3.2 The Multiple Linear Regression Model -- 3.3 Predictors and Regressors -- 3.4 Ordinary Least Squares -- 3.4.1 Data and Matrix Notation -- 3.4.2 The Errors e -- 3.4.3 Ordinary Least Squares Estimators -- 3.4.4 Properties of the Estimates -- 3.4.5 Simple Regression in Matrix Notation -- 3.4.6 The Coefficient of Determination -- 3.4.7 Hypotheses Concerning One Coefficient -- 3.4.8 t-Tests and Added-Variable Plots -- 3.5 Predictions, Fitted Values, and Linear Combinations -- 3.6 Problems -- CHAPTER 4: Interpretation of Main Effects -- 4.1 Understanding Parameter Estimates -- 4.1.1 Rate of Change -- 4.1.2 Signs of Estimates -- 4.1.3 Interpretation Depends on Other Terms in the Mean Function -- 4.1.4 Rank Deficient and Overparameterized Mean Functions -- 4.1.5 Collinearity -- 4.1.6 Regressors in Logarithmic Scale -- 4.1.7 Response in Logarithmic Scale -- 4.2 Dropping Regressors.
4.2.1 Parameters -- 4.2.2 Variances -- 4.3 Experimentation versus Observation -- 4.3.1 Feedlots -- 4.4 Sampling from a Normal Population -- 4.5 More on R2 -- 4.5.1 Simple Linear Regression and R2 -- 4.5.2 Multiple Linear Regression and R2 -- 4.5.3 Regression through the Origin -- 4.6 Problems -- CHAPTER 5: Complex Regressors -- 5.1 Factors -- 5.1.1 One-Factor Models -- 5.1.2 Comparison of Level Means -- 5.1.3 Adding a Continuous Predictor -- 5.1.4 The Main Effects Model -- 5.2 Many Factors -- 5.3 Polynomial Regression -- 5.3.1 Polynomials with Several Predictors -- 5.3.2 Numerical Issues with Polynomials -- 5.4 Splines -- 5.4.1 Choosing a Spline Basis -- 5.4.2 Coefficient Estimates -- 5.5 Principal Components -- 5.5.1 Using Principal Components -- 5.5.2 Scaling -- 5.6 Missing Data -- 5.6.1 Missing at Random -- 5.6.2 Imputation -- 5.7 Problems -- CHAPTER 6: Testing and Analysis of Variance -- 6.1 F-Tests -- 6.1.1 General Likelihood Ratio Tests -- 6.2 The Analysis of Variance -- 6.3 Comparisons of Means -- 6.4 Power and Non-Null Distributions -- 6.5 Wald Tests -- 6.5.1 One Coefficient -- 6.5.2 One Linear Combination -- 6.5.3 General Linear Hypothesis -- 6.5.4 Equivalence of Wald and Likelihood-Ratio Tests -- 6.6 Interpreting Tests -- 6.6.1 Interpreting p-Values -- 6.6.2 Why Most Published Research Findings Are False -- 6.6.3 Look at the Data, Not Just the Tests -- 6.6.4 Population versus Sample -- 6.6.5 Stacking the Deck -- 6.6.6 Multiple Testing -- 6.6.7 File Drawer Effects -- 6.6.8 The Lab Is Not the Real World -- 6.7 Problems -- CHAPTER 7: Variances -- 7.1 Weighted Least Squares -- 7.1.1 Weighting of Group Means -- 7.1.2 Sample Surveys -- 7.2 Misspecified Variances -- 7.2.1 Accommodating Misspecified Variance -- 7.2.2 A Test for Constant Variance -- 7.3 General Correlation Structures -- 7.4 Mixed Models -- 7.5 Variance Stabilizing Transformations.
7.6 The Delta Method -- 7.7 The Bootstrap -- 7.7.1 Regression Inference without Normality -- 7.7.2 Nonlinear Functions of Parameters -- 7.7.3 Residual Bootstrap -- 7.7.4 Bootstrap Tests -- 7.8 Problems -- CHAPTER 8: Transformations -- 8.1 Transformation Basics -- 8.1.1 Power Transformations -- 8.1.2 Transforming One Predictor Variable -- 8.1.3 The Box-Cox Method -- 8.2 A General Approach to Transformations -- 8.2.1 The 1D Estimation Result and Linearly Related Regressors -- 8.2.2 Automatic Choice of Transformation of Predictors -- 8.3 Transforming the Response -- 8.4 Transformations of Nonpositive Variables -- 8.5 Additive Models -- 8.6 Problems -- CHAPTER 9: Regression Diagnostics -- 9.1 The Residuals -- 9.1.1 Difference between ê and e -- 9.1.2 The Hat Matrix -- 9.1.3 Residuals and the Hat Matrix with Weights -- 9.1.4 Residual Plots When the Model Is Correct -- 9.1.5 The Residuals When the Model Is Not Correct -- 9.1.6 Fuel Consumption Data -- 9.2 Testing for Curvature -- 9.3 Nonconstant Variance -- 9.4 Outliers -- 9.4.1 An Outlier Test -- 9.4.2 Weighted Least Squares -- 9.4.3 Significance Levels for the Outlier Test -- 9.4.4 Additional Comments -- 9.5 Influence of Cases -- 9.5.1 Cook's Distance -- 9.5.2 Magnitude of Di -- 9.5.3 Computing Di -- 9.5.4 Other Measures of Influence -- 9.6 Normality Assumption -- 9.7 Problems -- CHAPTER 10: Variable Selection -- 10.1 Variable Selection and Parameter Assessment -- 10.2 Variable Selection for Discovery -- 10.2.1 Information Criteria -- 10.2.2 Stepwise Regression -- 10.2.3 Regularized Methods -- 10.2.4 Subset Selection Overstates Significance -- 10.3 Model Selection for Prediction -- 10.3.1 Cross-Validation -- 10.3.2 Professor Ratings -- 10.4 Problems -- CHAPTER 11: Nonlinear Regression -- 11.1 Estimation for Nonlinear Mean Functions -- 11.2 Inference Assuming Large Samples -- 11.3 Starting Values.
11.4 Bootstrap Inference -- 11.5 Further Reading -- 11.6 Problems -- CHAPTER 12: Binomial and Poisson Regression -- 12.1 Distributions for Counted Data -- 12.1.1 Bernoulli Distribution -- 12.1.2 Binomial Distribution -- 12.1.3 Poisson Distribution -- 12.2 Regression Models For Counts -- 12.2.1 Binomial Regression -- 12.2.2 Deviance -- 12.3 Poisson Regression -- 12.3.1 Goodness of Fit Tests -- 12.4 Transferring What You Know about Linear Models -- 12.4.1 Scatterplots and Regression -- 12.4.2 Simple and Multiple Regression -- 12.4.3 Model Building -- 12.4.4 Testing and Analysis of Deviance -- 12.4.5 Variances -- 12.4.6 Transformations -- 12.4.7 Regression Diagnostics -- 12.4.8 Variable Selection -- 12.5 Generalized Linear Models -- 12.6 Problems -- Appendix -- A.1 Website -- A.2 Means, Variances, Covariances, and Correlations -- A.2.1 The Population Mean and E Notation -- A.2.2 Variance and Var Notation -- A.2.3 Covariance and Correlation -- A.2.4 Conditional Moments -- A.3 Least Squares for Simple Regression -- A.4 Means and Variances of Least Squares Estimates -- A.5 Estimating E(Y|X) Using a Smoother -- A.6 A Brief Introduction to Matrices and Vectors -- A.6.1 Addition and Subtraction -- A.6.2 Multiplication by a Scalar -- A.6.3 Matrix Multiplication -- A.6.4 Transpose of a Matrix -- A.6.5 Inverse of a Matrix -- A.6.6 Orthogonality -- A.6.7 Linear Dependence and Rank of a Matrix -- A.7 Random Vectors -- A.8 Least Squares Using Matrices -- A.8.1 Properties of Estimates -- A.8.2 The Residual Sum of Squares -- A.8.3 Estimate of Variance -- A.8.4 Weighted Least Squares -- A.9 The QR Factorization -- A.10 Spectral Decomposition -- A.11 Maximum Likelihood Estimates -- A.11.1 Linear Models -- A.11.2 Logistic Regression -- A.12 The Box-Cox Method for Transformations -- A.12.1 Univariate Case -- A.12.2 Multivariate Case.
A.13 Case Deletion in Linear Regression -- References -- Author Index -- Subject Index.
Summary: Praise for the Third Edition "...this is an excellent book which could easily be used as a course text..." -International Statistical Institute The Fourth Edition of Applied Linear Regression provides a thorough update of the basic theory and methodology of linear regression modeling. Demonstrating the practical applications of linear regression analysis techniques, the Fourth Edition uses interesting, real-world exercises and examples. Stressing central concepts such as model building, understanding parameters, assessing fit and reliability, and drawing conclusions, the new edition illustrates how to develop estimation, confidence, and testing procedures primarily through the use of least squares regression. While maintaining the accessible appeal of each previous edition,Applied Linear Regression, Fourth Edition features: Graphical methods stressed in the initial exploratory phase, analysis phase, and summarization phase of an analysis In-depth coverage of parameter estimates in both simple and complex models, transformations, and regression diagnostics Newly added material on topics including testing, ANOVA, and variance assumptions Updated methodology, such as bootstrapping, cross-validation binomial and Poisson regression, and modern model selection methods Applied Linear Regression, Fourth Edition is an excellent textbook for upper-undergraduate and graduate-level students, as well as an appropriate reference guide for practitioners and applied statisticians in engineering, business administration, economics, and the social sciences.
Holdings
Item type Current library Call number Status Date due Barcode Item holds
Ebrary Ebrary Afghanistan
Available EBKAF00088060
Ebrary Ebrary Algeria
Available
Ebrary Ebrary Cyprus
Available
Ebrary Ebrary Egypt
Available
Ebrary Ebrary Libya
Available
Ebrary Ebrary Morocco
Available
Ebrary Ebrary Nepal
Available EBKNP00088060
Ebrary Ebrary Sudan

Access a wide range of magazines and books using Pressreader and Ebook central.

Enjoy your reading, British Council Sudan.

Available
Ebrary Ebrary Tunisia
Available
Total holds: 0

Cover -- Title page -- Copyright page -- Dedication -- Contents -- Preface to the Fourth Edition -- CHAPTER 1: Scatterplots and Regression -- 1.1 Scatterplots -- 1.2 Mean Functions -- 1.3 Variance Functions -- 1.4 Summary Graph -- 1.5 Tools for Looking at Scatterplots -- 1.5.1 Size -- 1.5.2 Transformations -- 1.5.3 Smoothers for the Mean Function -- 1.6 Scatterplot Matrices -- 1.7 Problems -- CHAPTER 2: Simple Linear Regression -- 2.1 Ordinary Least Squares Estimation -- 2.2 Least Squares Criterion -- 2.3 Estimating the Variance σ2 -- 2.4 Properties of Least Squares Estimates -- 2.5 Estimated Variances -- 2.6 Confidence Intervals and t-Tests -- 2.6.1 The Intercept -- 2.6.2 Slope -- 2.6.3 Prediction -- 2.6.4 Fitted Values -- 2.7 The Coefficient of Determination, R2 -- 2.8 The Residuals -- 2.9 Problems -- CHAPTER 3: Multiple Regression -- 3.1 Adding a Regressor to a Simple Linear Regression Model -- 3.1.1 Explaining Variability -- 3.1.2 Added-Variable Plots -- 3.2 The Multiple Linear Regression Model -- 3.3 Predictors and Regressors -- 3.4 Ordinary Least Squares -- 3.4.1 Data and Matrix Notation -- 3.4.2 The Errors e -- 3.4.3 Ordinary Least Squares Estimators -- 3.4.4 Properties of the Estimates -- 3.4.5 Simple Regression in Matrix Notation -- 3.4.6 The Coefficient of Determination -- 3.4.7 Hypotheses Concerning One Coefficient -- 3.4.8 t-Tests and Added-Variable Plots -- 3.5 Predictions, Fitted Values, and Linear Combinations -- 3.6 Problems -- CHAPTER 4: Interpretation of Main Effects -- 4.1 Understanding Parameter Estimates -- 4.1.1 Rate of Change -- 4.1.2 Signs of Estimates -- 4.1.3 Interpretation Depends on Other Terms in the Mean Function -- 4.1.4 Rank Deficient and Overparameterized Mean Functions -- 4.1.5 Collinearity -- 4.1.6 Regressors in Logarithmic Scale -- 4.1.7 Response in Logarithmic Scale -- 4.2 Dropping Regressors.

4.2.1 Parameters -- 4.2.2 Variances -- 4.3 Experimentation versus Observation -- 4.3.1 Feedlots -- 4.4 Sampling from a Normal Population -- 4.5 More on R2 -- 4.5.1 Simple Linear Regression and R2 -- 4.5.2 Multiple Linear Regression and R2 -- 4.5.3 Regression through the Origin -- 4.6 Problems -- CHAPTER 5: Complex Regressors -- 5.1 Factors -- 5.1.1 One-Factor Models -- 5.1.2 Comparison of Level Means -- 5.1.3 Adding a Continuous Predictor -- 5.1.4 The Main Effects Model -- 5.2 Many Factors -- 5.3 Polynomial Regression -- 5.3.1 Polynomials with Several Predictors -- 5.3.2 Numerical Issues with Polynomials -- 5.4 Splines -- 5.4.1 Choosing a Spline Basis -- 5.4.2 Coefficient Estimates -- 5.5 Principal Components -- 5.5.1 Using Principal Components -- 5.5.2 Scaling -- 5.6 Missing Data -- 5.6.1 Missing at Random -- 5.6.2 Imputation -- 5.7 Problems -- CHAPTER 6: Testing and Analysis of Variance -- 6.1 F-Tests -- 6.1.1 General Likelihood Ratio Tests -- 6.2 The Analysis of Variance -- 6.3 Comparisons of Means -- 6.4 Power and Non-Null Distributions -- 6.5 Wald Tests -- 6.5.1 One Coefficient -- 6.5.2 One Linear Combination -- 6.5.3 General Linear Hypothesis -- 6.5.4 Equivalence of Wald and Likelihood-Ratio Tests -- 6.6 Interpreting Tests -- 6.6.1 Interpreting p-Values -- 6.6.2 Why Most Published Research Findings Are False -- 6.6.3 Look at the Data, Not Just the Tests -- 6.6.4 Population versus Sample -- 6.6.5 Stacking the Deck -- 6.6.6 Multiple Testing -- 6.6.7 File Drawer Effects -- 6.6.8 The Lab Is Not the Real World -- 6.7 Problems -- CHAPTER 7: Variances -- 7.1 Weighted Least Squares -- 7.1.1 Weighting of Group Means -- 7.1.2 Sample Surveys -- 7.2 Misspecified Variances -- 7.2.1 Accommodating Misspecified Variance -- 7.2.2 A Test for Constant Variance -- 7.3 General Correlation Structures -- 7.4 Mixed Models -- 7.5 Variance Stabilizing Transformations.

7.6 The Delta Method -- 7.7 The Bootstrap -- 7.7.1 Regression Inference without Normality -- 7.7.2 Nonlinear Functions of Parameters -- 7.7.3 Residual Bootstrap -- 7.7.4 Bootstrap Tests -- 7.8 Problems -- CHAPTER 8: Transformations -- 8.1 Transformation Basics -- 8.1.1 Power Transformations -- 8.1.2 Transforming One Predictor Variable -- 8.1.3 The Box-Cox Method -- 8.2 A General Approach to Transformations -- 8.2.1 The 1D Estimation Result and Linearly Related Regressors -- 8.2.2 Automatic Choice of Transformation of Predictors -- 8.3 Transforming the Response -- 8.4 Transformations of Nonpositive Variables -- 8.5 Additive Models -- 8.6 Problems -- CHAPTER 9: Regression Diagnostics -- 9.1 The Residuals -- 9.1.1 Difference between ê and e -- 9.1.2 The Hat Matrix -- 9.1.3 Residuals and the Hat Matrix with Weights -- 9.1.4 Residual Plots When the Model Is Correct -- 9.1.5 The Residuals When the Model Is Not Correct -- 9.1.6 Fuel Consumption Data -- 9.2 Testing for Curvature -- 9.3 Nonconstant Variance -- 9.4 Outliers -- 9.4.1 An Outlier Test -- 9.4.2 Weighted Least Squares -- 9.4.3 Significance Levels for the Outlier Test -- 9.4.4 Additional Comments -- 9.5 Influence of Cases -- 9.5.1 Cook's Distance -- 9.5.2 Magnitude of Di -- 9.5.3 Computing Di -- 9.5.4 Other Measures of Influence -- 9.6 Normality Assumption -- 9.7 Problems -- CHAPTER 10: Variable Selection -- 10.1 Variable Selection and Parameter Assessment -- 10.2 Variable Selection for Discovery -- 10.2.1 Information Criteria -- 10.2.2 Stepwise Regression -- 10.2.3 Regularized Methods -- 10.2.4 Subset Selection Overstates Significance -- 10.3 Model Selection for Prediction -- 10.3.1 Cross-Validation -- 10.3.2 Professor Ratings -- 10.4 Problems -- CHAPTER 11: Nonlinear Regression -- 11.1 Estimation for Nonlinear Mean Functions -- 11.2 Inference Assuming Large Samples -- 11.3 Starting Values.

11.4 Bootstrap Inference -- 11.5 Further Reading -- 11.6 Problems -- CHAPTER 12: Binomial and Poisson Regression -- 12.1 Distributions for Counted Data -- 12.1.1 Bernoulli Distribution -- 12.1.2 Binomial Distribution -- 12.1.3 Poisson Distribution -- 12.2 Regression Models For Counts -- 12.2.1 Binomial Regression -- 12.2.2 Deviance -- 12.3 Poisson Regression -- 12.3.1 Goodness of Fit Tests -- 12.4 Transferring What You Know about Linear Models -- 12.4.1 Scatterplots and Regression -- 12.4.2 Simple and Multiple Regression -- 12.4.3 Model Building -- 12.4.4 Testing and Analysis of Deviance -- 12.4.5 Variances -- 12.4.6 Transformations -- 12.4.7 Regression Diagnostics -- 12.4.8 Variable Selection -- 12.5 Generalized Linear Models -- 12.6 Problems -- Appendix -- A.1 Website -- A.2 Means, Variances, Covariances, and Correlations -- A.2.1 The Population Mean and E Notation -- A.2.2 Variance and Var Notation -- A.2.3 Covariance and Correlation -- A.2.4 Conditional Moments -- A.3 Least Squares for Simple Regression -- A.4 Means and Variances of Least Squares Estimates -- A.5 Estimating E(Y|X) Using a Smoother -- A.6 A Brief Introduction to Matrices and Vectors -- A.6.1 Addition and Subtraction -- A.6.2 Multiplication by a Scalar -- A.6.3 Matrix Multiplication -- A.6.4 Transpose of a Matrix -- A.6.5 Inverse of a Matrix -- A.6.6 Orthogonality -- A.6.7 Linear Dependence and Rank of a Matrix -- A.7 Random Vectors -- A.8 Least Squares Using Matrices -- A.8.1 Properties of Estimates -- A.8.2 The Residual Sum of Squares -- A.8.3 Estimate of Variance -- A.8.4 Weighted Least Squares -- A.9 The QR Factorization -- A.10 Spectral Decomposition -- A.11 Maximum Likelihood Estimates -- A.11.1 Linear Models -- A.11.2 Logistic Regression -- A.12 The Box-Cox Method for Transformations -- A.12.1 Univariate Case -- A.12.2 Multivariate Case.

A.13 Case Deletion in Linear Regression -- References -- Author Index -- Subject Index.

Praise for the Third Edition "...this is an excellent book which could easily be used as a course text..." -International Statistical Institute The Fourth Edition of Applied Linear Regression provides a thorough update of the basic theory and methodology of linear regression modeling. Demonstrating the practical applications of linear regression analysis techniques, the Fourth Edition uses interesting, real-world exercises and examples. Stressing central concepts such as model building, understanding parameters, assessing fit and reliability, and drawing conclusions, the new edition illustrates how to develop estimation, confidence, and testing procedures primarily through the use of least squares regression. While maintaining the accessible appeal of each previous edition,Applied Linear Regression, Fourth Edition features: Graphical methods stressed in the initial exploratory phase, analysis phase, and summarization phase of an analysis In-depth coverage of parameter estimates in both simple and complex models, transformations, and regression diagnostics Newly added material on topics including testing, ANOVA, and variance assumptions Updated methodology, such as bootstrapping, cross-validation binomial and Poisson regression, and modern model selection methods Applied Linear Regression, Fourth Edition is an excellent textbook for upper-undergraduate and graduate-level students, as well as an appropriate reference guide for practitioners and applied statisticians in engineering, business administration, economics, and the social sciences.

Description based on publisher supplied metadata and other sources.

Electronic reproduction. Ann Arbor, Michigan : ProQuest Ebook Central, 2019. Available via World Wide Web. Access may be limited to ProQuest Ebook Central affiliated libraries.

There are no comments on this title.

to post a comment.