An R Companion to Applied Regression 3rd Edition by John Fox, Sanford Weisberg – Ebook PDF Instant Download/Delivery: 1544336470, 978-1544336473
Full download An R Companion to Applied Regression 3rd edition after payment

Product details:
ISBN 10: 1544336470
ISBN 13: 978-1544336473
Author: John Fox, Sanford Weisberg
An R Companion to Applied Regression is a broad introduction to the R statistical computing environment in the context of applied regression analysis. John Fox and Sanford Weisberg provide a step-by-step guide to using the free statistical software R, an emphasis on integrating statistical computing in R with the practice of data analysis, coverage of generalized linear models, and substantial web-based support materials.
The Third Edition has been reorganized and includes a new chapter on mixed-effects models, new and updated data sets, and a de-emphasis on statistical programming, while retaining a general introduction to basic R programming. The authors have substantially updated both the car and effects packages for R for this edition, introducing additional capabilities and making the software more consistent and easier to use. They also advocate an everyday data-analysis workflow that encourages reproducible research. To this end, they provide coverage of RStudio, an interactive development environment for R that allows readers to organize and document their work in a simple and intuitive fashion, and then easily share their results with others. Also included is coverage of R Markdown, showing how to create documents that mix R commands with explanatory text.
“An R Companion to Applied Regression continues to provide the most comprehensive and user-friendly guide to estimating, interpreting, and presenting results from regression models in R.”
–Christopher Hare, University of California, Davis
An R Companion to Applied Regression 3rd Table of contents:
1 Getting Started With R and RStudio
1.1 Projects in RStudio
1.2 R Basics
1.2.1 Interacting With R Through the Console
1.2.2 Editing R Commands in the Console
1.2.3 R Functions
1.2.4 Vectors and Variables
1.2.5 Nonnumeric Vectors
1.2.6 Indexing Vectors
1.2.7 User-Defined Functions
1.3 Fixing Errors and Getting Help
1.3.1 When Things Go Wrong
1.3.2 Getting Help and Information
1.4 Organizing Your Work in R and RStudio and Making It Reproducible
1.4.1 Using the RStudio Editor With R Script Files
1.4.2 Writing R Markdown Documents
1.5 An Extended Illustration: Duncan’s Occupational-Prestige Regression
1.5.1 Examining the Data
1.5.2 Regression Analysis
1.5.3 Regression Diagnostics
1.6 R Functions for Basic Statistics
1.7 Generic Functions and Their Methods*
2 Reading and Manipulating Data
2.1 Data Input
2.1.1 Accessing Data From a Package
2.1.2 Entering a Data Frame Directly
2.1.3 Reading Data From Plain-Text Files
2.1.4 Files and Paths
2.1.5 Exporting or Saving a Data Frame to a File
2.1.6 Reading and Writing Other File Formats
2.2 Other Approaches to Reading and Managing Data Sets in R
2.3 Working With Data Frames
2.3.1 How the R Interpreter Finds Objects
2.3.2 Missing Data
2.3.3 Modifying and Transforming Data
2.3.4 Binding Rows and Columns
2.3.5 Aggregating Data Frames
2.3.6 Merging Data Frames
2.3.7 Reshaping Data
2.4 Working With Matrices, Arrays, and Lists
2.4.1 Matrices
2.4.2 Arrays
2.4.3 Lists
2.4.4 Indexing
2.5 Dates and Times
2.6 Character Data
2.7 Large Data Sets in R*
2.7.1 How Large Is “Large”?
2.7.2 Reading and Saving Large Data Sets
2.8 Complementary Reading and References
3 Exploring and Transforming Data
3.1 Examining Distributions
3.1.1 Histograms
3.1.2 Density Estimation
3.1.3 Quantile-Comparison Plots
3.1.4 Boxplots
3.2 Examining Relationships
3.2.1 Scatterplots
3.2.2 Parallel Boxplots
3.2.3 More on the plot() Function
3.3 Examining Multivariate Data
3.3.1 Three-Dimensional Plots
3.3.2 Scatterplot Matrices
3.4 Transforming Data
3.4.1 Logarithms: The Champion of Transformations
3.4.2 Power Transformations
3.4.3 Transformations and Exploratory Data Analysis
3.4.4 Transforming Restricted-Range Variables
3.4.5 Other Transformations
3.5 Point Labeling and Identification
3.5.1 The identify() Function
3.5.2 Automatic Point Labeling
3.6 Scatterplot Smoothing
3.7 Complementary Reading and References
4 Fitting Linear Models
4.1 The Linear Model
4.2 Linear Least-Squares Regression
4.2.1 Simple Linear Regression
4.2.2 Multiple Linear Regression
4.2.3 Standardized Regression Coefficients
4.3 Predictor Effect Plots
4.4 Polynomial Regression and Regression Splines
4.4.1 Polynomial Regression
4.4.2 Regression Splines*
4.5 Factors in Linear Models
4.5.1 A Linear Model With One Factor: One-Way Analysis of Variance
4.5.2 Additive Models With Numeric Predictors and Factors
4.6 Linear Models With Interactions
4.6.1 Interactions Between Numeric Predictors and Factors
4.6.2 Shortcuts for Writing Linear-Model Formulas
4.6.3 Multiple Factors
4.6.4 Interactions Between Numeric Predictors*
4.7 More on Factors
4.7.1 Dummy Coding
4.7.2 Other Factor Codings
4.7.3 Ordered Factors and Orthogonal-Polynomial Contrasts
4.7.4 User-Specified Contrasts*
4.7.5 Suppressing the Intercept in a Model With Factors*
4.8 Too Many Regressors*
4.9 The Arguments of the lm() Function
4.9.1 formula
4.9.2 data
4.9.3 subset
4.9.4 weights
4.9.5 na.action
4.9.6 method, model, x, y, qr*
4.9.7 singular.ok*
4.9.8 contrasts
4.9.9 offset
4.10 Complementary Reading and References
5 Coefficient Standard Errors, Confidence Intervals, and Hypothesis Tests
5.1 Coefficient Standard Errors
5.1.1 Conventional Standard Errors of Least-Squares Regression Coefficients
5.1.2 Robust Regression Coefficient Standard Errors
5.1.3 Using the Bootstrap to Compute Standard Errors
5.1.4 The Delta Method for Standard Errors of Nonlinear Functions*
5.2 Confidence Intervals
5.2.1 Wald Confidence Intervals
5.2.2 Bootstrap Confidence Intervals
5.2.3 Confidence Regions and Data Ellipses*
5.3 Testing Hypotheses About Regression Coefficients
5.3.1 Wald Tests
5.3.2 Likelihood-Ratio Tests and the Analysis of Variance
5.3.3 Sequential Analysis of Variance
5.3.4 The Anova() Function
5.3.5 Testing General Linear Hypotheses*
5.4 Complementary Reading and References
6 Fitting Generalized Linear Models
6.1 Review of the Structure of GLMs
6.2 The glm() Function in R
6.3 GLMs for Binary Response Data
6.3.1 Example: Women’s Labor Force Participation
6.3.2 Example: Volunteering for a Psychological Experiment
6.3.3 Predictor Effect Plots for Logistic Regression
6.3.4 Analysis of Deviance and Hypothesis Tests for Logistic Regression
6.3.5 Fitted and Predicted Values
6.4 Binomial Data
6.5 Poisson GLMs for Count Data
6.6 Loglinear Models for Contingency Tables
6.6.1 Two-Dimensional Tables
6.6.2 Three-Dimensional Tables
6.6.3 Sampling Plans for Loglinear Models
6.6.4 Response Variables
6.7 Multinomial Response Data
6.8 Nested Dichotomies
6.9 The Proportional-Odds Model
6.9.1 Testing for Proportional Odds
6.10 Extensions
6.10.1 More on the Anova () Function
6.10.2 Gamma Models
6.10.3 Quasi-Likelihood Estimation
6.10.4 Overdispersed Binomial and Poisson Models
6.11 Arguments to glm()
6.11.1 weights
6.11.2 start, etastart, mustart
6.11.3 offset
6.11.4 control
6.11.5 model, method, x, y
6.12 Fitting GLMs by Iterated Weighted Least Squares*
6.13 Complementary Reading and References
7 Fitting Mixed-Effects Models
7.1 Background: The Linear Model Revisited
7.1.1 The Linear Model in Matrix Form*
7.2 Linear Mixed-Effects Models
7.2.1 Matrix Form of the Linear Mixed-Effects Model*
7.2.2 An Application to Hierarchical Data
7.2.3 Wald Tests for Linear Mixed-Effects Models
7.2.4 Examining the Random Effects: Computing BLUPs
7.2.5 An Application to Longitudinal Data
7.2.6 Modeling the Errors
7.2.7 Sandwich Standard Errors for Least-Squares Estimates
7.3 Generalized Linear Mixed Models
7.3.1 Matrix Form of the GLMM*
7.3.2 Example: Minneapolis Police Stops
7.4 Complementary Reading
8 Regression Diagnostics for Linear, Generalized Linear, and Mixed-Effects Models
8.1 Residuals
8.2 Basic Diagnostic Plots
8.2.1 Plotting Residuals
8.2.2 Marginal-Model Plots
8.2.3 Added-Variable Plots
8.2.4 Marginal-Conditional Plots
8.3 Unusual Data
8.3.1 Outliers and Studentized Residuals
8.3.2 Leverage: Hat-Values
8.3.3 Influence Measures
8.4 Transformations After Fitting a Regression Model
8.4.1 Transforming the Response
8.4.2 Predictor Transformations
8.5 Nonconstant Error Variance
8.5.1 Testing for Nonconstant Error Variance
8.6 Diagnostics for Generalized Linear Models
8.6.1 Residuals and Residual Plots
8.6.2 Influence Measures
8.6.3 Graphical Methods: Added-Variable Plots, Component-Plus-Residual Plots, and Effect Plots With Partial Residuals
8.7 Diagnostics for Mixed-Effects Models
8.7.1 Mixed-Model Component-Plus-Residual Plots
8.7.2 Influence Diagnostics for Mixed Models
8.8 Collinearity and Variance Inflation Factors
8.9 Additional Regression Diagnostics
8.10 Complementary Reading and References
9 Drawing Graphs
9.1 A General Approach to R Graphics
9.1.1 Defining a Coordinate System: plot()
9.1.2 Graphics Parameters: par()
9.1.3 Adding Graphical Elements: axis(), points(), lines(), text(), et al.
9.1.4 Specifying Colors
9.2 Putting It Together: Explaining Local Linear Regression
9.2.1 Finer Control Over Plot Layout
9.3 Other R Graphics Packages
9.3.1 The lattice Package
9.3.2 The ggplot2 Package
9.3.3 Maps
9.3.4 Other Notable Graphics Packages
9.4 Complementary Reading and References
10 An Introduction to R Programming
10.1 Why Learn to Program in R?
10.2 Defining Functions: Preliminary Examples
10.2.1 Lagging a Variable
10.2.2 Creating an Influence Plot
10.3 Working With Matrices*
10.3.1 Basic Matrix Arithmetic
10.3.2 Matrix Inversion and the Solution of Linear Simultaneous Equations
10.3.3 Example: Linear Least-Squares Regression
10.3.4 Eigenvalues and Eigenvectors
10.3.5 Miscellaneous Matrix Computations
10.4 Program Control With Conditionals, Loops, and Recursion
10.4.1 Conditionals
10.4.2 Iteration (Looping)
10.4.3 Recursion
10.5 Avoiding Loops: apply () and Its Relatives
10.5.1 To Loop or Not to Loop?
10.6 Optimization Problems*
10.6.1 Zero-Inflated Poisson Regression
10.7 Monte-Carlo Simulations*
10.7.1 Testing Regression Models Using Simulation
10.8 Debugging R Code*
10.9 Object-Oriented Programming in R*
10.10 Writing Statistical-Modeling Functions in R*
10.11 Organizing Code for R Functions
10.12 Complementary Reading and References
People also search for An R Companion to Applied Regression 3rd :
an r companion to applied regression third edition
an r companion to applied regression third edition pdf
an r companion to applied regression 3rd ed
an r companion to applied regression citation
an r companion to applied regression doi
Tags: John Fox, Sanford Weisberg, An R Companion, Applied Regression


