![]() ![]() The use and interpretation of \(R^2\) in the context of multiple linear regression remains the same.We'll explore this issue further in Lesson 7. All of the model-checking procedures we learned earlier are useful in the multiple linear regression framework, although the process becomes more involved since we now have multiple predictors. The only real difference is that whereas in simple linear regression we think of the distribution of errors at a fixed value of the single predictor, with multiple linear regression we have to think of the distribution of errors at a fixed set of values for all the predictors. The models have similar "LINE" assumptions.Think about it - you don't have to forget all of that good stuff you learned! In particular: The good news is that everything you learned about the simple linear regression model extends - with at most minor modifications - to the multiple linear regression model. If you're unsure about any of this, it may be a good time to take a look at this Matrix Algebra Review. This lesson considers some of the more important multiple regression formulas in matrix form. In the multiple regression setting, because of the potentially large number of predictors, it is more efficient to use matrices to define the regression model and the subsequent analyses. That is, we use the adjective "simple" to denote that our model has only predictors, and we use the adjective "multiple" to indicate that our model has at least two predictors. We move from the simple linear regression model with one predictor to the multiple linear regression model with two or more predictors. In this lesson, we make our first (and last?!) major jump in the course. ![]()
0 Comments
Leave a Reply. |
AuthorWrite something about yourself. No need to be fancy, just an overview. ArchivesCategories |