Multiple Regression

In our daily lives, we come across variables, which are related to each other. To study the degree of relationships between these variables, we make use of correlation. To find the nature of the relationship between the variables, we have another measure, which is known as regression. In this, we use correlation and regression to find equations such that we can estimate the value of one variable when the values of other variables are given.

Also, read:

Multiple Regression Definition

Multiple regression analysis is a statistical technique that analyzes the relationship between two or more variables and uses the information to estimate the value of the dependent variables. In multiple regression, the objective is to develop a model that describes a dependent variable y to more than one independent variable.

Multiple Regression Formula

In linear regression, there is only one independent and dependent variable involved. But, in the case of multiple regression, there will be a set of independent variables that helps us to explain better or predict the dependent variable y.

The multiple regression equation is given by

y = a + b 1×1+ b2×2+……+ bkxk

where x1, x2, ….xk are the k independent variables and y is the dependent variable.

Also, try out: Linear Regression Calculator

Multiple Regression Analysis Definition

Multiple regression analysis permits to control explicitly for many other circumstances that concurrently influence the dependent variable. The objective of regression analysis is to model the relationship between a dependent variable and one or more independent variables. Let k represent the number of variables and denoted by x1, x2, x3, ……, xk. Such an equation is useful for the prediction of value for y when the values of x are known.

Stepwise Multiple Regression

Stepwise regression is a step by step process that begins by developing a regression model with a single predictor variable and adds and deletes predictor variable one step at a time. Stepwise multiple regression is the method to determine a regression equation that begins with a single independent variable and add independent variables one by one. The stepwise multiple regression method is also known as the forward selection method because we begin with no independent variables and add one independent variable to the regression equation at each of the iterations. There is another method called backwards elimination method, which begins with an entire set of variables and eliminates one independent variable at each of the iterations.

Residual: The variations in the dependent variable explained by the regression model are called residual or error variation. It is also known as random error or sometimes just “error”. This is a random error due to different sampling methods.

Advantages of Stepwise Multiple Regression

  • Only independent variables with non zero regression coefficients are included in the regression equation.
  • The changes in the multiple standard errors of estimate and the coefficient of determination are shown.
  • The stepwise multiple regression is efficient in finding the regression equation with only significant regression coefficients.
  • The steps involved in developing the regression equation are clear.

Multivariate Multiple Regression

Mostly, the statistical inference has been kept at the bivariate level. Inferential statistical tests have also been developed for multivariate analyses, which analyses the relation among more than two variables. Commonly used extension of correlation analysis for multivariate inferences is multiple regression analysis. Multiple regression analysis shows the correlation between each set of independent and dependent variables.

Multicollinearity

Multicollinearity is a term reserved to describe the case when the inter-correlation of predictor variables is high.

Signs of Multicollinearity

  • The high correlation between pairs of predictor variables.
  • The magnitude or signs of regression coefficients do not make good physical sense.
  • Non-significant regression coefficients on significant predictors.
  • The ultimate sensitivity of magnitude or sign of regression coefficients leads to the insertion or deletion of a predictor variable.

Register with BYJU’S – The Learning App and download the app to learn with ease.