With Regression Diagnostics, researchers now have an accessible explanation of the techniques needed for exploring problems that compromise a regression analysis and for determining whether certain assumptions appear reasonable. The book covers such topics as the problem of collinearity in multiple regression, dealing with outlying and influential data, non-normality of errors, non-constant error variance and the problems and opportunities presented by discrete data. In addition, sophisticated diagnostics based on maximum-likelihood methods, scores tests, and constructed variables are introduced.
Multiple regression is a powerful and useful statistical technique. It and its derivative techniques are workhorses in the social sciences. Much of my own quantitative research is based on the use of regression. There are real strengths of this method: the results have pretty straightforward interpretation; you can control for a wide variety of variables; software (such as SPSS) makes it easy as pie to run.
However, there are problems with multiple regression that the user needs to be aware of. For example, if two independent variables are highly intercorrelated, you may get--as one result--strange results. High intercorrelations are indicators of the dread problem of multicollinearity. Hence, when running a regression, one would want to test for this effect. Thankfully, programs like SPSS allow one to test for multicollinearity in a variety of ways. Just so, outliers (extreme scores) can throw off results. The researcher can check to see if there are such outliers. Other problems for regression discussed in this slender volume: non-normality, heteroscedasticity, and nonlinearity. The good thing is that contemporary statistical software allows one to check these out.
So, a good resource for the person wanting to run multiple regression while making sure that the data do not confound their results.