Date of Award
Open Access Dissertation
Linear regression is a widely used method for analysis that is well understood across a wide variety of disciplines. In order to use linear regression, a number of assumptions must be met. These assumptions, specifically normality and homoscedasticity of the error distribution can at best be met only approximately with real data. Quantile regression requires fewer assumptions, which offers a potential advantage over linear regression. In this simulation study, we compare the performance of linear (least squares) regression to quantile regression when these assumptions are violated, in order to investigate under what conditions quantile regression becomes the more advantageous method of analysis. Statistical power and coverage percentage were calculated for all simulations, and potential bias was investigated for both quantile regression and linear regression. When errors are skewed, there is a threshold at which quantile regression surpasses linear regression in statistical power. When heteroscedasticity is introduced, linear regression does not accurately describe the relationship between predictor and response variables at the tails of the conditional distribution. When errors are both skewed and heteroscedastic, quantile regression performs drastically better than linear regression. Coverage percentage in linear regression not only suffers in this case, but also linear regression yields misleading results.
Howard, M.(2018). Comparison of the Performance of Simple Linear Regression and Quantile Regression with Non-Normal Data: A Simulation Study. (Doctoral dissertation). Retrieved from https://scholarcommons.sc.edu/etd/4517