Professor Ehsanes SalehMEAAD ABDULLAH AHMED ALDABAL2022-06-052021-03-092022-06-0595477https://drepo.sdl.edu.sa/handle/20.500.14154/67103The focus of this thesis is to review the three basic penalty estimators, namely, ridge regression estimator, LASSO, and elastic net estimator in the light of the deficiencies of least-squares estimator. Ill-conditioned design matrix is the major source of problem in this case. To overcome this problem, ridge regression was developed, and it opened the door for penalty estimators. Its impact is visible with various linear and non-linear models. A superb discovery in the class of subset selection is the LASSO (Least Absolute Shrinkage and Selection Operator) which selects subsets and estimates the coefficients simultaneously. Finally, we consider the elastic net penalty estimator which combine the L$_{1}$\ and L$_{2}$\ penalty function. Resulting estimator is weighted LASSO by ridge factor. We obtain the L$_{2}$-risk expressions and compare with pre-test and Stein-type estimators. For the location model, we discovered that the naive elastic net is better than elastic net estimators as opposed to the conclusion in the current literature. On the other hand in case of regression model, the elastic net performs reasonably compared to LASSO and ridge regression.49enA Comparative Study of Ridge, LASSO and Elastic net EstimatorsThesis