Advanced search options

Advanced Search Options 🞨

Browse by author name (“Author name starts with…”).

Find ETDs with:


Written in Published in Earliest date Latest date

Sorted by

Results per page:

You searched for id:"". One record found.

Search Limiters

Last 2 Years | English Only

No search limiters apply to these results.

▼ Search Limiters

Rochester Institute of Technology

1. Hassan, Mawal. Some Statistical Properties of Spectral Regression Estimators.

Degree: MS, School of Mathematical Sciences (COS), 2019, Rochester Institute of Technology

In this thesis we explore different Spectral Regression Estimators in order to solve the prob- lem in regression where we have multiple columns that are linearly dependent: We explore two scenarios • Scenario 1: p << n where there exists at least two columns; xj and xk that are nearly linearly dependent which indicates co-linearity and X⊤X becomes near singular. • Scenario 2: n << p since there are more predictors than observations so some columns must be a linear combination of another column which indicates linear dependence. The scenarios give us an ill conditioned matrix of X⊤X (when solving the normal equa- tion) due to collinearity issues and the matrix becomes singular and makes the least squares estimate unstable and impossible to compute. In the paper, we explore different methods (variable selection, regularization, compression and dimensionality reduction) that solves the above issue. For variable selection techniques, we use Stepwise Selection Regression as well as the method of Best Subset Selection regression. Two approaches for Stepwise Se- lection regression are assessed in the paper: Forward Selection and Backward Elimination. Performance assessment of our regression models will be made based on criterion based procedures like AIC,BIC,R2,R2 adjusted and the Mallow’s CP statistic. In chapter three of this paper we introduce the concepts of General Regularization, Ridge Regression as well as subsequent shrinkage methods such as the Lasso, Bayesian Lasso and the Elastic net. Chapter five will look at Compression and Dimensionality reduction procedures which are outlined via SVD (Singular Value Decomposition) and Eigenvector Decomposition. Hard thresholding is subsequently introduced via SPCA (Sparse Principle Component Analysis) and a novel approach using RPCA (Robust Principle Component Analysis). Furthermore, RPCA also shows how it can aid with data and image compression. The basis of this study is concluded with an empirical exploration of all the methods outlined above using several performance indicators on simulated data and real data sets. Assessment of the data sets is done via cross-validation. We determine the optimal values of the settings and then evalu- ate the predictive and explanatory performance. Advisors/Committee Members: Ernest Fokoue, Robert Parody, Joseph Voelkel.

Subjects/Keywords: None provided

Record DetailsSimilar RecordsGoogle PlusoneFacebookTwitterCiteULikeMendeleyreddit

APA · Chicago · MLA · Vancouver · CSE | Export to Zotero / EndNote / Reference Manager

APA (6th Edition):

Hassan, M. (2019). Some Statistical Properties of Spectral Regression Estimators. (Masters Thesis). Rochester Institute of Technology. Retrieved from

Chicago Manual of Style (16th Edition):

Hassan, Mawal. “Some Statistical Properties of Spectral Regression Estimators.” 2019. Masters Thesis, Rochester Institute of Technology. Accessed May 25, 2019.

MLA Handbook (7th Edition):

Hassan, Mawal. “Some Statistical Properties of Spectral Regression Estimators.” 2019. Web. 25 May 2019.


Hassan M. Some Statistical Properties of Spectral Regression Estimators. [Internet] [Masters thesis]. Rochester Institute of Technology; 2019. [cited 2019 May 25]. Available from:

Council of Science Editors:

Hassan M. Some Statistical Properties of Spectral Regression Estimators. [Masters Thesis]. Rochester Institute of Technology; 2019. Available from: