Statistical Mechanics of Regularized Least Squares

Document Type
Doctoral Thesis
Issue Date
Issue Year
Bereyhi, Ali

Various problems in practice are mathematically described by a regression model in which the linear relation between possibly noisy outputs and their corresponding inputs is to be learned. An effective approach for linear regression is the regularized form of the least-squares method commonly abbreviated by RLS. This scheme specifies the linear relation, such that a penalized version of error given by the linear model is minimized. The penalty describes information known in priori about the model.

Due to the growing interest in high-dimensional applications, recent studies often desire to characterize RLS in the large-system limit. Nevertheless, for various forms of RLS, basic analytical tools fail to address this task. For these forms, numerical simulations further deal with high computational complexity. Hence, assessing the performance by numerical approaches is also intractable. The main objective of this dissertation is to address this issue by using analytical tools from statistical mechanics. To this end, RLS solution of a linear regression model is interpreted as the ground state of a spin glass whose macroscopic parameters are derived via the replica method. Following the relation between the spin glass and its corresponding regression problem, the asymptotic distortion achieved by RLS is analytically derived for an arbitrary distortion function. The asymptotic characterization of RLS leads to a general form of the decoupling principle. This result states that the regression outputs given by RLS in a high-dimensional linear model are statistically described via a decoupled scalar setting. When the corresponding spin glass exhibits replica symmetry, this decoupled setting reduces to a scalar system with independent additive Gaussian noise and recovers earlier results in the literature. For cases in which replica symmetry breaks, the Gaussianity and independency of decoupled noise are violated. Using the derivations via the replica method, the decoupled noise term is further characterized for cases with broken replica symmetry with an arbitrary number of breaking steps.

Linear regression has a diverse scope of applications from statistics and communications to information theory, signal processing and machine learning. To enlighten applications of this study, two examples of linear regression via RLS are investigated in this dissertation. The first example considers the problem of sparse recovery. Using the asymptotic results, the large-system performance of various recovery algorithms is studied and compared to the results in literature. For algorithms with tractable computational complexity, investigations confirm validity of the derivations. The results further characterize those recovery algorithms whose performance has remained unknown due to their intractable complexity. In the second example, multiple-input multiple-output (MIMO) precoding is investigated. In this respect, the problem of MIMO precoding is first interpreted as linear regression and an RLS-based scheme is proposed. The large-system performance of this scheme is then investigated by means of the asymptotic results and compared to the benchmark. Investigations depict close agreement of asymptotic characterization with finite-dimensional simulations and demonstrate significant performance enhancements in some particular cases.

Faculties & Collections
Zugehörige ORCIDs