Structured matrix computations via randomization, their analysis, and applications

Yuanzhe Xi, Purdue University

Abstract

This dissertation presents several fast and stable algorithms for both dense and sparse matrices based on rank structured matrix techniques. In recent years, researchers have found that in some cases dense matrices have low (numerical) rank off-diagonal blocks, and thus can be represented or approximated by compact rank structured forms. Matrix operations involving these compact rank structures are often very efficient. In particular, the hierarchically semiseparable (HSS) representations have binary tree structures and are widely used in the development of novel direct solvers. First, we propose several randomized HSS algorithms. We integrate the randomized sampling techniques into the HSS construction procedures and greatly improve their efficiency compared with classical HSS algorithms. Two different HSS construction schemes are proposed here. Assume r is the HSS rank and n is the matrix size. One method requires O(r) matrix vector multiplications together with O(n) entries of the matrices, and the other dynamically detects the numerical rank and only relies on O( r2) matrix-vector multiplications without accessing any entry of the matrices. We then study the inherent structures in the HSS generators from randomized HSS construction schemes and present special ULV type algorithms for linear system solutions, one URV and two structured normal equation algorithms for linear least squares solutions and one structured LDL factorization and update algorithm for symmetric eigenvalue problems. All these algorithms have roughly O(n) flops. Second, we provide rigorous analysis for classical HSS algorithms. We show that the approximation errors in HSS construction procedures are controllable in either classical or randomized compression. In addition, both HSS ULV/URV factorizations and HSS matrix-vector multiplications are backward stable. We also prove that HSS ULV linear system solutions and HSS URV linear least squares solutions are structured backward stable. Through a concrete example, we demonstrate that the HSS inversion algorithm is not as numerically stable as the HSS ULV algorithms. Moreover, we show that inertia can be estimated accurately with low-accuracy structured HSS forms. This is analytically justified for a special matrix and numerically supported by a general matrix. Based on this, we propose a fast eigenvalue solver together with bisection scheme. Finally, we study the rank structures hidden in the general sparse matrix inverse. We show that even though the inverse sparse matrix is a fully dense matrix, all of its diagonal blocks have HSS structures and off-diagonal blocks are in low-rank forms. Third, we apply the newly developed rank structured techniques to solve many important applications. For example, in order to solve Toeplitz matrix problems, we can transform them into Fourier space and solve the resulting Cauchy-like systems with randomized HSS algorithms, which have almost nearly complexity. We can further transform the Cauchy-like matrix into a Cauchy matrix. Such Cauchy matrix is independent of the entries of the original Toeplitz matrix and can be approximated with O(n log3 n) flops with strong rank revealing lu factorization. The HSS approximation to the Cauchy-like matrix can then be obtained with O(n log n) cost. We also incorporate HSS matrix techniques into sparse schemes such as the multifrontal method to replace dense operations by structured ones. We define the separator pieces ordering rules on top of the nested dissection which not only reduce the fill-in in the factors but also preserve the low-rank property in the intermediate dense matrices. We then generalize the aggressive truncation theorem for the dense eigenvalue problems and propose a structured sparse eigensolver using bisection and approximate LDL factorizations. This is especially useful for finding finite number of eigenvalues within some intervals. We also show a structured selected inversion scheme to compute all the diagonal entries of symmetric sparse matrix inverses. When the matrices have the low-rank property, the selected inversion cost is nearly O(n) with about O(n) memory in both 2D and 3D problems.

Degree

Ph.D.

Advisors

Xia, Purdue University.

Subject Area

Applied Mathematics|Mathematics

Off-Campus Purdue Users:
To access this dissertation, please log in to our
proxy server
.

Share

COinS