Doubly separable models and distributed parameter estimation

Hyokun Yun, Purdue University

Abstract

It is well known that stochastic optimization algorithms are both theoretically and practically well-motivated for parameter estimation of large-scale statistical models. Unfortunately, in general they have been considered difficult to parallelize, especially in distributed memory environment. To address the problem, we first identify that stochastic optimization algorithms can be efficiently parallelized when the objective function is doubly separable; lock-free, decentralized, and serializable algorithms are proposed for stochastically finding minimizer or saddle-point of doubly separable functions. Then, we argue the usefulness of these algorithms in statistical context by showing that a large class of statistical models can be formulated as doubly separable functions; the class includes important models such as matrix completion and regularized risk minimization. Motivated by optimization techniques we have developed for doubly separable functions, we also propose a novel model for latent collaborative retrieval, an important problem that arises in recommender systems.

Degree

Ph.D.

Advisors

Vishwanathan, Purdue University.

Subject Area

Statistics|Artificial intelligence|Computer science

Off-Campus Purdue Users:
To access this dissertation, please log in to our
proxy server
.

Share

COinS