Statistical machine learning in the t-exponential family of distributions

Nan Ding, Purdue University

Abstract

The exponential family of distributions plays an important role in statistics and machine learning. They underlie numerous models such as logistic regression and probabilistic graphical models. However, exponential family based probabilistic models are vulnerable to outliers. This dissertation aims to design machine learning models based on a more generalized distribution family, namely the t-exponential family of distributions, and show that efficient inference algorithms exist for these models. We first focus on the classification problem and propose t-logistic regression, which replaces the exponential family in logistic regression by a t-exponential family and is more robust in the presence of label noise. Second, inspired by variational inference in the exponential family, we define a new t-entropy which is the Fenchel conjugate to the log-partition function of the t-exponential family. By minimizing the t-divergence, the Bregman divergence of t-entropy, between the approximate and the true distribution, we develop efficient variational inference approaches for t-exponential family based graphical models. Using our inference procedure, we generalize conditional random fields (CRF) to t-CRF, and show how t-divergence based mean field approach can be used to approximate the logpartition function. Finally, t-divergence is combined with t-logistic regression to obtain a generalized family of convex and non-convex loss functions for classification. Empirical evaluation of our models on a variety of datasets is presented to demonstrate their advantages.

Degree

Ph.D.

Advisors

Vishwanathan, Purdue University.

Subject Area

Statistics|Artificial intelligence|Computer science

Off-Campus Purdue Users:
To access this dissertation, please log in to our
proxy server
.

Share

COinS