Bayesian robustness with shape-constrained priors and mixture priors

Sudip Kumar Bose, Purdue University

Abstract

A common concern with Bayesian analysis is uncertainty in specification of the prior distribution. The robust Bayesian solution to this difficulty is to work with a class of prior distributions, reflecting uncertainty in elicitation, instead of a single prior. Interest then focuses on the range of the posterior expectations of certain parametric functions as the prior varies over the class being considered--if this range is small, one has robustness. Some posterior expectations of interest are posterior moments or posterior probability for the parameter to belong to an interval. One of the classes of priors that has been considered is the density-ratio bounded (DRB) class of De-Robertis and Hartigan (1981). We look at DRB classes of unimodal and symmetric priors and obtain results for minimising and maximising certain posterior expectations of a single parameter. Due to the shape constraints on the priors, the degree of the final numerical minimisation to which the problem is reduced, depends on the number of peaks and troughs of a ratio function involving the parametric function and the likelihood. These results do not generalise well to the case of several parameters. Other methods are considered which work for the multiparameter as well as the single parameter case--mixture classes to various sorts. We look at mixtures of uniforms on different sets, for example, rectangles and spheres, and mixtures of spherically symmetric densities. We state theorems for the cases where the mixing distribution lies in a DRB class or in a DB class and also for the class of $\epsilon$-contamination by mixture densities. Such classes are quite flexible, allowing various types of priors. In several dimensions there are a number of definitions of symmetry and unimodality. In Chapter 3, we consider these various definitions in successive sections. The main difficulty lies in transforming available prior information to construct a suitable mixture class of priors. Some possibilities are suggested for modelling prior beliefs by such classes, for instance by modelling (ranges of) tail thickness.

Degree

Ph.D.

Advisors

Berger, Purdue University.

Subject Area

Statistics

Off-Campus Purdue Users:
To access this dissertation, please log in to our
proxy server
.

Share

COinS