Sequential tests for Markov dependence
Abstract
A sequential likelihood ratio test for Markov dependence in a sequence of observations is considered. An expression for the asymptotic values of the type-I error probabilities is derived. We study renewal theory for Markov chains, and have an alternative expression for the limiting joint distribution of the overhits and the hitting locations. We also extend Lai and Siegmund's non-linear renewal theory to Markov chains. The maximal loglikehood ratio statistics $\Lambda\sb{n}$, $n\/\geq$ 1, play an important role in our test procedure. By means of the implicit function theorem, we can write $\Lambda\sb{n}$ as a twice continuously differentiable function of a suitable observable statistic, and decompose $\Lambda\sb{n}$ as the sum of an appropriate Markov random walk and a slowly changing stochastic process, and then apply the non-linear Markov renewal theorem to obtain the asymptotic distributions of the first-passage residue. Based on these results we get an expression for the asymptotic type-I error probabilities. We also do certain computing work to provide tables for the values of type-I error probabilities from both Monte Carlo estimates and the expression we derive.
Degree
Ph.D.
Advisors
Lalley, Purdue University.
Subject Area
Statistics|Mathematics
Off-Campus Purdue Users:
To access this dissertation, please log in to our
proxy server.