Detecting Human Machine Interaction Fingerprints in Continuous Event Data

Audrey E Reinert, Purdue University

Abstract

There is a problem facing human factors and human computing interaction researchers. While laboratory studies can provide direct measures of human performance, these methods are insufficient when trying to determine if similar changes in human performance are observable in high volumes of continuous event data. However, continuous event data does not contain direct measures of human performance, but it could contain indirect measures. It is not known if indirect measures of human performance present in continuous event data can be used to predict delay in responding to an unexpected event or assessing the operator‘s workload. By developing an interface with distinct difficulty levels that correlated with different measures of experienced workload we show that a set of variables exist that enable difficulty and response delay to be classified with 95% and 72% accuracy, respectively. Finally, there is evidence to suggest that the predictive accuracy is influenced by the sampling rate of the data and the size of the training set.

Degree

Ph.D.

Advisors

Landry, Purdue University.

Subject Area

Artificial intelligence|Management|Marketing

Off-Campus Purdue Users:
To access this dissertation, please log in to our
proxy server
.

Share

COinS