Methods for Analyzing Natural Patterns and Physical Ergonomics of Human Gestures in Mid-Air Interaction
Recent advances in sensing technologies have produced low-cost human motion tracking methods such as depth sensing cameras and hand-held/wearable motion sensors. These methods are being leveraged to study and adopt natural human gestures for the design of mid-air interaction in various environments including virtual/augmented reality, video gaming, and 3D shape modeling. In order to evaluate and optimize user experience during the interaction, it is necessary to analyze various human factors, such as naturalness, intuitiveness, easiness, and physical ergonomics, of mid-air interfaces. This thesis focuses on analyzing two fundamental human factors in mid-air interaction: (1) natural gesture inputs to control the interfaces and (2) physical fatigue during the interaction. A common method to observe and understand how users express gestures during mid-air interaction is to use elicitation studies. These studies are useful to identify the most acceptable gesture patterns from candidate user groups and to gain insights into the design of natural gesture vocabulary. However, such method commonly requires time-consuming analysis of user data to identify and categorize gesture patterns. In addition, the analysis by humans cannot describe gestures in as detail as in data-based representations of motion features. Quantifying cumulative arm fatigue is another critical factor in evaluating and optimizing user experience during prolonged mid-air interactions. A reasonably accurate estimation of fatigue requires an estimate of an individual's strength. However, in human-computer interaction (HCI) research, there is no easy-to-access method to measure individual strength to accommodate inter-individual differences. Moreover, fatigue is influenced by both psychological and physiological factors, but no current HCI model provides good estimates of cumulative subjective fatigue. This thesis proposes methods to address above mentioned challenges of analyzing naturalness and physical ergonomics of mid-air gestural interaction. First, we present visual analytics systems—that integrate interactive data visualizations and machine learning techniques—to aid HCI researchers in analyzing and understanding natural gesture behavior. We also discuss the usability of such analytics systems in performing pattern analysis of gestures.The other part of this thesis proposes a framework for evaluating arm fatigue during mid-air interaction. We first present a simple and effective method to estimate subject-specific arm strength. Then, we introduce a cumulative fatigue model informed by subjective and biomechanical measures. Unlike existing fatigue evaluation methods that commonly require expensive and impractical measurements, our framework intends to provide easily adaptable and implementable methods for HCI research. Finally, we conclude this thesis by summarizing research contributions and discussing future work directions.
Ramani, Purdue University.
Mechanical engineering|Computer science
Off-Campus Purdue Users:
To access this dissertation, please log in to our