Date of Award

12-2017

Degree Type

Dissertation

Degree Name

Doctor of Philosophy (PhD)

Department

Mechanical Engineering

Committee Chair

Karthik Ramani

Committee Member 1

David Cappelleri

Committee Member 2

Yung-Hsiang Lu

Committee Member 3

Alexander Quinn

Abstract

The advancements in electrical components and computing power have introduced new types of sensing capabilities and interfaces. To support upcoming interfaces like augmented/virtual reality (AR/VR), previous studies suggested on utilizing natural correspondences like human motions and natural behaviors [1]. However, conventional input methods still rely on utilizing built-in motion and touch sensors. Although these approaches are offered in effective and efficient manner, they do not fully accommodate human’s natural motions and behaviors as a meaningful input metaphor. Introducing new sensing techniques that are capable of understanding human motions and behaviors will become essential in supporting emerging interfaces. Recent works have explored new input metaphors such as eyes-free [2], around device [3], and object interactions [4] enabled with improved or novel sensing techniques. Still, there are rooms to improve performance, richness, and accessibility of existing input methods. By focusing on investigating novel sensing techniques, we can even further suggest futuristic input methods that can accommodate upcoming new interfaces. This thesis focuses on introducing novel sensing techniques with hand-driven inputs for natural soft and tangible interfaces. We select a hand as the main interaction medium from the human side since it represents human intents precisely with high flexibility [5]. We investigate and design hand-driven input along with suggesting novel magnetic, multimodal, and soft-matter sensing techniques. Furthermore, we introduce different form factors, user customization workflow, instant deployment capability, larger interaction volumes, and rich input types. In overall, this thesis contributes towards improving performance, richness, and accessibility for interpreting human’s natural actions into input intents. First, the magnetic sensing has been explored deeply to offer real-time 3D position tracking around the mobile device with a stylus form factor. The magnetic sensing has been further explored to bring interactivity to plain objects by simply embedding a small magnet. Here, we employed a finger-worn device and implemented an instant and customizable user interface with any given objects. This allows us to deploy the customizable interface easily while supporting various types of inputs including continuous and discrete inputs. By enabling 3D volume around the mobile device as interactive space, we improve the richness of physical input space. Moreover, we advance the accessibility by providing easily customizable and deployable interface while enhancing baseline position tracking performance. Subsequently, we developed a multimodal sensing (finger pressing and bending) smart textile to support an eyes-free input based on the somatosensory tactility of the finger. The proposed prototype showed performance improvement in terms of a response time and an accuracy during eyes-free interaction use cases. The rich input types were provided by forming input vocabulary based on finger press and bending. The wearable form factor allows users to maintain comfort level as wearing a clothing which encourages accessibility. In the final stage of this work, we introduce a novel soft-matter sensors. High performance and rich interactions have been feasible by proposing novel sensing algorithm that works with existing electrical impedance tomography (EIT) sensing. We also suggest a multimodal sensing pipeline that combines the computational-based and learning-based sensing methods. This sensing pipeline allows us to easily customize and instantly learn user’s physical motions as input intents which greatly enhance the accessibility of existing soft sensors.

Share

COinS