Abstract
As cornputing becomes more ubiquitous, there is a need for distributed intelligent humancomputer interfaces that can perceive and interpret a user's actions through sensors that see, hear and feel. A perceptually intelligent interface enables a more natural interaction between a user ancl a machine in the sense that the user can look at, talk to or touch an object instead of using a, machine language. Although research on haptic (i.e., touch-based) interfaces has received less attention in the past as compared to that on visual and auditory interfaces, it is emerging as a new interdisciplinary field that holds much promise for the future. The goal of the sensing chair project is to enable a computer to track., in real time, the sitting pclstures of a user through the use of surface-mounted contact sensors. Given the similarity between a pressure distribution map from the contact sensors and a gray-level image, we propose to adapt computer vision and paktern recognition algorithms for the analysis of sitting pressure daka,. Work in three areas are proposed: (1) data collection for a sitting pressure distribution database, (2) development of a real-time sitting posture tracking system, and (3) performance evaluation of the tracking system. The realization of a robust, real-time tracking system will lead to many exciting applications such as automatic control of airbag deployment forces, ergonomics of furniture design, and biometl-ic authentication for computer security.
Date of this Version
January 2000