Integrated sensors in robotic assembly tasks

Yihchih Tang, Purdue University

Abstract

Investigation of sensory fusion and coordination of integrated sensors in robotic assembly tasks is proposed. A computationally efficient framework for multisensor fusion is proposed to effectively represent as well as reliably and consistently fuse sensory measurements of assembly parts from diverse sensors. Sensory information represented in this fusion framework is processed for two sensory operations in assembly tasks: recognizing assembly parts and monitoring assembly mating procedures. Results of these investigations constitute an integrated-sensor system that enables and facilitates the execution of assembly tasks in an unknown and uncertain environment. The generic framework for multisensor fusion employs a unified, feature-based world model, called the Geometric Feature Relation Graph (GFRG), for representing various sensory measurements at a common level of abstraction. Sensory fusion based on GFRGs is obtained as an integration of irregular GFRGs constructed by various sensors into a consistent regular GFRG. In the integration process, the correspondence problems of identifying coincident measurements of geometric features and maintenance of consistency in a network of geometric relations are both solved in the presence of sensory uncertainties. For efficient utilization of sensors, the coordination of sensors for recognition is conducted according to an intelligent sensing strategy. This is accomplished essentially in two steps: critical information for recognition is first comprised in and identified from a CAndidate DIscriminating Graph (CADIG) in a systematic procedure, and then sensors are optimally coordinated subject to the dynamics of the sensors to achieve this critical information for recognition. This strategic recognition of objects is reliable with sensory noises by applying the Dempster-Shafer theory of belief functions. In sensory monitoring of mating procedures, sensors are employed to ensure the satisfaction of required mating constraints with a sufficient reliability in the presence of sensory uncertainty as well as manufacturing uncertainty of mating parts. Uncertainties in sensory monitoring are represented as partial constraints associated with Dempster-Shafer's belief functions, and the satisfaction of mating constraints is examined through the manipulation of symbolic constraints on a mating constraint network (MCN). The coordination of sensors for achieving this satisfactory sensory monitoring of mating procedure is also optimized subject to the dynamics of the sensors. The feasibility and performance of the proposed framework and strategy/procedures are verified by computer simulations. These results, though focused on assembly tasks, are general and useful enough to be applicable to a wide range of sensor-based tasks.

Degree

Ph.D.

Advisors

Lee, Purdue University.

Subject Area

Electrical engineering

Off-Campus Purdue Users:
To access this dissertation, please log in to our
proxy server
.

Share

COinS