Sensory-based robot control for automatic manipulations

Nasser Houshangi, Purdue University

Abstract

Intelligent robots require sensor-based control to perform complex operations and to deal with uncertainty in the environment. Feedback from the work environment can be from different sensors for example force/torque, visual, and tactile sensors. Many applications of robot manipulators require that the control is applied not only to the position of the gripper but also to the force exerted by the tool on the object. An adaptive controller has been designed to control the position and contact force in the Cartesian coordinate system. A successful simulation and lab experimentation illustrate the applicability of the approach. Vision systems usually provide more global information about the environment than force/torque sensors. Visual feedback represents a typical sensing system in which camera images provide feedback information, for example, for grasping a moving object with a robot manipulator. Because image processing is time-consuming, information about the target position will be delayed and not available instantaneously for the controller. Therefore, the present and future position has to be predicted in real-time. Since the dynamics of the target is assumed to be unknown, prediction of the object position will be accomplished using a model such as an auto-regressive discrete-time model. The predicted values and current end-effector position determine the desired trajectory point (subgoal) for the motion. The planner adapts to changes in the target position on-line. The desired trajectory is tracked by the end-effector, which is controlled by a self-tuner. A simulation study and lab experiments are presented to demonstrate the grasping of a moving target by a manipulator by means of a visual feedback. Contact sensing can be measured by force/torque and tactile sensors. Tactile sensors measure forces at specific points between the object and tactile pads. A specific task of placing a parallelepiped object on an unknown flat surface using information from force/torque and tactile sensors is considered. Three types of contact (point, line, and plane) can occur between the object and the surface. The type of contact is determined based on the information obtained from the force/torque sensor with the assumption that the object shape is known. The manipulator serving strategy depends on the type of contact. During line contact, the angle between the object base and the contact plane surface is calculated using the information received from the tactile sensor. The desired final position and orientation of the end-effector which places the object on the plane surface is determined based on the current end-effector position and orientation, and the obtained angle. Experiments were performed that successfully demonstrate the approach by calculating the point and type of contact, and the angle between the object base and the flat surface.

Degree

Ph.D.

Advisors

Koivo, Purdue University.

Subject Area

Electrical engineering

Off-Campus Purdue Users:
To access this dissertation, please log in to our
proxy server
.

Share

COinS