Navigation Content
University of Wisconsin Madison College of Engineering
You are here:
  1. Home > 
  2. News > 
  3. News archive > 
  4. 1998 > 

Giving robots a sense of touch

Just 20 years ago, the idea of a personal computer was foreign to all but a niche of hobbyists and industry professionals. Now, personal computers are ubiquitous. The idea of a personal robot today may not be as foreign as the thought of a personal computer once was, but with advances in computer science, engineering and artificial intelligence, robots may well be as common in 20 years as computers are today. But before a robotic assistant can dust the house, do the laundry and groom the cat, there are some development issues to be addressed. Assistant Professor of mechanical engineering Nicola J. Ferrier is leading a team of researchers in an effort to make robots more coordinated and dexterous.

"Right now robots are really clumsy," Ferrier says. "Eighty to 90 percent of the robots in use do not come into contact with anything. They are spray painters, arc welders and spot welders." And in those cases, Ferrier says the objects worked on by robots have to be placed in the same place every time so that the robot can find the work.

Ferrier researches robots

Assistant Professor Nicola J. Ferrier (right) is working with postdoctoral researcher Kyunghwan Kim (left) to give robots a sense of touch. (large image)

By incorporating force and shape sensors embedded within deformable robotic fingertips or "squishy fingers," Ferrier has developed a method of sensing an object's shape and the distribution of forces required to manipulate the object. With both shape and force information, a robotic hand can operate more dexterously. Currently, a typical robot hand might include two ridged fingers made of parallel plates. If an object is flat and its placement known, the robot can successfully pick it up. But try finding and picking up an egg with the same robotic hand and the task becomes problematic. Relying on friction alone to pick up the egg would most likely destroy the egg.

"If you pick up a square box by the edges and then by the flat faces, you could have an equivalent force acting on your fingers, but the distribution of the force over the finger is very different. You need a sensor that can capture both of those aspects. With a more general purpose robotic hand you don't need a specialized tool and a tool changer for every task," Ferrier says. "Squishy fingers would allow a robot to pick up and rotate an object. You can not do that with parallel plates."

To sense the shape of an object, Ferrier's robotic finger has a grid of dots drawn at computed locations on the membrane tip. When the tip comes into contact with an object, the membrane is deformed and the points on the membrane move. The new coordinates of the dots on the membrane are compared with their undeformed locations in order to calculate the three dimensional shape of the object.

To facilitate locating an object, Ferrier's research group will combine the force and shape sensing system with a visual sensory system. The combination will give the robot eye/hand coordination. Before that can be done, however, Ferrier's team must figure out how to instruct the robot to manage the various sensory information so that the robot will know when to look and move in order to successfully find and manipulate an object.

"As you are approaching the object, you use vision to guide where you are positioning your hand," Ferrier says. "And when you are actually grasping something you need a sensor to tell you that you are in contact and another sensor to tell you about force, pressure distribution of the finger and the shape of contact. How to transition between those, how to get that sensory information and transition this control loop is really what we are looking at."