On a mission to improve robotics, researchers have developed a sensor-packed glove that can learn the signatures of the human grasp. The sensor will help the neural network to identify objects by touch.
The aim of the development, which comes from the Massachusetts Institute of Technology, is to pinpoint the signals that can help a neural network to identify objects by touch. In the future it is hoped the system could aid robotics as well as prosthetics design, to assist with new structures for amputees. As things currently stand, as robot picking up an egg or a baseball is unlikely to be able to differentiate between the two. The new research aims to change this.The way by which the sensor has been developed and the signals identified is by researchers themselves wearing the specially designed glove. By wearing the the sensor-packed glove, different researchers have explored the environment and handled a variety of differing objects. The information gathered has then been used to construct a huge dataset. An artificial intelligence system has then learned to recognize the same objects through touch alone.The glove itself does not appear all that remarkable, at first look. It takes the form of a low-cost knitted glove, termed a “scalable tactile glove”. On closer inspection, however, the glove can be seen to be equipped with about 550 tiny sensors. The sensors are placed across the entire hand. The glove was laminated with an electrically conductive polymer, designed to change resistance to differently applied pressures.Each sensor was designed to captures pressure signals as the researchers touched the various objects (and in different ways). Objects ranged from a chalkboard eraser to a tennis ball; and from a small cat statue to a stapler, with a coffee mug in-between. In all 26 different objects were used and manipulated. The signal from each sensors was sent to a neural network to process the signals so that the machine could construct a tactile map and then “learn” from the array of pressure-signal patterns and associate these back to specific objects.The trials showed that the system can now draw upon the dataset to classify the objects and predict their weights by feel alone (tactile sensing), with no visual input required. The learning can now be applied to develop robots and artificial limbs for humans, as one of the researchers Subramanian Sundaram notes: “There’s been a lot of hope that we’ll be able to understand the human grasp someday and this will unlock our potential to create this dexterity in robots…We’ve always wanted robots to do what humans can do, like doing the dishes or other chores. If you want robots to do these things, they must be able to manipulate objects really well.”The research has been published in the journal Nature. The research paper is titled “Learning the signatures of the human grasp using a scalable tactile glove.”