Soft Bionics

This AI-powered smart glove can identify objects by touch

Image credit: MIT CSAIL

Scientists at MIT have developed a smart glove that can recognize objects by touching alone.

In the dark, humans have the ability to identify an object such as glasses or phone just by touching it. For years, scientists have been trying to teach robots how to grip different objects without crushing or dropping them.

“Humans can identify and handle objects well because we have tactile feedback. As we touch objects, we feel around and realize what they are. Robots don’t have that rich feedback,” says Subramanian Sundaram PhD ’18, a former CSAIL graduate student. “We’ve always wanted robots to do what humans can do, like doing the dishes or other chores. If you want robots to do these things, they must be able to manipulate objects really well.”

In a paper published in Nature, the researchers describe the low-cost device called Scalable Tactile Glove (STAG). The smart glove is equipped with about 550 tiny sensors across nearly the entire hand. Each sensor captures pressure signals as humans interact with objects in various ways. A neural network processes the signals to “learn” a dataset of pressure-signal patterns related to specific objects. Then, the system uses that dataset to classify the objects and predict their weights by feel alone, with no visual input needed, reports CSAIL MIT.

The researchers compiled a dataset using STAG for 26 common objects including a soda can, scissors, tennis ball, spoon, pen, and mug. Using the dataset, the system predicted the objects’ identities with up to 76 percent accuracy. The system can also predict the correct weights of most objects within about 60 grams.

Similar sensor-based gloves in use today can cost thousands of dollars and often contain only around 50 sensors that capture less information. Even though STAG produces very high-resolution data, it’s made from commercially available materials totaling around $10.

MIT Computer Science and Artificial Intelligence Lab (Image credit: CSAIL)

 

MIT Computer Science and Artificial Intelligence Lab (Image credit: CSAIL)

The tactile sensing system could be used in combination with traditional computer vision and image-based datasets to give robots a more human-like understanding of interacting with objects, according to CSAIL MIT report.

“Humans can identify and handle objects well because we have tactile feedback. As we touch objects, we feel around and realize what they are. Robots don’t have that rich feedback,” says Subramanian Sundaram PhD ’18, a former CSAIL graduate student. “We’ve always wanted robots to do what humans can do, like doing the dishes or other chores. If you want robots to do these things, they must be able to manipulate objects really well.”

Source: www.wearable-technologies.com

Related posts

International group working on hearing device, form of surgery for implant

Bodyelectron.com

Bioinspired wound dressing contracts in response to body heat to speed healing

Bodyelectron.com

NEW ROBOT HAND WORKS LIKE A VENUS FLYTRAP TO GRIP OBJECTS

Bodyelectron.com