Motion control of robots with wearable skin-like sensor
Bioinspired sensors control robotic devices by detecting and responding to strain and motion, without the need for voice control.
Imagine if you could direct a robot to clean your car with a wave of your hand?
Now, a team of researchers have developed an electronic skin-like sensor to control robotic devices with hand gestures.
Dr Zheng Yan is one of the primary researchers of the project; a collaboration with the UTS Australian Artificial Intelligence Institute (AAII) and Singapore’s Nanyang Technological University.
“We capture high-quality somatosensory data from hand gestures with the new stretchable and wearable sensor,” Yan said. “Then, our brain-inspired machine-learning architecture recognises the movements and learns how to respond to the visual and somatosensory information.”
The transparent sensors, worn directly on the skin, not only detect strain but also detect motion flow, and can operate during the day or in complete darkness.
Like other smart inventions that mimic the natural world, the human brain was the main inspiration behind the development of the sensor.
“High perceptual activities in the brain, such as thinking, planning and inspiration, do not depend on specific sensory information,” he said. “They are derived from the integration of multi-sensory information from diverse sensors, and this is what inspired us to combine visual and sensory information with high-precision gesture recognition.”
To demonstrate the technology in real-world situations, Yan and his team used the wearable sensor to successfully navigate a quadruped (four-legged) robot through a labyrinth in a dark environment, with an illuminance of 10 lux, to resemble an open parking lot at night.
Now, with their research paper published in this month’s prestigious journal, Nature Electronics, Yan and his collaborators have future plans to take the smart wearable technology to the next level, by enabling the wearable device to run directly on the AI models.