Thursday, March 4, 2021

Scientists use shadows to teach robots to “feel” the touch – ZAP

Facebook
Twitter
Pinterest
WhatsApp
Telegram

Must Read

Eastern Tobacco decided to sell 35.4 million treasury shares

The Board of Directors of the Eastern Company for Smoking approved the sale of the treasury shares owned by...

Minister of Housing: 19,600 units ready for delivery in the tenth announcement of social housing

The Butcher: Allocating 977 housing units in the cities of Sadat and the tenth of Ramadan, 6 months ahead...

A team of scientists at Cornell University is looking for a simple approach, using shadow imaging cameras, to let robots know when they are being touched.

Known as ShadowSense, the experimental system incorporates a common camera connected to a portable computer with USB, located under a non-electronic translucent “skin” in a soft-bodied robot, explains the NewAtlas.

When a person reaches the robot, the ambient light casts a shadow of your hand on the skin. The camera tracks that shadow on the other side of the skin – inside the robot – using algorithms based on machine learning to determine when the hand is actually touching the skin, what area of ​​the skin it is twisting and what gesture it is making.

In this way, the ShadowSense not only can it tell you when and where the robot is being touched, but it can also assign different commands to different gestures touch.

The current robot prototype is able to differentiate between touching with the palm of the hand, punching, touching with both hands, hugging, pointing and not touching. Technology can do this with 87.5% 96% accuracy, depending on the intensity and direction of the lighting.

The researchers point out that the applications of the technology are not limited to robotics, since it could also be used in touch screens or electronic devices.

O ShadowSense still has some limitations – not only a light source is needed, but also the camera must be located within the line of sight of the interactive part of the skin. The use of mirrors or lenses can help to solve the problem.

“Touch is such an important mode of communication for most organizations, but it has been virtually absent from human-robot interaction“Said the chief scientist Guy Hoffman, in a statement. “One reason is that full-body touch used to require a large number of sensors and was therefore not practical to implement. This research offers a low cost alternative ”.

This study was published in December in the scientific journal Proceedings of the Association for Computing Machinery on Interactive, Mobile, Wearable and Ubiquitous Technologies.

Maria Campos, ZAP //

Facebook
Twitter
Pinterest
WhatsApp
Telegram

Latest News

Eastern Tobacco decided to sell 35.4 million treasury shares

The Board of Directors of the Eastern Company for Smoking approved the sale of the treasury shares owned by...

More Articles Like This

Scientists use shadows to teach robots to "feel" the touch - ZAP