Добавить новость
ru24.net
News in English
Октябрь
2024

Listening skills bring human-like touch to robots

0
Los Angeles CA (SPX) Oct 23, 2024 - Imagine sitting in a dark movie theater, shaking your drink to hear how much soda is left. Now, researchers at Duke University are working to bring this type of human sensory ability to robots with a new system called SonicSense. This technology allows robots to "feel" and understand the world around them by interpreting acoustic vibrations, similar to how humans use sound to assess objects.

Scheduled to be presented at the Conference on Robot Learning (CoRL 2024) in Munich, Germany, SonicSense enables robots to interact with objects using sound-based feedback. The system features a robotic hand equipped with contact microphones embedded in its fingertips. These microphones detect vibrations when the robot taps, grasps, or shakes an object, allowing the robot to tune out background noise and focus on the specific item it's handling.

"Robots today mostly rely on vision to interpret the world," said Jiaxun Liu, lead author of the study and a Ph.D. student at Duke. "We wanted to create a solution that could work with complex and diverse objects found on a daily basis, giving robots a much richer ability to 'feel' and understand the world."

SonicSense uses the collected vibration data to analyze the object's material and shape. If the system has never encountered the object before, it may take up to 20 interactions to identify it. However, for objects stored in its database, it can make accurate identifications in as few as four interactions.

"SonicSense gives robots a new way to hear and feel, much like humans, which can transform how current robots perceive and interact with objects," said Boyuan Chen, professor of mechanical engineering and materials science at Duke, and supervisor of the research.

The researchers demonstrated SonicSense's capabilities by performing tasks such as counting dice in a box, determining the liquid level in a bottle, and building a 3D model of an object's shape and material through taps. The system's combination of multiple fingers, touch-based microphones, and AI techniques enables it to outperform previous methods, especially with objects that have complex surfaces or are made from multiple materials.

A critical advantage of SonicSense is its affordability. By utilizing low-cost components like 3D printing and microphones commonly used by musicians, the system is constructed for just over $200.

Looking ahead, the research team aims to improve SonicSense by integrating object-tracking algorithms, enabling robots to handle cluttered environments. Future iterations of the system will also explore advanced robotic hands with enhanced dexterity, making robots capable of performing more nuanced tasks.

"This is only the beginning," added Chen. "We're excited to explore how this technology can be further developed to integrate multiple sensory modalities, such as pressure and temperature, for even more complex interactions."

Research Report:SonicSense: Object Perception from In-Hand Acoustic Vibration




Moscow.media
Частные объявления сегодня





Rss.plus




Спорт в России и мире

Новости спорта


Новости тенниса
Юлия Путинцева

Путинцева откровенно высказалась о России






Начальник Росприроднадзора Алексея Карабаша арестован за крупную взятку

Укус змеи: что скрывается в тайных подземельях вокзала Новосибирск-Главный

В воскресенье в Москве и Подмосковье ожидается снег и гололедица

«Бело-голубые» полностью переиграли казанцев