A World’s First: A Robot with a Natural Auditory Sensor

A World’s First: A Robot with a Natural Auditory Sensor

Israeli researchers from Tel Aviv University sought to examine how biological systems could be integrated into technological systems which later achieved a technological and biological development as for the first time, the developed robot was able to ‘hear’ using a natural sensor, the ear of a dead locust. The ear has been connected to a robot which receives the ear's electrical signals and acts accordingly. As a result, the robot moves forward when you clap once, and back when you clap twice.
The multidisciplinary study was led by Idan Fishel, a joint master student, which is under the joint supervision of Dr. Ben M. Maoz of the Iby and Aladar Fleischman Faculty of Engineering and the Sagol School of Neuroscience, Prof. Yossi Yovel and Prof. Amir Ayali, experts from the School of Zoology and the Sagol School of Neuroscience together with Dr. Anton Sheinin, Idan, Yoni Amit, and Neta Shavil. The results of the study were published in the prestigious journal Sensors.

The developed robot was capable to respond to auditory signals it receives from the environment. In a multidisciplinary collaboration, the researchers were able to create an ear-on-a-chip which looks like a microfluidic chip connected to a custom-designed suction electrode. The locust ear was kept functional by supplying oxygen and food to the organ, while allowing the electrical signals to be taken out of the locust's ear and amplified and transmitted to the robot. Finally, researchers succeeded in finding a way to pick up the signals received by the locust's ear in a way that could be used by the robot.

So far, the call and response made possible by a locust-eared robot are basic ones. The demonstration, however, of combining robotic platforms and biological elements opens the door for an exciting set of new applications, as the significance of this work is to show a proof of concept that we are able to integrate biological elements with robotic platforms. More specifically, we show that we can take the hearing system, which can serve as an input for sound, to navigate a robot, according to Dr. Ben M. Maoz.