Page 30 - Winter 2020
P. 30
FEATURED ARTICLE
The Evolution of Bat Robots
Rolf Müller and Roman Kuc
Introduction
The ability of bats to sense their surroundings with ultrasound has fascinated scientists ever since the discovery of the animals’ biosonar system in the late 1930s (reviewed in Simmons, 2017). The riddle of how bats can extract so much information from a bunch of brief ultrasonic echoes has attracted researchers from varied backgrounds who have looked into the behavior, anatomy, and physiology of bats to unravel the mechanisms behind the animals’ enviable acoustic capabilities.
Over the past 30 years, engineers have been drawn into the fray and have tried to mimic the bats’ biosonar systems by designing a veritable zoo of “bat robots.” Indeed, reproducing the biosonar abilities of bats in a man-made system would do far more than just satisfy a scientist’s intellectual curiosity or an engineer’s play instinct. Engineers working in the broad area of autonomous systems, in other words, machines that can act by themselves, have long dreamed of taking the capabilities of their creations to new levels, and bats could be just the right model system to accomplish this.
Up to now, the impact of man-made autonomy has been limited to environments that are carefully designed to support this, for example, robots working on manufacturing lines or in warehouses. Semiconstrained environments such as public roads have long been believed to be the logical next step for autonomous systems, with reliable self-driving cars lingering “just around the corner” for many years. However, the fact that the developers of self-driving cars continue to struggle with the lack of predictability in driving conditions speaks volumes as to how difficult advancing autonomous systems away from carefully constrained environments remains.
Being able to mimic the biosonar-based autonomous navigation skills of bats could leapfrog the semiconstrained
road environments and open up the final frontier for autonomy: venturing out into complex natural environments that are completely free from any man- made constraints. Hence, autonomous bat robots could one day be used in areas such as precision agriculture and forestry or environmental surveillance and cleanup.
The creation of autonomous systems for the complex natural world is not just a matter of replicating the bats’ biosonar; it requires an integration of multiple capabilities, namely for sensing, interpretation, mobility, and control. Two of these areas, mobility and interpretation, are currently being targeted by two powerful technical trends. Our ability to interpret complex signals is being taken to new levels by the ongoing deep-learning revolution that allows finding patterns and relationships in data that were previously completely inaccessible. With respect to mobility, the popularity of small drones (“micro air vehicles”) has been driving the developments of new concepts for powered flight. Because the size of many of these drones puts them into a similar aerodynamic range as bats or birds, mimicking the flapping flight of these animals is an attractive proposition.
If the revolutions in deep learning and drone technology deliver on their respective promises, they will give us highly maneuverable and energy-efficient aerial platforms equipped with “brains” that can analyze even the most complicated inputs. In a future world with these technologies, robotic reproductions of the bats’ biosonar system could fill a critical gap between these mobility and data analytics capabilities. However, a drone that has the mobility to master complex environments on the wing and the intelligence to make sense out of complex patterns still misses one important function, namely the ability to encode sensory information that can be processed by the deep-learning stage and then can be used to control the mobility of the system. Past and present work on bat robots is an almost perfect fit
©2020 Acoustical Society of America. All rights reserved.
30 Acoustics Today • Winter 2020 | Volume 16, issue 4
https://doi.org/10.1121/AT.2020.16.4.30