Fog, near-darkness, and colored lights create a nightclub-like atmosphere in Worcester Polytechnic Institute’s (WPI) drone testing area. As a palm-sized drone hovers through artificial smoke and snow, it approaches a plexiglass wall autonomously and turns back, all without a single camera. This is the PeAR Bat, a revolutionary bio-inspired drone that “sees” the world through sound rather than sight.
Recently, Dronelife visited Professor Nitin Sanket and his team at WPI’s Perception and Autonomous Robotics Group (PeAR) to witness their bat-inspired drone in action. Guided entirely by the same ultrasonic distance-sensing technology found in automatic faucets, the PeAR Bat demonstrates how nature-inspired engineering can solve critical challenges facing first responders and drone professionals.
With a prestigious $705,000 National Science Foundation grant supporting the three-year project, Sanket and his students, including undergraduate researcher Colin Balfour and PhD candidate Deepak Singh, are pioneering ultrasonic navigation systems that could save lives in earthquakes, tsunamis, and building fires where darkness, smoke, and dust render traditional visual sensors useless.


Learning from Nature’s Expert Flyers
The genesis of the PeAR Bat comes from a fundamental insight: bats navigate effortlessly in complete darkness using echolocation, while robots remain blind.
Light has limited penetration power, meaning dusty, light-limited environments cause many drone systems to fail without crucial visual data. Bats, by contrast, emit high-frequency ultrasonic pulses and analyze returning echoes to build detailed environmental maps. Since sound penetrates smoke, fog, and dust where light fails, this biological capability provides the perfect blueprint for autonomous disaster response. “We talk a lot about stealing from nature’s blueprint,” Professor Sanket explains. “Millions of years of genetic evolution, we can’t do better than that.”
Engineering Extreme Miniaturization
Replicating bat echolocation in a tiny, affordable aerial robot presents formidable challenges. The PeAR Bat weighs less than 100 grams and measures smaller than 100 millimeters. The whole unit can be currently manufactured for around $300, and if commercialized at scale, could cost as little as $50, allowing for quick scalability.
The drone detects obstacles as close as 5 centimeters with a 120-by-60-degree field of view using the same ultrasonic sensor found in automatic water faucets, drawing just 0.6 milliwatts per sensor. This represents 1,000 times more efficient power consumption than a USB camera.
However, this miniaturization creates unique challenges. Propeller noise interferes with ultrasonic signals, requiring the team to design 3D-printed metamaterial shells to reduce acoustic interference. “Imagine you’re talking to your friend with a jet next to you – that’s what it’s like for the sensor,” Sanket explains, describing how vibration between the carbon-fiber airframe and 3D-printed parts compounds the problem.
Sanket’s solution combines hardware innovation with physics-informed deep learning to clarify ultrasonic data, sensor fusion with inertial sensors, and hierarchical reinforcement learning navigation. During the exclusive Dronelife demonstration, the drone repeatedly detected and avoided the transparent barrier, a significant advantage over vision-based systems that struggle with clear obstacles.
Nature-Inspired Design Philosophy
PeAR’s research approach has long centered on inspiration from nature. During the workshop tour, Professor Sanket and his research assistants displayed a range of previous attempts based on visual solutions, including designs inspired by the eyes of cuttlefish and honeybees.
In future, advanced applications extend beyond obstacle avoidance. Ultrasound itself could identify breath signatures and gunshots for anti-poaching and rescue operations.Looking forward, future iterations of the Bat drone could include an efficient event camera optimized for low light, inspired by hummingbirds, and additional ultrasonic sensors that could enable triangulation and fast imaging. Future collaboration with WPI’s fire lab and drone swarming experts could make their solution even more applicable across use cases.


Task-Centric Innovation Over Human-Inspired Design
This philosophy of parsimonious AI, using the least processing power to solve necessary tasks, guides the research. Rather than building expensive, complex systems, the team focuses on elegant, nature-inspired solutions with minimal computational overhead, choosing to model animal brains and functionality rather than human.
Sanket emphasizes rethinking robot design around actual mission requirements. “Most robotic agents are based on humans, and that’s not the best way to go about things,” he explains. “I think it should be task-centric, what’s the best way to get this done? Nature is often the best way to solve problems.”
A Future of Affordable Autonomous Rescue
As Sanket and his team continue refining the PeAR Bat, they envision tiny aerial robots becoming an important tool for humans trying to complete tasks. These tools could save lives through affordable, autonomous deployment where traditional rescue technology cannot reach. In laboratories like those at WPI, the line between biological inspiration and practical innovation grows increasingly blurred, and increasingly promising for disaster response.
More information on the PEaR lab is available here.
Read More


Ian McNabb is a journalist focusing on drone technology and lifestyle content at Dronelife. He is based between Boston and NH and, when not writing, enjoys hiking and Boston area sports.

