BatMobility: Towards Flying Without Seeing
for Autonomous Drones

Emerson Sie     Zikun Liu     Deepak Vasisht
University of Illinois Urbana-Champaign MobiCom 2023

Doppler shift enables autonomous navigation in challenging environments lacking visual or geometric features.

Abstract

Unmanned aerial vehicles (UAVs) rely on optical sensors such as cameras and lidar for autonomous operation. However, optical sensors fail under bad lighting, are occluded by debris and adverse weather conditions, struggle in featureless environments, and easily miss transparent surfaces and thin obstacles. In this paper, we question the extent to which optical sensors are sufficient or even necessary for full UAV autonomy. Specifically, we ask: can UAVs autonomously fly without seeing? We present BatMobility, a lightweight mmWave radar-only perception system for autonomous UAVs that completely eliminates the need for any optical sensors. BatMobility enables vision-free autonomy through two key functionalities – radio flow estimation (a novel FMCW radar-based alternative for optical flow based on surface-parallel doppler shift) and radar-based collision avoidance. We build BatMobility using inexpensive commodity sensors and deploy it as a real-time system on a small off-the-shelf quadcopter, showing its compatibility with existing flight controllers. Surprisingly, our evaluation shows that BatMobility achieves comparable or better performance than commercial-grade optical sensors across a wide range of scenarios.

Surface-Parallel Doppler Shift

How can we stabilize a drone using only a downward facing radar? Although obtaining altitude is straightforward, ground-parallel motion is challenging. Our key insight is that at mmWave frequencies, most ground surfaces appear diffuse to radar. This means ground-parallel motion induces unique patterns in the doppler-angle plane. We can make such patterns apparent even on very low resolution antenna arrays by extending the doppler axis.


Top. The amplitude is correlated with the magnitude of the velocity. Bottom. Using a 2-D antenna array, we can find the direction of motion.



Radio Flow Estimation

The previous simulated toy examples assume a perfectly diffuse surface. However, doppler-angle heatmaps generally appear different under various real world conditions due to specularity, multipath, noise, etc. We collect data on various real world surfaces and train a lightweight CNN to regress velocity from imperfect doppler-angle heatmaps. We also convert these to angular flow values for plug-and-play compatibility with existing flight controllers.

Rough Surface (Grass)

Smooth Surface (Concrete)



Position Hold

We compare loitering behavior of a UAV using (a) a commercial optical flow sensor and (b) BatMobility. In the left, we can see that the UAV is unable to hold its position in featureless or dark environments. In the right, BatMobility uses surface-parallel doppler shift to maintain stability in spite of the lack of visual or geometric ground features.

Optical Flow

Radio Flow

Optical Flow

Radio Flow



Velocity Control

BatMobility enables feedback-based velocity control in harsh environments (such as in the dark). This enables key action primitives (i.e. move forward, move left, turn right, stop) for vision-free navigation in unstructured environments.

BibTeX

@inproceedings{sie2023batmobility,
  author    = {Sie, Emerson and Liu, Zikun and Vasisht, Deepak},
  title     = {BatMobility: Towards Flying Without Seeing for Autonomous Drones},
  booktitle = {ACM International Conference on Mobile Computing (MobiCom)},
  year      = {2023},
  doi       = {https://doi.org/10.1145/3570361.3592532},
}

Acknowledgements

We thank the reviewers and our anonymous shepherd for their insightful comments and suggestions on improving this paper. This work was supported in part by NSF RINGS Award 2148583. This work was carried out in part in the Intelligent Robotics Laboratory, University of Illinois Urbana-Champaign. We thank John M. Hart for help regarding the flying arena. We thank Shahab Nikkhoo for his guidance and suggestions. We thank Kris Hauser for letting us use his 3D printer. We are grateful to Jayanth Shenoy, Bill Tao, Maleeha Masood, Ishani Janveja, and Om Chabra for their feedback on initial drafts.