HawkEye: Practical In-Flight Obstacle Avoidance with Event Camera and LiDAR Fusion

ACM MobiCom 2025 Demo Submission #63

Wenhua Ding1, Zhengli Zhang1, Xin Xu1, Haoyang Wang1, Yinan Zhu2,
Shilong Ji1, Xin Zhou3, Jingao Xu4, Dongyue Huang1, Xinlei Chen1
1SIGS, Tsinghua University, 2South China University of Technology,
3Hong Kong University of Science and Technology, 4Carnegie Mellon University

Indoor Experiments

We conducted indoor tests by randomly throwing objects such as a soccer ball and a badminton shuttlecock.

Diagram of indoor experiment scenarios with drone maneuvers

Situation I: Reactive Avoidance. As shown in the lower part of the figure, when an object is detected at a safe distance (e.g., a badminton shuttlecock at >1m), HawkEye autonomously plans and executes a smooth "Turn Right" trajectory, bypassing the obstacle without interrupting its flight path.

Situation II: Emergency Braking. In the upper part of the figure, upon detecting a fast-approaching object (e.g., a soccer ball) within a critical proximity (<1m, marked by the red arrow), HawkEye autonomously triggers an immediate "Brake" command to halt the UAV and prevent a collision.


Abstract

Drones are increasingly used in applications such as last-mile delivery and infrastructure inspection, but their safe operation, especially in high-speed scenarios, remains a critical challenge. Existing vision- and LiDAR-based obstacle localization methods suffer from motion blur, latency, and low spatio-temporal resolution, making them inadequate for detecting and tracking fast-moving objects. In this work, we present HawkEye, a drone obstacle avoidance system that fuses event cameras and LiDAR to achieve high-frequency, accurate 3D tracking of dynamic objects. By leveraging the complementary strengths of both sensors, HawkEye enables robust real-time sensing and safe evasive maneuvers, addressing a key requirement for the large-scale deployment of autonomous drones.

System Overview

  • Stage 1: Event-Based 2D Motion Tracking.
    The pipeline begins by processing the raw event stream to extract a dense optical flow field that captures fine-grained motion information. This capability enables reliable segmentation of dynamic objects, continuous tracking of their 2-D trajectories, and high‑frequency velocity estimation.
  • Stage 2: Event-Guided LiDAR Motion Compensation.
    Sparse LiDAR points, each associated with a unique timestamp, are then corrected for motion distortion. This de-smearing process is guided by velocity vectors estimated from the event stream, yielding a physically accurate point cloud of the object at a unified reference time.
  • Stage 3: 3D Trajectory Generation.
    Finally, the 2D trajectory is fused with the motion-compensated point cloud. A depth value is assigned to each 2D track point by associating it with the nearest 3D points in the cloud, resulting in a high-fidelity 3D trajectory that accurately captures the object's spatial position and motion over time.
HawkEye system pipeline diagram showing the three stages of fusion

Implementation and Experiments Setup

Our system is deployed on a custom-built UAV, with the hardware configuration shown below. The onboard computation is performed by an Intel NUC 12. The perception module integrates two primary sensors: (1) a Prophesee EVK4 HD event camera equipped with a 1280×720 Sony IMX636ES sensor, and (2) a MID-360 LiDAR, which provides 360° horizontal field-of-view depth measurements. As shown in the right panel of the figure, the dynamic target objects (a ball and a die) are equipped with reflective markers to facilitate ground-truth validation.

Hardware setup showing the custom-built UAV with event camera, LiDAR, and target objects

Outdoor Evaluation

We achieved a 100% success rate in obstacle depth estimation within the field of view, accomplished by throwing balls at various speeds and angles against a complex forest background.

3D object trajectory plot from outdoor evaluation

Experiments Video