group

What campus are you from?

Daytona Beach

Authors' Class Standing

Rylan Malarchick, Senior Jose Castelblanco, Graduate Student Chirag Kumar, Sophomore Graysen Brinkman, Senior Enrique Amaya, Junior Carmen DiMario, Senior Kiwoon Yoon, 2515526, Sophomore

Lead Presenter's Name

Rylan Malarchick

Faculty Mentor Name

Dr. Drakunov (Sergey Drakunov)

Abstract

The rise of small, agile Unmanned Aerial Vehicles (UAVs) presents a significant challenge for autonomous detection and tracking systems, especially in cluttered, real-world environments. Standard machine learning approaches are prone to failure when faced with noisy, corrupted, and high-dimensional data streams, particularly those from onboard sensors on resource constrained hardware. This work presents AIRHOUND, a UAV platform designed to address these challenges through a robust, low latency perception and controls pipeline. Our primary methodology is centered on a “yaw-to-target” capability, wherein the UAV can autonomously adjust its heading to maintain a visual lock on a given target UAV. The system leverages a YOLOv8 object detection model, optimized with NVIDIA’s TensorRT for high-throughput inference on an embedded Jetson computer. To achieve robustness against intermittent detections caused by occlusion or sensor noise, we introduce a lightweight predictive state filter. This module uses successful detections to continuously update a motion model of the target, estimating its angular velocity relative to the camera frame. During frames where the detector fails, the system uses this model to predict the target’s position, ensuring the yaw controller receives an uninterrupted stream of setpoints. The resulting 2D coordinates, whether measured or predicted, are fused with intrinsic camera parameters with a ROS2 framework to calculate a precise yaw command. This command is then streamed to a PX4 flight controller, enabling the vehicle to actively track the target despite transient data loss. Our approach demonstrates a practical solution for dynamic active sensing, achieving stable target acquisition under difficult real world scenarios. This offers advancement for applications in security and remote sensing, and serves as a foundational step for more advanced predictive control systems.

Did this research project receive funding support from the Office of Undergraduate Research.

No

Share

COinS
 

Robust Real-Time UAV Target Tracking with Onboard Vision-Based Yaw Controller

The rise of small, agile Unmanned Aerial Vehicles (UAVs) presents a significant challenge for autonomous detection and tracking systems, especially in cluttered, real-world environments. Standard machine learning approaches are prone to failure when faced with noisy, corrupted, and high-dimensional data streams, particularly those from onboard sensors on resource constrained hardware. This work presents AIRHOUND, a UAV platform designed to address these challenges through a robust, low latency perception and controls pipeline. Our primary methodology is centered on a “yaw-to-target” capability, wherein the UAV can autonomously adjust its heading to maintain a visual lock on a given target UAV. The system leverages a YOLOv8 object detection model, optimized with NVIDIA’s TensorRT for high-throughput inference on an embedded Jetson computer. To achieve robustness against intermittent detections caused by occlusion or sensor noise, we introduce a lightweight predictive state filter. This module uses successful detections to continuously update a motion model of the target, estimating its angular velocity relative to the camera frame. During frames where the detector fails, the system uses this model to predict the target’s position, ensuring the yaw controller receives an uninterrupted stream of setpoints. The resulting 2D coordinates, whether measured or predicted, are fused with intrinsic camera parameters with a ROS2 framework to calculate a precise yaw command. This command is then streamed to a PX4 flight controller, enabling the vehicle to actively track the target despite transient data loss. Our approach demonstrates a practical solution for dynamic active sensing, achieving stable target acquisition under difficult real world scenarios. This offers advancement for applications in security and remote sensing, and serves as a foundational step for more advanced predictive control systems.

 

To view the content in your browser, please download Adobe Reader or, alternately,
you may Download the file to your hard drive.

NOTE: The latest versions of Adobe Reader do not support viewing PDF files within Firefox on Mac OS and if you are using a modern (Intel) Mac, there is no official plugin for viewing PDF files within the browser window.