Document Type
Article
Abstract
This paper presents DroneChase, an automated sensing system that monitors acoustic and visual signals captured from a nearby flying drone to track its trajectory in both line-of-sight and non-line-of-sight conditions under mobility settings. Although drone monitoring has been an active research topic, most of the existing monitoring systems focus only on line-of-sight conditions and do not perform well under blockage conditions. Inspired by the human ability to localize objects in the environment using both visual and auditory signals, we develop a mobile system that integrates the information from multiple modalities into a reference scenario and performs real-time drone detection and trajectory monitoring. Our developed system, controlled by the Raspberry Pi platform, collects acoustic signals from 6 hexagonal channels placed 5 cm away from each other and video signals from an HD RGB camera. The monitoring system is placed in a moving vehicle and is able to track the drone even when it is flying/hovering behind the bush or trees. Furthermore, the portability of the system enables continuous chasing of the drone, allowing for uninterrupted monitoring and tracking even while on the move. In addition, the proposed system performs reliably in both day and night conditions.
Publication Date
6-19-2023
Language
English
License
This work is licensed under a Creative Commons Attribution 4.0 International License.
Recommended Citation
Vora, Neel R.; Wu, Yi; Liu, Jian; and Nguyen, Phuc, "DroneChase: A Mobile and Automated Cross-Modality System for Continuous Drone Tracking" (2023). Association of Computing Machinery Open Access Agreement Publications. 77.
https://mavmatrix.uta.edu/utalibraries_acmoapubs/77