Hefei No. 6 High School Cambridge, Hefei, Anhui, China
Email: 3243507741@qq.com (Y.W.Z.)
Manuscript received August 7, 2024; revised September 14, 2024; accepted October 14, 2024; published October 29, 2024.
Abstract—Drones have been developing rapidly in recent years, in various fields, such as rescue, military, agriculture, and so on. In recent years, it has been proposed that drones can be used to rescue people in harsh environments. Drones are categorized into three types, namely fixed-wing drones, bi-rotor drones, and multi-rotor drones. Because bi-rotor drones or multi-rotor drones are small in size, they are easy to move and the number of locations they can photograph is not limited. Thus, the drones we usually use to rescue lost people are bi-rotor or multi-rotor drones. After extensive literature reading, different technologies were compared. The Jetson TXI image processing system with SLAM technology will make the entire search and rescue operation faster and more responsive. In this article, we will explore the use of more accurate cameras and sensors on Unmanned Aerial Vehicles (UAVs). Also, by using a single variable frame rate camera mounted on a rotating turret and fixed low and wide-angle cameras, the resolution of the UAV can be increased to further improve Search And Rescue (SAR) efficiency. Furthermore, security and privacy issues need to be taken into consideration during the search and rescue process. Physical damage, such as vandalism and weather problems, are major safety concerns at the perception layer. Moreover, some private images may be unintentionally captured during the search and rescue process, such as private vehicles, house numbers, etc. Overall, the use of these techniques can greatly improve the efficiency and accuracy of police SAR in extreme environments.
Keywords—visual tracking, Unmanned Aerial Vehicles (UAVs), missing people, image processing, face tracking, HDTV cameras
Cite: Yawen Zhang, "Visual Tracking of Drones in Missing Persons: A Research Survey,"
International Journal of Engineering and Technology, vol. 16, no. 4, pp. 211-216, 2024.
Copyright © 2024 by the authors. This is an open access article distributed under the Creative Commons Attribution License which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited (CC BY 4.0).