Loading…

Human detection based on deep learning YOLO-v2 for real-time UAV applications

Recent advancements in the field of Artificial Intelligence (AI) have provided an opportunity to create autonomous devices, robots, and machines characterised particularly with the ability to make decisions and perform tasks without human mediation. One of these devices, Unmanned Aerial Vehicles (UA...

Full description

Saved in:
Bibliographic Details
Published in:Journal of experimental & theoretical artificial intelligence 2022-05, Vol.34 (3), p.527-544
Main Authors: Boudjit, Kamel, Ramzan, Naeem
Format: Article
Language:English
Subjects:
Citations: Items that this one cites
Items that cite this one
Online Access:Get full text
Tags: Add Tag
No Tags, Be the first to tag this record!
Description
Summary:Recent advancements in the field of Artificial Intelligence (AI) have provided an opportunity to create autonomous devices, robots, and machines characterised particularly with the ability to make decisions and perform tasks without human mediation. One of these devices, Unmanned Aerial Vehicles (UAVs) or drones are widely used to perform tasks like surveillance, search and rescue, object detection and target tracking, and many more. Efficient real-time object detection in aerial videos is an urgent need, especially with the increasing use of UAV in various fields. The sensitivity in performing said tasks demands that drones must be efficient and reliable. This paper presents our research progress in the development of applications for the identification and detection of person using the convolutional neural networks (CNN) YOLO-v2 based on the camera of drone. The position and state of the person are determined with deep-learning-based computer vision. The person detection results show that YOLO-v2 detects and classifies object with a high level of accuracy. For real-time tracking, the tracking algorithm responds faster than conventionally used approaches, efficiently tracking the detected person without losing it from sight.
ISSN:0952-813X
1362-3079
DOI:10.1080/0952813X.2021.1907793