Loading…

Efficient Fire Detection for Uncertain Surveillance Environment

Tactile Internet can combine multiple technologies by enabling intelligence via mobile edge computing and data transmission over a 5G network. Recently, several convolutional neural networks (CNN) based methods via edge intelligence are utilized for fire detection in certain environment with reasona...

Full description

Saved in:
Bibliographic Details
Published in:IEEE transactions on industrial informatics 2019-05, Vol.15 (5), p.3113-3122
Main Authors: Muhammad, Khan, Khan, Salman, Elhoseny, Mohamed, Hassan Ahmed, Syed, Wook Baik, Sung
Format: Article
Language:English
Subjects:
Citations: Items that this one cites
Items that cite this one
Online Access:Get full text
Tags: Add Tag
No Tags, Be the first to tag this record!
Description
Summary:Tactile Internet can combine multiple technologies by enabling intelligence via mobile edge computing and data transmission over a 5G network. Recently, several convolutional neural networks (CNN) based methods via edge intelligence are utilized for fire detection in certain environment with reasonable accuracy and running time. However, these methods fail to detect fire in uncertain Internet of Things (IoT) environment having smoke, fog, and snow. Furthermore, achieving good accuracy with reduced running time and model size is challenging for resource constrained devices. Therefore, in this paper, we propose an efficient CNN based system for fire detection in videos captured in uncertain surveillance scenarios. Our approach uses light-weight deep neural networks with no dense fully connected layers, making it computationally inexpensive. Experiments are conducted on benchmark fire datasets and the results reveal the better performance of our approach compared to state-of-the-art. Considering the accuracy, false alarms, size, and running time of our system, we believe that it is a suitable candidate for fire detection in uncertain IoT environment for mobile and embedded vision applications during surveillance.
ISSN:1551-3203
1941-0050
DOI:10.1109/TII.2019.2897594