Search citation statements
Paper Sections
Citation Types
Year Published
Publication Types
Relationship
Authors
Journals
In recent years, advancements in smart home technologies have underscored the need for the development of early fire and smoke detection systems to enhance safety and security. Traditional fire detection methods relying on thermal or smoke sensors exhibit limitations in terms of response time and environmental adaptability. To address these issues, this paper introduces the multi-scale information transformer–DETR (MITI-DETR) model, which incorporates multi-scale feature extraction and transformer-based attention mechanisms, tailored specifically for fire detection in smart homes. MITI-DETR achieves a precision of 99.00%, a recall of 99.50%, and a mean average precision (mAP) of 99.00% on a custom dataset designed to reflect diverse lighting and spatial conditions in smart homes. Extensive experiments demonstrate that MITI-DETR outperforms state-of-the-art models in terms of these metrics, especially under challenging environmental conditions. This work provides a robust solution for early fire detection in smart homes, combining high accuracy with real-time deployment feasibility.
In recent years, advancements in smart home technologies have underscored the need for the development of early fire and smoke detection systems to enhance safety and security. Traditional fire detection methods relying on thermal or smoke sensors exhibit limitations in terms of response time and environmental adaptability. To address these issues, this paper introduces the multi-scale information transformer–DETR (MITI-DETR) model, which incorporates multi-scale feature extraction and transformer-based attention mechanisms, tailored specifically for fire detection in smart homes. MITI-DETR achieves a precision of 99.00%, a recall of 99.50%, and a mean average precision (mAP) of 99.00% on a custom dataset designed to reflect diverse lighting and spatial conditions in smart homes. Extensive experiments demonstrate that MITI-DETR outperforms state-of-the-art models in terms of these metrics, especially under challenging environmental conditions. This work provides a robust solution for early fire detection in smart homes, combining high accuracy with real-time deployment feasibility.
In this study, we propose an advanced object detection model for fire and smoke detection in maritime environments, leveraging the DETR (Detection with Transformers) framework. To address the specific challenges of shipboard fire and smoke detection, such as varying lighting conditions, occlusions, and the complex structure of ships, we enhance the baseline DETR model by integrating EfficientNet-B0 as the backbone. This modification aims to improve detection accuracy while maintaining computational efficiency. We utilize a custom dataset of fire and smoke images captured from diverse shipboard environments, incorporating a range of data augmentation techniques to increase model robustness. The proposed model is evaluated against the baseline DETR and YOLOv5 variants, showing significant improvements in Average Precision (AP), especially in detecting small and medium-sized objects. Our model achieves a superior AP score of 38.7 and outperforms alternative models across multiple IoU thresholds (AP50, AP75), particularly in scenarios requiring high precision for small and occluded objects. The experimental results highlight the model’s efficacy in early fire and smoke detection, demonstrating its potential for deployment in real-time maritime safety monitoring systems. These findings provide a foundation for future research aimed at enhancing object detection in challenging maritime environments.
Globally, fire incidents cause significant social, economic, and environmental destruction, making early detection and rapid response essential for minimizing such devastation. While various traditional machine learning and deep learning techniques have been proposed, their detection performances remain poor, particularly due to low-resolution data and ineffective feature selection methods. Therefore, this study develops a novel framework for accurate fire detection, especially in challenging environments, focusing on two distinct phases: preprocessing and model initializing. In the preprocessing phase, super-resolution is applied to input data using LapSRN to effectively enhance the data quality, aiming to achieve optimal performance. In the subsequent phase, the proposed network utilizes an attention-based deep neural network (DNN) named Xception for detailed feature selection while reducing the computational cost, followed by adaptive spatial attention (ASA) to further enhance the model’s focus on a relevant spatial feature in the training data. Additionally, we contribute a medium-scale custom fire dataset, comprising high-resolution, imbalanced, and visually similar fire/non-fire images. Moreover, this study conducts an extensive experiment by exploring various pretrained DNN networks with attention modules and compares the proposed network with several state-of-the-art techniques using both a custom dataset and a standard benchmark. The experimental results demonstrate that our network achieved optimal performance in terms of precision, recall, F1-score, and accuracy among different competitive techniques, proving its suitability for real-time deployment compared to edge devices.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
customersupport@researchsolutions.com
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
This site is protected by reCAPTCHA and the Google Privacy Policy and Terms of Service apply.
Copyright © 2025 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.