Artículo Materias > Ingeniería Universidad Europea del Atlántico > Investigación > Artículos y libros
Fundación Universitaria Internacional de Colombia > Investigación > Producción Científica
Universidad Internacional Iberoamericana México > Investigación > Producción Científica
Universidad Internacional do Cuanza > Investigación > Producción Científica
Abierto Inglés A new artificial intelligence-based approach is proposed by developing a deep learning (DL) model for identifying the people who violate the face mask protocol in public places. To achieve this goal, a private dataset was created, including different face images with and without masks. The proposed model was trained to detect face masks from real-time surveillance videos. The proposed face mask detection (FMDNet) model achieved a promising detection of 99.0% in terms of accuracy for identifying violations (no face mask) in public places. The model presented a better detection capability compared to other recent DL models such as FSA-Net, MobileNet V2, and ResNet by 24.03%, 5.0%, and 24.10%, respectively. Meanwhile, the model is lightweight and had a confidence score of 99.0% in a resource-constrained environment. The model can perform the detection task in real-time environments at 41.72 frames per second (FPS). Thus, the developed model can be applicable and useful for governments to maintain the rules of the SOP protocol. metadata Benifa, J. V. Bibal; Chola, Channabasava; Muaad, Abdullah Y.; Hayat, Mohd Ammar Bin; Bin Heyat, Md Belal; Mehrotra, Rajat; Akhtar, Faijan; Hussein, Hany S.; Ramírez-Vargas, Debora L.; Kuc Castilla, Ángel Gabriel; Díez, Isabel de la Torre y Khan, Salabat mail SIN ESPECIFICAR, SIN ESPECIFICAR, SIN ESPECIFICAR, SIN ESPECIFICAR, SIN ESPECIFICAR, SIN ESPECIFICAR, SIN ESPECIFICAR, SIN ESPECIFICAR, debora.ramirez@unini.edu.mx, SIN ESPECIFICAR, SIN ESPECIFICAR, SIN ESPECIFICAR (2023) FMDNet: An Efficient System for Face Mask Detection Based on Lightweight Model during COVID-19 Pandemic in Public Areas. Sensors, 23 (13). p. 6090. ISSN 1424-8220