Umeå University's logo

umu.sePublications
Change search
CiteExportLink to record
Permanent link

Direct link
Cite
Citation style
  • apa
  • ieee
  • vancouver
  • Other style
More styles
Language
  • de-DE
  • en-GB
  • en-US
  • fi-FI
  • nn-NO
  • nn-NB
  • sv-SE
  • Other locale
More languages
Output format
  • html
  • text
  • asciidoc
  • rtf
Edge-assisted framework for instant anomaly detection and cloud-based anomaly recognition in smart surveillance
Sejong University, Seoul, South Korea.
Yonsei University, Seoul, South Korea.
Umeå University, Faculty of Science and Technology, Department of Computing Science.
Sejong University, Seoul, South Korea.
Show others and affiliations
2025 (English)In: Engineering applications of artificial intelligence, ISSN 0952-1976, E-ISSN 1873-6769, Vol. 160, article id 111936Article in journal (Refereed) Published
Abstract [en]

Anomaly detection in surveillance systems, identifying events such as fighting, shooting, or vandalism, remains challenging due to limited anomalous occurrences impacting model generalization and accuracy. Additionally, the significant variability in anomalies compared to normal actions complicates precise detection. Current state-of-the-art methods typically process all frames captured by surveillance cameras using resource-intensive, centralized systems, resulting in high computational costs and network bandwidth usage unsuitable for real-time, edge-based Artificial Intelligence of Things environments. Moreover, unclear anomaly definitions and computationally demanding real-time analyses hinder effective monitoring. To address these challenges, we propose an efficient Artificial Intelligence of Things-based anomaly detection framework employing a two-phase approach. Initially, a lightweight neural architecture search network classifies events as normal or anomalous directly on edge devices. When an anomaly is detected, the edge device alerts relevant authorities and transmits the relevant video frames to the cloud for further evaluation. In the second phase, a next-generation convolutional neural network extracts spatial features, refined by Spatial Attention Modules and further processed using a transformer-based multi-head attention network to capture contextual relationships. A Multi-Scale Feature module then classifies the anomalies into specific types. Extensive experiments conducted on the University of Central Florida Crime, Large-scale Anomaly Detection-2000, and Real-world Fighting-2000 datasets demonstrate that our method achieves accuracy of 53.60%, 80.02%, and 94.05%, with improvements of 1.8%, 1.2%, and 0.8%, respectively. Moreover, we report area under the curve scores of 0.76, 0.98, and 0.98 for all datasets, highlighting the method's effectiveness and robustness in enhancing anomaly detection and recognition across various datasets.

Place, publisher, year, edition, pages
Elsevier, 2025. Vol. 160, article id 111936
Keywords [en]
Anomaly detection, Anomaly recognition, Attention mechanism, Sequential learning, Smart surveillance system, Surveillance videos
National Category
Computer Sciences
Identifiers
URN: urn:nbn:se:umu:diva-243425DOI: 10.1016/j.engappai.2025.111936Scopus ID: 2-s2.0-105012968061OAI: oai:DiVA.org:umu-243425DiVA, id: diva2:1990941
Available from: 2025-08-21 Created: 2025-08-21 Last updated: 2025-08-21Bibliographically approved

Open Access in DiVA

No full text in DiVA

Other links

Publisher's full textScopus

Authority records

Khan, Zulfiqar Ahmad

Search in DiVA

By author/editor
Khan, Zulfiqar Ahmad
By organisation
Department of Computing Science
In the same journal
Engineering applications of artificial intelligence
Computer Sciences

Search outside of DiVA

GoogleGoogle Scholar

doi
urn-nbn

Altmetric score

doi
urn-nbn
Total: 96 hits
CiteExportLink to record
Permanent link

Direct link
Cite
Citation style
  • apa
  • ieee
  • vancouver
  • Other style
More styles
Language
  • de-DE
  • en-GB
  • en-US
  • fi-FI
  • nn-NO
  • nn-NB
  • sv-SE
  • Other locale
More languages
Output format
  • html
  • text
  • asciidoc
  • rtf