@ies

Multi-Task Representation Learning with Temporal Attention for Zero-Shot Time Series Anomaly Detection

, , , и . International Joint Conference on Neural Networks (IJCNN), IEEE, (2024)(accepted).

Аннотация

Ensuring the reliability of critical industrial systems across various sectors is crucial. It is essential to detect deviations from regular behaviour to mitigate disruptions and preserve infrastructure integrity. However, accurately labelling anomaly datasets is challenging due to their rarity and manual annotation subjectivity. The conventional approach of training separate models for each dataset entity further complicates model development. This paper presents a novel Multi-task Learning framework combining LSTM Autoencoder with temporal attention mechanism (MTL-LATAM) for effective time series anomaly detection. Multitask learning models improve adaptability and generalizability, leading to reduced runtime and compute power while supporting zero-shot evaluation. These models offer flexibility in detecting emerging anomalies. Additionally, we introduce a dynamic thresholding mechanism to incorporate temporal context for anomaly detection and provide visualizations of attention weights to enhance interpretability. The study compares MTL- LATAM, with other multi-task models, evaluates multi-task versus single-task models and assesses the performance of the proposed frame- work in zero-shot learning scenarios. The findings indicate MTL- LATAM’s effectiveness across real-world and open-source datasets, achieving 95% and 97% task synergy. The results underscore the superior performance of multi-task models in zero-shot tasks compared to individual models trained exclusively on their respective datasets.

Линки и ресурсы

тэги