Density Estimation
What is density estimation?
- random variable’s possible values and their probabilities for each possible value
- density = probability density. estimation pdf. relative likelihood
- 1. parametric:
What is a Gaussian Mixture Model?
- mix of Gaussian distributions
- way of clustering. need to provide k (=number of clusters)
- k determines the number of Gaussian distributions splitting from the model
- Gaussian distribution’s parameters are found using EM algorithm (Expectation-Maximization)
DAGMM (Deep Auto-encoding Gaussian Mixture Model)
- 1. compression network (deep autoencoder): dimensionality reduction of input
- 2. estimation network (mixture model): predict likelihood of GMM. density estimation
- end-to-end fashion
- result: up to 14% improvement in F1-score
- prevent information loss and difficulties in calculations
Dimensional Reduction
SPREAD (Sparse Recurrent Neural Network based Anomaly Detection)
- dimensionality reduction & encoder & decoder
- “SPREAD combines the advantages of dimensionality reduction as well as temporal encoding to learn a robust temporal model of high-dimensional time series.” “leading to better regularization and a robust temporal model”
- encoder: use sparse inputs (feedforward layer)
- decoder: reconstruct all original input dimensions
- sparsity constraints added on weights
- use Adam optimizer
- in high dimensions, this works well without enough knowledge of all dimensions
Prediction
LSTM-NDT (Long Short Term Memory-Nonparametric Dynamic Thresholding)
- Method 1: Telemetry Value Prediction with LSTMs. compare prediction value from model to check anomaly
- Method 2: Dynamic Error Thresholds. prediction error + exponentially weighted average to get new threshold, smoothed error
Reconstruction
MSCRED (Multi-Scale Convolutional Recurrent Encoder-Decoder)
- demonstrate multiple levels in various times (characterize status with signature matrices)
- convolutional encoder. spatial patterns. inter-sensor correlations
- ConvLSTM (Convolutional Long-Short Term Memory) layer captures patterns based on time
- convolutional decoder. reconstruct input (signature matrices)
- detect anomalies using signature matrices from convolutional decoder
- less affected by noise compared to ARMA and LSTM-ED
USAD (Unsupervised Anomaly Detection)
- great stability
- fast training
- not easily affected by choice of parameters
- need to have enough data to detect anomalies from normal data
Variational Autoencoder
OmniAnomaly
- stochastic recurrent neural network for multivariate time series
- reconstruction probability
- intuitive and effective
- when using time series data, need to accurately depict patterns
- considers anomalies in different dimensions
GAN
MAD-GAN (Multivariate Anomaly Detection-Generative Adversarial Networks)
- train LSTM-RNN
- use generator and discriminator
- DR-score(discriminator generator anomaly score) to detect anomalies
- anomaly detection loss
RSM-GAN (Robust Seasonal Multivariate Generative Adversarial Network)
- improvements for seasonal and contaminated multivariate data, as well as identifying anomalies
- extending from GAN with convolutional LSTM layers
- data contamination handled by encoder
- compared to other existing models, has lowest FP rate and precision improved by 30% for real-world data
- helps to handle complicated seasonal real-world data
Hybrid
MTAD-GAT (Multivariate Time-series Anomaly Detection via Graph Attention Network)
- considers relationship between multivariate data from different time series
- incorporates joint optimization strategy
THOC (Temporal Hierarchical One-Class Network)
- hierarchical clustering regarding time
- Multiscale Support Vector Data Description (MVDD) - loss
- end-to-end
'Deep Learning > Anomaly Detection' 카테고리의 다른 글
(수정중) [Time Series Data] Detecting COVID-19 with Smartwatch (0) | 2022.01.23 |
---|---|
[Paper Review] Time Series Anomaly Detection using Temporal Hierarchical One-Class (THOC) Network (0) | 2022.01.23 |
[Python] Deep SVDD DataLoader on Time Series Data (0) | 2022.01.23 |
Anomaly Detection Paper 모음 (0) | 2022.01.23 |
[Paper Review] DROCC: Deep Robust One-Class Classification (0) | 2022.01.23 |