One-Class SVM for Novelty Detection
One-Class Support Vector Machines (One-Class SVMs) are a prevailing technique for novelty detection, particularly when we only have access to normal data during training. Unlike other outlier detection methods, One-Class SVM attempts to learn the boundary of normality and classifies any point lying outside this boundary as a novelty. This makes it well-suited for use cases like fraud detection, equipment failure prediction, or rare disease diagnostics—any situation where anomalous examples are extremely rare or unavailable during training.
One-Class SVM is a kernel-based method that can model nonlinear boundaries, giving it significant flexibility when separating normal instances from unseen anomalies. This recipe implements the model with a synthetic dataset.
Getting ready
We’ll create a dataset with normal training data and include anomalies only at test time to demonstrate novelty detection.
Load the libraries:
import numpy as np...