-
Notifications
You must be signed in to change notification settings - Fork 0
noskill/sfa-gen
Folders and files
| Name | Name | Last commit message | Last commit date | |
|---|---|---|---|---|
Repository files navigation
# Neural Slow Feature Analysis (SFA)
This small experiment explores **Neural Slow Feature Analysis** – learning an
embedding *y* that
1. changes **slowly** over time,
2. is **decorrelated** across dimensions, and
3. has **unit variance**.
We train the network on the image sequence stored in `frames/` which contains
consecutive frames of a video. The approach follows the classic SFA objective
```
L_sfa = L_slowness + L_variance + L_correlation
L_slowness = mean(||y_t - y_{t-1}||²)
L_variance = mean( (diag(C) - 1)² )
L_correlation = mean( (C ⊙ (1 − I))² )
C = (Yᵀ Y) / N (covariance of zero-mean batch embeddings)
```
`⊙` denotes element-wise multiplication and `I` is the identity matrix.
## Encoder Architecture
```
Input image (3×H×W)
│
├─ 3× Conv-BN-ReLU blocks (stride 2) # ↓ spatial resolution, ↑ channels
│ (channels: 32 → 64 → 128)
│
├─ Global Average Pooling # 128-D feature vector *h*
│
└─ **Recurrent MLP head** # predicts embedding y_t
├─ Concatenate [h, y_{t-1}] # 128+emb_dim
├─ Linear → ReLU (256 units)
└─ Linear (emb_dim)
Output: embedding y_t (``emb_dim`` = 32 by default)
```
The **MLP head** receives the CNN features of the *current* frame together
with the embedding produced for the *previous* frame (`y_{t-1}`). This simple
recurrence helps the network explicitly reason about temporal continuity.
## Running the experiment
```
python train_sfa.py --data-dir frames --img-size 64 --epochs 20
```
All important hyper-parameters such as batch size, embedding dimension, and
learning rate can be configured via the command-line flags – consult
`train_sfa.py --help` for the full list.
## File overview
- `neural_sfa.py` – model definition and composite loss.
- `train_sfa.py` – minimal training loop for consecutive frame pairs.
- `./frames/` – video frames that serve as the training data.
About
No description, website, or topics provided.
Resources
Stars
Watchers
Forks
Releases
No releases published
Packages 0
No packages published