Timeband is an LSTM-GAN based model for simultaneous detection and correction of missing and outliers in multivariate time series data.
Step 1. Setting up configuration
Step 2. Prepare Time series Dataset
Step 3. Process the Time series Dataset
- Preprocess with Normalizing / Scaling / ...
- Process with Holdout / Windowing / ...
Step 4. Prepare Input/Output Data Structure
Real Dataset => Encoder => Context Space => Decoder => Target Dataset
Step 5. Train the LSTM-GAN based model
Step 6. Evaluate the models
Step 7. Get the outputs
-
Nvidia device with CUDA
-
Python 3.8+
-
PyTorch 1.9+
-
Numpy / pandas / ...
Download by
pip install -r requirements.txt
-
Create a virtual environment.
python -m venv .venv
-
Install PyTorch
pip install torch==1.9.0 or pip install torch==1.9.1+cu111 -f https://download.pytorch.org/whl/torch_stable.html
-
Get Timeband
git clone https://github.com/handal95/Timeband.git
-
Install Python packages
pip install -r requirements.txt
-
Set Default configuration
- copy
config.sample.jsontoconfig.json
- copy
