🧠 High-gamma and Low-frequency ECoG signal Fusion Network (HiLoFuseNet) for continuous finger movement decoding
This repository provides the official PyTorch implementation of the finger movement decoding framework detailed in the paper:
Sun et al., "Spectro-Temporal Fusion of High-Gamma and Low-Frequency ECoG Signals for Intracranial Finger Movement Decoding," 2025. under review.
The model achieved SOTA decoding performance on the public BCI Competition IV and Stanford datasets, with Pearson correlation coefficients between true and predicted finger movement trajectories of 0.631 and 0.534, representing improvements of 5.0% and 11.9%, respectively, over the previous best methods.
- Our original submission to TNNLS was rejected. The reviewers raised concerns regarding the state-of-the-art claim and the necessity of fusing low-frequency signals. We are currently improving the decoding framework and conducting thorough comparisons against the latest decoders.
We benchmarked a large number of decoders across the two datasets. Since most of them did not open-source their code, despite our best efforts to replicate the reported results, the discrepancies we observed were substantial. If, by any chance, you find that our implementation is incorrect, please email us!
The proposed framework is characterized by (a) a streamlined ECoG feature extraction pipeline and (b) a compact neural network for learning spectro-temporal information.
| Functionality | Implementation Path |
|---|---|
| HGA and LFS Feature Extraction | finger_regression/models/prepareDataset.py/HGALFS_feature_extractor |
| HiLoFuseNet Architecture | finger_regression/models/nn_regressors.py/HiLoFuseNet |
- BCIIV: https://www.bbci.de/competition/iv/#dataset4
- Stanford-FingerFlex: https://searchworks.stanford.edu/view/zk881ps0522
Raw ECoG signals were preprocessed using MATLAB FieldTrip-20230926.
- BCIIV Preprocessing:
data_preprocessing/data_preprocessing_BCI4.m - Stanford Preprocessing:
data_preprocessing/data_preprocessing_Stanford.m
Features were extracted using MNE-Python (v1.8.0).
- Script:
finger_regression/prepare_taskFormatedData.py
Configure the pytorch environment via finger_regression/environment.yml. The following table summarizes the scripts used to reproduce the paper's findings. Raw output files are provided in finger_regression/results. The .slum file provides the code to interact with a supercomputing cluster. If you run the script locally, please change the inputs in the .py file according to the corresponding settings from the .slurm file.
| Experiment | Execution Script(s) | Results Folder |
|---|---|---|
| DNN Multi-Output Regression | regression_o5_nn.py, submit_o5_nn.slurm |
finger_regression/results/o5/varyingSeed |
| ML Multi-Output Regression | regression_o5_ml.py, submit_o5_ml.slurm |
finger_regression/results/o5 |
| Model Interpretation | regression_o5_nn_interpretModel.py, submit_o5_nn_interpretModel.slurm |
finger_regression/results/o5/interpretModel |
| Ablation Study | regression_o5_nn_ablation.py, submit_o5_nn_ablation.slurm |
finger_regression/results/o5/ablation |
| Hyperparameter Test | regression_o5_nn_hyperparameter.py, submit_o5_nn_hyperparameter.slurm |
finger_regression/results/o5/hyperparameter |
| DNN Single-Output Regression | regression_o1_nn.py, submit_o1_nn.slurm |
finger_regression/results/o1/varyingSeed |
A sincere thanks to the code contributors for BTTR and HOPLS!
- BTTR: https://github.com/TheAxeC/block-term-tensor-regression
- HOPLS: https://github.com/arthurdehgan/HOPLS
Hope this model helps your research. We would be appreciated if u cite us.
to be appeared...

