Skip to content

hisunjiang/HiLoFuseNet

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

55 Commits
 
 
 
 
 
 
 
 
 
 

Repository files navigation

🧠 High-gamma and Low-frequency ECoG signal Fusion Network (HiLoFuseNet) for continuous finger movement decoding

This repository provides the official PyTorch implementation of the finger movement decoding framework detailed in the paper:

Sun et al., "Spectro-Temporal Fusion of High-Gamma and Low-Frequency ECoG Signals for Intracranial Finger Movement Decoding," 2025. under review.

The model achieved SOTA decoding performance on the public BCI Competition IV and Stanford datasets, with Pearson correlation coefficients between true and predicted finger movement trajectories of 0.631 and 0.534, representing improvements of 5.0% and 11.9%, respectively, over the previous best methods.

Comparison with previous previous studies on the BCIIV (blue) and Stanford (red) datasets.

📢📢📢 News

  • Our original submission to TNNLS was rejected. The reviewers raised concerns regarding the state-of-the-art claim and the necessity of fusing low-frequency signals. We are currently improving the decoding framework and conducting thorough comparisons against the latest decoders.

⚠️⚠️⚠️ We need you!

We benchmarked a large number of decoders across the two datasets. Since most of them did not open-source their code, despite our best efforts to replicate the reported results, the discrepancies we observed were substantial. If, by any chance, you find that our implementation is incorrect, please email us!


🛠️ Decoding Framework

The proposed framework is characterized by (a) a streamlined ECoG feature extraction pipeline and (b) a compact neural network for learning spectro-temporal information.

The HiLoFuseNet model architecture.

Core Functions

Functionality Implementation Path
HGA and LFS Feature Extraction finger_regression/models/prepareDataset.py/HGALFS_feature_extractor
HiLoFuseNet Architecture finger_regression/models/nn_regressors.py/HiLoFuseNet

💾 Quick Start & Reproducibility

1. Download Datasets

2. Signal Preprocessing

Raw ECoG signals were preprocessed using MATLAB FieldTrip-20230926.

  • BCIIV Preprocessing: data_preprocessing/data_preprocessing_BCI4.m
  • Stanford Preprocessing: data_preprocessing/data_preprocessing_Stanford.m

3. Feature Extraction

Features were extracted using MNE-Python (v1.8.0).

  • Script: finger_regression/prepare_taskFormatedData.py

4. Run Experiments

Configure the pytorch environment via finger_regression/environment.yml. The following table summarizes the scripts used to reproduce the paper's findings. Raw output files are provided in finger_regression/results. The .slum file provides the code to interact with a supercomputing cluster. If you run the script locally, please change the inputs in the .py file according to the corresponding settings from the .slurm file.

Experiment Execution Script(s) Results Folder
DNN Multi-Output Regression regression_o5_nn.py, submit_o5_nn.slurm finger_regression/results/o5/varyingSeed
ML Multi-Output Regression regression_o5_ml.py, submit_o5_ml.slurm finger_regression/results/o5
Model Interpretation regression_o5_nn_interpretModel.py, submit_o5_nn_interpretModel.slurm finger_regression/results/o5/interpretModel
Ablation Study regression_o5_nn_ablation.py, submit_o5_nn_ablation.slurm finger_regression/results/o5/ablation
Hyperparameter Test regression_o5_nn_hyperparameter.py, submit_o5_nn_hyperparameter.slurm finger_regression/results/o5/hyperparameter
DNN Single-Output Regression regression_o1_nn.py, submit_o1_nn.slurm finger_regression/results/o1/varyingSeed

Acknowledgement

A sincere thanks to the code contributors for BTTR and HOPLS!

Citation

Hope this model helps your research. We would be appreciated if u cite us.

to be appeared...

About

A concise framework for finger movement trajectory decoding from ECoG signals

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

 
 
 

Contributors