Skip to content

Differential effects of medication and globus pallidus internus deep brain stimulation in Parkinson’s disease using deep learning-based finger tapping motion analyses

License

Notifications You must be signed in to change notification settings

mi2rl/FingerTappingMotionAnalysis

Repository files navigation

Medication versus Globus Pallidus Internus Deep Brain Stimulation in Parkinson’s Disease: A Deep Learning-based Video Analysis of the Finger Tapping Task

This repository contains the code for the analysis described in the paper titled "Medication versus Globus Pallidus Internus Deep Brain Stimulation in Parkinson’s Disease: A Deep Learning-based Video Analysis of the Finger Tapping Task".

The code processes data obtained from a deep learning model (Mesh Graphormer) used to get a 3D reconstructed hand pose and mesh for each video frame. It includes steps for data preprocessing, feature extraction, and machine learning model training/prediction to predict the MDS-UPDRS Part 3 Finger Tapping (FT) rating.

Requirements

To run this code, you need to have Docker installed. The necessary dependencies are managed within a Docker image and the provided requirements.txt file.

  1. Docker Image: Pull the specified Docker image:
    docker pull docker.io/library/python:3.8.5-slim-buster

  2. System Dependencies: Install necessary system packages within the Docker environment:
    apt-get update && apt-get install \-y build-essential cmake libgomp1

  3. Python Dependencies: Install the required Python packages using the provided requirements.txt file. The content of requirements.txt should be placed in your project's root directory.

Code Structure

The repository contains the following main scripts:

  • preprocessing.py : Takes the .npy files containing hand joint data (obtained from Mesh Graphormer) and a .csv file with video metadata (video name and fps) as input. It calculates the 3D Euclidean distance between the thumb and index fingertips for each frame and generates a .txt file indicating the frames where the two fingers meet and separate.

  • feature_extract.py : Uses the .npy files and the .txt files generated by preprocessing.py to extract 21 distinct feature values, which are then saved into a .csv file.

  • machinelearning.py : Utilizes the feature values from the .csv file to predict the MDS-UPDRS Part 3 FT rating for each video using a machine learning model.

  • run_pipeline.py : Provides an end-to-end script to run the preprocessing, feature extraction, and machine learning steps sequentially.

How to Run

You can run the entire pipeline using the run_pipeline.py script. Ensure you have your data organized as follows:

  • video_name_and_fps.csv: A CSV file containing video names and their corresponding frames per second (fps).
  • hand_joint_data: A directory containing the hand joint data files (e.g., .npy files) obtained from Mesh Graphormer.
  • pipeline_results_directory: An output directory where the results (e.g., intermediate .txt files, feature .csv, prediction results) will be saved.

Execute the script with the following command, replacing the placeholder paths with your actual data paths:

python3 run_pipeline.py  
--metadata_csv /path/to/your/video_name_and_fps.csv \
--hand_joint_data_dir /path/to/your/hand_joint_data  \ 
--pipeline_output_base_dir /path/to/your/pipeline_results_directory

Paper

The full paper is currently: To be updated once published.

Acknowledgement

Our code was modified based on the released codes at https://github.com/ROC-HCI/finger-tapping-severity. We acknowledge the authors for making their code publicly available.

Contact

For questions or inquiries regarding this project, please contact:

Heeyeon Kwon - khy8387@gmail.com

About

Differential effects of medication and globus pallidus internus deep brain stimulation in Parkinson’s disease using deep learning-based finger tapping motion analyses

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages