Skip to content

Li-ZK/MCMD-CFSL-2025

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

2 Commits
 
 
 
 

Repository files navigation

This is a code demo for the paper "A Multimodal Prototype Correction and Multidimensional Knowledge Distillation Framework for Cross-Domain Few-Shot Hyperspectral Image Classification". IEEE Transactions on Geoscience and Remote Sensing, doi: 10.1109/TGRS.2025.3601337

[1]Jing Tian, Runze Wan, Zhaokui Li, Yan Wang, Zhuoqun Fang, Jiaxu Guo, Mingtai Qi, “A Multimodal Prototype Correction and Multidimensional Knowledge Distillation Framework for Cross-Domain Few-Shot Hyperspectral Image Classification,” IEEE Transactions on Geoscience and Remote Sensing, doi: 10.1109/TGRS.2025.3601337.

Requirements

python = 3.9.19

torchmetrics = 0.10.3

torch = 2.0.0+cu118

scikit-learn = 1.5.1

scipy = 1.13.1

Datasets

  • source domain dataset

    • Chikusei
  • target domain datasets

    • Indian Pines
    • Houston
    • Salinas
    • WHU-Hi-LongKou

You can download the source and target datasets mentioned above at https://pan.baidu.com/s/1vCKXNxH120CA3gVmPROtWQ?pwd=jemc , and move to folder datasets. In particular, for the source dataset Chikusei, you can choose to download it in mat format, and then use the utils/chikusei_imdb_128.py file to process it to get the patch size you want, or directly use the preprocessed source dataset Chikusei_imdb_128.pickle with a patch size of 9 $\times$ 9. An example datasets folder has the following structure:

datasets
├── Chikusei_imdb_128.pickle
├── Chikusei_raw_mat
│   ├── HyperspecVNIR_Chikusei_20140729.mat
│   └── HyperspecVNIR_Chikusei_20140729_Ground_Truth.mat
├── IP
│   ├── indian_pines_corrected.mat
│   └── indian_pines_gt.mat
├── Houston
│   ├── data.mat
│   ├── mask_train.mat
│   └── mask_test.mat
├── salinas
│   ├── salinas_corrected.mat
│   └── salinas_gt.mat
└── WHU-Hi-LongKou
    ├── WHU_Hi_LongKou.mat
    └── WHU_Hi_LongKou_gt.mat

Pretrain model

You can download the pre-trained model of Base Bert, bert-base-uncased, at https://pan.baidu.com/s/1RwcZlxOA-EX0O3ciPBLqow?pwd=1shx , and move to folder pretrain-model.

An example pretrain-model folder has the following structure:

pretrain-model
└── bert-base-uncased
    ├── config.json
    ├── pytorch_model.bin
    ├── tokenizer.json
    ├── tokenizer_config.json
    └── vocab.txt

Usage

  1. Download the required source and target datasets and move to folder datasets.
  2. Download the required Base Bert pre-trained model and move to folder pretrain-model.
  3. Run MCMD-CFSC-(IP/HT/LK/SA).py.

About

This is a code demo for the paper "A Multimodal Prototype Correction and Multidimensional Knowledge Distillation Framework for Cross-Domain Few-Shot Hyperspectral Image Classification". IEEE Transactions on Geoscience and Remote Sensing, doi: 10.1109/TGRS.2025.3601337

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

 
 
 

Contributors