Skip to content

QuTongxi/Quantized_Model_Privace_under_MIA

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

2 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

Quant-MIA

Python MIT License

English Description

Quant-MIA is an experimental deep learning framework designed for studying model quantization and membership inference attacks (MIA). It integrates multiple quantization algorithms and privacy evaluation tools, enabling researchers to reproduce and compare results across different datasets and attack settings.

Main Features:

  • Supports multiple weight quantization methods (OBC, BRECQ, AdaRound, etc.)
  • Provides an end-to-end training / quantization / privacy attack evaluation pipeline (Codes/main_workflow.py)
  • Includes analysis and visualization tools for reproducing experiments and generating reports
  • Supports low-bit quantization (including experimental 1.58-bit implementation)

Quick Start

Run the following commands. By default, datasets are stored in ./data/, and the CIFAR-100 dataset is used.
This script will train a ResNet18 model and 64 shadow models, perform asymmetric weight quantization at 1.58, 2, and 4 bits, and evaluate both quantized and full-precision models under MIA with true positive rate (TPR).

cd quant-mia/
python main_workflow.py

The configuration file is located at config.yaml.
Command-line arguments override parameters in the config file. You can modify the file or append command-line arguments to customize your experiments.

python main_workflow.py --legend cifar10 --root path/to/cifar10

Project Structure (Overview)

Quant-MIA/
 ├─ lira/            # MIA attacks
 ├─ quantize/        # AdaRound, BRECQ, OBC quantization
 ├─ utils/           # Utility functions
 ├─ main_workflow.py # Entry point for experiment pipeline
 ├─ train.py         # Training script (for full-precision models)
 ├─ config.yaml      # Experiment configuration file

Example Experimental Results

The accuracy.json file generated during execution records all experimental results and conclusions.
An example content is shown below:

    {
        "{'legend': 'cifar100', 'wbits': 2, 'Type': 'OBC quantized model'}": {
        "23033032_W2A32": {
            "remarks": "{'seed': 90649, 'root': './data/cifar100', 'n_samples': 1024, 'n_queries': 2, 'batch_size': 128, 'keep_file': '/home/qtx/workspace/quant-mia/utils/../experiments/models/cifar100/23033032.npy', 'aquant': False, 'rel_damp': 0.0, 'abits': 32, 'actsym': False, 'aminmax': False, 'set_last_layer': None, 'wperweight': False, 'wasym': True, 'wminmax': False, 'wquant': True, 'nrounds': 10, 'bnt_batches': 100}",
            "scores": 68.65999698638916,
            "attack results": "{'Online': {'AUC': 0.8968539247999998, 'ACC': 0.80574, 'TPR': 0.15808}, 'Online Fixed': {'AUC': 0.9107036911999999, 'ACC': 0.81348, 'TPR': 0.26428}, 'Offline': {'AUC': 0.7496179328000001, 'ACC': 0.70106, 'TPR': 0.03072}, 'Offline Fixed': {'AUC': 0.7495806592, 'ACC': 0.70566, 'TPR': 0.06908}, 'Global': {'AUC': 0.7852372656000001, 'ACC': 0.77636, 'TPR': 0.00024}}"
        }
    }
    }

Generated figures (if available): accuracy_vs_bits.pdf

Acknowledgments & Citation

This project integrates and builds upon several third-party implementations.
Special thanks to the original authors for their open-source contributions:

When using or publishing results based on this project, please comply with the licenses and citation requirements of the respective subprojects.

Major modifications/extensions made in this repository include (examples):

  • Updated dataset processing to fit the experimental pipeline
  • Added/improved evaluation methods for privacy attack analysis
  • Experimentally implemented ultra-low-bit quantization (e.g., 1.58-bit)

Contact & Contribution

If you wish to contribute, report bugs, or request new features, feel free to open a Pull Request or create an issue.
Providing minimal reproducible examples and experiment setups will greatly help accelerate debugging and merging.

About

No description, website, or topics provided.

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

 
 
 

Contributors

Languages