Welcome to the repository for the ECCV 2024 paper.
π Read the Paper
πΌοΈ View the Poster
This work focuses on predicting model performance on a target domain without access to the source data.
The method predicts classifier accuracy on the target domain through the following steps:
- Calculate GMM Statistics:
Compute the mean and covariance of the logits for each class based on the target predictions. - Calibration:
Replace Softmax with a GMM-based generative model to calibrate the final probabilities of the test sample. - Assess the Correctness of the Prediction:
Calculate 2 losses: CELoss to one-hot encoding for the most probable class, and CELoss to a uniform distribution (random prediction). Backpropogate through only the last layer (aka classifier) and compare gradient norms to determine correctness of prediction:- If the gradient norm to the predicted class <= gradient norm to a random class β Correct.
- Otherwise β Incorrect.
Before running the code, install the required dependencies using:
pip install -r requirements.txt- Download the MNIST-trained model lenet_mnist.pt and place it in the
logs/folder.
Run the following command to reproduce results on the SVHN and USPS datasets:
python main.py
This code will:
- Automatically download the SVHN and USPS datasets and save them in the
logs/folder. - Store the features of the target datasets in the
logs/folder. - Predict the modelβs performance on each target dataset.
Explore the all_experiments/ folder for other experiments and baseline comparisons.
If you use this code, please cite our paper:
@inproceedings{khramtsova2025sourcefreepp,
title={Source-Free Domain-Invariant Performance Prediction},
author={Khramtsova, Ekaterina and Baktashmotlagh, Mahsa and Zuccon, Guido and Wang, Xi and Salzmann, Mathieu},
editor={Leonardis, Ale{\v{s}} and Ricci, Elisa and Roth, Stefan and Russakovsky, Olga and Sattler, Torsten and Varol, G{\"u}l},
booktitle={Proceedings of The 18th European Conference on Computer Vision (ECCV)},
year={2024},
publisher={Springer Nature Switzerland},
pages={99--116},
isbn={978-3-031-72989-8}
}