Accepted as Spotlight at Learning on Graphs (LoG) 2024
For conda, using the environment.yml file:
conda env create -f environment.ymlFor pip, using requirements.txt:
pip install -r requirements.txtAlternatively, you can install the packages manually via conda:
conda create -n copt python=3.10
conda activate copt
conda install pytorch torchvision torchaudio pytorch-cuda=11.8 -c pytorch -c nvidia -y
conda install pyg -c pyg
conda install pytorch-scatter pytorch-sparse pytorch-cluster pytorch-spline-conv -c pyg
# might need to install latest torch-sparse via pip instead
pip install git+https://github.com/rusty1s/pytorch_sparse.git
conda install lightning -c conda-forge
pip install yacs einops loguru dwave-networkx ogb performer-pytorch wandbThis codebase is built on top of PyG GraphGym, and uses configuration files to run experiments.
The default GCON configurations are found in configs/benchmarks/{TASK_NAME}. E.g., to run GCON for MCut on BA-small:
python main.py --cfg configs/benchmarks/maxclique/maxclique_rb_small.yamlTo use the non-decoupled architecture, override the gcon layer in the default config with the hybridconv convolution:
python main.py --cfg configs/benchmarks/maxclique/maxclique_rb_small.yaml gnn.layer_type=hybridconvThe Erdos' GNN configurations are denoted with an -erdos suffix:
python main.py --cfg configs/benchmarks/maxclique/maxclique_rb_small-erdos.yamlTo use Erdos' GNN with entropy annealing, use the optim.entropy.enable flag:
python main.py --cfg configs/benchmarks/maxclique/maxclique_rb_small-erdos.yaml optim.entropy.enable=TrueYou can use your WandB account for logging by setting wandb.entity to your own entity. You can also use local-only logging by setting wandb.use=False.
If you find this work useful, please cite our paper:
@misc{wenkel2024generalgnnframeworkcombinatorial,
title={Towards a General GNN Framework for Combinatorial Optimization},
author={Frederik Wenkel and Semih Cantürk and Michael Perlmutter and Guy Wolf},
year={2024},
eprint={2405.20543},
archivePrefix={arXiv},
primaryClass={cs.LG},
url={https://arxiv.org/abs/2405.20543},
}