Code and data for "Open-Set Knowledge-Based Visual Question Answering with Inference Paths".
Using conda, create an environment
pip install -r requirements.txt
Download dataset OK-VQA.
Train from scratch or use pre-trained model parameters here.
Download pre-processed ConceptNet-annotated KBVQA data, or generate the schema graphs with script.
Please refer to Preprocess.md for more implementation details.
python ft_gather.py --save_name test_gather --seed 7777 --evaluate 0 --backbone_model bert --use_concept 0 --use_wiki 0 --use_image 0 --use_split_name 8505 --add_answer_emb 0 --ablation ptm path lstm prune node_type --from_pretrained ./save/bert_base_6layer_6conect-step1-128/pytorch_model.bin --vil_dim 128