-
Notifications
You must be signed in to change notification settings - Fork 3
Unable to reproduce the performance in Table1 of re-trained model following configs #9
Copy link
Copy link
Open
Description
When I used your trained model files/clipzyme_model.ckpt, to run the analysis/results, I can got the same results in Table 1.
# Test (not remove exact match uniprot_id)
* mean BEDROC_85: 0.44694237653821683
* mean BEDROC_20: 0.6298173546254426
* mean EF_0.05: 14.085124553319226
* mean EF_0.1: 8.060006004418565
* mean percentile: 95.10050213381196
* median percentile: 99.26118813166505
*
# Test (remove exact match uniprot_id)
* mean BEDROC_85: 0.36846560413957946
* mean BEDROC_20: 0.5723366739501967
* mean EF_0.05: 13.135639054964582
* mean EF_0.1: 7.702268103727128
* mean percentile: 94.22180948902022
* median percentile: 98.66822956240192
But, when I tried to train from scracth following the instruction in README.md (4 gpus),
For example, to run the first row in Table 1, run:
python scripts/dispatcher.py -c configs/train/clip_egnn.json -l ./logs/
I got poor performance when evaluating (low val_clip_accuracy and high val_clip_loss)
Epoch 29: 100%|██████████| 1075/1075 [14:39<00:00, 1.22it/s, v_num=7, train_clip_accuracy=0.938, train_clip_quantile=0.966, train_clip_loss=0.122, val_clip_accuracy=0.211, val_clip_quantile=0.784, val_clip_loss=2.830, val_loss=2.830, train_loss=0.141]
Saving args to ./logs/0337abd6ab11244491b4200e9ae37d8a.args
and poor results in analysis/Results.ipynb (Just change the path of best checkpoints)
* mean BEDROC_85: 0.01979012780672189
* mean BEDROC_20: 0.06457525835087777
* mean EF_0.05: 1.273378323980406
* mean EF_0.1: 1.2462138252576678
* mean percentile: 59.4702234656787
* median percentile: 62.850935637459095
Can you help check the configs/clip_egnn.json ? Thanks
For example, I tried to add
"train_esm_with_graph": [true],
But the performance is still poor.
Can you help re-check the training process? Thanks very much.
Reactions are currently unavailable
Metadata
Metadata
Assignees
Labels
No labels