-
Notifications
You must be signed in to change notification settings - Fork 26
Add flag to skip per-node projection for torch models. #55
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
base: main
Are you sure you want to change the base?
Conversation
|
@cfifty please read the following Contributor License Agreement(CLA). If you agree with the CLA, please reply with the following information.
Contributor License AgreementContribution License AgreementThis Contribution License Agreement (“Agreement”) is agreed to by the party signing below (“You”),
|
|
@microsoft-github-policy-service agree |
|
Thanks for the review! Please consider merging; I'm seeing an "Only those with to this repository can merge pull requests." message when trying to merge. |
Add a flag to optionally skip the node projection in GNN models.
This enables researchers to use pre-trained node embeddings rather than learning node embeddings on the FS-Mol dataset.
To use pre-trained embeddings, simply convert the FS-Mol dataset field MoleculeDatapoint.graph.node_features to the pre-trained embedding of size equal to the model's internal node representation dimension.