You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Copy file name to clipboardExpand all lines: README.md
+13-13Lines changed: 13 additions & 13 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -1,6 +1,7 @@
1
1
# ChEBai
2
2
3
-
ChEBai is a deep learning library designed for the integration of deep learning methods with chemical ontologies, particularly ChEBI. The library emphasizes the incorporation of the semantic qualities of the ontology into the learning process.
3
+
ChEBai is a deep learning library designed for the integration of deep learning methods with chemical ontologies, particularly ChEBI.
4
+
The library emphasizes the incorporation of the semantic qualities of the ontology into the learning process.
4
5
5
6
## Installation
6
7
@@ -20,25 +21,23 @@ pip install .
20
21
21
22
## Usage
22
23
23
-
The training and inference is abstracted using the Pytorch Lightning modules. Here are some quick CLI commands on using ChEBai for pretraining, finetuning, ontology extension and prediction:
24
+
The training and inference is abstracted using the Pytorch Lightning modules.
25
+
Here are some CLI commands for the standard functionalities of pretraining, ontology extension, fine-tuning for toxicity and prediction.
26
+
For further details, see the [wiki](https://github.com/ChEB-AI/python-chebai/wiki).
27
+
If you face any problems, please open a new [issue](https://github.com/ChEB-AI/python-chebai/issues/new).
24
28
25
29
### Pretraining
26
30
```
27
-
python -m chebai fit --data.class_path=chebai.preprocessing.datasets.pubchem.SWJChem --model=configs/model/electra-for-pretraining.yml --trainer=configs/training/default_trainer.yml --trainer.callbacks=configs/training/default_callbacks.yml
31
+
python -m chebai fit --data.class_path=chebai.preprocessing.datasets.pubchem.PubchemChem --model=configs/model/electra-for-pretraining.yml --trainer=configs/training/pretraining_trainer.yml
28
32
```
29
33
30
-
### Finetuning for predicting classes given SMILES strings
@@ -55,7 +54,8 @@ one row for each SMILES string and one column for each class.
55
54
56
55
## Evaluation
57
56
58
-
The finetuned model for predicting classes using the SMILES strings can be evaluated using the following python notebook `eval_model_basic.ipynb` in the root dir. It takes in the finetuned model as input for performing the evaluation.
57
+
An example for evaluating a model trained on the ontology extension task is given in `tutorials/eval_model_basic.ipynb`.
58
+
It takes in the finetuned model as input for performing the evaluation.
59
59
60
60
## Cross-validation
61
61
You can do inner k-fold cross-validation, i.e., train models on k train-validation splits that all use the same test
0 commit comments