kuznetsov-insilico commited on
Commit
bf7503b
1 Parent(s): b97ce95

readme changes

Browse files
Files changed (1) hide show
  1. README.md +5 -7
README.md CHANGED
@@ -1,7 +1,6 @@
1
  ---
2
  tags:
3
  - chemistry
4
- - medical
5
  widget:
6
  - text: <LIGAND>
7
  example_title: Generate molecule
@@ -21,8 +20,8 @@ The model was pretrained on the [Uni-Mol](https://github.com/deepmodeling/Uni-Mo
21
  the distribution of the GEOM-DRUGS datasets.
22
  We also expose pretrained and finetuned models:
23
 
24
- - For the pretrained model, visit [huggingface.co/insilicomedicine/BindGPT](https://huggingface.co/insilicomedicine/BindGPT)
25
- - For the model finetuned with Reinforcement Learning on CrossDocked, visit [huggingface.co/insilicomedicine/BindGPT-RL](https://huggingface.co/insilicomedicine/BindGPT-RL)
26
 
27
 
28
  ## Unconditional generation
@@ -33,15 +32,14 @@ sampling molecules from the model. It only depends on
33
  and it's not meant to reproduce the sampling speed reported
34
  in the paper (e.g. it does not use flash-attention, mixed precision,
35
  and large batch sampling).
36
- To reproduce sampling speed, please use the code from our repository:
37
- https://github.com/insilicomedicine/bindgpt
38
 
39
  ```python
40
  # Download model from Hugginface:
41
  from transformers import AutoTokenizer, AutoModelForCausalLM
42
 
43
- tokenizer = AutoTokenizer.from_pretrained("artemZholus/BindGPT")
44
- model = AutoModelForCausalLM.from_pretrained("artemZholus/BindGPT").cuda()
45
 
46
  # Generate 10 tokenized molecules without condition
47
  NUM_SAMPLES = 10
 
1
  ---
2
  tags:
3
  - chemistry
 
4
  widget:
5
  - text: <LIGAND>
6
  example_title: Generate molecule
 
20
  the distribution of the GEOM-DRUGS datasets.
21
  We also expose pretrained and finetuned models:
22
 
23
+ - For the pretrained model, visit [huggingface.co/insilicomedicine/bindgpt_pretrained](https://huggingface.co/insilicomedicine/bindgpt_pretrained)
24
+ - The model finetuned with Reinforcement Learning on CrossDocked is coming soon
25
 
26
 
27
  ## Unconditional generation
 
32
  and it's not meant to reproduce the sampling speed reported
33
  in the paper (e.g. it does not use flash-attention, mixed precision,
34
  and large batch sampling).
35
+ To reproduce sampling speed, please use the code from our repository: (code coming soon)
 
36
 
37
  ```python
38
  # Download model from Hugginface:
39
  from transformers import AutoTokenizer, AutoModelForCausalLM
40
 
41
+ tokenizer = AutoTokenizer.from_pretrained("insilicomedicine/bindgpt_finetuned")
42
+ model = AutoModelForCausalLM.from_pretrained("insilicomedicine/bindgpt_finetuned").cuda()
43
 
44
  # Generate 10 tokenized molecules without condition
45
  NUM_SAMPLES = 10