qq8933 commited on
Commit
eced2cb
1 Parent(s): 095e3a7

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +32 -0
README.md CHANGED
@@ -10,7 +10,39 @@ Chepybara-7B-Chat, The First Open-source Specialised LLM for Chemistry and Molec
10
  ## News
11
  - Chepybara online demo released. https://chemllm.org/ [2024-1-18]
12
  - Chepybara-7B-Chat ver.1.0 open-sourced.[2024-1-17]
 
 
13
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
14
  ## Dataset
15
 
16
  | Section | Dataset |Link|
 
10
  ## News
11
  - Chepybara online demo released. https://chemllm.org/ [2024-1-18]
12
  - Chepybara-7B-Chat ver.1.0 open-sourced.[2024-1-17]
13
+ ## Usage
14
+ Try (online demo)[https://chemllm.org/] instantly, or...
15
 
16
+ Install `transformers`,
17
+ ```
18
+ pip install transformers
19
+ ```
20
+ Load `Chepybara-7B-Chat` and run,
21
+ ```
22
+ from transformers import AutoModelForCausalLM, AutoTokenizer, GenerationConfig
23
+ import torch
24
+
25
+ model_name_or_id = "AI4Chem/Chepybara-7B-Chat"
26
+
27
+ model = AutoModelForCausalLM.from_pretrained(model_name_or_id, torch_dtype=torch.float16, device_map="cuda")
28
+ tokenizer = AutoTokenizer.from_pretrained(model_name_or_id)
29
+
30
+ prompt = "What is Molecule of Ibuprofen?"
31
+
32
+ inputs = tokenizer(prompt, return_tensors="pt").to("cuda")
33
+
34
+ generation_config = GenerationConfig(
35
+ do_sample=True,
36
+ top_k=1,
37
+ temperature=0.9,
38
+ max_new_tokens=500,
39
+ repetition_penalty=1.5,
40
+ pad_token_id=tokenizer.eos_token_id
41
+ )
42
+
43
+ outputs = model.generate(**inputs, generation_config=generation_config)
44
+ print(tokenizer.decode(outputs[0], skip_special_tokens=True))
45
+ ```
46
  ## Dataset
47
 
48
  | Section | Dataset |Link|