leejunhyeok
commited on
Commit
•
113b76d
1
Parent(s):
e17cd57
Update README.md
Browse files
README.md
CHANGED
@@ -4,16 +4,35 @@ language:
|
|
4 |
- en
|
5 |
library_name: peft
|
6 |
---
|
|
|
|
|
7 |
|
8 |
-
|
9 |
-
|
10 |
-
|
|
|
11 |
### Used Datasets
|
12 |
-
- Orca
|
13 |
-
-
|
14 |
-
- No other dataset was used except for the dataset mentioned above
|
15 |
- No benchmark test set or the training set are used
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
16 |
|
17 |
-
|
|
|
|
|
|
|
18 |
|
19 |
-
|
|
|
|
|
|
|
|
|
|
4 |
- en
|
5 |
library_name: peft
|
6 |
---
|
7 |
+
# **Introduction**
|
8 |
+
This is a sft version of QWEN/QWEN-72B Model, and llamafied for leaderboard submission
|
9 |
|
10 |
+
## Details
|
11 |
+
### Used Librarys
|
12 |
+
- torch
|
13 |
+
- peft
|
14 |
### Used Datasets
|
15 |
+
- Open-Orca/SlimOrca
|
16 |
+
- No other dataset was used
|
|
|
17 |
- No benchmark test set or the training set are used
|
18 |
+
- TBU: data contamination test results (with https://github.com/swj0419/detect-pretrain-code-contamination)
|
19 |
+
### Used Environments
|
20 |
+
- AMD MI250 & MoAI platform
|
21 |
+
- Please visit https://moreh.io/product for more information about MoAI platform
|
22 |
+
- Or, contact us directly [contact@moreh.io](mailto:contact@moreh.io)
|
23 |
+
|
24 |
+
## License
|
25 |
+
TBU
|
26 |
+
|
27 |
+
## How to use
|
28 |
|
29 |
+
```python
|
30 |
+
# pip install transformers==4.35.2
|
31 |
+
import torch
|
32 |
+
from transformers import AutoModelForCausalLM, AutoTokenizer
|
33 |
|
34 |
+
tokenizer = AutoTokenizer.from_pretrained("moreh/MoMo-70B-LoRA-V1.4")
|
35 |
+
model = AutoModelForCausalLM.from_pretrained(
|
36 |
+
"moreh/MoMo-70B-LoRA-V1.4"
|
37 |
+
)
|
38 |
+
```
|