chillymiao
commited on
Commit
•
f4c1548
1
Parent(s):
327aac8
Update README.md
Browse files
README.md
CHANGED
@@ -32,14 +32,19 @@ Training required approximately 20.6GB of VRAM without any quantization (default
|
|
32 |
# Evaluate Results
|
33 |
## CMMLU
|
34 |
<img src="./pics/cmmlu.png" alt="image_name png"/>
|
|
|
35 |
## C-eval
|
36 |
<img src="./pics/ceval.png" alt="image_name png"/>
|
|
|
37 |
## TC-eval by MediaTek Research
|
38 |
<img src="./pics/tc-eval.png" alt="image_name png"/>
|
|
|
39 |
## MT-bench
|
40 |
<img src="./pics/dashB.png" alt="image_name png"/>
|
|
|
41 |
## LLM-eval by NTU Miu Lab
|
42 |
<img src="./pics/llmeval.png" alt="image_name png"/>
|
|
|
43 |
## Bailong Bench
|
44 |
|
45 |
|
@@ -74,6 +79,7 @@ Please review his marvellous works!
|
|
74 |
Download model
|
75 |
|
76 |
Here is the example for you to download Hyacinth6B with huggingface transformers:
|
|
|
77 |
```
|
78 |
from transformers import AutoTokenizer,AutoModelForCausalLM
|
79 |
import torch
|
@@ -83,6 +89,7 @@ model = AutoModelForCausalLM.from_pretrained("chillymiao/Hyacinth6B")
|
|
83 |
```
|
84 |
|
85 |
### Citaion
|
|
|
86 |
```
|
87 |
@misc{song2024hyacinth6b,
|
88 |
title={Hyacinth6B: A large language model for Traditional Chinese},
|
@@ -92,4 +99,4 @@ model = AutoModelForCausalLM.from_pretrained("chillymiao/Hyacinth6B")
|
|
92 |
archivePrefix={arXiv},
|
93 |
primaryClass={cs.CL}
|
94 |
}
|
95 |
-
```
|
|
|
32 |
# Evaluate Results
|
33 |
## CMMLU
|
34 |
<img src="./pics/cmmlu.png" alt="image_name png"/>
|
35 |
+
|
36 |
## C-eval
|
37 |
<img src="./pics/ceval.png" alt="image_name png"/>
|
38 |
+
|
39 |
## TC-eval by MediaTek Research
|
40 |
<img src="./pics/tc-eval.png" alt="image_name png"/>
|
41 |
+
|
42 |
## MT-bench
|
43 |
<img src="./pics/dashB.png" alt="image_name png"/>
|
44 |
+
|
45 |
## LLM-eval by NTU Miu Lab
|
46 |
<img src="./pics/llmeval.png" alt="image_name png"/>
|
47 |
+
|
48 |
## Bailong Bench
|
49 |
|
50 |
|
|
|
79 |
Download model
|
80 |
|
81 |
Here is the example for you to download Hyacinth6B with huggingface transformers:
|
82 |
+
|
83 |
```
|
84 |
from transformers import AutoTokenizer,AutoModelForCausalLM
|
85 |
import torch
|
|
|
89 |
```
|
90 |
|
91 |
### Citaion
|
92 |
+
|
93 |
```
|
94 |
@misc{song2024hyacinth6b,
|
95 |
title={Hyacinth6B: A large language model for Traditional Chinese},
|
|
|
99 |
archivePrefix={arXiv},
|
100 |
primaryClass={cs.CL}
|
101 |
}
|
102 |
+
```
|