pooja-ganesh
commited on
Update README.md
Browse files
README.md
CHANGED
@@ -9,7 +9,7 @@ tags:
|
|
9 |
base_model: THUDM/chatglm3-6b
|
10 |
---
|
11 |
|
12 |
-
# chatglm3-6b-awq-w-int4-asym-gs128-
|
13 |
- ## Introduction
|
14 |
This model was created by applying [Quark](https://quark.docs.amd.com/latest/index.html) with calibration samples from Pile dataset, and applying [onnxruntime-genai model builder](https://github.com/microsoft/onnxruntime-genai/tree/main/src/python/py/models) to convert to ONNX.
|
15 |
- ## Quantization Strategy
|
@@ -19,10 +19,6 @@ base_model: THUDM/chatglm3-6b
|
|
19 |
- ## Quick Start
|
20 |
For quickstart, refer to AMD [RyzenAI-SW-EA](https://account.amd.com/en/member/ryzenai-sw-ea.html) (to be updated)
|
21 |
|
22 |
-
## Evaluation
|
23 |
-
Quark currently uses perplexity(PPL) as the evaluation metric for accuracy loss before and after quantization.The specific PPL algorithm can be referenced in the quantize_quark.py.
|
24 |
-
The quantization evaluation results are conducted in pseudo-quantization mode, which may slightly differ from the actual quantized inference accuracy. These results are provided for reference only.
|
25 |
-
|
26 |
#### License
|
27 |
Modifications copyright(c) 2024 Advanced Micro Devices,Inc. All rights reserved.
|
28 |
|
|
|
9 |
base_model: THUDM/chatglm3-6b
|
10 |
---
|
11 |
|
12 |
+
# chatglm3-6b-awq-w-int4-asym-gs128-a-fp16-onnx-ryzen-strix-hybrid
|
13 |
- ## Introduction
|
14 |
This model was created by applying [Quark](https://quark.docs.amd.com/latest/index.html) with calibration samples from Pile dataset, and applying [onnxruntime-genai model builder](https://github.com/microsoft/onnxruntime-genai/tree/main/src/python/py/models) to convert to ONNX.
|
15 |
- ## Quantization Strategy
|
|
|
19 |
- ## Quick Start
|
20 |
For quickstart, refer to AMD [RyzenAI-SW-EA](https://account.amd.com/en/member/ryzenai-sw-ea.html) (to be updated)
|
21 |
|
|
|
|
|
|
|
|
|
22 |
#### License
|
23 |
Modifications copyright(c) 2024 Advanced Micro Devices,Inc. All rights reserved.
|
24 |
|