pooja-ganesh commited on
Commit
0c23658
1 Parent(s): 2a056e6

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +2 -2
README.md CHANGED
@@ -14,9 +14,9 @@ tags:
14
  - ## Introduction
15
  This model was created using Quark Quantization, followed by OGA Model Builder, and finalized with post-processing for NPU deployment.
16
  - ## Quantization Strategy
17
- - AWQ / Group 128 / Asymmetric / BF16 activations
18
  - ## Quick Start
19
- For quickstart, refer to AMD [RyzenAI-SW-EA](https://account.amd.com/en/member/ryzenai-sw-ea.html)
20
 
21
  #### Evaluation scores
22
  The perplexity measurement is run on the wikitext-2-raw-v1 (raw data) dataset provided by Hugging Face. Perplexity score measured for prompt length 2k is 12.02787.
 
14
  - ## Introduction
15
  This model was created using Quark Quantization, followed by OGA Model Builder, and finalized with post-processing for NPU deployment.
16
  - ## Quantization Strategy
17
+ - AWQ / Group 128 / Asymmetric / BF16 activations/ UINT4 Weights
18
  - ## Quick Start
19
+ For quickstart, refer to npu-llm-artifacts_1.3.0.zip available in [RyzenAI-SW-EA](https://account.amd.com/en/member/ryzenai-sw-ea.html)
20
 
21
  #### Evaluation scores
22
  The perplexity measurement is run on the wikitext-2-raw-v1 (raw data) dataset provided by Hugging Face. Perplexity score measured for prompt length 2k is 12.02787.