pooja-ganesh commited on
Commit
d99b822
1 Parent(s): 1db2bb3

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +2 -2
README.md CHANGED
@@ -18,9 +18,9 @@ tags:
18
  - ## Introduction
19
  This model was created using Quark Quantization, followed by OGA Model Builder, and finalized with post-processing for NPU deployment.
20
  - ## Quantization Strategy
21
- - AWQ / Group 128 / Asymmetric / BF16 activations
22
  - ## Quick Start
23
- For quickstart, refer to AMD [RyzenAI-SW-EA](https://account.amd.com/en/member/ryzenai-sw-ea.html)
24
 
25
  #### Evaluation scores
26
  The perplexity measurement is run on the wikitext-2-raw-v1 (raw data) dataset provided by Hugging Face. Perplexity score measured for prompt length 2k is 7.153726.
 
18
  - ## Introduction
19
  This model was created using Quark Quantization, followed by OGA Model Builder, and finalized with post-processing for NPU deployment.
20
  - ## Quantization Strategy
21
+ - AWQ / Group 128 / Asymmetric / BF16 activations / UINT4 Weights
22
  - ## Quick Start
23
+ For quickstart, refer to npu-llm-artifacts_1.3.0.zip available in [RyzenAI-SW-EA](https://account.amd.com/en/member/ryzenai-sw-ea.html)
24
 
25
  #### Evaluation scores
26
  The perplexity measurement is run on the wikitext-2-raw-v1 (raw data) dataset provided by Hugging Face. Perplexity score measured for prompt length 2k is 7.153726.