File size: 1,007 Bytes
1682e68
 
 
 
 
 
 
 
 
 
 
7b513b3
 
c83e639
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
---
license: mit
language:
- multilingual
base_model:
- microsoft/Phi-3.5-mini-instruct
pipeline_tag: text-generation
library_name: transformers
tags:
- nlp
- code
- onnx
- amd
---

# microsoft/Phi-3.5-mini-instruct
- ## Introduction
  This model was created by applying [Quark](https://quark.docs.amd.com/latest/index.html) with calibration samples from Pile dataset, and applying [onnxruntime-genai model builder](https://github.com/microsoft/onnxruntime-genai/tree/main/src/python/py/models) to convert to ONNX.
- ## Quantization Strategy
  - ***Quantized Layers***: TBD
  - ***Weight***: TBD
- ## Quick Start
For quickstart, refer to AMD [RyzenAI-SW-EA](https://account.amd.com/en/member/ryzenai-sw-ea.html)

#### Evaluation scores
The perplexity measurement is run on the wikitext-2-raw-v1 (raw data) dataset provided by Hugging Face. Perplexity score measured for prompt length 2k is 7.12716.

#### License
Modifications copyright(c) 2024 Advanced Micro Devices,Inc. All rights reserved.

license: mit