metadata
base_model: Qwen/Qwen2.5-3B-Instruct
language:
- en
library_name: transformers
license: other
license_name: qwen-research
license_link: https://huggingface.co/Qwen/Qwen2.5-3B-Instruct/blob/main/LICENSE
pipeline_tag: text-generation
tags:
- chat
- openvino
- openvino-export
This model was converted to OpenVINO from Qwen/Qwen2.5-3B-Instruct
using optimum-intel
via the export space.
First make sure you have optimum-intel installed:
pip install optimum[openvino]
To load your model you can do as follows:
from optimum.intel import OVModelForCausalLM
model_id = "HelloSun/Qwen2.5-3B-Instruct-openvino"
model = OVModelForCausalLM.from_pretrained(model_id)