configuration_phi3_v.py
I have tried using the ONNX docs as guidance, but i keep receiving the following error when running the model. thanks for any help or thoughts.
error:
microsoft/Phi-3-vision-128k-instruct-onnx-cpu does not appear to have a file named configuration_phi3_v.py. Checkout 'https://huggingface.co/microsoft/Phi-3-vision-128k-instruct-onnx-cpu/tree/main' for available files.
Are you following the Phi-3 vision ONNX tutorial to run with ONNX Runtime GenAI? If so, which step is raising this error? The published ONNX tutorial should not require configuration_phi3_v.py
.
@kvaishnavi , yes i am following the ONNX tutorial.
I haven't dug into the issue in a minute to get the exact step.
using transformers here getting the same issue. @claytonOps @kvaishnavi any solutions?
Does the below code work?
Load model directly
from transformers import AutoModelForCausalLM
model = AutoModelForCausalLM.from_pretrained("microsoft/Phi-3-vision-128k-instruct-onnx-cpu", trust_remote_code=True)
Does the below code work?
# Load model directly from transformers import AutoModelForCausalLM model = AutoModelForCausalLM.from_pretrained("microsoft/Phi-3-vision-128k-instruct-onnx-cpu", trust_remote_code=True)
Hugging Face's Transformers library only works with PyTorch models and not ONNX models so this code will not work. The uploaded Phi-3 vision ONNX models work with ONNX Runtime GenAI. You can follow the Phi-3 vision ONNX tutorial to run the ONNX models.
或者你可以直接下载这里的脚本来推理onnx https://github.com/microsoft/onnxruntime-genai/blob/main/examples/python/phi3v.py