Could you please share an export shell and inference shell for this model?
when I try to export an onnx model and I got an error :
optimum-cli export onnx --model ./llava-1.5-7b-hf onnx_model/llava-v1.5-7b --task image-to-text-with-past --trust-remote-code
Traceback (most recent call last):
File "/opt/conda/bin/optimum-cli", line 8, in
sys.exit(main())
File "/opt/conda/lib/python3.10/site-packages/optimum/commands/optimum_cli.py", line 208, in main
service.run()
File "/opt/conda/lib/python3.10/site-packages/optimum/commands/export/onnx.py", line 265, in run
main_export(
File "/opt/conda/lib/python3.10/site-packages/optimum/exporters/onnx/main.py", line 365, in main_export
onnx_export_from_model(
File "/opt/conda/lib/python3.10/site-packages/optimum/exporters/onnx/convert.py", line 1048, in onnx_export_from_model
raise ValueError(
ValueError: Trying to export a llava model, that is a custom or unsupported architecture, but no custom onnx configuration was passed as custom_onnx_configs
. Please refer to https://huggingface.co/docs/optimum/main/en/exporters/onnx/usage_guides/export_a_model#custom-export-of-transformers-models for an example on how to export custom models. Please open an issue at https://github.com/huggingface/optimum/issues if you would like the model type llava to be supported natively in the ONNX export.