This is the ONNX variant of the bge-small-en-v1.5 embeddings model created with the DeepSparse Optimum integration.

For ONNX export, run:

pip install git+https://github.com/neuralmagic/optimum-deepsparse.git
from optimum.deepsparse import DeepSparseModelForFeatureExtraction
from transformers.onnx.utils import get_preprocessor
from pathlib import Path

model_id = "BAAI/bge-small-en-v1.5"

# load model and convert to onnx
model = DeepSparseModelForFeatureExtraction.from_pretrained(model_id, export=True)
tokenizer = get_preprocessor(model_id)

# save onnx checkpoint and tokenizer
onnx_path = Path("bge-small-en-v1.5-dense")
model.save_pretrained(onnx_path)
tokenizer.save_pretrained(onnx_path)
Downloads last month
3
Inference Providers NEW
This model is not currently available via any of the supported Inference Providers.