Optimized and quantized of the original model
Optimization format: ONNX
Quantization: int8
Original model is available at intfloat/multilingual-e5-small
- Downloads last month
- 34
Inference Providers
NEW
This model is not currently available via any of the supported third-party Inference Providers, and
the model is not deployed on the HF Inference API.
Evaluation results
- accuracy on MTEB AmazonCounterfactualClassification (en)test set self-reported73.791
- ap on MTEB AmazonCounterfactualClassification (en)test set self-reported37.000
- f1 on MTEB AmazonCounterfactualClassification (en)test set self-reported67.955
- accuracy on MTEB AmazonCounterfactualClassification (de)test set self-reported71.649
- ap on MTEB AmazonCounterfactualClassification (de)test set self-reported82.119
- f1 on MTEB AmazonCounterfactualClassification (de)test set self-reported69.880
- accuracy on MTEB AmazonCounterfactualClassification (en-ext)test set self-reported75.810
- ap on MTEB AmazonCounterfactualClassification (en-ext)test set self-reported24.469
- f1 on MTEB AmazonCounterfactualClassification (en-ext)test set self-reported63.001
- accuracy on MTEB AmazonCounterfactualClassification (ja)test set self-reported64.186