Optimized and quantized of the original model
Optimization format: ONNX
Quantization: int8
Original model is available at intfloat/multilingual-e5-small
- Downloads last month
- 63
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social
visibility and check back later, or deploy to Inference Endpoints (dedicated)
instead.
Evaluation results
- accuracy on MTEB AmazonCounterfactualClassification (en)test set self-reported73.791
- ap on MTEB AmazonCounterfactualClassification (en)test set self-reported37.000
- f1 on MTEB AmazonCounterfactualClassification (en)test set self-reported67.955
- accuracy on MTEB AmazonCounterfactualClassification (de)test set self-reported71.649
- ap on MTEB AmazonCounterfactualClassification (de)test set self-reported82.119
- f1 on MTEB AmazonCounterfactualClassification (de)test set self-reported69.880
- accuracy on MTEB AmazonCounterfactualClassification (en-ext)test set self-reported75.810
- ap on MTEB AmazonCounterfactualClassification (en-ext)test set self-reported24.469
- f1 on MTEB AmazonCounterfactualClassification (en-ext)test set self-reported63.001
- accuracy on MTEB AmazonCounterfactualClassification (ja)test set self-reported64.186