add neural-compressor tag
Browse files
README.md
CHANGED
@@ -1,14 +1,16 @@
|
|
1 |
---
|
2 |
license: apache-2.0
|
3 |
tags:
|
4 |
-
-
|
5 |
-
-
|
6 |
-
-
|
7 |
-
-
|
|
|
|
|
8 |
datasets:
|
9 |
-
- squad
|
10 |
metrics:
|
11 |
-
- f1
|
12 |
---
|
13 |
|
14 |
# INT8 DistilBERT base uncased finetuned on Squad
|
@@ -62,4 +64,4 @@ The calibration dataloader is the eval dataloader. The default calibration sampl
|
|
62 |
```python
|
63 |
from optimum.onnxruntime import ORTModelForQuestionAnswering
|
64 |
model = ORTModelForQuestionAnswering.from_pretrained('Intel/distilbert-base-uncased-distilled-squad-int8-static')
|
65 |
-
```
|
|
|
1 |
---
|
2 |
license: apache-2.0
|
3 |
tags:
|
4 |
+
- neural-compressor
|
5 |
+
- 8-bit
|
6 |
+
- int8
|
7 |
+
- Intel® Neural Compressor
|
8 |
+
- PostTrainingStatic
|
9 |
+
- onnx
|
10 |
datasets:
|
11 |
+
- squad
|
12 |
metrics:
|
13 |
+
- f1
|
14 |
---
|
15 |
|
16 |
# INT8 DistilBERT base uncased finetuned on Squad
|
|
|
64 |
```python
|
65 |
from optimum.onnxruntime import ORTModelForQuestionAnswering
|
66 |
model = ORTModelForQuestionAnswering.from_pretrained('Intel/distilbert-base-uncased-distilled-squad-int8-static')
|
67 |
+
```
|