Upload folder using huggingface_hub
Browse files- README.md +4 -4
- eval/Information-Retrieval_evaluation_results.csv +0 -0
- model.safetensors +1 -1
README.md
CHANGED
@@ -47,9 +47,9 @@ The model was trained with the parameters:
|
|
47 |
|
48 |
**DataLoader**:
|
49 |
|
50 |
-
`torch.utils.data.dataloader.DataLoader` of length
|
51 |
```
|
52 |
-
{'batch_size':
|
53 |
```
|
54 |
|
55 |
**Loss**:
|
@@ -62,7 +62,7 @@ The model was trained with the parameters:
|
|
62 |
Parameters of the fit()-Method:
|
63 |
```
|
64 |
{
|
65 |
-
"epochs":
|
66 |
"evaluation_steps": 50,
|
67 |
"evaluator": "sentence_transformers.evaluation.InformationRetrievalEvaluator.InformationRetrievalEvaluator",
|
68 |
"max_grad_norm": 1,
|
@@ -72,7 +72,7 @@ Parameters of the fit()-Method:
|
|
72 |
},
|
73 |
"scheduler": "WarmupLinear",
|
74 |
"steps_per_epoch": null,
|
75 |
-
"warmup_steps":
|
76 |
"weight_decay": 0.01
|
77 |
}
|
78 |
```
|
|
|
47 |
|
48 |
**DataLoader**:
|
49 |
|
50 |
+
`torch.utils.data.dataloader.DataLoader` of length 374 with parameters:
|
51 |
```
|
52 |
+
{'batch_size': 16, 'sampler': 'torch.utils.data.sampler.SequentialSampler', 'batch_sampler': 'torch.utils.data.sampler.BatchSampler'}
|
53 |
```
|
54 |
|
55 |
**Loss**:
|
|
|
62 |
Parameters of the fit()-Method:
|
63 |
```
|
64 |
{
|
65 |
+
"epochs": 100,
|
66 |
"evaluation_steps": 50,
|
67 |
"evaluator": "sentence_transformers.evaluation.InformationRetrievalEvaluator.InformationRetrievalEvaluator",
|
68 |
"max_grad_norm": 1,
|
|
|
72 |
},
|
73 |
"scheduler": "WarmupLinear",
|
74 |
"steps_per_epoch": null,
|
75 |
+
"warmup_steps": 3740,
|
76 |
"weight_decay": 0.01
|
77 |
}
|
78 |
```
|
eval/Information-Retrieval_evaluation_results.csv
CHANGED
The diff for this file is too large to render.
See raw diff
|
|
model.safetensors
CHANGED
@@ -1,3 +1,3 @@
|
|
1 |
version https://git-lfs.github.com/spec/v1
|
2 |
-
oid sha256:
|
3 |
size 133462128
|
|
|
1 |
version https://git-lfs.github.com/spec/v1
|
2 |
+
oid sha256:9402958f89c71c6be1274b44ae0356943725c52a67a6f068e0027cb42111e34f
|
3 |
size 133462128
|