sharkMeow commited on
Commit
ca371b5
·
verified ·
1 Parent(s): 0e7809b

Model save

Browse files
Files changed (1) hide show
  1. README.md +16 -6
README.md CHANGED
@@ -1,6 +1,4 @@
1
  ---
2
- license: gpl-3.0
3
- base_model: ckiplab/bert-base-chinese
4
  tags:
5
  - generated_from_trainer
6
  model-index:
@@ -13,9 +11,9 @@ should probably proofread and complete it, then remove this comment. -->
13
 
14
  # clip-roberta-finetuned
15
 
16
- This model is a fine-tuned version of [ckiplab/bert-base-chinese](https://huggingface.co/ckiplab/bert-base-chinese) on an unknown dataset.
17
  It achieves the following results on the evaluation set:
18
- - Loss: 7.5963
19
 
20
  ## Model description
21
 
@@ -36,14 +34,26 @@ More information needed
36
  The following hyperparameters were used during training:
37
  - learning_rate: 5e-05
38
  - train_batch_size: 80
39
- - eval_batch_size: 128
40
  - seed: 42
41
  - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
42
  - lr_scheduler_type: linear
43
- - num_epochs: 100.0
44
 
45
  ### Training results
46
 
 
 
 
 
 
 
 
 
 
 
 
 
47
 
48
 
49
  ### Framework versions
 
1
  ---
 
 
2
  tags:
3
  - generated_from_trainer
4
  model-index:
 
11
 
12
  # clip-roberta-finetuned
13
 
14
+ This model is a fine-tuned version of [](https://huggingface.co/) on an unknown dataset.
15
  It achieves the following results on the evaluation set:
16
+ - Loss: 7.7902
17
 
18
  ## Model description
19
 
 
34
  The following hyperparameters were used during training:
35
  - learning_rate: 5e-05
36
  - train_batch_size: 80
37
+ - eval_batch_size: 150
38
  - seed: 42
39
  - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
40
  - lr_scheduler_type: linear
41
+ - num_epochs: 150.0
42
 
43
  ### Training results
44
 
45
+ | Training Loss | Epoch | Step | Validation Loss |
46
+ |:-------------:|:-----:|:----:|:---------------:|
47
+ | 2.2125 | 15.0 | 240 | 7.3975 |
48
+ | 0.2662 | 30.0 | 480 | 7.6902 |
49
+ | 0.0878 | 45.0 | 720 | 7.7278 |
50
+ | 0.0478 | 60.0 | 960 | 7.7675 |
51
+ | 0.0271 | 75.0 | 1200 | 7.8001 |
52
+ | 0.0204 | 90.0 | 1440 | 7.7704 |
53
+ | 0.0153 | 105.0 | 1680 | 7.7562 |
54
+ | 0.0144 | 120.0 | 1920 | 7.7687 |
55
+ | 0.0118 | 135.0 | 2160 | 7.7854 |
56
+ | 0.0109 | 150.0 | 2400 | 7.7902 |
57
 
58
 
59
  ### Framework versions