zluvolyote commited on
Commit
81d966a
·
1 Parent(s): 2ff98a1

update model card README.md

Browse files
Files changed (1) hide show
  1. README.md +28 -2
README.md CHANGED
@@ -2,6 +2,8 @@
2
  license: apache-2.0
3
  tags:
4
  - generated_from_trainer
 
 
5
  model-index:
6
  - name: DEREXP
7
  results: []
@@ -13,6 +15,12 @@ should probably proofread and complete it, then remove this comment. -->
13
  # DEREXP
14
 
15
  This model is a fine-tuned version of [distilbert-base-uncased](https://huggingface.co/distilbert-base-uncased) on the None dataset.
 
 
 
 
 
 
16
 
17
  ## Model description
18
 
@@ -32,13 +40,31 @@ More information needed
32
 
33
  The following hyperparameters were used during training:
34
  - learning_rate: 2e-05
35
- - train_batch_size: 4
36
- - eval_batch_size: 4
37
  - seed: 42
38
  - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
39
  - lr_scheduler_type: linear
40
  - num_epochs: 1
41
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
42
  ### Framework versions
43
 
44
  - Transformers 4.20.1
 
2
  license: apache-2.0
3
  tags:
4
  - generated_from_trainer
5
+ metrics:
6
+ - accuracy
7
  model-index:
8
  - name: DEREXP
9
  results: []
 
15
  # DEREXP
16
 
17
  This model is a fine-tuned version of [distilbert-base-uncased](https://huggingface.co/distilbert-base-uncased) on the None dataset.
18
+ It achieves the following results on the evaluation set:
19
+ - Loss: 3.5797
20
+ - Mse: 3.5797
21
+ - Mae: 1.4414
22
+ - R2: 0.3526
23
+ - Accuracy: 0.2268
24
 
25
  ## Model description
26
 
 
40
 
41
  The following hyperparameters were used during training:
42
  - learning_rate: 2e-05
43
+ - train_batch_size: 16
44
+ - eval_batch_size: 16
45
  - seed: 42
46
  - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
47
  - lr_scheduler_type: linear
48
  - num_epochs: 1
49
 
50
+ ### Training results
51
+
52
+ | Training Loss | Epoch | Step | Validation Loss | Mse | Mae | R2 | Accuracy |
53
+ |:-------------:|:-----:|:----:|:---------------:|:------:|:------:|:------:|:--------:|
54
+ | 14.19 | 0.08 | 500 | 4.6174 | 4.6174 | 1.6744 | 0.1649 | 0.198 |
55
+ | 4.527 | 0.16 | 1000 | 3.9019 | 3.9019 | 1.5164 | 0.2943 | 0.2192 |
56
+ | 4.3036 | 0.24 | 1500 | 5.3501 | 5.3501 | 1.8130 | 0.0324 | 0.1736 |
57
+ | 4.0923 | 0.32 | 2000 | 3.8948 | 3.8948 | 1.5150 | 0.2956 | 0.2142 |
58
+ | 4.0042 | 0.4 | 2500 | 3.7648 | 3.7648 | 1.4905 | 0.3191 | 0.2162 |
59
+ | 3.8685 | 0.48 | 3000 | 3.7741 | 3.7741 | 1.4908 | 0.3174 | 0.2152 |
60
+ | 3.8928 | 0.56 | 3500 | 3.7122 | 3.7122 | 1.4738 | 0.3286 | 0.214 |
61
+ | 3.8193 | 0.64 | 4000 | 3.7020 | 3.7020 | 1.4727 | 0.3304 | 0.2182 |
62
+ | 3.6929 | 0.72 | 4500 | 3.6419 | 3.6419 | 1.4575 | 0.3413 | 0.2266 |
63
+ | 3.7974 | 0.8 | 5000 | 3.6995 | 3.6995 | 1.4656 | 0.3309 | 0.2202 |
64
+ | 3.7752 | 0.88 | 5500 | 3.6344 | 3.6344 | 1.4559 | 0.3427 | 0.2276 |
65
+ | 3.6254 | 0.96 | 6000 | 3.5797 | 3.5797 | 1.4414 | 0.3526 | 0.2268 |
66
+
67
+
68
  ### Framework versions
69
 
70
  - Transformers 4.20.1