igerasimov
commited on
Commit
•
5767d8e
1
Parent(s):
9f18674
igerasimov/nasa-impact-bert-e-base-mlm-finetuned
Browse files- README.md +173 -0
- config.json +78 -0
- model.safetensors +3 -0
- training_args.bin +3 -0
README.md
ADDED
@@ -0,0 +1,173 @@
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
1 |
+
---
|
2 |
+
license: apache-2.0
|
3 |
+
base_model: nasa-impact/nasa-smd-ibm-st
|
4 |
+
tags:
|
5 |
+
- generated_from_trainer
|
6 |
+
metrics:
|
7 |
+
- accuracy
|
8 |
+
model-index:
|
9 |
+
- name: checkpoints
|
10 |
+
results: []
|
11 |
+
---
|
12 |
+
|
13 |
+
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
|
14 |
+
should probably proofread and complete it, then remove this comment. -->
|
15 |
+
|
16 |
+
# checkpoints
|
17 |
+
|
18 |
+
This model is a fine-tuned version of [nasa-impact/nasa-smd-ibm-st](https://huggingface.co/nasa-impact/nasa-smd-ibm-st) on the None dataset.
|
19 |
+
It achieves the following results on the evaluation set:
|
20 |
+
- Accuracy: 0.7087
|
21 |
+
- Loss: 1.5329
|
22 |
+
- Overall/weighted Precision: 0.7311
|
23 |
+
- Overall/weighted Recall: 0.7087
|
24 |
+
- Overall/weighted F1: 0.7105
|
25 |
+
- Class Agriculture/precision: 1.0
|
26 |
+
- Class Agriculture/recall: 1.0
|
27 |
+
- Class Agriculture/f1: 1.0
|
28 |
+
- Class Air quality/precision: 0.7586
|
29 |
+
- Class Air quality/recall: 0.8462
|
30 |
+
- Class Air quality/f1: 0.8
|
31 |
+
- Class Cryospheric climate indicators/precision: 1.0
|
32 |
+
- Class Cryospheric climate indicators/recall: 0.5
|
33 |
+
- Class Cryospheric climate indicators/f1: 0.6667
|
34 |
+
- Class Droughts/precision: 1.0
|
35 |
+
- Class Droughts/recall: 1.0
|
36 |
+
- Class Droughts/f1: 1.0
|
37 |
+
- Class Earthquakes/precision: 1.0
|
38 |
+
- Class Earthquakes/recall: 1.0
|
39 |
+
- Class Earthquakes/f1: 1.0
|
40 |
+
- Class Ecosystem species/precision: 0.8571
|
41 |
+
- Class Ecosystem species/recall: 0.75
|
42 |
+
- Class Ecosystem species/f1: 0.8
|
43 |
+
- Class Ecosystems/precision: 0.0
|
44 |
+
- Class Ecosystems/recall: 1.0
|
45 |
+
- Class Ecosystems/f1: 0.0
|
46 |
+
- Class Energy production/use/precision: 0.0
|
47 |
+
- Class Energy production/use/recall: 0.0
|
48 |
+
- Class Energy production/use/f1: 0.0
|
49 |
+
- Class Extreme weather/precision: 0.3846
|
50 |
+
- Class Extreme weather/recall: 0.8333
|
51 |
+
- Class Extreme weather/f1: 0.5263
|
52 |
+
- Class Floods/precision: 0.0
|
53 |
+
- Class Floods/recall: 0.0
|
54 |
+
- Class Floods/f1: 0.0
|
55 |
+
- Class Greenhouse gases/precision: 0.6667
|
56 |
+
- Class Greenhouse gases/recall: 0.5
|
57 |
+
- Class Greenhouse gases/f1: 0.5714
|
58 |
+
- Class Heat/precision: 0.75
|
59 |
+
- Class Heat/recall: 0.75
|
60 |
+
- Class Heat/f1: 0.75
|
61 |
+
- Class Land use and cover change/precision: 1.0
|
62 |
+
- Class Land use and cover change/recall: 1.0
|
63 |
+
- Class Land use and cover change/f1: 1.0
|
64 |
+
- Class Landslides/precision: 0.8333
|
65 |
+
- Class Landslides/recall: 0.8333
|
66 |
+
- Class Landslides/f1: 0.8333
|
67 |
+
- Class Public health/precision: 1.0
|
68 |
+
- Class Public health/recall: 0.5
|
69 |
+
- Class Public health/f1: 0.6667
|
70 |
+
- Class Severe storms/precision: 0.5
|
71 |
+
- Class Severe storms/recall: 0.4
|
72 |
+
- Class Severe storms/f1: 0.4444
|
73 |
+
- Class Sun-earth interactions/precision: 0.7667
|
74 |
+
- Class Sun-earth interactions/recall: 0.6970
|
75 |
+
- Class Sun-earth interactions/f1: 0.7302
|
76 |
+
- Class Teleconnections/precision: 0.6667
|
77 |
+
- Class Teleconnections/recall: 0.5455
|
78 |
+
- Class Teleconnections/f1: 0.6
|
79 |
+
- Class Temperature indicators/precision: 0.5
|
80 |
+
- Class Temperature indicators/recall: 0.6667
|
81 |
+
- Class Temperature indicators/f1: 0.5714
|
82 |
+
|
83 |
+
## Model description
|
84 |
+
|
85 |
+
More information needed
|
86 |
+
|
87 |
+
## Intended uses & limitations
|
88 |
+
|
89 |
+
More information needed
|
90 |
+
|
91 |
+
## Training and evaluation data
|
92 |
+
|
93 |
+
More information needed
|
94 |
+
|
95 |
+
## Training procedure
|
96 |
+
|
97 |
+
### Training hyperparameters
|
98 |
+
|
99 |
+
The following hyperparameters were used during training:
|
100 |
+
- learning_rate: 5e-05
|
101 |
+
- train_batch_size: 16
|
102 |
+
- eval_batch_size: 16
|
103 |
+
- seed: 42
|
104 |
+
- gradient_accumulation_steps: 4
|
105 |
+
- total_train_batch_size: 64
|
106 |
+
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
|
107 |
+
- lr_scheduler_type: linear
|
108 |
+
- lr_scheduler_warmup_steps: 500
|
109 |
+
- num_epochs: 50
|
110 |
+
- mixed_precision_training: Native AMP
|
111 |
+
|
112 |
+
### Training results
|
113 |
+
|
114 |
+
| Training Loss | Epoch | Step | Accuracy | Validation Loss | Overall/weighted Precision | Overall/weighted Recall | Overall/weighted F1 | Class Agriculture/precision | Class Agriculture/recall | Class Agriculture/f1 | Class Air quality/precision | Class Air quality/recall | Class Air quality/f1 | Class Cryospheric climate indicators/precision | Class Cryospheric climate indicators/recall | Class Cryospheric climate indicators/f1 | Class Droughts/precision | Class Droughts/recall | Class Droughts/f1 | Class Earthquakes/precision | Class Earthquakes/recall | Class Earthquakes/f1 | Class Ecosystem species/precision | Class Ecosystem species/recall | Class Ecosystem species/f1 | Class Ecosystems/precision | Class Ecosystems/recall | Class Ecosystems/f1 | Class Energy production/use/precision | Class Energy production/use/recall | Class Energy production/use/f1 | Class Extreme weather/precision | Class Extreme weather/recall | Class Extreme weather/f1 | Class Floods/precision | Class Floods/recall | Class Floods/f1 | Class Greenhouse gases/precision | Class Greenhouse gases/recall | Class Greenhouse gases/f1 | Class Heat/precision | Class Heat/recall | Class Heat/f1 | Class Land use and cover change/precision | Class Land use and cover change/recall | Class Land use and cover change/f1 | Class Landslides/precision | Class Landslides/recall | Class Landslides/f1 | Class Public health/precision | Class Public health/recall | Class Public health/f1 | Class Severe storms/precision | Class Severe storms/recall | Class Severe storms/f1 | Class Sun-earth interactions/precision | Class Sun-earth interactions/recall | Class Sun-earth interactions/f1 | Class Teleconnections/precision | Class Teleconnections/recall | Class Teleconnections/f1 | Class Temperature indicators/precision | Class Temperature indicators/recall | Class Temperature indicators/f1 | Class Validation/precision | Class Validation/recall | Class Validation/f1 |
|
115 |
+
|:-------------:|:-----:|:----:|:--------:|:---------------:|:--------------------------:|:-----------------------:|:-------------------:|:---------------------------:|:------------------------:|:--------------------:|:---------------------------:|:------------------------:|:--------------------:|:----------------------------------------------:|:-------------------------------------------:|:---------------------------------------:|:------------------------:|:---------------------:|:-----------------:|:---------------------------:|:------------------------:|:--------------------:|:---------------------------------:|:------------------------------:|:--------------------------:|:--------------------------:|:-----------------------:|:-------------------:|:-------------------------------------:|:----------------------------------:|:------------------------------:|:-------------------------------:|:----------------------------:|:------------------------:|:----------------------:|:-------------------:|:---------------:|:--------------------------------:|:-----------------------------:|:-------------------------:|:--------------------:|:-----------------:|:-------------:|:-----------------------------------------:|:--------------------------------------:|:----------------------------------:|:--------------------------:|:-----------------------:|:-------------------:|:-----------------------------:|:--------------------------:|:----------------------:|:-----------------------------:|:--------------------------:|:----------------------:|:--------------------------------------:|:-----------------------------------:|:-------------------------------:|:-------------------------------:|:----------------------------:|:------------------------:|:--------------------------------------:|:-----------------------------------:|:-------------------------------:|:--------------------------:|:-----------------------:|:-------------------:|
|
116 |
+
| No log | 1.0 | 8 | 0.0 | 3.1300 | 0.4173 | 0.0 | 0.0 | 1.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 1.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 1.0 | 0.0 | 0.0 | 1.0 | 0.0 | 0.0 | 0.0 | 1.0 | 0.0 | 0.0 | 1.0 | 0.0 | 1.0 | 0.0 | 0.0 | 1.0 | 0.0 | 0.0 | 1.0 | 0.0 | 0.0 | 1.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 1.0 | 0.0 | 0.0 | 1.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 1.0 | 0.0 | 0.0 | 1.0 | 0.0 | 0.0 |
|
117 |
+
| 3.1343 | 2.0 | 16 | 0.0945 | 3.0987 | 0.4665 | 0.0945 | 0.0646 | 1.0 | 0.0 | 0.0 | 0.24 | 0.4615 | 0.3158 | 1.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 1.0 | 0.0 | 0.0 | 1.0 | 0.0 | 0.0 | 0.0 | 1.0 | 0.0 | 1.0 | 0.0 | 0.0 | 1.0 | 0.0 | 0.0 | 1.0 | 0.0 | 0.0 | 1.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 1.0 | 0.0 | 0.0 | 1.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 1.0 | 0.0 | 0.0 | 1.0 | 0.0 | 0.0 |
|
118 |
+
| 3.1018 | 3.0 | 24 | 0.2126 | 3.0484 | 0.5508 | 0.2126 | 0.0978 | 1.0 | 0.0 | 0.0 | 0.2336 | 0.9615 | 0.3759 | 1.0 | 0.0 | 0.0 | 1.0 | 0.0 | 0.0 | 1.0 | 0.0 | 0.0 | 1.0 | 0.0 | 0.0 | 1.0 | 0.0 | 0.0 | 1.0 | 0.0 | 0.0 | 1.0 | 0.0 | 0.0 | 1.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 1.0 | 0.0 | 0.0 | 1.0 | 0.0 | 0.0 | 1.0 | 0.0 | 0.0 | 0.1176 | 0.0606 | 0.08 | 1.0 | 0.0 | 0.0 | 1.0 | 0.0 | 0.0 |
|
119 |
+
| 3.0499 | 4.0 | 32 | 0.2283 | 2.9848 | 0.6297 | 0.2283 | 0.1095 | 1.0 | 0.0 | 0.0 | 0.2364 | 1.0 | 0.3824 | 1.0 | 0.0 | 0.0 | 1.0 | 0.0 | 0.0 | 1.0 | 0.0 | 0.0 | 1.0 | 0.0 | 0.0 | 1.0 | 0.0 | 0.0 | 1.0 | 0.0 | 0.0 | 1.0 | 0.0 | 0.0 | 1.0 | 0.0 | 0.0 | 1.0 | 0.0 | 0.0 | 1.0 | 0.0 | 0.0 | 1.0 | 0.0 | 0.0 | 1.0 | 0.0 | 0.0 | 1.0 | 0.0 | 0.0 | 0.1765 | 0.0909 | 0.12 | 1.0 | 0.0 | 0.0 | 1.0 | 0.0 | 0.0 |
|
120 |
+
| 2.9715 | 5.0 | 40 | 0.2283 | 2.9113 | 0.6321 | 0.2283 | 0.1095 | 1.0 | 0.0 | 0.0 | 0.2342 | 1.0 | 0.3796 | 1.0 | 0.0 | 0.0 | 1.0 | 0.0 | 0.0 | 1.0 | 0.0 | 0.0 | 1.0 | 0.0 | 0.0 | 1.0 | 0.0 | 0.0 | 1.0 | 0.0 | 0.0 | 1.0 | 0.0 | 0.0 | 1.0 | 0.0 | 0.0 | 1.0 | 0.0 | 0.0 | 1.0 | 0.0 | 0.0 | 1.0 | 0.0 | 0.0 | 1.0 | 0.0 | 0.0 | 1.0 | 0.0 | 0.0 | 0.1875 | 0.0909 | 0.1224 | 1.0 | 0.0 | 0.0 | 1.0 | 0.0 | 0.0 |
|
121 |
+
| 2.9715 | 6.0 | 48 | 0.2520 | 2.8317 | 0.6668 | 0.2520 | 0.1394 | 1.0 | 0.0 | 0.0 | 0.2407 | 1.0 | 0.3881 | 1.0 | 0.0 | 0.0 | 1.0 | 0.0 | 0.0 | 1.0 | 0.0 | 0.0 | 1.0 | 0.0 | 0.0 | 1.0 | 0.0 | 0.0 | 1.0 | 0.0 | 0.0 | 1.0 | 0.0 | 0.0 | 1.0 | 0.0 | 0.0 | 1.0 | 0.0 | 0.0 | 1.0 | 0.0 | 0.0 | 1.0 | 0.0 | 0.0 | 1.0 | 0.0 | 0.0 | 1.0 | 0.0 | 0.0 | 0.3158 | 0.1818 | 0.2308 | 1.0 | 0.0 | 0.0 | 1.0 | 0.0 | 0.0 |
|
122 |
+
| 2.8735 | 7.0 | 56 | 0.4016 | 2.7478 | 0.7263 | 0.4016 | 0.2582 | 1.0 | 0.0 | 0.0 | 0.52 | 1.0 | 0.6842 | 1.0 | 0.0 | 0.0 | 1.0 | 0.0 | 0.0 | 1.0 | 0.0 | 0.0 | 1.0 | 0.0 | 0.0 | 1.0 | 0.0 | 0.0 | 1.0 | 0.0 | 0.0 | 1.0 | 0.0 | 0.0 | 1.0 | 0.0 | 0.0 | 1.0 | 0.0 | 0.0 | 1.0 | 0.0 | 0.0 | 1.0 | 0.0 | 0.0 | 1.0 | 0.0 | 0.0 | 1.0 | 0.0 | 0.0 | 0.3247 | 0.7576 | 0.4545 | 1.0 | 0.0 | 0.0 | 1.0 | 0.0 | 0.0 |
|
123 |
+
| 2.7766 | 8.0 | 64 | 0.3937 | 2.6696 | 0.7255 | 0.3937 | 0.2552 | 1.0 | 0.0 | 0.0 | 0.5319 | 0.9615 | 0.6849 | 1.0 | 0.0 | 0.0 | 1.0 | 0.0 | 0.0 | 1.0 | 0.0 | 0.0 | 1.0 | 0.0 | 0.0 | 1.0 | 0.0 | 0.0 | 1.0 | 0.0 | 0.0 | 1.0 | 0.0 | 0.0 | 1.0 | 0.0 | 0.0 | 1.0 | 0.0 | 0.0 | 1.0 | 0.0 | 0.0 | 1.0 | 0.0 | 0.0 | 1.0 | 0.0 | 0.0 | 1.0 | 0.0 | 0.0 | 0.3125 | 0.7576 | 0.4425 | 1.0 | 0.0 | 0.0 | 1.0 | 0.0 | 0.0 |
|
124 |
+
| 2.6778 | 9.0 | 72 | 0.3307 | 2.6007 | 0.7783 | 0.3307 | 0.2201 | 1.0 | 0.0 | 0.0 | 0.8333 | 0.3846 | 0.5263 | 1.0 | 0.0 | 0.0 | 1.0 | 0.0 | 0.0 | 1.0 | 0.0 | 0.0 | 1.0 | 0.0 | 0.0 | 1.0 | 0.0 | 0.0 | 1.0 | 0.0 | 0.0 | 1.0 | 0.0 | 0.0 | 1.0 | 0.0 | 0.0 | 1.0 | 0.0 | 0.0 | 1.0 | 0.0 | 0.0 | 1.0 | 0.0 | 0.0 | 1.0 | 0.0 | 0.0 | 1.0 | 0.0 | 0.0 | 0.2783 | 0.9697 | 0.4324 | 1.0 | 0.0 | 0.0 | 1.0 | 0.0 | 0.0 |
|
125 |
+
| 2.6132 | 10.0 | 80 | 0.3701 | 2.5347 | 0.7679 | 0.3701 | 0.2580 | 1.0 | 0.0 | 0.0 | 0.7727 | 0.6538 | 0.7083 | 1.0 | 0.0 | 0.0 | 1.0 | 0.0 | 0.0 | 1.0 | 0.0 | 0.0 | 1.0 | 0.0 | 0.0 | 1.0 | 0.0 | 0.0 | 1.0 | 0.0 | 0.0 | 1.0 | 0.0 | 0.0 | 1.0 | 0.0 | 0.0 | 1.0 | 0.0 | 0.0 | 1.0 | 0.0 | 0.0 | 1.0 | 0.0 | 0.0 | 1.0 | 0.0 | 0.0 | 1.0 | 0.0 | 0.0 | 0.2857 | 0.9091 | 0.4348 | 1.0 | 0.0 | 0.0 | 1.0 | 0.0 | 0.0 |
|
126 |
+
| 2.6132 | 11.0 | 88 | 0.3937 | 2.4538 | 0.7650 | 0.3937 | 0.2717 | 1.0 | 0.0 | 0.0 | 0.7407 | 0.7692 | 0.7547 | 1.0 | 0.0 | 0.0 | 1.0 | 0.0 | 0.0 | 1.0 | 0.0 | 0.0 | 1.0 | 0.0 | 0.0 | 1.0 | 0.0 | 0.0 | 1.0 | 0.0 | 0.0 | 1.0 | 0.0 | 0.0 | 1.0 | 0.0 | 0.0 | 1.0 | 0.0 | 0.0 | 1.0 | 0.0 | 0.0 | 1.0 | 0.0 | 0.0 | 1.0 | 0.0 | 0.0 | 1.0 | 0.0 | 0.0 | 0.3 | 0.9091 | 0.4511 | 1.0 | 0.0 | 0.0 | 1.0 | 0.0 | 0.0 |
|
127 |
+
| 2.502 | 12.0 | 96 | 0.4252 | 2.3546 | 0.7537 | 0.4252 | 0.3030 | 1.0 | 0.0 | 0.0 | 0.6667 | 0.9231 | 0.7742 | 1.0 | 0.0 | 0.0 | 1.0 | 0.0 | 0.0 | 1.0 | 0.0 | 0.0 | 1.0 | 0.25 | 0.4 | 1.0 | 0.0 | 0.0 | 1.0 | 0.0 | 0.0 | 1.0 | 0.0 | 0.0 | 1.0 | 0.0 | 0.0 | 1.0 | 0.0 | 0.0 | 1.0 | 0.0 | 0.0 | 1.0 | 0.0 | 0.0 | 1.0 | 0.0 | 0.0 | 1.0 | 0.0 | 0.0 | 0.3146 | 0.8485 | 0.4590 | 1.0 | 0.0 | 0.0 | 1.0 | 0.0 | 0.0 |
|
128 |
+
| 2.4298 | 13.0 | 104 | 0.4646 | 2.2607 | 0.7308 | 0.4646 | 0.3283 | 1.0 | 0.0 | 0.0 | 0.6098 | 0.9615 | 0.7463 | 1.0 | 0.0 | 0.0 | 1.0 | 0.0 | 0.0 | 1.0 | 0.0 | 0.0 | 0.6 | 0.75 | 0.6667 | 1.0 | 0.0 | 0.0 | 1.0 | 0.0 | 0.0 | 1.0 | 0.0 | 0.0 | 1.0 | 0.0 | 0.0 | 1.0 | 0.0 | 0.0 | 1.0 | 0.0 | 0.0 | 1.0 | 0.0 | 0.0 | 1.0 | 0.0 | 0.0 | 1.0 | 0.0 | 0.0 | 0.3684 | 0.8485 | 0.5138 | 1.0 | 0.0 | 0.0 | 1.0 | 0.0 | 0.0 |
|
129 |
+
| 2.263 | 14.0 | 112 | 0.4803 | 2.1656 | 0.6757 | 0.4803 | 0.3474 | 1.0 | 0.0 | 0.0 | 0.6098 | 0.9615 | 0.7463 | 1.0 | 0.0 | 0.0 | 1.0 | 0.0 | 0.0 | 1.0 | 0.0 | 0.0 | 0.4375 | 0.875 | 0.5833 | 1.0 | 0.0 | 0.0 | 1.0 | 0.0 | 0.0 | 1.0 | 0.0 | 0.0 | 1.0 | 0.0 | 0.0 | 1.0 | 0.0 | 0.0 | 1.0 | 0.0 | 0.0 | 1.0 | 0.0 | 0.0 | 1.0 | 0.0 | 0.0 | 1.0 | 0.0 | 0.0 | 0.4179 | 0.8485 | 0.56 | 0.3333 | 0.0909 | 0.1429 | 1.0 | 0.0 | 0.0 |
|
130 |
+
| 2.1649 | 15.0 | 120 | 0.4882 | 2.0712 | 0.6239 | 0.4882 | 0.3616 | 1.0 | 0.0 | 0.0 | 0.5814 | 0.9615 | 0.7246 | 1.0 | 0.0 | 0.0 | 1.0 | 0.0 | 0.0 | 1.0 | 0.0 | 0.0 | 0.5 | 0.75 | 0.6 | 1.0 | 0.0 | 0.0 | 1.0 | 0.0 | 0.0 | 1.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 1.0 | 0.0 | 0.0 | 1.0 | 0.0 | 0.0 | 1.0 | 0.0 | 0.0 | 1.0 | 0.0 | 0.0 | 1.0 | 0.0 | 0.0 | 0.4462 | 0.8788 | 0.5918 | 0.4 | 0.1818 | 0.25 | 1.0 | 0.0 | 0.0 |
|
131 |
+
| 2.1649 | 16.0 | 128 | 0.5118 | 1.9789 | 0.6575 | 0.5118 | 0.3960 | 1.0 | 0.0 | 0.0 | 0.6944 | 0.9615 | 0.8065 | 1.0 | 0.0 | 0.0 | 1.0 | 0.0 | 0.0 | 1.0 | 0.0 | 0.0 | 0.4615 | 0.75 | 0.5714 | 1.0 | 0.0 | 0.0 | 1.0 | 0.0 | 0.0 | 1.0 | 0.0 | 0.0 | 0.1538 | 0.25 | 0.1905 | 1.0 | 0.0 | 0.0 | 1.0 | 0.0 | 0.0 | 1.0 | 0.0 | 0.0 | 1.0 | 0.0 | 0.0 | 1.0 | 0.0 | 0.0 | 0.5082 | 0.9394 | 0.6596 | 0.25 | 0.0909 | 0.1333 | 1.0 | 0.0 | 0.0 |
|
132 |
+
| 2.0169 | 17.0 | 136 | 0.5433 | 1.9319 | 0.7150 | 0.5433 | 0.4470 | 1.0 | 0.0 | 0.0 | 0.6757 | 0.9615 | 0.7937 | 1.0 | 0.0 | 0.0 | 1.0 | 0.0 | 0.0 | 1.0 | 0.0 | 0.0 | 0.4286 | 0.75 | 0.5455 | 1.0 | 0.0 | 0.0 | 1.0 | 0.0 | 0.0 | 1.0 | 0.0 | 0.0 | 0.1818 | 0.5 | 0.2667 | 1.0 | 0.0 | 0.0 | 1.0 | 0.0 | 0.0 | 1.0 | 0.0 | 0.0 | 1.0 | 0.0 | 0.0 | 1.0 | 0.0 | 0.0 | 0.6383 | 0.9091 | 0.75 | 0.5714 | 0.3636 | 0.4444 | 1.0 | 0.0 | 0.0 |
|
133 |
+
| 1.8937 | 18.0 | 144 | 0.5276 | 1.8886 | 0.7051 | 0.5276 | 0.4316 | 1.0 | 0.0 | 0.0 | 0.5814 | 0.9615 | 0.7246 | 1.0 | 0.0 | 0.0 | 1.0 | 0.0 | 0.0 | 1.0 | 0.0 | 0.0 | 0.4615 | 0.75 | 0.5714 | 1.0 | 0.0 | 0.0 | 1.0 | 0.0 | 0.0 | 1.0 | 0.0 | 0.0 | 0.25 | 0.75 | 0.375 | 1.0 | 0.0 | 0.0 | 1.0 | 0.0 | 0.0 | 1.0 | 0.0 | 0.0 | 1.0 | 0.0 | 0.0 | 1.0 | 0.0 | 0.0 | 0.65 | 0.7879 | 0.7123 | 0.5714 | 0.3636 | 0.4444 | 1.0 | 0.0 | 0.0 |
|
134 |
+
| 1.8209 | 19.0 | 152 | 0.5512 | 1.7950 | 0.7211 | 0.5512 | 0.4524 | 1.0 | 0.0 | 0.0 | 0.6579 | 0.9615 | 0.7812 | 1.0 | 0.0 | 0.0 | 1.0 | 0.0 | 0.0 | 1.0 | 0.0 | 0.0 | 0.5 | 0.75 | 0.6 | 1.0 | 0.0 | 0.0 | 1.0 | 0.0 | 0.0 | 1.0 | 0.0 | 0.0 | 0.24 | 0.75 | 0.3636 | 1.0 | 0.0 | 0.0 | 1.0 | 0.0 | 0.0 | 1.0 | 0.0 | 0.0 | 1.0 | 0.0 | 0.0 | 1.0 | 0.0 | 0.0 | 0.6444 | 0.8788 | 0.7436 | 0.5714 | 0.3636 | 0.4444 | 1.0 | 0.0 | 0.0 |
|
135 |
+
| 1.6452 | 20.0 | 160 | 0.5827 | 1.7998 | 0.7483 | 0.5827 | 0.4947 | 1.0 | 0.0 | 0.0 | 0.625 | 0.9615 | 0.7576 | 1.0 | 0.0 | 0.0 | 1.0 | 0.0 | 0.0 | 1.0 | 0.0 | 0.0 | 0.6667 | 0.75 | 0.7059 | 1.0 | 0.0 | 0.0 | 1.0 | 0.0 | 0.0 | 1.0 | 0.0 | 0.0 | 0.24 | 0.75 | 0.3636 | 1.0 | 0.25 | 0.4 | 1.0 | 0.0 | 0.0 | 1.0 | 0.0 | 0.0 | 1.0 | 0.0 | 0.0 | 1.0 | 0.0 | 0.0 | 0.6829 | 0.8485 | 0.7568 | 0.7273 | 0.7273 | 0.7273 | 1.0 | 0.0 | 0.0 |
|
136 |
+
| 1.6452 | 21.0 | 168 | 0.5827 | 1.7016 | 0.7211 | 0.5827 | 0.5068 | 1.0 | 0.0 | 0.0 | 0.6757 | 0.9615 | 0.7937 | 1.0 | 0.0 | 0.0 | 1.0 | 0.0 | 0.0 | 1.0 | 0.0 | 0.0 | 0.5833 | 0.875 | 0.7 | 1.0 | 0.0 | 0.0 | 0.5 | 0.1667 | 0.25 | 1.0 | 0.0 | 0.0 | 0.25 | 0.75 | 0.375 | 1.0 | 0.25 | 0.4 | 1.0 | 0.25 | 0.4 | 1.0 | 0.0 | 0.0 | 1.0 | 0.0 | 0.0 | 1.0 | 0.0 | 0.0 | 0.6512 | 0.8485 | 0.7368 | 0.7143 | 0.4545 | 0.5556 | 1.0 | 0.0 | 0.0 |
|
137 |
+
| 1.5597 | 22.0 | 176 | 0.6378 | 1.6544 | 0.7338 | 0.6378 | 0.5642 | 1.0 | 0.0 | 0.0 | 0.6579 | 0.9615 | 0.7812 | 1.0 | 0.0 | 0.0 | 1.0 | 0.0 | 0.0 | 1.0 | 0.5 | 0.6667 | 0.75 | 0.75 | 0.75 | 1.0 | 0.0 | 0.0 | 0.6 | 0.5 | 0.5455 | 1.0 | 0.0 | 0.0 | 0.4118 | 0.875 | 0.56 | 0.75 | 0.75 | 0.75 | 1.0 | 0.25 | 0.4 | 1.0 | 0.0 | 0.0 | 1.0 | 0.0 | 0.0 | 1.0 | 0.0 | 0.0 | 0.6512 | 0.8485 | 0.7368 | 0.7 | 0.6364 | 0.6667 | 1.0 | 0.0 | 0.0 |
|
138 |
+
| 1.4316 | 23.0 | 184 | 0.6850 | 1.5807 | 0.7513 | 0.6850 | 0.6319 | 1.0 | 0.0 | 0.0 | 0.7742 | 0.9231 | 0.8421 | 1.0 | 0.0 | 0.0 | 1.0 | 0.0 | 0.0 | 1.0 | 0.5 | 0.6667 | 0.6667 | 0.75 | 0.7059 | 1.0 | 0.0 | 0.0 | 0.5 | 0.5 | 0.5 | 1.0 | 0.0 | 0.0 | 0.4 | 0.75 | 0.5217 | 0.6 | 0.75 | 0.6667 | 1.0 | 0.75 | 0.8571 | 1.0 | 0.8333 | 0.9091 | 1.0 | 0.0 | 0.0 | 1.0 | 0.0 | 0.0 | 0.6977 | 0.9091 | 0.7895 | 0.6667 | 0.5455 | 0.6 | 1.0 | 0.0 | 0.0 |
|
139 |
+
| 1.2604 | 24.0 | 192 | 0.6850 | 1.5698 | 0.7486 | 0.6850 | 0.6436 | 1.0 | 0.0 | 0.0 | 0.7353 | 0.9615 | 0.8333 | 1.0 | 0.0 | 0.0 | 1.0 | 0.0 | 0.0 | 1.0 | 0.5 | 0.6667 | 0.75 | 0.75 | 0.75 | 1.0 | 0.0 | 0.0 | 0.4286 | 0.5 | 0.4615 | 1.0 | 0.0 | 0.0 | 0.4 | 0.75 | 0.5217 | 0.75 | 0.75 | 0.75 | 1.0 | 0.75 | 0.8571 | 1.0 | 0.8333 | 0.9091 | 1.0 | 0.0 | 0.0 | 1.0 | 0.0 | 0.0 | 0.7027 | 0.7879 | 0.7429 | 0.6364 | 0.6364 | 0.6364 | 1.0 | 0.6667 | 0.8 |
|
140 |
+
| 1.17 | 25.0 | 200 | 0.6772 | 1.5075 | 0.7379 | 0.6772 | 0.6377 | 1.0 | 0.0 | 0.0 | 0.7353 | 0.9615 | 0.8333 | 1.0 | 0.0 | 0.0 | 1.0 | 0.5 | 0.6667 | 1.0 | 0.5 | 0.6667 | 0.6667 | 0.75 | 0.7059 | 1.0 | 0.0 | 0.0 | 0.4444 | 0.6667 | 0.5333 | 1.0 | 0.0 | 0.0 | 0.4615 | 0.75 | 0.5714 | 0.75 | 0.75 | 0.75 | 1.0 | 0.75 | 0.8571 | 1.0 | 0.8333 | 0.9091 | 1.0 | 0.0 | 0.0 | 1.0 | 0.0 | 0.0 | 0.6757 | 0.7576 | 0.7143 | 0.6 | 0.5455 | 0.5714 | 1.0 | 0.3333 | 0.5 |
|
141 |
+
| 1.17 | 26.0 | 208 | 0.7165 | 1.4437 | 0.7679 | 0.7165 | 0.6756 | 1.0 | 0.0 | 0.0 | 0.8065 | 0.9615 | 0.8772 | 1.0 | 0.0 | 0.0 | 1.0 | 0.5 | 0.6667 | 1.0 | 0.5 | 0.6667 | 0.75 | 0.75 | 0.75 | 1.0 | 0.0 | 0.0 | 0.4444 | 0.6667 | 0.5333 | 1.0 | 0.0 | 0.0 | 0.5455 | 0.75 | 0.6316 | 0.6 | 0.75 | 0.6667 | 1.0 | 0.75 | 0.8571 | 1.0 | 0.8333 | 0.9091 | 1.0 | 0.0 | 0.0 | 1.0 | 0.0 | 0.0 | 0.6905 | 0.8788 | 0.7733 | 0.6667 | 0.5455 | 0.6 | 1.0 | 0.6667 | 0.8 |
|
142 |
+
| 1.0515 | 27.0 | 216 | 0.7402 | 1.3992 | 0.7587 | 0.7402 | 0.7103 | 1.0 | 0.0 | 0.0 | 0.8214 | 0.8846 | 0.8519 | 1.0 | 0.5 | 0.6667 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 0.75 | 0.75 | 0.75 | 1.0 | 0.0 | 0.0 | 0.5 | 0.8333 | 0.625 | 1.0 | 0.0 | 0.0 | 0.8333 | 0.625 | 0.7143 | 1.0 | 0.75 | 0.8571 | 1.0 | 1.0 | 1.0 | 1.0 | 0.8333 | 0.9091 | 1.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.6667 | 0.9091 | 0.7692 | 0.6667 | 0.5455 | 0.6 | 1.0 | 0.6667 | 0.8 |
|
143 |
+
| 0.9513 | 28.0 | 224 | 0.7165 | 1.3520 | 0.7659 | 0.7165 | 0.6754 | 1.0 | 0.0 | 0.0 | 0.7667 | 0.8846 | 0.8214 | 1.0 | 0.0 | 0.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 0.75 | 0.75 | 0.75 | 1.0 | 0.0 | 0.0 | 0.5 | 0.8333 | 0.625 | 1.0 | 0.0 | 0.0 | 0.6667 | 0.75 | 0.7059 | 1.0 | 0.75 | 0.8571 | 1.0 | 1.0 | 1.0 | 1.0 | 0.8333 | 0.9091 | 1.0 | 0.0 | 0.0 | 1.0 | 0.0 | 0.0 | 0.6667 | 0.8485 | 0.7467 | 0.5455 | 0.5455 | 0.5455 | 1.0 | 0.3333 | 0.5 |
|
144 |
+
| 0.7942 | 29.0 | 232 | 0.7244 | 1.3103 | 0.7409 | 0.7244 | 0.7038 | 1.0 | 0.5 | 0.6667 | 0.8065 | 0.9615 | 0.8772 | 1.0 | 0.5 | 0.6667 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 0.6667 | 0.75 | 0.7059 | 1.0 | 0.0 | 0.0 | 0.4167 | 0.8333 | 0.5556 | 1.0 | 0.0 | 0.0 | 0.75 | 0.75 | 0.75 | 1.0 | 0.75 | 0.8571 | 1.0 | 1.0 | 1.0 | 1.0 | 0.8333 | 0.9091 | 1.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.7059 | 0.7273 | 0.7164 | 0.5455 | 0.5455 | 0.5455 | 1.0 | 0.6667 | 0.8 |
|
145 |
+
| 0.7386 | 30.0 | 240 | 0.7008 | 1.3628 | 0.7289 | 0.7008 | 0.6813 | 1.0 | 0.0 | 0.0 | 0.8065 | 0.9615 | 0.8772 | 1.0 | 0.5 | 0.6667 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 0.75 | 0.75 | 0.75 | 1.0 | 0.0 | 0.0 | 0.3333 | 0.6667 | 0.4444 | 1.0 | 0.0 | 0.0 | 0.7143 | 0.625 | 0.6667 | 0.75 | 0.75 | 0.75 | 1.0 | 1.0 | 1.0 | 1.0 | 0.8333 | 0.9091 | 1.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.7059 | 0.7273 | 0.7164 | 0.6 | 0.5455 | 0.5714 | 0.6667 | 0.6667 | 0.6667 |
|
146 |
+
| 0.7386 | 31.0 | 248 | 0.7165 | 1.3002 | 0.7389 | 0.7165 | 0.6980 | 1.0 | 0.5 | 0.6667 | 0.7353 | 0.9615 | 0.8333 | 1.0 | 0.5 | 0.6667 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 0.75 | 0.75 | 0.75 | 1.0 | 0.0 | 0.0 | 0.4 | 0.6667 | 0.5 | 1.0 | 0.0 | 0.0 | 0.75 | 0.75 | 0.75 | 0.75 | 0.75 | 0.75 | 1.0 | 1.0 | 1.0 | 1.0 | 0.8333 | 0.9091 | 1.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.7586 | 0.6667 | 0.7097 | 0.5714 | 0.7273 | 0.64 | 1.0 | 0.6667 | 0.8 |
|
147 |
+
| 0.64 | 32.0 | 256 | 0.7244 | 1.3095 | 0.7345 | 0.7244 | 0.7027 | 1.0 | 0.5 | 0.6667 | 0.8065 | 0.9615 | 0.8772 | 1.0 | 0.5 | 0.6667 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 0.75 | 0.75 | 0.75 | 1.0 | 0.0 | 0.0 | 0.4 | 0.6667 | 0.5 | 1.0 | 0.0 | 0.0 | 0.75 | 0.75 | 0.75 | 0.75 | 0.75 | 0.75 | 1.0 | 1.0 | 1.0 | 1.0 | 0.8333 | 0.9091 | 1.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.6944 | 0.7576 | 0.7246 | 0.5455 | 0.5455 | 0.5455 | 1.0 | 0.6667 | 0.8 |
|
148 |
+
| 0.5528 | 33.0 | 264 | 0.7087 | 1.2698 | 0.7282 | 0.7087 | 0.6903 | 1.0 | 0.5 | 0.6667 | 0.7576 | 0.9615 | 0.8475 | 1.0 | 0.5 | 0.6667 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 0.75 | 0.75 | 0.75 | 1.0 | 0.0 | 0.0 | 0.3636 | 0.6667 | 0.4706 | 1.0 | 0.0 | 0.0 | 0.75 | 0.75 | 0.75 | 0.75 | 0.75 | 0.75 | 1.0 | 1.0 | 1.0 | 1.0 | 0.8333 | 0.9091 | 1.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.6970 | 0.6970 | 0.6970 | 0.6 | 0.5455 | 0.5714 | 1.0 | 0.6667 | 0.8 |
|
149 |
+
| 0.4612 | 34.0 | 272 | 0.7402 | 1.2532 | 0.7488 | 0.7402 | 0.7174 | 1.0 | 1.0 | 1.0 | 0.7812 | 0.9615 | 0.8621 | 1.0 | 0.5 | 0.6667 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 0.8571 | 0.75 | 0.8 | 1.0 | 0.0 | 0.0 | 0.4 | 0.6667 | 0.5 | 1.0 | 0.0 | 0.0 | 0.75 | 0.75 | 0.75 | 0.75 | 0.75 | 0.75 | 1.0 | 1.0 | 1.0 | 1.0 | 0.8333 | 0.9091 | 1.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.7027 | 0.7879 | 0.7429 | 0.6667 | 0.5455 | 0.6 | 1.0 | 0.6667 | 0.8 |
|
150 |
+
| 0.4051 | 35.0 | 280 | 0.7087 | 1.2745 | 0.7369 | 0.7087 | 0.6930 | 1.0 | 1.0 | 1.0 | 0.7812 | 0.9615 | 0.8621 | 1.0 | 0.5 | 0.6667 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 0.8571 | 0.75 | 0.8 | 1.0 | 0.0 | 0.0 | 0.3846 | 0.8333 | 0.5263 | 1.0 | 0.0 | 0.0 | 0.0 | 1.0 | 0.0 | 0.7143 | 0.625 | 0.6667 | 0.75 | 0.75 | 0.75 | 1.0 | 1.0 | 1.0 | 1.0 | 0.8333 | 0.9091 | 1.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.7241 | 0.6364 | 0.6774 | 0.5 | 0.6364 | 0.56 | 1.0 | 0.6667 | 0.8 |
|
151 |
+
| 0.4051 | 36.0 | 288 | 0.7087 | 1.3092 | 0.7413 | 0.7087 | 0.6988 | 1.0 | 1.0 | 1.0 | 0.8276 | 0.9231 | 0.8727 | 1.0 | 0.5 | 0.6667 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 0.8571 | 0.75 | 0.8 | 1.0 | 0.0 | 0.0 | 0.3077 | 0.6667 | 0.4211 | 1.0 | 0.0 | 0.0 | 0.0 | 1.0 | 0.0 | 0.75 | 0.75 | 0.75 | 0.75 | 0.75 | 0.75 | 1.0 | 1.0 | 1.0 | 1.0 | 0.8333 | 0.9091 | 1.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.6765 | 0.6970 | 0.6866 | 0.6 | 0.5455 | 0.5714 | 1.0 | 0.6667 | 0.8 |
|
152 |
+
| 0.3354 | 37.0 | 296 | 0.6929 | 1.2752 | 0.7348 | 0.6929 | 0.6799 | 1.0 | 0.5 | 0.6667 | 0.7143 | 0.9615 | 0.8197 | 1.0 | 0.5 | 0.6667 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 0.75 | 0.75 | 0.75 | 1.0 | 0.0 | 0.0 | 0.3333 | 0.6667 | 0.4444 | 1.0 | 0.0 | 0.0 | 0.0 | 1.0 | 0.0 | 0.7143 | 0.625 | 0.6667 | 0.75 | 0.75 | 0.75 | 1.0 | 1.0 | 1.0 | 1.0 | 0.8333 | 0.9091 | 1.0 | 0.0 | 0.0 | 0.3333 | 0.2 | 0.25 | 0.76 | 0.5758 | 0.6552 | 0.5714 | 0.7273 | 0.64 | 0.6667 | 0.6667 | 0.6667 |
|
153 |
+
| 0.2905 | 38.0 | 304 | 0.7244 | 1.2621 | 0.7442 | 0.7244 | 0.7060 | 1.0 | 1.0 | 1.0 | 0.7812 | 0.9615 | 0.8621 | 1.0 | 0.5 | 0.6667 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 0.8571 | 0.75 | 0.8 | 1.0 | 0.0 | 0.0 | 0.3846 | 0.8333 | 0.5263 | 1.0 | 0.0 | 0.0 | 0.6667 | 0.5 | 0.5714 | 0.75 | 0.75 | 0.75 | 1.0 | 1.0 | 1.0 | 1.0 | 0.8333 | 0.9091 | 1.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.7273 | 0.7273 | 0.7273 | 0.7 | 0.6364 | 0.6667 | 0.6667 | 0.6667 | 0.6667 |
|
154 |
+
| 0.249 | 39.0 | 312 | 0.7165 | 1.3443 | 0.7636 | 0.7165 | 0.7027 | 1.0 | 1.0 | 1.0 | 0.7143 | 0.9615 | 0.8197 | 1.0 | 0.5 | 0.6667 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 0.8571 | 0.75 | 0.8 | 1.0 | 0.0 | 0.0 | 0.3846 | 0.8333 | 0.5263 | 1.0 | 0.0 | 0.0 | 0.0 | 1.0 | 0.0 | 0.7143 | 0.625 | 0.6667 | 0.75 | 0.75 | 0.75 | 1.0 | 1.0 | 1.0 | 1.0 | 0.8333 | 0.9091 | 1.0 | 0.0 | 0.0 | 0.3333 | 0.2 | 0.25 | 0.8261 | 0.5758 | 0.6786 | 0.6 | 0.8182 | 0.6923 | 0.6667 | 0.6667 | 0.6667 |
|
155 |
+
| 0.201 | 40.0 | 320 | 0.7008 | 1.2940 | 0.7209 | 0.7008 | 0.6824 | 1.0 | 1.0 | 1.0 | 0.8276 | 0.9231 | 0.8727 | 1.0 | 0.5 | 0.6667 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 0.8571 | 0.75 | 0.8 | 0.0 | 1.0 | 0.0 | 1.0 | 0.0 | 0.0 | 0.3333 | 0.6667 | 0.4444 | 1.0 | 0.0 | 0.0 | 0.5 | 0.25 | 0.3333 | 0.75 | 0.75 | 0.75 | 1.0 | 1.0 | 1.0 | 1.0 | 0.8333 | 0.9091 | 1.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.6842 | 0.7879 | 0.7324 | 0.6 | 0.5455 | 0.5714 | 0.6667 | 0.6667 | 0.6667 |
|
156 |
+
| 0.201 | 41.0 | 328 | 0.7638 | 1.3724 | 0.8040 | 0.7638 | 0.7341 | 1.0 | 1.0 | 1.0 | 0.8 | 0.9231 | 0.8571 | 1.0 | 0.5 | 0.6667 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 0.8571 | 0.75 | 0.8 | 1.0 | 0.0 | 0.0 | 0.5 | 0.8333 | 0.625 | 1.0 | 0.0 | 0.0 | 0.8571 | 0.75 | 0.8 | 1.0 | 0.75 | 0.8571 | 1.0 | 1.0 | 1.0 | 1.0 | 0.8333 | 0.9091 | 1.0 | 0.0 | 0.0 | 1.0 | 0.0 | 0.0 | 0.6744 | 0.8788 | 0.7632 | 0.6667 | 0.5455 | 0.6 | 1.0 | 0.6667 | 0.8 |
|
157 |
+
| 0.1855 | 42.0 | 336 | 0.7323 | 1.3904 | 0.7521 | 0.7323 | 0.7117 | 1.0 | 1.0 | 1.0 | 0.8846 | 0.8846 | 0.8846 | 1.0 | 0.5 | 0.6667 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 0.8571 | 0.75 | 0.8 | 1.0 | 0.0 | 0.0 | 0.3636 | 0.6667 | 0.4706 | 1.0 | 0.0 | 0.0 | 0.8 | 0.5 | 0.6154 | 0.75 | 0.75 | 0.75 | 1.0 | 1.0 | 1.0 | 1.0 | 0.8333 | 0.9091 | 1.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.6591 | 0.8788 | 0.7532 | 0.6667 | 0.5455 | 0.6 | 0.6667 | 0.6667 | 0.6667 |
|
158 |
+
| 0.1636 | 43.0 | 344 | 0.6929 | 1.4009 | 0.7417 | 0.6929 | 0.6851 | 1.0 | 1.0 | 1.0 | 0.6970 | 0.8846 | 0.7797 | 0.5 | 0.5 | 0.5 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 0.8571 | 0.75 | 0.8 | 1.0 | 0.0 | 0.0 | 0.3571 | 0.8333 | 0.5 | 1.0 | 0.0 | 0.0 | 0.0 | 1.0 | 0.0 | 0.8 | 0.5 | 0.6154 | 1.0 | 0.75 | 0.8571 | 1.0 | 1.0 | 1.0 | 0.7143 | 0.8333 | 0.7692 | 1.0 | 0.0 | 0.0 | 0.2 | 0.2 | 0.2 | 0.7692 | 0.6061 | 0.6780 | 0.7273 | 0.7273 | 0.7273 | 0.6667 | 0.6667 | 0.6667 |
|
159 |
+
| 0.1472 | 44.0 | 352 | 0.7165 | 1.4073 | 0.7307 | 0.7165 | 0.7020 | 1.0 | 1.0 | 1.0 | 0.8462 | 0.8462 | 0.8462 | 1.0 | 0.5 | 0.6667 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 0.8571 | 0.75 | 0.8 | 1.0 | 0.0 | 0.0 | 0.3636 | 0.6667 | 0.4706 | 1.0 | 0.0 | 0.0 | 0.8333 | 0.625 | 0.7143 | 0.75 | 0.75 | 0.75 | 1.0 | 1.0 | 1.0 | 0.8333 | 0.8333 | 0.8333 | 1.0 | 0.5 | 0.6667 | 0.0 | 0.0 | 0.0 | 0.6429 | 0.8182 | 0.72 | 0.625 | 0.4545 | 0.5263 | 0.6667 | 0.6667 | 0.6667 |
|
160 |
+
| 0.114 | 45.0 | 360 | 0.7244 | 1.4105 | 0.7553 | 0.7244 | 0.7219 | 1.0 | 1.0 | 1.0 | 0.7059 | 0.9231 | 0.8 | 1.0 | 0.5 | 0.6667 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 0.8571 | 0.75 | 0.8 | 0.0 | 0.0 | 0.0 | 0.3846 | 0.8333 | 0.5263 | 1.0 | 0.0 | 0.0 | 0.0 | 1.0 | 0.0 | 0.7143 | 0.625 | 0.6667 | 0.75 | 0.75 | 0.75 | 1.0 | 1.0 | 1.0 | 0.8333 | 0.8333 | 0.8333 | 1.0 | 0.5 | 0.6667 | 0.6667 | 0.4 | 0.5 | 0.8077 | 0.6364 | 0.7119 | 0.7 | 0.6364 | 0.6667 | 0.6667 | 0.6667 | 0.6667 |
|
161 |
+
| 0.114 | 46.0 | 368 | 0.7402 | 1.3171 | 0.7515 | 0.7402 | 0.7248 | 1.0 | 1.0 | 1.0 | 0.8065 | 0.9615 | 0.8772 | 1.0 | 0.5 | 0.6667 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 0.8571 | 0.75 | 0.8 | 1.0 | 0.0 | 0.0 | 0.3846 | 0.8333 | 0.5263 | 1.0 | 0.0 | 0.0 | 0.7143 | 0.625 | 0.6667 | 0.75 | 0.75 | 0.75 | 1.0 | 1.0 | 1.0 | 1.0 | 0.8333 | 0.9091 | 1.0 | 0.5 | 0.6667 | 0.0 | 0.0 | 0.0 | 0.7353 | 0.7576 | 0.7463 | 0.6667 | 0.5455 | 0.6 | 0.6667 | 0.6667 | 0.6667 |
|
162 |
+
| 0.0875 | 47.0 | 376 | 0.7402 | 1.3592 | 0.7814 | 0.7402 | 0.7375 | 1.0 | 1.0 | 1.0 | 0.7742 | 0.9231 | 0.8421 | 1.0 | 0.5 | 0.6667 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 0.8571 | 0.75 | 0.8 | 1.0 | 0.0 | 0.0 | 0.3571 | 0.8333 | 0.5 | 1.0 | 0.0 | 0.0 | 0.7143 | 0.625 | 0.6667 | 0.75 | 0.75 | 0.75 | 1.0 | 1.0 | 1.0 | 1.0 | 0.8333 | 0.9091 | 1.0 | 0.5 | 0.6667 | 0.4 | 0.4 | 0.4 | 0.8148 | 0.6667 | 0.7333 | 0.7273 | 0.7273 | 0.7273 | 0.5 | 0.6667 | 0.5714 |
|
163 |
+
| 0.0698 | 48.0 | 384 | 0.7244 | 1.4037 | 0.7611 | 0.7244 | 0.7204 | 1.0 | 1.0 | 1.0 | 0.8 | 0.9231 | 0.8571 | 1.0 | 0.5 | 0.6667 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 0.8571 | 0.75 | 0.8 | 0.0 | 1.0 | 0.0 | 1.0 | 0.0 | 0.0 | 0.3846 | 0.8333 | 0.5263 | 1.0 | 0.0 | 0.0 | 0.0 | 1.0 | 0.0 | 0.6667 | 0.5 | 0.5714 | 0.75 | 0.75 | 0.75 | 0.8 | 1.0 | 0.8889 | 1.0 | 0.8333 | 0.9091 | 1.0 | 0.5 | 0.6667 | 0.3333 | 0.2 | 0.25 | 0.7667 | 0.6970 | 0.7302 | 0.7 | 0.6364 | 0.6667 | 0.5 | 0.6667 | 0.5714 |
|
164 |
+
| 0.0519 | 49.0 | 392 | 0.7244 | 1.5194 | 0.7606 | 0.7244 | 0.7188 | 1.0 | 1.0 | 1.0 | 0.7576 | 0.9615 | 0.8475 | 1.0 | 0.5 | 0.6667 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 0.8571 | 0.75 | 0.8 | 0.0 | 1.0 | 0.0 | 1.0 | 0.0 | 0.0 | 0.3846 | 0.8333 | 0.5263 | 1.0 | 0.0 | 0.0 | 0.0 | 1.0 | 0.0 | 0.6667 | 0.5 | 0.5714 | 0.75 | 0.75 | 0.75 | 1.0 | 1.0 | 1.0 | 1.0 | 0.8333 | 0.9091 | 1.0 | 0.5 | 0.6667 | 0.3333 | 0.2 | 0.25 | 0.7586 | 0.6667 | 0.7097 | 0.7 | 0.6364 | 0.6667 | 0.6667 | 0.6667 | 0.6667 |
|
165 |
+
| 0.0404 | 50.0 | 400 | 0.7087 | 1.5329 | 0.7311 | 0.7087 | 0.7105 | 1.0 | 1.0 | 1.0 | 0.7586 | 0.8462 | 0.8 | 1.0 | 0.5 | 0.6667 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 1.0 | 0.8571 | 0.75 | 0.8 | 0.0 | 1.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.3846 | 0.8333 | 0.5263 | 0.0 | 0.0 | 0.0 | 0.6667 | 0.5 | 0.5714 | 0.75 | 0.75 | 0.75 | 1.0 | 1.0 | 1.0 | 0.8333 | 0.8333 | 0.8333 | 1.0 | 0.5 | 0.6667 | 0.5 | 0.4 | 0.4444 | 0.7667 | 0.6970 | 0.7302 | 0.6667 | 0.5455 | 0.6 | 0.5 | 0.6667 | 0.5714 |
|
166 |
+
|
167 |
+
|
168 |
+
### Framework versions
|
169 |
+
|
170 |
+
- Transformers 4.38.2
|
171 |
+
- Pytorch 2.2.1+cu121
|
172 |
+
- Datasets 2.18.0
|
173 |
+
- Tokenizers 0.15.2
|
config.json
ADDED
@@ -0,0 +1,78 @@
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
1 |
+
{
|
2 |
+
"_name_or_path": "nasa-impact/nasa-smd-ibm-st",
|
3 |
+
"architectures": [
|
4 |
+
"RobertaForSequenceClassification"
|
5 |
+
],
|
6 |
+
"attention_probs_dropout_prob": 0.1,
|
7 |
+
"bos_token_id": 0,
|
8 |
+
"classifier_dropout": null,
|
9 |
+
"eos_token_id": 2,
|
10 |
+
"hidden_act": "gelu",
|
11 |
+
"hidden_dropout_prob": 0.1,
|
12 |
+
"hidden_size": 768,
|
13 |
+
"id2label": {
|
14 |
+
"0": "LABEL_0",
|
15 |
+
"1": "LABEL_1",
|
16 |
+
"2": "LABEL_2",
|
17 |
+
"3": "LABEL_3",
|
18 |
+
"4": "LABEL_4",
|
19 |
+
"5": "LABEL_5",
|
20 |
+
"6": "LABEL_6",
|
21 |
+
"7": "LABEL_7",
|
22 |
+
"8": "LABEL_8",
|
23 |
+
"9": "LABEL_9",
|
24 |
+
"10": "LABEL_10",
|
25 |
+
"11": "LABEL_11",
|
26 |
+
"12": "LABEL_12",
|
27 |
+
"13": "LABEL_13",
|
28 |
+
"14": "LABEL_14",
|
29 |
+
"15": "LABEL_15",
|
30 |
+
"16": "LABEL_16",
|
31 |
+
"17": "LABEL_17",
|
32 |
+
"18": "LABEL_18",
|
33 |
+
"19": "LABEL_19",
|
34 |
+
"20": "LABEL_20",
|
35 |
+
"21": "LABEL_21",
|
36 |
+
"22": "LABEL_22"
|
37 |
+
},
|
38 |
+
"initializer_range": 0.02,
|
39 |
+
"intermediate_size": 3072,
|
40 |
+
"label2id": {
|
41 |
+
"LABEL_0": 0,
|
42 |
+
"LABEL_1": 1,
|
43 |
+
"LABEL_10": 10,
|
44 |
+
"LABEL_11": 11,
|
45 |
+
"LABEL_12": 12,
|
46 |
+
"LABEL_13": 13,
|
47 |
+
"LABEL_14": 14,
|
48 |
+
"LABEL_15": 15,
|
49 |
+
"LABEL_16": 16,
|
50 |
+
"LABEL_17": 17,
|
51 |
+
"LABEL_18": 18,
|
52 |
+
"LABEL_19": 19,
|
53 |
+
"LABEL_2": 2,
|
54 |
+
"LABEL_20": 20,
|
55 |
+
"LABEL_21": 21,
|
56 |
+
"LABEL_22": 22,
|
57 |
+
"LABEL_3": 3,
|
58 |
+
"LABEL_4": 4,
|
59 |
+
"LABEL_5": 5,
|
60 |
+
"LABEL_6": 6,
|
61 |
+
"LABEL_7": 7,
|
62 |
+
"LABEL_8": 8,
|
63 |
+
"LABEL_9": 9
|
64 |
+
},
|
65 |
+
"layer_norm_eps": 1e-05,
|
66 |
+
"max_position_embeddings": 514,
|
67 |
+
"model_type": "roberta",
|
68 |
+
"num_attention_heads": 12,
|
69 |
+
"num_hidden_layers": 12,
|
70 |
+
"pad_token_id": 1,
|
71 |
+
"position_embedding_type": "absolute",
|
72 |
+
"problem_type": "single_label_classification",
|
73 |
+
"torch_dtype": "float32",
|
74 |
+
"transformers_version": "4.38.2",
|
75 |
+
"type_vocab_size": 1,
|
76 |
+
"use_cache": true,
|
77 |
+
"vocab_size": 50265
|
78 |
+
}
|
model.safetensors
ADDED
@@ -0,0 +1,3 @@
|
|
|
|
|
|
|
|
|
1 |
+
version https://git-lfs.github.com/spec/v1
|
2 |
+
oid sha256:ca30f0077594be566569fa389ac7ed800940ae40a7b53d5ae99db8692dc979bb
|
3 |
+
size 498677420
|
training_args.bin
ADDED
@@ -0,0 +1,3 @@
|
|
|
|
|
|
|
|
|
1 |
+
version https://git-lfs.github.com/spec/v1
|
2 |
+
oid sha256:f331127a7e9dbbf4206c884e5c479ca2da19ec6757b0978d41d0e3f027115776
|
3 |
+
size 4920
|