English
BenjiELCA commited on
Commit
4a9a568
1 Parent(s): dcdef80

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +91 -27
README.md CHANGED
@@ -41,13 +41,42 @@ The dataset contains 15 target labels:
41
  * `dataAssociation`
42
  * `messageFlow`
43
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
44
 
45
- It achieves the following results on the evaluation set:
46
- - Loss:
47
- - Precision:
48
- - Recall:
49
- - F1:
50
- - Accuracy:
51
 
52
  ## Model description
53
 
@@ -66,30 +95,65 @@ More information needed
66
  ### Training hyperparameters
67
 
68
  The following hyperparameters were used during training:
69
- - learning_rate:
70
- - train_batch_size:
71
- - eval_batch_size:
72
  - seed: 42
73
  - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
74
  - lr_scheduler_type: linear
75
- - num_epochs:
76
 
77
  ### Training results
78
 
79
- | Training Loss | Epoch | Step | Validation Loss | Precision | Recall | F1 | Accuracy |
80
- |:-------------:|:-----:|:----:|:---------------:|:---------:|:------:|:------:|:--------:|
81
- | 2.0586 | 1.0 | 10 | 1.5601 | 0.1278 | 0.1559 | 0.1404 | 0.4750 |
82
- | 1.3702 | 2.0 | 20 | 1.0113 | 0.3947 | 0.5645 | 0.4646 | 0.7150 |
83
- | 0.8872 | 3.0 | 30 | 0.6645 | 0.5224 | 0.6882 | 0.5940 | 0.8051 |
84
- | 0.5341 | 4.0 | 40 | 0.4741 | 0.6754 | 0.8280 | 0.7440 | 0.8541 |
85
- | 0.3221 | 5.0 | 50 | 0.3831 | 0.7523 | 0.8817 | 0.8119 | 0.8883 |
86
- | 0.2168 | 6.0 | 60 | 0.3297 | 0.7731 | 0.8978 | 0.8308 | 0.9079 |
87
- | 0.1565 | 7.0 | 70 | 0.2998 | 0.8195 | 0.9032 | 0.8593 | 0.9128 |
88
- | 0.1227 | 8.0 | 80 | 0.3227 | 0.8038 | 0.9032 | 0.8506 | 0.9099 |
89
- | 0.0957 | 9.0 | 90 | 0.2840 | 0.8431 | 0.9247 | 0.8821 | 0.9216 |
90
- | 0.077 | 10.0 | 100 | 0.2914 | 0.8252 | 0.9140 | 0.8673 | 0.9216 |
91
- | 0.0691 | 11.0 | 110 | 0.2850 | 0.8431 | 0.9247 | 0.8821 | 0.9285 |
92
- | 0.059 | 12.0 | 120 | 0.2886 | 0.8564 | 0.9301 | 0.8918 | 0.9285 |
93
- | 0.0528 | 13.0 | 130 | 0.2838 | 0.8564 | 0.9301 | 0.8918 | 0.9305 |
94
- | 0.0488 | 14.0 | 140 | 0.2881 | 0.8515 | 0.9247 | 0.8866 | 0.9305 |
95
- | 0.049 | 15.0 | 150 | 0.2909 | 0.8557 | 0.9247 | 0.8889 | 0.9285 |
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
41
  * `dataAssociation`
42
  * `messageFlow`
43
 
44
+ ## Results per type
45
+
46
+ It achieves the following results on the evaluation set with objects:
47
+ - Labels Precision: 0.97
48
+ - Precision: 0.97
49
+ - Recall: 0.95
50
+ - F1: 0.96
51
+
52
+ It achieves the following results on the evaluation set with arrows:
53
+ - Labels precision: 0.98
54
+ - Precision: 0.92
55
+ - Recall: 0.93
56
+ - F1: 0.92
57
+ - Keypoints Accuracy: 0.71
58
+
59
+ # Results per class
60
+
61
+ | Class | Precision | Recall | F1 |
62
+ |:-----------------:|:---------:|:--------:|:-------:|
63
+ | background | 0 | 0 | 0 |
64
+ | sequenceFlow | 0.9292 | 0.9605 | 0.9446 |
65
+ | dataAssociation | 0.8472 | 0.8095 | 0.8279 |
66
+ | messageFlow | 0.8589 | 0.7910 | 0.8235 |
67
+ | task | 0.9518 | 0.9875 | 0.9693 |
68
+ | exclusiveGateway | 0.9548 | 0.9427 | 0.9487 |
69
+ | event | 0.9515 | 0.9235 | 0.9373 |
70
+ | parallelGateway | 0.9333 | 0.9180 | 0.9256 |
71
+ | messageEvent | 0.9291 | 0.9365 | 0.9328 |
72
+ | pool | 0.8797 | 0.936 | 0.9070 |
73
+ | lane | 0.9178 | 0.67 | 0.7746 |
74
+ | dataObject | 0.9333 | 0.9565 | 0.9448 |
75
+ | dataStore | 1.0 | 0.64 | 0.7805 |
76
+ | subProcess | 1.0 | 0.1429 | 0.25 |
77
+ | eventBasedGateway | 0.7273 | 0.7273 | 0.7273 |
78
+ | timerEvent | 0.8571 | 0.75 | 0.8 |
79
 
 
 
 
 
 
 
80
 
81
  ## Model description
82
 
 
95
  ### Training hyperparameters
96
 
97
  The following hyperparameters were used during training:
98
+ - learning_rate: 0.0176
99
+ - train_batch_size: 4
100
+ - eval_batch_size: 1
101
  - seed: 42
102
  - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
103
  - lr_scheduler_type: linear
104
+ - num_epochs: 50
105
 
106
  ### Training results
107
 
108
+ | Epoch | Avg Loss | Classifier Loss | Box Reg Loss | Objectness Loss | RPN Box Reg Loss | Keypoints Loss | Precision | Recall | F1 Score | Accuracy |
109
+ |:-----:|:--------:|:---------------:|:------------:|:---------------:|:----------------:|:--------------:|:---------:|:------:|:--------:|:--------:|
110
+ | 1 | 3.9451 | 2.4416 | 0.5426 | 0.6502 | 0.3107 | 0.0 | 0.2763 | 0.0393 | 0.0689 | 2.0591 |
111
+ | 2 | 2.7259 | 1.6724 | 0.6697 | 0.1868 | 0.1969 | 0.0 | 0.5754 | 0.3358 | 0.4241 | 1.5387 |
112
+ | 3 | 2.2004 | 1.3860 | 0.5330 | 0.1216 | 0.1598 | 0.0 | 0.8657 | 0.6841 | 0.7643 | 1.1307 |
113
+ | 4 | 1.8611 | 1.1775 | 0.4172 | 0.1099 | 0.1565 | 0.0 | 0.7708 | 0.7790 | 0.7749 | 1.0110 |
114
+ | 5 | 1.7461 | 1.1202 | 0.3820 | 0.0971 | 0.1468 | 0.0 | 0.8542 | 0.8046 | 0.8287 | 0.9593 |
115
+ | 6 | 1.5859 | 0.9986 | 0.3590 | 0.0872 | 0.1412 | 0.0 | 0.8884 | 0.8002 | 0.8420 | 0.8956 |
116
+ | 7 | 1.5621 | 1.0214 | 0.3351 | 0.0776 | 0.1280 | 0.0 | 0.9435 | 0.8034 | 0.8678 | 0.9073 |
117
+ | 8 | 1.5194 | 0.9881 | 0.3261 | 0.0738 | 0.1314 | 0.0 | 0.9048 | 0.8246 | 0.8628 | 0.8695 |
118
+ | 9 | 1.5449 | 1.0105 | 0.3229 | 0.0769 | 0.1346 | 0.0 | 0.9478 | 0.8046 | 0.8704 | 0.9014 |
119
+ | 10 | 1.5805 | 1.0333 | 0.3338 | 0.0703 | 0.1431 | 0.0 | 0.8920 | 0.8920 | 0.8920 | 0.8134 |
120
+ | 11 | 1.3838 | 0.8743 | 0.3065 | 0.0653 | 0.1376 | 0.0 | 0.9634 | 0.8371 | 0.8958 | 0.8097 |
121
+ | 12 | 1.3582 | 0.8751 | 0.2909 | 0.0617 | 0.1306 | 0.0 | 0.9457 | 0.8596 | 0.9006 | 0.7362 |
122
+ | 13 | 1.3126 | 0.8347 | 0.2921 | 0.0593 | 0.1264 | 0.0 | 0.9152 | 0.9295 | 0.9223 | 0.7149 |
123
+ | 14 | 1.3532 | 0.9079 | 0.2783 | 0.0543 | 0.1128 | 0.0 | 0.9639 | 0.8508 | 0.9038 | 0.7775 |
124
+ | 15 | 1.3188 | 0.8986 | 0.2720 | 0.0434 | 0.1048 | 0.0 | 0.8856 | 0.9419 | 0.9129 | 0.6738 |
125
+ | 16 | 1.2512 | 0.7840 | 0.2784 | 0.0621 | 0.1268 | 0.0 | 0.9181 | 0.9101 | 0.9141 | 0.7478 |
126
+ | 17 | 1.2909 | 0.8425 | 0.2778 | 0.0547 | 0.1159 | 0.0 | 0.9012 | 0.9282 | 0.9145 | 0.6556 |
127
+ | 18 | 1.2526 | 0.8442 | 0.2607 | 0.0443 | 0.1034 | 0.0 | 0.9169 | 0.9020 | 0.9094 | 0.7003 |
128
+ | 19 | 1.1980 | 0.8062 | 0.2528 | 0.0361 | 0.1029 | 0.0 | 0.9520 | 0.9157 | 0.9335 | 0.7136 |
129
+ | 20 | 1.1821 | 0.7895 | 0.2517 | 0.0378 | 0.1030 | 0.0 | 0.9023 | 0.9513 | 0.9262 | 0.6308 |
130
+ | 21 | 1.0843 | 0.7168 | 0.2402 | 0.0316 | 0.0957 | 0.0 | 0.9348 | 0.9032 | 0.9187 | 0.6883 |
131
+ | 22 | 1.1058 | 0.7367 | 0.2336 | 0.0374 | 0.0981 | 0.0 | 0.9321 | 0.9513 | 0.9416 | 0.6192 |
132
+ | 23 | 1.0699 | 0.7119 | 0.2340 | 0.0306 | 0.0935 | 0.0 | 0.9353 | 0.9476 | 0.9414 | 0.5962 |
133
+ | 24 | 1.0616 | 0.7031 | 0.2367 | 0.0311 | 0.0908 | 0.0 | 0.9418 | 0.9301 | 0.9359 | 0.6674 |
134
+ | 25 | 1.0784 | 0.7275 | 0.2311 | 0.0295 | 0.0904 | 0.0 | 0.9176 | 0.9320 | 0.9247 | 0.6158 |
135
+ | 26 | 1.0618 | 0.7121 | 0.2283 | 0.0297 | 0.0916 | 0.0 | 0.9411 | 0.9182 | 0.9295 | 0.6483 |
136
+ | 27 | 1.0530 | 0.7139 | 0.2236 | 0.0279 | 0.0876 | 0.0 | 0.9477 | 0.9395 | 0.9436 | 0.5958 |
137
+ | 28 | 1.0452 | 0.7097 | 0.2223 | 0.0283 | 0.0849 | 0.0 | 0.9465 | 0.9494 | 0.9480 | 0.5964 |
138
+ | 29 | 1.0966 | 0.7795 | 0.2176 | 0.0203 | 0.0792 | 0.0 | 0.9558 | 0.9320 | 0.9437 | 0.6288 |
139
+ | 30 | 1.0506 | 0.7312 | 0.2142 | 0.0195 | 0.0856 | 0.0 | 0.9370 | 0.9370 | 0.9370 | 0.5956 |
140
+ | 31 | 1.0030 | 0.6777 | 0.2163 | 0.0204 | 0.0886 | 0.0 | 0.9506 | 0.9251 | 0.9377 | 0.6099 |
141
+ | 32 | 0.9748 | 0.6610 | 0.2098 | 0.0201 | 0.0839 | 0.0 | 0.9527 | 0.9313 | 0.9419 | 0.5976 |
142
+ | 33 | 0.9540 | 0.6402 | 0.2059 | 0.0216 | 0.0863 | 0.0 | 0.9536 | 0.9238 | 0.9385 | 0.5907 |
143
+ | 34 | 0.9730 | 0.6500 | 0.2076 | 0.0281 | 0.0873 | 0.0 | 0.9407 | 0.9413 | 0.9410 | 0.5809 |
144
+ | 35 | 0.9894 | 0.6831 | 0.2066 | 0.0202 | 0.0794 | 0.0 | 0.9451 | 0.9345 | 0.9397 | 0.5837 |
145
+ | 36 | 0.9042 | 0.5873 | 0.2096 | 0.0214 | 0.0860 | 0.0 | 0.9460 | 0.9519 | 0.9490 | 0.5534 |
146
+ | 37 | 0.9546 | 0.6400 | 0.2112 | 0.0216 | 0.0818 | 0.0 | 0.9260 | 0.9457 | 0.9358 | 0.5562 |
147
+ | 38 | 0.9806 | 0.6800 | 0.2031 | 0.0175 | 0.0800 | 0.0 | 0.9476 | 0.9363 | 0.9419 | 0.5792 |
148
+ | 39 | 0.9294 | 0.6247 | 0.2016 | 0.0204 | 0.0826 | 0.0 | 0.9401 | 0.9501 | 0.9450 | 0.5703 |
149
+ | 40 | 0.9786 | 0.6733 | 0.2010 | 0.0268 | 0.0775 | 0.0 | 0.9375 | 0.9170 | 0.9271 | 0.5880 |
150
+ | 41 | 1.0026 | 0.7073 | 0.2033 | 0.0179 | 0.0742 | 0.0 | 0.9476 | 0.9251 | 0.9362 | 0.5875 |
151
+ | 42 | 0.9567 | 0.6677 | 0.1992 | 0.0164 | 0.0734 | 0.0 | 0.9468 | 0.9332 | 0.9400 | 0.5724 |
152
+ | 43 | 0.8747 | 0.5794 | 0.1980 | 0.0159 | 0.0814 | 0.0 | 0.9557 | 0.9432 | 0.9494 | 0.5709 |
153
+ | 44 | 1.0310 | 0.7392 | 0.1956 | 0.0254 | 0.0709 | 0.0 | 0.9589 | 0.9313 | 0.9449 | 0.5497 |
154
+ | 45 | 0.9526 | 0.6598 | 0.1982 | 0.0185 | 0.0762 | 0.0 | 0.9401 | 0.9413 | 0.9407 | 0.5580 |
155
+ | 46 | 0.8753 | 0.5940 | 0.1939 | 0.0176 | 0.0698 | 0.0 | 0.9468 | 0.9438 | 0.9453 | 0.5548 |
156
+ | 47 | 0.9328 | 0.6493 | 0.1953 | 0.0163 | 0.0720 | 0.0 | 0.9534 | 0.9320 | 0.9426 | 0.5735 |
157
+ | 48 | 0.9019 | 0.6071 | 0.2002 | 0.0182 | 0.0765 | 0.0 | 0.9496 | 0.9413 | 0.9455 | 0.5605 |
158
+ | 49 | 0.8335 | 0.5459 | 0.1918 | 0.0175 | 0.0783 | 0.0 | 0.9588 | 0.9307 | 0.9446 | 0.5637 |
159
+ | 50 | 0.9043 | 0.6179 | 0.1933 | 0.0154 | 0.0776 | 0.0 | 0.9597 | 0.9370 | 0.9482 | 0.5617 |