joheras commited on
Commit
fbb368d
1 Parent(s): 106a60f

update model card README.md

Browse files
Files changed (1) hide show
  1. README.md +35 -35
README.md CHANGED
@@ -17,11 +17,11 @@ should probably proofread and complete it, then remove this comment. -->
17
 
18
  This model is a fine-tuned version of [facebook/mbart-large-50](https://huggingface.co/facebook/mbart-large-50) on the None dataset.
19
  It achieves the following results on the evaluation set:
20
- - Loss: 3.2719
21
- - Rouge1: 47.3273
22
- - Rouge2: 29.559
23
- - Rougel: 42.4756
24
- - Rougelsum: 42.6072
25
 
26
  ## Model description
27
 
@@ -52,36 +52,36 @@ The following hyperparameters were used during training:
52
 
53
  | Training Loss | Epoch | Step | Validation Loss | Rouge1 | Rouge2 | Rougel | Rougelsum |
54
  |:-------------:|:-----:|:----:|:---------------:|:-------:|:-------:|:-------:|:---------:|
55
- | No log | 1.0 | 190 | 8.8777 | 0.0 | 0.0 | 0.0 | 0.0 |
56
- | No log | 2.0 | 380 | 1.8823 | 42.9631 | 25.6911 | 38.0391 | 38.1538 |
57
- | 4.9008 | 3.0 | 570 | 2.0013 | 35.1183 | 20.8995 | 31.4246 | 31.5244 |
58
- | 4.9008 | 4.0 | 760 | 1.8788 | 40.5309 | 24.083 | 36.0682 | 36.1536 |
59
- | 1.5674 | 5.0 | 950 | 1.9732 | 45.3455 | 26.8893 | 39.9257 | 40.0479 |
60
- | 1.5674 | 6.0 | 1140 | 2.1709 | 45.7451 | 27.9563 | 40.7943 | 40.9558 |
61
- | 1.5674 | 7.0 | 1330 | 2.3818 | 46.7204 | 28.2229 | 41.3356 | 41.4894 |
62
- | 0.4273 | 8.0 | 1520 | 2.4915 | 45.2646 | 27.0017 | 40.0391 | 40.0651 |
63
- | 0.4273 | 9.0 | 1710 | 2.6317 | 47.2481 | 29.1062 | 41.8523 | 42.0337 |
64
- | 0.2256 | 10.0 | 1900 | 2.7477 | 47.0734 | 28.6713 | 41.6488 | 41.7768 |
65
- | 0.2256 | 11.0 | 2090 | 2.7957 | 45.7955 | 27.9674 | 40.7893 | 40.9195 |
66
- | 0.2256 | 12.0 | 2280 | 2.8360 | 46.4354 | 28.6781 | 41.3965 | 41.5133 |
67
- | 0.1692 | 13.0 | 2470 | 2.8865 | 46.3897 | 28.1487 | 41.0427 | 41.1223 |
68
- | 0.1692 | 14.0 | 2660 | 2.8814 | 47.4505 | 29.5163 | 42.4362 | 42.5386 |
69
- | 0.1123 | 15.0 | 2850 | 2.9375 | 47.1197 | 29.315 | 42.1043 | 42.2586 |
70
- | 0.1123 | 16.0 | 3040 | 3.0171 | 47.6975 | 29.342 | 42.654 | 42.7698 |
71
- | 0.1123 | 17.0 | 3230 | 3.0271 | 47.1759 | 29.4071 | 42.337 | 42.4166 |
72
- | 0.0508 | 18.0 | 3420 | 3.1101 | 46.0849 | 28.1961 | 41.174 | 41.3203 |
73
- | 0.0508 | 19.0 | 3610 | 3.1070 | 46.6169 | 29.2838 | 41.8743 | 41.9666 |
74
- | 0.0268 | 20.0 | 3800 | 3.1305 | 46.8893 | 29.0878 | 41.9831 | 42.1053 |
75
- | 0.0268 | 21.0 | 3990 | 3.1861 | 47.1381 | 29.4249 | 42.0727 | 42.1941 |
76
- | 0.0268 | 22.0 | 4180 | 3.1936 | 47.2897 | 29.3799 | 42.1783 | 42.3624 |
77
- | 0.0146 | 23.0 | 4370 | 3.1948 | 47.1995 | 29.4603 | 42.0493 | 42.1786 |
78
- | 0.0146 | 24.0 | 4560 | 3.2059 | 46.8882 | 29.4045 | 42.1013 | 42.2324 |
79
- | 0.0098 | 25.0 | 4750 | 3.2399 | 47.4652 | 29.4997 | 42.4001 | 42.477 |
80
- | 0.0098 | 26.0 | 4940 | 3.2327 | 47.8052 | 30.0034 | 42.9265 | 43.0563 |
81
- | 0.0098 | 27.0 | 5130 | 3.2529 | 47.3934 | 29.7044 | 42.3686 | 42.4734 |
82
- | 0.0065 | 28.0 | 5320 | 3.2535 | 47.2493 | 29.2719 | 42.3357 | 42.4759 |
83
- | 0.0065 | 29.0 | 5510 | 3.2686 | 47.3168 | 29.4467 | 42.4364 | 42.5837 |
84
- | 0.0047 | 30.0 | 5700 | 3.2719 | 47.3273 | 29.559 | 42.4756 | 42.6072 |
85
 
86
 
87
  ### Framework versions
 
17
 
18
  This model is a fine-tuned version of [facebook/mbart-large-50](https://huggingface.co/facebook/mbart-large-50) on the None dataset.
19
  It achieves the following results on the evaluation set:
20
+ - Loss: 3.2121
21
+ - Rouge1: 49.1001
22
+ - Rouge2: 31.2516
23
+ - Rougel: 44.0446
24
+ - Rougelsum: 44.1075
25
 
26
  ## Model description
27
 
 
52
 
53
  | Training Loss | Epoch | Step | Validation Loss | Rouge1 | Rouge2 | Rougel | Rougelsum |
54
  |:-------------:|:-----:|:----:|:---------------:|:-------:|:-------:|:-------:|:---------:|
55
+ | No log | 1.0 | 190 | 1.8633 | 44.8593 | 28.0451 | 40.7724 | 40.8654 |
56
+ | No log | 2.0 | 380 | 1.6667 | 46.8654 | 29.5857 | 42.6056 | 42.7844 |
57
+ | 3.317 | 3.0 | 570 | 1.6847 | 48.1605 | 30.163 | 43.1965 | 43.3317 |
58
+ | 3.317 | 4.0 | 760 | 1.7845 | 48.7615 | 30.8887 | 43.6946 | 43.8016 |
59
+ | 0.7441 | 5.0 | 950 | 2.0090 | 48.4207 | 30.64 | 43.654 | 43.7979 |
60
+ | 0.7441 | 6.0 | 1140 | 2.2425 | 49.1967 | 31.2644 | 44.0566 | 44.2112 |
61
+ | 0.7441 | 7.0 | 1330 | 2.4520 | 47.0568 | 28.7501 | 41.8219 | 41.9605 |
62
+ | 0.2396 | 8.0 | 1520 | 2.5336 | 47.969 | 30.0618 | 42.9924 | 43.1481 |
63
+ | 0.2396 | 9.0 | 1710 | 2.6153 | 47.2037 | 28.9732 | 42.0939 | 42.2242 |
64
+ | 0.1112 | 10.0 | 1900 | 2.7299 | 48.3657 | 30.3342 | 43.2025 | 43.3223 |
65
+ | 0.1112 | 11.0 | 2090 | 2.7696 | 48.0929 | 30.0156 | 42.9385 | 43.026 |
66
+ | 0.1112 | 12.0 | 2280 | 2.8627 | 48.1979 | 30.2714 | 43.0959 | 43.2027 |
67
+ | 0.0938 | 13.0 | 2470 | 2.8788 | 47.7685 | 29.5733 | 42.7561 | 42.9112 |
68
+ | 0.0938 | 14.0 | 2660 | 2.9128 | 47.5374 | 29.8217 | 42.7097 | 42.7803 |
69
+ | 0.0394 | 15.0 | 2850 | 2.9470 | 48.6385 | 30.1425 | 43.3326 | 43.3963 |
70
+ | 0.0394 | 16.0 | 3040 | 3.0039 | 48.6657 | 30.6642 | 43.471 | 43.592 |
71
+ | 0.0394 | 17.0 | 3230 | 3.0380 | 48.2351 | 30.5653 | 43.257 | 43.3788 |
72
+ | 0.023 | 18.0 | 3420 | 3.0289 | 48.6593 | 30.6916 | 43.7861 | 43.9098 |
73
+ | 0.023 | 19.0 | 3610 | 3.0733 | 49.2114 | 31.2737 | 44.0852 | 44.1993 |
74
+ | 0.0122 | 20.0 | 3800 | 3.1089 | 48.5431 | 30.5305 | 43.4128 | 43.5288 |
75
+ | 0.0122 | 21.0 | 3990 | 3.0684 | 48.4197 | 30.4005 | 43.2305 | 43.3214 |
76
+ | 0.0122 | 22.0 | 4180 | 3.1252 | 48.6007 | 30.5594 | 43.4008 | 43.5336 |
77
+ | 0.0071 | 23.0 | 4370 | 3.1572 | 48.7297 | 30.7028 | 43.436 | 43.5106 |
78
+ | 0.0071 | 24.0 | 4560 | 3.1716 | 48.9335 | 30.9918 | 43.7764 | 43.8044 |
79
+ | 0.0041 | 25.0 | 4750 | 3.1687 | 48.8731 | 31.1055 | 43.8021 | 43.8987 |
80
+ | 0.0041 | 26.0 | 4940 | 3.1845 | 48.9432 | 31.0766 | 43.8628 | 43.9726 |
81
+ | 0.0041 | 27.0 | 5130 | 3.2133 | 49.2016 | 31.1265 | 44.052 | 44.1427 |
82
+ | 0.0025 | 28.0 | 5320 | 3.2146 | 49.1473 | 31.3109 | 44.0372 | 44.1189 |
83
+ | 0.0025 | 29.0 | 5510 | 3.2121 | 49.2815 | 31.4258 | 44.1661 | 44.2436 |
84
+ | 0.0019 | 30.0 | 5700 | 3.2121 | 49.1001 | 31.2516 | 44.0446 | 44.1075 |
85
 
86
 
87
  ### Framework versions