Edit model card

t5-base-finetuned-stocknews_2000_150

This model is a fine-tuned version of google-t5/t5-base on an unknown dataset. It achieves the following results on the evaluation set:

  • Loss: 0.5246
  • Rouge1: 41.1174
  • Rouge2: 36.4917
  • Rougel: 40.2739
  • Rougelsum: 40.5043
  • Gen Len: 19.0

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 2e-05
  • train_batch_size: 8
  • eval_batch_size: 8
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 150
  • mixed_precision_training: Native AMP

Training results

Training Loss Epoch Step Validation Loss Rouge1 Rouge2 Rougel Rougelsum Gen Len
No log 1.0 211 0.4220 37.4081 29.7287 35.6792 36.0611 19.0
No log 2.0 422 0.4020 37.6979 30.5377 36.0747 36.4168 19.0
0.3832 3.0 633 0.3947 38.258 31.0862 36.5414 37.0213 19.0
0.3832 4.0 844 0.3850 38.4834 31.3747 36.8077 37.2317 19.0
0.2939 5.0 1055 0.3765 38.8131 32.3372 37.3919 37.7305 19.0
0.2939 6.0 1266 0.3762 39.1749 33.0152 37.6824 38.0201 19.0
0.2939 7.0 1477 0.3569 39.2336 32.9984 37.8439 38.1723 19.0
0.2511 8.0 1688 0.3551 39.452 33.6999 38.3731 38.5895 19.0
0.2511 9.0 1899 0.3523 39.8924 34.2746 38.6913 38.9944 19.0
0.2532 10.0 2110 0.3487 39.9155 34.2762 38.8052 39.077 19.0
0.2532 11.0 2321 0.3533 39.7805 34.2195 38.6591 38.9007 19.0
0.2158 12.0 2532 0.3529 39.6286 34.2772 38.5553 38.8225 19.0
0.2158 13.0 2743 0.3506 40.1899 35.0527 39.2227 39.4969 19.0
0.2158 14.0 2954 0.3474 40.666 35.5759 39.6311 39.9267 19.0
0.1882 15.0 3165 0.3488 40.4267 35.2551 39.2486 39.5608 19.0
0.1882 16.0 3376 0.3547 40.6478 35.5519 39.6034 39.8449 19.0
0.1612 17.0 3587 0.3616 40.7061 35.8348 39.8034 40.0508 19.0
0.1612 18.0 3798 0.3621 40.7052 35.8514 39.7689 40.0123 19.0
0.1434 19.0 4009 0.3632 40.5196 35.649 39.5977 39.8099 19.0
0.1434 20.0 4220 0.3667 40.8356 35.9832 39.9295 40.1647 19.0
0.1434 21.0 4431 0.3711 40.75 35.7893 39.7533 40.0671 19.0
0.1248 22.0 4642 0.3714 40.6404 35.8139 39.6508 39.9206 19.0
0.1248 23.0 4853 0.3720 40.596 35.7999 39.7515 39.9484 19.0
0.1097 24.0 5064 0.3766 40.6635 35.8029 39.8031 40.023 19.0
0.1097 25.0 5275 0.3841 40.6312 35.7811 39.7593 40.0159 19.0
0.1097 26.0 5486 0.3874 40.6912 35.85 39.7479 40.0379 19.0
0.0994 27.0 5697 0.3840 40.7263 35.9777 39.8711 40.1549 19.0
0.0994 28.0 5908 0.3935 40.7512 35.8443 39.7654 40.052 19.0
0.0877 29.0 6119 0.3942 40.801 35.9741 39.8594 40.0986 19.0
0.0877 30.0 6330 0.3977 40.9239 36.1363 40.0563 40.319 19.0
0.0786 31.0 6541 0.4009 40.8977 36.1534 40.0016 40.2385 19.0
0.0786 32.0 6752 0.3996 40.7816 36.1552 39.9214 40.1717 19.0
0.0786 33.0 6963 0.4023 40.9965 36.3464 40.1217 40.3481 19.0
0.0723 34.0 7174 0.4086 40.8352 36.1049 39.8852 40.142 19.0
0.0723 35.0 7385 0.4048 40.9399 36.2465 40.0545 40.3178 19.0
0.0654 36.0 7596 0.4097 40.9975 36.2784 40.0802 40.3726 19.0
0.0654 37.0 7807 0.4117 40.851 36.1677 40.0313 40.3027 19.0
0.0592 38.0 8018 0.4164 40.9427 36.2783 40.1323 40.4087 19.0
0.0592 39.0 8229 0.4187 40.6632 36.0088 39.8049 40.0361 19.0
0.0592 40.0 8440 0.4188 41.008 36.3243 40.1924 40.466 19.0
0.0557 41.0 8651 0.4244 40.887 36.2373 40.0544 40.3017 19.0
0.0557 42.0 8862 0.4219 40.8024 36.1323 39.9768 40.2685 19.0
0.0516 43.0 9073 0.4234 40.7758 36.1291 39.9284 40.1658 19.0
0.0516 44.0 9284 0.4268 40.8067 36.1192 39.9735 40.212 19.0
0.0516 45.0 9495 0.4229 40.8445 36.0577 39.9435 40.1851 19.0
0.0473 46.0 9706 0.4343 40.7118 36.1068 39.9453 40.1875 19.0
0.0473 47.0 9917 0.4311 40.7688 36.0953 39.9612 40.1921 19.0
0.0438 48.0 10128 0.4376 40.9327 36.2236 40.0164 40.2675 19.0
0.0438 49.0 10339 0.4360 41.0039 36.3548 40.0958 40.3716 19.0
0.0408 50.0 10550 0.4418 40.9386 36.3116 40.0052 40.2586 19.0
0.0408 51.0 10761 0.4436 41.0744 36.421 40.1518 40.4014 19.0
0.0408 52.0 10972 0.4427 41.1198 36.4495 40.2116 40.4505 19.0
0.0382 53.0 11183 0.4428 41.0544 36.4075 40.1852 40.4269 19.0
0.0382 54.0 11394 0.4468 41.0366 36.3513 40.1403 40.361 19.0
0.0354 55.0 11605 0.4463 40.9558 36.3748 40.1348 40.3447 19.0
0.0354 56.0 11816 0.4508 40.8857 36.3143 40.0455 40.2318 19.0
0.0338 57.0 12027 0.4544 40.8272 36.244 40.0023 40.2384 19.0
0.0338 58.0 12238 0.4555 40.9537 36.1908 40.0228 40.2483 19.0
0.0338 59.0 12449 0.4521 40.9258 36.1708 40.0611 40.3071 19.0
0.031 60.0 12660 0.4555 40.8837 36.147 40.0305 40.2382 19.0
0.031 61.0 12871 0.4566 40.9297 36.2576 40.09 40.2747 19.0
0.0307 62.0 13082 0.4562 40.8585 36.2582 40.0722 40.25 19.0
0.0307 63.0 13293 0.4592 40.9201 36.2751 40.0861 40.3269 19.0
0.0281 64.0 13504 0.4567 40.9232 36.2481 40.0753 40.3216 19.0
0.0281 65.0 13715 0.4606 41.0077 36.3489 40.1395 40.3744 19.0
0.0281 66.0 13926 0.4649 41.0042 36.5452 40.2019 40.4466 19.0
0.0263 67.0 14137 0.4674 40.9152 36.4575 40.2074 40.4128 19.0
0.0263 68.0 14348 0.4638 40.9942 36.4242 40.2192 40.4164 19.0
0.0258 69.0 14559 0.4652 41.0026 36.3871 40.1336 40.3569 19.0
0.0258 70.0 14770 0.4683 40.9275 36.4236 40.0798 40.3247 19.0
0.0258 71.0 14981 0.4729 40.9299 36.2989 40.1179 40.3533 19.0
0.0245 72.0 15192 0.4713 40.8745 36.2617 40.0829 40.3073 19.0
0.0245 73.0 15403 0.4720 40.9534 36.4602 40.1804 40.4279 19.0
0.0231 74.0 15614 0.4762 41.055 36.552 40.2672 40.5027 19.0
0.0231 75.0 15825 0.4776 40.939 36.492 40.1735 40.3718 19.0
0.0219 76.0 16036 0.4814 41.0543 36.6498 40.3146 40.5381 19.0
0.0219 77.0 16247 0.4826 41.0015 36.5925 40.2389 40.4813 19.0
0.0219 78.0 16458 0.4840 41.0486 36.6352 40.3106 40.5603 19.0
0.0213 79.0 16669 0.4848 40.9784 36.4886 40.1903 40.439 19.0
0.0213 80.0 16880 0.4910 41.175 36.6854 40.3474 40.5917 19.0
0.0204 81.0 17091 0.4843 41.0851 36.5354 40.3005 40.5392 19.0
0.0204 82.0 17302 0.4847 41.2714 36.6856 40.4516 40.672 19.0
0.0196 83.0 17513 0.4860 40.9692 36.3916 40.1273 40.3602 19.0
0.0196 84.0 17724 0.4870 40.9497 36.3933 40.1057 40.3926 19.0
0.0196 85.0 17935 0.4827 41.0823 36.5005 40.2376 40.4651 19.0
0.019 86.0 18146 0.4889 41.1902 36.6614 40.3848 40.6069 19.0
0.019 87.0 18357 0.4890 41.186 36.6136 40.4576 40.6462 19.0
0.0179 88.0 18568 0.4940 41.1593 36.5153 40.377 40.5727 19.0
0.0179 89.0 18779 0.4908 40.9712 36.43 40.1811 40.3797 19.0
0.0179 90.0 18990 0.4914 41.0358 36.4656 40.1936 40.4449 19.0
0.0176 91.0 19201 0.4924 40.8918 36.3329 40.0398 40.2895 19.0
0.0176 92.0 19412 0.4913 41.0889 36.3829 40.213 40.4163 19.0
0.0168 93.0 19623 0.4939 41.048 36.407 40.1863 40.4131 19.0
0.0168 94.0 19834 0.4996 41.0211 36.3687 40.1492 40.3375 19.0
0.016 95.0 20045 0.5000 40.8562 36.2496 39.9959 40.2259 19.0
0.016 96.0 20256 0.4989 41.0123 36.3468 40.1217 40.3407 19.0
0.016 97.0 20467 0.5004 41.0992 36.4577 40.1794 40.4175 19.0
0.0163 98.0 20678 0.5009 41.0319 36.3625 40.1331 40.3442 19.0
0.0163 99.0 20889 0.4978 40.8888 36.238 40.0311 40.2348 19.0
0.0154 100.0 21100 0.5059 40.9034 36.2802 40.033 40.2534 19.0
0.0154 101.0 21311 0.5026 41.0808 36.4192 40.211 40.4242 19.0
0.0148 102.0 21522 0.5043 41.1898 36.4732 40.3336 40.5495 19.0
0.0148 103.0 21733 0.5062 41.216 36.6109 40.408 40.6201 19.0
0.0148 104.0 21944 0.5076 40.9136 36.2326 40.043 40.274 19.0
0.0142 105.0 22155 0.5085 41.1476 36.5099 40.3444 40.5131 19.0
0.0142 106.0 22366 0.5087 41.1 36.4271 40.2888 40.4809 19.0
0.0137 107.0 22577 0.5083 40.8868 36.2128 40.0356 40.2519 19.0
0.0137 108.0 22788 0.5097 41.0436 36.4065 40.2004 40.4431 19.0
0.0137 109.0 22999 0.5113 41.1789 36.617 40.3938 40.5925 19.0
0.0137 110.0 23210 0.5127 40.989 36.3659 40.1097 40.3074 19.0
0.0137 111.0 23421 0.5144 41.0157 36.3607 40.1239 40.3237 19.0
0.0132 112.0 23632 0.5153 40.9412 36.3165 40.0601 40.283 19.0
0.0132 113.0 23843 0.5127 41.011 36.3343 40.1059 40.3317 19.0
0.0138 114.0 24054 0.5174 40.9507 36.3226 40.0426 40.2821 19.0
0.0138 115.0 24265 0.5172 40.9169 36.2471 40.0189 40.2581 19.0
0.0138 116.0 24476 0.5191 40.9621 36.2937 40.0859 40.2872 19.0
0.0129 117.0 24687 0.5164 40.9124 36.2428 40.0247 40.2636 19.0
0.0129 118.0 24898 0.5217 40.8482 36.2412 39.983 40.2084 19.0
0.0131 119.0 25109 0.5191 40.9377 36.3549 40.0702 40.303 19.0
0.0131 120.0 25320 0.5206 41.0878 36.5262 40.2577 40.4903 19.0
0.0123 121.0 25531 0.5223 40.9777 36.4348 40.1438 40.3255 19.0
0.0123 122.0 25742 0.5200 40.9512 36.2822 40.0795 40.2998 19.0
0.0123 123.0 25953 0.5244 40.9508 36.3301 40.0726 40.3256 19.0
0.0125 124.0 26164 0.5225 41.1733 36.4561 40.3336 40.5512 19.0
0.0125 125.0 26375 0.5240 41.0364 36.4154 40.189 40.4268 19.0
0.0118 126.0 26586 0.5246 41.1267 36.4904 40.3025 40.5672 19.0
0.0118 127.0 26797 0.5214 40.9609 36.417 40.1255 40.3472 19.0
0.0125 128.0 27008 0.5196 41.1335 36.4937 40.3248 40.5371 19.0
0.0125 129.0 27219 0.5214 41.1757 36.606 40.3908 40.6112 19.0
0.0125 130.0 27430 0.5190 41.1436 36.5116 40.344 40.5505 19.0
0.012 131.0 27641 0.5227 41.0854 36.5638 40.2975 40.5342 19.0
0.012 132.0 27852 0.5233 41.0652 36.5087 40.2447 40.4784 19.0
0.0117 133.0 28063 0.5251 41.1272 36.4621 40.2664 40.4917 19.0
0.0117 134.0 28274 0.5215 41.1819 36.5561 40.3583 40.5515 19.0
0.0117 135.0 28485 0.5219 41.1615 36.5308 40.323 40.5283 19.0
0.0116 136.0 28696 0.5228 41.0947 36.4701 40.2537 40.4725 19.0
0.0116 137.0 28907 0.5211 41.1187 36.4948 40.2711 40.4957 19.0
0.0114 138.0 29118 0.5219 41.0826 36.4684 40.2557 40.4678 19.0
0.0114 139.0 29329 0.5223 41.1453 36.5356 40.3132 40.5333 19.0
0.0111 140.0 29540 0.5237 41.1055 36.4938 40.2656 40.4907 19.0
0.0111 141.0 29751 0.5241 41.1391 36.4983 40.2896 40.5215 19.0
0.0111 142.0 29962 0.5243 41.1702 36.5621 40.3401 40.5579 19.0
0.0112 143.0 30173 0.5242 41.1499 36.5609 40.3355 40.5387 19.0
0.0112 144.0 30384 0.5236 41.1261 36.5274 40.3011 40.522 19.0
0.011 145.0 30595 0.5240 41.1174 36.4917 40.2739 40.5043 19.0
0.011 146.0 30806 0.5248 41.1174 36.4917 40.2739 40.5043 19.0
0.0106 147.0 31017 0.5241 41.1174 36.4917 40.2739 40.5043 19.0
0.0106 148.0 31228 0.5243 41.1174 36.4917 40.2739 40.5043 19.0
0.0106 149.0 31439 0.5245 41.1174 36.4917 40.2739 40.5043 19.0
0.0105 150.0 31650 0.5246 41.1174 36.4917 40.2739 40.5043 19.0

Framework versions

  • Transformers 4.38.1
  • Pytorch 2.1.2
  • Datasets 2.1.0
  • Tokenizers 0.15.2
Downloads last month
4
Safetensors
Model size
223M params
Tensor type
F32
·
Inference Examples
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social visibility and check back later, or deploy to Inference Endpoints (dedicated) instead.

Model tree for sujayC66/t5-base-finetuned-stocknews_2000_150

Base model

google-t5/t5-base
Finetuned
(347)
this model