Edit model card

t5-base-finetuned-stocknews_1900_100

This model is a fine-tuned version of google-t5/t5-base on an unknown dataset. It achieves the following results on the evaluation set:

  • Loss: 0.4554
  • Rouge1: 40.9735
  • Rouge2: 36.4343
  • Rougel: 40.1125
  • Rougelsum: 40.3384
  • Gen Len: 19.0

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 2e-05
  • train_batch_size: 8
  • eval_batch_size: 8
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 100
  • mixed_precision_training: Native AMP

Training results

Training Loss Epoch Step Validation Loss Rouge1 Rouge2 Rougel Rougelsum Gen Len
No log 1.0 211 0.7350 31.7308 20.2914 28.657 29.3167 18.9596
No log 2.0 422 0.6345 33.1681 22.6637 30.5277 31.1213 19.0
0.9162 3.0 633 0.5706 34.6997 24.847 32.2288 32.8098 19.0
0.9162 4.0 844 0.5268 35.4092 26.2862 33.1822 33.6119 19.0
0.6423 5.0 1055 0.4858 36.1444 27.7265 34.1005 34.4616 19.0
0.6423 6.0 1266 0.4560 36.7437 28.449 34.6735 35.1349 19.0
0.6423 7.0 1477 0.4323 37.33 29.5265 35.4853 35.9323 19.0
0.5063 8.0 1688 0.4142 37.1593 29.6064 35.4064 35.8123 19.0
0.5063 9.0 1899 0.3991 38.1553 30.5752 36.2114 36.7167 19.0
0.4102 10.0 2110 0.3864 38.3045 31.2785 36.6248 36.9254 19.0
0.4102 11.0 2321 0.3789 38.2719 31.5007 36.7926 37.0642 19.0
0.3415 12.0 2532 0.3703 38.8466 32.1912 37.3333 37.6131 19.0
0.3415 13.0 2743 0.3618 38.6865 32.2025 37.2779 37.5144 19.0
0.3415 14.0 2954 0.3522 39.3257 33.1793 38.0203 38.2379 19.0
0.2912 15.0 3165 0.3508 39.4422 33.4813 38.2943 38.4649 19.0
0.2912 16.0 3376 0.3506 39.8056 34.1172 38.6625 38.8293 19.0
0.2453 17.0 3587 0.3519 39.9209 34.5123 38.9012 39.0863 19.0
0.2453 18.0 3798 0.3498 40.1987 34.8962 39.2082 39.3708 19.0
0.216 19.0 4009 0.3544 39.6724 34.2613 38.6566 38.7859 19.0
0.216 20.0 4220 0.3539 40.1049 34.8915 39.0681 39.2354 19.0
0.216 21.0 4431 0.3561 40.0241 34.6788 38.9621 39.112 19.0
0.186 22.0 4642 0.3548 40.144 34.8856 39.1343 39.3265 19.0
0.186 23.0 4853 0.3564 40.3022 35.2446 39.3555 39.5398 19.0
0.1626 24.0 5064 0.3575 40.2556 35.1322 39.2923 39.4501 19.0
0.1626 25.0 5275 0.3655 40.4588 35.4231 39.5008 39.6855 19.0
0.1626 26.0 5486 0.3687 40.3751 35.4048 39.4194 39.6334 19.0
0.1463 27.0 5697 0.3636 40.5556 35.6104 39.646 39.8315 19.0
0.1463 28.0 5908 0.3724 40.6704 35.7873 39.645 39.8934 19.0
0.1291 29.0 6119 0.3721 40.7764 35.9434 39.8896 40.0641 19.0
0.1291 30.0 6330 0.3767 40.6911 35.868 39.7979 40.0009 19.0
0.115 31.0 6541 0.3776 40.5145 35.7139 39.6426 39.814 19.0
0.115 32.0 6752 0.3752 40.6776 35.8839 39.7995 39.9986 19.0
0.115 33.0 6963 0.3793 40.5806 35.7407 39.6819 39.8721 19.0
0.1051 34.0 7174 0.3871 40.652 35.8792 39.7158 39.9167 19.0
0.1051 35.0 7385 0.3828 40.8275 36.0878 39.9195 40.1043 19.0
0.095 36.0 7596 0.3886 40.9392 36.2701 40.0753 40.2416 19.0
0.095 37.0 7807 0.3908 40.6987 35.9383 39.8522 40.0252 19.0
0.0864 38.0 8018 0.3937 40.9136 36.1533 40.0212 40.1877 19.0
0.0864 39.0 8229 0.3979 40.5823 35.9301 39.7841 39.9357 19.0
0.0864 40.0 8440 0.3971 40.9144 36.1874 40.036 40.2312 19.0
0.0812 41.0 8651 0.4008 40.8206 36.1899 40.0098 40.185 19.0
0.0812 42.0 8862 0.4007 40.6012 35.8957 39.7683 39.932 19.0
0.0747 43.0 9073 0.4001 40.8324 36.0613 39.9346 40.119 19.0
0.0747 44.0 9284 0.4057 40.8783 36.0747 39.9939 40.1931 19.0
0.0747 45.0 9495 0.4026 40.9583 36.2066 40.1362 40.3269 19.0
0.0689 46.0 9706 0.4132 40.6396 36.0119 39.8226 40.0266 19.0
0.0689 47.0 9917 0.4092 40.8679 36.2276 40.0419 40.2269 19.0
0.0643 48.0 10128 0.4131 41.0975 36.4785 40.2175 40.4088 19.0
0.0643 49.0 10339 0.4142 41.084 36.4548 40.1774 40.3793 19.0
0.0599 50.0 10550 0.4162 41.0003 36.4144 40.0912 40.3021 19.0
0.0599 51.0 10761 0.4201 41.123 36.4406 40.2193 40.4498 19.0
0.0599 52.0 10972 0.4185 41.1181 36.4871 40.2354 40.4111 19.0
0.0563 53.0 11183 0.4183 41.0662 36.471 40.2436 40.4196 19.0
0.0563 54.0 11394 0.4222 40.9644 36.3705 40.0994 40.2857 19.0
0.053 55.0 11605 0.4219 41.0366 36.4104 40.2024 40.3756 19.0
0.053 56.0 11816 0.4238 40.9543 36.2944 40.0546 40.2509 19.0
0.0502 57.0 12027 0.4260 40.8299 36.173 39.9556 40.1762 19.0
0.0502 58.0 12238 0.4281 40.7226 36.0612 39.8837 40.0788 19.0
0.0502 59.0 12449 0.4281 40.8293 36.1924 39.9873 40.1796 19.0
0.0466 60.0 12660 0.4276 40.8576 36.1387 40.0215 40.2374 19.0
0.0466 61.0 12871 0.4311 41.0218 36.4164 40.1375 40.3726 19.0
0.0462 62.0 13082 0.4310 41.006 36.333 40.1393 40.3476 19.0
0.0462 63.0 13293 0.4343 41.0375 36.2933 40.1381 40.3135 19.0
0.0423 64.0 13504 0.4315 41.004 36.2703 40.0982 40.31 19.0
0.0423 65.0 13715 0.4346 41.0361 36.3826 40.1206 40.3346 19.0
0.0423 66.0 13926 0.4381 40.8662 36.347 40.0537 40.2147 19.0
0.0405 67.0 14137 0.4383 41.0513 36.4805 40.1781 40.397 19.0
0.0405 68.0 14348 0.4373 40.9528 36.3512 40.0602 40.2812 19.0
0.0398 69.0 14559 0.4385 40.9879 36.3848 40.1668 40.3769 19.0
0.0398 70.0 14770 0.4414 40.9653 36.4555 40.1602 40.3589 19.0
0.0398 71.0 14981 0.4433 41.0236 36.5146 40.1889 40.4139 19.0
0.0378 72.0 15192 0.4423 40.9979 36.3904 40.1236 40.3669 19.0
0.0378 73.0 15403 0.4435 41.0081 36.4075 40.1324 40.3675 19.0
0.0361 74.0 15614 0.4423 41.0208 36.4193 40.1883 40.4144 19.0
0.0361 75.0 15825 0.4449 40.9626 36.3828 40.1797 40.3773 19.0
0.0354 76.0 16036 0.4479 40.9415 36.3803 40.1269 40.3357 19.0
0.0354 77.0 16247 0.4464 41.0229 36.5098 40.2163 40.4094 19.0
0.0354 78.0 16458 0.4464 40.9558 36.413 40.1258 40.3388 19.0
0.0345 79.0 16669 0.4465 40.9385 36.3516 40.0814 40.3247 19.0
0.0345 80.0 16880 0.4531 41.0034 36.4385 40.1536 40.3875 19.0
0.0332 81.0 17091 0.4492 41.0399 36.4823 40.1741 40.4126 19.0
0.0332 82.0 17302 0.4486 41.065 36.5245 40.2065 40.4218 19.0
0.0326 83.0 17513 0.4512 40.9513 36.3926 40.0856 40.3274 19.0
0.0326 84.0 17724 0.4515 40.9202 36.3954 40.0657 40.2837 19.0
0.0326 85.0 17935 0.4504 40.9972 36.518 40.1999 40.4031 19.0
0.0319 86.0 18146 0.4533 40.9467 36.391 40.1257 40.3422 19.0
0.0319 87.0 18357 0.4527 40.9682 36.4798 40.1442 40.3529 19.0
0.0306 88.0 18568 0.4544 40.9622 36.4381 40.149 40.3599 19.0
0.0306 89.0 18779 0.4549 40.9742 36.4306 40.15 40.3669 19.0
0.0306 90.0 18990 0.4531 40.9875 36.4958 40.1809 40.3876 19.0
0.031 91.0 19201 0.4551 40.9555 36.4406 40.144 40.3408 19.0
0.031 92.0 19412 0.4531 40.9665 36.4446 40.1594 40.3673 19.0
0.0299 93.0 19623 0.4544 40.9272 36.3767 40.0731 40.2899 19.0
0.0299 94.0 19834 0.4549 40.9021 36.3566 40.0557 40.2726 19.0
0.0291 95.0 20045 0.4544 40.9254 36.3759 40.0779 40.2962 19.0
0.0291 96.0 20256 0.4546 40.9254 36.3759 40.0779 40.2962 19.0
0.0291 97.0 20467 0.4551 40.9465 36.3891 40.0831 40.3071 19.0
0.0299 98.0 20678 0.4553 40.9465 36.3891 40.0831 40.3071 19.0
0.0299 99.0 20889 0.4554 40.9465 36.3891 40.0831 40.3071 19.0
0.0292 100.0 21100 0.4554 40.9735 36.4343 40.1125 40.3384 19.0

Framework versions

  • Transformers 4.38.1
  • Pytorch 2.1.2
  • Datasets 2.1.0
  • Tokenizers 0.15.2
Downloads last month
5
Safetensors
Model size
223M params
Tensor type
F32
·
Inference Examples
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social visibility and check back later, or deploy to Inference Endpoints (dedicated) instead.

Model tree for sujayC66/t5-base-finetuned-stocknews_1900_100

Base model

google-t5/t5-base
Finetuned
(347)
this model