Edit model card

text_shortening_model_v25

This model is a fine-tuned version of t5-small on the None dataset. It achieves the following results on the evaluation set:

  • Loss: 2.0466
  • Rouge1: 0.5016
  • Rouge2: 0.301
  • Rougel: 0.4642
  • Rougelsum: 0.4623
  • Bert precision: 0.8789
  • Bert recall: 0.8772
  • Average word count: 9.6201
  • Max word count: 16
  • Min word count: 5
  • Average token count: 14.3319
  • % shortened texts with length > 12: 11.3537

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 0.0001
  • train_batch_size: 8
  • eval_batch_size: 8
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 100

Training results

Training Loss Epoch Step Validation Loss Rouge1 Rouge2 Rougel Rougelsum Bert precision Bert recall Average word count Max word count Min word count Average token count % shortened texts with length > 12
0.9535 1.0 100 1.7458 0.5167 0.3187 0.4841 0.4855 0.8803 0.8796 9.8996 18 4 14.3799 20.524
0.8999 2.0 200 1.6253 0.5209 0.3071 0.474 0.4749 0.8796 0.8803 10.0611 17 3 14.5153 23.5808
0.8388 3.0 300 1.5631 0.5245 0.3124 0.4823 0.4823 0.8773 0.8828 10.5022 17 3 15.1223 27.9476
0.816 4.0 400 1.4961 0.5349 0.3108 0.4786 0.479 0.8733 0.8862 11.1659 17 4 16.0655 32.3144
0.8173 5.0 500 1.4751 0.5243 0.3064 0.4738 0.4733 0.879 0.8832 10.3406 17 4 15.1179 24.4541
0.7685 6.0 600 1.4447 0.5305 0.3111 0.4814 0.4808 0.88 0.8855 10.3712 17 5 15.2009 22.2707
0.7293 7.0 700 1.4249 0.5343 0.3199 0.4823 0.4833 0.879 0.8865 10.6681 17 5 15.5109 25.3275
0.7312 8.0 800 1.3878 0.5344 0.3216 0.487 0.4873 0.8842 0.8863 10.1659 17 5 14.9607 17.0306
0.7256 9.0 900 1.4002 0.532 0.32 0.483 0.4835 0.8785 0.8858 10.6681 17 5 15.5721 25.7642
0.7125 10.0 1000 1.4156 0.5406 0.3301 0.497 0.4969 0.8815 0.889 10.6332 17 5 15.3799 24.4541
0.6937 11.0 1100 1.4109 0.5346 0.3128 0.4893 0.4892 0.8826 0.886 10.2926 17 4 15.1092 20.524
0.6755 12.0 1200 1.3998 0.5388 0.327 0.4936 0.4937 0.8845 0.8882 10.3362 17 4 15.131 20.9607
0.6722 13.0 1300 1.4058 0.538 0.3192 0.4933 0.4925 0.8836 0.8869 10.214 17 4 14.9476 19.6507
0.6656 14.0 1400 1.4237 0.5367 0.3241 0.4917 0.4922 0.8821 0.8857 10.2183 17 4 14.9913 19.214
0.6274 15.0 1500 1.4059 0.5365 0.3315 0.4967 0.4959 0.8848 0.8853 9.9039 17 4 14.6681 14.8472
0.6309 16.0 1600 1.4130 0.5355 0.3311 0.4938 0.494 0.8842 0.8866 10.0568 17 4 14.8515 16.1572
0.6185 17.0 1700 1.4334 0.5357 0.3193 0.483 0.483 0.8826 0.8876 10.5066 17 4 15.345 22.2707
0.6095 18.0 1800 1.4426 0.5425 0.3329 0.498 0.4977 0.8866 0.8869 10.0087 17 4 14.6943 14.4105
0.5951 19.0 1900 1.4640 0.5437 0.3357 0.5013 0.501 0.8864 0.8878 10.0917 17 5 14.786 15.2838
0.5893 20.0 2000 1.4577 0.532 0.3289 0.4879 0.488 0.8842 0.8863 10.1659 17 4 14.9389 19.6507
0.6105 21.0 2100 1.4789 0.5399 0.3336 0.4986 0.498 0.8855 0.888 10.1921 17 5 15.048 17.0306
0.5712 22.0 2200 1.4992 0.533 0.3235 0.4825 0.4821 0.8825 0.8854 10.2664 18 4 15.0786 19.214
0.5961 23.0 2300 1.5211 0.5291 0.3168 0.4826 0.4821 0.8813 0.8849 10.2489 17 5 15.0437 20.0873
0.566 24.0 2400 1.5313 0.5355 0.318 0.4875 0.4872 0.8853 0.8868 9.9563 17 5 14.7424 14.8472
0.5747 25.0 2500 1.5177 0.545 0.3403 0.5029 0.5017 0.888 0.8886 9.8908 17 5 14.6245 13.1004
0.5576 26.0 2600 1.5409 0.5314 0.3258 0.4817 0.4813 0.8818 0.886 10.2795 16 5 15.0655 15.7205
0.5669 27.0 2700 1.5500 0.5293 0.3229 0.4831 0.4835 0.8835 0.8852 9.9301 17 5 14.6507 14.4105
0.5577 28.0 2800 1.5767 0.525 0.3185 0.4825 0.4827 0.8844 0.8837 9.7598 16 5 14.476 13.5371
0.5551 29.0 2900 1.5956 0.5344 0.3269 0.4904 0.4906 0.8864 0.8854 9.786 17 5 14.4192 13.5371
0.517 30.0 3000 1.6067 0.5239 0.3132 0.4798 0.4793 0.8826 0.8831 9.9607 17 5 14.7074 13.5371
0.5316 31.0 3100 1.6107 0.5277 0.3264 0.4835 0.483 0.8835 0.8853 9.9476 16 5 14.7031 14.8472
0.5263 32.0 3200 1.6188 0.527 0.316 0.4788 0.4786 0.8806 0.8844 10.1441 17 5 14.9913 17.4672
0.5397 33.0 3300 1.6249 0.5245 0.3124 0.477 0.4757 0.8813 0.8833 10.0 16 4 14.8428 15.2838
0.52 34.0 3400 1.6383 0.5232 0.3154 0.4782 0.4771 0.8828 0.8838 9.9563 17 4 14.7162 13.5371
0.5331 35.0 3500 1.6546 0.5205 0.3181 0.4755 0.4746 0.882 0.8821 9.869 16 5 14.5677 13.9738
0.5144 36.0 3600 1.6702 0.5295 0.3216 0.4874 0.4865 0.8831 0.885 9.9738 16 4 14.7773 13.5371
0.5076 37.0 3700 1.6865 0.5185 0.3101 0.4703 0.4696 0.8804 0.8817 10.0611 16 4 14.8384 17.9039
0.5222 38.0 3800 1.6799 0.5249 0.3186 0.4837 0.4831 0.8833 0.8838 9.8515 17 4 14.6594 13.1004
0.4992 39.0 3900 1.6934 0.5258 0.3207 0.4866 0.4853 0.8847 0.8829 9.6288 16 4 14.4105 11.7904
0.5135 40.0 4000 1.7291 0.5225 0.3151 0.4768 0.476 0.8833 0.8829 9.8079 17 4 14.5764 11.3537
0.4912 41.0 4100 1.7379 0.5137 0.3089 0.4696 0.4684 0.8818 0.8808 9.7336 16 4 14.5284 13.5371
0.51 42.0 4200 1.7384 0.5177 0.3147 0.4772 0.4765 0.8824 0.8819 9.6856 16 4 14.4934 11.7904
0.5171 43.0 4300 1.7543 0.526 0.3181 0.4779 0.4768 0.884 0.8848 9.9083 17 5 14.6594 13.5371
0.4925 44.0 4400 1.7793 0.5193 0.3162 0.4749 0.4736 0.8831 0.8824 9.6769 16 5 14.4803 13.9738
0.4986 45.0 4500 1.7716 0.5125 0.3124 0.469 0.4678 0.8831 0.8807 9.4585 16 4 14.2489 11.3537
0.4723 46.0 4600 1.7763 0.5146 0.3147 0.4726 0.4714 0.8827 0.8814 9.6463 17 5 14.5022 12.2271
0.4952 47.0 4700 1.8000 0.5184 0.3143 0.4758 0.4744 0.884 0.8814 9.4541 16 4 14.2926 7.8603
0.4882 48.0 4800 1.7944 0.5178 0.3192 0.4715 0.4703 0.8823 0.8814 9.6681 17 5 14.3712 10.4803
0.4815 49.0 4900 1.8060 0.5206 0.3187 0.4762 0.4754 0.8839 0.8813 9.4105 16 4 14.0655 9.1703
0.4607 50.0 5000 1.8159 0.5152 0.3139 0.4695 0.4692 0.8829 0.88 9.5546 16 4 14.2664 9.607
0.4616 51.0 5100 1.8268 0.5201 0.3165 0.4784 0.4776 0.8842 0.8809 9.4847 16 4 14.345 10.0437
0.4581 52.0 5200 1.8350 0.5171 0.3153 0.4745 0.4736 0.8844 0.8807 9.4498 17 4 14.2838 10.917
0.5018 53.0 5300 1.8249 0.5216 0.3233 0.4822 0.4813 0.886 0.8822 9.3712 16 4 14.1834 8.7336
0.4942 54.0 5400 1.8318 0.5164 0.3143 0.4735 0.4737 0.881 0.8816 9.7162 16 4 14.6157 13.5371
0.454 55.0 5500 1.8374 0.5132 0.3099 0.4737 0.4728 0.8828 0.88 9.4323 16 4 14.2882 10.0437
0.4627 56.0 5600 1.8656 0.5188 0.3148 0.4752 0.4747 0.8826 0.8804 9.4672 16 4 14.2358 8.2969
0.5064 57.0 5700 1.8658 0.5158 0.3116 0.4721 0.4712 0.8844 0.8806 9.4454 16 4 14.2096 9.607
0.4612 58.0 5800 1.8849 0.5117 0.3077 0.4667 0.4666 0.8809 0.8787 9.5328 17 4 14.3712 9.607
0.4787 59.0 5900 1.8980 0.5138 0.3073 0.4706 0.4701 0.8818 0.8805 9.5415 17 4 14.3144 10.4803
0.4738 60.0 6000 1.8939 0.5145 0.3117 0.4742 0.4738 0.8829 0.8808 9.4672 16 4 14.2402 9.1703
0.4506 61.0 6100 1.9135 0.5094 0.3029 0.4662 0.4656 0.8799 0.8796 9.7118 16 4 14.4891 11.3537
0.4714 62.0 6200 1.9088 0.5036 0.3044 0.4651 0.4645 0.8791 0.8781 9.7293 17 4 14.3537 15.7205
0.4715 63.0 6300 1.9201 0.5052 0.3015 0.47 0.4691 0.8805 0.878 9.5895 16 4 14.345 12.6638
0.4768 64.0 6400 1.9271 0.5028 0.3037 0.4631 0.4623 0.8781 0.8776 9.7555 17 4 14.4367 14.8472
0.4549 65.0 6500 1.9241 0.5091 0.3092 0.4687 0.4683 0.8811 0.8799 9.6376 17 4 14.3144 12.6638
0.4603 66.0 6600 1.9316 0.5026 0.3007 0.4635 0.4634 0.8798 0.8785 9.6943 17 4 14.4323 13.1004
0.4368 67.0 6700 1.9312 0.5085 0.3055 0.4686 0.468 0.881 0.879 9.5852 16 4 14.262 13.1004
0.4517 68.0 6800 1.9407 0.5079 0.3039 0.4681 0.4676 0.8796 0.879 9.6376 16 4 14.3581 11.3537
0.4509 69.0 6900 1.9491 0.5016 0.2956 0.4632 0.4617 0.8797 0.8779 9.6026 17 4 14.3188 11.3537
0.4792 70.0 7000 1.9537 0.5049 0.2979 0.4646 0.4641 0.8801 0.8793 9.7118 17 4 14.3886 12.2271
0.481 71.0 7100 1.9519 0.5092 0.3063 0.4729 0.4723 0.8812 0.8801 9.6288 17 4 14.4105 11.3537
0.4638 72.0 7200 1.9549 0.5009 0.2977 0.4649 0.4638 0.8792 0.8784 9.6943 17 4 14.4672 11.7904
0.4659 73.0 7300 1.9684 0.4997 0.2973 0.4627 0.4623 0.8768 0.8778 9.8384 17 4 14.6026 13.9738
0.4543 74.0 7400 1.9707 0.5003 0.2962 0.4649 0.4642 0.8778 0.8779 9.6856 16 4 14.4279 12.2271
0.4676 75.0 7500 1.9719 0.5003 0.2955 0.465 0.4649 0.8785 0.8775 9.6332 16 5 14.3493 11.3537
0.4689 76.0 7600 1.9824 0.501 0.3007 0.4687 0.4679 0.8798 0.8783 9.5459 17 4 14.3275 10.4803
0.448 77.0 7700 1.9763 0.5033 0.2996 0.4669 0.4661 0.8789 0.8777 9.6157 16 4 14.3886 12.6638
0.4778 78.0 7800 1.9798 0.5008 0.2944 0.4613 0.4615 0.878 0.8766 9.6638 16 4 14.3013 13.9738
0.4656 79.0 7900 1.9814 0.5014 0.2972 0.4649 0.4649 0.8792 0.8771 9.5459 16 4 14.2576 11.3537
0.4546 80.0 8000 1.9921 0.5024 0.302 0.4663 0.4652 0.8789 0.8772 9.6114 16 4 14.3275 13.1004
0.4781 81.0 8100 1.9996 0.5025 0.2988 0.465 0.4645 0.8788 0.8762 9.5328 16 4 14.214 11.3537
0.4642 82.0 8200 2.0029 0.4974 0.2947 0.4571 0.4565 0.8774 0.8755 9.6725 16 4 14.3231 13.9738
0.4343 83.0 8300 2.0066 0.4979 0.2961 0.4594 0.4584 0.8777 0.8759 9.5939 16 5 14.3275 10.917
0.4439 84.0 8400 2.0091 0.5018 0.2983 0.4624 0.4623 0.8788 0.877 9.5939 16 5 14.3188 10.917
0.4439 85.0 8500 2.0188 0.5057 0.3003 0.4668 0.4669 0.8795 0.8774 9.5502 16 5 14.3057 10.4803
0.4349 86.0 8600 2.0250 0.5129 0.3041 0.4708 0.4703 0.8807 0.8793 9.6943 16 5 14.4323 12.2271
0.4677 87.0 8700 2.0260 0.5057 0.3017 0.4668 0.4657 0.8796 0.8783 9.6376 16 5 14.3712 11.3537
0.4412 88.0 8800 2.0310 0.5057 0.3032 0.4658 0.4645 0.8799 0.8782 9.6681 16 5 14.4148 12.2271
0.4533 89.0 8900 2.0284 0.5061 0.3028 0.4669 0.4657 0.8796 0.8783 9.6594 16 5 14.3886 11.7904
0.423 90.0 9000 2.0317 0.5037 0.2994 0.4656 0.4642 0.879 0.8778 9.6638 16 5 14.4279 11.3537
0.4384 91.0 9100 2.0351 0.5058 0.3003 0.4667 0.4653 0.8792 0.8781 9.6332 16 5 14.3755 10.917
0.4662 92.0 9200 2.0362 0.5043 0.3014 0.4667 0.4655 0.8797 0.8779 9.5808 16 5 14.3188 10.0437
0.4479 93.0 9300 2.0393 0.5051 0.3032 0.4672 0.466 0.8795 0.8779 9.5895 16 5 14.3275 10.0437
0.4609 94.0 9400 2.0400 0.5035 0.2998 0.4667 0.4648 0.8792 0.8775 9.5895 16 5 14.3275 10.0437
0.4513 95.0 9500 2.0434 0.5045 0.3019 0.4671 0.4656 0.8793 0.8778 9.5983 16 5 14.3188 10.0437
0.4496 96.0 9600 2.0439 0.5031 0.3009 0.4657 0.4637 0.8792 0.8777 9.5983 16 5 14.3231 10.4803
0.4481 97.0 9700 2.0464 0.5027 0.3016 0.4645 0.4624 0.8791 0.8777 9.6245 16 5 14.3406 11.3537
0.4522 98.0 9800 2.0459 0.5011 0.3002 0.4642 0.4622 0.8788 0.8771 9.6026 16 5 14.3188 10.917
0.4338 99.0 9900 2.0466 0.5016 0.301 0.4642 0.4623 0.8789 0.8772 9.6201 16 5 14.3319 11.3537
0.4325 100.0 10000 2.0466 0.5016 0.301 0.4642 0.4623 0.8789 0.8772 9.6201 16 5 14.3319 11.3537

Framework versions

  • Transformers 4.33.1
  • Pytorch 2.0.1+cu118
  • Datasets 2.14.5
  • Tokenizers 0.13.3
Downloads last month
3
Inference Examples
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social visibility and check back later, or deploy to Inference Endpoints (dedicated) instead.

Model tree for ldos/text_shortening_model_v25

Base model

google-t5/t5-small
Finetuned
(1501)
this model