Edit model card

text_shortening_model_v11

This model is a fine-tuned version of t5-small on the None dataset. It achieves the following results on the evaluation set:

  • Loss: 1.9156
  • Rouge1: 0.594
  • Rouge2: 0.3771
  • Rougel: 0.551
  • Rougelsum: 0.5514
  • Bert precision: 0.8963
  • Bert recall: 0.9029
  • Average word count: 11.1857
  • Max word count: 16
  • Min word count: 5
  • Average token count: 16.3143
  • % shortened texts with length > 12: 22.1429

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 0.0001
  • train_batch_size: 16
  • eval_batch_size: 16
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 100

Training results

Training Loss Epoch Step Validation Loss Rouge1 Rouge2 Rougel Rougelsum Bert precision Bert recall Average word count Max word count Min word count Average token count % shortened texts with length > 12
2.0573 1.0 31 1.6178 0.5605 0.3244 0.5037 0.5044 0.8826 0.8913 11.8 18 5 16.7571 44.2857
1.7151 2.0 62 1.5328 0.5713 0.3452 0.5232 0.5216 0.8855 0.8987 11.85 17 4 16.9571 45.7143
1.5849 3.0 93 1.4792 0.5806 0.3545 0.53 0.5289 0.8893 0.9002 11.6071 17 4 16.7571 42.8571
1.4642 4.0 124 1.4520 0.578 0.3468 0.5275 0.5269 0.8846 0.8994 12.1357 17 6 17.3786 46.4286
1.4162 5.0 155 1.4107 0.5887 0.3596 0.5412 0.5406 0.8892 0.901 11.8071 17 6 17.1 40.0
1.3163 6.0 186 1.4050 0.5888 0.3526 0.5348 0.5341 0.8905 0.9007 11.5071 16 6 16.8929 36.4286
1.255 7.0 217 1.4007 0.5793 0.3523 0.5315 0.53 0.8882 0.898 11.4857 15 6 16.9357 32.1429
1.2007 8.0 248 1.4018 0.6041 0.3743 0.5561 0.5552 0.8936 0.9044 11.65 16 7 16.95 32.8571
1.1432 9.0 279 1.3913 0.5969 0.3688 0.5481 0.5474 0.8907 0.9036 11.8786 16 6 17.1929 41.4286
1.1011 10.0 310 1.3961 0.5895 0.3541 0.5379 0.5365 0.8887 0.9022 11.9571 17 6 17.2857 39.2857
1.0608 11.0 341 1.3965 0.601 0.3676 0.551 0.5493 0.8912 0.9039 11.9143 17 7 17.2643 37.1429
1.0194 12.0 372 1.4092 0.5968 0.3691 0.5485 0.5479 0.896 0.9016 11.0571 15 6 16.2857 24.2857
0.9875 13.0 403 1.4105 0.6002 0.3748 0.5525 0.5519 0.8929 0.9034 11.5357 17 7 16.8643 29.2857
0.9369 14.0 434 1.4121 0.593 0.3658 0.5478 0.5476 0.896 0.903 11.15 16 6 16.4286 25.0
0.9258 15.0 465 1.4079 0.5956 0.3658 0.5434 0.543 0.8912 0.9025 11.6643 16 6 16.9786 31.4286
0.8838 16.0 496 1.4202 0.597 0.3662 0.5468 0.5464 0.8925 0.9041 11.4786 16 7 16.9429 27.8571
0.8615 17.0 527 1.4250 0.5936 0.3618 0.544 0.5434 0.8917 0.9028 11.5 17 7 16.9286 27.8571
0.8359 18.0 558 1.4392 0.5921 0.3726 0.5459 0.5451 0.8911 0.9019 11.5 16 7 16.8571 27.8571
0.7951 19.0 589 1.4446 0.5875 0.3687 0.5431 0.542 0.8904 0.9008 11.5143 16 7 16.8643 27.8571
0.7773 20.0 620 1.4564 0.5917 0.3678 0.5472 0.5471 0.892 0.9028 11.5286 16 7 16.7286 30.7143
0.7597 21.0 651 1.4609 0.587 0.3659 0.5436 0.5423 0.8907 0.9025 11.7429 16 7 17.0214 31.4286
0.7276 22.0 682 1.4723 0.5992 0.3824 0.5573 0.5569 0.8957 0.9016 11.0857 16 7 16.3357 20.7143
0.6884 23.0 713 1.4759 0.5919 0.3749 0.5502 0.5494 0.8946 0.9037 11.4143 16 6 16.7357 25.7143
0.6689 24.0 744 1.4953 0.5872 0.3741 0.5479 0.5465 0.8964 0.9001 10.9714 16 6 16.0786 20.7143
0.6634 25.0 775 1.5111 0.5985 0.3802 0.5565 0.5563 0.8948 0.9037 11.4 16 7 16.7214 25.7143
0.6451 26.0 806 1.5194 0.5895 0.3676 0.545 0.5442 0.8945 0.9002 11.2571 16 6 16.4857 21.4286
0.6309 27.0 837 1.5287 0.5857 0.3642 0.5445 0.5435 0.8942 0.9007 11.3357 16 7 16.5714 22.1429
0.615 28.0 868 1.5374 0.5969 0.3817 0.5547 0.5546 0.8982 0.9028 11.0857 17 6 16.2 20.0
0.6094 29.0 899 1.5423 0.593 0.3746 0.5506 0.5501 0.8951 0.902 11.3429 16 7 16.5571 25.7143
0.5757 30.0 930 1.5376 0.5916 0.3769 0.5479 0.5473 0.8976 0.9013 11.0143 16 7 16.1143 20.7143
0.5633 31.0 961 1.5586 0.5976 0.3852 0.5577 0.5571 0.8987 0.9034 11.15 16 7 16.2214 21.4286
0.5437 32.0 992 1.5716 0.5978 0.3843 0.5566 0.5556 0.8982 0.9043 11.1929 16 7 16.3 24.2857
0.545 33.0 1023 1.5776 0.5915 0.38 0.5505 0.549 0.8977 0.9011 11.0143 16 6 16.1 18.5714
0.5254 34.0 1054 1.5979 0.5847 0.3731 0.5442 0.5436 0.8978 0.9 10.9857 16 6 15.9429 20.0
0.5243 35.0 1085 1.6012 0.5983 0.3829 0.5551 0.5542 0.8986 0.9047 11.1714 16 6 16.3786 21.4286
0.5075 36.0 1116 1.5938 0.5906 0.3857 0.5501 0.5494 0.898 0.9041 11.2214 16 6 16.3786 22.1429
0.484 37.0 1147 1.6196 0.5952 0.3858 0.5548 0.555 0.8991 0.9031 11.0357 16 6 16.1429 19.2857
0.4797 38.0 1178 1.6349 0.5988 0.3861 0.5586 0.5582 0.9005 0.9041 10.9929 16 7 16.1357 17.8571
0.4693 39.0 1209 1.6353 0.5953 0.3927 0.5567 0.5571 0.8988 0.9038 11.1 16 7 16.2429 23.5714
0.4575 40.0 1240 1.6395 0.5907 0.3825 0.5518 0.5517 0.8979 0.9024 11.0571 16 7 16.1143 20.0
0.4376 41.0 1271 1.6676 0.5891 0.3869 0.5508 0.5511 0.8967 0.902 11.2357 16 7 16.3643 24.2857
0.4302 42.0 1302 1.6788 0.5937 0.3827 0.5511 0.5512 0.8987 0.9022 11.0214 16 7 16.1357 20.7143
0.4279 43.0 1333 1.6796 0.601 0.3873 0.5583 0.558 0.899 0.9025 11.1071 16 7 16.2071 23.5714
0.4222 44.0 1364 1.6884 0.6077 0.3944 0.565 0.5652 0.9017 0.9051 10.8929 15 6 16.0071 17.1429
0.4203 45.0 1395 1.6932 0.5978 0.3837 0.5578 0.557 0.8977 0.9031 11.1357 16 7 16.25 18.5714
0.4145 46.0 1426 1.7017 0.6084 0.3855 0.5632 0.5633 0.9006 0.9057 11.1357 16 7 16.2857 18.5714
0.3957 47.0 1457 1.6958 0.5969 0.3857 0.5579 0.5575 0.8979 0.9039 11.2429 16 7 16.3643 20.7143
0.3943 48.0 1488 1.7099 0.5891 0.3802 0.5482 0.5472 0.8982 0.9018 11.0286 16 7 16.0929 17.1429
0.3808 49.0 1519 1.7259 0.6003 0.3818 0.558 0.5583 0.8988 0.9031 11.1214 16 7 16.2857 19.2857
0.3746 50.0 1550 1.7252 0.5904 0.3749 0.5481 0.5483 0.8975 0.9012 11.0571 16 7 16.2214 17.1429
0.3743 51.0 1581 1.7394 0.5948 0.3789 0.5537 0.5539 0.8995 0.9048 11.25 16 7 16.3786 22.1429
0.3652 52.0 1612 1.7568 0.5934 0.3777 0.5492 0.549 0.8986 0.9022 11.0714 16 6 16.1714 18.5714
0.3676 53.0 1643 1.7608 0.5941 0.378 0.5562 0.5562 0.8996 0.9034 11.0571 17 7 16.1214 18.5714
0.3505 54.0 1674 1.7593 0.5934 0.3759 0.5522 0.5527 0.8985 0.9027 11.1143 16 7 16.0857 18.5714
0.3343 55.0 1705 1.7625 0.587 0.3749 0.5451 0.5455 0.8976 0.9009 11.0429 17 6 16.0929 17.8571
0.3471 56.0 1736 1.7744 0.5866 0.3738 0.5473 0.5468 0.8959 0.9005 11.1429 17 6 16.2571 19.2857
0.3396 57.0 1767 1.7778 0.5884 0.3753 0.5459 0.5459 0.8963 0.9009 11.1071 16 6 16.1714 19.2857
0.3313 58.0 1798 1.7836 0.5915 0.3743 0.5494 0.5491 0.8963 0.9017 11.1071 16 7 16.1571 20.0
0.3211 59.0 1829 1.7980 0.5935 0.3772 0.5536 0.554 0.8962 0.9033 11.25 17 7 16.3357 21.4286
0.3126 60.0 1860 1.8001 0.5979 0.3809 0.5553 0.5556 0.8968 0.9021 11.1643 17 6 16.2929 20.7143
0.3078 61.0 1891 1.8163 0.5939 0.3795 0.552 0.5521 0.8972 0.9026 11.1429 17 6 16.2786 22.8571
0.3007 62.0 1922 1.8209 0.6037 0.3886 0.5609 0.5619 0.8976 0.9051 11.2786 17 6 16.4571 23.5714
0.2969 63.0 1953 1.8165 0.5829 0.3693 0.5406 0.5407 0.8956 0.8988 10.9714 16 6 16.0143 19.2857
0.2886 64.0 1984 1.8299 0.5921 0.3754 0.5482 0.5483 0.8968 0.8997 11.0143 16 6 16.1214 18.5714
0.2942 65.0 2015 1.8299 0.5965 0.3707 0.5491 0.5483 0.8967 0.9024 11.2071 16 6 16.3571 22.1429
0.2991 66.0 2046 1.8329 0.5911 0.3789 0.5519 0.5512 0.8968 0.902 11.0857 16 6 16.2786 21.4286
0.2926 67.0 2077 1.8361 0.5975 0.3845 0.5559 0.5552 0.8985 0.9032 11.05 17 6 16.3071 20.7143
0.2888 68.0 2108 1.8442 0.5993 0.3855 0.5581 0.5582 0.8984 0.9042 11.1143 16 6 16.2929 22.1429
0.2851 69.0 2139 1.8479 0.597 0.3805 0.5534 0.5535 0.8974 0.9036 11.1 16 6 16.35 20.0
0.2704 70.0 2170 1.8532 0.5918 0.3746 0.5475 0.5461 0.8969 0.9027 11.15 16 6 16.3571 20.7143
0.269 71.0 2201 1.8584 0.594 0.3789 0.5534 0.553 0.8981 0.9039 11.1143 17 6 16.3286 19.2857
0.2738 72.0 2232 1.8590 0.5967 0.3833 0.5555 0.5552 0.8985 0.9041 11.0714 16 6 16.3286 17.8571
0.2644 73.0 2263 1.8656 0.5952 0.3801 0.5506 0.5506 0.8981 0.9029 11.0857 16 6 16.2714 20.0
0.2647 74.0 2294 1.8744 0.5995 0.384 0.5573 0.5571 0.8989 0.9049 11.2214 17 6 16.4429 22.1429
0.2678 75.0 2325 1.8825 0.6055 0.3886 0.563 0.5635 0.8992 0.9056 11.2786 16 6 16.4857 22.8571
0.2647 76.0 2356 1.8805 0.6024 0.3849 0.5605 0.5609 0.8996 0.9055 11.1357 17 6 16.3286 20.0
0.2535 77.0 2387 1.8865 0.5981 0.3932 0.5605 0.5612 0.8994 0.9045 11.1143 17 5 16.3 20.7143
0.2561 78.0 2418 1.8878 0.5961 0.3852 0.5558 0.5567 0.8991 0.9035 11.0643 16 5 16.2786 20.0
0.2586 79.0 2449 1.8910 0.5972 0.3881 0.5615 0.5615 0.8974 0.9033 11.1643 17 5 16.35 22.1429
0.2501 80.0 2480 1.8921 0.5929 0.3819 0.5529 0.5536 0.8958 0.9026 11.2214 16 5 16.4357 24.2857
0.2557 81.0 2511 1.8949 0.5941 0.3833 0.5535 0.5537 0.8956 0.9028 11.2643 17 5 16.4214 24.2857
0.2436 82.0 2542 1.8973 0.5916 0.3838 0.5525 0.5533 0.8958 0.9033 11.2786 17 5 16.4571 22.8571
0.2463 83.0 2573 1.8962 0.5915 0.3806 0.5533 0.5536 0.8955 0.9028 11.2143 16 5 16.4643 22.8571
0.2388 84.0 2604 1.8987 0.5945 0.3845 0.5568 0.5565 0.8968 0.9035 11.1429 17 5 16.3714 22.1429
0.2389 85.0 2635 1.9019 0.5957 0.3819 0.5546 0.5542 0.8971 0.9038 11.1929 16 5 16.3786 22.1429
0.2392 86.0 2666 1.9026 0.5928 0.3801 0.5522 0.5518 0.8969 0.9035 11.2143 16 5 16.4143 21.4286
0.2387 87.0 2697 1.9062 0.5907 0.3751 0.5496 0.549 0.8962 0.9028 11.1714 16 5 16.3286 20.7143
0.2403 88.0 2728 1.9064 0.5952 0.3779 0.5512 0.5512 0.8966 0.904 11.2643 16 5 16.4643 22.8571
0.2368 89.0 2759 1.9098 0.5995 0.387 0.5586 0.5588 0.8977 0.9044 11.1714 16 5 16.3786 20.0
0.2416 90.0 2790 1.9115 0.6007 0.3872 0.5595 0.5606 0.8982 0.9047 11.1857 16 5 16.3429 21.4286
0.2328 91.0 2821 1.9128 0.5997 0.3865 0.5574 0.558 0.8978 0.9039 11.1357 16 5 16.3357 21.4286
0.2389 92.0 2852 1.9151 0.5973 0.3864 0.5563 0.5576 0.8978 0.9032 11.0571 16 5 16.2071 20.0
0.2358 93.0 2883 1.9152 0.5952 0.3827 0.5529 0.5535 0.8974 0.903 11.0786 16 5 16.2286 20.0
0.233 94.0 2914 1.9155 0.6001 0.3861 0.5581 0.5585 0.8984 0.904 11.0929 16 5 16.3 20.7143
0.2293 95.0 2945 1.9146 0.5995 0.3845 0.5561 0.5572 0.8981 0.9038 11.1 16 5 16.2857 22.1429
0.2334 96.0 2976 1.9149 0.5963 0.3779 0.5518 0.5521 0.8975 0.9032 11.1214 16 5 16.3 21.4286
0.2334 97.0 3007 1.9153 0.5969 0.3813 0.554 0.5541 0.8978 0.9036 11.1429 16 5 16.3071 22.1429
0.237 98.0 3038 1.9150 0.5948 0.3803 0.5524 0.5528 0.8973 0.9031 11.1429 16 5 16.2643 22.1429
0.227 99.0 3069 1.9154 0.5946 0.3776 0.5509 0.5513 0.8969 0.903 11.1571 16 5 16.2786 22.1429
0.2357 100.0 3100 1.9156 0.594 0.3771 0.551 0.5514 0.8963 0.9029 11.1857 16 5 16.3143 22.1429

Framework versions

  • Transformers 4.33.0
  • Pytorch 2.0.1+cu118
  • Datasets 2.14.4
  • Tokenizers 0.13.3
Downloads last month
2
Inference Examples
Inference API (serverless) is not available, repository is disabled.

Model tree for ldos/text_shortening_model_v11

Base model

google-t5/t5-small
Finetuned
this model