Edit model card

text_shortening_model_v55

This model is a fine-tuned version of t5-small on the None dataset. It achieves the following results on the evaluation set:

  • Loss: 3.4100
  • Rouge1: 0.3351
  • Rouge2: 0.1743
  • Rougel: 0.3114
  • Rougelsum: 0.3116
  • Bert precision: 0.8348
  • Bert recall: 0.833
  • Average word count: 6.4435
  • Max word count: 16
  • Min word count: 2
  • Average token count: 10.4215
  • % shortened texts with length > 12: 2.4261

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 0.0005
  • train_batch_size: 16
  • eval_batch_size: 16
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 150

Training results

Training Loss Epoch Step Validation Loss Rouge1 Rouge2 Rougel Rougelsum Bert precision Bert recall Average word count Max word count Min word count Average token count % shortened texts with length > 12
2.4913 1.0 288 2.0408 0.34 0.1753 0.313 0.313 0.8255 0.8369 7.6831 17 2 11.8939 5.6861
1.9921 2.0 576 1.8694 0.3521 0.1826 0.3237 0.324 0.8334 0.8375 7.0136 17 2 10.9447 3.1842
1.7007 3.0 864 1.7968 0.3565 0.1853 0.3292 0.3296 0.8383 0.837 6.4261 16 2 10.461 2.1986
1.4881 4.0 1152 1.7468 0.3552 0.1797 0.3287 0.3291 0.8382 0.8381 6.558 17 2 10.4321 2.5019
1.3177 5.0 1440 1.7568 0.3566 0.1837 0.3273 0.3275 0.837 0.8387 6.8287 15 2 10.7172 3.26
1.1661 6.0 1728 1.7524 0.3646 0.191 0.3372 0.3375 0.84 0.8412 6.7733 17 2 10.6839 3.4875
1.0419 7.0 2016 1.7797 0.3589 0.1832 0.3312 0.3314 0.8392 0.8384 6.6801 16 2 10.5792 3.4875
0.9391 8.0 2304 1.8526 0.3595 0.1879 0.3338 0.3344 0.8406 0.8401 6.5686 16 2 10.4299 2.4261
0.8459 9.0 2592 1.8470 0.3475 0.1805 0.3217 0.3218 0.8369 0.8375 6.7165 16 2 10.8408 3.5633
0.7653 10.0 2880 1.8914 0.3574 0.1849 0.3327 0.3332 0.8384 0.8404 6.7597 16 2 10.8863 3.4117
0.6909 11.0 3168 1.9524 0.3601 0.185 0.3322 0.3324 0.8393 0.8399 6.7824 16 2 10.7377 3.7908
0.6315 12.0 3456 1.9708 0.3557 0.1834 0.3285 0.3289 0.8388 0.8394 6.6861 16 2 10.674 3.4117
0.5775 13.0 3744 2.0307 0.3503 0.1858 0.3258 0.326 0.8392 0.8385 6.5898 17 2 10.5838 3.4117
0.5151 14.0 4032 2.1009 0.354 0.181 0.3284 0.3282 0.8399 0.8398 6.6543 17 2 10.6596 3.26
0.4732 15.0 4320 2.1415 0.3498 0.1823 0.3236 0.3238 0.8381 0.8386 6.6194 17 2 10.6164 2.9568
0.4384 16.0 4608 2.1816 0.3533 0.1874 0.3285 0.3287 0.8397 0.8374 6.4572 16 2 10.3472 2.3503
0.3991 17.0 4896 2.2288 0.3439 0.1796 0.3197 0.3201 0.8371 0.8366 6.5724 16 2 10.5383 2.9568
0.3704 18.0 5184 2.2619 0.3424 0.1805 0.3185 0.3182 0.8375 0.8346 6.3313 15 2 10.3374 2.1228
0.3401 19.0 5472 2.3624 0.3414 0.1794 0.3182 0.3184 0.8366 0.8352 6.5049 16 2 10.5057 3.0326
0.3177 20.0 5760 2.3962 0.3418 0.1763 0.3183 0.3187 0.8362 0.8358 6.5785 17 2 10.5436 3.1084
0.2901 21.0 6048 2.4554 0.3454 0.1812 0.3202 0.321 0.8387 0.8363 6.4276 16 2 10.3472 2.3503
0.2809 22.0 6336 2.4607 0.3291 0.1705 0.3067 0.3073 0.8367 0.8322 6.2199 17 2 10.163 2.5777
0.2569 23.0 6624 2.5187 0.334 0.1684 0.3086 0.309 0.835 0.833 6.4246 17 2 10.3776 2.4261
0.2407 24.0 6912 2.5013 0.3358 0.1718 0.3133 0.3135 0.8358 0.8339 6.4526 17 2 10.4246 1.8196
0.2298 25.0 7200 2.5614 0.3325 0.1721 0.3096 0.3097 0.8353 0.8342 6.4867 16 2 10.4784 2.7293
0.2196 26.0 7488 2.5920 0.3339 0.1734 0.308 0.3081 0.8357 0.8324 6.3177 15 1 10.2206 1.5921
0.2105 27.0 7776 2.5837 0.3411 0.1789 0.3183 0.3184 0.8363 0.8348 6.4261 17 2 10.3055 2.047
0.202 28.0 8064 2.6076 0.3392 0.1776 0.315 0.3153 0.8366 0.8339 6.3707 16 1 10.2487 1.8954
0.1926 29.0 8352 2.6015 0.3399 0.1769 0.3144 0.3143 0.836 0.8341 6.4458 17 2 10.4291 2.5777
0.1861 30.0 8640 2.6601 0.3379 0.1769 0.3131 0.3133 0.8362 0.8336 6.3836 17 2 10.3889 2.4261
0.1767 31.0 8928 2.6906 0.3397 0.1759 0.3147 0.3151 0.8367 0.8346 6.4511 17 2 10.3101 2.1228
0.1675 32.0 9216 2.7051 0.3338 0.1702 0.3094 0.3093 0.835 0.8328 6.4056 19 1 10.2866 1.9712
0.166 33.0 9504 2.7246 0.331 0.17 0.307 0.307 0.8346 0.8322 6.3366 17 2 10.2237 1.3647
0.1556 34.0 9792 2.7517 0.3325 0.1744 0.3096 0.3092 0.8349 0.8332 6.4564 17 1 10.4344 2.5019
0.1564 35.0 10080 2.7514 0.3416 0.1783 0.3173 0.317 0.8363 0.8365 6.6194 17 2 10.6171 2.6535
0.1532 36.0 10368 2.7337 0.3404 0.179 0.3163 0.3162 0.837 0.836 6.5625 17 2 10.5481 3.1842
0.1436 37.0 10656 2.7368 0.3405 0.1794 0.3166 0.317 0.8365 0.8354 6.5603 17 2 10.4845 2.881
0.1399 38.0 10944 2.7829 0.3373 0.1734 0.3111 0.3111 0.8352 0.8343 6.5497 17 2 10.5072 2.7293
0.1382 39.0 11232 2.8448 0.3329 0.1753 0.309 0.3087 0.8338 0.8333 6.5807 17 2 10.5011 3.0326
0.1338 40.0 11520 2.8211 0.3384 0.1742 0.3146 0.3148 0.8347 0.8342 6.5951 17 2 10.5307 2.4261
0.1306 41.0 11808 2.8201 0.341 0.1779 0.316 0.3164 0.8353 0.8352 6.6376 17 2 10.6353 2.881
0.128 42.0 12096 2.8552 0.346 0.1819 0.3205 0.3209 0.8368 0.8362 6.5299 17 2 10.5444 2.881
0.1227 43.0 12384 2.8669 0.3385 0.1753 0.3146 0.3152 0.8362 0.8347 6.4458 18 2 10.423 2.6535
0.1231 44.0 12672 2.8838 0.3371 0.1766 0.3145 0.3146 0.835 0.8343 6.5118 17 2 10.5186 2.1986
0.1185 45.0 12960 2.8436 0.3403 0.1772 0.3156 0.3158 0.8361 0.8358 6.5353 17 2 10.5671 2.6535
0.1142 46.0 13248 2.8723 0.3339 0.1727 0.3102 0.3104 0.8355 0.8334 6.4071 16 2 10.3669 1.8954
0.1147 47.0 13536 2.8299 0.3415 0.1798 0.3181 0.3182 0.8368 0.8357 6.4928 17 2 10.4981 2.3503
0.1098 48.0 13824 2.9572 0.3382 0.1766 0.3144 0.3148 0.8355 0.8342 6.5011 17 2 10.445 2.1986
0.1105 49.0 14112 2.8968 0.3364 0.1731 0.3123 0.3121 0.8363 0.8343 6.4276 17 2 10.3465 2.4261
0.1058 50.0 14400 2.9254 0.3405 0.1789 0.3168 0.3163 0.836 0.8342 6.4951 16 2 10.4587 2.5019
0.1043 51.0 14688 2.9681 0.3364 0.1765 0.3136 0.3134 0.8355 0.8336 6.4321 17 2 10.4496 1.9712
0.103 52.0 14976 2.9271 0.3327 0.1729 0.3096 0.3097 0.8354 0.8336 6.4428 16 2 10.3571 2.5777
0.1 53.0 15264 2.8983 0.3304 0.1706 0.3063 0.3066 0.8345 0.8328 6.4625 17 2 10.3973 2.3503
0.0985 54.0 15552 2.9333 0.3335 0.1741 0.3115 0.3115 0.8357 0.8332 6.3723 16 2 10.2737 2.047
0.0975 55.0 15840 2.9403 0.332 0.1695 0.3081 0.3081 0.8351 0.8329 6.4079 17 2 10.395 1.8196
0.0904 56.0 16128 2.9620 0.3359 0.1744 0.3112 0.3114 0.8355 0.8339 6.5019 16 2 10.4109 2.3503
0.0937 57.0 16416 2.9393 0.3328 0.173 0.3093 0.3091 0.8359 0.8329 6.3768 16 2 10.2873 2.047
0.0908 58.0 16704 2.9622 0.3334 0.1743 0.3107 0.3108 0.835 0.8327 6.4488 17 2 10.3503 2.047
0.0889 59.0 16992 2.9277 0.3357 0.1748 0.3121 0.3125 0.8358 0.8347 6.4632 17 2 10.4519 2.4261
0.0881 60.0 17280 2.9642 0.3369 0.1764 0.3133 0.3136 0.8358 0.8348 6.5572 16 2 10.5186 3.0326
0.0871 61.0 17568 2.9379 0.3357 0.1766 0.3136 0.3139 0.8367 0.8336 6.3427 17 2 10.2168 1.8954
0.0858 62.0 17856 2.9806 0.3345 0.1749 0.3119 0.3122 0.8359 0.8342 6.417 17 2 10.3586 1.8196
0.0849 63.0 18144 3.0425 0.3386 0.1774 0.3156 0.3165 0.8365 0.8343 6.4556 16 2 10.3707 2.3503
0.0829 64.0 18432 2.9927 0.3344 0.1719 0.3126 0.3126 0.8354 0.8332 6.442 17 2 10.4033 2.2745
0.0805 65.0 18720 3.0175 0.337 0.1752 0.3142 0.3148 0.8367 0.8335 6.3419 17 2 10.3055 1.6679
0.0796 66.0 19008 2.9937 0.3386 0.176 0.3153 0.3155 0.836 0.8356 6.6111 17 2 10.5724 3.0326
0.0829 67.0 19296 3.0281 0.3348 0.1747 0.3126 0.3125 0.8355 0.8344 6.4845 17 2 10.4299 2.7293
0.0781 68.0 19584 3.0455 0.3361 0.1761 0.3141 0.3142 0.8357 0.8352 6.5064 17 2 10.4799 2.6535
0.0769 69.0 19872 3.0361 0.3315 0.1723 0.3089 0.3092 0.8349 0.833 6.3829 17 2 10.3882 1.6679
0.0769 70.0 20160 3.1013 0.3356 0.1761 0.3131 0.313 0.8358 0.8345 6.4488 17 2 10.3662 2.047
0.0743 71.0 20448 3.0453 0.3332 0.1739 0.3123 0.3124 0.8357 0.8344 6.4579 17 2 10.4473 2.2745
0.0738 72.0 20736 3.0540 0.338 0.1744 0.3147 0.3145 0.836 0.8347 6.4958 17 2 10.5034 2.3503
0.0729 73.0 21024 3.1174 0.3338 0.1766 0.3108 0.3108 0.8355 0.8337 6.4488 17 2 10.4147 2.5019
0.0729 74.0 21312 3.0804 0.3333 0.173 0.3102 0.3102 0.8355 0.8335 6.4253 17 2 10.3616 1.9712
0.0736 75.0 21600 3.0825 0.3346 0.1724 0.3119 0.3118 0.8354 0.834 6.5133 17 2 10.4625 2.1228
0.0696 76.0 21888 3.0762 0.3279 0.169 0.305 0.3049 0.8336 0.8329 6.5686 17 2 10.558 2.5777
0.0692 77.0 22176 3.0773 0.333 0.1735 0.3113 0.3113 0.8353 0.8336 6.4731 17 2 10.442 2.047
0.0713 78.0 22464 3.1254 0.3343 0.174 0.3117 0.3117 0.8348 0.8338 6.5618 16 2 10.5095 2.5777
0.0677 79.0 22752 3.1311 0.3383 0.1758 0.3142 0.3138 0.8362 0.8339 6.4018 15 2 10.3222 1.5163
0.0673 80.0 23040 3.1401 0.3382 0.1772 0.3145 0.3146 0.8367 0.8353 6.467 16 2 10.4352 1.9712
0.0666 81.0 23328 3.1692 0.3409 0.177 0.3166 0.3168 0.8358 0.835 6.5262 16 2 10.4958 2.1986
0.0656 82.0 23616 3.1194 0.3387 0.1767 0.3149 0.3148 0.8363 0.8353 6.5019 16 2 10.4261 1.7437
0.0641 83.0 23904 3.1410 0.3374 0.1757 0.3135 0.3137 0.8366 0.8349 6.4845 16 2 10.3798 2.1986
0.0641 84.0 24192 3.1541 0.3331 0.177 0.3105 0.3102 0.8352 0.8334 6.3882 15 2 10.3503 1.7437
0.065 85.0 24480 3.0968 0.3371 0.1768 0.3128 0.3127 0.8365 0.8348 6.4268 15 2 10.3988 1.9712
0.0611 86.0 24768 3.1422 0.3354 0.1781 0.3114 0.3114 0.8367 0.8345 6.3836 16 2 10.3086 1.4405
0.0625 87.0 25056 3.0621 0.3364 0.1778 0.3123 0.3126 0.8367 0.8346 6.4617 17 2 10.4238 2.2745
0.0606 88.0 25344 3.1483 0.3347 0.1759 0.3105 0.3109 0.8355 0.8337 6.4519 17 2 10.3738 2.1228
0.0621 89.0 25632 3.1658 0.3369 0.1777 0.3128 0.3127 0.836 0.8343 6.4708 17 2 10.4314 2.047
0.0604 90.0 25920 3.1247 0.339 0.1776 0.3155 0.3155 0.8352 0.8352 6.6429 17 2 10.6823 3.1842
0.0605 91.0 26208 3.1539 0.3372 0.1763 0.3136 0.3135 0.8355 0.8348 6.5406 16 2 10.4905 2.047
0.0591 92.0 26496 3.1979 0.3376 0.1763 0.314 0.3142 0.8355 0.8348 6.5436 17 2 10.4579 2.5019
0.0609 93.0 26784 3.1765 0.3449 0.1817 0.32 0.3201 0.8369 0.8358 6.5709 17 2 10.5679 2.6535
0.0587 94.0 27072 3.1695 0.3365 0.1775 0.3122 0.3122 0.8352 0.8344 6.539 17 2 10.5565 2.2745
0.058 95.0 27360 3.2291 0.3359 0.1765 0.3134 0.3133 0.8354 0.8335 6.42 15 2 10.4344 2.4261
0.0576 96.0 27648 3.1870 0.3362 0.1785 0.3143 0.3148 0.8357 0.8344 6.4428 15 2 10.4541 2.4261
0.0556 97.0 27936 3.1846 0.3384 0.1785 0.3154 0.3155 0.8365 0.835 6.4685 17 2 10.3995 2.1228
0.0567 98.0 28224 3.2245 0.3384 0.175 0.3141 0.3145 0.8352 0.8344 6.5641 17 2 10.5497 2.9568
0.0555 99.0 28512 3.2033 0.3359 0.1758 0.3128 0.3131 0.8363 0.8346 6.4466 17 2 10.4049 1.9712
0.0542 100.0 28800 3.2297 0.3358 0.1776 0.3136 0.3139 0.8362 0.834 6.4109 17 2 10.3533 2.3503
0.0542 101.0 29088 3.2236 0.3363 0.1759 0.3138 0.3136 0.8351 0.834 6.4761 17 2 10.3685 2.2745
0.0536 102.0 29376 3.2448 0.3375 0.1776 0.3132 0.3133 0.8347 0.8346 6.5709 17 2 10.5368 2.8052
0.0519 103.0 29664 3.2170 0.3391 0.1772 0.3155 0.3154 0.8363 0.8347 6.4094 17 2 10.3343 2.1228
0.0512 104.0 29952 3.2527 0.3371 0.1767 0.3138 0.3135 0.8352 0.835 6.5754 17 2 10.5716 2.5777
0.0531 105.0 30240 3.2278 0.3348 0.1767 0.3111 0.311 0.8356 0.8337 6.3654 17 2 10.3351 1.4405
0.0508 106.0 30528 3.2414 0.3348 0.1757 0.3114 0.3113 0.8351 0.8341 6.4541 17 2 10.4208 2.5777
0.0507 107.0 30816 3.2424 0.3345 0.1718 0.3106 0.3108 0.8344 0.834 6.4951 17 2 10.4503 2.6535
0.0495 108.0 31104 3.2742 0.3377 0.177 0.3135 0.3137 0.8356 0.8345 6.4799 16 2 10.4086 2.3503
0.0506 109.0 31392 3.2576 0.34 0.1817 0.3168 0.3171 0.8362 0.8351 6.4905 16 2 10.4526 2.5777
0.05 110.0 31680 3.2930 0.3387 0.1792 0.3149 0.3155 0.8361 0.8351 6.5133 17 2 10.4488 2.3503
0.0482 111.0 31968 3.2736 0.3352 0.1783 0.3124 0.3127 0.8357 0.8341 6.4435 16 2 10.3753 2.1228
0.05 112.0 32256 3.2692 0.3401 0.1802 0.3159 0.3163 0.8354 0.8345 6.5133 15 2 10.4261 2.7293
0.0486 113.0 32544 3.2848 0.3358 0.1763 0.3119 0.3124 0.8352 0.8335 6.4238 17 2 10.345 1.7437
0.0466 114.0 32832 3.2980 0.3425 0.1813 0.3188 0.3189 0.8366 0.8351 6.4594 15 2 10.4382 2.3503
0.047 115.0 33120 3.2739 0.3377 0.1761 0.314 0.3144 0.8351 0.8342 6.4973 17 2 10.5019 2.2745
0.0473 116.0 33408 3.2383 0.3423 0.1812 0.3178 0.3182 0.8363 0.8362 6.5595 15 2 10.5641 2.5777
0.0462 117.0 33696 3.2486 0.3402 0.1806 0.3154 0.3158 0.8367 0.835 6.4306 15 2 10.395 2.1986
0.0469 118.0 33984 3.2500 0.3414 0.1806 0.3165 0.3168 0.8361 0.8351 6.4716 17 2 10.4549 2.2745
0.0455 119.0 34272 3.3102 0.3449 0.1826 0.3199 0.3198 0.8366 0.836 6.4936 16 2 10.4996 2.1228
0.0464 120.0 34560 3.3132 0.3386 0.1773 0.3146 0.3149 0.8357 0.8346 6.4511 16 2 10.4109 1.8954
0.0457 121.0 34848 3.2897 0.3379 0.1765 0.3133 0.3135 0.8358 0.8348 6.4852 16 2 10.4754 2.047
0.0458 122.0 35136 3.2925 0.3428 0.1802 0.3194 0.3193 0.8366 0.8362 6.5504 16 2 10.5201 2.881
0.0446 123.0 35424 3.3392 0.3414 0.1802 0.3167 0.3169 0.8365 0.8354 6.4936 17 2 10.4701 2.047
0.0435 124.0 35712 3.3514 0.3429 0.1797 0.3173 0.3174 0.8359 0.8352 6.5284 16 2 10.5201 2.1228
0.0438 125.0 36000 3.3272 0.3396 0.1791 0.3163 0.3161 0.8357 0.8345 6.4155 17 2 10.3753 1.5921
0.0435 126.0 36288 3.3365 0.34 0.1798 0.3167 0.3167 0.8358 0.8347 6.4829 16 2 10.442 2.5777
0.0437 127.0 36576 3.3296 0.3401 0.1791 0.3155 0.3157 0.8354 0.8349 6.5481 16 2 10.5603 2.5777
0.044 128.0 36864 3.3290 0.3382 0.1769 0.3137 0.314 0.8349 0.8342 6.5216 16 2 10.4936 2.1986
0.0442 129.0 37152 3.3040 0.3387 0.1754 0.3124 0.3125 0.8352 0.8346 6.5337 16 2 10.5102 2.4261
0.0428 130.0 37440 3.3722 0.3416 0.1798 0.3148 0.3153 0.8359 0.8352 6.5375 16 2 10.5315 2.9568
0.0419 131.0 37728 3.3569 0.3382 0.177 0.3145 0.3145 0.8353 0.8343 6.4981 16 2 10.4761 2.7293
0.0418 132.0 38016 3.3595 0.338 0.176 0.3124 0.3125 0.8351 0.8341 6.4913 16 2 10.4852 2.6535
0.0413 133.0 38304 3.3565 0.3392 0.1761 0.3136 0.3137 0.8353 0.8337 6.4746 16 2 10.4503 2.4261
0.0417 134.0 38592 3.3282 0.3369 0.1737 0.3113 0.3116 0.8347 0.8331 6.4466 16 2 10.4033 2.4261
0.0401 135.0 38880 3.3727 0.3349 0.1734 0.3115 0.3116 0.8351 0.8334 6.4488 16 2 10.442 2.881
0.039 136.0 39168 3.3777 0.3369 0.1753 0.3131 0.3131 0.8354 0.8332 6.4306 16 2 10.3624 2.4261
0.0409 137.0 39456 3.3585 0.339 0.1764 0.3152 0.3151 0.836 0.8341 6.4314 16 2 10.3791 2.1228
0.0406 138.0 39744 3.3704 0.3377 0.1764 0.3136 0.3139 0.8353 0.8335 6.4435 16 2 10.4011 2.1986
0.0399 139.0 40032 3.3839 0.3368 0.1759 0.3132 0.313 0.8355 0.8338 6.4541 16 2 10.4208 2.047
0.039 140.0 40320 3.3843 0.3352 0.174 0.3115 0.3117 0.8351 0.8335 6.4488 16 2 10.417 2.1986
0.0386 141.0 40608 3.3850 0.3355 0.1747 0.3117 0.312 0.8353 0.8337 6.4526 17 2 10.4306 2.2745
0.0373 142.0 40896 3.4000 0.336 0.1746 0.3115 0.312 0.8349 0.8334 6.4754 16 2 10.4526 2.1228
0.0385 143.0 41184 3.3840 0.3393 0.1763 0.3152 0.3152 0.8359 0.8343 6.4807 16 2 10.4511 2.2745
0.0384 144.0 41472 3.3823 0.3375 0.1754 0.3129 0.313 0.8355 0.8341 6.4731 16 2 10.4443 2.2745
0.0375 145.0 41760 3.3968 0.338 0.1754 0.313 0.3128 0.8351 0.8336 6.4685 16 2 10.4519 2.3503
0.0367 146.0 42048 3.3979 0.3366 0.174 0.3117 0.3119 0.835 0.8332 6.4496 16 2 10.4367 2.4261
0.0369 147.0 42336 3.4081 0.3357 0.1744 0.3117 0.3119 0.8349 0.8331 6.4496 16 2 10.4268 2.4261
0.0362 148.0 42624 3.4072 0.3348 0.1738 0.3106 0.311 0.8348 0.8329 6.4359 16 2 10.4185 2.3503
0.0364 149.0 42912 3.4105 0.3351 0.1741 0.3109 0.3115 0.8347 0.8328 6.4352 16 2 10.4139 2.4261
0.0369 150.0 43200 3.4100 0.3351 0.1743 0.3114 0.3116 0.8348 0.833 6.4435 16 2 10.4215 2.4261

Framework versions

  • Transformers 4.33.1
  • Pytorch 2.0.1+cu118
  • Datasets 2.14.5
  • Tokenizers 0.13.3
Downloads last month
9
Inference Examples
Inference API (serverless) is not available, repository is disabled.

Model tree for ldos/text_shortening_model_v55

Base model

google-t5/t5-small
Finetuned
this model