--- license: apache-2.0 base_model: t5-small tags: - generated_from_trainer metrics: - rouge model-index: - name: test_trainer1 results: [] --- # test_trainer1 This model is a fine-tuned version of [t5-small](https://huggingface.co/t5-small) on an unknown dataset. It achieves the following results on the evaluation set: - Loss: 0.0000 - Rouge1: 0.8111 - Rouge2: 0.8008 - Rougel: 0.812 - Rougelsum: 0.8109 - Gen Len: 18.5 ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 0.0056 - train_batch_size: 8 - eval_batch_size: 8 - seed: 42 - optimizer: Adam with betas=(0.9,0.99) and epsilon=1e-06 - lr_scheduler_type: linear - num_epochs: 40 ### Training results | Training Loss | Epoch | Step | Validation Loss | Rouge1 | Rouge2 | Rougel | Rougelsum | Gen Len | |:-------------:|:-----:|:----:|:---------------:|:------:|:------:|:------:|:---------:|:-------:| | No log | 1.0 | 13 | 0.3042 | 0.7518 | 0.7064 | 0.7515 | 0.7499 | 18.2 | | No log | 2.0 | 26 | 0.0621 | 0.7853 | 0.7648 | 0.7778 | 0.778 | 18.4667 | | No log | 3.0 | 39 | 0.0600 | 0.7809 | 0.7539 | 0.7793 | 0.7794 | 18.3333 | | No log | 4.0 | 52 | 0.0293 | 0.8073 | 0.7961 | 0.8076 | 0.8069 | 18.4 | | No log | 5.0 | 65 | 0.0304 | 0.8053 | 0.7881 | 0.803 | 0.8027 | 18.4667 | | No log | 6.0 | 78 | 0.0167 | 0.7787 | 0.7634 | 0.7794 | 0.7792 | 18.7 | | No log | 7.0 | 91 | 0.0203 | 0.8076 | 0.7952 | 0.8083 | 0.8072 | 18.5333 | | No log | 8.0 | 104 | 0.0418 | 0.7722 | 0.7493 | 0.7711 | 0.7695 | 18.7667 | | No log | 9.0 | 117 | 0.0153 | 0.799 | 0.7804 | 0.7969 | 0.7964 | 18.4 | | No log | 10.0 | 130 | 0.0225 | 0.7963 | 0.7804 | 0.7968 | 0.7952 | 18.5 | | No log | 11.0 | 143 | 0.0119 | 0.7832 | 0.7676 | 0.784 | 0.7837 | 18.5 | | No log | 12.0 | 156 | 0.0118 | 0.8023 | 0.7863 | 0.8024 | 0.8011 | 18.5 | | No log | 13.0 | 169 | 0.0411 | 0.8019 | 0.7916 | 0.8034 | 0.8025 | 18.2667 | | No log | 14.0 | 182 | 0.0048 | 0.8017 | 0.791 | 0.8029 | 0.8022 | 18.5 | | No log | 15.0 | 195 | 0.0038 | 0.8111 | 0.8008 | 0.812 | 0.8109 | 18.5 | | No log | 16.0 | 208 | 0.0080 | 0.8091 | 0.7967 | 0.8093 | 0.8086 | 18.5 | | No log | 17.0 | 221 | 0.0046 | 0.8092 | 0.7967 | 0.8103 | 0.8095 | 18.5 | | No log | 18.0 | 234 | 0.0023 | 0.8111 | 0.8008 | 0.812 | 0.8109 | 18.5 | | No log | 19.0 | 247 | 0.0097 | 0.8105 | 0.799 | 0.8116 | 0.8105 | 18.5 | | No log | 20.0 | 260 | 0.0024 | 0.8111 | 0.8008 | 0.812 | 0.8109 | 18.5 | | No log | 21.0 | 273 | 0.0018 | 0.8111 | 0.7995 | 0.812 | 0.8109 | 18.5 | | No log | 22.0 | 286 | 0.0030 | 0.8111 | 0.8008 | 0.812 | 0.8109 | 18.5 | | No log | 23.0 | 299 | 0.0042 | 0.8111 | 0.8008 | 0.812 | 0.8109 | 18.5 | | No log | 24.0 | 312 | 0.0065 | 0.8102 | 0.8 | 0.8114 | 0.8099 | 18.5 | | No log | 25.0 | 325 | 0.0004 | 0.8111 | 0.8008 | 0.812 | 0.8109 | 18.5 | | No log | 26.0 | 338 | 0.0001 | 0.8111 | 0.8008 | 0.812 | 0.8109 | 18.5 | | No log | 27.0 | 351 | 0.0001 | 0.8111 | 0.8008 | 0.812 | 0.8109 | 18.5 | | No log | 28.0 | 364 | 0.0010 | 0.8111 | 0.8008 | 0.812 | 0.8109 | 18.5 | | No log | 29.0 | 377 | 0.0002 | 0.8111 | 0.8008 | 0.812 | 0.8109 | 18.5 | | No log | 30.0 | 390 | 0.0001 | 0.8111 | 0.8008 | 0.812 | 0.8109 | 18.5 | | No log | 31.0 | 403 | 0.0020 | 0.8093 | 0.7975 | 0.8103 | 0.8089 | 18.5 | | No log | 32.0 | 416 | 0.0014 | 0.8093 | 0.7975 | 0.8103 | 0.8089 | 18.5 | | No log | 33.0 | 429 | 0.0001 | 0.8111 | 0.8008 | 0.812 | 0.8109 | 18.5 | | No log | 34.0 | 442 | 0.0000 | 0.8111 | 0.8008 | 0.812 | 0.8109 | 18.5 | | No log | 35.0 | 455 | 0.0000 | 0.8111 | 0.8008 | 0.812 | 0.8109 | 18.5 | | No log | 36.0 | 468 | 0.0000 | 0.8111 | 0.8008 | 0.812 | 0.8109 | 18.5 | | No log | 37.0 | 481 | 0.0000 | 0.8111 | 0.8008 | 0.812 | 0.8109 | 18.5 | | No log | 38.0 | 494 | 0.0000 | 0.8111 | 0.8008 | 0.812 | 0.8109 | 18.5 | | 0.068 | 39.0 | 507 | 0.0000 | 0.8111 | 0.8008 | 0.812 | 0.8109 | 18.5 | | 0.068 | 40.0 | 520 | 0.0000 | 0.8111 | 0.8008 | 0.812 | 0.8109 | 18.5 | ### Framework versions - Transformers 4.33.2 - Pytorch 2.0.0 - Datasets 2.14.5 - Tokenizers 0.13.3