metadata
tags:
- summarization
- generated_from_trainer
model-index:
- name: led-risalah_data_v2
results: []
led-risalah_data_v2
This model was trained from scratch on an unknown dataset. It achieves the following results on the evaluation set:
- Loss: 1.6785
- Rouge1 Precision: 0.6665
- Rouge1 Recall: 0.1816
- Rouge1 Fmeasure: 0.284
Model description
More information needed
Intended uses & limitations
More information needed
Training and evaluation data
More information needed
Training procedure
Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 5e-05
- train_batch_size: 1
- eval_batch_size: 1
- seed: 42
- gradient_accumulation_steps: 8
- total_train_batch_size: 8
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 20
- mixed_precision_training: Native AMP
Training results
Training Loss | Epoch | Step | Validation Loss | Rouge1 Fmeasure | Rouge1 Precision | Rouge1 Recall |
---|---|---|---|---|---|---|
2.4597 | 0.91 | 8 | 1.8034 | 0.1951 | 0.4699 | 0.1246 |
1.7706 | 1.94 | 17 | 1.6403 | 0.2451 | 0.6043 | 0.1554 |
1.5072 | 2.97 | 26 | 1.5947 | 0.2628 | 0.6236 | 0.1676 |
1.4018 | 4.0 | 35 | 1.5688 | 0.2789 | 0.656 | 0.1782 |
1.2761 | 4.91 | 43 | 1.5454 | 0.2723 | 0.6434 | 0.1736 |
1.1779 | 5.94 | 52 | 1.5636 | 0.2889 | 0.6794 | 0.1843 |
1.1235 | 6.97 | 61 | 1.5430 | 0.2965 | 0.6913 | 0.1902 |
1.0529 | 8.0 | 70 | 1.5639 | 0.2829 | 0.6705 | 0.1805 |
0.9883 | 8.91 | 78 | 1.5740 | 0.2817 | 0.6757 | 0.1798 |
0.9274 | 9.94 | 87 | 1.5793 | 0.2771 | 0.6623 | 0.1764 |
0.925 | 10.97 | 96 | 1.6072 | 0.2821 | 0.665 | 0.18 |
0.858 | 12.0 | 105 | 1.6129 | 0.284 | 0.6625 | 0.1817 |
0.8182 | 12.91 | 113 | 1.6396 | 0.2765 | 0.6567 | 0.1761 |
0.7974 | 13.94 | 122 | 1.6445 | 0.2759 | 0.659 | 0.1759 |
0.7524 | 14.97 | 131 | 1.6585 | 0.2763 | 0.6585 | 0.1759 |
0.7743 | 16.0 | 140 | 1.6779 | 0.2788 | 0.6594 | 0.1779 |
0.7486 | 16.91 | 148 | 1.6742 | 0.2851 | 0.6666 | 0.1819 |
0.676 | 17.94 | 157 | 1.6790 | 0.2859 | 0.6707 | 0.1827 |
Framework versions
- Transformers 4.35.2
- Pytorch 2.1.1+cu121
- Datasets 2.14.5
- Tokenizers 0.15.1