distilbert_new2_0040

This model is a fine-tuned version of [/content/drive/MyDrive/Colab Notebooks/oscar/trybackup_distilbert/new_backup_0105105](https://huggingface.co//content/drive/MyDrive/Colab Notebooks/oscar/trybackup_distilbert/new_backup_0105105) on an unknown dataset. It achieves the following results on the evaluation set:

  • Train Loss: 0.9702
  • Validation Loss: 0.9482
  • Epoch: 39

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • optimizer: {'name': 'AdamWeightDecay', 'learning_rate': 2e-05, 'decay': 0.0, 'beta_1': 0.9, 'beta_2': 0.999, 'epsilon': 1e-07, 'amsgrad': False, 'weight_decay_rate': 0.01}
  • training_precision: float32

Training results

Train Loss Validation Loss Epoch
1.0180 0.9873 0
1.0163 0.9878 1
1.0145 0.9856 2
1.0139 0.9830 3
1.0122 0.9831 4
1.0118 0.9830 5
1.0094 0.9800 6
1.0075 0.9809 7
1.0066 0.9784 8
1.0062 0.9768 9
1.0032 0.9751 10
1.0023 0.9764 11
1.0008 0.9735 12
0.9994 0.9730 13
0.9986 0.9761 14
0.9975 0.9714 15
0.9953 0.9708 16
0.9941 0.9683 17
0.9933 0.9681 18
0.9920 0.9688 19
0.9907 0.9648 20
0.9897 0.9625 21
0.9890 0.9642 22
0.9873 0.9633 23
0.9867 0.9618 24
0.9857 0.9600 25
0.9839 0.9598 26
0.9827 0.9585 27
0.9821 0.9607 28
0.9809 0.9579 29
0.9803 0.9561 30
0.9786 0.9563 31
0.9774 0.9536 32
0.9766 0.9542 33
0.9756 0.9523 34
0.9743 0.9525 35
0.9730 0.9513 36
0.9721 0.9507 37
0.9715 0.9506 38
0.9702 0.9482 39

Framework versions

  • Transformers 4.20.1
  • TensorFlow 2.8.2
  • Datasets 2.3.2
  • Tokenizers 0.12.1
Downloads last month
8
Inference Examples
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social visibility and check back later, or deploy to Inference Endpoints (dedicated) instead.