diogopaes10's picture
update model card README.md
2f12227
|
raw
history blame
7.94 kB
metadata
license: mit
base_model: microsoft/deberta-v3-base
tags:
  - generated_from_trainer
metrics:
  - f1
  - accuracy
  - precision
  - recall
model-index:
  - name: 010-microsoft-deberta-v3-base-finetuned-yahoo-800_200
    results: []

010-microsoft-deberta-v3-base-finetuned-yahoo-800_200

This model is a fine-tuned version of microsoft/deberta-v3-base on the None dataset. It achieves the following results on the evaluation set:

  • Loss: 1.1327
  • F1: 0.6339
  • Accuracy: 0.64
  • Precision: 0.6436
  • Recall: 0.64
  • System Ram Used: 4.1191
  • System Ram Total: 83.4807
  • Gpu Ram Allocated: 2.0916
  • Gpu Ram Cached: 24.6602
  • Gpu Ram Total: 39.5640
  • Gpu Utilization: 46
  • Disk Space Used: 42.7346
  • Disk Space Total: 78.1898

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 2e-05
  • train_batch_size: 32
  • eval_batch_size: 32
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 10

Training results

Training Loss Epoch Step Validation Loss F1 Accuracy Precision Recall System Ram Used System Ram Total Gpu Ram Allocated Gpu Ram Cached Gpu Ram Total Gpu Utilization Disk Space Used Disk Space Total
2.3122 0.4 10 2.3038 0.0182 0.1 0.01 0.1 3.9481 83.4807 2.0915 24.6484 39.5640 44 42.7345 78.1898
2.3122 0.8 20 2.3008 0.0182 0.1 0.01 0.1 3.9500 83.4807 2.0916 24.6602 39.5640 64 42.7345 78.1898
2.3122 1.2 30 2.2951 0.0182 0.1 0.01 0.1 3.9885 83.4807 2.0915 24.6602 39.5640 44 42.7345 78.1898
2.3122 1.6 40 2.2860 0.0830 0.15 0.0948 0.15 4.0161 83.4807 2.0915 24.6602 39.5640 43 42.7345 78.1898
2.3122 2.0 50 2.2335 0.0916 0.195 0.1010 0.195 4.0651 83.4807 2.0916 24.6602 39.5640 43 42.7345 78.1898
2.3122 2.4 60 2.1085 0.2197 0.295 0.2090 0.295 4.0829 83.4807 2.0915 24.6602 39.5640 42 42.7345 78.1898
2.3122 2.8 70 1.9703 0.2923 0.33 0.3951 0.33 4.1017 83.4807 2.0915 24.6602 39.5640 47 42.7345 78.1898
2.3122 3.2 80 1.8818 0.3441 0.395 0.4073 0.395 4.1170 83.4807 2.0915 24.6602 39.5640 49 42.7345 78.1898
2.3122 3.6 90 1.7649 0.4158 0.44 0.4853 0.44 4.1182 83.4807 2.0915 24.6602 39.5640 45 42.7345 78.1898
2.3122 4.0 100 1.6408 0.5143 0.53 0.5429 0.53 4.1156 83.4807 2.0916 24.6602 39.5640 48 42.7345 78.1898
2.3122 4.4 110 1.5896 0.5167 0.535 0.5320 0.535 4.1162 83.4807 2.0915 24.6602 39.5640 46 42.7345 78.1898
2.3122 4.8 120 1.4783 0.5627 0.575 0.5692 0.575 4.1160 83.4807 2.0915 24.6602 39.5640 51 42.7345 78.1898
2.3122 5.2 130 1.3900 0.5844 0.595 0.6033 0.595 4.1169 83.4807 2.0915 24.6602 39.5640 57 42.7345 78.1898
2.3122 5.6 140 1.3547 0.6052 0.625 0.6127 0.625 4.1181 83.4807 2.0915 24.6602 39.5640 46 42.7345 78.1898
2.3122 6.0 150 1.2983 0.6032 0.6 0.6455 0.6 4.0997 83.4807 2.0915 24.6602 39.5640 48 42.7345 78.1898
2.3122 6.4 160 1.2805 0.5972 0.615 0.6058 0.615 4.1017 83.4807 2.0915 24.6602 39.5640 55 42.7345 78.1898
2.3122 6.8 170 1.2105 0.6213 0.62 0.6325 0.62 4.1238 83.4807 2.0915 24.6602 39.5640 50 42.7345 78.1898
2.3122 7.2 180 1.2458 0.5944 0.615 0.5958 0.615 4.1257 83.4807 2.0915 24.6602 39.5640 45 42.7345 78.1898
2.3122 7.6 190 1.1695 0.6629 0.665 0.6736 0.665 4.1261 83.4807 2.0915 24.6602 39.5640 52 42.7345 78.1898
2.3122 8.0 200 1.1737 0.6383 0.645 0.6425 0.645 4.1259 83.4807 2.0915 24.6602 39.5640 54 42.7345 78.1898
2.3122 8.4 210 1.1540 0.6347 0.635 0.6418 0.635 4.1258 83.4807 2.0915 24.6602 39.5640 47 42.7345 78.1898
2.3122 8.8 220 1.1422 0.6322 0.64 0.6413 0.64 4.1251 83.4807 2.0915 24.6602 39.5640 50 42.7346 78.1898
2.3122 9.2 230 1.1422 0.6443 0.65 0.6575 0.65 4.1251 83.4807 2.0916 24.6602 39.5640 47 42.7346 78.1898
2.3122 9.6 240 1.1345 0.6345 0.64 0.6483 0.64 4.1032 83.4807 2.0915 24.6602 39.5640 44 42.7346 78.1898
2.3122 10.0 250 1.1327 0.6339 0.64 0.6436 0.64 4.1084 83.4807 2.0915 24.6602 39.5640 44 42.7346 78.1898

Framework versions

  • Transformers 4.31.0
  • Pytorch 2.0.1+cu118
  • Datasets 2.13.1
  • Tokenizers 0.13.3