DGurgurov's picture
Upload 17 files
589c840 verified
|
raw
history blame
7.66 kB
metadata
license: apache-2.0
base_model: bert-base-multilingual-cased
tags:
  - generated_from_trainer
metrics:
  - accuracy
model-index:
  - name: bg
    results: []

bg

This model is an adapter fine-tuned on top of bert-base-multilingual-cased on the Bulgarian ConceptNet dataset. It achieves the following results on the evaluation set:

  • Loss: 0.4640
  • Accuracy: 0.8875

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 5e-05
  • train_batch_size: 16
  • eval_batch_size: 16
  • seed: 42
  • distributed_type: multi-GPU
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • training_steps: 50000

Training results

Training Loss Epoch Step Validation Loss Accuracy
1.5057 0.15 500 0.9846 0.8149
1.0172 0.31 1000 0.8395 0.8259
0.8814 0.46 1500 0.7823 0.8368
0.8405 0.61 2000 0.7437 0.8449
0.7773 0.77 2500 0.7247 0.8387
0.7762 0.92 3000 0.6521 0.8513
0.7186 1.07 3500 0.6834 0.8492
0.7033 1.22 4000 0.6715 0.8523
0.672 1.38 4500 0.6539 0.8560
0.6613 1.53 5000 0.6387 0.8567
0.6712 1.68 5500 0.6180 0.8624
0.6776 1.84 6000 0.6635 0.8537
0.6484 1.99 6500 0.5946 0.8661
0.6817 2.14 7000 0.6126 0.8655
0.6392 2.3 7500 0.6136 0.8613
0.6394 2.45 8000 0.6321 0.8621
0.6273 2.6 8500 0.5997 0.8629
0.5993 2.76 9000 0.6028 0.8646
0.6527 2.91 9500 0.6584 0.8510
0.5897 3.06 10000 0.5728 0.8676
0.574 3.21 10500 0.5870 0.8671
0.6026 3.37 11000 0.6067 0.8677
0.5896 3.52 11500 0.6000 0.8638
0.566 3.67 12000 0.5566 0.8712
0.5928 3.83 12500 0.5621 0.8675
0.597 3.98 13000 0.5162 0.8771
0.5836 4.13 13500 0.5498 0.8696
0.5864 4.29 14000 0.5728 0.8640
0.5562 4.44 14500 0.6000 0.8623
0.5999 4.59 15000 0.5589 0.8679
0.5767 4.75 15500 0.5713 0.8681
0.5574 4.9 16000 0.5338 0.8739
0.568 5.05 16500 0.5527 0.8725
0.5568 5.21 17000 0.5058 0.8777
0.5369 5.36 17500 0.5599 0.8720
0.518 5.51 18000 0.5610 0.8720
0.5637 5.66 18500 0.5467 0.8728
0.557 5.82 19000 0.5349 0.8714
0.5499 5.97 19500 0.5468 0.8724
0.5304 6.12 20000 0.5243 0.8741
0.5431 6.28 20500 0.4998 0.8784
0.5508 6.43 21000 0.5367 0.8764
0.5701 6.58 21500 0.5365 0.8734
0.521 6.74 22000 0.4879 0.8819
0.5514 6.89 22500 0.5106 0.8787
0.547 7.04 23000 0.5258 0.8747
0.5512 7.2 23500 0.4975 0.8778
0.5407 7.35 24000 0.4944 0.8786
0.5181 7.5 24500 0.4912 0.8795
0.5493 7.65 25000 0.5188 0.8730
0.5388 7.81 25500 0.5000 0.8831
0.5284 7.96 26000 0.5161 0.8737
0.5116 8.11 26500 0.5263 0.8760
0.5161 8.27 27000 0.5002 0.8787
0.5185 8.42 27500 0.5127 0.8745
0.5291 8.57 28000 0.5116 0.8782
0.5061 8.73 28500 0.4972 0.8774
0.479 8.88 29000 0.4978 0.8798
0.5154 9.03 29500 0.5088 0.8771
0.4989 9.19 30000 0.5119 0.8744
0.5098 9.34 30500 0.4916 0.8826
0.4777 9.49 31000 0.4957 0.8824
0.5462 9.64 31500 0.4846 0.8779
0.509 9.8 32000 0.4873 0.8810
0.5181 9.95 32500 0.5227 0.8710
0.5269 10.1 33000 0.4929 0.8803
0.5094 10.26 33500 0.4841 0.8877
0.5033 10.41 34000 0.5129 0.8805
0.4913 10.56 34500 0.4978 0.8789
0.4938 10.72 35000 0.4640 0.8838
0.4954 10.87 35500 0.4991 0.8794
0.458 11.02 36000 0.4453 0.8886
0.526 11.18 36500 0.4863 0.8832
0.4809 11.33 37000 0.4923 0.8784
0.466 11.48 37500 0.4824 0.8807
0.4903 11.64 38000 0.4552 0.8848
0.4875 11.79 38500 0.4850 0.8780
0.4858 11.94 39000 0.4728 0.8833
0.4868 12.09 39500 0.4868 0.8800
0.485 12.25 40000 0.4935 0.8802
0.4823 12.4 40500 0.4789 0.8828
0.4629 12.55 41000 0.4834 0.8835
0.4915 12.71 41500 0.4864 0.8812
0.473 12.86 42000 0.5136 0.8793
0.4849 13.01 42500 0.4823 0.8815
0.4582 13.17 43000 0.4637 0.8844
0.4938 13.32 43500 0.4829 0.8842
0.4682 13.47 44000 0.4799 0.8817
0.4885 13.63 44500 0.4754 0.8858
0.4641 13.78 45000 0.4738 0.8849
0.4664 13.93 45500 0.4512 0.8869
0.4722 14.08 46000 0.4821 0.8836
0.485 14.24 46500 0.4735 0.8842
0.4784 14.39 47000 0.4557 0.8823
0.4821 14.54 47500 0.4707 0.8856
0.478 14.7 48000 0.4682 0.8846
0.451 14.85 48500 0.4744 0.8781
0.4582 15.0 49000 0.4617 0.8835
0.4949 15.16 49500 0.4769 0.8835
0.4546 15.31 50000 0.4677 0.8835

Framework versions

  • Transformers 4.35.2
  • Pytorch 2.0.0
  • Datasets 2.15.0
  • Tokenizers 0.15.0