cantillation's picture
End of training
4ffec6f verified
metadata
language:
  - he
base_model: >-
  cantillation/Teamim-large-v2_WeightDecay-0.05_Augmented_Combined-Data_date-09-07-2024_16-25
tags:
  - hf-asr-leaderboard
  - generated_from_trainer
metrics:
  - wer
model-index:
  - name: he-cantillation
    results: []

he-cantillation

This model is a fine-tuned version of cantillation/Teamim-large-v2_WeightDecay-0.05_Augmented_Combined-Data_date-09-07-2024_16-25 on an unknown dataset. It achieves the following results on the evaluation set:

  • Loss: 0.6664
  • Wer: 34.2105
  • Avg Precision Exact: 0.3970
  • Avg Recall Exact: 0.4021
  • Avg F1 Exact: 0.3994
  • Avg Precision Letter Shift: 0.3970
  • Avg Recall Letter Shift: 0.4021
  • Avg F1 Letter Shift: 0.3994
  • Avg Precision Word Level: 0.3905
  • Avg Recall Word Level: 0.3956
  • Avg F1 Word Level: 0.3929
  • Avg Precision Word Shift: 0.5197
  • Avg Recall Word Shift: 0.5402
  • Avg F1 Word Shift: 0.5292
  • Precision Median Exact: 0.3970
  • Recall Median Exact: 0.4021
  • F1 Median Exact: 0.3994
  • Precision Max Exact: 0.7273
  • Recall Max Exact: 0.7273
  • F1 Max Exact: 0.7273
  • Precision Min Exact: 0.0667
  • Recall Min Exact: 0.0769
  • F1 Min Exact: 0.0714
  • Precision Min Letter Shift: 0.0667
  • Recall Min Letter Shift: 0.0769
  • F1 Min Letter Shift: 0.0714
  • Precision Min Word Level: 0.0667
  • Recall Min Word Level: 0.0769
  • F1 Min Word Level: 0.0714
  • Precision Min Word Shift: 0.2667
  • Recall Min Word Shift: 0.3077
  • F1 Min Word Shift: 0.2857

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 1e-05
  • train_batch_size: 8
  • eval_batch_size: 32
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • lr_scheduler_warmup_steps: 6
  • training_steps: 40
  • mixed_precision_training: Native AMP

Training results

Training Loss Epoch Step Validation Loss Wer Avg Precision Exact Avg Recall Exact Avg F1 Exact Avg Precision Letter Shift Avg Recall Letter Shift Avg F1 Letter Shift Avg Precision Word Level Avg Recall Word Level Avg F1 Word Level Avg Precision Word Shift Avg Recall Word Shift Avg F1 Word Shift Precision Median Exact Recall Median Exact F1 Median Exact Precision Max Exact Recall Max Exact F1 Max Exact Precision Min Exact Recall Min Exact F1 Min Exact Precision Min Letter Shift Recall Min Letter Shift F1 Min Letter Shift Precision Min Word Level Recall Min Word Level F1 Min Word Level Precision Min Word Shift Recall Min Word Shift F1 Min Word Shift
No log 1.0 2 1.0938 76.3158 0.25 0.2619 0.2558 0.25 0.2619 0.2558 0.2619 0.2619 0.2619 0.2727 0.2857 0.2791 0.25 0.2619 0.2558 0.5 0.5238 0.5116 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0
No log 2.0 4 0.7695 52.6316 0.25 0.2391 0.2444 0.25 0.2391 0.2444 0.2619 0.25 0.2558 0.3409 0.3261 0.3333 0.25 0.2391 0.2444 0.5 0.4783 0.4889 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0
No log 3.0 6 0.6176 52.6316 0.2955 0.2826 0.2889 0.2955 0.2826 0.2889 0.2857 0.2727 0.2791 0.3515 0.3401 0.3456 0.2955 0.2826 0.2889 0.5909 0.5652 0.5778 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0667 0.0714 0.0690
No log 4.0 8 0.6977 50.0 0.25 0.25 0.25 0.25 0.25 0.25 0.2381 0.2381 0.2381 0.3352 0.3497 0.3417 0.25 0.25 0.25 0.5 0.5 0.5 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.125 0.1538 0.1379
No log 5.0 10 0.6345 50.0 0.25 0.25 0.25 0.25 0.25 0.25 0.2381 0.2381 0.2381 0.3352 0.3497 0.3417 0.25 0.25 0.25 0.5 0.5 0.5 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.125 0.1538 0.1379
No log 6.0 12 0.6303 47.3684 0.2727 0.2857 0.2791 0.2727 0.2857 0.2791 0.2619 0.275 0.2683 0.3580 0.3864 0.3713 0.2727 0.2857 0.2791 0.5455 0.5714 0.5581 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.125 0.1538 0.1379
No log 7.0 14 0.6193 44.7368 0.2955 0.3095 0.3023 0.2955 0.3095 0.3023 0.2857 0.3 0.2927 0.4119 0.4487 0.4290 0.2955 0.3095 0.3023 0.5909 0.6190 0.6047 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.1875 0.2308 0.2069
No log 8.0 16 0.6194 42.1053 0.3182 0.3182 0.3182 0.3182 0.3182 0.3182 0.3095 0.3095 0.3095 0.4347 0.4563 0.4444 0.3182 0.3182 0.3182 0.6364 0.6364 0.6364 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.1875 0.2308 0.2069
No log 9.0 18 0.6302 39.4737 0.3409 0.3409 0.3409 0.3409 0.3409 0.3409 0.3333 0.3333 0.3333 0.4574 0.4790 0.4671 0.3409 0.3409 0.3409 0.6818 0.6818 0.6818 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.1875 0.2308 0.2069
No log 10.0 20 0.6424 39.4737 0.3409 0.3409 0.3409 0.3409 0.3409 0.3409 0.3333 0.3333 0.3333 0.4574 0.4790 0.4671 0.3409 0.3409 0.3409 0.6818 0.6818 0.6818 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.1875 0.2308 0.2069
No log 11.0 22 0.6514 36.8421 0.3742 0.3794 0.3766 0.3742 0.3794 0.3766 0.3667 0.3718 0.3690 0.4970 0.5175 0.5065 0.3742 0.3794 0.3766 0.6818 0.6818 0.6818 0.0667 0.0769 0.0714 0.0667 0.0769 0.0714 0.0667 0.0769 0.0714 0.2667 0.3077 0.2857
No log 12.0 24 0.6571 36.8421 0.3742 0.3794 0.3766 0.3742 0.3794 0.3766 0.3667 0.3718 0.3690 0.4970 0.5175 0.5065 0.3742 0.3794 0.3766 0.6818 0.6818 0.6818 0.0667 0.0769 0.0714 0.0667 0.0769 0.0714 0.0667 0.0769 0.0714 0.2667 0.3077 0.2857
0.1235 13.0 26 0.6607 36.8421 0.3742 0.3794 0.3766 0.3742 0.3794 0.3766 0.3667 0.3718 0.3690 0.4970 0.5175 0.5065 0.3742 0.3794 0.3766 0.6818 0.6818 0.6818 0.0667 0.0769 0.0714 0.0667 0.0769 0.0714 0.0667 0.0769 0.0714 0.2667 0.3077 0.2857
0.1235 14.0 28 0.6629 36.8421 0.3742 0.3794 0.3766 0.3742 0.3794 0.3766 0.3667 0.3718 0.3690 0.4970 0.5175 0.5065 0.3742 0.3794 0.3766 0.6818 0.6818 0.6818 0.0667 0.0769 0.0714 0.0667 0.0769 0.0714 0.0667 0.0769 0.0714 0.2667 0.3077 0.2857
0.1235 15.0 30 0.6644 36.8421 0.3742 0.3794 0.3766 0.3742 0.3794 0.3766 0.3667 0.3718 0.3690 0.4970 0.5175 0.5065 0.3742 0.3794 0.3766 0.6818 0.6818 0.6818 0.0667 0.0769 0.0714 0.0667 0.0769 0.0714 0.0667 0.0769 0.0714 0.2667 0.3077 0.2857
0.1235 16.0 32 0.6652 36.8421 0.3742 0.3794 0.3766 0.3742 0.3794 0.3766 0.3667 0.3718 0.3690 0.4970 0.5175 0.5065 0.3742 0.3794 0.3766 0.6818 0.6818 0.6818 0.0667 0.0769 0.0714 0.0667 0.0769 0.0714 0.0667 0.0769 0.0714 0.2667 0.3077 0.2857
0.1235 17.0 34 0.6662 36.8421 0.3742 0.3794 0.3766 0.3742 0.3794 0.3766 0.3667 0.3718 0.3690 0.4970 0.5175 0.5065 0.3742 0.3794 0.3766 0.6818 0.6818 0.6818 0.0667 0.0769 0.0714 0.0667 0.0769 0.0714 0.0667 0.0769 0.0714 0.2667 0.3077 0.2857
0.1235 18.0 36 0.6663 34.2105 0.3970 0.4021 0.3994 0.3970 0.4021 0.3994 0.3905 0.3956 0.3929 0.5197 0.5402 0.5292 0.3970 0.4021 0.3994 0.7273 0.7273 0.7273 0.0667 0.0769 0.0714 0.0667 0.0769 0.0714 0.0667 0.0769 0.0714 0.2667 0.3077 0.2857
0.1235 19.0 38 0.6665 34.2105 0.3970 0.4021 0.3994 0.3970 0.4021 0.3994 0.3905 0.3956 0.3929 0.5197 0.5402 0.5292 0.3970 0.4021 0.3994 0.7273 0.7273 0.7273 0.0667 0.0769 0.0714 0.0667 0.0769 0.0714 0.0667 0.0769 0.0714 0.2667 0.3077 0.2857
0.1235 20.0 40 0.6664 34.2105 0.3970 0.4021 0.3994 0.3970 0.4021 0.3994 0.3905 0.3956 0.3929 0.5197 0.5402 0.5292 0.3970 0.4021 0.3994 0.7273 0.7273 0.7273 0.0667 0.0769 0.0714 0.0667 0.0769 0.0714 0.0667 0.0769 0.0714 0.2667 0.3077 0.2857

Framework versions

  • Transformers 4.41.2
  • Pytorch 2.2.1
  • Datasets 2.20.0
  • Tokenizers 0.19.1