Edit model card

wavlm_torgo_0H

This model is a fine-tuned version of microsoft/wavlm-base on an unknown dataset. It achieves the following results on the evaluation set:

  • Loss: 4.2230
  • Wer: 1.0

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 0.0001
  • train_batch_size: 1
  • eval_batch_size: 8
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • lr_scheduler_warmup_steps: 1000
  • num_epochs: 8
  • mixed_precision_training: Native AMP

Training results

Training Loss Epoch Step Validation Loss Wer
36.5759 0.1882 500 5.7798 1.0
4.1355 0.3764 1000 4.3661 1.0
3.9484 0.5645 1500 4.2577 1.0
3.6159 0.7527 2000 4.1272 1.0
3.6944 0.9409 2500 3.9745 1.0
3.8285 1.1291 3000 4.0134 1.0
3.6116 1.3173 3500 4.1692 1.0
3.5828 1.5055 4000 4.0013 1.0
3.5703 1.6936 4500 4.1055 1.0
3.5841 1.8818 5000 4.1041 1.0
3.8079 2.0700 5500 4.1574 1.0
3.5977 2.2582 6000 4.3217 1.0
3.5523 2.4464 6500 4.1800 1.0
3.5661 2.6346 7000 4.2053 1.0
3.5676 2.8227 7500 4.3885 1.0
3.794 3.0109 8000 4.2958 1.0
3.5647 3.1991 8500 4.2959 1.0
3.5805 3.3873 9000 4.3383 1.0
3.5475 3.5755 9500 4.1639 1.0
3.5523 3.7636 10000 4.2241 1.0
3.5982 3.9518 10500 4.3270 1.0
3.7088 4.1400 11000 4.2886 1.0
3.561 4.3282 11500 4.2801 1.0
3.5367 4.5164 12000 4.6914 1.0
3.5573 4.7046 12500 4.2071 1.0
3.5613 4.8927 13000 4.4513 1.0
3.719 5.0809 13500 4.3972 1.0
3.5376 5.2691 14000 4.3590 1.0
3.5313 5.4573 14500 4.3130 1.0
3.5384 5.6455 15000 4.4599 1.0
3.5755 5.8336 15500 4.3602 1.0
3.6912 6.0218 16000 4.2520 1.0
3.532 6.2100 16500 4.2731 1.0
3.565 6.3982 17000 4.2608 1.0
3.5328 6.5864 17500 4.2221 1.0
3.5361 6.7746 18000 4.2500 1.0
3.4975 6.9627 18500 4.2042 1.0
3.6749 7.1509 19000 4.2319 1.0
3.5316 7.3391 19500 4.2101 1.0
3.5262 7.5273 20000 4.2657 1.0
3.6605 7.7155 20500 4.2559 1.0
3.528 7.9037 21000 4.2230 1.0

Framework versions

  • Transformers 4.40.2
  • Pytorch 2.3.0+cu121
  • Datasets 2.19.1
  • Tokenizers 0.19.1
Downloads last month
7
Safetensors
Model size
94.4M params
Tensor type
F32
·
Inference Examples
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social visibility and check back later, or deploy to Inference Endpoints (dedicated) instead.

Model tree for Cantaosu/wavlm_torgo_0H

Finetuned
(12)
this model