Whisper Large v3 Turbo (Albanian Fine-Tuned) - CTranslate2
This is the CTranslate2 version of the fine-tuned Whisper model Flutra/whisper-large-v3-turbo-sq-v2
, optimized for use with Faster Whisper and WhisperX.
Original Model Details
The original model was fine-tuned by Flutra. All credit for the training and performance goes to the original author.
- Base Model:
openai/whisper-large-v3-turbo
- Original Repo: Flutra/whisper-large-v3-turbo-sq-v2
- Language: Albanian (
sq
) - Word Error Rate (WER): 6.98% on the Common Voice 19 evaluation set.
Training Details of the Original Model
The original model was fine-tuned on the Mozilla Common Voice 19 Albanian dataset.
Training Arguments:
Argument | Value |
---|---|
per_device_train_batch_size |
8 |
gradient_accumulation_steps |
1 |
num_train_epochs |
3 |
learning_rate |
1e-5 |
fp16 |
True |
Final Performance:
- WER: 6.98% (at step 3500)
- Downloads last month
- 4
Inference Providers
NEW
This model isn't deployed by any Inference Provider.
๐
Ask for provider support
Model tree for StarryAir/whisper-large-v3-turbo-sq-v2-ct2
Base model
openai/whisper-large-v3
Finetuned
openai/whisper-large-v3-turbo