whisper-large-v3-turbo-bulgarian-bulgaria
This model is a fine-tuned version of openai/whisper-large-v3-turbo on the fleurs dataset. It achieves the following results on the evaluation set:
- Loss: 0.4592
- Model Preparation Time: 0.0067
- Wer Ortho: 28.9047
- Wer: 9.9711
Model description
More information needed
Intended uses & limitations
More information needed
Training and evaluation data
More information needed
Training procedure
Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 1e-05
- train_batch_size: 64
- eval_batch_size: 64
- seed: 42
- gradient_accumulation_steps: 2
- total_train_batch_size: 128
- optimizer: Use adamw_torch with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
- lr_scheduler_type: linear
- lr_scheduler_warmup_ratio: 0.06
- num_epochs: 1
- mixed_precision_training: Native AMP
Training results
Training Loss | Epoch | Step | Validation Loss | Model Preparation Time | Wer Ortho | Wer |
---|---|---|---|---|---|---|
0.1485 | 0.0299 | 32 | 0.5022 | 0.0067 | 31.5662 | 13.2030 |
0.1236 | 0.0598 | 64 | 0.4940 | 0.0067 | 31.6233 | 13.3865 |
0.1136 | 0.0897 | 96 | 0.4479 | 0.0067 | 31.6233 | 13.3865 |
0.1071 | 0.1196 | 128 | 0.4630 | 0.0067 | 31.0596 | 13.1536 |
0.1091 | 0.1495 | 160 | 0.4768 | 0.0067 | 31.1880 | 13.2030 |
0.1033 | 0.1794 | 192 | 0.4677 | 0.0067 | 30.3603 | 12.4903 |
0.0942 | 0.2092 | 224 | 0.4513 | 0.0067 | 30.6315 | 12.4832 |
0.097 | 0.2391 | 256 | 0.4636 | 0.0067 | 30.0250 | 12.0740 |
0.0955 | 0.2690 | 288 | 0.4449 | 0.0067 | 30.5102 | 12.6314 |
0.0981 | 0.2989 | 320 | 0.4494 | 0.0067 | 30.2747 | 12.1234 |
0.089 | 0.3288 | 352 | 0.4999 | 0.0067 | 30.5173 | 11.9258 |
0.0929 | 0.3587 | 384 | 0.5081 | 0.0067 | 30.4174 | 12.0387 |
0.0912 | 0.3886 | 416 | 0.4836 | 0.0067 | 30.3175 | 11.5094 |
0.0856 | 0.4185 | 448 | 0.4557 | 0.0067 | 29.6183 | 11.4247 |
0.0829 | 0.4484 | 480 | 0.4551 | 0.0067 | 29.9394 | 11.7987 |
0.0815 | 0.4783 | 512 | 0.4763 | 0.0067 | 30.5316 | 12.8078 |
0.0874 | 0.5082 | 544 | 0.4603 | 0.0067 | 29.5326 | 11.3824 |
0.0925 | 0.5381 | 576 | 0.4710 | 0.0067 | 29.6682 | 10.9237 |
0.0825 | 0.5680 | 608 | 0.4560 | 0.0067 | 29.5398 | 10.7685 |
0.0819 | 0.5979 | 640 | 0.4704 | 0.0067 | 29.4328 | 10.7614 |
0.0787 | 0.6277 | 672 | 0.4803 | 0.0067 | 29.6539 | 10.8955 |
0.0799 | 0.6576 | 704 | 0.4460 | 0.0067 | 29.1616 | 10.7826 |
0.0812 | 0.6875 | 736 | 0.4476 | 0.0067 | 29.1545 | 10.7261 |
0.0767 | 0.7174 | 768 | 0.4588 | 0.0067 | 29.3471 | 10.9167 |
0.082 | 0.7473 | 800 | 0.4658 | 0.0067 | 29.2615 | 10.7332 |
0.0836 | 0.7772 | 832 | 0.4640 | 0.0067 | 29.1188 | 10.4650 |
0.0848 | 0.8071 | 864 | 0.4474 | 0.0067 | 28.9119 | 10.2816 |
0.0813 | 0.8370 | 896 | 0.4662 | 0.0067 | 29.1117 | 10.3451 |
0.0779 | 0.8669 | 928 | 0.4570 | 0.0067 | 28.9975 | 10.2039 |
0.0727 | 0.8968 | 960 | 0.4621 | 0.0067 | 29.0617 | 10.2110 |
0.0728 | 0.9267 | 992 | 0.4592 | 0.0067 | 28.9047 | 9.9711 |
0.0704 | 0.9566 | 1024 | 0.4536 | 0.0067 | 28.9404 | 10.0064 |
0.076 | 0.9865 | 1056 | 0.4546 | 0.0067 | 28.9404 | 10.1263 |
Framework versions
- Transformers 4.51.3
- Pytorch 2.5.1
- Datasets 3.6.0
- Tokenizers 0.21.1
- Downloads last month
- 7
Inference Providers
NEW
This model isn't deployed by any Inference Provider.
๐
Ask for provider support
Model tree for SamuelPfisterer1/whisper-large-v3-turbo-bulgarian-bulgaria
Base model
openai/whisper-large-v3
Finetuned
openai/whisper-large-v3-turbo