whisper-large-v3-turbo-greek-greece

This model is a fine-tuned version of openai/whisper-large-v3-turbo on the fleurs dataset. It achieves the following results on the evaluation set:

  • Loss: 0.3006
  • Model Preparation Time: 0.0067
  • Wer Ortho: 26.8701
  • Wer: 11.0591

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 1e-05
  • train_batch_size: 64
  • eval_batch_size: 64
  • seed: 42
  • gradient_accumulation_steps: 2
  • total_train_batch_size: 128
  • optimizer: Use adamw_torch with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
  • lr_scheduler_type: linear
  • lr_scheduler_warmup_ratio: 0.06
  • num_epochs: 1
  • mixed_precision_training: Native AMP

Training results

Training Loss Epoch Step Validation Loss Model Preparation Time Wer Ortho Wer
0.6129 0.0204 32 0.2996 0.0067 29.5644 14.1172
0.5021 0.0408 64 0.3064 0.0067 29.0414 14.0381
0.5068 0.0612 96 0.3116 0.0067 29.3261 14.2226
0.5049 0.0816 128 0.3000 0.0067 28.5317 13.8404
0.5289 0.1020 160 0.3261 0.0067 29.5048 14.3413
0.4464 0.1224 192 0.3336 0.0067 29.5115 14.2292
0.4324 0.1428 224 0.3267 0.0067 28.9157 13.8865
0.4376 0.1632 256 0.3319 0.0067 29.8292 14.7367
0.4623 0.1836 288 0.3493 0.0067 29.5379 14.2424
0.4494 0.2040 320 0.3210 0.0067 28.9620 13.7415
0.4541 0.2244 352 0.3284 0.0067 29.0348 13.7349
0.4262 0.2448 384 0.3169 0.0067 28.7369 13.2077
0.4365 0.2652 416 0.3131 0.0067 28.7369 13.1484
0.4701 0.2856 448 0.3144 0.0067 28.5118 13.0759
0.4256 0.3060 480 0.3017 0.0067 28.4126 13.2868
0.4437 0.3264 512 0.3080 0.0067 28.6111 13.3724
0.4147 0.3468 544 0.3141 0.0067 28.1875 12.4629
0.4089 0.3672 576 0.3150 0.0067 28.2669 12.9638
0.4448 0.3876 608 0.3221 0.0067 28.5979 12.8913
0.4354 0.4080 640 0.3091 0.0067 28.1875 12.7859
0.3877 0.4284 672 0.3174 0.0067 28.0816 12.5816
0.4411 0.4488 704 0.3130 0.0067 28.4788 13.0693
0.4056 0.4692 736 0.3168 0.0067 28.3000 12.4366
0.4043 0.4896 768 0.3291 0.0067 28.1213 12.4695
0.4265 0.5100 800 0.3230 0.0067 27.8565 12.0082
0.4075 0.5304 832 0.3062 0.0067 27.4990 12.1400
0.4075 0.5508 864 0.3137 0.0067 27.6711 12.0675
0.4312 0.5712 896 0.3086 0.0067 27.6777 11.8302
0.4294 0.5916 928 0.3034 0.0067 27.4262 12.1202
0.4268 0.6120 960 0.3126 0.0067 27.9558 12.3707
0.4385 0.6325 992 0.3156 0.0067 28.0087 12.3113
0.3886 0.6529 1024 0.3043 0.0067 27.6115 11.8566
0.3761 0.6733 1056 0.3076 0.0067 27.2541 11.8434
0.4109 0.6937 1088 0.3125 0.0067 27.4924 12.0214
0.4231 0.7141 1120 0.3025 0.0067 27.3070 11.8368
0.4126 0.7345 1152 0.3058 0.0067 27.3070 11.7577
0.3999 0.7549 1184 0.3056 0.0067 26.9694 11.4677
0.4354 0.7753 1216 0.3031 0.0067 27.4725 11.6589
0.4036 0.7957 1248 0.2979 0.0067 27.2210 11.4611
0.3972 0.8161 1280 0.3043 0.0067 27.3467 11.4348
0.3478 0.8365 1312 0.3055 0.0067 27.4063 11.5139
0.3915 0.8569 1344 0.3015 0.0067 27.1813 11.4743
0.4092 0.8773 1376 0.2999 0.0067 27.0356 11.3952
0.4042 0.8977 1408 0.2974 0.0067 27.0025 11.3227
0.4332 0.9181 1440 0.3026 0.0067 27.2739 11.4282
0.4164 0.9385 1472 0.2996 0.0067 27.0158 11.1777
0.3774 0.9589 1504 0.3009 0.0067 27.0820 11.0987
0.3934 0.9793 1536 0.3006 0.0067 26.8701 11.0591
0.3695 0.9997 1568 0.3020 0.0067 26.9032 11.0921

Framework versions

  • Transformers 4.51.3
  • Pytorch 2.5.1
  • Datasets 3.6.0
  • Tokenizers 0.21.1
Downloads last month
47
Safetensors
Model size
809M params
Tensor type
F32
ยท
Inference Providers NEW
This model isn't deployed by any Inference Provider. ๐Ÿ™‹ Ask for provider support

Model tree for SamuelPfisterer1/whisper-large-v3-turbo-greek-greece

Finetuned
(269)
this model

Evaluation results