whisper-finetuned-v3_30e_augment_new

This model is a fine-tuned version of openai/whisper-large-v3-turbo on an unknown dataset. It achieves the following results on the evaluation set:

  • Loss: 0.1077
  • Wer: 52.3943
  • Cer: 27.6678

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 1e-05
  • train_batch_size: 2
  • eval_batch_size: 2
  • seed: 42
  • gradient_accumulation_steps: 2
  • total_train_batch_size: 4
  • optimizer: Use adamw_torch with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
  • lr_scheduler_type: linear
  • lr_scheduler_warmup_steps: 500
  • num_epochs: 30
  • mixed_precision_training: Native AMP

Training results

Training Loss Epoch Step Validation Loss Wer Cer
0.0913 1.0 1950 0.0906 64.8321 31.4175
0.0616 2.0 3900 0.0758 59.5771 29.2888
0.0428 3.0 5850 0.0732 57.7425 29.0100
0.0251 4.0 7800 0.0747 56.9652 28.8937
0.0227 5.0 9750 0.0780 56.0634 28.7220
0.0162 6.0 11700 0.0777 54.6331 28.5171
0.012 7.0 13650 0.0786 56.3122 28.6925
0.011 8.0 15600 0.0838 55.6592 28.4728
0.0069 9.0 17550 0.0810 55.0995 28.6703
0.0076 10.0 19500 0.0918 56.0323 28.5171
0.0048 11.0 21450 0.0918 54.4776 28.5060
0.0033 12.0 23400 0.0947 53.5759 28.2679
0.0035 13.0 25350 0.0876 54.7575 28.3805
0.0041 14.0 27300 0.0936 53.9801 28.1995
0.0023 15.0 29250 0.0943 52.8607 28.1146
0.0023 16.0 31200 0.0942 53.3271 28.2365
0.0025 17.0 33150 0.0986 53.2649 28.1829
0.0014 18.0 35100 0.0973 52.4565 28.0371
0.0008 19.0 37050 0.0970 53.0162 27.9189
0.0014 20.0 39000 0.1054 53.0784 27.9448
0.0009 21.0 40950 0.1016 52.4565 27.8192
0.001 22.0 42900 0.0991 52.7674 27.9928
0.0003 23.0 44850 0.1039 51.9590 27.7398
0.0003 24.0 46800 0.1071 52.8918 27.8968
0.0003 25.0 48750 0.1044 52.5498 27.7287
0.0001 26.0 50700 0.1085 52.0833 27.7897
0.0001 27.0 52650 0.1060 52.3632 27.8211
0.0001 28.0 54600 0.1082 52.7052 27.7306
0.0001 29.0 56550 0.1071 52.5187 27.8008
0.0 30.0 58500 0.1077 52.3943 27.6678

Framework versions

  • Transformers 4.52.3
  • Pytorch 2.7.0+cu126
  • Datasets 3.6.0
  • Tokenizers 0.21.1
Downloads last month
3
Safetensors
Model size
809M params
Tensor type
F32
ยท
Inference Providers NEW
This model isn't deployed by any Inference Provider. ๐Ÿ™‹ Ask for provider support

Model tree for tranha/whisper-finetuned-v3_30e_augment_new

Finetuned
(334)
this model