metadata
library_name: transformers
license: apache-2.0
base_model: google/flan-t5-base
tags:
- generated_from_trainer
metrics:
- rouge
model-index:
- name: flanT5Base-riksIdentification-trained
results: []
flanT5Base-riksIdentification-trained
This model is a fine-tuned version of google/flan-t5-base on the None dataset. It achieves the following results on the evaluation set:
- Loss: 1.5288
- Rouge1: 40.8854
- Rouge2: 22.6821
- Rougel: 36.0638
- Rougelsum: 36.8444
- Gen Len: 18.7778
Model description
More information needed
Intended uses & limitations
More information needed
Training and evaluation data
More information needed
Training procedure
Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 3e-05
- train_batch_size: 4
- eval_batch_size: 4
- seed: 42
- gradient_accumulation_steps: 2
- total_train_batch_size: 8
- optimizer: Use OptimizerNames.ADAMW_TORCH with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
- lr_scheduler_type: linear
- num_epochs: 20
Training results
Training Loss | Epoch | Step | Validation Loss | Rouge1 | Rouge2 | Rougel | Rougelsum | Gen Len |
---|---|---|---|---|---|---|---|---|
No log | 1.0 | 60 | 1.9165 | 39.1847 | 20.771 | 35.3702 | 35.7857 | 18.3148 |
No log | 2.0 | 120 | 1.8308 | 40.7152 | 21.7447 | 36.6752 | 37.1989 | 18.6296 |
No log | 3.0 | 180 | 1.7769 | 40.7843 | 22.2581 | 36.3515 | 36.9362 | 18.7407 |
No log | 4.0 | 240 | 1.7355 | 39.8099 | 21.8843 | 35.5563 | 36.0749 | 18.9444 |
No log | 5.0 | 300 | 1.6971 | 41.3752 | 23.8678 | 36.8792 | 37.3489 | 18.7037 |
No log | 6.0 | 360 | 1.6690 | 41.2441 | 23.4026 | 36.4348 | 37.067 | 18.9074 |
No log | 7.0 | 420 | 1.6327 | 40.9744 | 23.6697 | 36.3964 | 36.9935 | 18.9074 |
No log | 8.0 | 480 | 1.6214 | 41.3833 | 23.796 | 36.7591 | 37.4926 | 18.9815 |
1.7877 | 9.0 | 540 | 1.5955 | 40.9415 | 23.3711 | 36.2089 | 36.8672 | 18.7407 |
1.7877 | 10.0 | 600 | 1.5821 | 41.1759 | 23.362 | 36.5208 | 37.2207 | 18.7963 |
1.7877 | 11.0 | 660 | 1.5687 | 41.5582 | 23.575 | 36.4706 | 37.1927 | 18.9074 |
1.7877 | 12.0 | 720 | 1.5685 | 41.5293 | 23.2829 | 36.7273 | 37.468 | 18.8148 |
1.7877 | 13.0 | 780 | 1.5503 | 40.6781 | 21.9403 | 35.725 | 36.3384 | 18.9444 |
1.7877 | 14.0 | 840 | 1.5454 | 40.4918 | 22.3652 | 35.7063 | 36.3163 | 18.9815 |
1.7877 | 15.0 | 900 | 1.5364 | 41.6247 | 23.5637 | 36.9036 | 37.4561 | 18.8704 |
1.7877 | 16.0 | 960 | 1.5344 | 41.1763 | 23.2285 | 36.2463 | 36.6802 | 18.8704 |
1.3826 | 17.0 | 1020 | 1.5303 | 40.4807 | 21.8633 | 35.5338 | 36.2185 | 18.8519 |
1.3826 | 18.0 | 1080 | 1.5288 | 40.8854 | 22.6821 | 36.0638 | 36.8444 | 18.7778 |
1.3826 | 19.0 | 1140 | 1.5306 | 40.739 | 22.393 | 35.9189 | 36.4907 | 18.9815 |
1.3826 | 20.0 | 1200 | 1.5291 | 40.739 | 22.393 | 35.9189 | 36.4907 | 18.9815 |
Framework versions
- Transformers 4.52.3
- Pytorch 2.6.0+cu124
- Datasets 2.14.4
- Tokenizers 0.21.1