yolo_finetuned_fruits
This model is a fine-tuned version of hustvl/yolos-tiny on the None dataset. It achieves the following results on the evaluation set:
- Loss: 0.7791
- Map: 0.4714
- Map 50: 0.6956
- Map 75: 0.5409
- Map Small: -1.0
- Map Medium: 0.4873
- Map Large: 0.5087
- Mar 1: 0.4392
- Mar 10: 0.71
- Mar 100: 0.7539
- Mar Small: -1.0
- Mar Medium: 0.6438
- Mar Large: 0.7737
- Map Banana: 0.3709
- Mar 100 Banana: 0.7275
- Map Orange: 0.4969
- Mar 100 Orange: 0.7429
- Map Apple: 0.5463
- Mar 100 Apple: 0.7914
Model description
More information needed
Intended uses & limitations
More information needed
Training and evaluation data
More information needed
Training procedure
Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 5e-05
- train_batch_size: 4
- eval_batch_size: 8
- seed: 42
- optimizer: Use OptimizerNames.ADAMW_TORCH with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
- lr_scheduler_type: cosine
- num_epochs: 30
Training results
Training Loss | Epoch | Step | Validation Loss | Map | Map 50 | Map 75 | Map Small | Map Medium | Map Large | Mar 1 | Mar 10 | Mar 100 | Mar Small | Mar Medium | Mar Large | Map Banana | Mar 100 Banana | Map Orange | Mar 100 Orange | Map Apple | Mar 100 Apple |
---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
No log | 1.0 | 60 | 1.9961 | 0.0048 | 0.0162 | 0.0025 | -1.0 | 0.0199 | 0.0051 | 0.0369 | 0.1339 | 0.2356 | -1.0 | 0.1223 | 0.2534 | 0.0042 | 0.245 | 0.0063 | 0.2619 | 0.0038 | 0.2 |
No log | 2.0 | 120 | 1.7598 | 0.0221 | 0.0672 | 0.0129 | -1.0 | 0.0577 | 0.0216 | 0.0639 | 0.1861 | 0.3615 | -1.0 | 0.1723 | 0.3772 | 0.0439 | 0.5525 | 0.0137 | 0.1548 | 0.0088 | 0.3771 |
No log | 3.0 | 180 | 1.5885 | 0.0344 | 0.0984 | 0.0195 | -1.0 | 0.04 | 0.0394 | 0.1276 | 0.2906 | 0.4912 | -1.0 | 0.2277 | 0.5277 | 0.0393 | 0.5725 | 0.0242 | 0.3381 | 0.0398 | 0.5629 |
No log | 4.0 | 240 | 1.7055 | 0.0432 | 0.1236 | 0.0231 | -1.0 | 0.086 | 0.0453 | 0.1399 | 0.2912 | 0.4533 | -1.0 | 0.1991 | 0.493 | 0.0514 | 0.48 | 0.0468 | 0.3143 | 0.0313 | 0.5657 |
No log | 5.0 | 300 | 1.5373 | 0.0441 | 0.0939 | 0.0325 | -1.0 | 0.0775 | 0.049 | 0.1611 | 0.3362 | 0.4907 | -1.0 | 0.117 | 0.5434 | 0.0504 | 0.595 | 0.0585 | 0.3429 | 0.0233 | 0.5343 |
No log | 6.0 | 360 | 1.3281 | 0.078 | 0.1589 | 0.0696 | -1.0 | 0.1315 | 0.0958 | 0.2488 | 0.4444 | 0.6168 | -1.0 | 0.4062 | 0.6491 | 0.0527 | 0.6375 | 0.1145 | 0.6214 | 0.0667 | 0.5914 |
No log | 7.0 | 420 | 1.1093 | 0.084 | 0.1721 | 0.0795 | -1.0 | 0.2091 | 0.0909 | 0.2741 | 0.5155 | 0.6735 | -1.0 | 0.5036 | 0.6996 | 0.0753 | 0.6875 | 0.0937 | 0.65 | 0.0829 | 0.6829 |
No log | 8.0 | 480 | 1.0723 | 0.1198 | 0.2091 | 0.126 | -1.0 | 0.3302 | 0.1316 | 0.3113 | 0.5349 | 0.6929 | -1.0 | 0.5598 | 0.7189 | 0.0885 | 0.635 | 0.1448 | 0.681 | 0.1259 | 0.7629 |
1.4412 | 9.0 | 540 | 0.9907 | 0.1246 | 0.2195 | 0.1319 | -1.0 | 0.1973 | 0.1443 | 0.3778 | 0.5932 | 0.7191 | -1.0 | 0.5759 | 0.7445 | 0.1133 | 0.6925 | 0.1306 | 0.6762 | 0.1298 | 0.7886 |
1.4412 | 10.0 | 600 | 0.9855 | 0.1517 | 0.2615 | 0.1465 | -1.0 | 0.2562 | 0.1671 | 0.35 | 0.5819 | 0.6753 | -1.0 | 0.4375 | 0.7129 | 0.1153 | 0.685 | 0.1784 | 0.6381 | 0.1613 | 0.7029 |
1.4412 | 11.0 | 660 | 0.9734 | 0.1793 | 0.2934 | 0.1965 | -1.0 | 0.2641 | 0.1978 | 0.3564 | 0.6029 | 0.6922 | -1.0 | 0.4839 | 0.7249 | 0.1385 | 0.6975 | 0.1879 | 0.6476 | 0.2114 | 0.7314 |
1.4412 | 12.0 | 720 | 1.0457 | 0.2177 | 0.3468 | 0.2265 | -1.0 | 0.2242 | 0.2489 | 0.3676 | 0.6224 | 0.6648 | -1.0 | 0.4259 | 0.704 | 0.158 | 0.66 | 0.255 | 0.6286 | 0.2399 | 0.7057 |
1.4412 | 13.0 | 780 | 0.8756 | 0.2393 | 0.3799 | 0.2619 | -1.0 | 0.3545 | 0.2646 | 0.4163 | 0.6819 | 0.7369 | -1.0 | 0.6054 | 0.7577 | 0.1926 | 0.745 | 0.2709 | 0.7143 | 0.2546 | 0.7514 |
1.4412 | 14.0 | 840 | 0.9067 | 0.2987 | 0.4602 | 0.3327 | -1.0 | 0.4857 | 0.305 | 0.3873 | 0.6723 | 0.7256 | -1.0 | 0.6661 | 0.7371 | 0.2039 | 0.7025 | 0.3302 | 0.7429 | 0.362 | 0.7314 |
1.4412 | 15.0 | 900 | 0.9761 | 0.2658 | 0.4491 | 0.2969 | -1.0 | 0.3493 | 0.2926 | 0.3656 | 0.6225 | 0.7037 | -1.0 | 0.5911 | 0.7232 | 0.2068 | 0.685 | 0.3045 | 0.6976 | 0.286 | 0.7286 |
1.4412 | 16.0 | 960 | 0.9318 | 0.2791 | 0.4399 | 0.3218 | -1.0 | 0.3934 | 0.3035 | 0.3802 | 0.6542 | 0.7144 | -1.0 | 0.6054 | 0.733 | 0.2017 | 0.7 | 0.306 | 0.6976 | 0.3295 | 0.7457 |
0.8426 | 17.0 | 1020 | 0.8593 | 0.3076 | 0.4558 | 0.3454 | -1.0 | 0.42 | 0.3289 | 0.4046 | 0.678 | 0.7545 | -1.0 | 0.6589 | 0.7731 | 0.2056 | 0.7125 | 0.3782 | 0.7452 | 0.3389 | 0.8057 |
0.8426 | 18.0 | 1080 | 0.8634 | 0.3121 | 0.5056 | 0.3313 | -1.0 | 0.4074 | 0.347 | 0.4025 | 0.6684 | 0.7438 | -1.0 | 0.6393 | 0.7646 | 0.2542 | 0.6925 | 0.3362 | 0.7476 | 0.3459 | 0.7914 |
0.8426 | 19.0 | 1140 | 0.8064 | 0.3787 | 0.5725 | 0.3908 | -1.0 | 0.422 | 0.4135 | 0.4251 | 0.7022 | 0.7504 | -1.0 | 0.6313 | 0.774 | 0.2907 | 0.695 | 0.4079 | 0.7476 | 0.4376 | 0.8086 |
0.8426 | 20.0 | 1200 | 0.7830 | 0.4137 | 0.6134 | 0.4445 | -1.0 | 0.4597 | 0.446 | 0.4294 | 0.7059 | 0.7565 | -1.0 | 0.6259 | 0.7801 | 0.3242 | 0.7225 | 0.4305 | 0.75 | 0.4864 | 0.7971 |
0.8426 | 21.0 | 1260 | 0.7738 | 0.4224 | 0.6321 | 0.468 | -1.0 | 0.4326 | 0.4516 | 0.4314 | 0.7087 | 0.7619 | -1.0 | 0.6232 | 0.7866 | 0.3299 | 0.7325 | 0.4396 | 0.7619 | 0.4977 | 0.7914 |
0.8426 | 22.0 | 1320 | 0.8010 | 0.429 | 0.6543 | 0.4765 | -1.0 | 0.4505 | 0.4649 | 0.4238 | 0.7088 | 0.7403 | -1.0 | 0.6027 | 0.7643 | 0.3371 | 0.7175 | 0.4546 | 0.7262 | 0.4953 | 0.7771 |
0.8426 | 23.0 | 1380 | 0.7663 | 0.4368 | 0.6546 | 0.473 | -1.0 | 0.4744 | 0.4744 | 0.4363 | 0.7014 | 0.7561 | -1.0 | 0.6571 | 0.7754 | 0.3623 | 0.7125 | 0.4574 | 0.7643 | 0.4906 | 0.7914 |
0.8426 | 24.0 | 1440 | 0.7869 | 0.4652 | 0.704 | 0.5348 | -1.0 | 0.4769 | 0.4996 | 0.4319 | 0.7171 | 0.75 | -1.0 | 0.6295 | 0.7716 | 0.3588 | 0.7225 | 0.5102 | 0.7476 | 0.5265 | 0.78 |
0.6721 | 25.0 | 1500 | 0.7694 | 0.466 | 0.6751 | 0.5348 | -1.0 | 0.4933 | 0.5055 | 0.4348 | 0.7088 | 0.752 | -1.0 | 0.6509 | 0.7704 | 0.3672 | 0.725 | 0.4856 | 0.731 | 0.5452 | 0.8 |
0.6721 | 26.0 | 1560 | 0.7674 | 0.4673 | 0.6805 | 0.5337 | -1.0 | 0.4829 | 0.5079 | 0.4362 | 0.7093 | 0.7527 | -1.0 | 0.6375 | 0.7737 | 0.3688 | 0.7225 | 0.4927 | 0.7357 | 0.5405 | 0.8 |
0.6721 | 27.0 | 1620 | 0.7790 | 0.4742 | 0.7018 | 0.5431 | -1.0 | 0.486 | 0.5139 | 0.4414 | 0.7103 | 0.7557 | -1.0 | 0.6438 | 0.7759 | 0.3763 | 0.7275 | 0.5016 | 0.7452 | 0.5448 | 0.7943 |
0.6721 | 28.0 | 1680 | 0.7775 | 0.4726 | 0.6975 | 0.5412 | -1.0 | 0.4855 | 0.5109 | 0.4383 | 0.7101 | 0.7522 | -1.0 | 0.6366 | 0.7728 | 0.3748 | 0.7275 | 0.4959 | 0.7405 | 0.5472 | 0.7886 |
0.6721 | 29.0 | 1740 | 0.7791 | 0.4715 | 0.6957 | 0.5409 | -1.0 | 0.4873 | 0.5089 | 0.4392 | 0.7109 | 0.7539 | -1.0 | 0.6438 | 0.7737 | 0.3715 | 0.7275 | 0.4969 | 0.7429 | 0.5462 | 0.7914 |
0.6721 | 30.0 | 1800 | 0.7791 | 0.4714 | 0.6956 | 0.5409 | -1.0 | 0.4873 | 0.5087 | 0.4392 | 0.71 | 0.7539 | -1.0 | 0.6438 | 0.7737 | 0.3709 | 0.7275 | 0.4969 | 0.7429 | 0.5463 | 0.7914 |
Framework versions
- Transformers 4.51.3
- Pytorch 2.6.0+cu124
- Datasets 3.5.1
- Tokenizers 0.21.1
- Downloads last month
- 2
Inference Providers
NEW
This model isn't deployed by any Inference Provider.
๐
Ask for provider support
Model tree for diribes/yolo_finetuned_fruits
Base model
hustvl/yolos-tiny