segformer-b0-random-init-morphpadver1-hgo-coord-v9_mix_resample_40epochs
This model is a fine-tuned version of random_init on the NICOPOI-9/morphpad_coord_hgo_512_4class_v2 dataset. It achieves the following results on the evaluation set:
- Loss: 1.2528
- Mean Iou: 0.2977
- Mean Accuracy: 0.4609
- Overall Accuracy: 0.4591
- Accuracy 0-0: 0.3603
- Accuracy 0-90: 0.4502
- Accuracy 90-0: 0.4232
- Accuracy 90-90: 0.6098
- Iou 0-0: 0.2715
- Iou 0-90: 0.3233
- Iou 90-0: 0.3141
- Iou 90-90: 0.2821
Model description
More information needed
Intended uses & limitations
More information needed
Training and evaluation data
More information needed
Training procedure
Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 6e-05
- train_batch_size: 1
- eval_batch_size: 1
- seed: 42
- optimizer: Use OptimizerNames.ADAMW_TORCH with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
- lr_scheduler_type: linear
- num_epochs: 40
Training results
Training Loss | Epoch | Step | Validation Loss | Mean Iou | Mean Accuracy | Overall Accuracy | Accuracy 0-0 | Accuracy 0-90 | Accuracy 90-0 | Accuracy 90-90 | Iou 0-0 | Iou 0-90 | Iou 90-0 | Iou 90-90 |
---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
1.3877 | 1.3638 | 4000 | 1.3825 | 0.1120 | 0.2581 | 0.2834 | 0.0074 | 0.6338 | 0.3902 | 0.0011 | 0.0073 | 0.2487 | 0.1909 | 0.0011 |
1.2102 | 2.7276 | 8000 | 1.3756 | 0.1484 | 0.2805 | 0.2694 | 0.2393 | 0.1558 | 0.1636 | 0.5633 | 0.1609 | 0.1086 | 0.1147 | 0.2092 |
1.3585 | 4.0914 | 12000 | 1.3637 | 0.1622 | 0.2963 | 0.2942 | 0.2999 | 0.0845 | 0.4896 | 0.3115 | 0.1795 | 0.0708 | 0.2194 | 0.1792 |
1.3062 | 5.4552 | 16000 | 1.3470 | 0.1825 | 0.3121 | 0.3212 | 0.2055 | 0.5129 | 0.2845 | 0.2453 | 0.1549 | 0.2359 | 0.1840 | 0.1554 |
1.3491 | 6.8190 | 20000 | 1.3312 | 0.1933 | 0.3303 | 0.3424 | 0.1899 | 0.5864 | 0.3057 | 0.2391 | 0.1462 | 0.2584 | 0.2009 | 0.1676 |
1.1479 | 8.1827 | 24000 | 1.3200 | 0.2086 | 0.3471 | 0.3523 | 0.2267 | 0.3377 | 0.4672 | 0.3570 | 0.1655 | 0.2167 | 0.2496 | 0.2026 |
1.4035 | 9.5465 | 28000 | 1.3137 | 0.2157 | 0.3594 | 0.3602 | 0.2554 | 0.4350 | 0.2812 | 0.4661 | 0.1857 | 0.2445 | 0.2053 | 0.2271 |
1.3416 | 10.9103 | 32000 | 1.3023 | 0.2201 | 0.3639 | 0.3660 | 0.2353 | 0.3488 | 0.4199 | 0.4515 | 0.1768 | 0.2346 | 0.2462 | 0.2228 |
1.095 | 12.2741 | 36000 | 1.2883 | 0.2310 | 0.3757 | 0.3795 | 0.3025 | 0.4730 | 0.3459 | 0.3813 | 0.2028 | 0.2700 | 0.2346 | 0.2165 |
1.2933 | 13.6379 | 40000 | 1.2834 | 0.2327 | 0.3785 | 0.3882 | 0.2849 | 0.4351 | 0.5287 | 0.2655 | 0.1996 | 0.2682 | 0.2804 | 0.1825 |
1.2888 | 15.0017 | 44000 | 1.2892 | 0.2325 | 0.3807 | 0.3921 | 0.2443 | 0.6177 | 0.3612 | 0.2995 | 0.1860 | 0.3028 | 0.2493 | 0.1918 |
1.2542 | 16.3655 | 48000 | 1.2823 | 0.2372 | 0.3865 | 0.3972 | 0.2325 | 0.6028 | 0.3713 | 0.3395 | 0.1804 | 0.3078 | 0.2554 | 0.2050 |
1.2767 | 17.7293 | 52000 | 1.2623 | 0.2481 | 0.3976 | 0.4070 | 0.2735 | 0.4473 | 0.5460 | 0.3236 | 0.2071 | 0.2865 | 0.3009 | 0.1978 |
1.2314 | 19.0931 | 56000 | 1.2697 | 0.2541 | 0.4045 | 0.4111 | 0.2902 | 0.5142 | 0.4228 | 0.3906 | 0.2086 | 0.3068 | 0.2825 | 0.2186 |
1.273 | 20.4569 | 60000 | 1.2814 | 0.2509 | 0.4011 | 0.4111 | 0.2550 | 0.5118 | 0.4926 | 0.3452 | 0.2010 | 0.3025 | 0.2933 | 0.2066 |
1.5948 | 21.8207 | 64000 | 1.2636 | 0.2561 | 0.4070 | 0.4150 | 0.2698 | 0.5138 | 0.4581 | 0.3862 | 0.2084 | 0.3075 | 0.2925 | 0.2160 |
0.8904 | 23.1845 | 68000 | 1.2795 | 0.2490 | 0.4008 | 0.4138 | 0.2460 | 0.5977 | 0.4621 | 0.2974 | 0.1932 | 0.3167 | 0.2907 | 0.1953 |
1.2066 | 24.5482 | 72000 | 1.2601 | 0.2555 | 0.4119 | 0.4259 | 0.2483 | 0.4454 | 0.6835 | 0.2703 | 0.2036 | 0.3017 | 0.3208 | 0.1959 |
1.284 | 25.9120 | 76000 | 1.2672 | 0.2615 | 0.4179 | 0.4220 | 0.2264 | 0.4455 | 0.4647 | 0.5349 | 0.1919 | 0.2995 | 0.3056 | 0.2490 |
1.3295 | 27.2758 | 80000 | 1.2713 | 0.2703 | 0.4265 | 0.4333 | 0.2465 | 0.5185 | 0.4634 | 0.4776 | 0.2047 | 0.3240 | 0.3087 | 0.2437 |
0.857 | 28.6396 | 84000 | 1.2813 | 0.2655 | 0.4286 | 0.4262 | 0.2374 | 0.4049 | 0.3834 | 0.6887 | 0.2048 | 0.2945 | 0.2891 | 0.2734 |
1.1713 | 30.0034 | 88000 | 1.2530 | 0.2789 | 0.4374 | 0.4473 | 0.3726 | 0.4646 | 0.6296 | 0.2827 | 0.2610 | 0.3141 | 0.3358 | 0.2046 |
0.7255 | 31.3672 | 92000 | 1.2424 | 0.2842 | 0.4424 | 0.4444 | 0.3358 | 0.4430 | 0.4794 | 0.5114 | 0.2469 | 0.3141 | 0.3177 | 0.2582 |
1.0961 | 32.7310 | 96000 | 1.2584 | 0.2760 | 0.4366 | 0.4447 | 0.2329 | 0.4157 | 0.6283 | 0.4696 | 0.2077 | 0.3070 | 0.3346 | 0.2548 |
1.1369 | 34.0948 | 100000 | 1.2632 | 0.2863 | 0.4469 | 0.4483 | 0.2992 | 0.4085 | 0.5100 | 0.5698 | 0.2409 | 0.3092 | 0.3272 | 0.2677 |
1.0328 | 35.4586 | 104000 | 1.2445 | 0.2921 | 0.4512 | 0.4568 | 0.3166 | 0.5220 | 0.4865 | 0.4798 | 0.2508 | 0.3353 | 0.3201 | 0.2621 |
1.2026 | 36.8224 | 108000 | 1.2417 | 0.2928 | 0.4522 | 0.4602 | 0.3099 | 0.4714 | 0.6024 | 0.4251 | 0.2529 | 0.3287 | 0.3358 | 0.2537 |
1.2102 | 38.1862 | 112000 | 1.3176 | 0.2773 | 0.4430 | 0.4419 | 0.2541 | 0.3309 | 0.5290 | 0.6580 | 0.2253 | 0.2724 | 0.3334 | 0.2782 |
1.1765 | 39.5499 | 116000 | 1.2528 | 0.2977 | 0.4609 | 0.4591 | 0.3603 | 0.4502 | 0.4232 | 0.6098 | 0.2715 | 0.3233 | 0.3141 | 0.2821 |
Framework versions
- Transformers 4.48.3
- Pytorch 2.1.0
- Datasets 3.2.0
- Tokenizers 0.21.0
- Downloads last month
- 12
Inference Providers
NEW
This model isn't deployed by any Inference Provider.
๐
Ask for provider support