model_id
stringlengths 9
102
| model_card
stringlengths 4
343k
| model_labels
listlengths 2
50.8k
|
---|---|---|
dltpdn/detr-resnet-50_finetuned_cppe5 |
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# detr-resnet-50_finetuned_cppe5
This model is a fine-tuned version of [facebook/detr-resnet-50](https://huggingface.co/facebook/detr-resnet-50) on an unknown dataset.
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 1e-05
- train_batch_size: 8
- eval_batch_size: 8
- seed: 42
- optimizer: Use adamw_torch with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
- lr_scheduler_type: linear
- num_epochs: 10
- mixed_precision_training: Native AMP
### Training results
### Framework versions
- Transformers 4.47.1
- Pytorch 2.5.1+cu121
- Datasets 3.2.0
- Tokenizers 0.21.0
| [
"coverall",
"face_shield",
"gloves",
"goggles",
"mask"
] |
DFAIvenger/detr-resnet-50-dc5-fashionpedia-finetuned |
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# detr-resnet-50-dc5-fashionpedia-finetuned
This model is a fine-tuned version of [facebook/detr-resnet-50-dc5](https://huggingface.co/facebook/detr-resnet-50-dc5) on an unknown dataset.
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 1e-05
- train_batch_size: 4
- eval_batch_size: 4
- seed: 42
- optimizer: Use adamw_torch with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
- lr_scheduler_type: linear
- training_steps: 10000
- mixed_precision_training: Native AMP
### Framework versions
- Transformers 4.47.1
- Pytorch 2.5.1+cu121
- Datasets 3.2.0
- Tokenizers 0.21.0
| [
"shirt, blouse",
"top, t-shirt, sweatshirt",
"sweater",
"cardigan",
"jacket",
"vest",
"pants",
"shorts",
"skirt",
"coat",
"dress",
"jumpsuit",
"cape",
"glasses",
"hat",
"headband, head covering, hair accessory",
"tie",
"glove",
"watch",
"belt",
"leg warmer",
"tights, stockings",
"sock",
"shoe",
"bag, wallet",
"scarf",
"umbrella",
"hood",
"collar",
"lapel",
"epaulette",
"sleeve",
"pocket",
"neckline",
"buckle",
"zipper",
"applique",
"bead",
"bow",
"flower",
"fringe",
"ribbon",
"rivet",
"ruffle",
"sequin",
"tassel"
] |
initial01/detr-finetuned-cppe-5-10k-steps |
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# detr-finetuned-cppe-5-10k-steps
This model is a fine-tuned version of [facebook/detr-resnet-50](https://huggingface.co/facebook/detr-resnet-50) on the danelcsb/cppe-5-v2 dataset.
It achieves the following results on the evaluation set:
- Loss: 0.9579
- Map: 0.3769
- Map 50: 0.6804
- Map 75: 0.3483
- Map Small: 0.1331
- Map Medium: 0.3109
- Map Large: 0.5141
- Mar 1: 0.3946
- Mar 10: 0.5584
- Mar 100: 0.5691
- Mar Small: 0.3543
- Mar Medium: 0.5333
- Mar Large: 0.6781
- Map Coverall: 0.646
- Mar 100 Coverall: 0.8118
- Map Face Shield: 0.2809
- Mar 100 Face Shield: 0.5564
- Map Gloves: 0.2972
- Mar 100 Gloves: 0.4827
- Map Goggles: 0.223
- Mar 100 Goggles: 0.4476
- Map Mask: 0.4374
- Mar 100 Mask: 0.5469
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 5e-05
- train_batch_size: 8
- eval_batch_size: 8
- seed: 1337
- optimizer: Use adamw_torch with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
- lr_scheduler_type: linear
- num_epochs: 100.0
- mixed_precision_training: Native AMP
### Training results
| Training Loss | Epoch | Step | Validation Loss | Map | Map 50 | Map 75 | Map Small | Map Medium | Map Large | Mar 1 | Mar 10 | Mar 100 | Mar Small | Mar Medium | Mar Large | Map Coverall | Mar 100 Coverall | Map Face Shield | Mar 100 Face Shield | Map Gloves | Mar 100 Gloves | Map Goggles | Mar 100 Goggles | Map Mask | Mar 100 Mask |
|:-------------:|:-----:|:-----:|:---------------:|:------:|:------:|:------:|:---------:|:----------:|:---------:|:------:|:------:|:-------:|:---------:|:----------:|:---------:|:------------:|:----------------:|:---------------:|:-------------------:|:----------:|:--------------:|:-----------:|:---------------:|:--------:|:------------:|
| 2.7184 | 1.0 | 102 | 2.6066 | 0.0053 | 0.0133 | 0.0034 | 0.0015 | 0.0054 | 0.0095 | 0.0358 | 0.0838 | 0.1556 | 0.0565 | 0.1579 | 0.1476 | 0.0178 | 0.5049 | 0.0 | 0.0 | 0.0058 | 0.152 | 0.0 | 0.0 | 0.0027 | 0.121 |
| 2.2231 | 2.0 | 204 | 1.9040 | 0.0421 | 0.0848 | 0.0341 | 0.0115 | 0.057 | 0.0441 | 0.0967 | 0.2064 | 0.2527 | 0.1085 | 0.2546 | 0.2474 | 0.1541 | 0.7 | 0.0 | 0.0 | 0.028 | 0.2642 | 0.0 | 0.0 | 0.0284 | 0.2994 |
| 2.1254 | 3.0 | 306 | 1.8283 | 0.0596 | 0.1146 | 0.0533 | 0.0194 | 0.0679 | 0.0601 | 0.1115 | 0.2279 | 0.2483 | 0.1033 | 0.3053 | 0.247 | 0.2259 | 0.684 | 0.0 | 0.0 | 0.0296 | 0.2168 | 0.0 | 0.0 | 0.0424 | 0.3407 |
| 1.9328 | 4.0 | 408 | 1.8491 | 0.0699 | 0.1462 | 0.0568 | 0.0301 | 0.0714 | 0.0741 | 0.1229 | 0.225 | 0.2587 | 0.1373 | 0.2285 | 0.248 | 0.2484 | 0.7333 | 0.0021 | 0.0164 | 0.0357 | 0.2397 | 0.0 | 0.0 | 0.0631 | 0.3043 |
| 1.7998 | 5.0 | 510 | 1.6615 | 0.1104 | 0.2435 | 0.0839 | 0.0439 | 0.1081 | 0.1176 | 0.1492 | 0.253 | 0.265 | 0.1203 | 0.2916 | 0.2703 | 0.3398 | 0.6458 | 0.033 | 0.0982 | 0.0411 | 0.2391 | 0.0002 | 0.0032 | 0.1378 | 0.3389 |
| 1.7995 | 6.0 | 612 | 1.6423 | 0.136 | 0.2587 | 0.1193 | 0.0429 | 0.111 | 0.154 | 0.1655 | 0.2745 | 0.3097 | 0.1494 | 0.3333 | 0.3207 | 0.4282 | 0.7174 | 0.0326 | 0.0964 | 0.0409 | 0.3101 | 0.0023 | 0.0222 | 0.176 | 0.4025 |
| 1.6677 | 7.0 | 714 | 1.5354 | 0.1412 | 0.281 | 0.123 | 0.0596 | 0.1478 | 0.1447 | 0.1806 | 0.3321 | 0.3586 | 0.2333 | 0.3621 | 0.3616 | 0.4209 | 0.7319 | 0.0337 | 0.2764 | 0.0605 | 0.3497 | 0.0008 | 0.0238 | 0.1903 | 0.4111 |
| 1.6329 | 8.0 | 816 | 1.5835 | 0.1591 | 0.3248 | 0.1322 | 0.0616 | 0.2121 | 0.1744 | 0.1919 | 0.3677 | 0.3951 | 0.3209 | 0.42 | 0.4255 | 0.4396 | 0.7111 | 0.0595 | 0.3655 | 0.0618 | 0.3358 | 0.0182 | 0.1587 | 0.2162 | 0.4043 |
| 1.6031 | 9.0 | 918 | 1.4644 | 0.165 | 0.3424 | 0.1443 | 0.0469 | 0.1803 | 0.1983 | 0.2043 | 0.3692 | 0.3898 | 0.2638 | 0.3951 | 0.4381 | 0.5105 | 0.7521 | 0.0703 | 0.3218 | 0.0619 | 0.3162 | 0.017 | 0.1937 | 0.1656 | 0.3654 |
| 1.5464 | 10.0 | 1020 | 1.3868 | 0.2084 | 0.3927 | 0.1792 | 0.0797 | 0.1933 | 0.2438 | 0.2454 | 0.405 | 0.4273 | 0.2948 | 0.424 | 0.4738 | 0.5649 | 0.7556 | 0.0925 | 0.4455 | 0.0756 | 0.3391 | 0.0261 | 0.1397 | 0.2827 | 0.4568 |
| 1.5454 | 11.0 | 1122 | 1.4023 | 0.192 | 0.3869 | 0.1682 | 0.0792 | 0.1991 | 0.2221 | 0.2421 | 0.4079 | 0.4247 | 0.2727 | 0.3988 | 0.4983 | 0.5026 | 0.7431 | 0.0667 | 0.38 | 0.0785 | 0.3229 | 0.0195 | 0.2063 | 0.2927 | 0.471 |
| 1.5008 | 12.0 | 1224 | 1.4417 | 0.1545 | 0.3341 | 0.1307 | 0.0629 | 0.1426 | 0.1898 | 0.2025 | 0.3779 | 0.4036 | 0.1873 | 0.3478 | 0.4817 | 0.4062 | 0.7465 | 0.0457 | 0.2836 | 0.066 | 0.3425 | 0.0237 | 0.2159 | 0.231 | 0.4296 |
| 1.4689 | 13.0 | 1326 | 1.4068 | 0.1938 | 0.4018 | 0.173 | 0.0734 | 0.2144 | 0.2255 | 0.2475 | 0.417 | 0.4383 | 0.258 | 0.4313 | 0.5185 | 0.475 | 0.7528 | 0.0753 | 0.3655 | 0.0911 | 0.3542 | 0.0468 | 0.2921 | 0.2807 | 0.4272 |
| 1.4263 | 14.0 | 1428 | 1.3763 | 0.1811 | 0.3658 | 0.1486 | 0.0969 | 0.1508 | 0.2104 | 0.2251 | 0.392 | 0.4163 | 0.2785 | 0.3984 | 0.4498 | 0.5154 | 0.7375 | 0.0332 | 0.3091 | 0.0912 | 0.3676 | 0.0303 | 0.2778 | 0.2356 | 0.3895 |
| 1.4193 | 15.0 | 1530 | 1.3191 | 0.2018 | 0.4037 | 0.1755 | 0.0892 | 0.195 | 0.2507 | 0.2617 | 0.4412 | 0.4602 | 0.2284 | 0.4069 | 0.5573 | 0.4697 | 0.7556 | 0.064 | 0.4473 | 0.1109 | 0.3453 | 0.0396 | 0.2794 | 0.325 | 0.4735 |
| 1.3707 | 16.0 | 1632 | 1.2574 | 0.2307 | 0.446 | 0.203 | 0.0786 | 0.2154 | 0.3001 | 0.2646 | 0.4684 | 0.4894 | 0.2933 | 0.4306 | 0.5928 | 0.5449 | 0.7583 | 0.1085 | 0.4582 | 0.1139 | 0.3989 | 0.0573 | 0.3444 | 0.3288 | 0.487 |
| 1.3594 | 17.0 | 1734 | 1.3527 | 0.2115 | 0.4043 | 0.185 | 0.0693 | 0.2171 | 0.2571 | 0.2471 | 0.4242 | 0.4464 | 0.1675 | 0.424 | 0.5451 | 0.4841 | 0.7451 | 0.0776 | 0.3327 | 0.1426 | 0.4061 | 0.041 | 0.2921 | 0.312 | 0.4562 |
| 1.353 | 18.0 | 1836 | 1.2458 | 0.2239 | 0.4409 | 0.206 | 0.1241 | 0.1888 | 0.2979 | 0.2722 | 0.4603 | 0.4849 | 0.3326 | 0.463 | 0.5781 | 0.5126 | 0.7667 | 0.0968 | 0.4636 | 0.1468 | 0.3994 | 0.0432 | 0.3175 | 0.3199 | 0.4772 |
| 1.2895 | 19.0 | 1938 | 1.2359 | 0.2359 | 0.4527 | 0.2218 | 0.1001 | 0.2109 | 0.3097 | 0.2898 | 0.4599 | 0.4741 | 0.2588 | 0.4135 | 0.5761 | 0.5503 | 0.7708 | 0.1012 | 0.4527 | 0.1608 | 0.4207 | 0.0345 | 0.2619 | 0.333 | 0.4642 |
| 1.281 | 20.0 | 2040 | 1.2395 | 0.2336 | 0.4626 | 0.2112 | 0.0803 | 0.1993 | 0.2902 | 0.284 | 0.4584 | 0.4744 | 0.2248 | 0.4237 | 0.5854 | 0.576 | 0.7799 | 0.1113 | 0.4127 | 0.1211 | 0.3872 | 0.0465 | 0.3302 | 0.3129 | 0.4623 |
| 1.2998 | 21.0 | 2142 | 1.2720 | 0.2331 | 0.4603 | 0.2056 | 0.09 | 0.1863 | 0.2905 | 0.2827 | 0.4446 | 0.4632 | 0.208 | 0.4373 | 0.5567 | 0.559 | 0.7667 | 0.1126 | 0.4509 | 0.1362 | 0.3698 | 0.0634 | 0.2952 | 0.2941 | 0.4333 |
| 1.2792 | 22.0 | 2244 | 1.2337 | 0.2484 | 0.4841 | 0.2234 | 0.0792 | 0.2208 | 0.3134 | 0.2857 | 0.4458 | 0.4648 | 0.1966 | 0.4095 | 0.578 | 0.5778 | 0.7882 | 0.1232 | 0.4164 | 0.1722 | 0.3827 | 0.0402 | 0.3079 | 0.3284 | 0.429 |
| 1.2585 | 23.0 | 2346 | 1.2072 | 0.27 | 0.4938 | 0.2544 | 0.0898 | 0.2435 | 0.338 | 0.2941 | 0.4769 | 0.4885 | 0.2701 | 0.4476 | 0.5933 | 0.5733 | 0.7576 | 0.1614 | 0.44 | 0.1648 | 0.4017 | 0.089 | 0.3444 | 0.3614 | 0.4988 |
| 1.2448 | 24.0 | 2448 | 1.2070 | 0.256 | 0.4825 | 0.2235 | 0.084 | 0.2321 | 0.3487 | 0.3079 | 0.459 | 0.4656 | 0.2084 | 0.4237 | 0.5849 | 0.5846 | 0.7736 | 0.1178 | 0.4473 | 0.1628 | 0.3765 | 0.073 | 0.254 | 0.3416 | 0.4765 |
| 1.2294 | 25.0 | 2550 | 1.2309 | 0.2452 | 0.4716 | 0.2184 | 0.0936 | 0.1978 | 0.3224 | 0.2844 | 0.4588 | 0.4675 | 0.2615 | 0.4478 | 0.579 | 0.5602 | 0.7493 | 0.1464 | 0.4309 | 0.1602 | 0.3927 | 0.06 | 0.3286 | 0.2992 | 0.4358 |
| 1.2595 | 26.0 | 2652 | 1.2038 | 0.2487 | 0.4766 | 0.2387 | 0.1149 | 0.1975 | 0.3144 | 0.2949 | 0.4697 | 0.4822 | 0.3066 | 0.4111 | 0.5843 | 0.5839 | 0.7688 | 0.1494 | 0.4745 | 0.1552 | 0.4223 | 0.0462 | 0.3048 | 0.3086 | 0.4407 |
| 1.1934 | 27.0 | 2754 | 1.1991 | 0.2489 | 0.4654 | 0.2356 | 0.111 | 0.2088 | 0.3068 | 0.2877 | 0.469 | 0.4768 | 0.3152 | 0.4497 | 0.5617 | 0.5761 | 0.7611 | 0.1759 | 0.4582 | 0.1172 | 0.3676 | 0.0339 | 0.3286 | 0.3413 | 0.4685 |
| 1.1864 | 28.0 | 2856 | 1.1400 | 0.2631 | 0.5124 | 0.2318 | 0.1017 | 0.2317 | 0.3561 | 0.3264 | 0.5114 | 0.5248 | 0.3453 | 0.4718 | 0.6482 | 0.5566 | 0.7806 | 0.1568 | 0.5091 | 0.2059 | 0.433 | 0.0707 | 0.4317 | 0.3257 | 0.4698 |
| 1.146 | 29.0 | 2958 | 1.1343 | 0.2803 | 0.5264 | 0.2612 | 0.0919 | 0.2465 | 0.3757 | 0.3346 | 0.5101 | 0.5227 | 0.3628 | 0.4858 | 0.6342 | 0.5722 | 0.7743 | 0.1673 | 0.5236 | 0.1962 | 0.4291 | 0.0854 | 0.3857 | 0.3803 | 0.5006 |
| 1.1749 | 30.0 | 3060 | 1.1617 | 0.2458 | 0.4916 | 0.2194 | 0.0753 | 0.194 | 0.3481 | 0.2977 | 0.4894 | 0.5073 | 0.3132 | 0.427 | 0.6398 | 0.583 | 0.7917 | 0.1541 | 0.5273 | 0.1731 | 0.4067 | 0.0472 | 0.3937 | 0.2713 | 0.4173 |
| 1.1695 | 31.0 | 3162 | 1.1849 | 0.254 | 0.5068 | 0.2275 | 0.089 | 0.2121 | 0.3555 | 0.3071 | 0.4853 | 0.498 | 0.2604 | 0.4693 | 0.6007 | 0.5234 | 0.7479 | 0.1237 | 0.4818 | 0.1924 | 0.4268 | 0.0908 | 0.3571 | 0.3395 | 0.4765 |
| 1.1613 | 32.0 | 3264 | 1.1447 | 0.2701 | 0.5138 | 0.2449 | 0.1041 | 0.2158 | 0.3704 | 0.3234 | 0.4972 | 0.508 | 0.2777 | 0.4314 | 0.635 | 0.5824 | 0.741 | 0.0965 | 0.4891 | 0.2241 | 0.4475 | 0.1017 | 0.3762 | 0.3458 | 0.4864 |
| 1.098 | 33.0 | 3366 | 1.1320 | 0.2686 | 0.5117 | 0.2382 | 0.1397 | 0.234 | 0.3737 | 0.3184 | 0.5021 | 0.5167 | 0.3209 | 0.4494 | 0.6358 | 0.5733 | 0.7847 | 0.1279 | 0.5255 | 0.2049 | 0.4302 | 0.0925 | 0.3683 | 0.3442 | 0.4747 |
| 1.119 | 34.0 | 3468 | 1.1157 | 0.2826 | 0.5481 | 0.24 | 0.0995 | 0.2062 | 0.3898 | 0.3266 | 0.5164 | 0.5243 | 0.3035 | 0.4758 | 0.6339 | 0.5914 | 0.791 | 0.149 | 0.5145 | 0.2382 | 0.433 | 0.0871 | 0.3968 | 0.3474 | 0.4864 |
| 1.1193 | 35.0 | 3570 | 1.1290 | 0.2701 | 0.5107 | 0.2566 | 0.1223 | 0.2484 | 0.3594 | 0.3345 | 0.5143 | 0.5339 | 0.3225 | 0.4833 | 0.643 | 0.5701 | 0.7972 | 0.1372 | 0.5436 | 0.2003 | 0.424 | 0.0671 | 0.3984 | 0.376 | 0.5062 |
| 1.1078 | 36.0 | 3672 | 1.1141 | 0.2919 | 0.5577 | 0.2664 | 0.1137 | 0.2548 | 0.3932 | 0.3459 | 0.5121 | 0.5276 | 0.3414 | 0.5069 | 0.6317 | 0.6082 | 0.8028 | 0.1585 | 0.5182 | 0.2206 | 0.4307 | 0.0893 | 0.3873 | 0.3828 | 0.4988 |
| 1.1058 | 37.0 | 3774 | 1.1030 | 0.2998 | 0.5571 | 0.2839 | 0.0986 | 0.2493 | 0.4103 | 0.3382 | 0.5038 | 0.5226 | 0.3159 | 0.4878 | 0.6339 | 0.6137 | 0.7979 | 0.1434 | 0.5109 | 0.2409 | 0.4525 | 0.1156 | 0.3556 | 0.3854 | 0.4963 |
| 1.0858 | 38.0 | 3876 | 1.1045 | 0.2903 | 0.5478 | 0.2586 | 0.1247 | 0.2634 | 0.3745 | 0.3242 | 0.4986 | 0.5144 | 0.3195 | 0.4866 | 0.6067 | 0.6152 | 0.791 | 0.1736 | 0.4982 | 0.2114 | 0.4453 | 0.0625 | 0.3286 | 0.389 | 0.5093 |
| 1.119 | 39.0 | 3978 | 1.0792 | 0.3029 | 0.5638 | 0.2777 | 0.0902 | 0.2537 | 0.4204 | 0.336 | 0.5012 | 0.5142 | 0.2878 | 0.4634 | 0.6236 | 0.6232 | 0.7854 | 0.1472 | 0.4709 | 0.2203 | 0.4503 | 0.1361 | 0.354 | 0.3876 | 0.5105 |
| 1.0875 | 40.0 | 4080 | 1.0980 | 0.2971 | 0.5562 | 0.2757 | 0.0952 | 0.2547 | 0.4105 | 0.3452 | 0.514 | 0.5306 | 0.3262 | 0.5031 | 0.6313 | 0.6002 | 0.7958 | 0.1773 | 0.5527 | 0.2115 | 0.4419 | 0.1141 | 0.3603 | 0.3825 | 0.5025 |
| 1.0678 | 41.0 | 4182 | 1.0920 | 0.3101 | 0.5739 | 0.2992 | 0.1013 | 0.2653 | 0.4163 | 0.3414 | 0.5251 | 0.5339 | 0.3002 | 0.4982 | 0.6393 | 0.6043 | 0.7931 | 0.1829 | 0.4855 | 0.2159 | 0.4553 | 0.1387 | 0.4143 | 0.4085 | 0.5216 |
| 1.063 | 42.0 | 4284 | 1.0699 | 0.298 | 0.5384 | 0.2756 | 0.0953 | 0.2476 | 0.4072 | 0.3549 | 0.5186 | 0.5331 | 0.2756 | 0.4947 | 0.6508 | 0.6078 | 0.791 | 0.1876 | 0.5073 | 0.1886 | 0.4408 | 0.1123 | 0.4111 | 0.3935 | 0.5154 |
| 1.0203 | 43.0 | 4386 | 1.0743 | 0.2996 | 0.5563 | 0.2799 | 0.1052 | 0.2584 | 0.4007 | 0.3431 | 0.5049 | 0.5176 | 0.2654 | 0.4677 | 0.6334 | 0.6125 | 0.7875 | 0.1773 | 0.4855 | 0.2153 | 0.4447 | 0.1303 | 0.3857 | 0.3625 | 0.4846 |
| 1.0451 | 44.0 | 4488 | 1.0523 | 0.3092 | 0.5677 | 0.298 | 0.1081 | 0.2643 | 0.4169 | 0.3462 | 0.5155 | 0.5306 | 0.3082 | 0.4822 | 0.638 | 0.6089 | 0.8014 | 0.1823 | 0.4727 | 0.2165 | 0.4497 | 0.1286 | 0.4 | 0.4099 | 0.529 |
| 1.0466 | 45.0 | 4590 | 1.0640 | 0.3084 | 0.5758 | 0.2981 | 0.1334 | 0.2597 | 0.4242 | 0.3463 | 0.5165 | 0.5313 | 0.2973 | 0.4779 | 0.6353 | 0.6042 | 0.7778 | 0.1959 | 0.5291 | 0.2111 | 0.4391 | 0.1519 | 0.4 | 0.3787 | 0.5105 |
| 1.0193 | 46.0 | 4692 | 1.0589 | 0.3088 | 0.5772 | 0.2879 | 0.097 | 0.254 | 0.4288 | 0.3425 | 0.5165 | 0.5297 | 0.3159 | 0.4744 | 0.6496 | 0.61 | 0.8069 | 0.185 | 0.4891 | 0.2327 | 0.4441 | 0.1067 | 0.3698 | 0.4097 | 0.5383 |
| 0.9959 | 47.0 | 4794 | 1.0625 | 0.3128 | 0.5804 | 0.2986 | 0.1159 | 0.2762 | 0.414 | 0.3518 | 0.5142 | 0.5268 | 0.2813 | 0.5191 | 0.6351 | 0.5796 | 0.8049 | 0.2084 | 0.4836 | 0.2305 | 0.4525 | 0.1401 | 0.3825 | 0.4053 | 0.5105 |
| 1.0161 | 48.0 | 4896 | 1.1113 | 0.2994 | 0.5631 | 0.2718 | 0.1116 | 0.2375 | 0.4107 | 0.3383 | 0.5089 | 0.5205 | 0.3101 | 0.4832 | 0.6215 | 0.6176 | 0.7889 | 0.1863 | 0.5291 | 0.2011 | 0.4358 | 0.1131 | 0.3556 | 0.3788 | 0.4932 |
| 1.0175 | 49.0 | 4998 | 1.0668 | 0.3173 | 0.5917 | 0.2847 | 0.1087 | 0.2504 | 0.4395 | 0.3498 | 0.5015 | 0.5104 | 0.2699 | 0.4529 | 0.6335 | 0.6292 | 0.7778 | 0.2088 | 0.5109 | 0.2267 | 0.4223 | 0.1358 | 0.3556 | 0.3858 | 0.4852 |
| 0.9979 | 50.0 | 5100 | 1.0493 | 0.3214 | 0.5978 | 0.3056 | 0.1125 | 0.2544 | 0.4391 | 0.3528 | 0.522 | 0.5394 | 0.2902 | 0.4955 | 0.6627 | 0.6278 | 0.8 | 0.2297 | 0.56 | 0.2442 | 0.4637 | 0.1171 | 0.3635 | 0.3881 | 0.5099 |
| 0.9904 | 51.0 | 5202 | 1.0453 | 0.3401 | 0.607 | 0.321 | 0.1441 | 0.2577 | 0.454 | 0.3673 | 0.5223 | 0.5363 | 0.3374 | 0.4618 | 0.6465 | 0.6503 | 0.7771 | 0.2359 | 0.5436 | 0.2344 | 0.4318 | 0.1581 | 0.4159 | 0.4217 | 0.513 |
| 0.9846 | 52.0 | 5304 | 1.0403 | 0.326 | 0.5822 | 0.327 | 0.1025 | 0.2834 | 0.4381 | 0.3643 | 0.5234 | 0.531 | 0.2839 | 0.4865 | 0.6413 | 0.6281 | 0.7986 | 0.1991 | 0.5182 | 0.2502 | 0.4514 | 0.1201 | 0.3524 | 0.4324 | 0.5346 |
| 0.9633 | 53.0 | 5406 | 1.0478 | 0.3181 | 0.5837 | 0.3182 | 0.0995 | 0.2797 | 0.4298 | 0.36 | 0.5191 | 0.5322 | 0.2812 | 0.4853 | 0.6551 | 0.631 | 0.8069 | 0.1657 | 0.4836 | 0.2318 | 0.4469 | 0.157 | 0.4 | 0.4049 | 0.5235 |
| 0.9673 | 54.0 | 5508 | 1.0160 | 0.3436 | 0.6169 | 0.3341 | 0.1415 | 0.2963 | 0.4787 | 0.3677 | 0.5326 | 0.5409 | 0.2983 | 0.4858 | 0.6692 | 0.6295 | 0.8069 | 0.2235 | 0.5182 | 0.2507 | 0.4385 | 0.1908 | 0.4095 | 0.4235 | 0.5315 |
| 0.932 | 55.0 | 5610 | 1.0526 | 0.3349 | 0.6158 | 0.3244 | 0.1117 | 0.2969 | 0.4433 | 0.3581 | 0.528 | 0.5416 | 0.3187 | 0.4857 | 0.6567 | 0.6113 | 0.8076 | 0.2424 | 0.5218 | 0.2464 | 0.4531 | 0.1768 | 0.419 | 0.3976 | 0.5062 |
| 0.952 | 56.0 | 5712 | 1.0479 | 0.3293 | 0.6097 | 0.2897 | 0.1166 | 0.2852 | 0.4376 | 0.3632 | 0.5245 | 0.5352 | 0.2998 | 0.5127 | 0.6473 | 0.6394 | 0.7965 | 0.2297 | 0.5309 | 0.2437 | 0.4464 | 0.1251 | 0.3857 | 0.4087 | 0.5167 |
| 0.9407 | 57.0 | 5814 | 1.0272 | 0.3283 | 0.6069 | 0.3116 | 0.1085 | 0.268 | 0.465 | 0.3657 | 0.5302 | 0.5436 | 0.3034 | 0.4854 | 0.6705 | 0.6336 | 0.8021 | 0.2297 | 0.5327 | 0.2335 | 0.4609 | 0.1385 | 0.4 | 0.4064 | 0.5222 |
| 0.9397 | 58.0 | 5916 | 1.0231 | 0.349 | 0.6351 | 0.3303 | 0.1188 | 0.2873 | 0.4816 | 0.3715 | 0.5423 | 0.5536 | 0.3065 | 0.5154 | 0.6701 | 0.6395 | 0.809 | 0.2533 | 0.5473 | 0.2523 | 0.4603 | 0.1838 | 0.4333 | 0.416 | 0.5179 |
| 0.9314 | 59.0 | 6018 | 1.0115 | 0.3483 | 0.6266 | 0.3305 | 0.1316 | 0.3084 | 0.4695 | 0.371 | 0.5311 | 0.5432 | 0.322 | 0.4979 | 0.6664 | 0.6387 | 0.8111 | 0.2686 | 0.5564 | 0.2601 | 0.4626 | 0.1603 | 0.3778 | 0.414 | 0.508 |
| 0.9166 | 60.0 | 6120 | 1.0243 | 0.3391 | 0.6249 | 0.3221 | 0.1232 | 0.2851 | 0.4776 | 0.3666 | 0.5371 | 0.5505 | 0.3372 | 0.5103 | 0.6696 | 0.6324 | 0.7979 | 0.2062 | 0.54 | 0.2379 | 0.4615 | 0.2038 | 0.4317 | 0.4152 | 0.5216 |
| 0.924 | 61.0 | 6222 | 1.0122 | 0.3354 | 0.6173 | 0.3128 | 0.1347 | 0.2828 | 0.4528 | 0.3614 | 0.5374 | 0.5528 | 0.3338 | 0.5168 | 0.6655 | 0.6429 | 0.8014 | 0.2192 | 0.5527 | 0.2427 | 0.4698 | 0.1652 | 0.427 | 0.4071 | 0.513 |
| 0.9165 | 62.0 | 6324 | 1.0434 | 0.3345 | 0.6152 | 0.3282 | 0.1142 | 0.2817 | 0.4585 | 0.3644 | 0.531 | 0.5431 | 0.2893 | 0.5075 | 0.6588 | 0.6436 | 0.7965 | 0.2145 | 0.5327 | 0.2701 | 0.4637 | 0.1629 | 0.427 | 0.3815 | 0.4957 |
| 0.9005 | 63.0 | 6426 | 0.9866 | 0.3572 | 0.6479 | 0.3335 | 0.128 | 0.2956 | 0.4984 | 0.376 | 0.5559 | 0.5668 | 0.3083 | 0.5304 | 0.6817 | 0.6512 | 0.8125 | 0.2538 | 0.5618 | 0.2758 | 0.4693 | 0.1885 | 0.4492 | 0.4167 | 0.5414 |
| 0.8996 | 64.0 | 6528 | 0.9901 | 0.3477 | 0.6335 | 0.3346 | 0.1162 | 0.2967 | 0.4793 | 0.3704 | 0.5415 | 0.5534 | 0.2957 | 0.5201 | 0.675 | 0.643 | 0.8056 | 0.2311 | 0.5418 | 0.2598 | 0.4838 | 0.1697 | 0.3921 | 0.4348 | 0.5438 |
| 0.8975 | 65.0 | 6630 | 1.0033 | 0.3397 | 0.634 | 0.3112 | 0.1167 | 0.2969 | 0.4509 | 0.3652 | 0.5342 | 0.5486 | 0.362 | 0.5161 | 0.6506 | 0.6341 | 0.7951 | 0.249 | 0.5255 | 0.2452 | 0.4726 | 0.1589 | 0.4206 | 0.4116 | 0.529 |
| 0.8908 | 66.0 | 6732 | 0.9923 | 0.3449 | 0.6215 | 0.3384 | 0.123 | 0.2871 | 0.474 | 0.3701 | 0.5418 | 0.5559 | 0.3346 | 0.5038 | 0.6677 | 0.6338 | 0.8007 | 0.2394 | 0.5309 | 0.2635 | 0.486 | 0.1822 | 0.4349 | 0.4057 | 0.5272 |
| 0.8959 | 67.0 | 6834 | 1.0181 | 0.3435 | 0.6396 | 0.3147 | 0.1244 | 0.3009 | 0.4555 | 0.3701 | 0.5421 | 0.5582 | 0.3111 | 0.5186 | 0.6676 | 0.6351 | 0.8069 | 0.2261 | 0.56 | 0.246 | 0.4626 | 0.1869 | 0.4127 | 0.4235 | 0.5488 |
| 0.8859 | 68.0 | 6936 | 1.0185 | 0.3487 | 0.6383 | 0.3248 | 0.1315 | 0.3021 | 0.4684 | 0.3732 | 0.5443 | 0.5579 | 0.313 | 0.5189 | 0.6687 | 0.6356 | 0.8014 | 0.262 | 0.5545 | 0.2352 | 0.4603 | 0.1923 | 0.4333 | 0.4187 | 0.5401 |
| 0.8789 | 69.0 | 7038 | 0.9968 | 0.3569 | 0.6402 | 0.3474 | 0.1113 | 0.3215 | 0.4761 | 0.376 | 0.5435 | 0.5574 | 0.3 | 0.5361 | 0.6643 | 0.6316 | 0.8076 | 0.2636 | 0.5418 | 0.2702 | 0.4654 | 0.19 | 0.4349 | 0.4289 | 0.537 |
| 0.8675 | 70.0 | 7140 | 0.9992 | 0.3535 | 0.6402 | 0.3588 | 0.1253 | 0.301 | 0.478 | 0.3704 | 0.5412 | 0.5562 | 0.3174 | 0.5313 | 0.6622 | 0.6274 | 0.8104 | 0.2616 | 0.5491 | 0.2785 | 0.4693 | 0.1847 | 0.4302 | 0.4153 | 0.5222 |
| 0.8513 | 71.0 | 7242 | 1.0028 | 0.3543 | 0.6439 | 0.3148 | 0.12 | 0.3177 | 0.4679 | 0.3838 | 0.5404 | 0.5554 | 0.3157 | 0.5152 | 0.6597 | 0.646 | 0.8174 | 0.2473 | 0.5564 | 0.2541 | 0.462 | 0.2172 | 0.419 | 0.407 | 0.5222 |
| 0.8621 | 72.0 | 7344 | 1.0029 | 0.3485 | 0.6401 | 0.3418 | 0.1305 | 0.3149 | 0.4686 | 0.3754 | 0.5321 | 0.5483 | 0.2893 | 0.5143 | 0.6649 | 0.6281 | 0.8083 | 0.2476 | 0.5436 | 0.2695 | 0.4626 | 0.1768 | 0.3952 | 0.4206 | 0.5315 |
| 0.8598 | 73.0 | 7446 | 1.0067 | 0.3558 | 0.6601 | 0.3215 | 0.1212 | 0.3124 | 0.4768 | 0.3692 | 0.539 | 0.5533 | 0.3012 | 0.5177 | 0.6634 | 0.6468 | 0.8028 | 0.2425 | 0.5236 | 0.264 | 0.457 | 0.2102 | 0.4492 | 0.4154 | 0.534 |
| 0.858 | 74.0 | 7548 | 0.9737 | 0.3611 | 0.6578 | 0.3395 | 0.1112 | 0.318 | 0.4917 | 0.3792 | 0.5544 | 0.5668 | 0.3003 | 0.5348 | 0.6774 | 0.6436 | 0.8083 | 0.2561 | 0.5855 | 0.279 | 0.4665 | 0.2096 | 0.4381 | 0.4171 | 0.5358 |
| 0.8404 | 75.0 | 7650 | 0.9834 | 0.3638 | 0.6521 | 0.3412 | 0.1358 | 0.3115 | 0.4937 | 0.3866 | 0.546 | 0.5596 | 0.3181 | 0.5281 | 0.6844 | 0.6387 | 0.7986 | 0.2642 | 0.5582 | 0.2725 | 0.4726 | 0.2278 | 0.4429 | 0.4158 | 0.5259 |
| 0.8558 | 76.0 | 7752 | 0.9765 | 0.3656 | 0.6597 | 0.3563 | 0.1068 | 0.3392 | 0.4915 | 0.3845 | 0.5463 | 0.5607 | 0.3079 | 0.5338 | 0.6779 | 0.6511 | 0.8042 | 0.2675 | 0.56 | 0.2923 | 0.4799 | 0.2092 | 0.4365 | 0.4079 | 0.5228 |
| 0.8338 | 77.0 | 7854 | 0.9860 | 0.3682 | 0.6592 | 0.331 | 0.1094 | 0.3315 | 0.4936 | 0.3778 | 0.5518 | 0.5637 | 0.3483 | 0.5224 | 0.6737 | 0.6595 | 0.8049 | 0.2605 | 0.5673 | 0.2689 | 0.4709 | 0.2163 | 0.4444 | 0.4361 | 0.5309 |
| 0.8454 | 78.0 | 7956 | 0.9876 | 0.366 | 0.6662 | 0.3421 | 0.1131 | 0.3236 | 0.489 | 0.3807 | 0.5459 | 0.5562 | 0.3028 | 0.5072 | 0.6758 | 0.6509 | 0.8014 | 0.278 | 0.5418 | 0.2593 | 0.4648 | 0.21 | 0.4317 | 0.432 | 0.5414 |
| 0.8319 | 79.0 | 8058 | 0.9771 | 0.3651 | 0.6651 | 0.3351 | 0.1234 | 0.3099 | 0.4936 | 0.3839 | 0.5498 | 0.5642 | 0.317 | 0.5096 | 0.6813 | 0.6521 | 0.816 | 0.2753 | 0.5509 | 0.2707 | 0.4804 | 0.1979 | 0.4317 | 0.4294 | 0.542 |
| 0.8368 | 80.0 | 8160 | 0.9886 | 0.3654 | 0.6689 | 0.3357 | 0.1371 | 0.3275 | 0.4843 | 0.3767 | 0.5479 | 0.5584 | 0.3301 | 0.5112 | 0.6671 | 0.6367 | 0.7972 | 0.2799 | 0.56 | 0.2777 | 0.4827 | 0.2147 | 0.427 | 0.4178 | 0.5253 |
| 0.8346 | 81.0 | 8262 | 0.9593 | 0.3663 | 0.6595 | 0.3305 | 0.1097 | 0.3251 | 0.4945 | 0.3839 | 0.5541 | 0.5656 | 0.3244 | 0.5284 | 0.6835 | 0.6385 | 0.8062 | 0.2747 | 0.5527 | 0.2886 | 0.4944 | 0.203 | 0.4333 | 0.4268 | 0.5414 |
| 0.8247 | 82.0 | 8364 | 0.9591 | 0.3674 | 0.6658 | 0.3293 | 0.1102 | 0.319 | 0.4972 | 0.3788 | 0.5496 | 0.5616 | 0.3093 | 0.5183 | 0.6776 | 0.6414 | 0.8097 | 0.2621 | 0.5545 | 0.2876 | 0.4788 | 0.2113 | 0.4222 | 0.4344 | 0.5426 |
| 0.8021 | 83.0 | 8466 | 0.9635 | 0.373 | 0.6714 | 0.3418 | 0.1179 | 0.3261 | 0.5015 | 0.3839 | 0.5507 | 0.562 | 0.323 | 0.5185 | 0.6753 | 0.6439 | 0.8111 | 0.2769 | 0.5545 | 0.2796 | 0.4732 | 0.2274 | 0.4317 | 0.4369 | 0.5395 |
| 0.8113 | 84.0 | 8568 | 0.9459 | 0.3785 | 0.6794 | 0.3632 | 0.1305 | 0.3286 | 0.514 | 0.388 | 0.5518 | 0.5654 | 0.3149 | 0.5341 | 0.679 | 0.6467 | 0.8132 | 0.2899 | 0.5564 | 0.2881 | 0.481 | 0.2335 | 0.4333 | 0.4343 | 0.5432 |
| 0.8146 | 85.0 | 8670 | 0.9516 | 0.3798 | 0.69 | 0.3588 | 0.1255 | 0.3253 | 0.5141 | 0.3884 | 0.553 | 0.5664 | 0.3204 | 0.5261 | 0.6825 | 0.6431 | 0.8049 | 0.2913 | 0.5582 | 0.3048 | 0.4844 | 0.2255 | 0.4444 | 0.4341 | 0.5401 |
| 0.8129 | 86.0 | 8772 | 0.9782 | 0.3791 | 0.6869 | 0.3499 | 0.1286 | 0.3203 | 0.5144 | 0.3914 | 0.5523 | 0.566 | 0.3086 | 0.5321 | 0.6841 | 0.642 | 0.8062 | 0.2851 | 0.5782 | 0.2948 | 0.4682 | 0.2418 | 0.4365 | 0.4316 | 0.5407 |
| 0.7919 | 87.0 | 8874 | 0.9658 | 0.382 | 0.6836 | 0.3658 | 0.1333 | 0.3282 | 0.5098 | 0.386 | 0.5553 | 0.5696 | 0.3325 | 0.5265 | 0.6809 | 0.6517 | 0.8097 | 0.2841 | 0.5727 | 0.3035 | 0.4872 | 0.2244 | 0.4286 | 0.4462 | 0.55 |
| 0.7972 | 88.0 | 8976 | 0.9767 | 0.376 | 0.6721 | 0.3546 | 0.1168 | 0.3185 | 0.507 | 0.3862 | 0.5568 | 0.5713 | 0.3286 | 0.55 | 0.6792 | 0.6441 | 0.8111 | 0.2799 | 0.5691 | 0.2954 | 0.4832 | 0.2197 | 0.4492 | 0.4411 | 0.5438 |
| 0.7994 | 89.0 | 9078 | 0.9656 | 0.382 | 0.669 | 0.3615 | 0.1262 | 0.3273 | 0.5104 | 0.3907 | 0.5606 | 0.5717 | 0.3356 | 0.5459 | 0.6844 | 0.6519 | 0.8153 | 0.2958 | 0.5636 | 0.2968 | 0.4872 | 0.2164 | 0.4381 | 0.4489 | 0.5543 |
| 0.789 | 90.0 | 9180 | 0.9649 | 0.3786 | 0.6728 | 0.3539 | 0.1144 | 0.3136 | 0.5142 | 0.3918 | 0.5544 | 0.5652 | 0.3281 | 0.5252 | 0.6838 | 0.651 | 0.8097 | 0.2838 | 0.5509 | 0.3014 | 0.4804 | 0.2118 | 0.4349 | 0.4451 | 0.55 |
| 0.799 | 91.0 | 9282 | 0.9637 | 0.3826 | 0.6882 | 0.3597 | 0.1366 | 0.33 | 0.5186 | 0.3972 | 0.5646 | 0.5763 | 0.366 | 0.5349 | 0.6859 | 0.6376 | 0.8069 | 0.2985 | 0.5855 | 0.2998 | 0.481 | 0.2427 | 0.4603 | 0.4343 | 0.5475 |
| 0.7742 | 92.0 | 9384 | 0.9634 | 0.3732 | 0.6658 | 0.3523 | 0.1325 | 0.3044 | 0.5138 | 0.391 | 0.5533 | 0.5636 | 0.3418 | 0.5087 | 0.6828 | 0.6481 | 0.8097 | 0.2727 | 0.54 | 0.2907 | 0.4732 | 0.2244 | 0.454 | 0.4301 | 0.5414 |
| 0.789 | 93.0 | 9486 | 0.9582 | 0.3734 | 0.6717 | 0.3496 | 0.1235 | 0.3045 | 0.5036 | 0.3934 | 0.5599 | 0.5709 | 0.3576 | 0.5285 | 0.6807 | 0.6445 | 0.8062 | 0.2755 | 0.5509 | 0.2971 | 0.4855 | 0.2188 | 0.4651 | 0.431 | 0.5469 |
| 0.7788 | 94.0 | 9588 | 0.9508 | 0.3794 | 0.6814 | 0.3451 | 0.1139 | 0.3162 | 0.5163 | 0.3954 | 0.5585 | 0.5704 | 0.3361 | 0.5368 | 0.6867 | 0.6516 | 0.8146 | 0.2837 | 0.5582 | 0.298 | 0.4888 | 0.2231 | 0.446 | 0.4404 | 0.5444 |
| 0.7757 | 95.0 | 9690 | 0.9593 | 0.3785 | 0.6749 | 0.3541 | 0.1197 | 0.322 | 0.5183 | 0.3983 | 0.558 | 0.5715 | 0.3425 | 0.5356 | 0.6857 | 0.6436 | 0.8132 | 0.2853 | 0.5618 | 0.2965 | 0.4832 | 0.226 | 0.4492 | 0.4412 | 0.55 |
| 0.7672 | 96.0 | 9792 | 0.9619 | 0.3781 | 0.6715 | 0.3528 | 0.1251 | 0.3249 | 0.5164 | 0.3944 | 0.5569 | 0.571 | 0.3525 | 0.53 | 0.685 | 0.645 | 0.8132 | 0.2836 | 0.5582 | 0.2902 | 0.4894 | 0.2391 | 0.4508 | 0.4328 | 0.5432 |
| 0.7646 | 97.0 | 9894 | 0.9548 | 0.3788 | 0.6778 | 0.3553 | 0.1221 | 0.3209 | 0.5197 | 0.3965 | 0.5568 | 0.5692 | 0.353 | 0.5344 | 0.6819 | 0.6463 | 0.8167 | 0.2846 | 0.5582 | 0.2956 | 0.4838 | 0.2313 | 0.4413 | 0.4362 | 0.5463 |
| 0.7742 | 98.0 | 9996 | 0.9617 | 0.3802 | 0.6793 | 0.3575 | 0.1269 | 0.3209 | 0.52 | 0.3957 | 0.5583 | 0.5702 | 0.3563 | 0.5287 | 0.6829 | 0.6474 | 0.8146 | 0.2867 | 0.5564 | 0.2991 | 0.4872 | 0.2307 | 0.4429 | 0.4373 | 0.55 |
| 0.7704 | 99.0 | 10098 | 0.9599 | 0.3766 | 0.6798 | 0.3493 | 0.1305 | 0.3206 | 0.5156 | 0.3957 | 0.5592 | 0.5707 | 0.3523 | 0.5352 | 0.6809 | 0.6434 | 0.8125 | 0.2833 | 0.5582 | 0.2967 | 0.4872 | 0.2228 | 0.4476 | 0.437 | 0.5481 |
| 0.7546 | 100.0 | 10200 | 0.9579 | 0.3769 | 0.6804 | 0.3483 | 0.1331 | 0.3109 | 0.5141 | 0.3946 | 0.5584 | 0.5691 | 0.3543 | 0.5333 | 0.6781 | 0.646 | 0.8118 | 0.2809 | 0.5564 | 0.2972 | 0.4827 | 0.223 | 0.4476 | 0.4374 | 0.5469 |
### Framework versions
- Transformers 4.47.1
- Pytorch 2.5.1+cu121
- Datasets 3.2.0
- Tokenizers 0.21.0
| [
"coverall",
"face_shield",
"gloves",
"goggles",
"mask"
] |
sars973/detr_finetuned_cppe5 |
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# detr_finetuned_cppe5
This model is a fine-tuned version of [facebook/detr-resnet-101](https://huggingface.co/facebook/detr-resnet-101) on an unknown dataset.
It achieves the following results on the evaluation set:
- Loss: 1.3489
- Map: 0.2197
- Map 50: 0.4408
- Map 75: 0.1896
- Map Small: 0.0734
- Map Medium: 0.1903
- Map Large: 0.3159
- Mar 1: 0.2516
- Mar 10: 0.4459
- Mar 100: 0.4733
- Mar Small: 0.191
- Mar Medium: 0.4214
- Mar Large: 0.6208
- Map Coverall: 0.4922
- Mar 100 Coverall: 0.6752
- Map Face Shield: 0.1239
- Mar 100 Face Shield: 0.4392
- Map Gloves: 0.1397
- Mar 100 Gloves: 0.4259
- Map Goggles: 0.0782
- Mar 100 Goggles: 0.42
- Map Mask: 0.2645
- Mar 100 Mask: 0.4062
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 5e-05
- train_batch_size: 8
- eval_batch_size: 8
- seed: 42
- optimizer: Use OptimizerNames.ADAMW_TORCH with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
- lr_scheduler_type: cosine
- num_epochs: 30
### Training results
| Training Loss | Epoch | Step | Validation Loss | Map | Map 50 | Map 75 | Map Small | Map Medium | Map Large | Mar 1 | Mar 10 | Mar 100 | Mar Small | Mar Medium | Mar Large | Map Coverall | Mar 100 Coverall | Map Face Shield | Mar 100 Face Shield | Map Gloves | Mar 100 Gloves | Map Goggles | Mar 100 Goggles | Map Mask | Mar 100 Mask |
|:-------------:|:-----:|:----:|:---------------:|:------:|:------:|:------:|:---------:|:----------:|:---------:|:------:|:------:|:-------:|:---------:|:----------:|:---------:|:------------:|:----------------:|:---------------:|:-------------------:|:----------:|:--------------:|:-----------:|:---------------:|:--------:|:------------:|
| No log | 1.0 | 107 | 2.2906 | 0.0222 | 0.0441 | 0.02 | 0.0042 | 0.0229 | 0.0243 | 0.0624 | 0.1415 | 0.1926 | 0.052 | 0.1457 | 0.2191 | 0.0902 | 0.5595 | 0.0 | 0.0 | 0.0106 | 0.1839 | 0.0 | 0.0 | 0.0103 | 0.2196 |
| No log | 2.0 | 214 | 2.2644 | 0.0368 | 0.0827 | 0.0263 | 0.0071 | 0.0254 | 0.0396 | 0.059 | 0.1417 | 0.1885 | 0.0671 | 0.1532 | 0.197 | 0.1528 | 0.5108 | 0.0 | 0.0 | 0.0102 | 0.2013 | 0.0 | 0.0 | 0.0209 | 0.2302 |
| No log | 3.0 | 321 | 2.1753 | 0.0446 | 0.108 | 0.0328 | 0.0103 | 0.0392 | 0.0454 | 0.0729 | 0.1561 | 0.1818 | 0.0786 | 0.1465 | 0.19 | 0.1766 | 0.505 | 0.0 | 0.0 | 0.0152 | 0.1817 | 0.0 | 0.0 | 0.0313 | 0.2222 |
| No log | 4.0 | 428 | 2.0441 | 0.0708 | 0.1548 | 0.0557 | 0.0094 | 0.0532 | 0.0746 | 0.0863 | 0.1814 | 0.1966 | 0.0738 | 0.1496 | 0.2101 | 0.2703 | 0.5491 | 0.0 | 0.0 | 0.032 | 0.2138 | 0.0 | 0.0 | 0.0515 | 0.22 |
| 2.0459 | 5.0 | 535 | 1.8798 | 0.0896 | 0.1805 | 0.0813 | 0.012 | 0.0648 | 0.0977 | 0.0895 | 0.2014 | 0.2246 | 0.081 | 0.1749 | 0.2519 | 0.3432 | 0.6018 | 0.0088 | 0.019 | 0.0326 | 0.2402 | 0.0 | 0.0 | 0.0635 | 0.2622 |
| 2.0459 | 6.0 | 642 | 1.8768 | 0.0888 | 0.1898 | 0.0727 | 0.0176 | 0.0618 | 0.1025 | 0.0974 | 0.2118 | 0.2366 | 0.0936 | 0.1639 | 0.2877 | 0.3342 | 0.6113 | 0.0102 | 0.081 | 0.0409 | 0.2536 | 0.0001 | 0.0046 | 0.0585 | 0.2324 |
| 2.0459 | 7.0 | 749 | 1.8465 | 0.0917 | 0.2075 | 0.0735 | 0.0216 | 0.0758 | 0.1166 | 0.0993 | 0.2008 | 0.2114 | 0.0748 | 0.1657 | 0.2472 | 0.3279 | 0.5905 | 0.0187 | 0.0443 | 0.0297 | 0.1862 | 0.0 | 0.0 | 0.0824 | 0.236 |
| 2.0459 | 8.0 | 856 | 1.7855 | 0.1165 | 0.2639 | 0.0908 | 0.0309 | 0.1054 | 0.1401 | 0.1341 | 0.2522 | 0.2767 | 0.0914 | 0.2412 | 0.3087 | 0.3517 | 0.6122 | 0.035 | 0.1684 | 0.0605 | 0.2902 | 0.0076 | 0.0215 | 0.1279 | 0.2911 |
| 2.0459 | 9.0 | 963 | 1.7916 | 0.1089 | 0.2494 | 0.0794 | 0.0287 | 0.087 | 0.1542 | 0.1341 | 0.282 | 0.3076 | 0.1564 | 0.2707 | 0.3343 | 0.3475 | 0.5685 | 0.0525 | 0.2797 | 0.0412 | 0.3063 | 0.0044 | 0.0754 | 0.099 | 0.308 |
| 1.6481 | 10.0 | 1070 | 1.6954 | 0.13 | 0.288 | 0.0955 | 0.0547 | 0.1057 | 0.1728 | 0.1643 | 0.3253 | 0.3494 | 0.1481 | 0.3027 | 0.4404 | 0.4015 | 0.6144 | 0.0393 | 0.2823 | 0.0652 | 0.3362 | 0.018 | 0.2185 | 0.1261 | 0.2956 |
| 1.6481 | 11.0 | 1177 | 1.7003 | 0.1424 | 0.3156 | 0.1053 | 0.0356 | 0.1306 | 0.1797 | 0.1699 | 0.3168 | 0.3385 | 0.1057 | 0.2948 | 0.4284 | 0.3823 | 0.6203 | 0.0518 | 0.2684 | 0.0809 | 0.308 | 0.0254 | 0.1569 | 0.1718 | 0.3391 |
| 1.6481 | 12.0 | 1284 | 1.6607 | 0.1468 | 0.3287 | 0.1131 | 0.0482 | 0.1304 | 0.1974 | 0.185 | 0.3226 | 0.3403 | 0.1284 | 0.2845 | 0.43 | 0.3794 | 0.5991 | 0.0699 | 0.2734 | 0.0715 | 0.3112 | 0.0238 | 0.1908 | 0.1892 | 0.3271 |
| 1.6481 | 13.0 | 1391 | 1.5862 | 0.1479 | 0.3321 | 0.1154 | 0.0532 | 0.1332 | 0.1939 | 0.1686 | 0.3415 | 0.3611 | 0.1666 | 0.3319 | 0.4261 | 0.4009 | 0.6275 | 0.0778 | 0.2835 | 0.0665 | 0.3263 | 0.0194 | 0.2354 | 0.1751 | 0.3329 |
| 1.6481 | 14.0 | 1498 | 1.6090 | 0.156 | 0.3327 | 0.1277 | 0.0447 | 0.1329 | 0.2319 | 0.1894 | 0.3776 | 0.4062 | 0.1785 | 0.357 | 0.5088 | 0.4333 | 0.6437 | 0.0546 | 0.3848 | 0.0923 | 0.3759 | 0.0302 | 0.2677 | 0.1694 | 0.3591 |
| 1.4268 | 15.0 | 1605 | 1.4913 | 0.1795 | 0.3855 | 0.147 | 0.0629 | 0.1506 | 0.2349 | 0.2076 | 0.3961 | 0.4235 | 0.2045 | 0.3634 | 0.5405 | 0.4682 | 0.6523 | 0.0713 | 0.3532 | 0.0941 | 0.3638 | 0.0461 | 0.3662 | 0.2179 | 0.3822 |
| 1.4268 | 16.0 | 1712 | 1.5350 | 0.1775 | 0.404 | 0.1384 | 0.0649 | 0.1484 | 0.2563 | 0.2119 | 0.3822 | 0.4094 | 0.1624 | 0.3624 | 0.5271 | 0.4532 | 0.6473 | 0.0644 | 0.3392 | 0.0806 | 0.3406 | 0.074 | 0.3538 | 0.2154 | 0.3662 |
| 1.4268 | 17.0 | 1819 | 1.4915 | 0.1842 | 0.3823 | 0.1568 | 0.0591 | 0.1542 | 0.2608 | 0.2168 | 0.3984 | 0.4208 | 0.1613 | 0.3758 | 0.5317 | 0.4509 | 0.6586 | 0.0853 | 0.3595 | 0.0951 | 0.3728 | 0.0471 | 0.3231 | 0.2428 | 0.3902 |
| 1.4268 | 18.0 | 1926 | 1.4537 | 0.1943 | 0.4042 | 0.1637 | 0.0724 | 0.1683 | 0.2693 | 0.2178 | 0.414 | 0.4384 | 0.1963 | 0.386 | 0.5681 | 0.4548 | 0.6577 | 0.0928 | 0.3772 | 0.1061 | 0.371 | 0.0684 | 0.3723 | 0.2494 | 0.4138 |
| 1.2822 | 19.0 | 2033 | 1.4585 | 0.1947 | 0.4081 | 0.1622 | 0.056 | 0.1612 | 0.2822 | 0.2239 | 0.4019 | 0.4303 | 0.1415 | 0.3811 | 0.5779 | 0.4742 | 0.6568 | 0.0916 | 0.3911 | 0.0974 | 0.3719 | 0.0701 | 0.3523 | 0.2404 | 0.3796 |
| 1.2822 | 20.0 | 2140 | 1.4307 | 0.2048 | 0.4048 | 0.1845 | 0.0582 | 0.179 | 0.2957 | 0.2286 | 0.4145 | 0.437 | 0.1719 | 0.3827 | 0.5799 | 0.4801 | 0.6541 | 0.102 | 0.381 | 0.1017 | 0.3701 | 0.091 | 0.3723 | 0.2491 | 0.4076 |
| 1.2822 | 21.0 | 2247 | 1.3939 | 0.1981 | 0.421 | 0.1641 | 0.0716 | 0.1697 | 0.2914 | 0.2249 | 0.4264 | 0.45 | 0.2087 | 0.3874 | 0.5846 | 0.4618 | 0.6685 | 0.1003 | 0.3886 | 0.1179 | 0.4228 | 0.0725 | 0.38 | 0.2378 | 0.3902 |
| 1.2822 | 22.0 | 2354 | 1.3966 | 0.2084 | 0.4238 | 0.1814 | 0.0638 | 0.1836 | 0.2981 | 0.2372 | 0.4322 | 0.455 | 0.1738 | 0.4012 | 0.6097 | 0.4764 | 0.6626 | 0.0987 | 0.4051 | 0.135 | 0.4098 | 0.0797 | 0.4015 | 0.252 | 0.396 |
| 1.2822 | 23.0 | 2461 | 1.3867 | 0.2125 | 0.4346 | 0.182 | 0.0781 | 0.1874 | 0.3082 | 0.2534 | 0.432 | 0.4603 | 0.1989 | 0.3963 | 0.6137 | 0.4686 | 0.6626 | 0.115 | 0.4241 | 0.1394 | 0.4147 | 0.0814 | 0.3938 | 0.2581 | 0.4062 |
| 1.1474 | 24.0 | 2568 | 1.3617 | 0.2177 | 0.4413 | 0.1898 | 0.0759 | 0.1939 | 0.3081 | 0.2521 | 0.4427 | 0.4719 | 0.2093 | 0.4277 | 0.6013 | 0.4865 | 0.6761 | 0.1186 | 0.4278 | 0.1469 | 0.4259 | 0.0852 | 0.4308 | 0.2512 | 0.3987 |
| 1.1474 | 25.0 | 2675 | 1.3695 | 0.215 | 0.4353 | 0.1889 | 0.0712 | 0.1881 | 0.3099 | 0.2553 | 0.4463 | 0.4727 | 0.1949 | 0.4258 | 0.6194 | 0.4879 | 0.6748 | 0.1128 | 0.4215 | 0.1426 | 0.4286 | 0.0771 | 0.44 | 0.2545 | 0.3987 |
| 1.1474 | 26.0 | 2782 | 1.3575 | 0.2177 | 0.433 | 0.1898 | 0.073 | 0.1879 | 0.316 | 0.2586 | 0.4454 | 0.4703 | 0.1913 | 0.4133 | 0.6285 | 0.493 | 0.682 | 0.1262 | 0.4228 | 0.1371 | 0.4241 | 0.0757 | 0.4231 | 0.2564 | 0.3996 |
| 1.1474 | 27.0 | 2889 | 1.3664 | 0.2161 | 0.4328 | 0.1848 | 0.0711 | 0.1846 | 0.314 | 0.2507 | 0.4425 | 0.4686 | 0.1885 | 0.4181 | 0.6185 | 0.4908 | 0.6775 | 0.1186 | 0.4165 | 0.1402 | 0.4246 | 0.0759 | 0.4277 | 0.2547 | 0.3969 |
| 1.1474 | 28.0 | 2996 | 1.3482 | 0.2188 | 0.4406 | 0.1864 | 0.0731 | 0.1891 | 0.317 | 0.2517 | 0.4463 | 0.4704 | 0.1906 | 0.4181 | 0.6175 | 0.491 | 0.6752 | 0.1219 | 0.4266 | 0.1388 | 0.4196 | 0.0776 | 0.4231 | 0.2647 | 0.4076 |
| 1.0794 | 29.0 | 3103 | 1.3481 | 0.2189 | 0.4402 | 0.1884 | 0.0732 | 0.1885 | 0.3173 | 0.2519 | 0.4459 | 0.4728 | 0.1923 | 0.4175 | 0.6232 | 0.491 | 0.673 | 0.1237 | 0.4354 | 0.1392 | 0.425 | 0.0768 | 0.4215 | 0.264 | 0.4089 |
| 1.0794 | 30.0 | 3210 | 1.3489 | 0.2197 | 0.4408 | 0.1896 | 0.0734 | 0.1903 | 0.3159 | 0.2516 | 0.4459 | 0.4733 | 0.191 | 0.4214 | 0.6208 | 0.4922 | 0.6752 | 0.1239 | 0.4392 | 0.1397 | 0.4259 | 0.0782 | 0.42 | 0.2645 | 0.4062 |
### Framework versions
- Transformers 4.48.3
- Pytorch 2.5.1+cu124
- Datasets 3.3.1
- Tokenizers 0.21.0
| [
"coverall",
"face_shield",
"gloves",
"goggles",
"mask"
] |
sha000/detr-finetuned-quadrant-v1 |
# Model Card for Model ID
<!-- Provide a quick summary of what the model is/does. -->
## Model Details
### Model Description
<!-- Provide a longer summary of what this model is. -->
This is the model card of a 🤗 transformers model that has been pushed on the Hub. This model card has been automatically generated.
- **Developed by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Model type:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
- **Finetuned from model [optional]:** [More Information Needed]
### Model Sources [optional]
<!-- Provide the basic links for the model. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the model is intended to be used, including the foreseeable users of the model and those affected by the model. -->
### Direct Use
<!-- This section is for the model use without fine-tuning or plugging into a larger ecosystem/app. -->
[More Information Needed]
### Downstream Use [optional]
<!-- This section is for the model use when fine-tuned for a task, or when plugged into a larger ecosystem/app -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the model will not work well for. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.
## How to Get Started with the Model
Use the code below to get started with the model.
[More Information Needed]
## Training Details
### Training Data
<!-- This should link to a Dataset Card, perhaps with a short stub of information on what the training data is all about as well as documentation related to data pre-processing or additional filtering. -->
[More Information Needed]
### Training Procedure
<!-- This relates heavily to the Technical Specifications. Content here should link to that section when it is relevant to the training procedure. -->
#### Preprocessing [optional]
[More Information Needed]
#### Training Hyperparameters
- **Training regime:** [More Information Needed] <!--fp32, fp16 mixed precision, bf16 mixed precision, bf16 non-mixed precision, fp16 non-mixed precision, fp8 mixed precision -->
#### Speeds, Sizes, Times [optional]
<!-- This section provides information about throughput, start/end time, checkpoint size if relevant, etc. -->
[More Information Needed]
## Evaluation
<!-- This section describes the evaluation protocols and provides the results. -->
### Testing Data, Factors & Metrics
#### Testing Data
<!-- This should link to a Dataset Card if possible. -->
[More Information Needed]
#### Factors
<!-- These are the things the evaluation is disaggregating by, e.g., subpopulations or domains. -->
[More Information Needed]
#### Metrics
<!-- These are the evaluation metrics being used, ideally with a description of why. -->
[More Information Needed]
### Results
[More Information Needed]
#### Summary
## Model Examination [optional]
<!-- Relevant interpretability work for the model goes here -->
[More Information Needed]
## Environmental Impact
<!-- Total emissions (in grams of CO2eq) and additional considerations, such as electricity usage, go here. Edit the suggested text below accordingly -->
Carbon emissions can be estimated using the [Machine Learning Impact calculator](https://mlco2.github.io/impact#compute) presented in [Lacoste et al. (2019)](https://arxiv.org/abs/1910.09700).
- **Hardware Type:** [More Information Needed]
- **Hours used:** [More Information Needed]
- **Cloud Provider:** [More Information Needed]
- **Compute Region:** [More Information Needed]
- **Carbon Emitted:** [More Information Needed]
## Technical Specifications [optional]
### Model Architecture and Objective
[More Information Needed]
### Compute Infrastructure
[More Information Needed]
#### Hardware
[More Information Needed]
#### Software
[More Information Needed]
## Citation [optional]
<!-- If there is a paper or blog post introducing the model, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the model or model card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Model Card Authors [optional]
[More Information Needed]
## Model Card Contact
[More Information Needed] | [
"label_0",
"label_1",
"label_2",
"label_3"
] |
sha000/detr-finetuned-quadrant-v2 |
# Model Card for Model ID
<!-- Provide a quick summary of what the model is/does. -->
## Model Details
### Model Description
<!-- Provide a longer summary of what this model is. -->
This is the model card of a 🤗 transformers model that has been pushed on the Hub. This model card has been automatically generated.
- **Developed by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Model type:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
- **Finetuned from model [optional]:** [More Information Needed]
### Model Sources [optional]
<!-- Provide the basic links for the model. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the model is intended to be used, including the foreseeable users of the model and those affected by the model. -->
### Direct Use
<!-- This section is for the model use without fine-tuning or plugging into a larger ecosystem/app. -->
[More Information Needed]
### Downstream Use [optional]
<!-- This section is for the model use when fine-tuned for a task, or when plugged into a larger ecosystem/app -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the model will not work well for. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.
## How to Get Started with the Model
Use the code below to get started with the model.
[More Information Needed]
## Training Details
### Training Data
<!-- This should link to a Dataset Card, perhaps with a short stub of information on what the training data is all about as well as documentation related to data pre-processing or additional filtering. -->
[More Information Needed]
### Training Procedure
<!-- This relates heavily to the Technical Specifications. Content here should link to that section when it is relevant to the training procedure. -->
#### Preprocessing [optional]
[More Information Needed]
#### Training Hyperparameters
- **Training regime:** [More Information Needed] <!--fp32, fp16 mixed precision, bf16 mixed precision, bf16 non-mixed precision, fp16 non-mixed precision, fp8 mixed precision -->
#### Speeds, Sizes, Times [optional]
<!-- This section provides information about throughput, start/end time, checkpoint size if relevant, etc. -->
[More Information Needed]
## Evaluation
<!-- This section describes the evaluation protocols and provides the results. -->
### Testing Data, Factors & Metrics
#### Testing Data
<!-- This should link to a Dataset Card if possible. -->
[More Information Needed]
#### Factors
<!-- These are the things the evaluation is disaggregating by, e.g., subpopulations or domains. -->
[More Information Needed]
#### Metrics
<!-- These are the evaluation metrics being used, ideally with a description of why. -->
[More Information Needed]
### Results
[More Information Needed]
#### Summary
## Model Examination [optional]
<!-- Relevant interpretability work for the model goes here -->
[More Information Needed]
## Environmental Impact
<!-- Total emissions (in grams of CO2eq) and additional considerations, such as electricity usage, go here. Edit the suggested text below accordingly -->
Carbon emissions can be estimated using the [Machine Learning Impact calculator](https://mlco2.github.io/impact#compute) presented in [Lacoste et al. (2019)](https://arxiv.org/abs/1910.09700).
- **Hardware Type:** [More Information Needed]
- **Hours used:** [More Information Needed]
- **Cloud Provider:** [More Information Needed]
- **Compute Region:** [More Information Needed]
- **Carbon Emitted:** [More Information Needed]
## Technical Specifications [optional]
### Model Architecture and Objective
[More Information Needed]
### Compute Infrastructure
[More Information Needed]
#### Hardware
[More Information Needed]
#### Software
[More Information Needed]
## Citation [optional]
<!-- If there is a paper or blog post introducing the model, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the model or model card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Model Card Authors [optional]
[More Information Needed]
## Model Card Contact
[More Information Needed] | [
"label_0",
"label_1",
"label_2",
"label_3"
] |
sha000/detr-finetuned-quadrant-v3 |
# Model Card for Model ID
<!-- Provide a quick summary of what the model is/does. -->
## Model Details
### Model Description
<!-- Provide a longer summary of what this model is. -->
This is the model card of a 🤗 transformers model that has been pushed on the Hub. This model card has been automatically generated.
- **Developed by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Model type:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
- **Finetuned from model [optional]:** [More Information Needed]
### Model Sources [optional]
<!-- Provide the basic links for the model. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the model is intended to be used, including the foreseeable users of the model and those affected by the model. -->
### Direct Use
<!-- This section is for the model use without fine-tuning or plugging into a larger ecosystem/app. -->
[More Information Needed]
### Downstream Use [optional]
<!-- This section is for the model use when fine-tuned for a task, or when plugged into a larger ecosystem/app -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the model will not work well for. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.
## How to Get Started with the Model
Use the code below to get started with the model.
[More Information Needed]
## Training Details
### Training Data
<!-- This should link to a Dataset Card, perhaps with a short stub of information on what the training data is all about as well as documentation related to data pre-processing or additional filtering. -->
[More Information Needed]
### Training Procedure
<!-- This relates heavily to the Technical Specifications. Content here should link to that section when it is relevant to the training procedure. -->
#### Preprocessing [optional]
[More Information Needed]
#### Training Hyperparameters
- **Training regime:** [More Information Needed] <!--fp32, fp16 mixed precision, bf16 mixed precision, bf16 non-mixed precision, fp16 non-mixed precision, fp8 mixed precision -->
#### Speeds, Sizes, Times [optional]
<!-- This section provides information about throughput, start/end time, checkpoint size if relevant, etc. -->
[More Information Needed]
## Evaluation
<!-- This section describes the evaluation protocols and provides the results. -->
### Testing Data, Factors & Metrics
#### Testing Data
<!-- This should link to a Dataset Card if possible. -->
[More Information Needed]
#### Factors
<!-- These are the things the evaluation is disaggregating by, e.g., subpopulations or domains. -->
[More Information Needed]
#### Metrics
<!-- These are the evaluation metrics being used, ideally with a description of why. -->
[More Information Needed]
### Results
[More Information Needed]
#### Summary
## Model Examination [optional]
<!-- Relevant interpretability work for the model goes here -->
[More Information Needed]
## Environmental Impact
<!-- Total emissions (in grams of CO2eq) and additional considerations, such as electricity usage, go here. Edit the suggested text below accordingly -->
Carbon emissions can be estimated using the [Machine Learning Impact calculator](https://mlco2.github.io/impact#compute) presented in [Lacoste et al. (2019)](https://arxiv.org/abs/1910.09700).
- **Hardware Type:** [More Information Needed]
- **Hours used:** [More Information Needed]
- **Cloud Provider:** [More Information Needed]
- **Compute Region:** [More Information Needed]
- **Carbon Emitted:** [More Information Needed]
## Technical Specifications [optional]
### Model Architecture and Objective
[More Information Needed]
### Compute Infrastructure
[More Information Needed]
#### Hardware
[More Information Needed]
#### Software
[More Information Needed]
## Citation [optional]
<!-- If there is a paper or blog post introducing the model, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the model or model card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Model Card Authors [optional]
[More Information Needed]
## Model Card Contact
[More Information Needed] | [
"label_0",
"label_1",
"label_2",
"label_3"
] |
sha000/detr-finetuned-quadrant-v4 |
# Model Card for Model ID
<!-- Provide a quick summary of what the model is/does. -->
## Model Details
### Model Description
<!-- Provide a longer summary of what this model is. -->
This is the model card of a 🤗 transformers model that has been pushed on the Hub. This model card has been automatically generated.
- **Developed by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Model type:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
- **Finetuned from model [optional]:** [More Information Needed]
### Model Sources [optional]
<!-- Provide the basic links for the model. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the model is intended to be used, including the foreseeable users of the model and those affected by the model. -->
### Direct Use
<!-- This section is for the model use without fine-tuning or plugging into a larger ecosystem/app. -->
[More Information Needed]
### Downstream Use [optional]
<!-- This section is for the model use when fine-tuned for a task, or when plugged into a larger ecosystem/app -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the model will not work well for. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.
## How to Get Started with the Model
Use the code below to get started with the model.
[More Information Needed]
## Training Details
### Training Data
<!-- This should link to a Dataset Card, perhaps with a short stub of information on what the training data is all about as well as documentation related to data pre-processing or additional filtering. -->
[More Information Needed]
### Training Procedure
<!-- This relates heavily to the Technical Specifications. Content here should link to that section when it is relevant to the training procedure. -->
#### Preprocessing [optional]
[More Information Needed]
#### Training Hyperparameters
- **Training regime:** [More Information Needed] <!--fp32, fp16 mixed precision, bf16 mixed precision, bf16 non-mixed precision, fp16 non-mixed precision, fp8 mixed precision -->
#### Speeds, Sizes, Times [optional]
<!-- This section provides information about throughput, start/end time, checkpoint size if relevant, etc. -->
[More Information Needed]
## Evaluation
<!-- This section describes the evaluation protocols and provides the results. -->
### Testing Data, Factors & Metrics
#### Testing Data
<!-- This should link to a Dataset Card if possible. -->
[More Information Needed]
#### Factors
<!-- These are the things the evaluation is disaggregating by, e.g., subpopulations or domains. -->
[More Information Needed]
#### Metrics
<!-- These are the evaluation metrics being used, ideally with a description of why. -->
[More Information Needed]
### Results
[More Information Needed]
#### Summary
## Model Examination [optional]
<!-- Relevant interpretability work for the model goes here -->
[More Information Needed]
## Environmental Impact
<!-- Total emissions (in grams of CO2eq) and additional considerations, such as electricity usage, go here. Edit the suggested text below accordingly -->
Carbon emissions can be estimated using the [Machine Learning Impact calculator](https://mlco2.github.io/impact#compute) presented in [Lacoste et al. (2019)](https://arxiv.org/abs/1910.09700).
- **Hardware Type:** [More Information Needed]
- **Hours used:** [More Information Needed]
- **Cloud Provider:** [More Information Needed]
- **Compute Region:** [More Information Needed]
- **Carbon Emitted:** [More Information Needed]
## Technical Specifications [optional]
### Model Architecture and Objective
[More Information Needed]
### Compute Infrastructure
[More Information Needed]
#### Hardware
[More Information Needed]
#### Software
[More Information Needed]
## Citation [optional]
<!-- If there is a paper or blog post introducing the model, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the model or model card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Model Card Authors [optional]
[More Information Needed]
## Model Card Contact
[More Information Needed] | [
"label_0",
"label_1",
"label_2",
"label_3"
] |
sha000/detr-finetuned-quadrant-v5 |
# Model Card for Model ID
<!-- Provide a quick summary of what the model is/does. -->
## Model Details
### Model Description
<!-- Provide a longer summary of what this model is. -->
This is the model card of a 🤗 transformers model that has been pushed on the Hub. This model card has been automatically generated.
- **Developed by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Model type:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
- **Finetuned from model [optional]:** [More Information Needed]
### Model Sources [optional]
<!-- Provide the basic links for the model. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the model is intended to be used, including the foreseeable users of the model and those affected by the model. -->
### Direct Use
<!-- This section is for the model use without fine-tuning or plugging into a larger ecosystem/app. -->
[More Information Needed]
### Downstream Use [optional]
<!-- This section is for the model use when fine-tuned for a task, or when plugged into a larger ecosystem/app -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the model will not work well for. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.
## How to Get Started with the Model
Use the code below to get started with the model.
[More Information Needed]
## Training Details
### Training Data
<!-- This should link to a Dataset Card, perhaps with a short stub of information on what the training data is all about as well as documentation related to data pre-processing or additional filtering. -->
[More Information Needed]
### Training Procedure
<!-- This relates heavily to the Technical Specifications. Content here should link to that section when it is relevant to the training procedure. -->
#### Preprocessing [optional]
[More Information Needed]
#### Training Hyperparameters
- **Training regime:** [More Information Needed] <!--fp32, fp16 mixed precision, bf16 mixed precision, bf16 non-mixed precision, fp16 non-mixed precision, fp8 mixed precision -->
#### Speeds, Sizes, Times [optional]
<!-- This section provides information about throughput, start/end time, checkpoint size if relevant, etc. -->
[More Information Needed]
## Evaluation
<!-- This section describes the evaluation protocols and provides the results. -->
### Testing Data, Factors & Metrics
#### Testing Data
<!-- This should link to a Dataset Card if possible. -->
[More Information Needed]
#### Factors
<!-- These are the things the evaluation is disaggregating by, e.g., subpopulations or domains. -->
[More Information Needed]
#### Metrics
<!-- These are the evaluation metrics being used, ideally with a description of why. -->
[More Information Needed]
### Results
[More Information Needed]
#### Summary
## Model Examination [optional]
<!-- Relevant interpretability work for the model goes here -->
[More Information Needed]
## Environmental Impact
<!-- Total emissions (in grams of CO2eq) and additional considerations, such as electricity usage, go here. Edit the suggested text below accordingly -->
Carbon emissions can be estimated using the [Machine Learning Impact calculator](https://mlco2.github.io/impact#compute) presented in [Lacoste et al. (2019)](https://arxiv.org/abs/1910.09700).
- **Hardware Type:** [More Information Needed]
- **Hours used:** [More Information Needed]
- **Cloud Provider:** [More Information Needed]
- **Compute Region:** [More Information Needed]
- **Carbon Emitted:** [More Information Needed]
## Technical Specifications [optional]
### Model Architecture and Objective
[More Information Needed]
### Compute Infrastructure
[More Information Needed]
#### Hardware
[More Information Needed]
#### Software
[More Information Needed]
## Citation [optional]
<!-- If there is a paper or blog post introducing the model, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the model or model card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Model Card Authors [optional]
[More Information Needed]
## Model Card Contact
[More Information Needed] | [
"label_0",
"label_1",
"label_2",
"label_3"
] |
sha000/detr-finetuned-quadrant-v6 |
# Model Card for Model ID
<!-- Provide a quick summary of what the model is/does. -->
## Model Details
### Model Description
<!-- Provide a longer summary of what this model is. -->
This is the model card of a 🤗 transformers model that has been pushed on the Hub. This model card has been automatically generated.
- **Developed by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Model type:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
- **Finetuned from model [optional]:** [More Information Needed]
### Model Sources [optional]
<!-- Provide the basic links for the model. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the model is intended to be used, including the foreseeable users of the model and those affected by the model. -->
### Direct Use
<!-- This section is for the model use without fine-tuning or plugging into a larger ecosystem/app. -->
[More Information Needed]
### Downstream Use [optional]
<!-- This section is for the model use when fine-tuned for a task, or when plugged into a larger ecosystem/app -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the model will not work well for. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.
## How to Get Started with the Model
Use the code below to get started with the model.
[More Information Needed]
## Training Details
### Training Data
<!-- This should link to a Dataset Card, perhaps with a short stub of information on what the training data is all about as well as documentation related to data pre-processing or additional filtering. -->
[More Information Needed]
### Training Procedure
<!-- This relates heavily to the Technical Specifications. Content here should link to that section when it is relevant to the training procedure. -->
#### Preprocessing [optional]
[More Information Needed]
#### Training Hyperparameters
- **Training regime:** [More Information Needed] <!--fp32, fp16 mixed precision, bf16 mixed precision, bf16 non-mixed precision, fp16 non-mixed precision, fp8 mixed precision -->
#### Speeds, Sizes, Times [optional]
<!-- This section provides information about throughput, start/end time, checkpoint size if relevant, etc. -->
[More Information Needed]
## Evaluation
<!-- This section describes the evaluation protocols and provides the results. -->
### Testing Data, Factors & Metrics
#### Testing Data
<!-- This should link to a Dataset Card if possible. -->
[More Information Needed]
#### Factors
<!-- These are the things the evaluation is disaggregating by, e.g., subpopulations or domains. -->
[More Information Needed]
#### Metrics
<!-- These are the evaluation metrics being used, ideally with a description of why. -->
[More Information Needed]
### Results
[More Information Needed]
#### Summary
## Model Examination [optional]
<!-- Relevant interpretability work for the model goes here -->
[More Information Needed]
## Environmental Impact
<!-- Total emissions (in grams of CO2eq) and additional considerations, such as electricity usage, go here. Edit the suggested text below accordingly -->
Carbon emissions can be estimated using the [Machine Learning Impact calculator](https://mlco2.github.io/impact#compute) presented in [Lacoste et al. (2019)](https://arxiv.org/abs/1910.09700).
- **Hardware Type:** [More Information Needed]
- **Hours used:** [More Information Needed]
- **Cloud Provider:** [More Information Needed]
- **Compute Region:** [More Information Needed]
- **Carbon Emitted:** [More Information Needed]
## Technical Specifications [optional]
### Model Architecture and Objective
[More Information Needed]
### Compute Infrastructure
[More Information Needed]
#### Hardware
[More Information Needed]
#### Software
[More Information Needed]
## Citation [optional]
<!-- If there is a paper or blog post introducing the model, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the model or model card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Model Card Authors [optional]
[More Information Needed]
## Model Card Contact
[More Information Needed] | [
"label_0",
"label_1",
"label_2",
"label_3"
] |
sha000/detr-finetuned-quadrant-v7 |
# Model Card for Model ID
<!-- Provide a quick summary of what the model is/does. -->
## Model Details
### Model Description
<!-- Provide a longer summary of what this model is. -->
This is the model card of a 🤗 transformers model that has been pushed on the Hub. This model card has been automatically generated.
- **Developed by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Model type:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
- **Finetuned from model [optional]:** [More Information Needed]
### Model Sources [optional]
<!-- Provide the basic links for the model. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the model is intended to be used, including the foreseeable users of the model and those affected by the model. -->
### Direct Use
<!-- This section is for the model use without fine-tuning or plugging into a larger ecosystem/app. -->
[More Information Needed]
### Downstream Use [optional]
<!-- This section is for the model use when fine-tuned for a task, or when plugged into a larger ecosystem/app -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the model will not work well for. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.
## How to Get Started with the Model
Use the code below to get started with the model.
[More Information Needed]
## Training Details
### Training Data
<!-- This should link to a Dataset Card, perhaps with a short stub of information on what the training data is all about as well as documentation related to data pre-processing or additional filtering. -->
[More Information Needed]
### Training Procedure
<!-- This relates heavily to the Technical Specifications. Content here should link to that section when it is relevant to the training procedure. -->
#### Preprocessing [optional]
[More Information Needed]
#### Training Hyperparameters
- **Training regime:** [More Information Needed] <!--fp32, fp16 mixed precision, bf16 mixed precision, bf16 non-mixed precision, fp16 non-mixed precision, fp8 mixed precision -->
#### Speeds, Sizes, Times [optional]
<!-- This section provides information about throughput, start/end time, checkpoint size if relevant, etc. -->
[More Information Needed]
## Evaluation
<!-- This section describes the evaluation protocols and provides the results. -->
### Testing Data, Factors & Metrics
#### Testing Data
<!-- This should link to a Dataset Card if possible. -->
[More Information Needed]
#### Factors
<!-- These are the things the evaluation is disaggregating by, e.g., subpopulations or domains. -->
[More Information Needed]
#### Metrics
<!-- These are the evaluation metrics being used, ideally with a description of why. -->
[More Information Needed]
### Results
[More Information Needed]
#### Summary
## Model Examination [optional]
<!-- Relevant interpretability work for the model goes here -->
[More Information Needed]
## Environmental Impact
<!-- Total emissions (in grams of CO2eq) and additional considerations, such as electricity usage, go here. Edit the suggested text below accordingly -->
Carbon emissions can be estimated using the [Machine Learning Impact calculator](https://mlco2.github.io/impact#compute) presented in [Lacoste et al. (2019)](https://arxiv.org/abs/1910.09700).
- **Hardware Type:** [More Information Needed]
- **Hours used:** [More Information Needed]
- **Cloud Provider:** [More Information Needed]
- **Compute Region:** [More Information Needed]
- **Carbon Emitted:** [More Information Needed]
## Technical Specifications [optional]
### Model Architecture and Objective
[More Information Needed]
### Compute Infrastructure
[More Information Needed]
#### Hardware
[More Information Needed]
#### Software
[More Information Needed]
## Citation [optional]
<!-- If there is a paper or blog post introducing the model, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the model or model card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Model Card Authors [optional]
[More Information Needed]
## Model Card Contact
[More Information Needed] | [
"label_0",
"label_1",
"label_2",
"label_3"
] |
sha000/detr-finetuned-quadrant-v8 |
# Model Card for Model ID
<!-- Provide a quick summary of what the model is/does. -->
## Model Details
### Model Description
<!-- Provide a longer summary of what this model is. -->
This is the model card of a 🤗 transformers model that has been pushed on the Hub. This model card has been automatically generated.
- **Developed by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Model type:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
- **Finetuned from model [optional]:** [More Information Needed]
### Model Sources [optional]
<!-- Provide the basic links for the model. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the model is intended to be used, including the foreseeable users of the model and those affected by the model. -->
### Direct Use
<!-- This section is for the model use without fine-tuning or plugging into a larger ecosystem/app. -->
[More Information Needed]
### Downstream Use [optional]
<!-- This section is for the model use when fine-tuned for a task, or when plugged into a larger ecosystem/app -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the model will not work well for. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.
## How to Get Started with the Model
Use the code below to get started with the model.
[More Information Needed]
## Training Details
### Training Data
<!-- This should link to a Dataset Card, perhaps with a short stub of information on what the training data is all about as well as documentation related to data pre-processing or additional filtering. -->
[More Information Needed]
### Training Procedure
<!-- This relates heavily to the Technical Specifications. Content here should link to that section when it is relevant to the training procedure. -->
#### Preprocessing [optional]
[More Information Needed]
#### Training Hyperparameters
- **Training regime:** [More Information Needed] <!--fp32, fp16 mixed precision, bf16 mixed precision, bf16 non-mixed precision, fp16 non-mixed precision, fp8 mixed precision -->
#### Speeds, Sizes, Times [optional]
<!-- This section provides information about throughput, start/end time, checkpoint size if relevant, etc. -->
[More Information Needed]
## Evaluation
<!-- This section describes the evaluation protocols and provides the results. -->
### Testing Data, Factors & Metrics
#### Testing Data
<!-- This should link to a Dataset Card if possible. -->
[More Information Needed]
#### Factors
<!-- These are the things the evaluation is disaggregating by, e.g., subpopulations or domains. -->
[More Information Needed]
#### Metrics
<!-- These are the evaluation metrics being used, ideally with a description of why. -->
[More Information Needed]
### Results
[More Information Needed]
#### Summary
## Model Examination [optional]
<!-- Relevant interpretability work for the model goes here -->
[More Information Needed]
## Environmental Impact
<!-- Total emissions (in grams of CO2eq) and additional considerations, such as electricity usage, go here. Edit the suggested text below accordingly -->
Carbon emissions can be estimated using the [Machine Learning Impact calculator](https://mlco2.github.io/impact#compute) presented in [Lacoste et al. (2019)](https://arxiv.org/abs/1910.09700).
- **Hardware Type:** [More Information Needed]
- **Hours used:** [More Information Needed]
- **Cloud Provider:** [More Information Needed]
- **Compute Region:** [More Information Needed]
- **Carbon Emitted:** [More Information Needed]
## Technical Specifications [optional]
### Model Architecture and Objective
[More Information Needed]
### Compute Infrastructure
[More Information Needed]
#### Hardware
[More Information Needed]
#### Software
[More Information Needed]
## Citation [optional]
<!-- If there is a paper or blog post introducing the model, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the model or model card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Model Card Authors [optional]
[More Information Needed]
## Model Card Contact
[More Information Needed] | [
"label_0",
"label_1",
"label_2",
"label_3"
] |
sha000/detr-finetuned-quadrant-v9 |
# Model Card for Model ID
<!-- Provide a quick summary of what the model is/does. -->
## Model Details
### Model Description
<!-- Provide a longer summary of what this model is. -->
This is the model card of a 🤗 transformers model that has been pushed on the Hub. This model card has been automatically generated.
- **Developed by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Model type:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
- **Finetuned from model [optional]:** [More Information Needed]
### Model Sources [optional]
<!-- Provide the basic links for the model. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the model is intended to be used, including the foreseeable users of the model and those affected by the model. -->
### Direct Use
<!-- This section is for the model use without fine-tuning or plugging into a larger ecosystem/app. -->
[More Information Needed]
### Downstream Use [optional]
<!-- This section is for the model use when fine-tuned for a task, or when plugged into a larger ecosystem/app -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the model will not work well for. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.
## How to Get Started with the Model
Use the code below to get started with the model.
[More Information Needed]
## Training Details
### Training Data
<!-- This should link to a Dataset Card, perhaps with a short stub of information on what the training data is all about as well as documentation related to data pre-processing or additional filtering. -->
[More Information Needed]
### Training Procedure
<!-- This relates heavily to the Technical Specifications. Content here should link to that section when it is relevant to the training procedure. -->
#### Preprocessing [optional]
[More Information Needed]
#### Training Hyperparameters
- **Training regime:** [More Information Needed] <!--fp32, fp16 mixed precision, bf16 mixed precision, bf16 non-mixed precision, fp16 non-mixed precision, fp8 mixed precision -->
#### Speeds, Sizes, Times [optional]
<!-- This section provides information about throughput, start/end time, checkpoint size if relevant, etc. -->
[More Information Needed]
## Evaluation
<!-- This section describes the evaluation protocols and provides the results. -->
### Testing Data, Factors & Metrics
#### Testing Data
<!-- This should link to a Dataset Card if possible. -->
[More Information Needed]
#### Factors
<!-- These are the things the evaluation is disaggregating by, e.g., subpopulations or domains. -->
[More Information Needed]
#### Metrics
<!-- These are the evaluation metrics being used, ideally with a description of why. -->
[More Information Needed]
### Results
[More Information Needed]
#### Summary
## Model Examination [optional]
<!-- Relevant interpretability work for the model goes here -->
[More Information Needed]
## Environmental Impact
<!-- Total emissions (in grams of CO2eq) and additional considerations, such as electricity usage, go here. Edit the suggested text below accordingly -->
Carbon emissions can be estimated using the [Machine Learning Impact calculator](https://mlco2.github.io/impact#compute) presented in [Lacoste et al. (2019)](https://arxiv.org/abs/1910.09700).
- **Hardware Type:** [More Information Needed]
- **Hours used:** [More Information Needed]
- **Cloud Provider:** [More Information Needed]
- **Compute Region:** [More Information Needed]
- **Carbon Emitted:** [More Information Needed]
## Technical Specifications [optional]
### Model Architecture and Objective
[More Information Needed]
### Compute Infrastructure
[More Information Needed]
#### Hardware
[More Information Needed]
#### Software
[More Information Needed]
## Citation [optional]
<!-- If there is a paper or blog post introducing the model, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the model or model card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Model Card Authors [optional]
[More Information Needed]
## Model Card Contact
[More Information Needed] | [
"label_0",
"label_1",
"label_2",
"label_3"
] |
sha000/deta-finetuned-quadrant-v1 |
# Model Card for Model ID
<!-- Provide a quick summary of what the model is/does. -->
## Model Details
### Model Description
<!-- Provide a longer summary of what this model is. -->
This is the model card of a 🤗 transformers model that has been pushed on the Hub. This model card has been automatically generated.
- **Developed by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Model type:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
- **Finetuned from model [optional]:** [More Information Needed]
### Model Sources [optional]
<!-- Provide the basic links for the model. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the model is intended to be used, including the foreseeable users of the model and those affected by the model. -->
### Direct Use
<!-- This section is for the model use without fine-tuning or plugging into a larger ecosystem/app. -->
[More Information Needed]
### Downstream Use [optional]
<!-- This section is for the model use when fine-tuned for a task, or when plugged into a larger ecosystem/app -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the model will not work well for. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.
## How to Get Started with the Model
Use the code below to get started with the model.
[More Information Needed]
## Training Details
### Training Data
<!-- This should link to a Dataset Card, perhaps with a short stub of information on what the training data is all about as well as documentation related to data pre-processing or additional filtering. -->
[More Information Needed]
### Training Procedure
<!-- This relates heavily to the Technical Specifications. Content here should link to that section when it is relevant to the training procedure. -->
#### Preprocessing [optional]
[More Information Needed]
#### Training Hyperparameters
- **Training regime:** [More Information Needed] <!--fp32, fp16 mixed precision, bf16 mixed precision, bf16 non-mixed precision, fp16 non-mixed precision, fp8 mixed precision -->
#### Speeds, Sizes, Times [optional]
<!-- This section provides information about throughput, start/end time, checkpoint size if relevant, etc. -->
[More Information Needed]
## Evaluation
<!-- This section describes the evaluation protocols and provides the results. -->
### Testing Data, Factors & Metrics
#### Testing Data
<!-- This should link to a Dataset Card if possible. -->
[More Information Needed]
#### Factors
<!-- These are the things the evaluation is disaggregating by, e.g., subpopulations or domains. -->
[More Information Needed]
#### Metrics
<!-- These are the evaluation metrics being used, ideally with a description of why. -->
[More Information Needed]
### Results
[More Information Needed]
#### Summary
## Model Examination [optional]
<!-- Relevant interpretability work for the model goes here -->
[More Information Needed]
## Environmental Impact
<!-- Total emissions (in grams of CO2eq) and additional considerations, such as electricity usage, go here. Edit the suggested text below accordingly -->
Carbon emissions can be estimated using the [Machine Learning Impact calculator](https://mlco2.github.io/impact#compute) presented in [Lacoste et al. (2019)](https://arxiv.org/abs/1910.09700).
- **Hardware Type:** [More Information Needed]
- **Hours used:** [More Information Needed]
- **Cloud Provider:** [More Information Needed]
- **Compute Region:** [More Information Needed]
- **Carbon Emitted:** [More Information Needed]
## Technical Specifications [optional]
### Model Architecture and Objective
[More Information Needed]
### Compute Infrastructure
[More Information Needed]
#### Hardware
[More Information Needed]
#### Software
[More Information Needed]
## Citation [optional]
<!-- If there is a paper or blog post introducing the model, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the model or model card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Model Card Authors [optional]
[More Information Needed]
## Model Card Contact
[More Information Needed] | [
"label_0",
"label_1",
"label_2",
"label_3"
] |
Ilhamfaisal/detr-resnet-50-dc5-grasshopper-finetuned-maxsteps-10000-batchsize-2-ilham |
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# detr-resnet-50-dc5-grasshopper-finetuned-maxsteps-10000-batchsize-2-ilham
This model is a fine-tuned version of [facebook/detr-resnet-50-dc5](https://huggingface.co/facebook/detr-resnet-50-dc5) on the None dataset.
It achieves the following results on the evaluation set:
- Loss: 3.7495
- Map: 0.0001
- Map 50: 0.0002
- Map 75: 0.0001
- Map Small: 0.0001
- Map Medium: -1.0
- Map Large: -1.0
- Mar 1: 0.0
- Mar 10: 0.0019
- Mar 100: 0.0047
- Mar Small: 0.0047
- Mar Medium: -1.0
- Mar Large: -1.0
- Map Recilia dorsalis: 0.0
- Mar 100 Recilia dorsalis: 0.0042
- Map Nephotettix malayanus: 0.0003
- Mar 100 Nephotettix malayanus: 0.0074
- Map Sogatella furcifera: 0.0
- Mar 100 Sogatella furcifera: 0.0
- Map Nilaparvata lugens: 0.0
- Mar 100 Nilaparvata lugens: 0.0071
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 1e-05
- train_batch_size: 2
- eval_batch_size: 2
- seed: 42
- optimizer: Use OptimizerNames.ADAMW_TORCH with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
- lr_scheduler_type: linear
- training_steps: 10000
- mixed_precision_training: Native AMP
### Training results
| Training Loss | Epoch | Step | Validation Loss | Map | Map 50 | Map 75 | Map Small | Map Medium | Map Large | Mar 1 | Mar 10 | Mar 100 | Mar Small | Mar Medium | Mar Large | Map Recilia dorsalis | Mar 100 Recilia dorsalis | Map Nephotettix malayanus | Mar 100 Nephotettix malayanus | Map Sogatella furcifera | Mar 100 Sogatella furcifera | Map Nilaparvata lugens | Mar 100 Nilaparvata lugens |
|:-------------:|:--------:|:-----:|:---------------:|:------:|:------:|:------:|:---------:|:----------:|:---------:|:------:|:------:|:-------:|:---------:|:----------:|:---------:|:--------------------:|:------------------------:|:-------------------------:|:-----------------------------:|:-----------------------:|:---------------------------:|:----------------------:|:--------------------------:|
| 7.8557 | 0.5952 | 50 | 6.5271 | 0.0 | 0.0 | 0.0 | 0.0 | -1.0 | -1.0 | 0.0 | 0.0 | 0.0001 | 0.0001 | -1.0 | -1.0 | 0.0 | 0.0 | 0.0 | 0.0005 | 0.0 | 0.0 | 0.0 | 0.0 |
| 5.1482 | 1.1905 | 100 | 4.5178 | 0.0 | 0.0 | 0.0 | 0.0 | -1.0 | -1.0 | 0.0 | 0.0 | 0.0017 | 0.0017 | -1.0 | -1.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0068 | 0.0 | 0.0 |
| 4.73 | 1.7857 | 150 | 4.3700 | 0.0 | 0.0 | 0.0 | 0.0 | -1.0 | -1.0 | 0.0 | 0.0002 | 0.0036 | 0.0036 | -1.0 | -1.0 | 0.0 | 0.0 | 0.0 | 0.0009 | 0.0 | 0.0136 | 0.0 | 0.0 |
| 4.2549 | 2.3810 | 200 | 4.3965 | 0.0 | 0.0 | 0.0 | 0.0 | -1.0 | -1.0 | 0.0 | 0.0 | 0.0 | 0.0 | -1.0 | -1.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 4.5438 | 2.9762 | 250 | 4.2896 | 0.0 | 0.0 | 0.0 | 0.0 | -1.0 | -1.0 | 0.0 | 0.0 | 0.0 | 0.0 | -1.0 | -1.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 4.19 | 3.5714 | 300 | 4.3528 | 0.0 | 0.0 | 0.0 | 0.0 | -1.0 | -1.0 | 0.0 | 0.0 | 0.0 | 0.0 | -1.0 | -1.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 4.2118 | 4.1667 | 350 | 4.2757 | 0.0 | 0.0 | 0.0 | 0.0 | -1.0 | -1.0 | 0.0 | 0.0 | 0.0 | 0.0 | -1.0 | -1.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 4.1294 | 4.7619 | 400 | 4.2314 | 0.0 | 0.0 | 0.0 | 0.0 | -1.0 | -1.0 | 0.0 | 0.0 | 0.0 | 0.0 | -1.0 | -1.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 3.9266 | 5.3571 | 450 | 4.2360 | 0.0 | 0.0 | 0.0 | 0.0 | -1.0 | -1.0 | 0.0 | 0.0 | 0.0 | 0.0 | -1.0 | -1.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 4.102 | 5.9524 | 500 | 4.1888 | 0.0 | 0.0 | 0.0 | 0.0 | -1.0 | -1.0 | 0.0 | 0.0 | 0.0 | 0.0 | -1.0 | -1.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 3.862 | 6.5476 | 550 | 4.1402 | 0.0 | 0.0 | 0.0 | 0.0 | -1.0 | -1.0 | 0.0 | 0.0 | 0.0 | 0.0 | -1.0 | -1.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 4.0277 | 7.1429 | 600 | 4.1432 | 0.0 | 0.0 | 0.0 | 0.0 | -1.0 | -1.0 | 0.0 | 0.0 | 0.0 | 0.0 | -1.0 | -1.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 4.1993 | 7.7381 | 650 | 4.1737 | 0.0 | 0.0001 | 0.0 | 0.0 | -1.0 | -1.0 | 0.0 | 0.0002 | 0.0002 | 0.0002 | -1.0 | -1.0 | 0.0 | 0.0 | 0.0001 | 0.0009 | 0.0 | 0.0 | 0.0 | 0.0 |
| 4.873 | 8.3333 | 700 | 4.1959 | 0.0 | 0.0001 | 0.0 | 0.0 | -1.0 | -1.0 | 0.0 | 0.0001 | 0.0003 | 0.0003 | -1.0 | -1.0 | 0.0 | 0.0 | 0.0 | 0.0005 | 0.0 | 0.0 | 0.0 | 0.0006 |
| 4.5011 | 8.9286 | 750 | 4.1311 | 0.0 | 0.0 | 0.0 | 0.0 | -1.0 | -1.0 | 0.0 | 0.0 | 0.0 | 0.0 | -1.0 | -1.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 4.0208 | 9.5238 | 800 | 4.1827 | 0.0 | 0.0 | 0.0 | 0.0 | -1.0 | -1.0 | 0.0 | 0.0 | 0.0 | 0.0 | -1.0 | -1.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 4.0261 | 10.1190 | 850 | 4.0846 | 0.0 | 0.0 | 0.0 | 0.0 | -1.0 | -1.0 | 0.0 | 0.0 | 0.0 | 0.0 | -1.0 | -1.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 3.8466 | 10.7143 | 900 | 4.1543 | 0.0 | 0.0 | 0.0 | 0.0 | -1.0 | -1.0 | 0.0 | 0.0 | 0.0 | 0.0 | -1.0 | -1.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 4.1287 | 11.3095 | 950 | 4.1680 | 0.0 | 0.0 | 0.0 | 0.0 | -1.0 | -1.0 | 0.0 | 0.0 | 0.0 | 0.0 | -1.0 | -1.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 3.9538 | 11.9048 | 1000 | 4.1849 | 0.0 | 0.0 | 0.0 | 0.0 | -1.0 | -1.0 | 0.0 | 0.0 | 0.0 | 0.0 | -1.0 | -1.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 3.6927 | 12.5 | 1050 | 4.0885 | 0.0 | 0.0 | 0.0 | 0.0 | -1.0 | -1.0 | 0.0 | 0.0 | 0.0 | 0.0 | -1.0 | -1.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 3.9334 | 13.0952 | 1100 | 4.1670 | 0.0 | 0.0 | 0.0 | 0.0 | -1.0 | -1.0 | 0.0 | 0.0 | 0.0 | 0.0 | -1.0 | -1.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 3.9389 | 13.6905 | 1150 | 4.2387 | 0.0 | 0.0 | 0.0 | 0.0 | -1.0 | -1.0 | 0.0 | 0.0 | 0.0 | 0.0 | -1.0 | -1.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 3.6675 | 14.2857 | 1200 | 4.1635 | 0.0 | 0.0004 | 0.0 | 0.0 | -1.0 | -1.0 | 0.0001 | 0.0001 | 0.0001 | 0.0001 | -1.0 | -1.0 | 0.0 | 0.0 | 0.0002 | 0.0005 | 0.0 | 0.0 | 0.0 | 0.0 |
| 4.3397 | 14.8810 | 1250 | 4.1893 | 0.0 | 0.0 | 0.0 | 0.0 | -1.0 | -1.0 | 0.0 | 0.0 | 0.0 | 0.0 | -1.0 | -1.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 3.1642 | 15.4762 | 1300 | 4.1441 | 0.0 | 0.0 | 0.0 | 0.0 | -1.0 | -1.0 | 0.0 | 0.0 | 0.0 | 0.0 | -1.0 | -1.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 4.6283 | 16.0714 | 1350 | 4.1287 | 0.0 | 0.0 | 0.0 | 0.0 | -1.0 | -1.0 | 0.0 | 0.0 | 0.0 | 0.0 | -1.0 | -1.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 4.2969 | 16.6667 | 1400 | 4.0819 | 0.0 | 0.0 | 0.0 | 0.0 | -1.0 | -1.0 | 0.0 | 0.0 | 0.0 | 0.0 | -1.0 | -1.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 3.9804 | 17.2619 | 1450 | 4.1292 | 0.0 | 0.0 | 0.0 | 0.0 | -1.0 | -1.0 | 0.0 | 0.0 | 0.0 | 0.0 | -1.0 | -1.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 3.971 | 17.8571 | 1500 | 4.0567 | 0.0 | 0.0 | 0.0 | 0.0 | -1.0 | -1.0 | 0.0 | 0.0 | 0.0 | 0.0 | -1.0 | -1.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 3.4007 | 18.4524 | 1550 | 4.0544 | 0.0002 | 0.0005 | 0.0 | 0.0002 | -1.0 | -1.0 | 0.0 | 0.0005 | 0.0005 | 0.0005 | -1.0 | -1.0 | 0.0 | 0.0 | 0.0008 | 0.0019 | 0.0 | 0.0 | 0.0 | 0.0 |
| 4.2788 | 19.0476 | 1600 | 4.0556 | 0.0001 | 0.0002 | 0.0 | 0.0001 | -1.0 | -1.0 | 0.0 | 0.001 | 0.001 | 0.001 | -1.0 | -1.0 | 0.0 | 0.0 | 0.0002 | 0.0042 | 0.0 | 0.0 | 0.0 | 0.0 |
| 4.2513 | 19.6429 | 1650 | 4.0553 | 0.0002 | 0.0004 | 0.0 | 0.0002 | -1.0 | -1.0 | 0.0 | 0.0006 | 0.0006 | 0.0006 | -1.0 | -1.0 | 0.0 | 0.0 | 0.0007 | 0.0023 | 0.0 | 0.0 | 0.0 | 0.0 |
| 3.6183 | 20.2381 | 1700 | 4.1075 | 0.0 | 0.0 | 0.0 | 0.0 | -1.0 | -1.0 | 0.0 | 0.0 | 0.0006 | 0.0006 | -1.0 | -1.0 | 0.0 | 0.0014 | 0.0 | 0.0009 | 0.0 | 0.0 | 0.0 | 0.0 |
| 4.2522 | 20.8333 | 1750 | 4.1207 | 0.0001 | 0.0004 | 0.0 | 0.0001 | -1.0 | -1.0 | 0.0002 | 0.0002 | 0.0006 | 0.0006 | -1.0 | -1.0 | 0.0 | 0.0005 | 0.0003 | 0.0019 | 0.0 | 0.0 | 0.0 | 0.0 |
| 4.2156 | 21.4286 | 1800 | 4.0288 | 0.0 | 0.0 | 0.0 | 0.0 | -1.0 | -1.0 | 0.0 | 0.0 | 0.0014 | 0.0014 | -1.0 | -1.0 | 0.0 | 0.0005 | 0.0 | 0.0051 | 0.0 | 0.0 | 0.0 | 0.0 |
| 4.4063 | 22.0238 | 1850 | 4.0143 | 0.0 | 0.0 | 0.0 | 0.0 | -1.0 | -1.0 | 0.0 | 0.0 | 0.0019 | 0.0019 | -1.0 | -1.0 | 0.0 | 0.0014 | 0.0 | 0.006 | 0.0 | 0.0 | 0.0 | 0.0 |
| 4.5577 | 22.6190 | 1900 | 4.0514 | 0.0001 | 0.0002 | 0.0 | 0.0001 | -1.0 | -1.0 | 0.0 | 0.0006 | 0.0006 | 0.0006 | -1.0 | -1.0 | 0.0 | 0.0 | 0.0004 | 0.0023 | 0.0 | 0.0 | 0.0 | 0.0 |
| 3.9326 | 23.2143 | 1950 | 4.0323 | 0.0 | 0.0003 | 0.0 | 0.0 | -1.0 | -1.0 | 0.0 | 0.0001 | 0.0001 | 0.0001 | -1.0 | -1.0 | 0.0 | 0.0 | 0.0001 | 0.0005 | 0.0 | 0.0 | 0.0 | 0.0 |
| 3.9504 | 23.8095 | 2000 | 3.9393 | 0.0001 | 0.0003 | 0.0 | 0.0001 | -1.0 | -1.0 | 0.0 | 0.0002 | 0.0016 | 0.0016 | -1.0 | -1.0 | 0.0 | 0.0 | 0.0003 | 0.0051 | 0.0 | 0.0 | 0.0 | 0.0012 |
| 4.6677 | 24.4048 | 2050 | 3.9693 | 0.0 | 0.0 | 0.0 | 0.0 | -1.0 | -1.0 | 0.0 | 0.0 | 0.0005 | 0.0005 | -1.0 | -1.0 | 0.0 | 0.0 | 0.0 | 0.0014 | 0.0 | 0.0 | 0.0 | 0.0006 |
| 3.7945 | 25.0 | 2100 | 3.9099 | 0.0002 | 0.0004 | 0.0004 | 0.0002 | -1.0 | -1.0 | 0.0 | 0.0007 | 0.0012 | 0.0012 | -1.0 | -1.0 | 0.0 | 0.0005 | 0.0008 | 0.0042 | 0.0 | 0.0 | 0.0 | 0.0 |
| 3.6887 | 25.5952 | 2150 | 3.7940 | 0.0 | 0.0004 | 0.0 | 0.0 | -1.0 | -1.0 | 0.0 | 0.0001 | 0.0024 | 0.0024 | -1.0 | -1.0 | 0.0 | 0.0032 | 0.0002 | 0.0051 | 0.0 | 0.0 | 0.0 | 0.0012 |
| 3.5835 | 26.1905 | 2200 | 3.8883 | 0.0 | 0.0 | 0.0 | 0.0 | -1.0 | -1.0 | 0.0 | 0.0 | 0.0013 | 0.0013 | -1.0 | -1.0 | 0.0 | 0.0019 | 0.0 | 0.0014 | 0.0 | 0.0 | 0.0 | 0.0018 |
| 4.4304 | 26.7857 | 2250 | 3.8795 | 0.0 | 0.0001 | 0.0 | 0.0 | -1.0 | -1.0 | 0.0 | 0.0005 | 0.001 | 0.001 | -1.0 | -1.0 | 0.0 | 0.0005 | 0.0001 | 0.0019 | 0.0 | 0.0 | 0.0 | 0.0018 |
| 4.4133 | 27.3810 | 2300 | 3.8778 | 0.0 | 0.0001 | 0.0 | 0.0 | -1.0 | -1.0 | 0.0 | 0.0005 | 0.0012 | 0.0012 | -1.0 | -1.0 | 0.0 | 0.0028 | 0.0002 | 0.0019 | 0.0 | 0.0 | 0.0 | 0.0 |
| 4.0047 | 27.9762 | 2350 | 4.0399 | 0.0 | 0.0 | 0.0 | 0.0 | -1.0 | -1.0 | 0.0 | 0.0 | 0.0001 | 0.0001 | -1.0 | -1.0 | 0.0 | 0.0005 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 3.9441 | 28.5714 | 2400 | 3.9364 | 0.0 | 0.0 | 0.0 | 0.0 | -1.0 | -1.0 | 0.0 | 0.0 | 0.0008 | 0.0008 | -1.0 | -1.0 | 0.0 | 0.0032 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 3.7486 | 29.1667 | 2450 | 3.8319 | 0.0 | 0.0 | 0.0 | 0.0 | -1.0 | -1.0 | 0.0 | 0.0 | 0.0007 | 0.0007 | -1.0 | -1.0 | 0.0 | 0.0023 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0006 |
| 3.8208 | 29.7619 | 2500 | 3.8796 | 0.0 | 0.0 | 0.0 | 0.0 | -1.0 | -1.0 | 0.0 | 0.0 | 0.0012 | 0.0012 | -1.0 | -1.0 | 0.0 | 0.0042 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0006 |
| 3.6209 | 30.3571 | 2550 | 3.9295 | 0.0 | 0.0 | 0.0 | 0.0 | -1.0 | -1.0 | 0.0 | 0.0 | 0.001 | 0.001 | -1.0 | -1.0 | 0.0 | 0.0042 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 3.8594 | 30.9524 | 2600 | 3.8844 | 0.0 | 0.0001 | 0.0 | 0.0 | -1.0 | -1.0 | 0.0 | 0.0002 | 0.0013 | 0.0013 | -1.0 | -1.0 | 0.0001 | 0.0042 | 0.0 | 0.0009 | 0.0 | 0.0 | 0.0 | 0.0 |
| 3.2017 | 31.5476 | 2650 | 3.8853 | 0.0 | 0.0 | 0.0 | 0.0 | -1.0 | -1.0 | 0.0 | 0.0 | 0.0017 | 0.0017 | -1.0 | -1.0 | 0.0 | 0.0032 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0035 |
| 3.5715 | 32.1429 | 2700 | 3.8114 | 0.0 | 0.0 | 0.0 | 0.0 | -1.0 | -1.0 | 0.0 | 0.0 | 0.0007 | 0.0007 | -1.0 | -1.0 | 0.0 | 0.0023 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0006 |
| 3.1726 | 32.7381 | 2750 | 3.8134 | 0.0 | 0.0 | 0.0 | 0.0 | -1.0 | -1.0 | 0.0 | 0.0 | 0.0003 | 0.0003 | -1.0 | -1.0 | 0.0 | 0.0014 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 4.531 | 33.3333 | 2800 | 3.8624 | 0.0 | 0.0001 | 0.0 | 0.0 | -1.0 | -1.0 | 0.0 | 0.0001 | 0.001 | 0.001 | -1.0 | -1.0 | 0.0 | 0.0028 | 0.0 | 0.0005 | 0.0 | 0.0 | 0.0 | 0.0006 |
| 3.3098 | 33.9286 | 2850 | 3.9348 | 0.0 | 0.0001 | 0.0 | 0.0 | -1.0 | -1.0 | 0.0 | 0.0005 | 0.002 | 0.002 | -1.0 | -1.0 | 0.0 | 0.0023 | 0.0001 | 0.0037 | 0.0 | 0.0 | 0.0 | 0.0018 |
| 3.6244 | 34.5238 | 2900 | 3.9067 | 0.0 | 0.0001 | 0.0 | 0.0 | -1.0 | -1.0 | 0.0 | 0.0001 | 0.0009 | 0.0009 | -1.0 | -1.0 | 0.0 | 0.0032 | 0.0 | 0.0005 | 0.0 | 0.0 | 0.0 | 0.0 |
| 4.0737 | 35.1190 | 2950 | 3.8542 | 0.0 | 0.0001 | 0.0 | 0.0 | -1.0 | -1.0 | 0.0001 | 0.0001 | 0.0012 | 0.0012 | -1.0 | -1.0 | 0.0 | 0.0037 | 0.0 | 0.0005 | 0.0 | 0.0 | 0.0 | 0.0006 |
| 4.2459 | 35.7143 | 3000 | 3.8562 | 0.0 | 0.0 | 0.0 | 0.0 | -1.0 | -1.0 | 0.0 | 0.0 | 0.0003 | 0.0003 | -1.0 | -1.0 | 0.0 | 0.0014 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 3.635 | 36.3095 | 3050 | 3.7866 | 0.0 | 0.0 | 0.0 | 0.0 | -1.0 | -1.0 | 0.0 | 0.0005 | 0.0013 | 0.0013 | -1.0 | -1.0 | 0.0 | 0.0032 | 0.0001 | 0.0019 | 0.0 | 0.0 | 0.0 | 0.0 |
| 4.434 | 36.9048 | 3100 | 3.7946 | 0.0 | 0.0 | 0.0 | 0.0 | -1.0 | -1.0 | 0.0 | 0.0 | 0.0009 | 0.0009 | -1.0 | -1.0 | 0.0 | 0.0037 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 4.1133 | 37.5 | 3150 | 3.8690 | 0.0 | 0.0 | 0.0 | 0.0 | -1.0 | -1.0 | 0.0 | 0.0 | 0.0003 | 0.0003 | -1.0 | -1.0 | 0.0 | 0.0014 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 3.8537 | 38.0952 | 3200 | 3.7781 | 0.0 | 0.0001 | 0.0 | 0.0 | -1.0 | -1.0 | 0.0 | 0.0007 | 0.0018 | 0.0018 | -1.0 | -1.0 | 0.0 | 0.0032 | 0.0001 | 0.0028 | 0.0 | 0.0 | 0.0 | 0.0012 |
| 3.5134 | 38.6905 | 3250 | 3.8650 | 0.0 | 0.0 | 0.0 | 0.0 | -1.0 | -1.0 | 0.0 | 0.0003 | 0.0007 | 0.0007 | -1.0 | -1.0 | 0.0 | 0.0 | 0.0 | 0.0028 | 0.0 | 0.0 | 0.0 | 0.0 |
| 3.0984 | 39.2857 | 3300 | 3.7999 | 0.0 | 0.0 | 0.0 | 0.0 | -1.0 | -1.0 | 0.0 | 0.0003 | 0.0005 | 0.0005 | -1.0 | -1.0 | 0.0 | 0.0005 | 0.0 | 0.0014 | 0.0 | 0.0 | 0.0 | 0.0 |
| 3.7236 | 39.8810 | 3350 | 3.8309 | 0.0 | 0.0001 | 0.0 | 0.0 | -1.0 | -1.0 | 0.0 | 0.0008 | 0.001 | 0.001 | -1.0 | -1.0 | 0.0001 | 0.0028 | 0.0 | 0.0014 | 0.0 | 0.0 | 0.0 | 0.0 |
| 4.433 | 40.4762 | 3400 | 3.8276 | 0.0 | 0.0 | 0.0 | 0.0 | -1.0 | -1.0 | 0.0 | 0.0 | 0.0008 | 0.0008 | -1.0 | -1.0 | 0.0 | 0.0023 | 0.0 | 0.0009 | 0.0 | 0.0 | 0.0 | 0.0 |
| 4.1477 | 41.0714 | 3450 | 3.8255 | 0.0 | 0.0 | 0.0 | 0.0 | -1.0 | -1.0 | 0.0 | 0.0001 | 0.001 | 0.001 | -1.0 | -1.0 | 0.0 | 0.0 | 0.0 | 0.0023 | 0.0 | 0.0 | 0.0 | 0.0018 |
| 3.8148 | 41.6667 | 3500 | 3.8010 | 0.0 | 0.0001 | 0.0 | 0.0 | -1.0 | -1.0 | 0.0 | 0.0005 | 0.0008 | 0.0008 | -1.0 | -1.0 | 0.0 | 0.0014 | 0.0001 | 0.0019 | 0.0 | 0.0 | 0.0 | 0.0 |
| 4.3838 | 42.2619 | 3550 | 3.8383 | 0.0 | 0.0001 | 0.0001 | 0.0 | -1.0 | -1.0 | 0.0 | 0.0007 | 0.001 | 0.001 | -1.0 | -1.0 | 0.0 | 0.0009 | 0.0002 | 0.0033 | 0.0 | 0.0 | 0.0 | 0.0 |
| 3.7048 | 42.8571 | 3600 | 3.8204 | 0.0 | 0.0001 | 0.0 | 0.0 | -1.0 | -1.0 | 0.0 | 0.0006 | 0.0013 | 0.0013 | -1.0 | -1.0 | 0.0 | 0.0019 | 0.0001 | 0.0033 | 0.0 | 0.0 | 0.0 | 0.0 |
| 2.6646 | 43.4524 | 3650 | 3.7597 | 0.0 | 0.0001 | 0.0 | 0.0 | -1.0 | -1.0 | 0.0 | 0.0002 | 0.0012 | 0.0012 | -1.0 | -1.0 | 0.0 | 0.0023 | 0.0001 | 0.0023 | 0.0 | 0.0 | 0.0 | 0.0 |
| 2.883 | 44.0476 | 3700 | 3.7708 | 0.0 | 0.0001 | 0.0 | 0.0 | -1.0 | -1.0 | 0.0 | 0.0004 | 0.0024 | 0.0024 | -1.0 | -1.0 | 0.0 | 0.0069 | 0.0 | 0.0005 | 0.0 | 0.0 | 0.0 | 0.0024 |
| 3.7521 | 44.6429 | 3750 | 3.8078 | 0.0001 | 0.0001 | 0.0001 | 0.0001 | -1.0 | -1.0 | 0.0 | 0.0007 | 0.0014 | 0.0014 | -1.0 | -1.0 | 0.0 | 0.0019 | 0.0002 | 0.0037 | 0.0 | 0.0 | 0.0 | 0.0 |
| 3.9671 | 45.2381 | 3800 | 3.8627 | 0.0 | 0.0001 | 0.0 | 0.0 | -1.0 | -1.0 | 0.0 | 0.0006 | 0.0016 | 0.0016 | -1.0 | -1.0 | 0.0 | 0.0028 | 0.0002 | 0.0037 | 0.0 | 0.0 | 0.0 | 0.0 |
| 3.7391 | 45.8333 | 3850 | 3.8454 | 0.0 | 0.0001 | 0.0 | 0.0 | -1.0 | -1.0 | 0.0003 | 0.0003 | 0.0007 | 0.0007 | -1.0 | -1.0 | 0.0 | 0.0009 | 0.0001 | 0.0019 | 0.0 | 0.0 | 0.0 | 0.0 |
| 3.2402 | 46.4286 | 3900 | 3.7974 | 0.0 | 0.0 | 0.0 | 0.0 | -1.0 | -1.0 | 0.0007 | 0.0007 | 0.0012 | 0.0012 | -1.0 | -1.0 | 0.0 | 0.0014 | 0.0001 | 0.0033 | 0.0 | 0.0 | 0.0 | 0.0 |
| 3.4314 | 47.0238 | 3950 | 3.7405 | 0.0 | 0.0 | 0.0 | 0.0 | -1.0 | -1.0 | 0.0 | 0.0008 | 0.0024 | 0.0024 | -1.0 | -1.0 | 0.0 | 0.0056 | 0.0001 | 0.0042 | 0.0 | 0.0 | 0.0 | 0.0 |
| 3.9437 | 47.6190 | 4000 | 3.8957 | 0.0 | 0.0 | 0.0 | 0.0 | -1.0 | -1.0 | 0.0 | 0.0 | 0.0003 | 0.0003 | -1.0 | -1.0 | 0.0 | 0.0005 | 0.0 | 0.0009 | 0.0 | 0.0 | 0.0 | 0.0 |
| 4.0132 | 48.2143 | 4050 | 3.8682 | 0.0 | 0.0 | 0.0 | 0.0 | -1.0 | -1.0 | 0.0 | 0.0 | 0.0003 | 0.0003 | -1.0 | -1.0 | 0.0 | 0.0014 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 4.1971 | 48.8095 | 4100 | 3.6982 | 0.0 | 0.0001 | 0.0 | 0.0 | -1.0 | -1.0 | 0.0 | 0.0 | 0.002 | 0.002 | -1.0 | -1.0 | 0.0 | 0.0037 | 0.0 | 0.0019 | 0.0 | 0.0 | 0.0 | 0.0024 |
| 3.79 | 49.4048 | 4150 | 3.8140 | 0.0 | 0.0001 | 0.0 | 0.0 | -1.0 | -1.0 | 0.0 | 0.0005 | 0.0016 | 0.0016 | -1.0 | -1.0 | 0.0 | 0.0046 | 0.0001 | 0.0019 | 0.0 | 0.0 | 0.0 | 0.0 |
| 4.361 | 50.0 | 4200 | 3.9469 | 0.0 | 0.0 | 0.0 | 0.0 | -1.0 | -1.0 | 0.0 | 0.0 | 0.0016 | 0.0016 | -1.0 | -1.0 | 0.0 | 0.0046 | 0.0 | 0.0019 | 0.0 | 0.0 | 0.0 | 0.0 |
| 3.7437 | 50.5952 | 4250 | 3.9639 | 0.0 | 0.0 | 0.0 | 0.0 | -1.0 | -1.0 | 0.0 | 0.0001 | 0.0002 | 0.0002 | -1.0 | -1.0 | 0.0 | 0.0005 | 0.0 | 0.0005 | 0.0 | 0.0 | 0.0 | 0.0 |
| 3.0685 | 51.1905 | 4300 | 3.7958 | 0.0 | 0.0 | 0.0 | 0.0 | -1.0 | -1.0 | 0.0 | 0.0 | 0.0015 | 0.0015 | -1.0 | -1.0 | 0.0 | 0.006 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 3.3051 | 51.7857 | 4350 | 3.7975 | 0.0 | 0.0001 | 0.0001 | 0.0 | -1.0 | -1.0 | 0.0 | 0.0012 | 0.0016 | 0.0016 | -1.0 | -1.0 | 0.0 | 0.0019 | 0.0001 | 0.0047 | 0.0 | 0.0 | 0.0 | 0.0 |
| 2.8429 | 52.3810 | 4400 | 3.7979 | 0.0 | 0.0001 | 0.0 | 0.0 | -1.0 | -1.0 | 0.0 | 0.0005 | 0.0016 | 0.0016 | -1.0 | -1.0 | 0.0 | 0.0037 | 0.0001 | 0.0028 | 0.0 | 0.0 | 0.0 | 0.0 |
| 3.3521 | 52.9762 | 4450 | 3.8025 | 0.0 | 0.0001 | 0.0 | 0.0 | -1.0 | -1.0 | 0.0 | 0.0006 | 0.0013 | 0.0013 | -1.0 | -1.0 | 0.0 | 0.0028 | 0.0001 | 0.0023 | 0.0 | 0.0 | 0.0 | 0.0 |
| 3.8709 | 53.5714 | 4500 | 3.8048 | 0.0 | 0.0001 | 0.0001 | 0.0 | -1.0 | -1.0 | 0.0 | 0.0007 | 0.0014 | 0.0014 | -1.0 | -1.0 | 0.0 | 0.0028 | 0.0002 | 0.0028 | 0.0 | 0.0 | 0.0 | 0.0 |
| 3.9027 | 54.1667 | 4550 | 3.7956 | 0.0 | 0.0001 | 0.0001 | 0.0 | -1.0 | -1.0 | 0.0 | 0.0007 | 0.002 | 0.002 | -1.0 | -1.0 | 0.0 | 0.0023 | 0.0002 | 0.0051 | 0.0 | 0.0 | 0.0 | 0.0006 |
| 3.0122 | 54.7619 | 4600 | 3.7756 | 0.0 | 0.0002 | 0.0 | 0.0 | -1.0 | -1.0 | 0.0 | 0.0005 | 0.0017 | 0.0017 | -1.0 | -1.0 | 0.0 | 0.0042 | 0.0002 | 0.0019 | 0.0 | 0.0 | 0.0 | 0.0006 |
| 3.5297 | 55.3571 | 4650 | 3.8070 | 0.0001 | 0.0001 | 0.0001 | 0.0001 | -1.0 | -1.0 | 0.0 | 0.0007 | 0.0027 | 0.0027 | -1.0 | -1.0 | 0.0 | 0.006 | 0.0003 | 0.0047 | 0.0 | 0.0 | 0.0 | 0.0 |
| 3.632 | 55.9524 | 4700 | 3.8292 | 0.0 | 0.0002 | 0.0 | 0.0 | -1.0 | -1.0 | 0.0 | 0.0003 | 0.002 | 0.002 | -1.0 | -1.0 | 0.0 | 0.0051 | 0.0002 | 0.0028 | 0.0 | 0.0 | 0.0 | 0.0 |
| 3.1858 | 56.5476 | 4750 | 3.8356 | 0.0001 | 0.0001 | 0.0001 | 0.0001 | -1.0 | -1.0 | 0.0 | 0.0007 | 0.002 | 0.002 | -1.0 | -1.0 | 0.0 | 0.0051 | 0.0003 | 0.0028 | 0.0 | 0.0 | 0.0 | 0.0 |
| 4.1364 | 57.1429 | 4800 | 3.8043 | 0.0001 | 0.0001 | 0.0 | 0.0001 | -1.0 | -1.0 | 0.0 | 0.0006 | 0.0014 | 0.0014 | -1.0 | -1.0 | 0.0 | 0.0032 | 0.0002 | 0.0023 | 0.0 | 0.0 | 0.0 | 0.0 |
| 3.754 | 57.7381 | 4850 | 3.7635 | 0.0 | 0.0001 | 0.0 | 0.0 | -1.0 | -1.0 | 0.0 | 0.0005 | 0.002 | 0.002 | -1.0 | -1.0 | 0.0 | 0.006 | 0.0002 | 0.0019 | 0.0 | 0.0 | 0.0 | 0.0 |
| 3.5476 | 58.3333 | 4900 | 3.8144 | 0.0001 | 0.0002 | 0.0001 | 0.0001 | -1.0 | -1.0 | 0.0 | 0.001 | 0.002 | 0.002 | -1.0 | -1.0 | 0.0 | 0.0037 | 0.0003 | 0.0042 | 0.0 | 0.0 | 0.0 | 0.0 |
| 3.7201 | 58.9286 | 4950 | 3.8137 | 0.0 | 0.0002 | 0.0 | 0.0 | -1.0 | -1.0 | 0.0 | 0.0002 | 0.0008 | 0.0008 | -1.0 | -1.0 | 0.0 | 0.0023 | 0.0001 | 0.0009 | 0.0 | 0.0 | 0.0 | 0.0 |
| 4.0767 | 59.5238 | 5000 | 3.7964 | 0.0001 | 0.0003 | 0.0 | 0.0001 | -1.0 | -1.0 | 0.0001 | 0.0006 | 0.0019 | 0.0019 | -1.0 | -1.0 | 0.0 | 0.0051 | 0.0002 | 0.0023 | 0.0 | 0.0 | 0.0 | 0.0 |
| 3.6719 | 60.1190 | 5050 | 3.8036 | 0.0 | 0.0001 | 0.0 | 0.0 | -1.0 | -1.0 | 0.0 | 0.0006 | 0.0014 | 0.0014 | -1.0 | -1.0 | 0.0 | 0.0032 | 0.0002 | 0.0023 | 0.0 | 0.0 | 0.0 | 0.0 |
| 3.9692 | 60.7143 | 5100 | 3.8246 | 0.0 | 0.0001 | 0.0 | 0.0 | -1.0 | -1.0 | 0.0 | 0.0006 | 0.0015 | 0.0015 | -1.0 | -1.0 | 0.0 | 0.0037 | 0.0002 | 0.0023 | 0.0 | 0.0 | 0.0 | 0.0 |
| 3.7247 | 61.3095 | 5150 | 3.7791 | 0.0001 | 0.0003 | 0.0001 | 0.0001 | -1.0 | -1.0 | 0.0001 | 0.0008 | 0.0035 | 0.0035 | -1.0 | -1.0 | 0.0 | 0.0106 | 0.0003 | 0.0033 | 0.0 | 0.0 | 0.0 | 0.0 |
| 4.6245 | 61.9048 | 5200 | 3.6678 | 0.0001 | 0.0004 | 0.0 | 0.0001 | -1.0 | -1.0 | 0.0 | 0.0006 | 0.002 | 0.002 | -1.0 | -1.0 | 0.0 | 0.0056 | 0.0003 | 0.0023 | 0.0 | 0.0 | 0.0 | 0.0 |
| 3.7635 | 62.5 | 5250 | 3.7871 | 0.0001 | 0.0004 | 0.0001 | 0.0001 | -1.0 | -1.0 | 0.0 | 0.0009 | 0.0027 | 0.0027 | -1.0 | -1.0 | 0.0 | 0.0069 | 0.0004 | 0.0037 | 0.0 | 0.0 | 0.0 | 0.0 |
| 3.554 | 63.0952 | 5300 | 3.8363 | 0.0001 | 0.0002 | 0.0002 | 0.0001 | -1.0 | -1.0 | 0.0 | 0.0007 | 0.0013 | 0.0013 | -1.0 | -1.0 | 0.0 | 0.0023 | 0.0004 | 0.0028 | 0.0 | 0.0 | 0.0 | 0.0 |
| 3.4084 | 63.6905 | 5350 | 3.7173 | 0.0001 | 0.0001 | 0.0001 | 0.0001 | -1.0 | -1.0 | 0.0 | 0.0013 | 0.0027 | 0.0027 | -1.0 | -1.0 | 0.0 | 0.0046 | 0.0002 | 0.006 | 0.0 | 0.0 | 0.0 | 0.0 |
| 3.7329 | 64.2857 | 5400 | 3.7424 | 0.0 | 0.0001 | 0.0 | 0.0 | -1.0 | -1.0 | 0.0005 | 0.0007 | 0.0029 | 0.0029 | -1.0 | -1.0 | 0.0 | 0.0014 | 0.0001 | 0.0051 | 0.0 | 0.0 | 0.0 | 0.0053 |
| 3.5339 | 64.8810 | 5450 | 3.8419 | 0.0 | 0.0002 | 0.0 | 0.0 | -1.0 | -1.0 | 0.0 | 0.0001 | 0.0008 | 0.0008 | -1.0 | -1.0 | 0.0 | 0.0028 | 0.0001 | 0.0005 | 0.0 | 0.0 | 0.0 | 0.0 |
| 3.3459 | 65.4762 | 5500 | 3.6949 | 0.0002 | 0.0011 | 0.0 | 0.0002 | -1.0 | -1.0 | 0.0001 | 0.0008 | 0.0029 | 0.0029 | -1.0 | -1.0 | 0.0 | 0.0046 | 0.0007 | 0.0047 | 0.0 | 0.0 | 0.0 | 0.0024 |
| 3.5871 | 66.0714 | 5550 | 3.7609 | 0.0001 | 0.0003 | 0.0 | 0.0001 | -1.0 | -1.0 | 0.0 | 0.0006 | 0.0017 | 0.0017 | -1.0 | -1.0 | 0.0 | 0.0046 | 0.0005 | 0.0023 | 0.0 | 0.0 | 0.0 | 0.0 |
| 3.4437 | 66.6667 | 5600 | 3.6608 | 0.0003 | 0.0014 | 0.0002 | 0.0003 | -1.0 | -1.0 | 0.0001 | 0.0013 | 0.0029 | 0.0029 | -1.0 | -1.0 | 0.0 | 0.0028 | 0.0011 | 0.0084 | 0.0 | 0.0 | 0.0 | 0.0006 |
| 3.7097 | 67.2619 | 5650 | 3.7673 | 0.0001 | 0.0006 | 0.0 | 0.0001 | -1.0 | -1.0 | 0.0001 | 0.0003 | 0.0027 | 0.0027 | -1.0 | -1.0 | 0.0 | 0.0083 | 0.0003 | 0.0023 | 0.0 | 0.0 | 0.0 | 0.0 |
| 4.1379 | 67.8571 | 5700 | 3.8311 | 0.0001 | 0.0003 | 0.0 | 0.0001 | -1.0 | -1.0 | 0.0001 | 0.0005 | 0.0008 | 0.0008 | -1.0 | -1.0 | 0.0 | 0.0014 | 0.0003 | 0.0019 | 0.0 | 0.0 | 0.0 | 0.0 |
| 4.8736 | 68.4524 | 5750 | 3.8420 | 0.0001 | 0.0003 | 0.0 | 0.0001 | -1.0 | -1.0 | 0.0 | 0.0008 | 0.002 | 0.002 | -1.0 | -1.0 | 0.0 | 0.0037 | 0.0003 | 0.0042 | 0.0 | 0.0 | 0.0 | 0.0 |
| 3.6911 | 69.0476 | 5800 | 3.8638 | 0.0 | 0.0 | 0.0 | 0.0 | -1.0 | -1.0 | 0.0 | 0.0001 | 0.0007 | 0.0007 | -1.0 | -1.0 | 0.0 | 0.0023 | 0.0 | 0.0005 | 0.0 | 0.0 | 0.0 | 0.0 |
| 3.8242 | 69.6429 | 5850 | 3.7993 | 0.0 | 0.0002 | 0.0 | 0.0 | -1.0 | -1.0 | 0.0 | 0.0007 | 0.0023 | 0.0023 | -1.0 | -1.0 | 0.0 | 0.0046 | 0.0002 | 0.0047 | 0.0 | 0.0 | 0.0 | 0.0 |
| 4.6435 | 70.2381 | 5900 | 3.7962 | 0.0 | 0.0002 | 0.0 | 0.0 | -1.0 | -1.0 | 0.0 | 0.0006 | 0.0013 | 0.0013 | -1.0 | -1.0 | 0.0 | 0.0028 | 0.0002 | 0.0023 | 0.0 | 0.0 | 0.0 | 0.0 |
| 3.1471 | 70.8333 | 5950 | 3.6997 | 0.0001 | 0.0001 | 0.0 | 0.0001 | -1.0 | -1.0 | 0.0 | 0.0009 | 0.0028 | 0.0028 | -1.0 | -1.0 | 0.0 | 0.0056 | 0.0002 | 0.0037 | 0.0 | 0.0 | 0.0 | 0.0018 |
| 2.8904 | 71.4286 | 6000 | 3.7533 | 0.0 | 0.0 | 0.0 | 0.0 | -1.0 | -1.0 | 0.0 | 0.0 | 0.0013 | 0.0013 | -1.0 | -1.0 | 0.0 | 0.0032 | 0.0 | 0.0019 | 0.0 | 0.0 | 0.0 | 0.0 |
| 3.6199 | 72.0238 | 6050 | 3.7900 | 0.0 | 0.0 | 0.0 | 0.0 | -1.0 | -1.0 | 0.0 | 0.0001 | 0.0005 | 0.0005 | -1.0 | -1.0 | 0.0 | 0.0014 | 0.0 | 0.0005 | 0.0 | 0.0 | 0.0 | 0.0 |
| 3.6421 | 72.6190 | 6100 | 3.7686 | 0.0 | 0.0001 | 0.0 | 0.0 | -1.0 | -1.0 | 0.0 | 0.0002 | 0.0011 | 0.0011 | -1.0 | -1.0 | 0.0 | 0.0009 | 0.0001 | 0.0009 | 0.0 | 0.0 | 0.0 | 0.0024 |
| 3.3558 | 73.2143 | 6150 | 3.7517 | 0.0 | 0.0002 | 0.0 | 0.0 | -1.0 | -1.0 | 0.0 | 0.0006 | 0.0037 | 0.0037 | -1.0 | -1.0 | 0.0 | 0.0069 | 0.0001 | 0.0037 | 0.0 | 0.0 | 0.0 | 0.0041 |
| 3.4124 | 73.8095 | 6200 | 3.8264 | 0.0 | 0.0001 | 0.0 | 0.0 | -1.0 | -1.0 | 0.0 | 0.0007 | 0.0016 | 0.0016 | -1.0 | -1.0 | 0.0 | 0.0023 | 0.0001 | 0.0042 | 0.0 | 0.0 | 0.0 | 0.0 |
| 4.2499 | 74.4048 | 6250 | 3.7574 | 0.0001 | 0.0002 | 0.0 | 0.0001 | -1.0 | -1.0 | 0.0 | 0.0017 | 0.0048 | 0.0048 | -1.0 | -1.0 | 0.0 | 0.0046 | 0.0002 | 0.0088 | 0.0 | 0.0 | 0.0 | 0.0059 |
| 3.2602 | 75.0 | 6300 | 3.7183 | 0.0001 | 0.0002 | 0.0 | 0.0001 | -1.0 | -1.0 | 0.0 | 0.0009 | 0.0021 | 0.0021 | -1.0 | -1.0 | 0.0 | 0.0046 | 0.0002 | 0.0037 | 0.0 | 0.0 | 0.0 | 0.0 |
| 3.517 | 75.5952 | 6350 | 3.7538 | 0.0 | 0.0002 | 0.0 | 0.0 | -1.0 | -1.0 | 0.0 | 0.001 | 0.0032 | 0.0032 | -1.0 | -1.0 | 0.0 | 0.0046 | 0.0001 | 0.0042 | 0.0 | 0.0 | 0.0 | 0.0041 |
| 3.8136 | 76.1905 | 6400 | 3.8021 | 0.0 | 0.0001 | 0.0 | 0.0 | -1.0 | -1.0 | 0.0006 | 0.0008 | 0.003 | 0.003 | -1.0 | -1.0 | 0.0 | 0.0032 | 0.0001 | 0.0047 | 0.0 | 0.0 | 0.0 | 0.0041 |
| 3.9782 | 76.7857 | 6450 | 3.8456 | 0.0 | 0.0002 | 0.0 | 0.0 | -1.0 | -1.0 | 0.0 | 0.0008 | 0.0021 | 0.0021 | -1.0 | -1.0 | 0.0 | 0.0023 | 0.0002 | 0.0056 | 0.0 | 0.0 | 0.0 | 0.0006 |
| 3.3156 | 77.3810 | 6500 | 3.8129 | 0.0 | 0.0001 | 0.0 | 0.0 | -1.0 | -1.0 | 0.0 | 0.0016 | 0.0034 | 0.0034 | -1.0 | -1.0 | 0.0 | 0.0009 | 0.0002 | 0.0084 | 0.0 | 0.0 | 0.0 | 0.0041 |
| 3.3581 | 77.9762 | 6550 | 3.7948 | 0.0001 | 0.0003 | 0.0 | 0.0001 | -1.0 | -1.0 | 0.0 | 0.0013 | 0.0032 | 0.0032 | -1.0 | -1.0 | 0.0 | 0.0028 | 0.0003 | 0.0088 | 0.0 | 0.0 | 0.0 | 0.0012 |
| 4.1628 | 78.5714 | 6600 | 3.8843 | 0.0 | 0.0001 | 0.0 | 0.0 | -1.0 | -1.0 | 0.0 | 0.0014 | 0.0026 | 0.0026 | -1.0 | -1.0 | 0.0 | 0.0032 | 0.0001 | 0.0056 | 0.0 | 0.0 | 0.0 | 0.0018 |
| 4.5568 | 79.1667 | 6650 | 3.8372 | 0.0 | 0.0001 | 0.0 | 0.0 | -1.0 | -1.0 | 0.0 | 0.0014 | 0.0025 | 0.0025 | -1.0 | -1.0 | 0.0 | 0.0028 | 0.0001 | 0.0056 | 0.0 | 0.0 | 0.0 | 0.0018 |
| 3.6556 | 79.7619 | 6700 | 3.8767 | 0.0 | 0.0001 | 0.0001 | 0.0 | -1.0 | -1.0 | 0.0 | 0.002 | 0.0029 | 0.0029 | -1.0 | -1.0 | 0.0 | 0.0009 | 0.0002 | 0.0079 | 0.0 | 0.0 | 0.0 | 0.0029 |
| 4.05 | 80.3571 | 6750 | 3.8427 | 0.0 | 0.0001 | 0.0 | 0.0 | -1.0 | -1.0 | 0.0 | 0.001 | 0.0029 | 0.0029 | -1.0 | -1.0 | 0.0 | 0.0037 | 0.0001 | 0.0042 | 0.0 | 0.0 | 0.0 | 0.0035 |
| 4.3435 | 80.9524 | 6800 | 3.8122 | 0.0 | 0.0001 | 0.0 | 0.0 | -1.0 | -1.0 | 0.0 | 0.0012 | 0.0037 | 0.0037 | -1.0 | -1.0 | 0.0 | 0.0028 | 0.0001 | 0.0074 | 0.0 | 0.0 | 0.0 | 0.0047 |
| 4.0973 | 81.5476 | 6850 | 3.8318 | 0.0 | 0.0001 | 0.0 | 0.0 | -1.0 | -1.0 | 0.0 | 0.0006 | 0.003 | 0.003 | -1.0 | -1.0 | 0.0 | 0.0074 | 0.0001 | 0.0042 | 0.0 | 0.0 | 0.0 | 0.0006 |
| 3.8509 | 82.1429 | 6900 | 3.8137 | 0.0 | 0.0001 | 0.0 | 0.0 | -1.0 | -1.0 | 0.0 | 0.0009 | 0.0045 | 0.0045 | -1.0 | -1.0 | 0.0 | 0.0088 | 0.0001 | 0.007 | 0.0 | 0.0 | 0.0 | 0.0024 |
| 3.4915 | 82.7381 | 6950 | 3.8163 | 0.0 | 0.0001 | 0.0 | 0.0 | -1.0 | -1.0 | 0.0 | 0.0008 | 0.0035 | 0.0035 | -1.0 | -1.0 | 0.0 | 0.006 | 0.0001 | 0.0037 | 0.0 | 0.0 | 0.0 | 0.0041 |
| 3.1978 | 83.3333 | 7000 | 3.8011 | 0.0 | 0.0001 | 0.0 | 0.0 | -1.0 | -1.0 | 0.0 | 0.0008 | 0.0022 | 0.0022 | -1.0 | -1.0 | 0.0 | 0.0028 | 0.0001 | 0.0033 | 0.0 | 0.0 | 0.0 | 0.0029 |
| 3.2499 | 83.9286 | 7050 | 3.7287 | 0.0001 | 0.0001 | 0.0 | 0.0001 | -1.0 | -1.0 | 0.0 | 0.002 | 0.0058 | 0.0058 | -1.0 | -1.0 | 0.0 | 0.0069 | 0.0002 | 0.0079 | 0.0 | 0.0 | 0.0 | 0.0082 |
| 4.2883 | 84.5238 | 7100 | 3.7991 | 0.0001 | 0.0001 | 0.0 | 0.0001 | -1.0 | -1.0 | 0.0 | 0.0019 | 0.0037 | 0.0037 | -1.0 | -1.0 | 0.0 | 0.0051 | 0.0002 | 0.0074 | 0.0 | 0.0 | 0.0 | 0.0024 |
| 3.6939 | 85.1190 | 7150 | 3.7686 | 0.0 | 0.0001 | 0.0 | 0.0 | -1.0 | -1.0 | 0.0 | 0.0016 | 0.0055 | 0.0055 | -1.0 | -1.0 | 0.0 | 0.0079 | 0.0001 | 0.0065 | 0.0 | 0.0 | 0.0 | 0.0076 |
| 3.9535 | 85.7143 | 7200 | 3.6905 | 0.0 | 0.0001 | 0.0 | 0.0 | -1.0 | -1.0 | 0.0 | 0.0014 | 0.0048 | 0.0048 | -1.0 | -1.0 | 0.0 | 0.0065 | 0.0001 | 0.0056 | 0.0 | 0.0 | 0.0 | 0.0071 |
| 3.6218 | 86.3095 | 7250 | 3.7205 | 0.0 | 0.0001 | 0.0 | 0.0 | -1.0 | -1.0 | 0.0 | 0.0012 | 0.0045 | 0.0045 | -1.0 | -1.0 | 0.0 | 0.0051 | 0.0001 | 0.0047 | 0.0 | 0.0 | 0.0 | 0.0082 |
| 3.5398 | 86.9048 | 7300 | 3.7463 | 0.0 | 0.0001 | 0.0 | 0.0 | -1.0 | -1.0 | 0.0 | 0.001 | 0.0037 | 0.0037 | -1.0 | -1.0 | 0.0 | 0.0046 | 0.0001 | 0.0042 | 0.0 | 0.0 | 0.0 | 0.0059 |
| 3.1931 | 87.5 | 7350 | 3.7300 | 0.0 | 0.0001 | 0.0 | 0.0 | -1.0 | -1.0 | 0.0 | 0.001 | 0.0026 | 0.0026 | -1.0 | -1.0 | 0.0 | 0.0023 | 0.0001 | 0.0047 | 0.0 | 0.0 | 0.0 | 0.0035 |
| 3.5578 | 88.0952 | 7400 | 3.7558 | 0.0 | 0.0001 | 0.0 | 0.0 | -1.0 | -1.0 | 0.0 | 0.0016 | 0.0042 | 0.0042 | -1.0 | -1.0 | 0.0 | 0.0028 | 0.0002 | 0.007 | 0.0 | 0.0 | 0.0 | 0.0071 |
| 3.8336 | 88.6905 | 7450 | 3.7182 | 0.0 | 0.0001 | 0.0 | 0.0 | -1.0 | -1.0 | 0.0 | 0.0015 | 0.0045 | 0.0045 | -1.0 | -1.0 | 0.0 | 0.0032 | 0.0001 | 0.006 | 0.0 | 0.0 | 0.0 | 0.0088 |
| 3.8849 | 89.2857 | 7500 | 3.7476 | 0.0 | 0.0001 | 0.0 | 0.0 | -1.0 | -1.0 | 0.0 | 0.0008 | 0.0031 | 0.0031 | -1.0 | -1.0 | 0.0 | 0.0009 | 0.0002 | 0.006 | 0.0 | 0.0 | 0.0 | 0.0053 |
| 3.4902 | 89.8810 | 7550 | 3.7985 | 0.0 | 0.0001 | 0.0 | 0.0 | -1.0 | -1.0 | 0.0 | 0.0008 | 0.0026 | 0.0026 | -1.0 | -1.0 | 0.0 | 0.0019 | 0.0001 | 0.0037 | 0.0 | 0.0 | 0.0 | 0.0047 |
| 3.9584 | 90.4762 | 7600 | 3.7602 | 0.0001 | 0.0002 | 0.0 | 0.0001 | -1.0 | -1.0 | 0.0 | 0.001 | 0.003 | 0.003 | -1.0 | -1.0 | 0.0 | 0.0014 | 0.0003 | 0.0047 | 0.0 | 0.0 | 0.0 | 0.0059 |
| 4.6706 | 91.0714 | 7650 | 3.7235 | 0.0001 | 0.0002 | 0.0001 | 0.0001 | -1.0 | -1.0 | 0.0 | 0.0019 | 0.0046 | 0.0046 | -1.0 | -1.0 | 0.0 | 0.0042 | 0.0004 | 0.0084 | 0.0 | 0.0 | 0.0 | 0.0059 |
| 2.8092 | 91.6667 | 7700 | 3.7510 | 0.0001 | 0.0002 | 0.0002 | 0.0001 | -1.0 | -1.0 | 0.0 | 0.0017 | 0.0032 | 0.0032 | -1.0 | -1.0 | 0.0 | 0.0019 | 0.0005 | 0.007 | 0.0 | 0.0 | 0.0 | 0.0041 |
| 4.5045 | 92.2619 | 7750 | 3.7197 | 0.0001 | 0.0001 | 0.0001 | 0.0001 | -1.0 | -1.0 | 0.0 | 0.0012 | 0.0042 | 0.0042 | -1.0 | -1.0 | 0.0 | 0.0028 | 0.0002 | 0.0051 | 0.0 | 0.0 | 0.0 | 0.0088 |
| 3.7061 | 92.8571 | 7800 | 3.7624 | 0.0001 | 0.0002 | 0.0 | 0.0001 | -1.0 | -1.0 | 0.0 | 0.001 | 0.0038 | 0.0038 | -1.0 | -1.0 | 0.0 | 0.0023 | 0.0003 | 0.0074 | 0.0 | 0.0 | 0.0 | 0.0053 |
| 2.7675 | 93.4524 | 7850 | 3.7657 | 0.0001 | 0.0002 | 0.0001 | 0.0001 | -1.0 | -1.0 | 0.0 | 0.0013 | 0.003 | 0.003 | -1.0 | -1.0 | 0.0 | 0.0037 | 0.0004 | 0.0065 | 0.0 | 0.0 | 0.0 | 0.0018 |
| 3.7357 | 94.0476 | 7900 | 3.7447 | 0.0001 | 0.0002 | 0.0001 | 0.0001 | -1.0 | -1.0 | 0.0 | 0.0014 | 0.004 | 0.004 | -1.0 | -1.0 | 0.0 | 0.0032 | 0.0003 | 0.0079 | 0.0 | 0.0 | 0.0 | 0.0047 |
| 3.7985 | 94.6429 | 7950 | 3.7296 | 0.0001 | 0.0001 | 0.0001 | 0.0001 | -1.0 | -1.0 | 0.0 | 0.0013 | 0.0033 | 0.0033 | -1.0 | -1.0 | 0.0 | 0.0056 | 0.0003 | 0.006 | 0.0 | 0.0 | 0.0 | 0.0018 |
| 4.289 | 95.2381 | 8000 | 3.7628 | 0.0001 | 0.0002 | 0.0001 | 0.0001 | -1.0 | -1.0 | 0.0 | 0.0015 | 0.0045 | 0.0045 | -1.0 | -1.0 | 0.0 | 0.0065 | 0.0003 | 0.007 | 0.0 | 0.0 | 0.0 | 0.0047 |
| 3.3646 | 95.8333 | 8050 | 3.7704 | 0.0001 | 0.0002 | 0.0001 | 0.0001 | -1.0 | -1.0 | 0.0 | 0.0012 | 0.0041 | 0.0041 | -1.0 | -1.0 | 0.0 | 0.006 | 0.0003 | 0.007 | 0.0 | 0.0 | 0.0 | 0.0035 |
| 3.6207 | 96.4286 | 8100 | 3.7398 | 0.0001 | 0.0002 | 0.0001 | 0.0001 | -1.0 | -1.0 | 0.0 | 0.0014 | 0.0042 | 0.0042 | -1.0 | -1.0 | 0.0 | 0.0056 | 0.0003 | 0.0065 | 0.0 | 0.0 | 0.0 | 0.0047 |
| 4.2325 | 97.0238 | 8150 | 3.7298 | 0.0 | 0.0001 | 0.0 | 0.0 | -1.0 | -1.0 | 0.0 | 0.0016 | 0.0048 | 0.0048 | -1.0 | -1.0 | 0.0 | 0.0037 | 0.0001 | 0.0074 | 0.0 | 0.0 | 0.0 | 0.0082 |
| 3.0169 | 97.6190 | 8200 | 3.7921 | 0.0001 | 0.0001 | 0.0001 | 0.0001 | -1.0 | -1.0 | 0.0 | 0.0017 | 0.0055 | 0.0055 | -1.0 | -1.0 | 0.0 | 0.0051 | 0.0003 | 0.0079 | 0.0 | 0.0 | 0.0 | 0.0088 |
| 3.5777 | 98.2143 | 8250 | 3.8096 | 0.0001 | 0.0002 | 0.0 | 0.0001 | -1.0 | -1.0 | 0.0 | 0.0009 | 0.0036 | 0.0036 | -1.0 | -1.0 | 0.0 | 0.0037 | 0.0002 | 0.0065 | 0.0 | 0.0 | 0.0 | 0.0041 |
| 3.611 | 98.8095 | 8300 | 3.7768 | 0.0001 | 0.0002 | 0.0001 | 0.0001 | -1.0 | -1.0 | 0.0 | 0.0017 | 0.0036 | 0.0036 | -1.0 | -1.0 | 0.0 | 0.0032 | 0.0004 | 0.007 | 0.0 | 0.0 | 0.0 | 0.0041 |
| 3.0438 | 99.4048 | 8350 | 3.7838 | 0.0001 | 0.0001 | 0.0 | 0.0001 | -1.0 | -1.0 | 0.0 | 0.001 | 0.0038 | 0.0038 | -1.0 | -1.0 | 0.0 | 0.0028 | 0.0003 | 0.0079 | 0.0 | 0.0 | 0.0 | 0.0047 |
| 3.4507 | 100.0 | 8400 | 3.7966 | 0.0001 | 0.0001 | 0.0001 | 0.0001 | -1.0 | -1.0 | 0.0 | 0.0012 | 0.0029 | 0.0029 | -1.0 | -1.0 | 0.0 | 0.0019 | 0.0002 | 0.0047 | 0.0 | 0.0 | 0.0 | 0.0053 |
| 3.7107 | 100.5952 | 8450 | 3.7070 | 0.0 | 0.0001 | 0.0 | 0.0 | -1.0 | -1.0 | 0.0 | 0.0017 | 0.0043 | 0.0043 | -1.0 | -1.0 | 0.0 | 0.0032 | 0.0002 | 0.007 | 0.0 | 0.0 | 0.0 | 0.0071 |
| 3.3861 | 101.1905 | 8500 | 3.7708 | 0.0001 | 0.0002 | 0.0001 | 0.0001 | -1.0 | -1.0 | 0.0 | 0.0013 | 0.0045 | 0.0045 | -1.0 | -1.0 | 0.0 | 0.0028 | 0.0003 | 0.0093 | 0.0 | 0.0 | 0.0 | 0.0059 |
| 3.4074 | 101.7857 | 8550 | 3.7599 | 0.0001 | 0.0002 | 0.0001 | 0.0001 | -1.0 | -1.0 | 0.0 | 0.0019 | 0.0048 | 0.0048 | -1.0 | -1.0 | 0.0 | 0.0046 | 0.0003 | 0.0074 | 0.0 | 0.0 | 0.0 | 0.0071 |
| 3.8869 | 102.3810 | 8600 | 3.7371 | 0.0001 | 0.0002 | 0.0001 | 0.0001 | -1.0 | -1.0 | 0.0 | 0.0017 | 0.0047 | 0.0047 | -1.0 | -1.0 | 0.0 | 0.0042 | 0.0003 | 0.007 | 0.0 | 0.0 | 0.0 | 0.0076 |
| 2.9239 | 102.9762 | 8650 | 3.7551 | 0.0 | 0.0001 | 0.0 | 0.0 | -1.0 | -1.0 | 0.0 | 0.0016 | 0.0039 | 0.0039 | -1.0 | -1.0 | 0.0 | 0.0032 | 0.0001 | 0.0065 | 0.0 | 0.0 | 0.0 | 0.0059 |
| 3.5625 | 103.5714 | 8700 | 3.7443 | 0.0 | 0.0001 | 0.0 | 0.0 | -1.0 | -1.0 | 0.0 | 0.0014 | 0.0045 | 0.0045 | -1.0 | -1.0 | 0.0 | 0.0042 | 0.0002 | 0.0056 | 0.0 | 0.0 | 0.0 | 0.0082 |
| 4.6142 | 104.1667 | 8750 | 3.7256 | 0.0001 | 0.0002 | 0.0 | 0.0001 | -1.0 | -1.0 | 0.0 | 0.0019 | 0.005 | 0.005 | -1.0 | -1.0 | 0.0 | 0.0023 | 0.0002 | 0.0084 | 0.0 | 0.0 | 0.0 | 0.0094 |
| 4.0728 | 104.7619 | 8800 | 3.7221 | 0.0001 | 0.0002 | 0.0 | 0.0001 | -1.0 | -1.0 | 0.0 | 0.0015 | 0.0056 | 0.0056 | -1.0 | -1.0 | 0.0 | 0.0028 | 0.0002 | 0.0088 | 0.0 | 0.0 | 0.0 | 0.0106 |
| 3.8747 | 105.3571 | 8850 | 3.7195 | 0.0001 | 0.0001 | 0.0001 | 0.0001 | -1.0 | -1.0 | 0.0 | 0.0016 | 0.0045 | 0.0045 | -1.0 | -1.0 | 0.0 | 0.0023 | 0.0002 | 0.0079 | 0.0 | 0.0 | 0.0 | 0.0076 |
| 3.9537 | 105.9524 | 8900 | 3.7267 | 0.0001 | 0.0001 | 0.0001 | 0.0001 | -1.0 | -1.0 | 0.0 | 0.0015 | 0.0049 | 0.0049 | -1.0 | -1.0 | 0.0 | 0.0056 | 0.0003 | 0.0074 | 0.0 | 0.0 | 0.0 | 0.0065 |
| 3.8057 | 106.5476 | 8950 | 3.7282 | 0.0001 | 0.0001 | 0.0001 | 0.0001 | -1.0 | -1.0 | 0.0 | 0.0016 | 0.0046 | 0.0046 | -1.0 | -1.0 | 0.0 | 0.0028 | 0.0002 | 0.007 | 0.0 | 0.0 | 0.0 | 0.0088 |
| 3.3724 | 107.1429 | 9000 | 3.7304 | 0.0001 | 0.0001 | 0.0001 | 0.0001 | -1.0 | -1.0 | 0.0 | 0.002 | 0.0047 | 0.0047 | -1.0 | -1.0 | 0.0 | 0.0056 | 0.0003 | 0.0079 | 0.0 | 0.0 | 0.0 | 0.0053 |
| 3.7657 | 107.7381 | 9050 | 3.7322 | 0.0001 | 0.0001 | 0.0001 | 0.0001 | -1.0 | -1.0 | 0.0 | 0.0017 | 0.0037 | 0.0037 | -1.0 | -1.0 | 0.0 | 0.006 | 0.0003 | 0.0074 | 0.0 | 0.0 | 0.0 | 0.0012 |
| 2.2652 | 108.3333 | 9100 | 3.7145 | 0.0001 | 0.0001 | 0.0001 | 0.0001 | -1.0 | -1.0 | 0.0 | 0.0016 | 0.004 | 0.004 | -1.0 | -1.0 | 0.0 | 0.0019 | 0.0002 | 0.007 | 0.0 | 0.0 | 0.0 | 0.0071 |
| 4.1087 | 108.9286 | 9150 | 3.7280 | 0.0001 | 0.0002 | 0.0001 | 0.0001 | -1.0 | -1.0 | 0.0 | 0.0012 | 0.0039 | 0.0039 | -1.0 | -1.0 | 0.0 | 0.0056 | 0.0003 | 0.0088 | 0.0 | 0.0 | 0.0 | 0.0012 |
| 3.0613 | 109.5238 | 9200 | 3.7164 | 0.0001 | 0.0001 | 0.0001 | 0.0001 | -1.0 | -1.0 | 0.0 | 0.0017 | 0.0038 | 0.0038 | -1.0 | -1.0 | 0.0 | 0.0019 | 0.0003 | 0.007 | 0.0 | 0.0 | 0.0 | 0.0065 |
| 3.5957 | 110.1190 | 9250 | 3.7170 | 0.0001 | 0.0002 | 0.0001 | 0.0001 | -1.0 | -1.0 | 0.0 | 0.0017 | 0.0041 | 0.0041 | -1.0 | -1.0 | 0.0 | 0.0023 | 0.0003 | 0.007 | 0.0 | 0.0 | 0.0 | 0.0071 |
| 3.4566 | 110.7143 | 9300 | 3.7069 | 0.0001 | 0.0001 | 0.0001 | 0.0001 | -1.0 | -1.0 | 0.0 | 0.0019 | 0.0043 | 0.0043 | -1.0 | -1.0 | 0.0 | 0.0014 | 0.0003 | 0.0074 | 0.0 | 0.0 | 0.0 | 0.0082 |
| 3.5257 | 111.3095 | 9350 | 3.7247 | 0.0001 | 0.0002 | 0.0001 | 0.0001 | -1.0 | -1.0 | 0.0 | 0.0019 | 0.0038 | 0.0038 | -1.0 | -1.0 | 0.0 | 0.0056 | 0.0003 | 0.0074 | 0.0 | 0.0 | 0.0 | 0.0024 |
| 4.7539 | 111.9048 | 9400 | 3.7316 | 0.0001 | 0.0002 | 0.0001 | 0.0001 | -1.0 | -1.0 | 0.0 | 0.0013 | 0.004 | 0.004 | -1.0 | -1.0 | 0.0 | 0.0056 | 0.0003 | 0.007 | 0.0 | 0.0 | 0.0 | 0.0035 |
| 3.5568 | 112.5 | 9450 | 3.7488 | 0.0001 | 0.0002 | 0.0 | 0.0001 | -1.0 | -1.0 | 0.0 | 0.001 | 0.0039 | 0.0039 | -1.0 | -1.0 | 0.0 | 0.0051 | 0.0003 | 0.007 | 0.0 | 0.0 | 0.0 | 0.0035 |
| 3.5134 | 113.0952 | 9500 | 3.7609 | 0.0001 | 0.0002 | 0.0001 | 0.0001 | -1.0 | -1.0 | 0.0 | 0.0012 | 0.0043 | 0.0043 | -1.0 | -1.0 | 0.0 | 0.0051 | 0.0003 | 0.0074 | 0.0 | 0.0 | 0.0 | 0.0047 |
| 3.714 | 113.6905 | 9550 | 3.7513 | 0.0001 | 0.0002 | 0.0001 | 0.0001 | -1.0 | -1.0 | 0.0 | 0.0017 | 0.0053 | 0.0053 | -1.0 | -1.0 | 0.0 | 0.0051 | 0.0003 | 0.0074 | 0.0 | 0.0 | 0.0 | 0.0088 |
| 3.4494 | 114.2857 | 9600 | 3.7369 | 0.0001 | 0.0002 | 0.0001 | 0.0001 | -1.0 | -1.0 | 0.0 | 0.0017 | 0.0054 | 0.0054 | -1.0 | -1.0 | 0.0 | 0.0042 | 0.0003 | 0.0074 | 0.0 | 0.0 | 0.0 | 0.01 |
| 2.9166 | 114.8810 | 9650 | 3.7420 | 0.0001 | 0.0002 | 0.0001 | 0.0001 | -1.0 | -1.0 | 0.0 | 0.0017 | 0.0051 | 0.0051 | -1.0 | -1.0 | 0.0 | 0.0046 | 0.0003 | 0.0074 | 0.0 | 0.0 | 0.0 | 0.0082 |
| 2.8269 | 115.4762 | 9700 | 3.7485 | 0.0001 | 0.0002 | 0.0001 | 0.0001 | -1.0 | -1.0 | 0.0 | 0.0017 | 0.005 | 0.005 | -1.0 | -1.0 | 0.0 | 0.0042 | 0.0004 | 0.0074 | 0.0 | 0.0 | 0.0 | 0.0082 |
| 2.9184 | 116.0714 | 9750 | 3.7430 | 0.0001 | 0.0002 | 0.0001 | 0.0001 | -1.0 | -1.0 | 0.0 | 0.0019 | 0.0047 | 0.0047 | -1.0 | -1.0 | 0.0 | 0.0037 | 0.0003 | 0.0074 | 0.0 | 0.0 | 0.0 | 0.0076 |
| 3.6305 | 116.6667 | 9800 | 3.7569 | 0.0001 | 0.0002 | 0.0001 | 0.0001 | -1.0 | -1.0 | 0.0 | 0.0019 | 0.0046 | 0.0046 | -1.0 | -1.0 | 0.0 | 0.0037 | 0.0003 | 0.0074 | 0.0 | 0.0 | 0.0 | 0.0071 |
| 3.5946 | 117.2619 | 9850 | 3.7549 | 0.0001 | 0.0002 | 0.0001 | 0.0001 | -1.0 | -1.0 | 0.0 | 0.0019 | 0.0048 | 0.0048 | -1.0 | -1.0 | 0.0 | 0.0042 | 0.0003 | 0.0074 | 0.0 | 0.0 | 0.0 | 0.0076 |
| 2.3713 | 117.8571 | 9900 | 3.7542 | 0.0001 | 0.0002 | 0.0001 | 0.0001 | -1.0 | -1.0 | 0.0 | 0.0019 | 0.0044 | 0.0044 | -1.0 | -1.0 | 0.0 | 0.0037 | 0.0003 | 0.0074 | 0.0 | 0.0 | 0.0 | 0.0065 |
| 3.7416 | 118.4524 | 9950 | 3.7532 | 0.0001 | 0.0002 | 0.0001 | 0.0001 | -1.0 | -1.0 | 0.0 | 0.0017 | 0.0046 | 0.0046 | -1.0 | -1.0 | 0.0 | 0.0042 | 0.0003 | 0.007 | 0.0 | 0.0 | 0.0 | 0.0071 |
| 3.8033 | 119.0476 | 10000 | 3.7495 | 0.0001 | 0.0002 | 0.0001 | 0.0001 | -1.0 | -1.0 | 0.0 | 0.0019 | 0.0047 | 0.0047 | -1.0 | -1.0 | 0.0 | 0.0042 | 0.0003 | 0.0074 | 0.0 | 0.0 | 0.0 | 0.0071 |
### Framework versions
- Transformers 4.48.0
- Pytorch 2.5.1+cu124
- Datasets 3.2.0
- Tokenizers 0.21.0
| [
"recilia dorsalis",
"nephotettix malayanus",
"sogatella furcifera",
"nilaparvata lugens"
] |
sha000/deta-finetuned-teeth-v1 |
# Model Card for Model ID
<!-- Provide a quick summary of what the model is/does. -->
## Model Details
### Model Description
<!-- Provide a longer summary of what this model is. -->
This is the model card of a 🤗 transformers model that has been pushed on the Hub. This model card has been automatically generated.
- **Developed by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Model type:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
- **Finetuned from model [optional]:** [More Information Needed]
### Model Sources [optional]
<!-- Provide the basic links for the model. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the model is intended to be used, including the foreseeable users of the model and those affected by the model. -->
### Direct Use
<!-- This section is for the model use without fine-tuning or plugging into a larger ecosystem/app. -->
[More Information Needed]
### Downstream Use [optional]
<!-- This section is for the model use when fine-tuned for a task, or when plugged into a larger ecosystem/app -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the model will not work well for. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.
## How to Get Started with the Model
Use the code below to get started with the model.
[More Information Needed]
## Training Details
### Training Data
<!-- This should link to a Dataset Card, perhaps with a short stub of information on what the training data is all about as well as documentation related to data pre-processing or additional filtering. -->
[More Information Needed]
### Training Procedure
<!-- This relates heavily to the Technical Specifications. Content here should link to that section when it is relevant to the training procedure. -->
#### Preprocessing [optional]
[More Information Needed]
#### Training Hyperparameters
- **Training regime:** [More Information Needed] <!--fp32, fp16 mixed precision, bf16 mixed precision, bf16 non-mixed precision, fp16 non-mixed precision, fp8 mixed precision -->
#### Speeds, Sizes, Times [optional]
<!-- This section provides information about throughput, start/end time, checkpoint size if relevant, etc. -->
[More Information Needed]
## Evaluation
<!-- This section describes the evaluation protocols and provides the results. -->
### Testing Data, Factors & Metrics
#### Testing Data
<!-- This should link to a Dataset Card if possible. -->
[More Information Needed]
#### Factors
<!-- These are the things the evaluation is disaggregating by, e.g., subpopulations or domains. -->
[More Information Needed]
#### Metrics
<!-- These are the evaluation metrics being used, ideally with a description of why. -->
[More Information Needed]
### Results
[More Information Needed]
#### Summary
## Model Examination [optional]
<!-- Relevant interpretability work for the model goes here -->
[More Information Needed]
## Environmental Impact
<!-- Total emissions (in grams of CO2eq) and additional considerations, such as electricity usage, go here. Edit the suggested text below accordingly -->
Carbon emissions can be estimated using the [Machine Learning Impact calculator](https://mlco2.github.io/impact#compute) presented in [Lacoste et al. (2019)](https://arxiv.org/abs/1910.09700).
- **Hardware Type:** [More Information Needed]
- **Hours used:** [More Information Needed]
- **Cloud Provider:** [More Information Needed]
- **Compute Region:** [More Information Needed]
- **Carbon Emitted:** [More Information Needed]
## Technical Specifications [optional]
### Model Architecture and Objective
[More Information Needed]
### Compute Infrastructure
[More Information Needed]
#### Hardware
[More Information Needed]
#### Software
[More Information Needed]
## Citation [optional]
<!-- If there is a paper or blog post introducing the model, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the model or model card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Model Card Authors [optional]
[More Information Needed]
## Model Card Contact
[More Information Needed] | [
"label_0",
"label_1",
"label_2",
"label_3",
"label_4",
"label_5",
"label_6",
"label_7",
"label_8",
"label_9",
"label_10",
"label_11",
"label_12",
"label_13",
"label_14",
"label_15",
"label_16",
"label_17",
"label_18",
"label_19",
"label_20",
"label_21",
"label_22",
"label_23",
"label_24",
"label_25",
"label_26",
"label_27",
"label_28",
"label_29",
"label_30",
"label_31"
] |
sha000/deta-finetuned-conditions-v1 |
# Model Card for Model ID
<!-- Provide a quick summary of what the model is/does. -->
## Model Details
### Model Description
<!-- Provide a longer summary of what this model is. -->
This is the model card of a 🤗 transformers model that has been pushed on the Hub. This model card has been automatically generated.
- **Developed by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Model type:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
- **Finetuned from model [optional]:** [More Information Needed]
### Model Sources [optional]
<!-- Provide the basic links for the model. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the model is intended to be used, including the foreseeable users of the model and those affected by the model. -->
### Direct Use
<!-- This section is for the model use without fine-tuning or plugging into a larger ecosystem/app. -->
[More Information Needed]
### Downstream Use [optional]
<!-- This section is for the model use when fine-tuned for a task, or when plugged into a larger ecosystem/app -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the model will not work well for. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.
## How to Get Started with the Model
Use the code below to get started with the model.
[More Information Needed]
## Training Details
### Training Data
<!-- This should link to a Dataset Card, perhaps with a short stub of information on what the training data is all about as well as documentation related to data pre-processing or additional filtering. -->
[More Information Needed]
### Training Procedure
<!-- This relates heavily to the Technical Specifications. Content here should link to that section when it is relevant to the training procedure. -->
#### Preprocessing [optional]
[More Information Needed]
#### Training Hyperparameters
- **Training regime:** [More Information Needed] <!--fp32, fp16 mixed precision, bf16 mixed precision, bf16 non-mixed precision, fp16 non-mixed precision, fp8 mixed precision -->
#### Speeds, Sizes, Times [optional]
<!-- This section provides information about throughput, start/end time, checkpoint size if relevant, etc. -->
[More Information Needed]
## Evaluation
<!-- This section describes the evaluation protocols and provides the results. -->
### Testing Data, Factors & Metrics
#### Testing Data
<!-- This should link to a Dataset Card if possible. -->
[More Information Needed]
#### Factors
<!-- These are the things the evaluation is disaggregating by, e.g., subpopulations or domains. -->
[More Information Needed]
#### Metrics
<!-- These are the evaluation metrics being used, ideally with a description of why. -->
[More Information Needed]
### Results
[More Information Needed]
#### Summary
## Model Examination [optional]
<!-- Relevant interpretability work for the model goes here -->
[More Information Needed]
## Environmental Impact
<!-- Total emissions (in grams of CO2eq) and additional considerations, such as electricity usage, go here. Edit the suggested text below accordingly -->
Carbon emissions can be estimated using the [Machine Learning Impact calculator](https://mlco2.github.io/impact#compute) presented in [Lacoste et al. (2019)](https://arxiv.org/abs/1910.09700).
- **Hardware Type:** [More Information Needed]
- **Hours used:** [More Information Needed]
- **Cloud Provider:** [More Information Needed]
- **Compute Region:** [More Information Needed]
- **Carbon Emitted:** [More Information Needed]
## Technical Specifications [optional]
### Model Architecture and Objective
[More Information Needed]
### Compute Infrastructure
[More Information Needed]
#### Hardware
[More Information Needed]
#### Software
[More Information Needed]
## Citation [optional]
<!-- If there is a paper or blog post introducing the model, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the model or model card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Model Card Authors [optional]
[More Information Needed]
## Model Card Contact
[More Information Needed] | [
"label_0",
"label_1",
"label_2",
"label_3"
] |
sha000/deta-finetuned-conditions-v2 |
# Model Card for Model ID
<!-- Provide a quick summary of what the model is/does. -->
## Model Details
### Model Description
<!-- Provide a longer summary of what this model is. -->
This is the model card of a 🤗 transformers model that has been pushed on the Hub. This model card has been automatically generated.
- **Developed by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Model type:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
- **Finetuned from model [optional]:** [More Information Needed]
### Model Sources [optional]
<!-- Provide the basic links for the model. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the model is intended to be used, including the foreseeable users of the model and those affected by the model. -->
### Direct Use
<!-- This section is for the model use without fine-tuning or plugging into a larger ecosystem/app. -->
[More Information Needed]
### Downstream Use [optional]
<!-- This section is for the model use when fine-tuned for a task, or when plugged into a larger ecosystem/app -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the model will not work well for. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.
## How to Get Started with the Model
Use the code below to get started with the model.
[More Information Needed]
## Training Details
### Training Data
<!-- This should link to a Dataset Card, perhaps with a short stub of information on what the training data is all about as well as documentation related to data pre-processing or additional filtering. -->
[More Information Needed]
### Training Procedure
<!-- This relates heavily to the Technical Specifications. Content here should link to that section when it is relevant to the training procedure. -->
#### Preprocessing [optional]
[More Information Needed]
#### Training Hyperparameters
- **Training regime:** [More Information Needed] <!--fp32, fp16 mixed precision, bf16 mixed precision, bf16 non-mixed precision, fp16 non-mixed precision, fp8 mixed precision -->
#### Speeds, Sizes, Times [optional]
<!-- This section provides information about throughput, start/end time, checkpoint size if relevant, etc. -->
[More Information Needed]
## Evaluation
<!-- This section describes the evaluation protocols and provides the results. -->
### Testing Data, Factors & Metrics
#### Testing Data
<!-- This should link to a Dataset Card if possible. -->
[More Information Needed]
#### Factors
<!-- These are the things the evaluation is disaggregating by, e.g., subpopulations or domains. -->
[More Information Needed]
#### Metrics
<!-- These are the evaluation metrics being used, ideally with a description of why. -->
[More Information Needed]
### Results
[More Information Needed]
#### Summary
## Model Examination [optional]
<!-- Relevant interpretability work for the model goes here -->
[More Information Needed]
## Environmental Impact
<!-- Total emissions (in grams of CO2eq) and additional considerations, such as electricity usage, go here. Edit the suggested text below accordingly -->
Carbon emissions can be estimated using the [Machine Learning Impact calculator](https://mlco2.github.io/impact#compute) presented in [Lacoste et al. (2019)](https://arxiv.org/abs/1910.09700).
- **Hardware Type:** [More Information Needed]
- **Hours used:** [More Information Needed]
- **Cloud Provider:** [More Information Needed]
- **Compute Region:** [More Information Needed]
- **Carbon Emitted:** [More Information Needed]
## Technical Specifications [optional]
### Model Architecture and Objective
[More Information Needed]
### Compute Infrastructure
[More Information Needed]
#### Hardware
[More Information Needed]
#### Software
[More Information Needed]
## Citation [optional]
<!-- If there is a paper or blog post introducing the model, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the model or model card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Model Card Authors [optional]
[More Information Needed]
## Model Card Contact
[More Information Needed] | [
"label_0",
"label_1",
"label_2",
"label_3"
] |
ellabettison/Logo-Detection-finetune |
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
[<img src="https://raw.githubusercontent.com/wandb/assets/main/wandb-github-badge-28.svg" alt="Visualize in Weights & Biases" width="200" height="32"/>](https://wandb.ai/ella-bettison/logo_extraction/runs/4k33h7px)
# Logo-Detection-finetune
This model is a fine-tuned version of [hustvl/yolos-small](https://huggingface.co/hustvl/yolos-small) on an unknown dataset.
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 5e-05
- train_batch_size: 8
- eval_batch_size: 8
- seed: 42
- optimizer: Use adamw_torch with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
- lr_scheduler_type: linear
- lr_scheduler_warmup_steps: 100
- num_epochs: 20
### Framework versions
- Transformers 4.49.0.dev0
- Pytorch 2.5.1+cu121
- Tokenizers 0.21.0
| [
"logos",
"logos"
] |
firdhokk/apple-detection-with-rtdetr-rd50vd-coco-o365 |
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# apple-detection-with-rtdetr-rd50vd-coco-o365
This model is a fine-tuned version of [PekingU/rtdetr_r50vd_coco_o365](https://huggingface.co/PekingU/rtdetr_r50vd_coco_o365) on an unknown dataset.
It achieves the following results on the evaluation set:
- Loss: 5.7891
- Map: 0.8588
- Map 50: 0.9441
- Map 75: 0.9232
- Map Small: -1.0
- Map Medium: -1.0
- Map Large: 0.8593
- Mar 1: 0.2596
- Mar 10: 0.7771
- Mar 100: 0.94
- Mar Small: -1.0
- Mar Medium: -1.0
- Mar Large: 0.94
- Map Apple: 0.8588
- Mar 100 Apple: 0.94
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 5e-05
- train_batch_size: 8
- eval_batch_size: 8
- seed: 42
- optimizer: Use OptimizerNames.ADAMW_TORCH with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
- lr_scheduler_type: linear
- lr_scheduler_warmup_steps: 300
- num_epochs: 100
- mixed_precision_training: Native AMP
### Training results
| Training Loss | Epoch | Step | Validation Loss | Map | Map 50 | Map 75 | Map Small | Map Medium | Map Large | Mar 1 | Mar 10 | Mar 100 | Mar Small | Mar Medium | Mar Large | Map Apple | Mar 100 Apple |
|:-------------:|:-----:|:----:|:---------------:|:------:|:------:|:------:|:---------:|:----------:|:---------:|:------:|:------:|:-------:|:---------:|:----------:|:---------:|:---------:|:-------------:|
| No log | 1.0 | 169 | 14.5718 | 0.1228 | 0.2421 | 0.1084 | -1.0 | 0.0157 | 0.1399 | 0.1375 | 0.3146 | 0.6275 | -1.0 | 0.5333 | 0.6277 | 0.1228 | 0.6275 |
| No log | 2.0 | 338 | 9.2170 | 0.5022 | 0.692 | 0.5452 | -1.0 | 0.3272 | 0.5035 | 0.225 | 0.5618 | 0.8138 | -1.0 | 0.9333 | 0.8136 | 0.5022 | 0.8138 |
| 30.1672 | 3.0 | 507 | 8.8104 | 0.5503 | 0.6829 | 0.6163 | -1.0 | 0.4455 | 0.5516 | 0.2309 | 0.6075 | 0.8643 | -1.0 | 0.9 | 0.8642 | 0.5503 | 0.8643 |
| 30.1672 | 4.0 | 676 | 8.0371 | 0.6275 | 0.7697 | 0.7078 | -1.0 | 0.4374 | 0.6279 | 0.2427 | 0.6525 | 0.8602 | -1.0 | 0.9333 | 0.86 | 0.6275 | 0.8602 |
| 30.1672 | 5.0 | 845 | 7.0432 | 0.6851 | 0.807 | 0.778 | -1.0 | 0.521 | 0.6858 | 0.2448 | 0.6939 | 0.8754 | -1.0 | 0.7667 | 0.8757 | 0.6851 | 0.8754 |
| 12.3777 | 6.0 | 1014 | 7.4024 | 0.7262 | 0.8607 | 0.8231 | -1.0 | 0.4729 | 0.729 | 0.2437 | 0.6998 | 0.8842 | -1.0 | 0.9 | 0.8842 | 0.7262 | 0.8842 |
| 12.3777 | 7.0 | 1183 | 7.6969 | 0.7006 | 0.8097 | 0.7746 | -1.0 | 0.417 | 0.7031 | 0.2443 | 0.7104 | 0.8897 | -1.0 | 0.7667 | 0.89 | 0.7006 | 0.8897 |
| 12.3777 | 8.0 | 1352 | 7.0771 | 0.7536 | 0.8664 | 0.8397 | -1.0 | 0.7023 | 0.7541 | 0.249 | 0.7151 | 0.891 | -1.0 | 0.9333 | 0.8909 | 0.7536 | 0.891 |
| 11.4378 | 9.0 | 1521 | 7.6821 | 0.7493 | 0.8728 | 0.8397 | -1.0 | 0.5096 | 0.7509 | 0.2484 | 0.7257 | 0.9009 | -1.0 | 0.9 | 0.9009 | 0.7493 | 0.9009 |
| 11.4378 | 10.0 | 1690 | 7.0789 | 0.7901 | 0.9168 | 0.8872 | -1.0 | 0.5995 | 0.7907 | 0.2459 | 0.7272 | 0.8985 | -1.0 | 0.8667 | 0.8985 | 0.7901 | 0.8985 |
| 11.4378 | 11.0 | 1859 | 7.1990 | 0.7908 | 0.894 | 0.8677 | -1.0 | 0.6183 | 0.7918 | 0.2503 | 0.7353 | 0.9113 | -1.0 | 0.9 | 0.9114 | 0.7908 | 0.9113 |
| 10.5157 | 12.0 | 2028 | 6.6454 | 0.7664 | 0.8872 | 0.8547 | -1.0 | 0.5614 | 0.7677 | 0.248 | 0.7315 | 0.8994 | -1.0 | 0.8667 | 0.8995 | 0.7664 | 0.8994 |
| 10.5157 | 13.0 | 2197 | 7.4329 | 0.7401 | 0.8425 | 0.812 | -1.0 | 0.7072 | 0.7409 | 0.2485 | 0.7313 | 0.902 | -1.0 | 0.9 | 0.902 | 0.7401 | 0.902 |
| 10.5157 | 14.0 | 2366 | 6.7330 | 0.8094 | 0.9061 | 0.8781 | -1.0 | 0.7158 | 0.8098 | 0.2523 | 0.7525 | 0.9168 | -1.0 | 0.9 | 0.9169 | 0.8094 | 0.9168 |
| 9.8125 | 15.0 | 2535 | 6.5761 | 0.8023 | 0.9071 | 0.8815 | -1.0 | 0.7502 | 0.8029 | 0.2528 | 0.7456 | 0.9108 | -1.0 | 0.8667 | 0.9109 | 0.8023 | 0.9108 |
| 9.8125 | 16.0 | 2704 | 6.5281 | 0.818 | 0.919 | 0.8961 | -1.0 | 0.6455 | 0.8185 | 0.252 | 0.7501 | 0.9141 | -1.0 | 0.9 | 0.9141 | 0.818 | 0.9141 |
| 9.8125 | 17.0 | 2873 | 6.6390 | 0.8178 | 0.9158 | 0.8959 | -1.0 | 0.7281 | 0.8189 | 0.251 | 0.7423 | 0.9179 | -1.0 | 0.9 | 0.918 | 0.8178 | 0.9179 |
| 9.4182 | 18.0 | 3042 | 6.4843 | 0.8341 | 0.9298 | 0.9035 | -1.0 | 0.6102 | 0.8347 | 0.2507 | 0.7526 | 0.9203 | -1.0 | 0.8667 | 0.9205 | 0.8341 | 0.9203 |
| 9.4182 | 19.0 | 3211 | 7.2777 | 0.7999 | 0.9018 | 0.8787 | -1.0 | 0.6338 | 0.8015 | 0.249 | 0.7413 | 0.9173 | -1.0 | 0.9 | 0.9173 | 0.7999 | 0.9173 |
| 9.4182 | 20.0 | 3380 | 6.6993 | 0.8291 | 0.9234 | 0.8972 | -1.0 | 0.7493 | 0.8301 | 0.2516 | 0.7537 | 0.9245 | -1.0 | 0.9333 | 0.9245 | 0.8291 | 0.9245 |
| 9.0821 | 21.0 | 3549 | 6.8164 | 0.8154 | 0.9222 | 0.9018 | -1.0 | 0.7889 | 0.8158 | 0.2533 | 0.744 | 0.9167 | -1.0 | 0.9333 | 0.9166 | 0.8154 | 0.9167 |
| 9.0821 | 22.0 | 3718 | 6.7705 | 0.8158 | 0.9272 | 0.9034 | -1.0 | 0.6072 | 0.8169 | 0.2473 | 0.7421 | 0.9116 | -1.0 | 0.9 | 0.9117 | 0.8158 | 0.9116 |
| 9.0821 | 23.0 | 3887 | 5.8021 | 0.838 | 0.9356 | 0.9121 | -1.0 | 0.6007 | 0.8391 | 0.2505 | 0.7538 | 0.9247 | -1.0 | 0.9 | 0.9248 | 0.838 | 0.9247 |
| 8.7442 | 24.0 | 4056 | 6.3070 | 0.8475 | 0.9312 | 0.9098 | -1.0 | 0.7521 | 0.8481 | 0.2531 | 0.7582 | 0.9282 | -1.0 | 0.9333 | 0.9282 | 0.8475 | 0.9282 |
| 8.7442 | 25.0 | 4225 | 5.7625 | 0.8491 | 0.9336 | 0.9136 | -1.0 | 0.7109 | 0.8498 | 0.2511 | 0.7639 | 0.9285 | -1.0 | 0.9333 | 0.9285 | 0.8491 | 0.9285 |
| 8.7442 | 26.0 | 4394 | 5.9863 | 0.8284 | 0.9285 | 0.9072 | -1.0 | 0.5891 | 0.8295 | 0.2525 | 0.7493 | 0.9185 | -1.0 | 0.7667 | 0.9188 | 0.8284 | 0.9185 |
| 8.3969 | 27.0 | 4563 | 6.0623 | 0.8271 | 0.9343 | 0.9078 | -1.0 | 0.5669 | 0.8277 | 0.2521 | 0.7518 | 0.9097 | -1.0 | 0.8 | 0.91 | 0.8271 | 0.9097 |
| 8.3969 | 28.0 | 4732 | 6.2344 | 0.8329 | 0.9302 | 0.9033 | -1.0 | 0.5902 | 0.8336 | 0.2519 | 0.7548 | 0.9247 | -1.0 | 0.9 | 0.9248 | 0.8329 | 0.9247 |
| 8.3969 | 29.0 | 4901 | 6.1610 | 0.8294 | 0.9177 | 0.8965 | -1.0 | 0.6745 | 0.8304 | 0.2503 | 0.7544 | 0.9315 | -1.0 | 0.9 | 0.9315 | 0.8294 | 0.9315 |
| 8.2127 | 30.0 | 5070 | 6.5000 | 0.8297 | 0.9219 | 0.9001 | -1.0 | 0.6378 | 0.8305 | 0.2508 | 0.7541 | 0.9225 | -1.0 | 0.8667 | 0.9227 | 0.8297 | 0.9225 |
### Framework versions
- Transformers 4.49.0.dev0
- Pytorch 2.5.1+cu121
- Tokenizers 0.21.0
| [
"apple",
"apple"
] |
Aalzen/detr-resnet-50_finetuned_cppe5 |
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# detr-resnet-50_finetuned_cppe5
This model is a fine-tuned version of [facebook/detr-resnet-50](https://huggingface.co/facebook/detr-resnet-50) on an unknown dataset.
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 1e-05
- train_batch_size: 8
- eval_batch_size: 8
- seed: 42
- optimizer: Use adamw_torch with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
- lr_scheduler_type: linear
- num_epochs: 10
- mixed_precision_training: Native AMP
### Training results
### Framework versions
- Transformers 4.47.1
- Pytorch 2.5.1+cu121
- Datasets 3.2.0
- Tokenizers 0.21.0
| [
"coverall",
"face_shield",
"gloves",
"goggles",
"mask"
] |
ARG-NCTU/detr-resnet-50-finetuned-20-epochs-boat-dataset |
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# detr-resnet-50-finetuned-20-epochs-boat-dataset
This model is a fine-tuned version of [facebook/detr-resnet-50](https://huggingface.co/facebook/detr-resnet-50) on an unknown dataset.
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 1e-05
- train_batch_size: 8
- eval_batch_size: 8
- seed: 42
- optimizer: Use adamw_torch with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
- lr_scheduler_type: linear
- num_epochs: 20
### Training results
### Framework versions
- Transformers 4.46.3
- Pytorch 2.4.1+cu121
- Datasets 3.1.0
- Tokenizers 0.20.3
| [
"ballonboat",
"bigboat",
"boat",
"jetski",
"katamaran",
"sailboat",
"smallboat",
"speedboat",
"wam_v",
"container_ship",
"tugship",
"yacht",
"blueboat"
] |
KaushiGihan/detr-kids-and-adults-detection-finetuned |
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# detr-kids-and-adults-detection-finetuned
This model is a fine-tuned version of [facebook/detr-resnet-50-dc5](https://huggingface.co/facebook/detr-resnet-50-dc5) on the None dataset.
It achieves the following results on the evaluation set:
- Loss: 1.7496
- Map: 0.0043
- Map 50: 0.0096
- Map 75: 0.0039
- Map Small: -1.0
- Map Medium: 0.0051
- Map Large: 0.0073
- Mar 1: 0.0175
- Mar 10: 0.0983
- Mar 100: 0.3165
- Mar Small: -1.0
- Mar Medium: 0.5364
- Mar Large: 0.3035
- Map Kid: 0.0029
- Mar 100 Kid: 0.2694
- Map Adult: 0.0056
- Mar 100 Adult: 0.3636
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 1e-05
- train_batch_size: 4
- eval_batch_size: 4
- seed: 42
- optimizer: Use OptimizerNames.ADAMW_TORCH with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
- lr_scheduler_type: linear
- training_steps: 100
- mixed_precision_training: Native AMP
### Training results
| Training Loss | Epoch | Step | Validation Loss | Map | Map 50 | Map 75 | Map Small | Map Medium | Map Large | Mar 1 | Mar 10 | Mar 100 | Mar Small | Mar Medium | Mar Large | Map Kid | Mar 100 Kid | Map Adult | Mar 100 Adult |
|:-------------:|:------:|:----:|:---------------:|:------:|:------:|:------:|:---------:|:----------:|:---------:|:------:|:------:|:-------:|:---------:|:----------:|:---------:|:-------:|:-----------:|:---------:|:-------------:|
| 1.9155 | 0.1852 | 20 | 1.8038 | 0.0047 | 0.0101 | 0.005 | -1.0 | 0.0035 | 0.0071 | 0.0332 | 0.1231 | 0.3206 | -1.0 | 0.4545 | 0.3067 | 0.0013 | 0.1685 | 0.0081 | 0.4727 |
| 1.8395 | 0.3704 | 40 | 1.7979 | 0.0045 | 0.0107 | 0.0035 | -1.0 | 0.0042 | 0.0072 | 0.0223 | 0.1147 | 0.3081 | -1.0 | 0.4909 | 0.2926 | 0.0014 | 0.1726 | 0.0076 | 0.4436 |
| 1.8477 | 0.5556 | 60 | 1.7656 | 0.0046 | 0.0107 | 0.0041 | -1.0 | 0.0044 | 0.0077 | 0.0329 | 0.1148 | 0.3276 | -1.0 | 0.5091 | 0.3137 | 0.002 | 0.2242 | 0.0072 | 0.4309 |
| 1.66 | 0.7407 | 80 | 1.7440 | 0.0044 | 0.0098 | 0.0039 | -1.0 | 0.0049 | 0.0076 | 0.0256 | 0.1029 | 0.3266 | -1.0 | 0.5182 | 0.3137 | 0.0026 | 0.2532 | 0.0062 | 0.4 |
| 2.0088 | 0.9259 | 100 | 1.7496 | 0.0043 | 0.0096 | 0.0039 | -1.0 | 0.0051 | 0.0073 | 0.0175 | 0.0983 | 0.3165 | -1.0 | 0.5364 | 0.3035 | 0.0029 | 0.2694 | 0.0056 | 0.3636 |
### Framework versions
- Transformers 4.48.0
- Pytorch 2.5.1+cu121
- Datasets 3.2.0
- Tokenizers 0.21.0 | [
"kid",
"adult"
] |
yainage90/fashion-object-detection-yolos-tiny | This model is fine-tuned version of hustvl/yolos-tiny.
You can find details of model in this github repo -> [fashion-visual-search](https://github.com/yainage90/fashion-visual-search)
And you can find fashion image feature extractor model -> [yainage90/fashion-image-feature-extractor](https://huggingface.co/yainage90/fashion-image-feature-extractor)
This model was trained using a combination of two datasets: [modanet](https://github.com/eBay/modanet) and [fashionpedia](https://fashionpedia.github.io/home/)
The labels are ['bag', 'bottom', 'dress', 'hat', 'shoes', 'outer', 'top']
In the 96th epoch out of total of 100 epochs, the best score was achieved with mAP 0.697400.
``` python
from PIL import Image
import torch
from transformers import YolosImageProcessor, YolosForObjectDetection
device = 'cpu'
if torch.cuda.is_available():
device = torch.device('cuda')
elif torch.backends.mps.is_available():
device = torch.device('mps')
ckpt = 'yainage90/fashion-object-detection-yolos-tiny'
image_processor = YolosImageProcessor.from_pretrained(ckpt)
model = YolosForObjectDetection.from_pretrained(ckpt).to(device)
image = Image.open('<path/to/image>').convert('RGB')
with torch.no_grad():
inputs = image_processor(images=[image], return_tensors="pt")
outputs = model(**inputs.to(device))
target_sizes = torch.tensor([[image.size[1], image.size[0]]])
results = image_processor.post_process_object_detection(outputs, threshold=0.85, target_sizes=target_sizes)[0]
items = []
for score, label, box in zip(results["scores"], results["labels"], results["boxes"]):
score = score.item()
label = label.item()
box = [i.item() for i in box]
print(f"{model.config.id2label[label]}: {round(score, 3)} at {box}")
items.append((score, label, box))
```
 | [
"bag",
"bottom",
"dress",
"hat",
"outer",
"shoes",
"top"
] |
bbreddy30/yolos-fashionpedia |
# Model Card for Model ID
<!-- Provide a quick summary of what the model is/does. -->
A fine-tunned object detection model for fashion.
Based on valentinafeve/yolos-fashionpedia
## Model Details
### Model Description
<!-- Provide a longer summary of what this model is. -->
This is the model card of a 🤗 transformers model that has been pushed on the Hub. This model card has been automatically generated.
- **Developed by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Model type:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
- **Finetuned from model [optional]:** [More Information Needed]
### Model Sources [optional]
<!-- Provide the basic links for the model. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the model is intended to be used, including the foreseeable users of the model and those affected by the model. -->
### Direct Use
<!-- This section is for the model use without fine-tuning or plugging into a larger ecosystem/app. -->
[More Information Needed]
### Downstream Use [optional]
<!-- This section is for the model use when fine-tuned for a task, or when plugged into a larger ecosystem/app -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the model will not work well for. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.
## How to Get Started with the Model
Use the code below to get started with the model.
[More Information Needed]
## Training Details
### Training Data
<!-- This should link to a Dataset Card, perhaps with a short stub of information on what the training data is all about as well as documentation related to data pre-processing or additional filtering. -->
[More Information Needed]
### Training Procedure
<!-- This relates heavily to the Technical Specifications. Content here should link to that section when it is relevant to the training procedure. -->
#### Preprocessing [optional]
[More Information Needed]
#### Training Hyperparameters
- **Training regime:** [More Information Needed] <!--fp32, fp16 mixed precision, bf16 mixed precision, bf16 non-mixed precision, fp16 non-mixed precision, fp8 mixed precision -->
#### Speeds, Sizes, Times [optional]
<!-- This section provides information about throughput, start/end time, checkpoint size if relevant, etc. -->
[More Information Needed]
## Evaluation
<!-- This section describes the evaluation protocols and provides the results. -->
### Testing Data, Factors & Metrics
#### Testing Data
<!-- This should link to a Dataset Card if possible. -->
[More Information Needed]
#### Factors
<!-- These are the things the evaluation is disaggregating by, e.g., subpopulations or domains. -->
[More Information Needed]
#### Metrics
<!-- These are the evaluation metrics being used, ideally with a description of why. -->
[More Information Needed]
### Results
[More Information Needed]
#### Summary
## Model Examination [optional]
<!-- Relevant interpretability work for the model goes here -->
[More Information Needed]
## Environmental Impact
<!-- Total emissions (in grams of CO2eq) and additional considerations, such as electricity usage, go here. Edit the suggested text below accordingly -->
Carbon emissions can be estimated using the [Machine Learning Impact calculator](https://mlco2.github.io/impact#compute) presented in [Lacoste et al. (2019)](https://arxiv.org/abs/1910.09700).
- **Hardware Type:** [More Information Needed]
- **Hours used:** [More Information Needed]
- **Cloud Provider:** [More Information Needed]
- **Compute Region:** [More Information Needed]
- **Carbon Emitted:** [More Information Needed]
## Technical Specifications [optional]
### Model Architecture and Objective
[More Information Needed]
### Compute Infrastructure
[More Information Needed]
#### Hardware
[More Information Needed]
#### Software
[More Information Needed]
## Citation [optional]
<!-- If there is a paper or blog post introducing the model, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the model or model card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Model Card Authors [optional]
[More Information Needed]
## Model Card Contact
[More Information Needed] | [
"label_0",
"label_1",
"label_2",
"label_3",
"label_4",
"label_5",
"label_6",
"label_7",
"label_8",
"label_9",
"label_10",
"label_11",
"label_12",
"label_13",
"label_14",
"label_15",
"label_16",
"label_17",
"label_18",
"label_19",
"label_20",
"label_21",
"label_22",
"label_23",
"label_24",
"label_25",
"label_26",
"label_27",
"label_28",
"label_29",
"label_30",
"label_31",
"label_32",
"label_33",
"label_34",
"label_35",
"label_36",
"label_37",
"label_38",
"label_39",
"label_40",
"label_41",
"label_42",
"label_43",
"label_44",
"label_45"
] |
huangjian900116/detr-resnet-50_finetuned_cppe5 |
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# detr-resnet-50_finetuned_cppe5
This model is a fine-tuned version of [facebook/detr-resnet-50](https://huggingface.co/facebook/detr-resnet-50) on an unknown dataset.
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 1e-05
- train_batch_size: 8
- eval_batch_size: 8
- seed: 42
- optimizer: Use OptimizerNames.ADAMW_TORCH with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
- lr_scheduler_type: linear
- num_epochs: 10
- mixed_precision_training: Native AMP
### Framework versions
- Transformers 4.51.3
- Pytorch 2.6.0+cu124
- Datasets 3.6.0
- Tokenizers 0.21.1
| [
"coverall",
"face_shield",
"gloves",
"goggles",
"mask"
] |
bismanwz/detr-resnet-50_finetuned_cppe5 |
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# detr-resnet-50_finetuned_cppe5
This model is a fine-tuned version of [facebook/detr-resnet-50](https://huggingface.co/facebook/detr-resnet-50) on an unknown dataset.
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 1e-05
- train_batch_size: 8
- eval_batch_size: 8
- seed: 42
- optimizer: Use OptimizerNames.ADAMW_TORCH with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
- lr_scheduler_type: linear
- num_epochs: 10
- mixed_precision_training: Native AMP
### Framework versions
- Transformers 4.47.1
- Pytorch 2.5.1+cu121
- Datasets 3.2.0
- Tokenizers 0.21.0
| [
"coverall",
"face_shield",
"gloves",
"goggles",
"mask"
] |
bismanwz/detr_finetuned_cppe5 |
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# detr_finetuned_cppe5
This model is a fine-tuned version of [microsoft/conditional-detr-resnet-50](https://huggingface.co/microsoft/conditional-detr-resnet-50) on an unknown dataset.
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 5e-05
- train_batch_size: 8
- eval_batch_size: 8
- seed: 42
- optimizer: Use OptimizerNames.ADAMW_TORCH with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
- lr_scheduler_type: cosine
- num_epochs: 30
### Framework versions
- Transformers 4.47.1
- Pytorch 2.5.1+cu121
- Datasets 3.2.0
- Tokenizers 0.21.0
| [
"coverall",
"face_shield",
"gloves",
"goggles",
"mask"
] |
Ilhamfaisal/detr-resnet-50-dc5-grasshopper-testdata-finetuned2.0-maxsteps-10000-batchsize-2-ilham |
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# detr-resnet-50-dc5-grasshopper-testdata-finetuned2.0-maxsteps-10000-batchsize-2-ilham
This model is a fine-tuned version of [facebook/detr-resnet-50-dc5](https://huggingface.co/facebook/detr-resnet-50-dc5) on the None dataset.
It achieves the following results on the evaluation set:
- Loss: 3.0007
- Map: 0.0047
- Map 50: 0.0173
- Map 75: 0.0009
- Map Small: 0.0047
- Map Medium: -1.0
- Map Large: -1.0
- Mar 1: 0.0017
- Mar 10: 0.0117
- Mar 100: 0.0426
- Mar Small: 0.0426
- Mar Medium: -1.0
- Mar Large: -1.0
- Map Recilia dorsalis: 0.0113
- Mar 100 Recilia dorsalis: 0.1227
- Map Nephotettix malayanus: 0.0075
- Mar 100 Nephotettix malayanus: 0.0307
- Map Sogatella furcifera: 0.0
- Mar 100 Sogatella furcifera: 0.0
- Map Nilaparvata lugens: 0.0
- Mar 100 Nilaparvata lugens: 0.0171
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 1e-05
- train_batch_size: 2
- eval_batch_size: 2
- seed: 42
- optimizer: Use OptimizerNames.ADAMW_TORCH with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
- lr_scheduler_type: linear
- training_steps: 10000
- mixed_precision_training: Native AMP
### Training results
| Training Loss | Epoch | Step | Validation Loss | Map | Map 50 | Map 75 | Map Small | Map Medium | Map Large | Mar 1 | Mar 10 | Mar 100 | Mar Small | Mar Medium | Mar Large | Map Recilia dorsalis | Mar 100 Recilia dorsalis | Map Nephotettix malayanus | Mar 100 Nephotettix malayanus | Map Sogatella furcifera | Mar 100 Sogatella furcifera | Map Nilaparvata lugens | Mar 100 Nilaparvata lugens |
|:-------------:|:--------:|:-----:|:---------------:|:------:|:------:|:------:|:---------:|:----------:|:---------:|:------:|:------:|:-------:|:---------:|:----------:|:---------:|:--------------------:|:------------------------:|:-------------------------:|:-----------------------------:|:-----------------------:|:---------------------------:|:----------------------:|:--------------------------:|
| 4.922 | 0.5952 | 50 | 4.2729 | 0.0001 | 0.0003 | 0.0 | 0.0001 | -1.0 | -1.0 | 0.0 | 0.0015 | 0.0085 | 0.0085 | -1.0 | -1.0 | 0.0 | 0.0 | 0.0001 | 0.0177 | 0.0 | 0.0 | 0.0002 | 0.0165 |
| 4.1668 | 1.1905 | 100 | 3.9789 | 0.0 | 0.0001 | 0.0 | 0.0 | -1.0 | -1.0 | 0.0 | 0.0008 | 0.0051 | 0.0051 | -1.0 | -1.0 | 0.0 | 0.0 | 0.0001 | 0.0163 | 0.0 | 0.0 | 0.0 | 0.0041 |
| 4.5854 | 1.7857 | 150 | 3.7985 | 0.0001 | 0.0002 | 0.0 | 0.0001 | -1.0 | -1.0 | 0.0 | 0.0015 | 0.0055 | 0.0055 | -1.0 | -1.0 | 0.0 | 0.0 | 0.0001 | 0.0163 | 0.0 | 0.0 | 0.0001 | 0.0059 |
| 4.3943 | 2.3810 | 200 | 3.7751 | 0.0 | 0.0001 | 0.0 | 0.0 | -1.0 | -1.0 | 0.0 | 0.001 | 0.004 | 0.004 | -1.0 | -1.0 | 0.0 | 0.0 | 0.0 | 0.0042 | 0.0 | 0.0 | 0.0 | 0.0118 |
| 2.9732 | 2.9762 | 250 | 3.7922 | 0.0 | 0.0001 | 0.0 | 0.0 | -1.0 | -1.0 | 0.0 | 0.0013 | 0.0023 | 0.0023 | -1.0 | -1.0 | 0.0 | 0.0 | 0.0 | 0.0005 | 0.0 | 0.0 | 0.0001 | 0.0088 |
| 3.571 | 3.5714 | 300 | 3.8144 | 0.0 | 0.0001 | 0.0 | 0.0 | -1.0 | -1.0 | 0.0 | 0.001 | 0.0018 | 0.0018 | -1.0 | -1.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0001 | 0.0071 |
| 2.4278 | 4.1667 | 350 | 3.7715 | 0.0 | 0.0001 | 0.0 | 0.0 | -1.0 | -1.0 | 0.0002 | 0.0005 | 0.0028 | 0.0028 | -1.0 | -1.0 | 0.0 | 0.0009 | 0.0 | 0.0009 | 0.0 | 0.0 | 0.0 | 0.0094 |
| 4.0768 | 4.7619 | 400 | 3.6700 | 0.0001 | 0.0004 | 0.0 | 0.0001 | -1.0 | -1.0 | 0.0001 | 0.0011 | 0.0053 | 0.0053 | -1.0 | -1.0 | 0.0001 | 0.0125 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0001 | 0.0088 |
| 4.4523 | 5.3571 | 450 | 3.6946 | 0.0001 | 0.0004 | 0.0 | 0.0001 | -1.0 | -1.0 | 0.0 | 0.0009 | 0.0074 | 0.0074 | -1.0 | -1.0 | 0.0002 | 0.0213 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0001 | 0.0082 |
| 3.5476 | 5.9524 | 500 | 3.7100 | 0.0 | 0.0002 | 0.0 | 0.0 | -1.0 | -1.0 | 0.0 | 0.0016 | 0.0049 | 0.0049 | -1.0 | -1.0 | 0.0 | 0.0042 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0001 | 0.0153 |
| 3.5861 | 6.5476 | 550 | 3.6604 | 0.0001 | 0.0005 | 0.0 | 0.0001 | -1.0 | -1.0 | 0.0 | 0.0008 | 0.0089 | 0.0089 | -1.0 | -1.0 | 0.0003 | 0.0343 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0012 |
| 3.32 | 7.1429 | 600 | 3.6814 | 0.0003 | 0.0013 | 0.0 | 0.0003 | -1.0 | -1.0 | 0.0 | 0.003 | 0.0167 | 0.0167 | -1.0 | -1.0 | 0.001 | 0.0667 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 4.4471 | 7.7381 | 650 | 3.7570 | 0.0001 | 0.0005 | 0.0 | 0.0001 | -1.0 | -1.0 | 0.0003 | 0.0003 | 0.0073 | 0.0073 | -1.0 | -1.0 | 0.0003 | 0.0292 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 5.2238 | 8.3333 | 700 | 3.6898 | 0.0002 | 0.0015 | 0.0 | 0.0002 | -1.0 | -1.0 | 0.0 | 0.0014 | 0.0086 | 0.0086 | -1.0 | -1.0 | 0.0009 | 0.0338 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0006 |
| 3.8461 | 8.9286 | 750 | 3.7262 | 0.0002 | 0.001 | 0.0 | 0.0002 | -1.0 | -1.0 | 0.0 | 0.0014 | 0.0133 | 0.0133 | -1.0 | -1.0 | 0.0007 | 0.0532 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 3.2069 | 9.5238 | 800 | 3.7627 | 0.0001 | 0.0007 | 0.0 | 0.0001 | -1.0 | -1.0 | 0.0 | 0.0008 | 0.0082 | 0.0082 | -1.0 | -1.0 | 0.0005 | 0.0329 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 4.0347 | 10.1190 | 850 | 3.7339 | 0.0006 | 0.0028 | 0.0 | 0.0006 | -1.0 | -1.0 | 0.0 | 0.0027 | 0.0172 | 0.0172 | -1.0 | -1.0 | 0.0023 | 0.069 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 4.0595 | 10.7143 | 900 | 3.6200 | 0.0005 | 0.0025 | 0.0 | 0.0005 | -1.0 | -1.0 | 0.0 | 0.0019 | 0.0167 | 0.0167 | -1.0 | -1.0 | 0.0021 | 0.0667 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 4.2273 | 11.3095 | 950 | 3.6058 | 0.001 | 0.0059 | 0.0 | 0.001 | -1.0 | -1.0 | 0.0003 | 0.0027 | 0.0157 | 0.0157 | -1.0 | -1.0 | 0.0039 | 0.063 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 2.6219 | 11.9048 | 1000 | 3.5723 | 0.0011 | 0.0045 | 0.0 | 0.0012 | -1.0 | -1.0 | 0.0003 | 0.0031 | 0.0115 | 0.0115 | -1.0 | -1.0 | 0.0045 | 0.0458 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 3.5385 | 12.5 | 1050 | 3.5279 | 0.0018 | 0.0084 | 0.0002 | 0.0018 | -1.0 | -1.0 | 0.0002 | 0.0041 | 0.0233 | 0.0233 | -1.0 | -1.0 | 0.0072 | 0.0931 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 3.6905 | 13.0952 | 1100 | 3.5101 | 0.0008 | 0.0035 | 0.0001 | 0.0008 | -1.0 | -1.0 | 0.0 | 0.0022 | 0.0163 | 0.0163 | -1.0 | -1.0 | 0.0031 | 0.0653 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 3.0729 | 13.6905 | 1150 | 3.5698 | 0.0007 | 0.0032 | 0.0 | 0.0007 | -1.0 | -1.0 | 0.0 | 0.0024 | 0.0112 | 0.0112 | -1.0 | -1.0 | 0.0027 | 0.0449 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 3.6636 | 14.2857 | 1200 | 3.4950 | 0.0011 | 0.0052 | 0.0001 | 0.0011 | -1.0 | -1.0 | 0.0 | 0.0028 | 0.0199 | 0.0199 | -1.0 | -1.0 | 0.0043 | 0.0796 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 3.4026 | 14.8810 | 1250 | 3.4642 | 0.0019 | 0.0088 | 0.0001 | 0.0019 | -1.0 | -1.0 | 0.0003 | 0.0061 | 0.0266 | 0.0266 | -1.0 | -1.0 | 0.0074 | 0.1065 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 3.514 | 15.4762 | 1300 | 3.4471 | 0.0021 | 0.0074 | 0.0005 | 0.0021 | -1.0 | -1.0 | 0.0006 | 0.0047 | 0.0293 | 0.0293 | -1.0 | -1.0 | 0.0085 | 0.1171 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 4.4938 | 16.0714 | 1350 | 3.5414 | 0.0028 | 0.0095 | 0.0001 | 0.0028 | -1.0 | -1.0 | 0.0007 | 0.0043 | 0.023 | 0.023 | -1.0 | -1.0 | 0.0061 | 0.0898 | 0.005 | 0.0023 | 0.0 | 0.0 | 0.0 | 0.0 |
| 3.9956 | 16.6667 | 1400 | 3.4033 | 0.0031 | 0.012 | 0.0006 | 0.0031 | -1.0 | -1.0 | 0.0001 | 0.0084 | 0.0362 | 0.0362 | -1.0 | -1.0 | 0.0123 | 0.1449 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 4.1115 | 17.2619 | 1450 | 3.4297 | 0.0022 | 0.0107 | 0.0001 | 0.0022 | -1.0 | -1.0 | 0.0007 | 0.0046 | 0.0264 | 0.0264 | -1.0 | -1.0 | 0.0086 | 0.1056 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 3.0744 | 17.8571 | 1500 | 3.4763 | 0.0018 | 0.0071 | 0.0001 | 0.0018 | -1.0 | -1.0 | 0.0003 | 0.0063 | 0.028 | 0.028 | -1.0 | -1.0 | 0.0073 | 0.112 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 3.3757 | 18.4524 | 1550 | 3.4410 | 0.002 | 0.008 | 0.0004 | 0.002 | -1.0 | -1.0 | 0.0008 | 0.0066 | 0.0317 | 0.0317 | -1.0 | -1.0 | 0.008 | 0.1269 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 4.5711 | 19.0476 | 1600 | 3.4374 | 0.0024 | 0.0101 | 0.0003 | 0.0024 | -1.0 | -1.0 | 0.0001 | 0.0069 | 0.0322 | 0.0322 | -1.0 | -1.0 | 0.0096 | 0.1287 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 2.8704 | 19.6429 | 1650 | 3.4496 | 0.0018 | 0.0074 | 0.0004 | 0.0018 | -1.0 | -1.0 | 0.0007 | 0.0057 | 0.0252 | 0.0252 | -1.0 | -1.0 | 0.0073 | 0.1009 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 3.9042 | 20.2381 | 1700 | 3.3804 | 0.0029 | 0.0119 | 0.0004 | 0.0029 | -1.0 | -1.0 | 0.0002 | 0.0065 | 0.0306 | 0.0306 | -1.0 | -1.0 | 0.0117 | 0.1222 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 3.3289 | 20.8333 | 1750 | 3.4044 | 0.0021 | 0.0083 | 0.0 | 0.0021 | -1.0 | -1.0 | 0.001 | 0.0053 | 0.0238 | 0.0238 | -1.0 | -1.0 | 0.0086 | 0.0954 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 4.4952 | 21.4286 | 1800 | 3.4224 | 0.0014 | 0.0069 | 0.0001 | 0.0014 | -1.0 | -1.0 | 0.0009 | 0.003 | 0.0231 | 0.0231 | -1.0 | -1.0 | 0.0058 | 0.0926 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 3.7537 | 22.0238 | 1850 | 3.3736 | 0.0012 | 0.0055 | 0.0001 | 0.0012 | -1.0 | -1.0 | 0.0 | 0.0039 | 0.0223 | 0.0223 | -1.0 | -1.0 | 0.0047 | 0.0894 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 4.7113 | 22.6190 | 1900 | 3.5088 | 0.0011 | 0.0045 | 0.0002 | 0.0011 | -1.0 | -1.0 | 0.0 | 0.0038 | 0.0214 | 0.0214 | -1.0 | -1.0 | 0.0045 | 0.0856 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 3.6112 | 23.2143 | 1950 | 3.3937 | 0.0017 | 0.0097 | 0.0 | 0.0017 | -1.0 | -1.0 | 0.0008 | 0.0043 | 0.0227 | 0.0227 | -1.0 | -1.0 | 0.0066 | 0.0903 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0006 |
| 4.0428 | 23.8095 | 2000 | 3.3784 | 0.0013 | 0.0054 | 0.0001 | 0.0013 | -1.0 | -1.0 | 0.0003 | 0.0045 | 0.0233 | 0.0233 | -1.0 | -1.0 | 0.0052 | 0.0931 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 4.7735 | 24.4048 | 2050 | 3.3649 | 0.0025 | 0.0101 | 0.0009 | 0.0025 | -1.0 | -1.0 | 0.0017 | 0.0082 | 0.03 | 0.03 | -1.0 | -1.0 | 0.0085 | 0.1116 | 0.0015 | 0.006 | 0.0 | 0.0 | 0.0 | 0.0024 |
| 4.2583 | 25.0 | 2100 | 3.3489 | 0.0009 | 0.0043 | 0.0001 | 0.0009 | -1.0 | -1.0 | 0.0005 | 0.0032 | 0.0168 | 0.0168 | -1.0 | -1.0 | 0.0034 | 0.0671 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 3.383 | 25.5952 | 2150 | 3.3785 | 0.0014 | 0.0066 | 0.0001 | 0.0014 | -1.0 | -1.0 | 0.0015 | 0.0037 | 0.0221 | 0.0221 | -1.0 | -1.0 | 0.0054 | 0.0884 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 3.9419 | 26.1905 | 2200 | 3.3480 | 0.0012 | 0.006 | 0.0001 | 0.0012 | -1.0 | -1.0 | 0.0008 | 0.0044 | 0.0208 | 0.0208 | -1.0 | -1.0 | 0.005 | 0.0796 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0035 |
| 3.6637 | 26.7857 | 2250 | 3.4193 | 0.0014 | 0.0066 | 0.0 | 0.0014 | -1.0 | -1.0 | 0.0005 | 0.0032 | 0.0243 | 0.0243 | -1.0 | -1.0 | 0.0055 | 0.0954 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0018 |
| 4.8536 | 27.3810 | 2300 | 3.3361 | 0.001 | 0.0053 | 0.0 | 0.001 | -1.0 | -1.0 | 0.0008 | 0.0036 | 0.0216 | 0.0216 | -1.0 | -1.0 | 0.0041 | 0.0852 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0012 |
| 2.4094 | 27.9762 | 2350 | 3.3181 | 0.0016 | 0.0066 | 0.0001 | 0.0016 | -1.0 | -1.0 | 0.0001 | 0.0037 | 0.0262 | 0.0262 | -1.0 | -1.0 | 0.0062 | 0.0995 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0053 |
| 4.2904 | 28.5714 | 2400 | 3.3715 | 0.0009 | 0.0047 | 0.0001 | 0.0009 | -1.0 | -1.0 | 0.0006 | 0.0027 | 0.0194 | 0.0194 | -1.0 | -1.0 | 0.0036 | 0.0722 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0053 |
| 3.5257 | 29.1667 | 2450 | 3.3269 | 0.0014 | 0.0062 | 0.0005 | 0.0014 | -1.0 | -1.0 | 0.0001 | 0.0035 | 0.0228 | 0.0228 | -1.0 | -1.0 | 0.0045 | 0.0829 | 0.0012 | 0.0047 | 0.0 | 0.0 | 0.0 | 0.0035 |
| 3.3029 | 29.7619 | 2500 | 3.2705 | 0.0012 | 0.0056 | 0.0001 | 0.0012 | -1.0 | -1.0 | 0.0008 | 0.005 | 0.0221 | 0.0221 | -1.0 | -1.0 | 0.0048 | 0.0847 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0035 |
| 3.7728 | 30.3571 | 2550 | 3.3696 | 0.0014 | 0.0073 | 0.0 | 0.0014 | -1.0 | -1.0 | 0.0016 | 0.0035 | 0.02 | 0.02 | -1.0 | -1.0 | 0.0057 | 0.0745 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0053 |
| 4.2904 | 30.9524 | 2600 | 3.3354 | 0.0023 | 0.0119 | 0.0001 | 0.0023 | -1.0 | -1.0 | 0.001 | 0.0053 | 0.0258 | 0.0258 | -1.0 | -1.0 | 0.0074 | 0.1009 | 0.002 | 0.0019 | 0.0 | 0.0 | 0.0 | 0.0006 |
| 3.9415 | 31.5476 | 2650 | 3.2970 | 0.0024 | 0.0105 | 0.0005 | 0.0024 | -1.0 | -1.0 | 0.0006 | 0.0066 | 0.0289 | 0.0289 | -1.0 | -1.0 | 0.0071 | 0.0995 | 0.0027 | 0.0107 | 0.0 | 0.0 | 0.0 | 0.0053 |
| 3.7503 | 32.1429 | 2700 | 3.3180 | 0.0016 | 0.0063 | 0.0004 | 0.0016 | -1.0 | -1.0 | 0.0012 | 0.0056 | 0.0242 | 0.0242 | -1.0 | -1.0 | 0.0053 | 0.0847 | 0.001 | 0.0028 | 0.0 | 0.0 | 0.0 | 0.0094 |
| 3.876 | 32.7381 | 2750 | 3.2774 | 0.0016 | 0.0073 | 0.0 | 0.0016 | -1.0 | -1.0 | 0.0013 | 0.0041 | 0.0276 | 0.0276 | -1.0 | -1.0 | 0.0063 | 0.0917 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0188 |
| 4.616 | 33.3333 | 2800 | 3.2346 | 0.003 | 0.0104 | 0.0014 | 0.003 | -1.0 | -1.0 | 0.0019 | 0.0066 | 0.0321 | 0.0321 | -1.0 | -1.0 | 0.0068 | 0.0991 | 0.0052 | 0.0098 | 0.0 | 0.0 | 0.0 | 0.0194 |
| 3.6788 | 33.9286 | 2850 | 3.2381 | 0.002 | 0.0086 | 0.0002 | 0.002 | -1.0 | -1.0 | 0.0009 | 0.0065 | 0.0318 | 0.0318 | -1.0 | -1.0 | 0.0055 | 0.0949 | 0.0025 | 0.0121 | 0.0 | 0.0 | 0.0 | 0.02 |
| 2.4553 | 34.5238 | 2900 | 3.2698 | 0.0014 | 0.0067 | 0.0001 | 0.0014 | -1.0 | -1.0 | 0.0015 | 0.0049 | 0.0221 | 0.0221 | -1.0 | -1.0 | 0.0029 | 0.0625 | 0.0028 | 0.0093 | 0.0 | 0.0 | 0.0 | 0.0165 |
| 4.0858 | 35.1190 | 2950 | 3.2620 | 0.0018 | 0.0072 | 0.001 | 0.0018 | -1.0 | -1.0 | 0.0019 | 0.0057 | 0.0269 | 0.0269 | -1.0 | -1.0 | 0.0033 | 0.0699 | 0.0038 | 0.0158 | 0.0 | 0.0 | 0.0001 | 0.0218 |
| 2.8577 | 35.7143 | 3000 | 3.2360 | 0.0017 | 0.007 | 0.0004 | 0.0017 | -1.0 | -1.0 | 0.0012 | 0.0053 | 0.0252 | 0.0252 | -1.0 | -1.0 | 0.004 | 0.0782 | 0.0029 | 0.0126 | 0.0 | 0.0 | 0.0 | 0.01 |
| 3.2801 | 36.3095 | 3050 | 3.1968 | 0.0021 | 0.0093 | 0.0002 | 0.0021 | -1.0 | -1.0 | 0.0016 | 0.0055 | 0.0278 | 0.0278 | -1.0 | -1.0 | 0.0039 | 0.0801 | 0.0043 | 0.0177 | 0.0 | 0.0 | 0.0 | 0.0135 |
| 4.5916 | 36.9048 | 3100 | 3.2167 | 0.0018 | 0.007 | 0.0004 | 0.0018 | -1.0 | -1.0 | 0.0021 | 0.0052 | 0.0315 | 0.0315 | -1.0 | -1.0 | 0.0045 | 0.0954 | 0.0027 | 0.0126 | 0.0 | 0.0 | 0.0 | 0.0182 |
| 3.7165 | 37.5 | 3150 | 3.2289 | 0.0023 | 0.0105 | 0.0004 | 0.0023 | -1.0 | -1.0 | 0.0015 | 0.0057 | 0.0314 | 0.0314 | -1.0 | -1.0 | 0.0055 | 0.0958 | 0.0036 | 0.0144 | 0.0 | 0.0 | 0.0 | 0.0153 |
| 2.5332 | 38.0952 | 3200 | 3.1863 | 0.0037 | 0.0124 | 0.0013 | 0.0037 | -1.0 | -1.0 | 0.002 | 0.0063 | 0.0367 | 0.0367 | -1.0 | -1.0 | 0.0084 | 0.1199 | 0.0066 | 0.0186 | 0.0 | 0.0 | 0.0 | 0.0082 |
| 3.8024 | 38.6905 | 3250 | 3.2116 | 0.0024 | 0.0085 | 0.0003 | 0.0024 | -1.0 | -1.0 | 0.0019 | 0.0041 | 0.0289 | 0.0289 | -1.0 | -1.0 | 0.0053 | 0.0931 | 0.0044 | 0.0149 | 0.0 | 0.0 | 0.0 | 0.0076 |
| 2.5606 | 39.2857 | 3300 | 3.1705 | 0.003 | 0.0102 | 0.0011 | 0.0031 | -1.0 | -1.0 | 0.0022 | 0.0048 | 0.0393 | 0.0393 | -1.0 | -1.0 | 0.0084 | 0.1343 | 0.0038 | 0.013 | 0.0 | 0.0 | 0.0 | 0.01 |
| 3.5795 | 39.8810 | 3350 | 3.2093 | 0.0026 | 0.0097 | 0.0002 | 0.0026 | -1.0 | -1.0 | 0.0009 | 0.0038 | 0.0285 | 0.0285 | -1.0 | -1.0 | 0.0053 | 0.0894 | 0.0049 | 0.0098 | 0.0 | 0.0 | 0.0 | 0.0147 |
| 4.2509 | 40.4762 | 3400 | 3.2076 | 0.005 | 0.0138 | 0.003 | 0.005 | -1.0 | -1.0 | 0.0024 | 0.0059 | 0.0449 | 0.0449 | -1.0 | -1.0 | 0.0109 | 0.1486 | 0.0091 | 0.0144 | 0.0 | 0.0 | 0.0 | 0.0165 |
| 3.1034 | 41.0714 | 3450 | 3.2442 | 0.003 | 0.0095 | 0.0003 | 0.003 | -1.0 | -1.0 | 0.0014 | 0.0051 | 0.0321 | 0.0321 | -1.0 | -1.0 | 0.0058 | 0.1051 | 0.0063 | 0.0163 | 0.0 | 0.0 | 0.0 | 0.0071 |
| 2.4811 | 41.6667 | 3500 | 3.2295 | 0.0032 | 0.011 | 0.0012 | 0.0032 | -1.0 | -1.0 | 0.0019 | 0.0056 | 0.0346 | 0.0346 | -1.0 | -1.0 | 0.0068 | 0.1134 | 0.0061 | 0.0116 | 0.0 | 0.0 | 0.0 | 0.0135 |
| 2.4999 | 42.2619 | 3550 | 3.2601 | 0.0033 | 0.0099 | 0.0026 | 0.0033 | -1.0 | -1.0 | 0.0016 | 0.0049 | 0.0286 | 0.0286 | -1.0 | -1.0 | 0.0054 | 0.0949 | 0.0076 | 0.0158 | 0.0 | 0.0 | 0.0 | 0.0035 |
| 3.5424 | 42.8571 | 3600 | 3.1707 | 0.0029 | 0.0093 | 0.0011 | 0.0029 | -1.0 | -1.0 | 0.0017 | 0.0044 | 0.0328 | 0.0328 | -1.0 | -1.0 | 0.0055 | 0.1102 | 0.0063 | 0.0191 | 0.0 | 0.0 | 0.0 | 0.0018 |
| 3.6171 | 43.4524 | 3650 | 3.2108 | 0.0041 | 0.0112 | 0.0031 | 0.0041 | -1.0 | -1.0 | 0.0021 | 0.0062 | 0.0402 | 0.0402 | -1.0 | -1.0 | 0.0075 | 0.1315 | 0.009 | 0.0209 | 0.0 | 0.0 | 0.0 | 0.0082 |
| 3.361 | 44.0476 | 3700 | 3.1418 | 0.0026 | 0.0099 | 0.0003 | 0.0026 | -1.0 | -1.0 | 0.0013 | 0.0028 | 0.0299 | 0.0299 | -1.0 | -1.0 | 0.0055 | 0.1037 | 0.0048 | 0.0088 | 0.0 | 0.0 | 0.0 | 0.0071 |
| 4.3083 | 44.6429 | 3750 | 3.1361 | 0.0054 | 0.0162 | 0.0031 | 0.0054 | -1.0 | -1.0 | 0.0023 | 0.0085 | 0.0419 | 0.0419 | -1.0 | -1.0 | 0.0121 | 0.1407 | 0.0095 | 0.0163 | 0.0 | 0.0 | 0.0 | 0.0106 |
| 4.2463 | 45.2381 | 3800 | 3.1431 | 0.0046 | 0.0152 | 0.0032 | 0.0046 | -1.0 | -1.0 | 0.0015 | 0.0066 | 0.0411 | 0.0411 | -1.0 | -1.0 | 0.0098 | 0.1329 | 0.0085 | 0.0214 | 0.0 | 0.0 | 0.0 | 0.01 |
| 2.2291 | 45.8333 | 3850 | 3.1680 | 0.0037 | 0.0121 | 0.0004 | 0.0037 | -1.0 | -1.0 | 0.0022 | 0.0055 | 0.0359 | 0.0359 | -1.0 | -1.0 | 0.0074 | 0.1176 | 0.0073 | 0.0195 | 0.0 | 0.0 | 0.0 | 0.0065 |
| 3.5551 | 46.4286 | 3900 | 3.1039 | 0.0033 | 0.0109 | 0.0027 | 0.0033 | -1.0 | -1.0 | 0.0013 | 0.0049 | 0.0312 | 0.0312 | -1.0 | -1.0 | 0.0058 | 0.1005 | 0.0073 | 0.0163 | 0.0 | 0.0 | 0.0 | 0.0082 |
| 3.353 | 47.0238 | 3950 | 3.0903 | 0.0031 | 0.0107 | 0.0005 | 0.0031 | -1.0 | -1.0 | 0.0021 | 0.0053 | 0.0346 | 0.0346 | -1.0 | -1.0 | 0.0063 | 0.1097 | 0.006 | 0.02 | 0.0 | 0.0 | 0.0 | 0.0088 |
| 4.4283 | 47.6190 | 4000 | 3.0969 | 0.0042 | 0.0151 | 0.0028 | 0.0043 | -1.0 | -1.0 | 0.0012 | 0.0046 | 0.037 | 0.037 | -1.0 | -1.0 | 0.0083 | 0.1185 | 0.0086 | 0.0219 | 0.0 | 0.0 | 0.0 | 0.0076 |
| 3.5267 | 48.2143 | 4050 | 3.1400 | 0.0034 | 0.0125 | 0.0002 | 0.0034 | -1.0 | -1.0 | 0.0014 | 0.0057 | 0.0337 | 0.0337 | -1.0 | -1.0 | 0.0063 | 0.1079 | 0.0073 | 0.0186 | 0.0 | 0.0 | 0.0 | 0.0082 |
| 2.2114 | 48.8095 | 4100 | 3.1356 | 0.0028 | 0.0098 | 0.0002 | 0.0028 | -1.0 | -1.0 | 0.0015 | 0.0056 | 0.0277 | 0.0277 | -1.0 | -1.0 | 0.0056 | 0.0921 | 0.0057 | 0.0116 | 0.0 | 0.0 | 0.0 | 0.0071 |
| 4.0059 | 49.4048 | 4150 | 3.1413 | 0.0036 | 0.0111 | 0.0028 | 0.0036 | -1.0 | -1.0 | 0.0019 | 0.0037 | 0.0344 | 0.0344 | -1.0 | -1.0 | 0.0076 | 0.1199 | 0.0067 | 0.0116 | 0.0 | 0.0 | 0.0 | 0.0059 |
| 4.6763 | 50.0 | 4200 | 3.1362 | 0.0039 | 0.0118 | 0.0027 | 0.0039 | -1.0 | -1.0 | 0.0019 | 0.0067 | 0.0334 | 0.0334 | -1.0 | -1.0 | 0.0074 | 0.1144 | 0.0083 | 0.0158 | 0.0 | 0.0 | 0.0 | 0.0035 |
| 3.6185 | 50.5952 | 4250 | 3.1500 | 0.003 | 0.0094 | 0.0009 | 0.003 | -1.0 | -1.0 | 0.0012 | 0.006 | 0.0338 | 0.0338 | -1.0 | -1.0 | 0.0069 | 0.1153 | 0.0051 | 0.0116 | 0.0 | 0.0 | 0.0 | 0.0082 |
| 4.0936 | 51.1905 | 4300 | 3.1810 | 0.0039 | 0.0099 | 0.0028 | 0.0039 | -1.0 | -1.0 | 0.0016 | 0.0068 | 0.0346 | 0.0346 | -1.0 | -1.0 | 0.0072 | 0.1222 | 0.0085 | 0.0098 | 0.0 | 0.0 | 0.0 | 0.0065 |
| 3.4563 | 51.7857 | 4350 | 3.2194 | 0.0035 | 0.0108 | 0.0016 | 0.0035 | -1.0 | -1.0 | 0.0028 | 0.0067 | 0.0337 | 0.0337 | -1.0 | -1.0 | 0.0087 | 0.1148 | 0.0052 | 0.0084 | 0.0 | 0.0 | 0.0 | 0.0118 |
| 1.8965 | 52.3810 | 4400 | 3.1726 | 0.0037 | 0.0137 | 0.0006 | 0.0037 | -1.0 | -1.0 | 0.0016 | 0.0061 | 0.0385 | 0.0385 | -1.0 | -1.0 | 0.01 | 0.1278 | 0.0048 | 0.0116 | 0.0 | 0.0 | 0.0 | 0.0147 |
| 3.6665 | 52.9762 | 4450 | 3.2276 | 0.0028 | 0.0107 | 0.0004 | 0.0028 | -1.0 | -1.0 | 0.0015 | 0.0058 | 0.0292 | 0.0292 | -1.0 | -1.0 | 0.0071 | 0.0931 | 0.0041 | 0.0144 | 0.0 | 0.0 | 0.0 | 0.0094 |
| 3.354 | 53.5714 | 4500 | 3.1528 | 0.0033 | 0.0113 | 0.0006 | 0.0033 | -1.0 | -1.0 | 0.0015 | 0.0067 | 0.0357 | 0.0357 | -1.0 | -1.0 | 0.009 | 0.1245 | 0.0041 | 0.0088 | 0.0 | 0.0 | 0.0 | 0.0094 |
| 4.3119 | 54.1667 | 4550 | 3.1685 | 0.0034 | 0.0111 | 0.0011 | 0.0034 | -1.0 | -1.0 | 0.0007 | 0.0039 | 0.0336 | 0.0336 | -1.0 | -1.0 | 0.0076 | 0.1148 | 0.006 | 0.0102 | 0.0 | 0.0 | 0.0 | 0.0094 |
| 3.0092 | 54.7619 | 4600 | 3.1044 | 0.0047 | 0.0164 | 0.0012 | 0.0047 | -1.0 | -1.0 | 0.0009 | 0.0074 | 0.0403 | 0.0403 | -1.0 | -1.0 | 0.0117 | 0.1329 | 0.0071 | 0.0181 | 0.0 | 0.0 | 0.0 | 0.01 |
| 2.5933 | 55.3571 | 4650 | 3.1413 | 0.0048 | 0.0155 | 0.0015 | 0.0048 | -1.0 | -1.0 | 0.0013 | 0.0066 | 0.0399 | 0.0399 | -1.0 | -1.0 | 0.0111 | 0.1296 | 0.0082 | 0.0181 | 0.0 | 0.0 | 0.0 | 0.0118 |
| 4.3193 | 55.9524 | 4700 | 3.0734 | 0.0042 | 0.0134 | 0.001 | 0.0042 | -1.0 | -1.0 | 0.0015 | 0.0066 | 0.04 | 0.04 | -1.0 | -1.0 | 0.0094 | 0.1324 | 0.0072 | 0.0153 | 0.0 | 0.0 | 0.0 | 0.0124 |
| 2.8189 | 56.5476 | 4750 | 3.1543 | 0.0037 | 0.0145 | 0.0005 | 0.0037 | -1.0 | -1.0 | 0.0014 | 0.0055 | 0.0337 | 0.0337 | -1.0 | -1.0 | 0.0073 | 0.1083 | 0.0076 | 0.02 | 0.0 | 0.0 | 0.0 | 0.0065 |
| 4.3455 | 57.1429 | 4800 | 3.1613 | 0.0035 | 0.0136 | 0.0005 | 0.0035 | -1.0 | -1.0 | 0.0005 | 0.0042 | 0.0347 | 0.0347 | -1.0 | -1.0 | 0.0081 | 0.1167 | 0.0059 | 0.0163 | 0.0 | 0.0 | 0.0 | 0.0059 |
| 2.3567 | 57.7381 | 4850 | 3.0877 | 0.0041 | 0.0161 | 0.0009 | 0.0041 | -1.0 | -1.0 | 0.001 | 0.0058 | 0.0396 | 0.0396 | -1.0 | -1.0 | 0.0084 | 0.131 | 0.0079 | 0.0195 | 0.0 | 0.0 | 0.0 | 0.0076 |
| 2.0553 | 58.3333 | 4900 | 3.1233 | 0.0048 | 0.0143 | 0.0034 | 0.0048 | -1.0 | -1.0 | 0.0013 | 0.0051 | 0.038 | 0.038 | -1.0 | -1.0 | 0.008 | 0.1208 | 0.0111 | 0.0251 | 0.0 | 0.0 | 0.0 | 0.0059 |
| 3.5256 | 58.9286 | 4950 | 3.1314 | 0.0044 | 0.0147 | 0.0028 | 0.0044 | -1.0 | -1.0 | 0.0008 | 0.0037 | 0.0345 | 0.0345 | -1.0 | -1.0 | 0.0066 | 0.1088 | 0.0112 | 0.0223 | 0.0 | 0.0 | 0.0 | 0.0071 |
| 4.0251 | 59.5238 | 5000 | 3.1076 | 0.0039 | 0.0129 | 0.0006 | 0.0039 | -1.0 | -1.0 | 0.0009 | 0.0052 | 0.0363 | 0.0363 | -1.0 | -1.0 | 0.0074 | 0.1181 | 0.0083 | 0.0195 | 0.0 | 0.0 | 0.0 | 0.0076 |
| 3.0831 | 60.1190 | 5050 | 3.1320 | 0.0044 | 0.012 | 0.0027 | 0.0044 | -1.0 | -1.0 | 0.0029 | 0.0066 | 0.0341 | 0.0341 | -1.0 | -1.0 | 0.0069 | 0.1074 | 0.0107 | 0.0195 | 0.0 | 0.0 | 0.0 | 0.0094 |
| 3.6648 | 60.7143 | 5100 | 3.1147 | 0.0052 | 0.0173 | 0.003 | 0.0052 | -1.0 | -1.0 | 0.0034 | 0.0087 | 0.0408 | 0.0408 | -1.0 | -1.0 | 0.0101 | 0.1278 | 0.0105 | 0.0265 | 0.0 | 0.0 | 0.0 | 0.0088 |
| 3.9512 | 61.3095 | 5150 | 3.1377 | 0.0032 | 0.0111 | 0.0002 | 0.0032 | -1.0 | -1.0 | 0.001 | 0.0051 | 0.0267 | 0.0267 | -1.0 | -1.0 | 0.005 | 0.0861 | 0.0077 | 0.0149 | 0.0 | 0.0 | 0.0 | 0.0059 |
| 4.6871 | 61.9048 | 5200 | 3.1088 | 0.0036 | 0.0121 | 0.0002 | 0.0036 | -1.0 | -1.0 | 0.001 | 0.0065 | 0.0313 | 0.0313 | -1.0 | -1.0 | 0.0069 | 0.1019 | 0.0077 | 0.0181 | 0.0 | 0.0 | 0.0 | 0.0053 |
| 3.4733 | 62.5 | 5250 | 3.1430 | 0.0041 | 0.0144 | 0.0028 | 0.0041 | -1.0 | -1.0 | 0.0019 | 0.0074 | 0.034 | 0.034 | -1.0 | -1.0 | 0.0074 | 0.1056 | 0.0091 | 0.02 | 0.0 | 0.0 | 0.0 | 0.0106 |
| 1.7946 | 63.0952 | 5300 | 3.1228 | 0.0038 | 0.0123 | 0.0003 | 0.0038 | -1.0 | -1.0 | 0.0016 | 0.0072 | 0.0346 | 0.0346 | -1.0 | -1.0 | 0.007 | 0.106 | 0.0083 | 0.0214 | 0.0 | 0.0 | 0.0 | 0.0112 |
| 2.0241 | 63.6905 | 5350 | 3.1019 | 0.0038 | 0.0118 | 0.0005 | 0.0038 | -1.0 | -1.0 | 0.0029 | 0.0059 | 0.0346 | 0.0346 | -1.0 | -1.0 | 0.0075 | 0.1046 | 0.0079 | 0.0228 | 0.0 | 0.0 | 0.0 | 0.0112 |
| 4.195 | 64.2857 | 5400 | 3.1583 | 0.003 | 0.0091 | 0.0026 | 0.003 | -1.0 | -1.0 | 0.0016 | 0.0055 | 0.0236 | 0.0236 | -1.0 | -1.0 | 0.0044 | 0.069 | 0.0075 | 0.0149 | 0.0 | 0.0 | 0.0 | 0.0106 |
| 4.0282 | 64.8810 | 5450 | 3.1138 | 0.0048 | 0.0146 | 0.0012 | 0.0048 | -1.0 | -1.0 | 0.0026 | 0.0079 | 0.0388 | 0.0388 | -1.0 | -1.0 | 0.0094 | 0.1111 | 0.0097 | 0.0307 | 0.0 | 0.0 | 0.0 | 0.0135 |
| 3.6162 | 65.4762 | 5500 | 3.0953 | 0.005 | 0.0159 | 0.0029 | 0.005 | -1.0 | -1.0 | 0.0027 | 0.0075 | 0.0413 | 0.0413 | -1.0 | -1.0 | 0.0106 | 0.1259 | 0.0092 | 0.0256 | 0.0 | 0.0 | 0.0 | 0.0135 |
| 3.4249 | 66.0714 | 5550 | 3.0953 | 0.0055 | 0.0169 | 0.002 | 0.0055 | -1.0 | -1.0 | 0.003 | 0.0095 | 0.0473 | 0.0473 | -1.0 | -1.0 | 0.0127 | 0.1449 | 0.0091 | 0.0288 | 0.0 | 0.0 | 0.0 | 0.0153 |
| 3.7631 | 66.6667 | 5600 | 3.1707 | 0.0047 | 0.0136 | 0.0013 | 0.0047 | -1.0 | -1.0 | 0.0031 | 0.0071 | 0.0383 | 0.0383 | -1.0 | -1.0 | 0.0099 | 0.1241 | 0.009 | 0.0209 | 0.0 | 0.0 | 0.0 | 0.0082 |
| 2.3827 | 67.2619 | 5650 | 3.1308 | 0.0046 | 0.0144 | 0.0028 | 0.0046 | -1.0 | -1.0 | 0.001 | 0.0061 | 0.0374 | 0.0374 | -1.0 | -1.0 | 0.0101 | 0.1231 | 0.0082 | 0.02 | 0.0 | 0.0 | 0.0 | 0.0065 |
| 2.4558 | 67.8571 | 5700 | 3.0937 | 0.0048 | 0.0154 | 0.0028 | 0.0048 | -1.0 | -1.0 | 0.0015 | 0.0056 | 0.0347 | 0.0347 | -1.0 | -1.0 | 0.0094 | 0.1079 | 0.01 | 0.0205 | 0.0 | 0.0 | 0.0 | 0.0106 |
| 1.9914 | 68.4524 | 5750 | 3.1093 | 0.0048 | 0.0152 | 0.0029 | 0.0048 | -1.0 | -1.0 | 0.002 | 0.0056 | 0.0393 | 0.0393 | -1.0 | -1.0 | 0.011 | 0.131 | 0.0084 | 0.0172 | 0.0 | 0.0 | 0.0 | 0.0088 |
| 2.4137 | 69.0476 | 5800 | 3.1115 | 0.0051 | 0.0181 | 0.0011 | 0.0051 | -1.0 | -1.0 | 0.0021 | 0.0071 | 0.0429 | 0.0429 | -1.0 | -1.0 | 0.012 | 0.1347 | 0.0082 | 0.0223 | 0.0 | 0.0 | 0.0 | 0.0147 |
| 4.5029 | 69.6429 | 5850 | 3.1292 | 0.0055 | 0.0193 | 0.0027 | 0.0055 | -1.0 | -1.0 | 0.003 | 0.0073 | 0.0389 | 0.0389 | -1.0 | -1.0 | 0.013 | 0.1241 | 0.009 | 0.0228 | 0.0 | 0.0 | 0.0 | 0.0088 |
| 4.7825 | 70.2381 | 5900 | 3.1378 | 0.0054 | 0.0181 | 0.003 | 0.0054 | -1.0 | -1.0 | 0.0019 | 0.0068 | 0.041 | 0.041 | -1.0 | -1.0 | 0.0116 | 0.1264 | 0.0098 | 0.0251 | 0.0 | 0.0 | 0.0 | 0.0124 |
| 3.5144 | 70.8333 | 5950 | 3.1005 | 0.0055 | 0.0175 | 0.0032 | 0.0056 | -1.0 | -1.0 | 0.0021 | 0.0078 | 0.0431 | 0.0431 | -1.0 | -1.0 | 0.0122 | 0.131 | 0.01 | 0.0302 | 0.0 | 0.0 | 0.0 | 0.0112 |
| 3.7245 | 71.4286 | 6000 | 3.0791 | 0.0049 | 0.0162 | 0.0005 | 0.0049 | -1.0 | -1.0 | 0.0019 | 0.0081 | 0.0443 | 0.0443 | -1.0 | -1.0 | 0.0122 | 0.1398 | 0.0073 | 0.0237 | 0.0 | 0.0 | 0.0 | 0.0135 |
| 1.0643 | 72.0238 | 6050 | 3.1118 | 0.0045 | 0.0145 | 0.0027 | 0.0045 | -1.0 | -1.0 | 0.0014 | 0.0059 | 0.0394 | 0.0394 | -1.0 | -1.0 | 0.0102 | 0.1245 | 0.0078 | 0.0214 | 0.0 | 0.0 | 0.0 | 0.0118 |
| 4.0432 | 72.6190 | 6100 | 3.1661 | 0.004 | 0.0132 | 0.0004 | 0.004 | -1.0 | -1.0 | 0.0014 | 0.0073 | 0.0359 | 0.0359 | -1.0 | -1.0 | 0.0078 | 0.1083 | 0.0083 | 0.0307 | 0.0 | 0.0 | 0.0 | 0.0047 |
| 3.1888 | 73.2143 | 6150 | 3.0644 | 0.0047 | 0.0173 | 0.0009 | 0.0047 | -1.0 | -1.0 | 0.001 | 0.006 | 0.0413 | 0.0413 | -1.0 | -1.0 | 0.0108 | 0.1301 | 0.0081 | 0.0312 | 0.0 | 0.0 | 0.0 | 0.0041 |
| 2.8505 | 73.8095 | 6200 | 3.0855 | 0.0054 | 0.0202 | 0.001 | 0.0054 | -1.0 | -1.0 | 0.0015 | 0.0067 | 0.0485 | 0.0485 | -1.0 | -1.0 | 0.015 | 0.1574 | 0.0065 | 0.0265 | 0.0 | 0.0 | 0.0 | 0.01 |
| 3.5999 | 74.4048 | 6250 | 3.1414 | 0.0043 | 0.0146 | 0.003 | 0.0043 | -1.0 | -1.0 | 0.0016 | 0.0063 | 0.0344 | 0.0344 | -1.0 | -1.0 | 0.0089 | 0.112 | 0.0082 | 0.0209 | 0.0 | 0.0 | 0.0 | 0.0047 |
| 2.2493 | 75.0 | 6300 | 3.0587 | 0.0046 | 0.0185 | 0.0008 | 0.0046 | -1.0 | -1.0 | 0.0017 | 0.0063 | 0.042 | 0.042 | -1.0 | -1.0 | 0.0118 | 0.1366 | 0.0067 | 0.0279 | 0.0 | 0.0 | 0.0 | 0.0035 |
| 2.6329 | 75.5952 | 6350 | 3.0739 | 0.0049 | 0.0176 | 0.0006 | 0.0049 | -1.0 | -1.0 | 0.001 | 0.0059 | 0.0433 | 0.0433 | -1.0 | -1.0 | 0.0111 | 0.1301 | 0.0085 | 0.0307 | 0.0 | 0.0 | 0.0 | 0.0124 |
| 2.7892 | 76.1905 | 6400 | 3.0635 | 0.0047 | 0.016 | 0.0006 | 0.0047 | -1.0 | -1.0 | 0.0009 | 0.0065 | 0.0432 | 0.0432 | -1.0 | -1.0 | 0.0115 | 0.1361 | 0.0072 | 0.0284 | 0.0 | 0.0 | 0.0 | 0.0082 |
| 2.8733 | 76.7857 | 6450 | 3.1128 | 0.0042 | 0.0129 | 0.0029 | 0.0042 | -1.0 | -1.0 | 0.0017 | 0.0062 | 0.0358 | 0.0358 | -1.0 | -1.0 | 0.0077 | 0.1116 | 0.0092 | 0.0274 | 0.0 | 0.0 | 0.0 | 0.0041 |
| 3.3976 | 77.3810 | 6500 | 3.0844 | 0.0038 | 0.0139 | 0.0009 | 0.0038 | -1.0 | -1.0 | 0.001 | 0.0084 | 0.0408 | 0.0408 | -1.0 | -1.0 | 0.01 | 0.1333 | 0.0052 | 0.0233 | 0.0 | 0.0 | 0.0 | 0.0065 |
| 3.1254 | 77.9762 | 6550 | 3.1135 | 0.0031 | 0.0109 | 0.0008 | 0.0031 | -1.0 | -1.0 | 0.0007 | 0.0053 | 0.0347 | 0.0347 | -1.0 | -1.0 | 0.007 | 0.1144 | 0.0052 | 0.0214 | 0.0 | 0.0 | 0.0 | 0.0029 |
| 3.7946 | 78.5714 | 6600 | 3.0972 | 0.004 | 0.0144 | 0.0008 | 0.004 | -1.0 | -1.0 | 0.0009 | 0.0067 | 0.0395 | 0.0395 | -1.0 | -1.0 | 0.009 | 0.1315 | 0.0069 | 0.0219 | 0.0 | 0.0 | 0.0 | 0.0047 |
| 6.2164 | 79.1667 | 6650 | 3.1311 | 0.0036 | 0.0114 | 0.0004 | 0.0036 | -1.0 | -1.0 | 0.0013 | 0.0074 | 0.0344 | 0.0344 | -1.0 | -1.0 | 0.0069 | 0.1125 | 0.0074 | 0.0214 | 0.0 | 0.0 | 0.0 | 0.0035 |
| 3.631 | 79.7619 | 6700 | 3.1717 | 0.0038 | 0.0122 | 0.0027 | 0.0038 | -1.0 | -1.0 | 0.0008 | 0.0077 | 0.032 | 0.032 | -1.0 | -1.0 | 0.0067 | 0.1019 | 0.0083 | 0.0195 | 0.0 | 0.0 | 0.0 | 0.0065 |
| 4.0791 | 80.3571 | 6750 | 3.1171 | 0.004 | 0.0129 | 0.0002 | 0.004 | -1.0 | -1.0 | 0.0009 | 0.0084 | 0.0344 | 0.0344 | -1.0 | -1.0 | 0.0084 | 0.1171 | 0.0074 | 0.02 | 0.0 | 0.0 | 0.0 | 0.0006 |
| 4.3475 | 80.9524 | 6800 | 3.0581 | 0.0038 | 0.0141 | 0.0003 | 0.0038 | -1.0 | -1.0 | 0.0016 | 0.0086 | 0.0335 | 0.0335 | -1.0 | -1.0 | 0.0076 | 0.0991 | 0.0078 | 0.0256 | 0.0 | 0.0 | 0.0 | 0.0094 |
| 3.9135 | 81.5476 | 6850 | 3.0715 | 0.0039 | 0.0141 | 0.0001 | 0.0039 | -1.0 | -1.0 | 0.0026 | 0.0071 | 0.03 | 0.03 | -1.0 | -1.0 | 0.0062 | 0.0852 | 0.0093 | 0.0242 | 0.0 | 0.0 | 0.0 | 0.0106 |
| 3.6389 | 82.1429 | 6900 | 3.0798 | 0.0042 | 0.0144 | 0.0007 | 0.0042 | -1.0 | -1.0 | 0.0026 | 0.0086 | 0.0339 | 0.0339 | -1.0 | -1.0 | 0.0074 | 0.1028 | 0.0093 | 0.027 | 0.0 | 0.0 | 0.0 | 0.0059 |
| 1.648 | 82.7381 | 6950 | 3.0625 | 0.0048 | 0.0155 | 0.001 | 0.0048 | -1.0 | -1.0 | 0.0024 | 0.0124 | 0.0414 | 0.0414 | -1.0 | -1.0 | 0.0101 | 0.1269 | 0.0093 | 0.0288 | 0.0 | 0.0 | 0.0 | 0.01 |
| 4.0621 | 83.3333 | 7000 | 3.0674 | 0.0038 | 0.0136 | 0.0002 | 0.0038 | -1.0 | -1.0 | 0.0021 | 0.0103 | 0.0341 | 0.0341 | -1.0 | -1.0 | 0.0071 | 0.1056 | 0.0082 | 0.0256 | 0.0 | 0.0 | 0.0 | 0.0053 |
| 3.6219 | 83.9286 | 7050 | 3.1208 | 0.004 | 0.0138 | 0.0002 | 0.004 | -1.0 | -1.0 | 0.0014 | 0.0096 | 0.0354 | 0.0354 | -1.0 | -1.0 | 0.0081 | 0.1106 | 0.008 | 0.0251 | 0.0 | 0.0 | 0.0 | 0.0059 |
| 3.3229 | 84.5238 | 7100 | 3.1118 | 0.0043 | 0.0139 | 0.0005 | 0.0043 | -1.0 | -1.0 | 0.0013 | 0.0085 | 0.037 | 0.037 | -1.0 | -1.0 | 0.009 | 0.119 | 0.008 | 0.0237 | 0.0 | 0.0 | 0.0 | 0.0053 |
| 2.964 | 85.1190 | 7150 | 3.1132 | 0.0044 | 0.0138 | 0.0029 | 0.0044 | -1.0 | -1.0 | 0.0015 | 0.0085 | 0.037 | 0.037 | -1.0 | -1.0 | 0.0094 | 0.1208 | 0.0082 | 0.02 | 0.0 | 0.0 | 0.0 | 0.0071 |
| 3.5183 | 85.7143 | 7200 | 3.0452 | 0.0044 | 0.0155 | 0.0007 | 0.0044 | -1.0 | -1.0 | 0.0013 | 0.0099 | 0.0402 | 0.0402 | -1.0 | -1.0 | 0.0091 | 0.1213 | 0.0083 | 0.0242 | 0.0 | 0.0 | 0.0 | 0.0153 |
| 3.2454 | 86.3095 | 7250 | 3.0898 | 0.0036 | 0.0132 | 0.001 | 0.0036 | -1.0 | -1.0 | 0.0013 | 0.0066 | 0.0342 | 0.0342 | -1.0 | -1.0 | 0.008 | 0.1023 | 0.0065 | 0.0209 | 0.0 | 0.0 | 0.0 | 0.0135 |
| 2.2413 | 86.9048 | 7300 | 3.1073 | 0.0033 | 0.0131 | 0.001 | 0.0033 | -1.0 | -1.0 | 0.0005 | 0.0055 | 0.0348 | 0.0348 | -1.0 | -1.0 | 0.0084 | 0.1046 | 0.0046 | 0.0167 | 0.0 | 0.0 | 0.0 | 0.0176 |
| 3.4964 | 87.5 | 7350 | 3.0578 | 0.0046 | 0.0154 | 0.0013 | 0.0046 | -1.0 | -1.0 | 0.002 | 0.0062 | 0.0371 | 0.0371 | -1.0 | -1.0 | 0.0101 | 0.113 | 0.0081 | 0.0214 | 0.0 | 0.0 | 0.0 | 0.0141 |
| 3.8867 | 88.0952 | 7400 | 3.0704 | 0.0047 | 0.0162 | 0.0004 | 0.0047 | -1.0 | -1.0 | 0.002 | 0.0088 | 0.038 | 0.038 | -1.0 | -1.0 | 0.0106 | 0.112 | 0.0081 | 0.027 | 0.0 | 0.0 | 0.0 | 0.0129 |
| 2.9379 | 88.6905 | 7450 | 3.0753 | 0.0043 | 0.0138 | 0.0029 | 0.0043 | -1.0 | -1.0 | 0.0016 | 0.0068 | 0.035 | 0.035 | -1.0 | -1.0 | 0.0089 | 0.1037 | 0.0083 | 0.0247 | 0.0 | 0.0 | 0.0 | 0.0118 |
| 1.3889 | 89.2857 | 7500 | 3.0395 | 0.0048 | 0.0167 | 0.0003 | 0.0048 | -1.0 | -1.0 | 0.0022 | 0.0093 | 0.0411 | 0.0411 | -1.0 | -1.0 | 0.0111 | 0.1245 | 0.0082 | 0.027 | 0.0 | 0.0 | 0.0 | 0.0129 |
| 2.8528 | 89.8810 | 7550 | 3.0231 | 0.0043 | 0.0146 | 0.0005 | 0.0043 | -1.0 | -1.0 | 0.0023 | 0.0089 | 0.038 | 0.038 | -1.0 | -1.0 | 0.0097 | 0.1144 | 0.0073 | 0.0237 | 0.0 | 0.0 | 0.0 | 0.0141 |
| 2.7471 | 90.4762 | 7600 | 3.0100 | 0.0046 | 0.0162 | 0.0018 | 0.0046 | -1.0 | -1.0 | 0.0024 | 0.0101 | 0.0406 | 0.0406 | -1.0 | -1.0 | 0.0105 | 0.1236 | 0.0078 | 0.027 | 0.0 | 0.0 | 0.0 | 0.0118 |
| 4.8898 | 91.0714 | 7650 | 2.9963 | 0.0045 | 0.0153 | 0.002 | 0.0045 | -1.0 | -1.0 | 0.0023 | 0.0082 | 0.0386 | 0.0386 | -1.0 | -1.0 | 0.0098 | 0.1171 | 0.008 | 0.0284 | 0.0 | 0.0 | 0.0 | 0.0088 |
| 2.6671 | 91.6667 | 7700 | 3.0438 | 0.0043 | 0.0124 | 0.0027 | 0.0043 | -1.0 | -1.0 | 0.0026 | 0.008 | 0.0315 | 0.0315 | -1.0 | -1.0 | 0.0079 | 0.0958 | 0.0094 | 0.0233 | 0.0 | 0.0 | 0.0 | 0.0071 |
| 4.8524 | 92.2619 | 7750 | 3.0317 | 0.0045 | 0.0147 | 0.003 | 0.0045 | -1.0 | -1.0 | 0.0035 | 0.0099 | 0.0352 | 0.0352 | -1.0 | -1.0 | 0.0092 | 0.1037 | 0.0087 | 0.0265 | 0.0 | 0.0 | 0.0 | 0.0106 |
| 2.5835 | 92.8571 | 7800 | 3.0333 | 0.005 | 0.0178 | 0.003 | 0.005 | -1.0 | -1.0 | 0.0035 | 0.0095 | 0.0392 | 0.0392 | -1.0 | -1.0 | 0.0099 | 0.1134 | 0.0102 | 0.0293 | 0.0 | 0.0 | 0.0 | 0.0141 |
| 4.065 | 93.4524 | 7850 | 3.0547 | 0.005 | 0.0149 | 0.003 | 0.005 | -1.0 | -1.0 | 0.0017 | 0.008 | 0.0368 | 0.0368 | -1.0 | -1.0 | 0.0105 | 0.1162 | 0.0094 | 0.02 | 0.0 | 0.0 | 0.0 | 0.0112 |
| 3.9096 | 94.0476 | 7900 | 3.0684 | 0.005 | 0.0158 | 0.0029 | 0.005 | -1.0 | -1.0 | 0.0037 | 0.0087 | 0.038 | 0.038 | -1.0 | -1.0 | 0.01 | 0.1144 | 0.0101 | 0.0223 | 0.0 | 0.0 | 0.0 | 0.0153 |
| 3.097 | 94.6429 | 7950 | 3.0694 | 0.0051 | 0.0171 | 0.0032 | 0.0051 | -1.0 | -1.0 | 0.0027 | 0.0089 | 0.0388 | 0.0388 | -1.0 | -1.0 | 0.011 | 0.1176 | 0.0094 | 0.0219 | 0.0 | 0.0 | 0.0 | 0.0159 |
| 3.4465 | 95.2381 | 8000 | 3.0668 | 0.0056 | 0.0179 | 0.0032 | 0.0056 | -1.0 | -1.0 | 0.0028 | 0.0106 | 0.0421 | 0.0421 | -1.0 | -1.0 | 0.0118 | 0.1264 | 0.0107 | 0.0256 | 0.0 | 0.0 | 0.0 | 0.0165 |
| 2.9165 | 95.8333 | 8050 | 3.0545 | 0.0057 | 0.0177 | 0.0028 | 0.0057 | -1.0 | -1.0 | 0.0023 | 0.0117 | 0.043 | 0.043 | -1.0 | -1.0 | 0.0126 | 0.1292 | 0.01 | 0.0265 | 0.0 | 0.0 | 0.0 | 0.0165 |
| 3.7888 | 96.4286 | 8100 | 3.0652 | 0.0053 | 0.0166 | 0.0029 | 0.0053 | -1.0 | -1.0 | 0.002 | 0.0097 | 0.0398 | 0.0398 | -1.0 | -1.0 | 0.0114 | 0.1194 | 0.0096 | 0.026 | 0.0 | 0.0 | 0.0 | 0.0135 |
| 3.4622 | 97.0238 | 8150 | 3.0395 | 0.0051 | 0.0169 | 0.0007 | 0.0051 | -1.0 | -1.0 | 0.0016 | 0.0129 | 0.04 | 0.04 | -1.0 | -1.0 | 0.0116 | 0.1194 | 0.0089 | 0.0284 | 0.0 | 0.0 | 0.0 | 0.0124 |
| 2.4208 | 97.6190 | 8200 | 3.0382 | 0.0056 | 0.0173 | 0.0034 | 0.0056 | -1.0 | -1.0 | 0.0022 | 0.0108 | 0.0428 | 0.0428 | -1.0 | -1.0 | 0.0131 | 0.1269 | 0.0095 | 0.0284 | 0.0 | 0.0 | 0.0 | 0.0159 |
| 3.4318 | 98.2143 | 8250 | 3.0362 | 0.0054 | 0.0175 | 0.0007 | 0.0054 | -1.0 | -1.0 | 0.0015 | 0.0114 | 0.0439 | 0.0439 | -1.0 | -1.0 | 0.0123 | 0.1273 | 0.0092 | 0.0302 | 0.0 | 0.0 | 0.0 | 0.0182 |
| 2.8207 | 98.8095 | 8300 | 3.0339 | 0.0054 | 0.0173 | 0.001 | 0.0054 | -1.0 | -1.0 | 0.0021 | 0.0111 | 0.0442 | 0.0442 | -1.0 | -1.0 | 0.0127 | 0.1319 | 0.0087 | 0.0284 | 0.0 | 0.0 | 0.0 | 0.0165 |
| 1.7095 | 99.4048 | 8350 | 3.0436 | 0.0056 | 0.0198 | 0.0011 | 0.0056 | -1.0 | -1.0 | 0.0028 | 0.011 | 0.0449 | 0.0449 | -1.0 | -1.0 | 0.0149 | 0.1389 | 0.0073 | 0.0256 | 0.0 | 0.0 | 0.0 | 0.0153 |
| 3.8167 | 100.0 | 8400 | 3.0498 | 0.0049 | 0.0179 | 0.0008 | 0.0049 | -1.0 | -1.0 | 0.0028 | 0.0114 | 0.0438 | 0.0438 | -1.0 | -1.0 | 0.0129 | 0.1301 | 0.0067 | 0.0279 | 0.0 | 0.0 | 0.0 | 0.0171 |
| 3.4136 | 100.5952 | 8450 | 3.0501 | 0.0048 | 0.017 | 0.001 | 0.0048 | -1.0 | -1.0 | 0.0015 | 0.0103 | 0.0429 | 0.0429 | -1.0 | -1.0 | 0.0128 | 0.1287 | 0.0065 | 0.0265 | 0.0 | 0.0 | 0.0 | 0.0165 |
| 2.8302 | 101.1905 | 8500 | 3.0754 | 0.0052 | 0.019 | 0.0005 | 0.0052 | -1.0 | -1.0 | 0.0016 | 0.0104 | 0.0417 | 0.0417 | -1.0 | -1.0 | 0.0126 | 0.1213 | 0.0081 | 0.0242 | 0.0 | 0.0 | 0.0001 | 0.0212 |
| 3.1284 | 101.7857 | 8550 | 3.0672 | 0.0051 | 0.0191 | 0.0006 | 0.0051 | -1.0 | -1.0 | 0.0019 | 0.0101 | 0.0417 | 0.0417 | -1.0 | -1.0 | 0.0119 | 0.1199 | 0.0082 | 0.0247 | 0.0 | 0.0 | 0.0 | 0.0224 |
| 2.2794 | 102.3810 | 8600 | 3.0259 | 0.0052 | 0.018 | 0.0009 | 0.0052 | -1.0 | -1.0 | 0.0015 | 0.0102 | 0.0439 | 0.0439 | -1.0 | -1.0 | 0.0131 | 0.1301 | 0.0078 | 0.026 | 0.0 | 0.0 | 0.0 | 0.0194 |
| 2.5676 | 102.9762 | 8650 | 3.0616 | 0.0049 | 0.0172 | 0.0009 | 0.0049 | -1.0 | -1.0 | 0.0014 | 0.011 | 0.0419 | 0.0419 | -1.0 | -1.0 | 0.0118 | 0.1227 | 0.0079 | 0.0256 | 0.0 | 0.0 | 0.0 | 0.0194 |
| 1.8186 | 103.5714 | 8700 | 3.0570 | 0.0046 | 0.017 | 0.0009 | 0.0046 | -1.0 | -1.0 | 0.0016 | 0.0107 | 0.0401 | 0.0401 | -1.0 | -1.0 | 0.0108 | 0.1199 | 0.0075 | 0.0219 | 0.0 | 0.0 | 0.0 | 0.0188 |
| 4.6028 | 104.1667 | 8750 | 3.0423 | 0.0048 | 0.0163 | 0.0008 | 0.0048 | -1.0 | -1.0 | 0.0016 | 0.0108 | 0.0419 | 0.0419 | -1.0 | -1.0 | 0.0111 | 0.125 | 0.0081 | 0.026 | 0.0 | 0.0 | 0.0 | 0.0165 |
| 2.3711 | 104.7619 | 8800 | 3.0418 | 0.0047 | 0.017 | 0.0008 | 0.0047 | -1.0 | -1.0 | 0.0017 | 0.0114 | 0.0416 | 0.0416 | -1.0 | -1.0 | 0.0111 | 0.1218 | 0.0078 | 0.027 | 0.0 | 0.0 | 0.0 | 0.0176 |
| 3.3833 | 105.3571 | 8850 | 3.0609 | 0.0047 | 0.0152 | 0.0007 | 0.0047 | -1.0 | -1.0 | 0.0019 | 0.0108 | 0.0398 | 0.0398 | -1.0 | -1.0 | 0.0102 | 0.1157 | 0.0085 | 0.0251 | 0.0 | 0.0 | 0.0 | 0.0182 |
| 4.1088 | 105.9524 | 8900 | 3.0393 | 0.005 | 0.017 | 0.0009 | 0.005 | -1.0 | -1.0 | 0.0017 | 0.0117 | 0.0419 | 0.0419 | -1.0 | -1.0 | 0.0113 | 0.1213 | 0.0088 | 0.027 | 0.0 | 0.0 | 0.0 | 0.0194 |
| 3.0827 | 106.5476 | 8950 | 3.0428 | 0.0043 | 0.0172 | 0.0006 | 0.0043 | -1.0 | -1.0 | 0.0014 | 0.0099 | 0.0401 | 0.0401 | -1.0 | -1.0 | 0.0108 | 0.1185 | 0.0063 | 0.026 | 0.0 | 0.0 | 0.0 | 0.0159 |
| 4.1679 | 107.1429 | 9000 | 3.0473 | 0.0045 | 0.0176 | 0.0009 | 0.0045 | -1.0 | -1.0 | 0.0024 | 0.0117 | 0.0425 | 0.0425 | -1.0 | -1.0 | 0.0111 | 0.1222 | 0.0069 | 0.0284 | 0.0 | 0.0 | 0.0 | 0.0194 |
| 3.6622 | 107.7381 | 9050 | 3.0389 | 0.0046 | 0.0175 | 0.0008 | 0.0046 | -1.0 | -1.0 | 0.0022 | 0.0122 | 0.0433 | 0.0433 | -1.0 | -1.0 | 0.0113 | 0.1255 | 0.007 | 0.0288 | 0.0 | 0.0 | 0.0 | 0.0188 |
| 1.6864 | 108.3333 | 9100 | 3.0567 | 0.0045 | 0.0161 | 0.0009 | 0.0045 | -1.0 | -1.0 | 0.0017 | 0.0109 | 0.0417 | 0.0417 | -1.0 | -1.0 | 0.0103 | 0.1185 | 0.0075 | 0.0288 | 0.0 | 0.0 | 0.0001 | 0.0194 |
| 2.7387 | 108.9286 | 9150 | 3.0351 | 0.0048 | 0.0159 | 0.0009 | 0.0048 | -1.0 | -1.0 | 0.0021 | 0.0107 | 0.0414 | 0.0414 | -1.0 | -1.0 | 0.0105 | 0.1185 | 0.0084 | 0.0307 | 0.0 | 0.0 | 0.0 | 0.0165 |
| 2.5301 | 109.5238 | 9200 | 3.0230 | 0.0046 | 0.0179 | 0.001 | 0.0046 | -1.0 | -1.0 | 0.0022 | 0.0109 | 0.0428 | 0.0428 | -1.0 | -1.0 | 0.0108 | 0.1241 | 0.0077 | 0.0312 | 0.0 | 0.0 | 0.0 | 0.0159 |
| 3.3649 | 110.1190 | 9250 | 3.0249 | 0.0046 | 0.0156 | 0.0006 | 0.0046 | -1.0 | -1.0 | 0.002 | 0.0113 | 0.0404 | 0.0404 | -1.0 | -1.0 | 0.0105 | 0.1181 | 0.0077 | 0.0288 | 0.0 | 0.0 | 0.0 | 0.0147 |
| 3.6737 | 110.7143 | 9300 | 3.0224 | 0.0048 | 0.0162 | 0.0008 | 0.0048 | -1.0 | -1.0 | 0.0022 | 0.0123 | 0.0418 | 0.0418 | -1.0 | -1.0 | 0.0117 | 0.1245 | 0.0076 | 0.0279 | 0.0 | 0.0 | 0.0 | 0.0147 |
| 3.7058 | 111.3095 | 9350 | 3.0062 | 0.0047 | 0.0155 | 0.0009 | 0.0047 | -1.0 | -1.0 | 0.0022 | 0.0124 | 0.0419 | 0.0419 | -1.0 | -1.0 | 0.0113 | 0.1222 | 0.0073 | 0.0288 | 0.0 | 0.0 | 0.0 | 0.0165 |
| 4.4911 | 111.9048 | 9400 | 3.0124 | 0.0047 | 0.0149 | 0.0006 | 0.0047 | -1.0 | -1.0 | 0.0015 | 0.0113 | 0.0411 | 0.0411 | -1.0 | -1.0 | 0.011 | 0.1194 | 0.0078 | 0.0284 | 0.0 | 0.0 | 0.0 | 0.0165 |
| 3.5137 | 112.5 | 9450 | 3.0127 | 0.0047 | 0.0153 | 0.0008 | 0.0047 | -1.0 | -1.0 | 0.0019 | 0.0121 | 0.0422 | 0.0422 | -1.0 | -1.0 | 0.0117 | 0.1245 | 0.0071 | 0.0279 | 0.0 | 0.0 | 0.0 | 0.0165 |
| 3.6468 | 113.0952 | 9500 | 3.0071 | 0.0047 | 0.015 | 0.0006 | 0.0047 | -1.0 | -1.0 | 0.0015 | 0.0116 | 0.0416 | 0.0416 | -1.0 | -1.0 | 0.0115 | 0.1227 | 0.0073 | 0.0274 | 0.0 | 0.0 | 0.0 | 0.0165 |
| 2.9878 | 113.6905 | 9550 | 3.0092 | 0.0045 | 0.0158 | 0.0006 | 0.0045 | -1.0 | -1.0 | 0.0019 | 0.0118 | 0.0412 | 0.0412 | -1.0 | -1.0 | 0.0113 | 0.1199 | 0.0069 | 0.0279 | 0.0 | 0.0 | 0.0 | 0.0171 |
| 4.1334 | 114.2857 | 9600 | 3.0088 | 0.0047 | 0.0169 | 0.001 | 0.0047 | -1.0 | -1.0 | 0.0019 | 0.0121 | 0.0424 | 0.0424 | -1.0 | -1.0 | 0.0116 | 0.1236 | 0.0071 | 0.0284 | 0.0 | 0.0 | 0.0 | 0.0176 |
| 3.0444 | 114.8810 | 9650 | 3.0066 | 0.0048 | 0.0178 | 0.0009 | 0.0048 | -1.0 | -1.0 | 0.0019 | 0.0118 | 0.0433 | 0.0433 | -1.0 | -1.0 | 0.0116 | 0.1231 | 0.0078 | 0.0288 | 0.0 | 0.0 | 0.0001 | 0.0212 |
| 1.6588 | 115.4762 | 9700 | 3.0116 | 0.0046 | 0.0154 | 0.0009 | 0.0046 | -1.0 | -1.0 | 0.002 | 0.0118 | 0.0418 | 0.0418 | -1.0 | -1.0 | 0.0109 | 0.1199 | 0.0075 | 0.0284 | 0.0 | 0.0 | 0.0001 | 0.0188 |
| 3.1918 | 116.0714 | 9750 | 3.0103 | 0.0048 | 0.0171 | 0.0009 | 0.0048 | -1.0 | -1.0 | 0.0019 | 0.0119 | 0.0429 | 0.0429 | -1.0 | -1.0 | 0.0117 | 0.1227 | 0.0077 | 0.0288 | 0.0 | 0.0 | 0.0001 | 0.02 |
| 1.7329 | 116.6667 | 9800 | 3.0005 | 0.0047 | 0.0164 | 0.0009 | 0.0047 | -1.0 | -1.0 | 0.002 | 0.0121 | 0.0416 | 0.0416 | -1.0 | -1.0 | 0.011 | 0.1199 | 0.0076 | 0.0288 | 0.0 | 0.0 | 0.0 | 0.0176 |
| 3.5186 | 117.2619 | 9850 | 2.9999 | 0.0049 | 0.0168 | 0.0012 | 0.0049 | -1.0 | -1.0 | 0.002 | 0.0122 | 0.0427 | 0.0427 | -1.0 | -1.0 | 0.0112 | 0.1218 | 0.0082 | 0.0321 | 0.0 | 0.0 | 0.0 | 0.0171 |
| 1.7971 | 117.8571 | 9900 | 3.0009 | 0.0048 | 0.0172 | 0.001 | 0.0048 | -1.0 | -1.0 | 0.0017 | 0.0121 | 0.0437 | 0.0437 | -1.0 | -1.0 | 0.0116 | 0.1259 | 0.0077 | 0.0316 | 0.0 | 0.0 | 0.0 | 0.0171 |
| 3.9121 | 118.4524 | 9950 | 3.0010 | 0.0048 | 0.0164 | 0.001 | 0.0048 | -1.0 | -1.0 | 0.0019 | 0.0117 | 0.0436 | 0.0436 | -1.0 | -1.0 | 0.0117 | 0.1264 | 0.0076 | 0.0302 | 0.0 | 0.0 | 0.0 | 0.0176 |
| 2.9464 | 119.0476 | 10000 | 3.0007 | 0.0047 | 0.0173 | 0.0009 | 0.0047 | -1.0 | -1.0 | 0.0017 | 0.0117 | 0.0426 | 0.0426 | -1.0 | -1.0 | 0.0113 | 0.1227 | 0.0075 | 0.0307 | 0.0 | 0.0 | 0.0 | 0.0171 |
### Framework versions
- Transformers 4.48.0
- Pytorch 2.5.1+cu124
- Datasets 3.2.0
- Tokenizers 0.21.0
| [
"recilia dorsalis",
"nephotettix malayanus",
"sogatella furcifera",
"nilaparvata lugens"
] |
pylu5229/conditional-detr-resnet-50-uLED-obj-detect-test |
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# conditional-detr-resnet-50-uLED-obj-detect-test
This model is a fine-tuned version of [microsoft/conditional-detr-resnet-50](https://huggingface.co/microsoft/conditional-detr-resnet-50) on the imagefolder dataset.
It achieves the following results on the evaluation set:
- Loss: 0.0912
- Map: 0.9334
- Map 50: 0.9684
- Map 75: 0.9684
- Map Small: -1.0
- Map Medium: 0.9334
- Map Large: -1.0
- Mar 1: 0.0125
- Mar 10: 0.1259
- Mar 100: 0.9777
- Mar Small: -1.0
- Mar Medium: 0.9777
- Mar Large: -1.0
- Map Uled: 0.9334
- Mar 100 Uled: 0.9777
- Map Trash: -1.0
- Mar 100 Trash: -1.0
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 5e-05
- train_batch_size: 32
- eval_batch_size: 8
- seed: 42
- optimizer: Use OptimizerNames.ADAMW_TORCH with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
- lr_scheduler_type: cosine
- num_epochs: 30
### Training results
| Training Loss | Epoch | Step | Validation Loss | Map | Map 50 | Map 75 | Map Small | Map Medium | Map Large | Mar 1 | Mar 10 | Mar 100 | Mar Small | Mar Medium | Mar Large | Map Uled | Mar 100 Uled | Map Trash | Mar 100 Trash |
|:-------------:|:-----:|:----:|:---------------:|:------:|:------:|:------:|:---------:|:----------:|:---------:|:------:|:------:|:-------:|:---------:|:----------:|:---------:|:--------:|:------------:|:---------:|:-------------:|
| No log | 1.0 | 41 | 0.2460 | 0.7925 | 0.9619 | 0.9382 | -1.0 | 0.7925 | -1.0 | 0.0115 | 0.1133 | 0.8652 | -1.0 | 0.8652 | -1.0 | 0.7925 | 0.8652 | -1.0 | -1.0 |
| No log | 2.0 | 82 | 0.2123 | 0.8121 | 0.9671 | 0.9527 | -1.0 | 0.8121 | -1.0 | 0.0111 | 0.1125 | 0.8797 | -1.0 | 0.8797 | -1.0 | 0.8121 | 0.8797 | -1.0 | -1.0 |
| No log | 3.0 | 123 | 0.1597 | 0.8576 | 0.9645 | 0.963 | -1.0 | 0.8576 | -1.0 | 0.0118 | 0.1181 | 0.9217 | -1.0 | 0.9217 | -1.0 | 0.8576 | 0.9217 | -1.0 | -1.0 |
| No log | 4.0 | 164 | 0.1645 | 0.8532 | 0.9644 | 0.9606 | -1.0 | 0.8532 | -1.0 | 0.0118 | 0.1184 | 0.9174 | -1.0 | 0.9174 | -1.0 | 0.8532 | 0.9174 | -1.0 | -1.0 |
| No log | 5.0 | 205 | 0.2037 | 0.824 | 0.9632 | 0.9614 | -1.0 | 0.824 | -1.0 | 0.0115 | 0.1142 | 0.8826 | -1.0 | 0.8826 | -1.0 | 0.824 | 0.8826 | -1.0 | -1.0 |
| No log | 6.0 | 246 | 0.1342 | 0.8864 | 0.9672 | 0.9665 | -1.0 | 0.8864 | -1.0 | 0.0119 | 0.1213 | 0.9429 | -1.0 | 0.9429 | -1.0 | 0.8864 | 0.9429 | -1.0 | -1.0 |
| No log | 7.0 | 287 | 0.1365 | 0.8821 | 0.9677 | 0.9672 | -1.0 | 0.8821 | -1.0 | 0.0121 | 0.1218 | 0.9362 | -1.0 | 0.9362 | -1.0 | 0.8821 | 0.9362 | -1.0 | -1.0 |
| No log | 8.0 | 328 | 0.1470 | 0.872 | 0.9666 | 0.9662 | -1.0 | 0.872 | -1.0 | 0.0119 | 0.12 | 0.9326 | -1.0 | 0.9326 | -1.0 | 0.872 | 0.9326 | -1.0 | -1.0 |
| No log | 9.0 | 369 | 0.1783 | 0.8495 | 0.9678 | 0.9673 | -1.0 | 0.8495 | -1.0 | 0.0118 | 0.118 | 0.9017 | -1.0 | 0.9017 | -1.0 | 0.8495 | 0.9017 | -1.0 | -1.0 |
| No log | 10.0 | 410 | 0.1563 | 0.8676 | 0.9662 | 0.9643 | -1.0 | 0.8676 | -1.0 | 0.012 | 0.1203 | 0.9225 | -1.0 | 0.9225 | -1.0 | 0.8676 | 0.9225 | -1.0 | -1.0 |
| No log | 11.0 | 451 | 0.1458 | 0.8783 | 0.966 | 0.9658 | -1.0 | 0.8783 | -1.0 | 0.012 | 0.121 | 0.9321 | -1.0 | 0.9321 | -1.0 | 0.8783 | 0.9321 | -1.0 | -1.0 |
| No log | 12.0 | 492 | 0.1273 | 0.8939 | 0.9669 | 0.9667 | -1.0 | 0.8939 | -1.0 | 0.0123 | 0.1234 | 0.9462 | -1.0 | 0.9462 | -1.0 | 0.8939 | 0.9462 | -1.0 | -1.0 |
| 0.2348 | 13.0 | 533 | 0.1376 | 0.8862 | 0.9683 | 0.968 | -1.0 | 0.8862 | -1.0 | 0.0121 | 0.1217 | 0.9404 | -1.0 | 0.9404 | -1.0 | 0.8862 | 0.9404 | -1.0 | -1.0 |
| 0.2348 | 14.0 | 574 | 0.1338 | 0.8865 | 0.9669 | 0.9668 | -1.0 | 0.8865 | -1.0 | 0.0122 | 0.1222 | 0.9422 | -1.0 | 0.9422 | -1.0 | 0.8865 | 0.9422 | -1.0 | -1.0 |
| 0.2348 | 15.0 | 615 | 0.1258 | 0.8917 | 0.9685 | 0.9685 | -1.0 | 0.8917 | -1.0 | 0.012 | 0.1221 | 0.9454 | -1.0 | 0.9454 | -1.0 | 0.8917 | 0.9454 | -1.0 | -1.0 |
| 0.2348 | 16.0 | 656 | 0.1206 | 0.8998 | 0.9689 | 0.9689 | -1.0 | 0.8998 | -1.0 | 0.0123 | 0.1233 | 0.9524 | -1.0 | 0.9524 | -1.0 | 0.8998 | 0.9524 | -1.0 | -1.0 |
| 0.2348 | 17.0 | 697 | 0.1075 | 0.911 | 0.969 | 0.969 | -1.0 | 0.911 | -1.0 | 0.0123 | 0.1238 | 0.9612 | -1.0 | 0.9612 | -1.0 | 0.911 | 0.9612 | -1.0 | -1.0 |
| 0.2348 | 18.0 | 738 | 0.1084 | 0.9113 | 0.9692 | 0.9691 | -1.0 | 0.9113 | -1.0 | 0.0123 | 0.1237 | 0.9628 | -1.0 | 0.9628 | -1.0 | 0.9113 | 0.9628 | -1.0 | -1.0 |
| 0.2348 | 19.0 | 779 | 0.1104 | 0.91 | 0.9688 | 0.9688 | -1.0 | 0.91 | -1.0 | 0.0123 | 0.1236 | 0.9602 | -1.0 | 0.9602 | -1.0 | 0.91 | 0.9602 | -1.0 | -1.0 |
| 0.2348 | 20.0 | 820 | 0.1097 | 0.9103 | 0.9693 | 0.9693 | -1.0 | 0.9103 | -1.0 | 0.0123 | 0.1241 | 0.9616 | -1.0 | 0.9616 | -1.0 | 0.9103 | 0.9616 | -1.0 | -1.0 |
| 0.2348 | 21.0 | 861 | 0.1111 | 0.9106 | 0.9666 | 0.9665 | -1.0 | 0.9106 | -1.0 | 0.0123 | 0.1242 | 0.9624 | -1.0 | 0.9624 | -1.0 | 0.9106 | 0.9624 | -1.0 | -1.0 |
| 0.2348 | 22.0 | 902 | 0.1007 | 0.923 | 0.9667 | 0.9666 | -1.0 | 0.923 | -1.0 | 0.0125 | 0.1251 | 0.972 | -1.0 | 0.972 | -1.0 | 0.923 | 0.972 | -1.0 | -1.0 |
| 0.2348 | 23.0 | 943 | 0.1080 | 0.9103 | 0.9671 | 0.9671 | -1.0 | 0.9103 | -1.0 | 0.0123 | 0.1242 | 0.9612 | -1.0 | 0.9612 | -1.0 | 0.9103 | 0.9612 | -1.0 | -1.0 |
| 0.2348 | 24.0 | 984 | 0.0987 | 0.9197 | 0.967 | 0.967 | -1.0 | 0.9197 | -1.0 | 0.0124 | 0.1253 | 0.9697 | -1.0 | 0.9697 | -1.0 | 0.9197 | 0.9697 | -1.0 | -1.0 |
| 0.1648 | 25.0 | 1025 | 0.0979 | 0.9226 | 0.9675 | 0.9675 | -1.0 | 0.9226 | -1.0 | 0.0125 | 0.1253 | 0.9715 | -1.0 | 0.9715 | -1.0 | 0.9226 | 0.9715 | -1.0 | -1.0 |
| 0.1648 | 26.0 | 1066 | 0.0912 | 0.9334 | 0.9684 | 0.9684 | -1.0 | 0.9334 | -1.0 | 0.0125 | 0.1259 | 0.9777 | -1.0 | 0.9777 | -1.0 | 0.9334 | 0.9777 | -1.0 | -1.0 |
| 0.1648 | 27.0 | 1107 | 0.0926 | 0.9311 | 0.9682 | 0.9682 | -1.0 | 0.9311 | -1.0 | 0.0125 | 0.1258 | 0.9763 | -1.0 | 0.9763 | -1.0 | 0.9311 | 0.9763 | -1.0 | -1.0 |
| 0.1648 | 28.0 | 1148 | 0.0933 | 0.9301 | 0.9682 | 0.9681 | -1.0 | 0.9301 | -1.0 | 0.0125 | 0.1258 | 0.9756 | -1.0 | 0.9756 | -1.0 | 0.9301 | 0.9756 | -1.0 | -1.0 |
| 0.1648 | 29.0 | 1189 | 0.0937 | 0.9301 | 0.9682 | 0.9681 | -1.0 | 0.9301 | -1.0 | 0.0125 | 0.1259 | 0.9758 | -1.0 | 0.9758 | -1.0 | 0.9301 | 0.9758 | -1.0 | -1.0 |
| 0.1648 | 30.0 | 1230 | 0.0932 | 0.9311 | 0.9682 | 0.9681 | -1.0 | 0.9311 | -1.0 | 0.0125 | 0.126 | 0.9763 | -1.0 | 0.9763 | -1.0 | 0.9311 | 0.9763 | -1.0 | -1.0 |
### Framework versions
- Transformers 4.47.1
- Pytorch 2.5.1+cu121
- Datasets 3.2.0
- Tokenizers 0.21.0
| [
"uled",
"trash"
] |
Ilhamfaisal/detr-resnet-50-dc5-grasshopper-testdata-noaug-maxsteps-10000-batchsize-2-ilham |
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# detr-resnet-50-dc5-grasshopper-testdata-noaug-maxsteps-10000-batchsize-2-ilham
This model is a fine-tuned version of [facebook/detr-resnet-50-dc5](https://huggingface.co/facebook/detr-resnet-50-dc5) on the None dataset.
It achieves the following results on the evaluation set:
- Loss: 2.4932
- Map: 0.0333
- Map 50: 0.0821
- Map 75: 0.02
- Map Small: 0.0053
- Map Medium: 0.0338
- Map Large: 0.0505
- Mar 1: 0.0161
- Mar 10: 0.0587
- Mar 100: 0.1495
- Mar Small: 0.0143
- Mar Medium: 0.1522
- Mar Large: 0.05
- Map Recilia dorsalis: 0.0958
- Mar 100 Recilia dorsalis: 0.3407
- Map Nephotettix malayanus: 0.0345
- Mar 100 Nephotettix malayanus: 0.1335
- Map Sogatella furcifera: 0.0018
- Mar 100 Sogatella furcifera: 0.0409
- Map Nilaparvata lugens: 0.0011
- Mar 100 Nilaparvata lugens: 0.0829
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 1e-05
- train_batch_size: 2
- eval_batch_size: 2
- seed: 42
- optimizer: Use OptimizerNames.ADAMW_TORCH with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
- lr_scheduler_type: linear
- training_steps: 10000
- mixed_precision_training: Native AMP
### Training results
| Training Loss | Epoch | Step | Validation Loss | Map | Map 50 | Map 75 | Map Small | Map Medium | Map Large | Mar 1 | Mar 10 | Mar 100 | Mar Small | Mar Medium | Mar Large | Map Recilia dorsalis | Mar 100 Recilia dorsalis | Map Nephotettix malayanus | Mar 100 Nephotettix malayanus | Map Sogatella furcifera | Mar 100 Sogatella furcifera | Map Nilaparvata lugens | Mar 100 Nilaparvata lugens |
|:-------------:|:--------:|:-----:|:---------------:|:------:|:------:|:------:|:---------:|:----------:|:---------:|:------:|:------:|:-------:|:---------:|:----------:|:---------:|:--------------------:|:------------------------:|:-------------------------:|:-----------------------------:|:-----------------------:|:---------------------------:|:----------------------:|:--------------------------:|
| 7.4546 | 0.5952 | 50 | 6.1204 | 0.0 | 0.0001 | 0.0 | 0.0 | 0.0001 | 0.0 | 0.0006 | 0.0037 | 0.0067 | 0.0 | 0.0074 | 0.0 | 0.0 | 0.0023 | 0.0001 | 0.0042 | 0.0001 | 0.0205 | 0.0 | 0.0 |
| 3.0891 | 1.1905 | 100 | 5.5380 | 0.0004 | 0.0018 | 0.0001 | 0.0 | 0.0021 | 0.0 | 0.0022 | 0.0081 | 0.0191 | 0.0 | 0.0201 | 0.0 | 0.0009 | 0.0301 | 0.0003 | 0.0167 | 0.0006 | 0.0295 | 0.0 | 0.0 |
| 7.0419 | 1.7857 | 150 | 4.9099 | 0.0036 | 0.0137 | 0.0011 | 0.0 | 0.0041 | 0.0 | 0.0066 | 0.0234 | 0.0445 | 0.0 | 0.0465 | 0.0 | 0.0111 | 0.0949 | 0.0007 | 0.0228 | 0.0024 | 0.0545 | 0.0 | 0.0059 |
| 4.7901 | 2.3810 | 200 | 4.0027 | 0.0039 | 0.0118 | 0.0005 | 0.0 | 0.0059 | 0.0 | 0.0037 | 0.0128 | 0.0421 | 0.0 | 0.0426 | 0.0 | 0.0083 | 0.1477 | 0.0001 | 0.0005 | 0.0074 | 0.0114 | 0.0 | 0.0088 |
| 2.9157 | 2.9762 | 250 | 3.8153 | 0.0009 | 0.004 | 0.0002 | 0.0 | 0.001 | 0.0 | 0.0015 | 0.0064 | 0.0325 | 0.0 | 0.0325 | 0.0 | 0.0037 | 0.1301 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 2.625 | 3.5714 | 300 | 3.6508 | 0.0043 | 0.0124 | 0.0004 | 0.0 | 0.0044 | 0.0 | 0.0023 | 0.0092 | 0.0442 | 0.0 | 0.0442 | 0.0 | 0.0174 | 0.175 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0018 |
| 2.3755 | 4.1667 | 350 | 3.6447 | 0.0079 | 0.0234 | 0.0018 | 0.0 | 0.0081 | 0.0 | 0.0028 | 0.0136 | 0.0401 | 0.0 | 0.0402 | 0.0 | 0.0316 | 0.1523 | 0.0001 | 0.0033 | 0.0 | 0.0 | 0.0 | 0.0047 |
| 4.4918 | 4.7619 | 400 | 3.5190 | 0.007 | 0.0241 | 0.0028 | 0.0 | 0.0072 | 0.0 | 0.002 | 0.0108 | 0.0529 | 0.0 | 0.0531 | 0.0 | 0.028 | 0.2019 | 0.0001 | 0.0014 | 0.0 | 0.0 | 0.0 | 0.0082 |
| 4.3961 | 5.3571 | 450 | 3.4433 | 0.0072 | 0.0245 | 0.0017 | 0.0 | 0.0075 | 0.0 | 0.0021 | 0.0119 | 0.0509 | 0.0 | 0.0515 | 0.0 | 0.027 | 0.1736 | 0.0018 | 0.0065 | 0.0 | 0.0 | 0.0001 | 0.0235 |
| 2.0585 | 5.9524 | 500 | 3.3878 | 0.009 | 0.0291 | 0.0015 | 0.0 | 0.0091 | 0.0 | 0.0015 | 0.0117 | 0.0581 | 0.0 | 0.0587 | 0.0 | 0.0353 | 0.2032 | 0.0005 | 0.0028 | 0.0 | 0.0 | 0.0001 | 0.0265 |
| 3.295 | 6.5476 | 550 | 3.2713 | 0.0104 | 0.0348 | 0.0027 | 0.0 | 0.0105 | 0.0 | 0.0037 | 0.0168 | 0.0617 | 0.0 | 0.0627 | 0.0 | 0.0359 | 0.2005 | 0.0054 | 0.0065 | 0.0 | 0.0 | 0.0002 | 0.04 |
| 3.108 | 7.1429 | 600 | 3.1867 | 0.016 | 0.0475 | 0.0059 | 0.0 | 0.0162 | 0.0 | 0.0045 | 0.019 | 0.0735 | 0.0 | 0.0745 | 0.0 | 0.0569 | 0.2468 | 0.0069 | 0.0033 | 0.0 | 0.0 | 0.0002 | 0.0441 |
| 4.2217 | 7.7381 | 650 | 3.1803 | 0.0162 | 0.0508 | 0.0049 | 0.0 | 0.0163 | 0.0 | 0.0032 | 0.0161 | 0.0682 | 0.0 | 0.0685 | 0.0 | 0.0649 | 0.2556 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0171 |
| 3.058 | 8.3333 | 700 | 3.2050 | 0.0119 | 0.0376 | 0.0022 | 0.0 | 0.012 | 0.0 | 0.0032 | 0.0134 | 0.0618 | 0.0 | 0.0623 | 0.0 | 0.043 | 0.2213 | 0.0045 | 0.0023 | 0.0 | 0.0 | 0.0001 | 0.0235 |
| 3.2302 | 8.9286 | 750 | 3.1112 | 0.0134 | 0.0437 | 0.0047 | 0.0 | 0.0135 | 0.0 | 0.0037 | 0.0142 | 0.0712 | 0.0 | 0.0719 | 0.0 | 0.052 | 0.2509 | 0.0015 | 0.0014 | 0.0 | 0.0 | 0.0001 | 0.0324 |
| 2.3508 | 9.5238 | 800 | 3.0767 | 0.0131 | 0.0482 | 0.0036 | 0.0 | 0.0131 | 0.0 | 0.0039 | 0.0163 | 0.0622 | 0.0 | 0.0627 | 0.0 | 0.0443 | 0.2245 | 0.0079 | 0.0037 | 0.0 | 0.0 | 0.0001 | 0.0206 |
| 3.237 | 10.1190 | 850 | 3.0410 | 0.0115 | 0.0432 | 0.0018 | 0.0 | 0.0115 | 0.0 | 0.0032 | 0.011 | 0.0651 | 0.0 | 0.0655 | 0.0 | 0.0458 | 0.2403 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0001 | 0.02 |
| 3.3086 | 10.7143 | 900 | 3.0368 | 0.0143 | 0.0432 | 0.004 | 0.0 | 0.0144 | 0.0 | 0.0042 | 0.0124 | 0.0722 | 0.0 | 0.073 | 0.0 | 0.0522 | 0.2537 | 0.005 | 0.0023 | 0.0 | 0.0 | 0.0002 | 0.0329 |
| 3.5539 | 11.3095 | 950 | 3.1159 | 0.0171 | 0.0479 | 0.0077 | 0.0 | 0.0172 | 0.0 | 0.0047 | 0.0164 | 0.072 | 0.0 | 0.0724 | 0.0 | 0.0613 | 0.263 | 0.0073 | 0.0037 | 0.0 | 0.0 | 0.0001 | 0.0212 |
| 1.9724 | 11.9048 | 1000 | 3.0500 | 0.0134 | 0.039 | 0.0057 | 0.0 | 0.0135 | 0.0 | 0.0045 | 0.0176 | 0.0734 | 0.0 | 0.074 | 0.0 | 0.0449 | 0.2574 | 0.0085 | 0.0079 | 0.0 | 0.0 | 0.0001 | 0.0282 |
| 2.4243 | 12.5 | 1050 | 3.0655 | 0.0112 | 0.0373 | 0.0045 | 0.0 | 0.0113 | 0.0 | 0.0043 | 0.0169 | 0.068 | 0.0 | 0.0685 | 0.0 | 0.036 | 0.238 | 0.0089 | 0.0098 | 0.0 | 0.0 | 0.0001 | 0.0241 |
| 3.4438 | 13.0952 | 1100 | 3.1563 | 0.0133 | 0.0417 | 0.002 | 0.0 | 0.0134 | 0.0 | 0.0028 | 0.0148 | 0.0733 | 0.0 | 0.074 | 0.0 | 0.0504 | 0.2574 | 0.0026 | 0.0033 | 0.0 | 0.0 | 0.0001 | 0.0324 |
| 1.7927 | 13.6905 | 1150 | 3.0797 | 0.0185 | 0.0525 | 0.0081 | 0.0 | 0.0186 | 0.0 | 0.0046 | 0.0186 | 0.0814 | 0.0 | 0.0821 | 0.0 | 0.0669 | 0.2931 | 0.0069 | 0.0033 | 0.0 | 0.0 | 0.0001 | 0.0294 |
| 2.5378 | 14.2857 | 1200 | 3.0169 | 0.015 | 0.0434 | 0.0068 | 0.0 | 0.0152 | 0.0 | 0.0048 | 0.0178 | 0.0777 | 0.0 | 0.0784 | 0.0 | 0.0502 | 0.269 | 0.0099 | 0.0088 | 0.0 | 0.0 | 0.0001 | 0.0329 |
| 1.8713 | 14.8810 | 1250 | 2.9527 | 0.0132 | 0.0365 | 0.0052 | 0.0 | 0.0133 | 0.0 | 0.0038 | 0.0169 | 0.076 | 0.0 | 0.0768 | 0.0 | 0.0432 | 0.2593 | 0.0094 | 0.013 | 0.0 | 0.0 | 0.0001 | 0.0318 |
| 2.651 | 15.4762 | 1300 | 2.9760 | 0.0141 | 0.0387 | 0.0066 | 0.0 | 0.0142 | 0.0 | 0.0038 | 0.0147 | 0.0797 | 0.0 | 0.0803 | 0.0 | 0.049 | 0.2856 | 0.0072 | 0.0079 | 0.0 | 0.0 | 0.0001 | 0.0253 |
| 2.1526 | 16.0714 | 1350 | 3.1095 | 0.0128 | 0.0371 | 0.0058 | 0.0 | 0.0129 | 0.0 | 0.0029 | 0.014 | 0.0783 | 0.0 | 0.079 | 0.0 | 0.0433 | 0.2759 | 0.0079 | 0.0079 | 0.0 | 0.0 | 0.0001 | 0.0294 |
| 3.8567 | 16.6667 | 1400 | 3.0147 | 0.013 | 0.0376 | 0.0057 | 0.0 | 0.0131 | 0.0 | 0.005 | 0.0161 | 0.077 | 0.0 | 0.0775 | 0.0 | 0.0461 | 0.2764 | 0.0058 | 0.0116 | 0.0 | 0.0 | 0.0001 | 0.02 |
| 1.9783 | 17.2619 | 1450 | 3.0208 | 0.0142 | 0.0381 | 0.0076 | 0.0 | 0.0143 | 0.0 | 0.0052 | 0.0189 | 0.079 | 0.0 | 0.0796 | 0.0 | 0.0483 | 0.2824 | 0.0084 | 0.0107 | 0.0 | 0.0 | 0.0001 | 0.0229 |
| 1.7907 | 17.8571 | 1500 | 2.9505 | 0.0168 | 0.0502 | 0.0073 | 0.0 | 0.0169 | 0.0 | 0.0046 | 0.0202 | 0.0861 | 0.0 | 0.0868 | 0.0 | 0.0568 | 0.3037 | 0.0102 | 0.0112 | 0.0 | 0.0 | 0.0001 | 0.0294 |
| 3.0093 | 18.4524 | 1550 | 2.9710 | 0.0155 | 0.0443 | 0.0062 | 0.0 | 0.0156 | 0.0 | 0.0049 | 0.0173 | 0.0831 | 0.0 | 0.0838 | 0.0 | 0.0564 | 0.2944 | 0.0055 | 0.0074 | 0.0 | 0.0 | 0.0001 | 0.0306 |
| 2.1716 | 19.0476 | 1600 | 2.9435 | 0.0133 | 0.0365 | 0.0055 | 0.0 | 0.0133 | 0.0 | 0.0043 | 0.0185 | 0.0816 | 0.0 | 0.0825 | 0.0 | 0.0464 | 0.2778 | 0.0064 | 0.0112 | 0.0 | 0.0 | 0.0002 | 0.0376 |
| 2.0769 | 19.6429 | 1650 | 2.9077 | 0.0145 | 0.0415 | 0.0086 | 0.0 | 0.0145 | 0.0 | 0.0064 | 0.0191 | 0.0842 | 0.0 | 0.0851 | 0.0 | 0.0493 | 0.2852 | 0.0084 | 0.0121 | 0.0 | 0.0 | 0.0002 | 0.0394 |
| 2.0751 | 20.2381 | 1700 | 2.9094 | 0.0146 | 0.0425 | 0.0075 | 0.0 | 0.0146 | 0.0 | 0.0028 | 0.0183 | 0.0811 | 0.0 | 0.0819 | 0.0 | 0.0521 | 0.2843 | 0.0059 | 0.0042 | 0.0 | 0.0 | 0.0002 | 0.0359 |
| 2.2801 | 20.8333 | 1750 | 2.8951 | 0.0141 | 0.0413 | 0.0074 | 0.0 | 0.0141 | 0.0 | 0.0037 | 0.0151 | 0.0851 | 0.0 | 0.0861 | 0.0 | 0.0492 | 0.287 | 0.0068 | 0.0074 | 0.0 | 0.0 | 0.0002 | 0.0459 |
| 2.4332 | 21.4286 | 1800 | 2.9038 | 0.0142 | 0.039 | 0.0059 | 0.0 | 0.0142 | 0.0 | 0.0037 | 0.015 | 0.0804 | 0.0 | 0.0814 | 0.0 | 0.0485 | 0.2745 | 0.0079 | 0.0042 | 0.0 | 0.0 | 0.0002 | 0.0429 |
| 3.6742 | 22.0238 | 1850 | 2.7914 | 0.017 | 0.0465 | 0.0071 | 0.0 | 0.0171 | 0.0 | 0.0046 | 0.0224 | 0.0892 | 0.0 | 0.0904 | 0.0 | 0.0591 | 0.2944 | 0.0086 | 0.007 | 0.0 | 0.0 | 0.0003 | 0.0553 |
| 2.6172 | 22.6190 | 1900 | 2.8617 | 0.0152 | 0.0403 | 0.0106 | 0.0 | 0.0154 | 0.0 | 0.0066 | 0.0219 | 0.0873 | 0.0 | 0.0884 | 0.0 | 0.0521 | 0.2898 | 0.0082 | 0.0112 | 0.0 | 0.0 | 0.0003 | 0.0482 |
| 1.4784 | 23.2143 | 1950 | 2.8535 | 0.0176 | 0.0447 | 0.0106 | 0.0 | 0.0177 | 0.0 | 0.006 | 0.0197 | 0.091 | 0.0 | 0.0918 | 0.0 | 0.0613 | 0.3185 | 0.009 | 0.0126 | 0.0 | 0.0 | 0.0002 | 0.0329 |
| 1.6666 | 23.8095 | 2000 | 2.8936 | 0.0201 | 0.0557 | 0.0093 | 0.0 | 0.0202 | 0.0 | 0.0045 | 0.0197 | 0.0925 | 0.0 | 0.0933 | 0.0 | 0.0748 | 0.3231 | 0.0055 | 0.0098 | 0.0 | 0.0 | 0.0002 | 0.0371 |
| 2.492 | 24.4048 | 2050 | 2.8839 | 0.0199 | 0.0541 | 0.0103 | 0.0 | 0.02 | 0.0 | 0.006 | 0.0198 | 0.0887 | 0.0 | 0.0893 | 0.0 | 0.0721 | 0.3181 | 0.0076 | 0.0107 | 0.0 | 0.0 | 0.0001 | 0.0259 |
| 3.2219 | 25.0 | 2100 | 2.8863 | 0.0139 | 0.0411 | 0.0079 | 0.0 | 0.014 | 0.0 | 0.0045 | 0.0205 | 0.0796 | 0.0 | 0.0801 | 0.0 | 0.0477 | 0.2847 | 0.008 | 0.0107 | 0.0 | 0.0 | 0.0001 | 0.0229 |
| 2.9668 | 25.5952 | 2150 | 2.8588 | 0.013 | 0.0376 | 0.0056 | 0.0 | 0.013 | 0.0 | 0.0044 | 0.0154 | 0.0809 | 0.0 | 0.0817 | 0.0 | 0.0492 | 0.2815 | 0.0025 | 0.0074 | 0.0 | 0.0 | 0.0002 | 0.0347 |
| 3.1214 | 26.1905 | 2200 | 2.9280 | 0.013 | 0.0339 | 0.0076 | 0.0 | 0.0131 | 0.0 | 0.0065 | 0.0183 | 0.09 | 0.0 | 0.0911 | 0.0 | 0.0453 | 0.288 | 0.0063 | 0.0256 | 0.0 | 0.0 | 0.0003 | 0.0465 |
| 2.6639 | 26.7857 | 2250 | 2.8761 | 0.0135 | 0.0367 | 0.0062 | 0.0 | 0.0137 | 0.0 | 0.0073 | 0.0207 | 0.0897 | 0.0 | 0.0908 | 0.0 | 0.0463 | 0.2861 | 0.0075 | 0.0237 | 0.0 | 0.0 | 0.0003 | 0.0488 |
| 2.2444 | 27.3810 | 2300 | 2.9808 | 0.0145 | 0.0422 | 0.0062 | 0.0 | 0.0147 | 0.0 | 0.0071 | 0.0214 | 0.0829 | 0.0 | 0.0838 | 0.0 | 0.048 | 0.2796 | 0.0098 | 0.0163 | 0.0 | 0.0 | 0.0002 | 0.0359 |
| 2.8195 | 27.9762 | 2350 | 2.9021 | 0.0158 | 0.0425 | 0.0099 | 0.0 | 0.0159 | 0.0 | 0.0063 | 0.0236 | 0.0909 | 0.0 | 0.0919 | 0.0 | 0.0506 | 0.3005 | 0.0124 | 0.0209 | 0.0 | 0.0 | 0.0002 | 0.0424 |
| 1.9824 | 28.5714 | 2400 | 2.8553 | 0.0121 | 0.0339 | 0.0054 | 0.0 | 0.0122 | 0.0 | 0.0075 | 0.0209 | 0.0847 | 0.0 | 0.0859 | 0.0 | 0.0426 | 0.2639 | 0.0054 | 0.0237 | 0.0 | 0.0 | 0.0003 | 0.0512 |
| 2.4574 | 29.1667 | 2450 | 2.8393 | 0.0113 | 0.0335 | 0.0059 | 0.0 | 0.0113 | 0.0 | 0.0037 | 0.0166 | 0.0716 | 0.0 | 0.0723 | 0.0 | 0.038 | 0.2412 | 0.0069 | 0.0121 | 0.0 | 0.0 | 0.0002 | 0.0329 |
| 1.9585 | 29.7619 | 2500 | 2.7672 | 0.0133 | 0.0365 | 0.0081 | 0.0 | 0.0134 | 0.0 | 0.0071 | 0.0227 | 0.0878 | 0.0 | 0.0889 | 0.0 | 0.0459 | 0.2898 | 0.0069 | 0.0163 | 0.0 | 0.0 | 0.0003 | 0.0453 |
| 3.2506 | 30.3571 | 2550 | 2.8153 | 0.0137 | 0.0384 | 0.0069 | 0.0 | 0.0139 | 0.0 | 0.0064 | 0.022 | 0.084 | 0.0 | 0.085 | 0.0 | 0.0426 | 0.2731 | 0.012 | 0.02 | 0.0 | 0.0 | 0.0003 | 0.0429 |
| 3.21 | 30.9524 | 2600 | 2.7157 | 0.0153 | 0.0411 | 0.0073 | 0.0 | 0.0154 | 0.0 | 0.0054 | 0.0215 | 0.0885 | 0.0 | 0.0896 | 0.0 | 0.051 | 0.2958 | 0.0099 | 0.013 | 0.0 | 0.0 | 0.0003 | 0.0453 |
| 2.9558 | 31.5476 | 2650 | 2.7337 | 0.0155 | 0.0392 | 0.0094 | 0.0 | 0.0157 | 0.0 | 0.0067 | 0.0229 | 0.0891 | 0.0 | 0.0902 | 0.0 | 0.0527 | 0.2894 | 0.0092 | 0.0195 | 0.0 | 0.0 | 0.0003 | 0.0476 |
| 2.5154 | 32.1429 | 2700 | 2.7609 | 0.016 | 0.0411 | 0.0083 | 0.0 | 0.016 | 0.0 | 0.0063 | 0.0262 | 0.0933 | 0.0 | 0.0944 | 0.0 | 0.0562 | 0.2921 | 0.0074 | 0.0377 | 0.0 | 0.0 | 0.0003 | 0.0435 |
| 2.7466 | 32.7381 | 2750 | 2.8100 | 0.015 | 0.0407 | 0.0073 | 0.0 | 0.0151 | 0.0 | 0.0057 | 0.0247 | 0.0864 | 0.0 | 0.0874 | 0.0 | 0.0517 | 0.2694 | 0.0082 | 0.0326 | 0.0 | 0.0 | 0.0002 | 0.0435 |
| 4.0197 | 33.3333 | 2800 | 2.7672 | 0.0155 | 0.0412 | 0.0088 | 0.0 | 0.0157 | 0.0 | 0.0072 | 0.0277 | 0.0894 | 0.0 | 0.0905 | 0.0 | 0.0511 | 0.2796 | 0.0106 | 0.0302 | 0.0 | 0.0 | 0.0003 | 0.0476 |
| 3.1527 | 33.9286 | 2850 | 2.7591 | 0.0172 | 0.0453 | 0.01 | 0.0 | 0.0172 | 0.0 | 0.0081 | 0.0243 | 0.0875 | 0.0 | 0.0884 | 0.0 | 0.0572 | 0.2889 | 0.0113 | 0.0242 | 0.0 | 0.0 | 0.0002 | 0.0371 |
| 1.9554 | 34.5238 | 2900 | 2.7563 | 0.0154 | 0.0403 | 0.0081 | 0.0 | 0.0155 | 0.0 | 0.0087 | 0.0264 | 0.093 | 0.0 | 0.0942 | 0.0 | 0.0525 | 0.2986 | 0.0086 | 0.0247 | 0.0 | 0.0 | 0.0003 | 0.0488 |
| 1.6361 | 35.1190 | 2950 | 2.7492 | 0.0173 | 0.045 | 0.009 | 0.0 | 0.0174 | 0.0 | 0.0089 | 0.0249 | 0.0967 | 0.0 | 0.0979 | 0.0 | 0.0564 | 0.3037 | 0.0126 | 0.0326 | 0.0 | 0.0 | 0.0003 | 0.0506 |
| 2.2596 | 35.7143 | 3000 | 2.7795 | 0.0134 | 0.0327 | 0.0098 | 0.0 | 0.0135 | 0.0 | 0.0078 | 0.0218 | 0.0873 | 0.0 | 0.0885 | 0.0 | 0.0417 | 0.2819 | 0.0114 | 0.0167 | 0.0 | 0.0 | 0.0003 | 0.0506 |
| 2.1195 | 36.3095 | 3050 | 2.8448 | 0.0139 | 0.0393 | 0.0069 | 0.0 | 0.014 | 0.0 | 0.0074 | 0.021 | 0.0873 | 0.0 | 0.0884 | 0.0 | 0.0451 | 0.2829 | 0.0103 | 0.0205 | 0.0 | 0.0 | 0.0003 | 0.0459 |
| 2.3732 | 36.9048 | 3100 | 2.7696 | 0.0155 | 0.0425 | 0.0089 | 0.0 | 0.0156 | 0.0 | 0.0074 | 0.0226 | 0.0912 | 0.0 | 0.0923 | 0.0 | 0.0509 | 0.2986 | 0.0109 | 0.0167 | 0.0 | 0.0 | 0.0003 | 0.0494 |
| 1.8122 | 37.5 | 3150 | 2.8021 | 0.0148 | 0.0395 | 0.0075 | 0.0 | 0.0149 | 0.0 | 0.0064 | 0.0221 | 0.0917 | 0.0 | 0.0928 | 0.0 | 0.0503 | 0.2991 | 0.0085 | 0.0195 | 0.0 | 0.0 | 0.0003 | 0.0482 |
| 1.8091 | 38.0952 | 3200 | 2.7925 | 0.0143 | 0.039 | 0.0065 | 0.0 | 0.0144 | 0.0 | 0.007 | 0.0228 | 0.0954 | 0.0 | 0.0965 | 0.0 | 0.0532 | 0.3097 | 0.0037 | 0.0223 | 0.0 | 0.0 | 0.0003 | 0.0494 |
| 2.8387 | 38.6905 | 3250 | 2.8216 | 0.0146 | 0.0402 | 0.0071 | 0.0 | 0.0148 | 0.0 | 0.0064 | 0.0238 | 0.0926 | 0.0 | 0.0938 | 0.0 | 0.0518 | 0.2968 | 0.0064 | 0.0223 | 0.0 | 0.0 | 0.0003 | 0.0512 |
| 1.1967 | 39.2857 | 3300 | 2.7958 | 0.0125 | 0.0379 | 0.0058 | 0.0 | 0.0126 | 0.0 | 0.0056 | 0.0205 | 0.083 | 0.0 | 0.0839 | 0.0 | 0.0479 | 0.2759 | 0.002 | 0.0209 | 0.0 | 0.0 | 0.0002 | 0.0353 |
| 2.4042 | 39.8810 | 3350 | 2.7605 | 0.0164 | 0.0427 | 0.0092 | 0.0 | 0.0166 | 0.0 | 0.0101 | 0.0259 | 0.1007 | 0.0 | 0.1018 | 0.0 | 0.0591 | 0.3278 | 0.0064 | 0.0279 | 0.0 | 0.0 | 0.0003 | 0.0471 |
| 2.1347 | 40.4762 | 3400 | 2.7584 | 0.0153 | 0.0384 | 0.0098 | 0.0 | 0.0154 | 0.0 | 0.006 | 0.0221 | 0.0969 | 0.0 | 0.0982 | 0.0 | 0.0561 | 0.3167 | 0.0045 | 0.0181 | 0.0 | 0.0 | 0.0004 | 0.0529 |
| 2.3655 | 41.0714 | 3450 | 2.7901 | 0.0143 | 0.0378 | 0.0079 | 0.0 | 0.0145 | 0.0 | 0.0059 | 0.0231 | 0.0951 | 0.0 | 0.0962 | 0.0 | 0.0503 | 0.3083 | 0.0065 | 0.0237 | 0.0 | 0.0 | 0.0003 | 0.0482 |
| 1.9613 | 41.6667 | 3500 | 2.8064 | 0.0148 | 0.0396 | 0.0069 | 0.0 | 0.0149 | 0.0 | 0.0068 | 0.0229 | 0.0939 | 0.0 | 0.0949 | 0.0 | 0.0571 | 0.3125 | 0.002 | 0.02 | 0.0 | 0.0 | 0.0003 | 0.0429 |
| 1.7809 | 42.2619 | 3550 | 2.7503 | 0.017 | 0.0469 | 0.0079 | 0.0 | 0.017 | 0.0 | 0.0061 | 0.0302 | 0.1025 | 0.0 | 0.1037 | 0.0 | 0.0629 | 0.3282 | 0.0046 | 0.0307 | 0.0 | 0.0 | 0.0004 | 0.0512 |
| 2.4661 | 42.8571 | 3600 | 2.7294 | 0.0209 | 0.0571 | 0.0081 | 0.0 | 0.0211 | 0.0 | 0.0073 | 0.0293 | 0.1023 | 0.0 | 0.1038 | 0.0 | 0.0695 | 0.313 | 0.0139 | 0.0358 | 0.0 | 0.0 | 0.0004 | 0.0606 |
| 1.8027 | 43.4524 | 3650 | 2.7443 | 0.0185 | 0.0521 | 0.0082 | 0.0 | 0.0186 | 0.0 | 0.0073 | 0.0283 | 0.0951 | 0.0 | 0.0963 | 0.0 | 0.0541 | 0.2852 | 0.0198 | 0.047 | 0.0 | 0.0 | 0.0003 | 0.0482 |
| 1.9605 | 44.0476 | 3700 | 2.7724 | 0.015 | 0.0399 | 0.0073 | 0.0 | 0.0152 | 0.0 | 0.0089 | 0.0296 | 0.0919 | 0.0 | 0.0933 | 0.0 | 0.0488 | 0.2639 | 0.0109 | 0.0442 | 0.0 | 0.0 | 0.0004 | 0.0594 |
| 2.3219 | 44.6429 | 3750 | 2.6873 | 0.0198 | 0.054 | 0.0086 | 0.0 | 0.02 | 0.0 | 0.0089 | 0.0309 | 0.0989 | 0.0 | 0.1001 | 0.0 | 0.0683 | 0.3032 | 0.0108 | 0.0442 | 0.0 | 0.0 | 0.0003 | 0.0482 |
| 3.1291 | 45.2381 | 3800 | 2.6679 | 0.0216 | 0.0558 | 0.0111 | 0.0 | 0.0217 | 0.0 | 0.0106 | 0.0319 | 0.1078 | 0.0 | 0.1093 | 0.0 | 0.0743 | 0.3273 | 0.0118 | 0.0447 | 0.0 | 0.0 | 0.0004 | 0.0594 |
| 1.8964 | 45.8333 | 3850 | 2.6811 | 0.017 | 0.0516 | 0.0092 | 0.0 | 0.0171 | 0.0 | 0.008 | 0.0278 | 0.0988 | 0.0 | 0.1004 | 0.0 | 0.0549 | 0.2889 | 0.0124 | 0.0381 | 0.0 | 0.0 | 0.0005 | 0.0682 |
| 2.5316 | 46.4286 | 3900 | 2.6912 | 0.0179 | 0.0479 | 0.0092 | 0.0 | 0.018 | 0.0 | 0.0075 | 0.027 | 0.1017 | 0.0 | 0.1031 | 0.0 | 0.0641 | 0.3102 | 0.0069 | 0.0391 | 0.0 | 0.0 | 0.0004 | 0.0576 |
| 2.8135 | 47.0238 | 3950 | 2.6549 | 0.0192 | 0.0507 | 0.0108 | 0.002 | 0.0193 | 0.0 | 0.0074 | 0.0281 | 0.1007 | 0.0036 | 0.1018 | 0.0 | 0.0656 | 0.3125 | 0.0108 | 0.0349 | 0.0 | 0.0 | 0.0004 | 0.0553 |
| 3.1813 | 47.6190 | 4000 | 2.6733 | 0.0213 | 0.0506 | 0.0139 | 0.0 | 0.0214 | 0.0 | 0.0077 | 0.028 | 0.1071 | 0.0 | 0.1082 | 0.0 | 0.0727 | 0.3481 | 0.0123 | 0.0349 | 0.0 | 0.0 | 0.0003 | 0.0453 |
| 2.6851 | 48.2143 | 4050 | 2.7095 | 0.0173 | 0.0446 | 0.0086 | 0.0 | 0.0174 | 0.0 | 0.008 | 0.0253 | 0.1074 | 0.0 | 0.1087 | 0.0 | 0.0616 | 0.337 | 0.0073 | 0.0377 | 0.0 | 0.0 | 0.0003 | 0.0547 |
| 1.7636 | 48.8095 | 4100 | 2.6430 | 0.017 | 0.043 | 0.0102 | 0.0 | 0.0171 | 0.0 | 0.008 | 0.0266 | 0.1029 | 0.0 | 0.1042 | 0.0 | 0.0614 | 0.3194 | 0.006 | 0.0363 | 0.0 | 0.0 | 0.0004 | 0.0559 |
| 3.0702 | 49.4048 | 4150 | 2.6970 | 0.016 | 0.042 | 0.0057 | 0.0 | 0.0162 | 0.0 | 0.0078 | 0.0268 | 0.1003 | 0.0 | 0.1017 | 0.0 | 0.0545 | 0.2995 | 0.009 | 0.0451 | 0.0 | 0.0 | 0.0004 | 0.0565 |
| 2.4806 | 50.0 | 4200 | 2.6789 | 0.0191 | 0.0475 | 0.0121 | 0.0 | 0.0192 | 0.0 | 0.0088 | 0.0311 | 0.1085 | 0.0 | 0.11 | 0.0 | 0.0627 | 0.3116 | 0.0131 | 0.0605 | 0.0 | 0.0 | 0.0004 | 0.0618 |
| 1.8828 | 50.5952 | 4250 | 2.6843 | 0.0169 | 0.0428 | 0.0085 | 0.0 | 0.017 | 0.0 | 0.0081 | 0.0309 | 0.1028 | 0.0 | 0.1041 | 0.0 | 0.0603 | 0.3069 | 0.007 | 0.053 | 0.0 | 0.0 | 0.0004 | 0.0512 |
| 2.6281 | 51.1905 | 4300 | 2.6656 | 0.0168 | 0.0444 | 0.007 | 0.0 | 0.0169 | 0.0 | 0.008 | 0.029 | 0.1035 | 0.0 | 0.1049 | 0.0 | 0.0614 | 0.3102 | 0.0056 | 0.0437 | 0.0 | 0.0 | 0.0004 | 0.06 |
| 2.7043 | 51.7857 | 4350 | 2.6403 | 0.0187 | 0.0461 | 0.0112 | 0.0013 | 0.0188 | 0.0 | 0.0081 | 0.0306 | 0.1094 | 0.0036 | 0.1109 | 0.0 | 0.0646 | 0.3144 | 0.0096 | 0.0549 | 0.0 | 0.0 | 0.0005 | 0.0682 |
| 1.2471 | 52.3810 | 4400 | 2.6702 | 0.0186 | 0.0472 | 0.0126 | 0.004 | 0.0188 | 0.0 | 0.0082 | 0.0312 | 0.1096 | 0.0071 | 0.1109 | 0.0 | 0.0652 | 0.3218 | 0.0087 | 0.0507 | 0.0 | 0.0 | 0.0006 | 0.0659 |
| 2.9681 | 52.9762 | 4450 | 2.6858 | 0.0196 | 0.0488 | 0.0108 | 0.0 | 0.0197 | 0.0 | 0.0099 | 0.0316 | 0.1143 | 0.0 | 0.1158 | 0.0 | 0.0691 | 0.337 | 0.0086 | 0.0609 | 0.0 | 0.0 | 0.0005 | 0.0594 |
| 1.7295 | 53.5714 | 4500 | 2.6434 | 0.0213 | 0.0543 | 0.0107 | 0.002 | 0.0215 | 0.0 | 0.0086 | 0.0335 | 0.1161 | 0.0071 | 0.1175 | 0.0 | 0.0737 | 0.3412 | 0.011 | 0.0526 | 0.0 | 0.0 | 0.0006 | 0.0706 |
| 2.3558 | 54.1667 | 4550 | 2.6355 | 0.0186 | 0.0506 | 0.008 | 0.004 | 0.0187 | 0.0 | 0.0072 | 0.0337 | 0.109 | 0.0036 | 0.1105 | 0.0 | 0.0626 | 0.3083 | 0.0111 | 0.0619 | 0.0 | 0.0 | 0.0005 | 0.0659 |
| 1.8154 | 54.7619 | 4600 | 2.6580 | 0.0184 | 0.0512 | 0.0079 | 0.0 | 0.0185 | 0.0 | 0.0067 | 0.0313 | 0.1074 | 0.0 | 0.1093 | 0.0 | 0.0613 | 0.2958 | 0.0117 | 0.0586 | 0.0 | 0.0 | 0.0006 | 0.0753 |
| 1.0892 | 55.3571 | 4650 | 2.6029 | 0.0164 | 0.0492 | 0.0063 | 0.0 | 0.0166 | 0.0 | 0.0075 | 0.0295 | 0.1051 | 0.0 | 0.107 | 0.0 | 0.0563 | 0.288 | 0.0086 | 0.0535 | 0.0 | 0.0 | 0.0007 | 0.0788 |
| 2.7959 | 55.9524 | 4700 | 2.6312 | 0.0197 | 0.0541 | 0.0119 | 0.0 | 0.0198 | 0.0 | 0.0095 | 0.0317 | 0.1127 | 0.0 | 0.1142 | 0.0 | 0.0706 | 0.3389 | 0.0075 | 0.0465 | 0.0 | 0.0 | 0.0007 | 0.0653 |
| 1.86 | 56.5476 | 4750 | 2.7088 | 0.0168 | 0.0439 | 0.0091 | 0.0 | 0.017 | 0.0 | 0.0084 | 0.0258 | 0.1063 | 0.0 | 0.1077 | 0.0 | 0.0603 | 0.3227 | 0.0066 | 0.0442 | 0.0 | 0.0 | 0.0005 | 0.0582 |
| 1.5733 | 57.1429 | 4800 | 2.6762 | 0.0182 | 0.0451 | 0.0101 | 0.004 | 0.0184 | 0.0 | 0.0095 | 0.0329 | 0.1103 | 0.0071 | 0.1114 | 0.0 | 0.0636 | 0.3245 | 0.0088 | 0.0572 | 0.0 | 0.0 | 0.0005 | 0.0594 |
| 1.7953 | 57.7381 | 4850 | 2.5965 | 0.021 | 0.0519 | 0.0118 | 0.004 | 0.0212 | 0.0 | 0.0078 | 0.0342 | 0.1097 | 0.0036 | 0.1109 | 0.0 | 0.0719 | 0.3352 | 0.0118 | 0.0465 | 0.0 | 0.0 | 0.0004 | 0.0571 |
| 1.9123 | 58.3333 | 4900 | 2.6606 | 0.0175 | 0.0505 | 0.0081 | 0.0 | 0.0176 | 0.0 | 0.0088 | 0.0305 | 0.1009 | 0.0 | 0.1023 | 0.0 | 0.0549 | 0.2935 | 0.0147 | 0.0549 | 0.0 | 0.0 | 0.0004 | 0.0553 |
| 1.342 | 58.9286 | 4950 | 2.7189 | 0.0195 | 0.0528 | 0.0082 | 0.004 | 0.0196 | 0.0 | 0.0087 | 0.0291 | 0.1031 | 0.0107 | 0.1038 | 0.0 | 0.0664 | 0.3208 | 0.0113 | 0.0428 | 0.0 | 0.0 | 0.0004 | 0.0488 |
| 3.0454 | 59.5238 | 5000 | 2.5984 | 0.0248 | 0.0612 | 0.0145 | 0.004 | 0.025 | 0.0 | 0.0089 | 0.0337 | 0.1156 | 0.0107 | 0.1167 | 0.0 | 0.0738 | 0.3384 | 0.0251 | 0.0623 | 0.0 | 0.0 | 0.0005 | 0.0618 |
| 1.4143 | 60.1190 | 5050 | 2.6701 | 0.0195 | 0.0538 | 0.009 | 0.0 | 0.0196 | 0.0 | 0.0071 | 0.0279 | 0.1075 | 0.0 | 0.1091 | 0.0 | 0.062 | 0.3148 | 0.0152 | 0.0493 | 0.0 | 0.0 | 0.0005 | 0.0659 |
| 1.6468 | 60.7143 | 5100 | 2.6621 | 0.0202 | 0.0489 | 0.0123 | 0.0059 | 0.0203 | 0.0 | 0.0086 | 0.0308 | 0.109 | 0.0107 | 0.1098 | 0.0 | 0.0709 | 0.338 | 0.0095 | 0.047 | 0.0 | 0.0 | 0.0004 | 0.0512 |
| 3.1546 | 61.3095 | 5150 | 2.6501 | 0.0219 | 0.0555 | 0.0096 | 0.0059 | 0.022 | 0.0 | 0.0075 | 0.0309 | 0.113 | 0.0107 | 0.114 | 0.0 | 0.0778 | 0.3403 | 0.0092 | 0.0474 | 0.0 | 0.0 | 0.0006 | 0.0641 |
| 3.617 | 61.9048 | 5200 | 2.6077 | 0.0254 | 0.0639 | 0.0158 | 0.004 | 0.0257 | 0.0 | 0.0081 | 0.0344 | 0.1223 | 0.0071 | 0.124 | 0.0 | 0.0773 | 0.3375 | 0.0235 | 0.067 | 0.0 | 0.0 | 0.0009 | 0.0847 |
| 1.4078 | 62.5 | 5250 | 2.6264 | 0.0216 | 0.0564 | 0.0123 | 0.004 | 0.0217 | 0.0 | 0.009 | 0.0313 | 0.1142 | 0.0071 | 0.1156 | 0.0 | 0.0724 | 0.3278 | 0.0132 | 0.0553 | 0.0 | 0.0 | 0.0007 | 0.0735 |
| 1.3171 | 63.0952 | 5300 | 2.5980 | 0.023 | 0.0557 | 0.0137 | 0.004 | 0.0231 | 0.0 | 0.0071 | 0.034 | 0.1158 | 0.0036 | 0.1175 | 0.0 | 0.0718 | 0.3259 | 0.0195 | 0.0651 | 0.0 | 0.0 | 0.0006 | 0.0724 |
| 1.5221 | 63.6905 | 5350 | 2.6430 | 0.0207 | 0.0534 | 0.0121 | 0.0 | 0.0208 | 0.0 | 0.0085 | 0.0321 | 0.1116 | 0.0 | 0.1132 | 0.0 | 0.0669 | 0.3255 | 0.0153 | 0.0563 | 0.0 | 0.0 | 0.0005 | 0.0647 |
| 3.0371 | 64.2857 | 5400 | 2.6606 | 0.0199 | 0.0505 | 0.0113 | 0.0079 | 0.0201 | 0.0 | 0.0077 | 0.0335 | 0.1127 | 0.0071 | 0.1139 | 0.0 | 0.067 | 0.3319 | 0.0122 | 0.0553 | 0.0 | 0.0 | 0.0006 | 0.0635 |
| 1.3461 | 64.8810 | 5450 | 2.6517 | 0.0218 | 0.0539 | 0.0141 | 0.004 | 0.0221 | 0.0 | 0.0088 | 0.035 | 0.1164 | 0.0036 | 0.1179 | 0.0 | 0.069 | 0.3264 | 0.0179 | 0.0735 | 0.0 | 0.0 | 0.0005 | 0.0659 |
| 3.0254 | 65.4762 | 5500 | 2.6664 | 0.0252 | 0.0611 | 0.0149 | 0.0079 | 0.0256 | 0.0 | 0.0064 | 0.0336 | 0.1196 | 0.0071 | 0.1209 | 0.0 | 0.0764 | 0.3505 | 0.0239 | 0.0619 | 0.0 | 0.0 | 0.0006 | 0.0659 |
| 2.4977 | 66.0714 | 5550 | 2.6556 | 0.0222 | 0.0601 | 0.0116 | 0.0 | 0.0224 | 0.0 | 0.0077 | 0.0312 | 0.1056 | 0.0 | 0.1069 | 0.0 | 0.0706 | 0.3194 | 0.0178 | 0.0507 | 0.0 | 0.0 | 0.0005 | 0.0524 |
| 2.7955 | 66.6667 | 5600 | 2.6139 | 0.0202 | 0.0506 | 0.0128 | 0.004 | 0.0204 | 0.0 | 0.0095 | 0.0367 | 0.1177 | 0.0071 | 0.1196 | 0.0 | 0.0636 | 0.3222 | 0.0153 | 0.0633 | 0.0012 | 0.0114 | 0.0007 | 0.0741 |
| 1.377 | 67.2619 | 5650 | 2.6742 | 0.0211 | 0.0567 | 0.0112 | 0.0 | 0.0212 | 0.0 | 0.008 | 0.0334 | 0.1112 | 0.0 | 0.1128 | 0.0 | 0.072 | 0.3241 | 0.0118 | 0.0591 | 0.0 | 0.0 | 0.0006 | 0.0618 |
| 1.6206 | 67.8571 | 5700 | 2.6946 | 0.0232 | 0.0545 | 0.0149 | 0.0158 | 0.0233 | 0.0252 | 0.0081 | 0.0412 | 0.1216 | 0.0143 | 0.1226 | 0.05 | 0.0748 | 0.3426 | 0.0173 | 0.0721 | 0.0 | 0.0 | 0.0007 | 0.0718 |
| 1.5313 | 68.4524 | 5750 | 2.6710 | 0.0217 | 0.0585 | 0.0124 | 0.004 | 0.0218 | 0.0252 | 0.0063 | 0.0357 | 0.1141 | 0.0071 | 0.1155 | 0.05 | 0.0636 | 0.3083 | 0.0224 | 0.0744 | 0.0 | 0.0 | 0.0007 | 0.0735 |
| 1.4385 | 69.0476 | 5800 | 2.6120 | 0.0245 | 0.0593 | 0.0169 | 0.004 | 0.0248 | 0.0 | 0.0065 | 0.0385 | 0.1171 | 0.0071 | 0.1185 | 0.0 | 0.0736 | 0.3259 | 0.0236 | 0.0712 | 0.0 | 0.0 | 0.0006 | 0.0712 |
| 1.8468 | 69.6429 | 5850 | 2.6097 | 0.0246 | 0.0593 | 0.0121 | 0.002 | 0.0248 | 0.0 | 0.0073 | 0.0397 | 0.121 | 0.0036 | 0.1227 | 0.0 | 0.0745 | 0.3324 | 0.0233 | 0.0763 | 0.0 | 0.0 | 0.0007 | 0.0753 |
| 3.5324 | 70.2381 | 5900 | 2.6230 | 0.025 | 0.0608 | 0.0151 | 0.0 | 0.0251 | 0.0 | 0.0071 | 0.0391 | 0.1219 | 0.0 | 0.1238 | 0.0 | 0.0731 | 0.3343 | 0.0261 | 0.08 | 0.0 | 0.0 | 0.0007 | 0.0735 |
| 1.9273 | 70.8333 | 5950 | 2.5812 | 0.0219 | 0.0559 | 0.0101 | 0.004 | 0.0221 | 0.0 | 0.0071 | 0.0353 | 0.1195 | 0.0036 | 0.1214 | 0.0 | 0.0664 | 0.3134 | 0.0203 | 0.0809 | 0.0 | 0.0 | 0.0009 | 0.0835 |
| 1.2119 | 71.4286 | 6000 | 2.6256 | 0.0249 | 0.0636 | 0.0155 | 0.0119 | 0.0261 | 0.0505 | 0.0115 | 0.0404 | 0.1247 | 0.0107 | 0.1265 | 0.05 | 0.0763 | 0.3273 | 0.018 | 0.0786 | 0.0045 | 0.0136 | 0.0008 | 0.0794 |
| 0.6637 | 72.0238 | 6050 | 2.5968 | 0.022 | 0.0566 | 0.0093 | 0.0059 | 0.0222 | 0.0 | 0.0107 | 0.0361 | 0.1162 | 0.0107 | 0.1178 | 0.0 | 0.075 | 0.3269 | 0.0097 | 0.054 | 0.0024 | 0.0091 | 0.0008 | 0.0747 |
| 1.5147 | 72.6190 | 6100 | 2.5598 | 0.0221 | 0.0563 | 0.0112 | 0.004 | 0.0223 | 0.0 | 0.0074 | 0.0356 | 0.1167 | 0.0107 | 0.1182 | 0.0 | 0.072 | 0.325 | 0.0156 | 0.0619 | 0.0 | 0.0 | 0.0008 | 0.08 |
| 2.1471 | 73.2143 | 6150 | 2.5925 | 0.0207 | 0.054 | 0.012 | 0.0013 | 0.0208 | 0.0 | 0.007 | 0.0367 | 0.1149 | 0.0071 | 0.1166 | 0.0 | 0.068 | 0.3037 | 0.0139 | 0.0749 | 0.0 | 0.0 | 0.0008 | 0.0812 |
| 2.1066 | 73.8095 | 6200 | 2.5630 | 0.0217 | 0.0566 | 0.0124 | 0.0 | 0.0218 | 0.0 | 0.009 | 0.0363 | 0.1134 | 0.0 | 0.1151 | 0.0 | 0.0733 | 0.3208 | 0.0128 | 0.0647 | 0.0 | 0.0 | 0.0006 | 0.0682 |
| 1.6752 | 74.4048 | 6250 | 2.5494 | 0.0232 | 0.057 | 0.0115 | 0.0119 | 0.0233 | 0.0505 | 0.0097 | 0.0407 | 0.1205 | 0.0214 | 0.1213 | 0.05 | 0.0773 | 0.3333 | 0.0149 | 0.074 | 0.0 | 0.0 | 0.0007 | 0.0747 |
| 1.5349 | 75.0 | 6300 | 2.6036 | 0.0233 | 0.0578 | 0.0141 | 0.0079 | 0.0235 | 0.0 | 0.0088 | 0.0356 | 0.116 | 0.0143 | 0.117 | 0.0 | 0.0819 | 0.3338 | 0.0109 | 0.0619 | 0.0 | 0.0 | 0.0006 | 0.0682 |
| 1.4838 | 75.5952 | 6350 | 2.6241 | 0.023 | 0.0576 | 0.0127 | 0.0079 | 0.0231 | 0.0 | 0.0077 | 0.0364 | 0.1155 | 0.0071 | 0.1168 | 0.0 | 0.0741 | 0.3231 | 0.0173 | 0.0772 | 0.0 | 0.0 | 0.0005 | 0.0618 |
| 1.9086 | 76.1905 | 6400 | 2.5999 | 0.0228 | 0.0583 | 0.0131 | 0.0 | 0.023 | 0.0 | 0.0078 | 0.0395 | 0.1139 | 0.0 | 0.1155 | 0.0 | 0.0737 | 0.3167 | 0.017 | 0.0753 | 0.0 | 0.0 | 0.0005 | 0.0635 |
| 1.795 | 76.7857 | 6450 | 2.5530 | 0.0233 | 0.0597 | 0.0155 | 0.0 | 0.0235 | 0.0 | 0.0089 | 0.0405 | 0.1196 | 0.0 | 0.1214 | 0.0 | 0.0769 | 0.3287 | 0.0157 | 0.0786 | 0.0 | 0.0 | 0.0007 | 0.0712 |
| 1.6272 | 77.3810 | 6500 | 2.5781 | 0.0256 | 0.0613 | 0.0168 | 0.004 | 0.0258 | 0.0505 | 0.0093 | 0.0412 | 0.1235 | 0.0071 | 0.1248 | 0.05 | 0.0785 | 0.3338 | 0.0235 | 0.093 | 0.0 | 0.0 | 0.0006 | 0.0671 |
| 2.5867 | 77.9762 | 6550 | 2.5581 | 0.0246 | 0.0614 | 0.0159 | 0.002 | 0.0248 | 0.0 | 0.0085 | 0.038 | 0.1201 | 0.0071 | 0.1216 | 0.0 | 0.0789 | 0.3315 | 0.019 | 0.0735 | 0.0 | 0.0 | 0.0007 | 0.0753 |
| 2.0696 | 78.5714 | 6600 | 2.5246 | 0.0221 | 0.0601 | 0.0116 | 0.0033 | 0.0222 | 0.0505 | 0.0088 | 0.037 | 0.1169 | 0.0179 | 0.1178 | 0.05 | 0.0743 | 0.3227 | 0.0132 | 0.0707 | 0.0 | 0.0 | 0.0007 | 0.0741 |
| 2.6178 | 79.1667 | 6650 | 2.5714 | 0.0234 | 0.0612 | 0.0137 | 0.002 | 0.0235 | 0.0505 | 0.0092 | 0.0407 | 0.1187 | 0.0036 | 0.1202 | 0.05 | 0.0749 | 0.3218 | 0.0181 | 0.0837 | 0.0 | 0.0 | 0.0007 | 0.0694 |
| 2.7948 | 79.7619 | 6700 | 2.5763 | 0.0216 | 0.0561 | 0.0119 | 0.0079 | 0.0218 | 0.0 | 0.009 | 0.0404 | 0.1155 | 0.0071 | 0.117 | 0.0 | 0.0729 | 0.3162 | 0.013 | 0.0735 | 0.0 | 0.0 | 0.0007 | 0.0724 |
| 2.9522 | 80.3571 | 6750 | 2.6389 | 0.0218 | 0.054 | 0.0122 | 0.0079 | 0.0218 | 0.0252 | 0.009 | 0.0417 | 0.1164 | 0.0071 | 0.1176 | 0.05 | 0.0704 | 0.3194 | 0.016 | 0.0795 | 0.0 | 0.0 | 0.0006 | 0.0665 |
| 1.7235 | 80.9524 | 6800 | 2.6528 | 0.0237 | 0.0674 | 0.0116 | 0.0059 | 0.0238 | 0.0505 | 0.008 | 0.0393 | 0.1159 | 0.0107 | 0.117 | 0.05 | 0.0782 | 0.3236 | 0.0161 | 0.0735 | 0.0 | 0.0 | 0.0006 | 0.0665 |
| 3.1389 | 81.5476 | 6850 | 2.6011 | 0.0243 | 0.0653 | 0.0124 | 0.0059 | 0.0244 | 0.101 | 0.008 | 0.0393 | 0.121 | 0.0107 | 0.122 | 0.1 | 0.0818 | 0.3389 | 0.0146 | 0.0763 | 0.0 | 0.0 | 0.0007 | 0.0688 |
| 2.4076 | 82.1429 | 6900 | 2.5874 | 0.0265 | 0.0698 | 0.0151 | 0.0 | 0.0267 | 0.0505 | 0.0099 | 0.0404 | 0.1217 | 0.0 | 0.1235 | 0.05 | 0.0797 | 0.3324 | 0.0256 | 0.0772 | 0.0 | 0.0 | 0.0008 | 0.0771 |
| 1.3061 | 82.7381 | 6950 | 2.5568 | 0.0253 | 0.0714 | 0.0094 | 0.0013 | 0.0254 | 0.0505 | 0.0087 | 0.0399 | 0.118 | 0.0036 | 0.1197 | 0.05 | 0.0778 | 0.3194 | 0.0225 | 0.0744 | 0.0 | 0.0 | 0.0009 | 0.0782 |
| 2.2139 | 83.3333 | 7000 | 2.5416 | 0.029 | 0.0748 | 0.0138 | 0.0102 | 0.0291 | 0.0505 | 0.009 | 0.0412 | 0.127 | 0.0143 | 0.1282 | 0.05 | 0.09 | 0.3528 | 0.0252 | 0.0763 | 0.0 | 0.0 | 0.0009 | 0.0788 |
| 2.3156 | 83.9286 | 7050 | 2.6074 | 0.0289 | 0.0762 | 0.0158 | 0.0119 | 0.0291 | 0.0 | 0.0088 | 0.039 | 0.1259 | 0.0107 | 0.1272 | 0.0 | 0.0852 | 0.3477 | 0.0295 | 0.0823 | 0.0 | 0.0 | 0.0009 | 0.0735 |
| 2.0129 | 84.5238 | 7100 | 2.5810 | 0.0274 | 0.0742 | 0.0142 | 0.004 | 0.0277 | 0.0 | 0.0077 | 0.0371 | 0.1232 | 0.0036 | 0.125 | 0.0 | 0.0782 | 0.3296 | 0.0307 | 0.0851 | 0.0 | 0.0 | 0.0009 | 0.0782 |
| 1.6521 | 85.1190 | 7150 | 2.5850 | 0.0303 | 0.0716 | 0.0208 | 0.0059 | 0.0306 | 0.0 | 0.0141 | 0.0481 | 0.1317 | 0.0107 | 0.1337 | 0.0 | 0.0858 | 0.35 | 0.0333 | 0.0809 | 0.001 | 0.0159 | 0.0009 | 0.08 |
| 2.2442 | 85.7143 | 7200 | 2.6064 | 0.0291 | 0.0728 | 0.013 | 0.004 | 0.0292 | 0.0505 | 0.0077 | 0.0389 | 0.1235 | 0.0143 | 0.1247 | 0.05 | 0.0849 | 0.3403 | 0.0306 | 0.0744 | 0.0 | 0.0 | 0.0009 | 0.0794 |
| 1.3171 | 86.3095 | 7250 | 2.5703 | 0.027 | 0.0717 | 0.0132 | 0.004 | 0.0272 | 0.0505 | 0.0104 | 0.04 | 0.1233 | 0.0071 | 0.1247 | 0.05 | 0.0828 | 0.3361 | 0.0243 | 0.0805 | 0.0 | 0.0 | 0.001 | 0.0765 |
| 1.4797 | 86.9048 | 7300 | 2.6329 | 0.0259 | 0.066 | 0.0137 | 0.0059 | 0.026 | 0.0505 | 0.0103 | 0.0402 | 0.1202 | 0.0107 | 0.1213 | 0.05 | 0.0814 | 0.338 | 0.0216 | 0.0735 | 0.0 | 0.0 | 0.0008 | 0.0694 |
| 2.0362 | 87.5 | 7350 | 2.6194 | 0.0272 | 0.0675 | 0.0153 | 0.0026 | 0.0274 | 0.0 | 0.0096 | 0.0372 | 0.1237 | 0.0071 | 0.1252 | 0.0 | 0.0896 | 0.3472 | 0.0183 | 0.0753 | 0.0 | 0.0 | 0.0008 | 0.0724 |
| 2.3021 | 88.0952 | 7400 | 2.6001 | 0.0255 | 0.0652 | 0.013 | 0.0 | 0.0257 | 0.0 | 0.0125 | 0.0393 | 0.1246 | 0.0 | 0.1268 | 0.0 | 0.0791 | 0.3315 | 0.0212 | 0.0828 | 0.0007 | 0.0114 | 0.0008 | 0.0729 |
| 1.5977 | 88.6905 | 7450 | 2.5935 | 0.0279 | 0.0689 | 0.0135 | 0.0158 | 0.0285 | 0.0 | 0.0145 | 0.0442 | 0.1314 | 0.0143 | 0.1333 | 0.0 | 0.0862 | 0.3519 | 0.0219 | 0.0805 | 0.0025 | 0.0182 | 0.0009 | 0.0753 |
| 0.6887 | 89.2857 | 7500 | 2.5216 | 0.027 | 0.0668 | 0.0137 | 0.0 | 0.0272 | 0.0 | 0.0097 | 0.0377 | 0.1275 | 0.0 | 0.1293 | 0.0 | 0.0886 | 0.3579 | 0.0185 | 0.0805 | 0.0 | 0.0 | 0.0008 | 0.0718 |
| 1.515 | 89.8810 | 7550 | 2.5073 | 0.0302 | 0.0728 | 0.0169 | 0.0053 | 0.0304 | 0.0 | 0.0133 | 0.0445 | 0.1354 | 0.0143 | 0.1371 | 0.0 | 0.0926 | 0.3556 | 0.026 | 0.0958 | 0.0012 | 0.0136 | 0.0009 | 0.0765 |
| 2.2476 | 90.4762 | 7600 | 2.5313 | 0.0278 | 0.0692 | 0.0169 | 0.0066 | 0.0279 | 0.0 | 0.0084 | 0.0379 | 0.1319 | 0.0179 | 0.1331 | 0.0 | 0.0844 | 0.344 | 0.0257 | 0.1023 | 0.0 | 0.0 | 0.0009 | 0.0812 |
| 3.4808 | 91.0714 | 7650 | 2.5386 | 0.0287 | 0.0716 | 0.0167 | 0.0059 | 0.0288 | 0.0 | 0.0093 | 0.041 | 0.1334 | 0.0214 | 0.1346 | 0.0 | 0.0848 | 0.3417 | 0.0291 | 0.1098 | 0.0 | 0.0 | 0.0009 | 0.0824 |
| 1.3062 | 91.6667 | 7700 | 2.5310 | 0.0289 | 0.072 | 0.0148 | 0.001 | 0.0291 | 0.0 | 0.009 | 0.0401 | 0.1332 | 0.0036 | 0.1352 | 0.0 | 0.0848 | 0.3519 | 0.0297 | 0.0958 | 0.0 | 0.0 | 0.0012 | 0.0853 |
| 3.2105 | 92.2619 | 7750 | 2.5405 | 0.0301 | 0.076 | 0.0164 | 0.0158 | 0.0302 | 0.0505 | 0.009 | 0.0429 | 0.1337 | 0.0143 | 0.135 | 0.05 | 0.0891 | 0.3472 | 0.0303 | 0.1088 | 0.0 | 0.0 | 0.0009 | 0.0788 |
| 1.807 | 92.8571 | 7800 | 2.5132 | 0.0295 | 0.0747 | 0.0147 | 0.0086 | 0.0296 | 0.0 | 0.0092 | 0.0415 | 0.1331 | 0.0179 | 0.1343 | 0.0 | 0.0908 | 0.3486 | 0.0263 | 0.1074 | 0.0 | 0.0 | 0.0009 | 0.0765 |
| 1.8476 | 93.4524 | 7850 | 2.4973 | 0.0265 | 0.0705 | 0.0127 | 0.004 | 0.0266 | 0.0505 | 0.0085 | 0.0384 | 0.1278 | 0.0179 | 0.129 | 0.05 | 0.0845 | 0.3389 | 0.0207 | 0.0907 | 0.0 | 0.0 | 0.0009 | 0.0818 |
| 2.0871 | 94.0476 | 7900 | 2.5060 | 0.0301 | 0.076 | 0.0173 | 0.0024 | 0.0305 | 0.0 | 0.0132 | 0.0467 | 0.142 | 0.0107 | 0.1442 | 0.0 | 0.091 | 0.3542 | 0.0273 | 0.1102 | 0.0008 | 0.0182 | 0.0012 | 0.0853 |
| 1.7232 | 94.6429 | 7950 | 2.5097 | 0.0277 | 0.073 | 0.0132 | 0.0013 | 0.0278 | 0.0505 | 0.009 | 0.044 | 0.1315 | 0.0071 | 0.1331 | 0.05 | 0.0828 | 0.3352 | 0.0268 | 0.114 | 0.0 | 0.0 | 0.0011 | 0.0771 |
| 2.1078 | 95.2381 | 8000 | 2.4939 | 0.0297 | 0.0791 | 0.0163 | 0.0023 | 0.03 | 0.0 | 0.012 | 0.0485 | 0.136 | 0.0143 | 0.1376 | 0.0 | 0.0936 | 0.3481 | 0.0241 | 0.1102 | 0.0003 | 0.0091 | 0.001 | 0.0765 |
| 2.3128 | 95.8333 | 8050 | 2.5257 | 0.0297 | 0.0772 | 0.0182 | 0.0 | 0.0298 | 0.0505 | 0.0103 | 0.0438 | 0.1328 | 0.0 | 0.1345 | 0.05 | 0.0912 | 0.3481 | 0.0266 | 0.113 | 0.0 | 0.0 | 0.0008 | 0.07 |
| 2.5208 | 96.4286 | 8100 | 2.5316 | 0.0305 | 0.0765 | 0.018 | 0.0 | 0.0308 | 0.0 | 0.0125 | 0.0465 | 0.1367 | 0.0 | 0.1389 | 0.0 | 0.0954 | 0.3537 | 0.0253 | 0.1074 | 0.0004 | 0.0114 | 0.001 | 0.0741 |
| 1.6958 | 97.0238 | 8150 | 2.5154 | 0.0304 | 0.0788 | 0.0188 | 0.0 | 0.0306 | 0.0505 | 0.0122 | 0.0462 | 0.1382 | 0.0 | 0.1404 | 0.05 | 0.0939 | 0.3546 | 0.0264 | 0.1121 | 0.0003 | 0.0091 | 0.0009 | 0.0771 |
| 1.5931 | 97.6190 | 8200 | 2.5143 | 0.0328 | 0.0825 | 0.0201 | 0.0011 | 0.0331 | 0.0 | 0.0112 | 0.0434 | 0.1424 | 0.0107 | 0.1443 | 0.0 | 0.1033 | 0.3718 | 0.0265 | 0.1088 | 0.0002 | 0.0068 | 0.001 | 0.0824 |
| 1.6552 | 98.2143 | 8250 | 2.5198 | 0.0321 | 0.0812 | 0.0206 | 0.0005 | 0.0323 | 0.0505 | 0.0096 | 0.0451 | 0.1379 | 0.0036 | 0.1396 | 0.05 | 0.0994 | 0.3653 | 0.028 | 0.1098 | 0.0 | 0.0 | 0.001 | 0.0765 |
| 1.8368 | 98.8095 | 8300 | 2.5458 | 0.0287 | 0.0748 | 0.0172 | 0.0 | 0.0289 | 0.0505 | 0.0093 | 0.0419 | 0.1319 | 0.0 | 0.1338 | 0.05 | 0.0896 | 0.3444 | 0.0244 | 0.1051 | 0.0 | 0.0 | 0.001 | 0.0782 |
| 1.1703 | 99.4048 | 8350 | 2.5517 | 0.0305 | 0.0784 | 0.0188 | 0.0 | 0.0307 | 0.0505 | 0.0096 | 0.0434 | 0.1375 | 0.0 | 0.1395 | 0.05 | 0.0967 | 0.3593 | 0.0244 | 0.1102 | 0.0 | 0.0 | 0.0011 | 0.0806 |
| 2.1934 | 100.0 | 8400 | 2.5361 | 0.03 | 0.0751 | 0.0179 | 0.0 | 0.0301 | 0.0 | 0.0106 | 0.0452 | 0.132 | 0.0 | 0.1339 | 0.0 | 0.0933 | 0.3509 | 0.0258 | 0.1037 | 0.0 | 0.0 | 0.0009 | 0.0735 |
| 2.5554 | 100.5952 | 8450 | 2.5152 | 0.0311 | 0.0779 | 0.0205 | 0.0 | 0.0312 | 0.0 | 0.0103 | 0.0441 | 0.1368 | 0.0 | 0.1389 | 0.0 | 0.0962 | 0.3574 | 0.0271 | 0.1079 | 0.0 | 0.0 | 0.0011 | 0.0818 |
| 1.4375 | 101.1905 | 8500 | 2.5004 | 0.0289 | 0.0737 | 0.0165 | 0.0 | 0.0291 | 0.0 | 0.0096 | 0.0446 | 0.1342 | 0.0 | 0.1363 | 0.0 | 0.0879 | 0.3403 | 0.0266 | 0.113 | 0.0 | 0.0 | 0.0011 | 0.0835 |
| 1.5642 | 101.7857 | 8550 | 2.5002 | 0.0306 | 0.0761 | 0.018 | 0.0013 | 0.031 | 0.0 | 0.015 | 0.0521 | 0.1433 | 0.0036 | 0.146 | 0.0 | 0.0924 | 0.3542 | 0.0281 | 0.1126 | 0.0007 | 0.0205 | 0.0011 | 0.0859 |
| 1.8362 | 102.3810 | 8600 | 2.5106 | 0.0297 | 0.0754 | 0.0176 | 0.0026 | 0.0299 | 0.0 | 0.0097 | 0.0468 | 0.1356 | 0.0071 | 0.1374 | 0.0 | 0.0886 | 0.3347 | 0.0291 | 0.126 | 0.0 | 0.0 | 0.0011 | 0.0818 |
| 1.1796 | 102.9762 | 8650 | 2.5102 | 0.0298 | 0.0749 | 0.0177 | 0.004 | 0.0299 | 0.0 | 0.0099 | 0.0478 | 0.1347 | 0.0214 | 0.1358 | 0.0 | 0.089 | 0.3315 | 0.029 | 0.1242 | 0.0 | 0.0 | 0.0011 | 0.0829 |
| 1.4024 | 103.5714 | 8700 | 2.4970 | 0.0302 | 0.0762 | 0.0182 | 0.0026 | 0.0306 | 0.0 | 0.0149 | 0.052 | 0.1432 | 0.0071 | 0.1458 | 0.0 | 0.0871 | 0.3319 | 0.032 | 0.134 | 0.0007 | 0.0205 | 0.0012 | 0.0865 |
| 3.0784 | 104.1667 | 8750 | 2.5118 | 0.0315 | 0.0786 | 0.0186 | 0.0032 | 0.0318 | 0.0 | 0.0154 | 0.0531 | 0.1402 | 0.0143 | 0.1423 | 0.0 | 0.0919 | 0.3315 | 0.0325 | 0.1302 | 0.0007 | 0.0205 | 0.001 | 0.0788 |
| 1.6551 | 104.7619 | 8800 | 2.5130 | 0.0307 | 0.0757 | 0.0172 | 0.002 | 0.0309 | 0.0 | 0.0102 | 0.0481 | 0.1376 | 0.0036 | 0.1396 | 0.0 | 0.0911 | 0.3412 | 0.0306 | 0.1256 | 0.0 | 0.0 | 0.0011 | 0.0835 |
| 1.3568 | 105.3571 | 8850 | 2.4991 | 0.0314 | 0.0777 | 0.0195 | 0.0119 | 0.0316 | 0.0 | 0.0099 | 0.0477 | 0.1422 | 0.0214 | 0.1436 | 0.0 | 0.0933 | 0.3495 | 0.031 | 0.1288 | 0.0 | 0.0 | 0.0012 | 0.0906 |
| 1.6515 | 105.9524 | 8900 | 2.5070 | 0.0321 | 0.0822 | 0.0184 | 0.0119 | 0.0323 | 0.0 | 0.0152 | 0.0523 | 0.1455 | 0.0214 | 0.1475 | 0.0 | 0.0943 | 0.3435 | 0.032 | 0.1279 | 0.0008 | 0.0205 | 0.0013 | 0.09 |
| 0.9578 | 106.5476 | 8950 | 2.4925 | 0.0311 | 0.0781 | 0.0202 | 0.0 | 0.0313 | 0.0 | 0.0142 | 0.0508 | 0.1395 | 0.0 | 0.1421 | 0.0 | 0.0957 | 0.3463 | 0.0269 | 0.1135 | 0.0006 | 0.0205 | 0.001 | 0.0776 |
| 1.4401 | 107.1429 | 9000 | 2.4871 | 0.0316 | 0.0818 | 0.0184 | 0.004 | 0.032 | 0.0 | 0.015 | 0.0528 | 0.1443 | 0.0071 | 0.1471 | 0.0 | 0.0975 | 0.3486 | 0.0266 | 0.113 | 0.001 | 0.025 | 0.0013 | 0.0906 |
| 2.4228 | 107.7381 | 9050 | 2.4839 | 0.0318 | 0.0795 | 0.0195 | 0.0099 | 0.0321 | 0.0 | 0.0144 | 0.0513 | 0.1453 | 0.0179 | 0.1475 | 0.0 | 0.0949 | 0.3463 | 0.0301 | 0.1251 | 0.0008 | 0.0205 | 0.0013 | 0.0894 |
| 0.8218 | 108.3333 | 9100 | 2.5016 | 0.0323 | 0.0805 | 0.0198 | 0.0079 | 0.0325 | 0.0 | 0.0102 | 0.0483 | 0.1418 | 0.0143 | 0.1435 | 0.0 | 0.1009 | 0.3616 | 0.0269 | 0.1144 | 0.0 | 0.0 | 0.0013 | 0.0912 |
| 2.0528 | 108.9286 | 9150 | 2.4979 | 0.0321 | 0.0777 | 0.0205 | 0.0 | 0.0324 | 0.0 | 0.0134 | 0.051 | 0.1431 | 0.0 | 0.1458 | 0.0 | 0.1008 | 0.3574 | 0.0259 | 0.1144 | 0.0005 | 0.0159 | 0.0012 | 0.0847 |
| 1.5715 | 109.5238 | 9200 | 2.5221 | 0.0318 | 0.0794 | 0.0195 | 0.0 | 0.0321 | 0.0505 | 0.0133 | 0.0512 | 0.1421 | 0.0 | 0.1446 | 0.05 | 0.0976 | 0.3574 | 0.028 | 0.1121 | 0.0006 | 0.0159 | 0.0011 | 0.0829 |
| 1.0875 | 110.1190 | 9250 | 2.5094 | 0.0329 | 0.0818 | 0.0181 | 0.004 | 0.0334 | 0.0505 | 0.0144 | 0.0536 | 0.1488 | 0.0071 | 0.1515 | 0.05 | 0.0968 | 0.3546 | 0.0323 | 0.1233 | 0.0013 | 0.0295 | 0.0011 | 0.0876 |
| 1.9644 | 110.7143 | 9300 | 2.4954 | 0.0315 | 0.0808 | 0.0187 | 0.004 | 0.0319 | 0.0505 | 0.0139 | 0.0528 | 0.1473 | 0.0071 | 0.15 | 0.05 | 0.0924 | 0.3449 | 0.0314 | 0.127 | 0.0011 | 0.0273 | 0.0012 | 0.09 |
| 1.3941 | 111.3095 | 9350 | 2.5030 | 0.033 | 0.0844 | 0.0211 | 0.004 | 0.0335 | 0.0505 | 0.0136 | 0.0521 | 0.1492 | 0.0071 | 0.1519 | 0.05 | 0.0975 | 0.3532 | 0.0324 | 0.1312 | 0.0011 | 0.0273 | 0.0011 | 0.0853 |
| 1.9707 | 111.9048 | 9400 | 2.5016 | 0.0342 | 0.0876 | 0.0191 | 0.0079 | 0.035 | 0.0505 | 0.0143 | 0.0581 | 0.1539 | 0.0143 | 0.1568 | 0.05 | 0.1021 | 0.3551 | 0.0314 | 0.1279 | 0.0022 | 0.0432 | 0.0012 | 0.0894 |
| 1.3412 | 112.5 | 9450 | 2.5044 | 0.0345 | 0.0859 | 0.021 | 0.0079 | 0.0354 | 0.0 | 0.0138 | 0.0575 | 0.1536 | 0.0143 | 0.1565 | 0.0 | 0.1022 | 0.3588 | 0.0324 | 0.127 | 0.0023 | 0.0432 | 0.0011 | 0.0853 |
| 2.5679 | 113.0952 | 9500 | 2.5079 | 0.0345 | 0.0883 | 0.0202 | 0.004 | 0.0354 | 0.0505 | 0.014 | 0.0584 | 0.1531 | 0.0107 | 0.1561 | 0.05 | 0.1007 | 0.3546 | 0.0336 | 0.1293 | 0.0026 | 0.0432 | 0.0011 | 0.0853 |
| 1.3141 | 113.6905 | 9550 | 2.5066 | 0.034 | 0.0871 | 0.0196 | 0.004 | 0.0347 | 0.0505 | 0.0145 | 0.0585 | 0.1526 | 0.0071 | 0.1559 | 0.05 | 0.0994 | 0.3514 | 0.0332 | 0.1284 | 0.0025 | 0.0432 | 0.0011 | 0.0876 |
| 2.2506 | 114.2857 | 9600 | 2.4980 | 0.0341 | 0.085 | 0.0204 | 0.0053 | 0.0347 | 0.0505 | 0.015 | 0.059 | 0.1514 | 0.0143 | 0.154 | 0.05 | 0.1007 | 0.3542 | 0.0325 | 0.1298 | 0.0022 | 0.0386 | 0.0011 | 0.0829 |
| 1.3625 | 114.8810 | 9650 | 2.5007 | 0.0342 | 0.0821 | 0.0187 | 0.0053 | 0.0348 | 0.0 | 0.0162 | 0.0595 | 0.1511 | 0.0143 | 0.1538 | 0.0 | 0.0995 | 0.3481 | 0.0345 | 0.1358 | 0.0017 | 0.0386 | 0.0011 | 0.0818 |
| 1.2527 | 115.4762 | 9700 | 2.4987 | 0.0342 | 0.0831 | 0.0208 | 0.0053 | 0.0348 | 0.0 | 0.0136 | 0.0557 | 0.1496 | 0.0143 | 0.1521 | 0.0 | 0.0996 | 0.3491 | 0.0345 | 0.1344 | 0.0014 | 0.0295 | 0.0012 | 0.0853 |
| 1.5965 | 116.0714 | 9750 | 2.4983 | 0.033 | 0.0838 | 0.0193 | 0.004 | 0.0335 | 0.0505 | 0.0137 | 0.0528 | 0.1478 | 0.0107 | 0.1505 | 0.05 | 0.0952 | 0.3403 | 0.0343 | 0.134 | 0.0013 | 0.0318 | 0.0012 | 0.0853 |
| 1.1548 | 116.6667 | 9800 | 2.4955 | 0.0334 | 0.0823 | 0.0188 | 0.0053 | 0.0339 | 0.0505 | 0.0156 | 0.0542 | 0.1486 | 0.0143 | 0.1511 | 0.05 | 0.0969 | 0.3444 | 0.0342 | 0.133 | 0.0014 | 0.0364 | 0.001 | 0.0806 |
| 1.7837 | 117.2619 | 9850 | 2.4938 | 0.033 | 0.0803 | 0.0207 | 0.0053 | 0.0335 | 0.0505 | 0.0156 | 0.0539 | 0.1478 | 0.0143 | 0.1502 | 0.05 | 0.0966 | 0.3435 | 0.0333 | 0.1316 | 0.0012 | 0.0341 | 0.001 | 0.0818 |
| 0.9888 | 117.8571 | 9900 | 2.4916 | 0.0335 | 0.0804 | 0.0212 | 0.0053 | 0.034 | 0.0 | 0.0154 | 0.0547 | 0.1488 | 0.0143 | 0.1514 | 0.0 | 0.0978 | 0.3454 | 0.0337 | 0.1353 | 0.0013 | 0.0341 | 0.001 | 0.0806 |
| 2.682 | 118.4524 | 9950 | 2.4895 | 0.0337 | 0.0827 | 0.0208 | 0.0053 | 0.0344 | 0.0 | 0.0166 | 0.0563 | 0.1513 | 0.0143 | 0.1542 | 0.0 | 0.098 | 0.3463 | 0.0341 | 0.1335 | 0.0018 | 0.0432 | 0.0011 | 0.0824 |
| 1.6787 | 119.0476 | 10000 | 2.4932 | 0.0333 | 0.0821 | 0.02 | 0.0053 | 0.0338 | 0.0505 | 0.0161 | 0.0587 | 0.1495 | 0.0143 | 0.1522 | 0.05 | 0.0958 | 0.3407 | 0.0345 | 0.1335 | 0.0018 | 0.0409 | 0.0011 | 0.0829 |
### Framework versions
- Transformers 4.48.0
- Pytorch 2.5.1+cu124
- Datasets 3.2.0
- Tokenizers 0.21.0
| [
"recilia dorsalis",
"nephotettix malayanus",
"sogatella furcifera",
"nilaparvata lugens"
] |
Rodr16020/rt-detr-handwriten_text_latex |
# Model Card for Model ID
<!-- Provide a quick summary of what the model is/does. -->
## Model Details
### Model Description
<!-- Provide a longer summary of what this model is. -->
This is the model card of a 🤗 transformers model that has been pushed on the Hub. This model card has been automatically generated.
- **Developed by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Model type:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
- **Finetuned from model [optional]:** [More Information Needed]
### Model Sources [optional]
<!-- Provide the basic links for the model. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the model is intended to be used, including the foreseeable users of the model and those affected by the model. -->
### Direct Use
<!-- This section is for the model use without fine-tuning or plugging into a larger ecosystem/app. -->
[More Information Needed]
### Downstream Use [optional]
<!-- This section is for the model use when fine-tuned for a task, or when plugged into a larger ecosystem/app -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the model will not work well for. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.
## How to Get Started with the Model
Use the code below to get started with the model.
[More Information Needed]
## Training Details
### Training Data
<!-- This should link to a Dataset Card, perhaps with a short stub of information on what the training data is all about as well as documentation related to data pre-processing or additional filtering. -->
[More Information Needed]
### Training Procedure
<!-- This relates heavily to the Technical Specifications. Content here should link to that section when it is relevant to the training procedure. -->
#### Preprocessing [optional]
[More Information Needed]
#### Training Hyperparameters
- **Training regime:** [More Information Needed] <!--fp32, fp16 mixed precision, bf16 mixed precision, bf16 non-mixed precision, fp16 non-mixed precision, fp8 mixed precision -->
#### Speeds, Sizes, Times [optional]
<!-- This section provides information about throughput, start/end time, checkpoint size if relevant, etc. -->
[More Information Needed]
## Evaluation
<!-- This section describes the evaluation protocols and provides the results. -->
### Testing Data, Factors & Metrics
#### Testing Data
<!-- This should link to a Dataset Card if possible. -->
[More Information Needed]
#### Factors
<!-- These are the things the evaluation is disaggregating by, e.g., subpopulations or domains. -->
[More Information Needed]
#### Metrics
<!-- These are the evaluation metrics being used, ideally with a description of why. -->
[More Information Needed]
### Results
[More Information Needed]
#### Summary
## Model Examination [optional]
<!-- Relevant interpretability work for the model goes here -->
[More Information Needed]
## Environmental Impact
<!-- Total emissions (in grams of CO2eq) and additional considerations, such as electricity usage, go here. Edit the suggested text below accordingly -->
Carbon emissions can be estimated using the [Machine Learning Impact calculator](https://mlco2.github.io/impact#compute) presented in [Lacoste et al. (2019)](https://arxiv.org/abs/1910.09700).
- **Hardware Type:** [More Information Needed]
- **Hours used:** [More Information Needed]
- **Cloud Provider:** [More Information Needed]
- **Compute Region:** [More Information Needed]
- **Carbon Emitted:** [More Information Needed]
## Technical Specifications [optional]
### Model Architecture and Objective
[More Information Needed]
### Compute Infrastructure
[More Information Needed]
#### Hardware
[More Information Needed]
#### Software
[More Information Needed]
## Citation [optional]
<!-- If there is a paper or blog post introducing the model, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the model or model card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Model Card Authors [optional]
[More Information Needed]
## Model Card Contact
[More Information Needed] | [
"label_0",
"label_1",
"label_2",
"label_3"
] |
FaridaElhusseiny/TATR_V2_26 |
# Model Card for Model ID
<!-- Provide a quick summary of what the model is/does. -->
## Model Details
### Model Description
<!-- Provide a longer summary of what this model is. -->
This is the model card of a 🤗 transformers model that has been pushed on the Hub. This model card has been automatically generated.
- **Developed by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Model type:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
- **Finetuned from model [optional]:** [More Information Needed]
### Model Sources [optional]
<!-- Provide the basic links for the model. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the model is intended to be used, including the foreseeable users of the model and those affected by the model. -->
### Direct Use
<!-- This section is for the model use without fine-tuning or plugging into a larger ecosystem/app. -->
[More Information Needed]
### Downstream Use [optional]
<!-- This section is for the model use when fine-tuned for a task, or when plugged into a larger ecosystem/app -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the model will not work well for. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.
## How to Get Started with the Model
Use the code below to get started with the model.
[More Information Needed]
## Training Details
### Training Data
<!-- This should link to a Dataset Card, perhaps with a short stub of information on what the training data is all about as well as documentation related to data pre-processing or additional filtering. -->
[More Information Needed]
### Training Procedure
<!-- This relates heavily to the Technical Specifications. Content here should link to that section when it is relevant to the training procedure. -->
#### Preprocessing [optional]
[More Information Needed]
#### Training Hyperparameters
- **Training regime:** [More Information Needed] <!--fp32, fp16 mixed precision, bf16 mixed precision, bf16 non-mixed precision, fp16 non-mixed precision, fp8 mixed precision -->
#### Speeds, Sizes, Times [optional]
<!-- This section provides information about throughput, start/end time, checkpoint size if relevant, etc. -->
[More Information Needed]
## Evaluation
<!-- This section describes the evaluation protocols and provides the results. -->
### Testing Data, Factors & Metrics
#### Testing Data
<!-- This should link to a Dataset Card if possible. -->
[More Information Needed]
#### Factors
<!-- These are the things the evaluation is disaggregating by, e.g., subpopulations or domains. -->
[More Information Needed]
#### Metrics
<!-- These are the evaluation metrics being used, ideally with a description of why. -->
[More Information Needed]
### Results
[More Information Needed]
#### Summary
## Model Examination [optional]
<!-- Relevant interpretability work for the model goes here -->
[More Information Needed]
## Environmental Impact
<!-- Total emissions (in grams of CO2eq) and additional considerations, such as electricity usage, go here. Edit the suggested text below accordingly -->
Carbon emissions can be estimated using the [Machine Learning Impact calculator](https://mlco2.github.io/impact#compute) presented in [Lacoste et al. (2019)](https://arxiv.org/abs/1910.09700).
- **Hardware Type:** [More Information Needed]
- **Hours used:** [More Information Needed]
- **Cloud Provider:** [More Information Needed]
- **Compute Region:** [More Information Needed]
- **Carbon Emitted:** [More Information Needed]
## Technical Specifications [optional]
### Model Architecture and Objective
[More Information Needed]
### Compute Infrastructure
[More Information Needed]
#### Hardware
[More Information Needed]
#### Software
[More Information Needed]
## Citation [optional]
<!-- If there is a paper or blog post introducing the model, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the model or model card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Model Card Authors [optional]
[More Information Needed]
## Model Card Contact
[More Information Needed] | [
"label_0",
"label_1",
"label_2",
"label_3",
"label_4",
"label_5",
"label_6"
] |
dyland222/detr-coco-baseball_v2 |
# Model Card for Model ID
<!-- Provide a quick summary of what the model is/does. -->
## Model Details
### Model Description
<!-- Provide a longer summary of what this model is. -->
This is the model card of a 🤗 transformers model that has been pushed on the Hub. This model card has been automatically generated.
- **Developed by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Model type:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
- **Finetuned from model [optional]:** [More Information Needed]
### Model Sources [optional]
<!-- Provide the basic links for the model. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the model is intended to be used, including the foreseeable users of the model and those affected by the model. -->
### Direct Use
<!-- This section is for the model use without fine-tuning or plugging into a larger ecosystem/app. -->
[More Information Needed]
### Downstream Use [optional]
<!-- This section is for the model use when fine-tuned for a task, or when plugged into a larger ecosystem/app -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the model will not work well for. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.
## How to Get Started with the Model
Use the code below to get started with the model.
[More Information Needed]
## Training Details
### Training Data
<!-- This should link to a Dataset Card, perhaps with a short stub of information on what the training data is all about as well as documentation related to data pre-processing or additional filtering. -->
[More Information Needed]
### Training Procedure
<!-- This relates heavily to the Technical Specifications. Content here should link to that section when it is relevant to the training procedure. -->
#### Preprocessing [optional]
[More Information Needed]
#### Training Hyperparameters
- **Training regime:** [More Information Needed] <!--fp32, fp16 mixed precision, bf16 mixed precision, bf16 non-mixed precision, fp16 non-mixed precision, fp8 mixed precision -->
#### Speeds, Sizes, Times [optional]
<!-- This section provides information about throughput, start/end time, checkpoint size if relevant, etc. -->
[More Information Needed]
## Evaluation
<!-- This section describes the evaluation protocols and provides the results. -->
### Testing Data, Factors & Metrics
#### Testing Data
<!-- This should link to a Dataset Card if possible. -->
[More Information Needed]
#### Factors
<!-- These are the things the evaluation is disaggregating by, e.g., subpopulations or domains. -->
[More Information Needed]
#### Metrics
<!-- These are the evaluation metrics being used, ideally with a description of why. -->
[More Information Needed]
### Results
[More Information Needed]
#### Summary
## Model Examination [optional]
<!-- Relevant interpretability work for the model goes here -->
[More Information Needed]
## Environmental Impact
<!-- Total emissions (in grams of CO2eq) and additional considerations, such as electricity usage, go here. Edit the suggested text below accordingly -->
Carbon emissions can be estimated using the [Machine Learning Impact calculator](https://mlco2.github.io/impact#compute) presented in [Lacoste et al. (2019)](https://arxiv.org/abs/1910.09700).
- **Hardware Type:** [More Information Needed]
- **Hours used:** [More Information Needed]
- **Cloud Provider:** [More Information Needed]
- **Compute Region:** [More Information Needed]
- **Carbon Emitted:** [More Information Needed]
## Technical Specifications [optional]
### Model Architecture and Objective
[More Information Needed]
### Compute Infrastructure
[More Information Needed]
#### Hardware
[More Information Needed]
#### Software
[More Information Needed]
## Citation [optional]
<!-- If there is a paper or blog post introducing the model, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the model or model card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Model Card Authors [optional]
[More Information Needed]
## Model Card Contact
[More Information Needed] | [
"pitcher",
"hitter"
] |
cems-official/panels_detection_rtdetr_augmented |
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# panels_detection_rtdetr_augmented
This model is a fine-tuned version of [PekingU/rtdetr_r50vd_coco_o365](https://huggingface.co/PekingU/rtdetr_r50vd_coco_o365) on the None dataset.
It achieves the following results on the evaluation set:
- Loss: 13.1458
- Map: 0.2716
- Map 50: 0.3676
- Map 75: 0.2999
- Map Small: -1.0
- Map Medium: 0.339
- Map Large: 0.2963
- Mar 1: 0.4229
- Mar 10: 0.5653
- Mar 100: 0.5922
- Mar Small: -1.0
- Mar Medium: 0.4535
- Mar Large: 0.6263
- Map Radar (small): 0.1167
- Mar 100 Radar (small): 0.6875
- Map Ship management system (small): 0.615
- Mar 100 Ship management system (small): 0.8077
- Map Radar (large): 0.2059
- Mar 100 Radar (large): 0.5217
- Map Ship management system (large): 0.0695
- Mar 100 Ship management system (large): 0.3702
- Map Ship management system (top): 0.5528
- Mar 100 Ship management system (top): 0.8087
- Map Ecdis (large): 0.5496
- Mar 100 Ecdis (large): 0.9193
- Map Visual observation (small): 0.0041
- Mar 100 Visual observation (small): 0.1021
- Map Ecdis (small): 0.2262
- Mar 100 Ecdis (small): 0.9154
- Map Ship management system (table top): 0.3138
- Mar 100 Ship management system (table top): 0.5657
- Map Thruster control: 0.6685
- Mar 100 Thruster control: 0.8256
- Map Visual observation (left): 0.141
- Mar 100 Visual observation (left): 0.7186
- Map Visual observation (mid): 0.1513
- Mar 100 Visual observation (mid): 0.5087
- Map Visual observation (right): 0.0322
- Mar 100 Visual observation (right): 0.2434
- Map Bow thruster: 0.2794
- Mar 100 Bow thruster: 0.5724
- Map Me telegraph: 0.1478
- Mar 100 Me telegraph: 0.3154
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 0.0001
- train_batch_size: 8
- eval_batch_size: 8
- seed: 42
- optimizer: Use adamw_torch with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
- lr_scheduler_type: cosine
- num_epochs: 10
### Training results
| Training Loss | Epoch | Step | Validation Loss | Map | Map 50 | Map 75 | Map Small | Map Medium | Map Large | Mar 1 | Mar 10 | Mar 100 | Mar Small | Mar Medium | Mar Large | Map Radar (small) | Mar 100 Radar (small) | Map Ship management system (small) | Mar 100 Ship management system (small) | Map Radar (large) | Mar 100 Radar (large) | Map Ship management system (large) | Mar 100 Ship management system (large) | Map Ship management system (top) | Mar 100 Ship management system (top) | Map Ecdis (large) | Mar 100 Ecdis (large) | Map Visual observation (small) | Mar 100 Visual observation (small) | Map Ecdis (small) | Mar 100 Ecdis (small) | Map Ship management system (table top) | Mar 100 Ship management system (table top) | Map Thruster control | Mar 100 Thruster control | Map Visual observation (left) | Mar 100 Visual observation (left) | Map Visual observation (mid) | Mar 100 Visual observation (mid) | Map Visual observation (right) | Mar 100 Visual observation (right) | Map Bow thruster | Mar 100 Bow thruster | Map Me telegraph | Mar 100 Me telegraph |
|:-------------:|:-----:|:----:|:---------------:|:------:|:------:|:------:|:---------:|:----------:|:---------:|:------:|:------:|:-------:|:---------:|:----------:|:---------:|:-----------------:|:---------------------:|:----------------------------------:|:--------------------------------------:|:-----------------:|:---------------------:|:----------------------------------:|:--------------------------------------:|:--------------------------------:|:------------------------------------:|:-----------------:|:---------------------:|:------------------------------:|:----------------------------------:|:-----------------:|:---------------------:|:--------------------------------------:|:------------------------------------------:|:--------------------:|:------------------------:|:-----------------------------:|:---------------------------------:|:----------------------------:|:--------------------------------:|:------------------------------:|:----------------------------------:|:----------------:|:--------------------:|:----------------:|:--------------------:|
| 8.4513 | 1.0 | 397 | 10.3922 | 0.4094 | 0.514 | 0.4547 | -1.0 | 0.295 | 0.416 | 0.5205 | 0.671 | 0.6975 | -1.0 | 0.5291 | 0.7247 | 0.8486 | 0.9393 | 0.7263 | 0.8862 | 0.7814 | 0.914 | 0.7258 | 0.9289 | 0.7534 | 0.8481 | 0.5119 | 0.9035 | 0.0874 | 0.4833 | 0.0535 | 0.8423 | 0.1808 | 0.3943 | 0.303 | 0.7333 | 0.0478 | 0.7629 | 0.7613 | 0.8965 | 0.0059 | 0.2189 | 0.2985 | 0.4655 | 0.0559 | 0.2462 |
| 8.1253 | 2.0 | 794 | 10.2868 | 0.493 | 0.6032 | 0.5285 | -1.0 | 0.2459 | 0.5437 | 0.6029 | 0.7667 | 0.7839 | -1.0 | 0.5496 | 0.8401 | 0.8156 | 0.9446 | 0.7535 | 0.9169 | 0.6787 | 0.9171 | 0.7445 | 0.9653 | 0.6818 | 0.8558 | 0.7943 | 0.9561 | 0.0721 | 0.6708 | 0.4244 | 0.9192 | 0.413 | 0.5286 | 0.2675 | 0.5487 | 0.2213 | 0.93 | 0.7938 | 0.9617 | 0.1889 | 0.7434 | 0.3784 | 0.5379 | 0.1678 | 0.3615 |
| 8.0068 | 3.0 | 1191 | 11.7780 | 0.3272 | 0.4283 | 0.3611 | -1.0 | 0.2647 | 0.3723 | 0.4608 | 0.5782 | 0.5984 | -1.0 | 0.4623 | 0.6375 | 0.1775 | 0.5857 | 0.4329 | 0.6215 | 0.4254 | 0.7295 | 0.3831 | 0.6066 | 0.7085 | 0.8644 | 0.6623 | 0.9061 | 0.0002 | 0.0167 | 0.3838 | 0.8 | 0.3677 | 0.7657 | 0.6698 | 0.7949 | 0.1571 | 0.7557 | 0.1336 | 0.5904 | 0.0578 | 0.2528 | 0.2839 | 0.4862 | 0.064 | 0.2 |
| 7.6432 | 4.0 | 1588 | 12.1826 | 0.2727 | 0.3822 | 0.3009 | -1.0 | 0.2386 | 0.2872 | 0.4224 | 0.5885 | 0.6213 | -1.0 | 0.3722 | 0.6607 | 0.1547 | 0.7696 | 0.5703 | 0.8585 | 0.2211 | 0.6302 | 0.068 | 0.5826 | 0.6042 | 0.85 | 0.4937 | 0.9105 | 0.0039 | 0.1208 | 0.1132 | 0.9 | 0.5012 | 0.6629 | 0.5743 | 0.6385 | 0.1246 | 0.8343 | 0.0925 | 0.6009 | 0.0052 | 0.1321 | 0.3535 | 0.4862 | 0.21 | 0.3423 |
| 7.2118 | 5.0 | 1985 | 10.7370 | 0.422 | 0.523 | 0.4602 | -1.0 | 0.4067 | 0.4716 | 0.5546 | 0.7167 | 0.7475 | -1.0 | 0.5951 | 0.8004 | 0.2887 | 0.8054 | 0.6827 | 0.9092 | 0.4278 | 0.8651 | 0.6697 | 0.9248 | 0.729 | 0.8788 | 0.6685 | 0.9649 | 0.0397 | 0.3521 | 0.6509 | 0.9731 | 0.6702 | 0.7971 | 0.5996 | 0.8103 | 0.0968 | 0.8471 | 0.4106 | 0.7904 | 0.0739 | 0.4698 | 0.2219 | 0.5207 | 0.0993 | 0.3038 |
| 6.8503 | 6.0 | 2382 | 12.6236 | 0.3114 | 0.4103 | 0.3386 | -1.0 | 0.3392 | 0.3656 | 0.4614 | 0.6187 | 0.6432 | -1.0 | 0.5065 | 0.6996 | 0.2113 | 0.7643 | 0.6179 | 0.8831 | 0.1871 | 0.676 | 0.3889 | 0.8248 | 0.5547 | 0.7596 | 0.5439 | 0.9439 | 0.0034 | 0.0812 | 0.3491 | 0.9654 | 0.4448 | 0.6686 | 0.6626 | 0.7692 | 0.1653 | 0.7714 | 0.145 | 0.5113 | 0.0273 | 0.2792 | 0.1843 | 0.4621 | 0.1852 | 0.2885 |
| 6.5273 | 7.0 | 2779 | 12.6545 | 0.3121 | 0.4213 | 0.3466 | -1.0 | 0.3182 | 0.3481 | 0.4635 | 0.626 | 0.649 | -1.0 | 0.5283 | 0.6854 | 0.176 | 0.8125 | 0.6625 | 0.8738 | 0.2449 | 0.6605 | 0.1453 | 0.562 | 0.5223 | 0.8067 | 0.6556 | 0.9509 | 0.0134 | 0.1937 | 0.3682 | 0.9462 | 0.4038 | 0.6657 | 0.7023 | 0.8333 | 0.1228 | 0.6743 | 0.1926 | 0.6252 | 0.0452 | 0.2925 | 0.2814 | 0.5759 | 0.1451 | 0.2615 |
| 6.2721 | 8.0 | 3176 | 13.0793 | 0.2565 | 0.3545 | 0.2826 | -1.0 | 0.284 | 0.286 | 0.4092 | 0.552 | 0.5818 | -1.0 | 0.4221 | 0.6166 | 0.1159 | 0.6643 | 0.5472 | 0.7862 | 0.2094 | 0.5047 | 0.0775 | 0.395 | 0.5442 | 0.8135 | 0.5517 | 0.9009 | 0.0029 | 0.0708 | 0.2641 | 0.9269 | 0.2066 | 0.4657 | 0.6357 | 0.8077 | 0.0902 | 0.7071 | 0.1688 | 0.5617 | 0.0261 | 0.2208 | 0.2688 | 0.5483 | 0.1376 | 0.3538 |
| 6.1623 | 9.0 | 3573 | 13.0892 | 0.2651 | 0.3555 | 0.3017 | -1.0 | 0.3047 | 0.2866 | 0.4149 | 0.5729 | 0.5997 | -1.0 | 0.4693 | 0.6365 | 0.1156 | 0.7214 | 0.5878 | 0.8062 | 0.1642 | 0.5109 | 0.055 | 0.3678 | 0.53 | 0.8269 | 0.5833 | 0.9175 | 0.0049 | 0.1083 | 0.1511 | 0.9269 | 0.34 | 0.5771 | 0.7039 | 0.8744 | 0.11 | 0.7414 | 0.1774 | 0.5478 | 0.0305 | 0.2358 | 0.2654 | 0.5517 | 0.157 | 0.2808 |
| 6.1458 | 10.0 | 3970 | 13.1458 | 0.2716 | 0.3676 | 0.2999 | -1.0 | 0.339 | 0.2963 | 0.4229 | 0.5653 | 0.5922 | -1.0 | 0.4535 | 0.6263 | 0.1167 | 0.6875 | 0.615 | 0.8077 | 0.2059 | 0.5217 | 0.0695 | 0.3702 | 0.5528 | 0.8087 | 0.5496 | 0.9193 | 0.0041 | 0.1021 | 0.2262 | 0.9154 | 0.3138 | 0.5657 | 0.6685 | 0.8256 | 0.141 | 0.7186 | 0.1513 | 0.5087 | 0.0322 | 0.2434 | 0.2794 | 0.5724 | 0.1478 | 0.3154 |
### Framework versions
- Transformers 4.46.0
- Pytorch 2.5.0+cu121
- Datasets 3.0.2
- Tokenizers 0.20.1
| [
"radar (small)",
"ship management system (small)",
"radar (large)",
"ship management system (large)",
"ship management system (top)",
"ecdis (large)",
"visual observation (small)",
"ecdis (small)",
"ship management system (table top)",
"thruster control",
"visual observation (left)",
"visual observation (mid)",
"visual observation (right)",
"bow thruster",
"me telegraph"
] |
cems-official/panels_detection_rtdetr_augmented_consolidated_labels |
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# panels_detection_rtdetr_augmented_consolidated_labels
This model is a fine-tuned version of [PekingU/rtdetr_r50vd_coco_o365](https://huggingface.co/PekingU/rtdetr_r50vd_coco_o365) on the None dataset.
It achieves the following results on the evaluation set:
- Loss: 11.9166
- Map: 0.3323
- Map 50: 0.469
- Map 75: 0.3645
- Map Small: -1.0
- Map Medium: 0.3095
- Map Large: 0.4094
- Mar 1: 0.459
- Mar 10: 0.6541
- Mar 100: 0.7167
- Mar Small: -1.0
- Mar Medium: 0.5411
- Mar Large: 0.8481
- Map Radar: 0.3744
- Mar 100 Radar: 0.9049
- Map Ship management system: 0.4699
- Mar 100 Ship management system: 0.9591
- Map Ship management system (top): 0.4953
- Mar 100 Ship management system (top): 0.8538
- Map Ecdis: 0.3507
- Mar 100 Ecdis: 0.8893
- Map Visual observation: 0.2942
- Mar 100 Visual observation: 0.8507
- Map Ship management system (table top): 0.5411
- Mar 100 Ship management system (table top): 0.72
- Map Thruster control: 0.1788
- Mar 100 Thruster control: 0.4077
- Map Bow thruster: 0.156
- Mar 100 Bow thruster: 0.4034
- Map Me telegraph: 0.1302
- Mar 100 Me telegraph: 0.4615
- Classification Accuracy: 0.2282
- Classification Accuracy Ship management system: 0.2957
- Classification Accuracy Radar: 0.3297
- Classification Accuracy Visual observation: 0.2203
- Classification Accuracy Ship management system (table top): 0.0
- Classification Accuracy Thruster control: 0.0256
- Classification Accuracy Ship management system (top): 0.3173
- Classification Accuracy Ecdis: 0.1071
- Classification Accuracy Me telegraph: 0.1923
- Classification Accuracy Bow thruster: 0.069
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 0.0001
- train_batch_size: 8
- eval_batch_size: 8
- seed: 42
- optimizer: Use adamw_torch with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
- lr_scheduler_type: cosine
- num_epochs: 10
### Training results
| Training Loss | Epoch | Step | Validation Loss | Map | Map 50 | Map 75 | Map Small | Map Medium | Map Large | Mar 1 | Mar 10 | Mar 100 | Mar Small | Mar Medium | Mar Large | Map Radar | Mar 100 Radar | Map Ship management system | Mar 100 Ship management system | Map Ship management system (top) | Mar 100 Ship management system (top) | Map Ecdis | Mar 100 Ecdis | Map Visual observation | Mar 100 Visual observation | Map Ship management system (table top) | Mar 100 Ship management system (table top) | Map Thruster control | Mar 100 Thruster control | Map Bow thruster | Mar 100 Bow thruster | Map Me telegraph | Mar 100 Me telegraph | Classification Accuracy | Classification Accuracy Ship management system | Classification Accuracy Radar | Classification Accuracy Visual observation | Classification Accuracy Ship management system (table top) | Classification Accuracy Thruster control | Classification Accuracy Ship management system (top) | Classification Accuracy Ecdis | Classification Accuracy Me telegraph | Classification Accuracy Bow thruster |
|:-------------:|:-----:|:----:|:---------------:|:------:|:------:|:------:|:---------:|:----------:|:---------:|:------:|:------:|:-------:|:---------:|:----------:|:---------:|:---------:|:-------------:|:--------------------------:|:------------------------------:|:--------------------------------:|:------------------------------------:|:---------:|:-------------:|:----------------------:|:--------------------------:|:--------------------------------------:|:------------------------------------------:|:--------------------:|:------------------------:|:----------------:|:--------------------:|:----------------:|:--------------------:|:-----------------------:|:----------------------------------------------:|:-----------------------------:|:------------------------------------------:|:----------------------------------------------------------:|:----------------------------------------:|:----------------------------------------------------:|:-----------------------------:|:------------------------------------:|:------------------------------------:|
| 12.1253 | 1.0 | 596 | 9.0098 | 0.4593 | 0.5254 | 0.4993 | -1.0 | 0.3057 | 0.5007 | 0.482 | 0.6769 | 0.6956 | -1.0 | 0.5412 | 0.799 | 0.7974 | 0.9308 | 0.8298 | 0.9366 | 0.5269 | 0.8221 | 0.8698 | 0.9736 | 0.581 | 0.9381 | 0.1451 | 0.5571 | 0.3128 | 0.6923 | 0.0391 | 0.2172 | 0.0319 | 0.1923 | 0.2592 | 0.3226 | 0.3027 | 0.3252 | 0.0 | 0.1282 | 0.125 | 0.2857 | 0.0 | 0.0 |
| 8.3909 | 2.0 | 1192 | 9.0147 | 0.4982 | 0.62 | 0.5363 | -1.0 | 0.3274 | 0.5545 | 0.5187 | 0.677 | 0.6981 | -1.0 | 0.5581 | 0.8106 | 0.7253 | 0.9259 | 0.8928 | 0.957 | 0.7477 | 0.8904 | 0.759 | 0.9264 | 0.5413 | 0.8951 | 0.3766 | 0.6171 | 0.2168 | 0.5308 | 0.0941 | 0.2483 | 0.1301 | 0.2923 | 0.2485 | 0.2796 | 0.2378 | 0.2937 | 0.0 | 0.0256 | 0.4423 | 0.2 | 0.0 | 0.0345 |
| 7.7341 | 3.0 | 1788 | 9.4848 | 0.5194 | 0.646 | 0.5676 | -1.0 | 0.4074 | 0.6135 | 0.5582 | 0.7028 | 0.7338 | -1.0 | 0.581 | 0.8242 | 0.7274 | 0.933 | 0.8448 | 0.9559 | 0.6896 | 0.8981 | 0.7131 | 0.9379 | 0.3997 | 0.8909 | 0.4943 | 0.6257 | 0.4133 | 0.6385 | 0.2247 | 0.4207 | 0.168 | 0.3038 | 0.2408 | 0.2419 | 0.2054 | 0.2727 | 0.0286 | 0.1795 | 0.2981 | 0.2571 | 0.3077 | 0.1379 |
| 7.6104 | 4.0 | 2384 | 9.4468 | 0.5195 | 0.6623 | 0.5763 | -1.0 | 0.4071 | 0.5821 | 0.5825 | 0.7279 | 0.7439 | -1.0 | 0.5779 | 0.8443 | 0.6493 | 0.9232 | 0.7005 | 0.9532 | 0.7667 | 0.875 | 0.6255 | 0.9329 | 0.5905 | 0.9101 | 0.4918 | 0.7057 | 0.4777 | 0.6949 | 0.2204 | 0.3655 | 0.1532 | 0.3346 | 0.2447 | 0.1882 | 0.2378 | 0.3811 | 0.0 | 0.1026 | 0.2981 | 0.1643 | 0.1923 | 0.0345 |
| 7.0844 | 5.0 | 2980 | 9.9452 | 0.5275 | 0.6615 | 0.5679 | -1.0 | 0.4377 | 0.632 | 0.5795 | 0.7384 | 0.7581 | -1.0 | 0.6037 | 0.8954 | 0.7249 | 0.9405 | 0.7164 | 0.9457 | 0.5673 | 0.8663 | 0.7485 | 0.9436 | 0.6044 | 0.9392 | 0.604 | 0.7086 | 0.3434 | 0.6256 | 0.2514 | 0.4379 | 0.1874 | 0.4154 | 0.2097 | 0.0968 | 0.2703 | 0.3252 | 0.0 | 0.0513 | 0.2019 | 0.2071 | 0.0769 | 0.0345 |
| 6.6803 | 6.0 | 3576 | 10.6003 | 0.4635 | 0.5961 | 0.5036 | -1.0 | 0.4156 | 0.5591 | 0.5808 | 0.7419 | 0.7748 | -1.0 | 0.6358 | 0.8924 | 0.5136 | 0.947 | 0.6338 | 0.9629 | 0.69 | 0.8904 | 0.6112 | 0.9407 | 0.3836 | 0.9066 | 0.6059 | 0.8029 | 0.3949 | 0.6513 | 0.1625 | 0.4483 | 0.1762 | 0.4231 | 0.2388 | 0.1774 | 0.427 | 0.2133 | 0.0286 | 0.0256 | 0.4038 | 0.1786 | 0.1538 | 0.0 |
| 6.4981 | 7.0 | 4172 | 11.3455 | 0.3103 | 0.4335 | 0.3555 | -1.0 | 0.333 | 0.3612 | 0.4548 | 0.6611 | 0.7139 | -1.0 | 0.567 | 0.8517 | 0.3993 | 0.8973 | 0.4322 | 0.9489 | 0.4434 | 0.8394 | 0.3526 | 0.9057 | 0.2785 | 0.8479 | 0.4733 | 0.7143 | 0.1746 | 0.5385 | 0.0939 | 0.3138 | 0.1448 | 0.4192 | 0.2311 | 0.2581 | 0.3622 | 0.2028 | 0.0 | 0.0256 | 0.2788 | 0.1857 | 0.1923 | 0.1379 |
| 6.1177 | 8.0 | 4768 | 11.4110 | 0.3684 | 0.5032 | 0.4021 | -1.0 | 0.3369 | 0.4837 | 0.4834 | 0.6829 | 0.7346 | -1.0 | 0.5818 | 0.8732 | 0.4138 | 0.9162 | 0.5081 | 0.9651 | 0.5771 | 0.8721 | 0.4373 | 0.9307 | 0.3317 | 0.8734 | 0.5112 | 0.6886 | 0.2239 | 0.5128 | 0.1511 | 0.3793 | 0.1609 | 0.4731 | 0.2437 | 0.3172 | 0.3568 | 0.2378 | 0.0 | 0.0256 | 0.2212 | 0.2071 | 0.1538 | 0.0345 |
| 5.9352 | 9.0 | 5364 | 11.8779 | 0.3299 | 0.4592 | 0.3628 | -1.0 | 0.3115 | 0.4131 | 0.4611 | 0.652 | 0.7139 | -1.0 | 0.5542 | 0.8584 | 0.351 | 0.9086 | 0.4956 | 0.9613 | 0.4915 | 0.8462 | 0.3509 | 0.8936 | 0.326 | 0.8598 | 0.4875 | 0.6714 | 0.1978 | 0.4564 | 0.1452 | 0.3621 | 0.1234 | 0.4654 | 0.2262 | 0.328 | 0.3243 | 0.2168 | 0.0286 | 0.0256 | 0.2308 | 0.15 | 0.0769 | 0.0345 |
| 5.8941 | 10.0 | 5960 | 11.9166 | 0.3323 | 0.469 | 0.3645 | -1.0 | 0.3095 | 0.4094 | 0.459 | 0.6541 | 0.7167 | -1.0 | 0.5411 | 0.8481 | 0.3744 | 0.9049 | 0.4699 | 0.9591 | 0.4953 | 0.8538 | 0.3507 | 0.8893 | 0.2942 | 0.8507 | 0.5411 | 0.72 | 0.1788 | 0.4077 | 0.156 | 0.4034 | 0.1302 | 0.4615 | 0.2282 | 0.2957 | 0.3297 | 0.2203 | 0.0 | 0.0256 | 0.3173 | 0.1071 | 0.1923 | 0.069 |
### Framework versions
- Transformers 4.46.0
- Pytorch 2.5.0+cu121
- Datasets 3.0.2
- Tokenizers 0.20.1
| [
"radar",
"ship management system",
"ship management system (top)",
"ecdis",
"visual observation",
"ship management system (table top)",
"thruster control",
"bow thruster",
"me telegraph"
] |
pneupane/detr-resnet-50_finetuned_cppe5 |
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# detr-resnet-50_finetuned_cppe5
This model is a fine-tuned version of [facebook/detr-resnet-50](https://huggingface.co/facebook/detr-resnet-50) on an unknown dataset.
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 1e-05
- train_batch_size: 8
- eval_batch_size: 8
- seed: 42
- optimizer: Use OptimizerNames.ADAMW_TORCH with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
- lr_scheduler_type: linear
- num_epochs: 10
- mixed_precision_training: Native AMP
### Training results
### Framework versions
- Transformers 4.47.1
- Pytorch 2.5.1+cu121
- Tokenizers 0.21.0
| [
"coverall",
"face_shield",
"gloves",
"goggles",
"mask"
] |
cems-official/panels_detection_rtdetr_r100_augmented |
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# panels_detection_rtdetr_r100_augmented
This model is a fine-tuned version of [PekingU/rtdetr_r101vd_coco_o365](https://huggingface.co/PekingU/rtdetr_r101vd_coco_o365) on an unknown dataset.
It achieves the following results on the evaluation set:
- Loss: 14.2566
- Map: 0.2175
- Map 50: 0.2801
- Map 75: 0.2339
- Map Small: -1.0
- Map Medium: 0.1861
- Map Large: 0.2334
- Mar 1: 0.3663
- Mar 10: 0.5553
- Mar 100: 0.5854
- Mar Small: -1.0
- Mar Medium: 0.329
- Mar Large: 0.6411
- Map Radar (small): 0.1125
- Mar 100 Radar (small): 0.4786
- Map Ship management system (small): 0.5197
- Mar 100 Ship management system (small): 0.8908
- Map Radar (large): 0.0338
- Mar 100 Radar (large): 0.2248
- Map Ship management system (large): 0.5831
- Mar 100 Ship management system (large): 0.8843
- Map Ship management system (top): 0.6613
- Mar 100 Ship management system (top): 0.8712
- Map Ecdis (large): 0.2557
- Mar 100 Ecdis (large): 0.7561
- Map Visual observation (small): 0.1058
- Mar 100 Visual observation (small): 0.1521
- Map Ecdis (small): 0.047
- Mar 100 Ecdis (small): 0.7654
- Map Ship management system (table top): 0.2346
- Mar 100 Ship management system (table top): 0.6286
- Map Thruster control: 0.1786
- Mar 100 Thruster control: 0.5436
- Map Visual observation (left): 0.0349
- Mar 100 Visual observation (left): 0.75
- Map Visual observation (mid): 0.2246
- Mar 100 Visual observation (mid): 0.6687
- Map Visual observation (right): 0.0052
- Mar 100 Visual observation (right): 0.4774
- Map Bow thruster: 0.2361
- Mar 100 Bow thruster: 0.4897
- Map Me telegraph: 0.03
- Mar 100 Me telegraph: 0.2
- Classification Accuracy: 0.0903
- Classification Accuracy Ship management system (small): 0.4154
- Classification Accuracy Radar (small): 0.0179
- Classification Accuracy Radar (large): 0.0
- Classification Accuracy Visual observation (left): 0.0429
- Classification Accuracy Ship management system (table top): 0.0
- Classification Accuracy Thruster control: 0.0256
- Classification Accuracy Visual observation (mid): 0.0522
- Classification Accuracy Ship management system (top): 0.3173
- Classification Accuracy Ship management system (large): 0.0413
- Classification Accuracy Ecdis (large): 0.0877
- Classification Accuracy Visual observation (right): 0.0
- Classification Accuracy Visual observation (small): 0.0208
- Classification Accuracy Me telegraph: 0.1923
- Classification Accuracy Bow thruster: 0.0345
- Classification Accuracy Ecdis (small): 0.0
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 0.0001
- train_batch_size: 8
- eval_batch_size: 8
- seed: 42
- optimizer: Use adamw_torch with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
- lr_scheduler_type: cosine
- num_epochs: 10
### Training results
| Training Loss | Epoch | Step | Validation Loss | Map | Map 50 | Map 75 | Map Small | Map Medium | Map Large | Mar 1 | Mar 10 | Mar 100 | Mar Small | Mar Medium | Mar Large | Map Radar (small) | Mar 100 Radar (small) | Map Ship management system (small) | Mar 100 Ship management system (small) | Map Radar (large) | Mar 100 Radar (large) | Map Ship management system (large) | Mar 100 Ship management system (large) | Map Ship management system (top) | Mar 100 Ship management system (top) | Map Ecdis (large) | Mar 100 Ecdis (large) | Map Visual observation (small) | Mar 100 Visual observation (small) | Map Ecdis (small) | Mar 100 Ecdis (small) | Map Ship management system (table top) | Mar 100 Ship management system (table top) | Map Thruster control | Mar 100 Thruster control | Map Visual observation (left) | Mar 100 Visual observation (left) | Map Visual observation (mid) | Mar 100 Visual observation (mid) | Map Visual observation (right) | Mar 100 Visual observation (right) | Map Bow thruster | Mar 100 Bow thruster | Map Me telegraph | Mar 100 Me telegraph | Classification Accuracy | Classification Accuracy Ship management system (small) | Classification Accuracy Radar (small) | Classification Accuracy Radar (large) | Classification Accuracy Visual observation (left) | Classification Accuracy Ship management system (table top) | Classification Accuracy Thruster control | Classification Accuracy Visual observation (mid) | Classification Accuracy Ship management system (top) | Classification Accuracy Ship management system (large) | Classification Accuracy Ecdis (large) | Classification Accuracy Visual observation (right) | Classification Accuracy Visual observation (small) | Classification Accuracy Me telegraph | Classification Accuracy Bow thruster | Classification Accuracy Ecdis (small) |
|:-------------:|:-----:|:----:|:---------------:|:------:|:------:|:------:|:---------:|:----------:|:---------:|:------:|:------:|:-------:|:---------:|:----------:|:---------:|:-----------------:|:---------------------:|:----------------------------------:|:--------------------------------------:|:-----------------:|:---------------------:|:----------------------------------:|:--------------------------------------:|:--------------------------------:|:------------------------------------:|:-----------------:|:---------------------:|:------------------------------:|:----------------------------------:|:-----------------:|:---------------------:|:--------------------------------------:|:------------------------------------------:|:--------------------:|:------------------------:|:-----------------------------:|:---------------------------------:|:----------------------------:|:--------------------------------:|:------------------------------:|:----------------------------------:|:----------------:|:--------------------:|:----------------:|:--------------------:|:-----------------------:|:------------------------------------------------------:|:-------------------------------------:|:-------------------------------------:|:-------------------------------------------------:|:----------------------------------------------------------:|:----------------------------------------:|:------------------------------------------------:|:----------------------------------------------------:|:------------------------------------------------------:|:-------------------------------------:|:--------------------------------------------------:|:--------------------------------------------------:|:------------------------------------:|:------------------------------------:|:-------------------------------------:|
| 16.0745 | 1.0 | 397 | 13.3384 | 0.2492 | 0.2799 | 0.2692 | -1.0 | 0.0781 | 0.2752 | 0.3739 | 0.5035 | 0.5179 | -1.0 | 0.1764 | 0.5709 | 0.3138 | 0.6625 | 0.2582 | 0.7415 | 0.6231 | 0.8806 | 0.7174 | 0.8901 | 0.1403 | 0.249 | 0.4495 | 0.8561 | 0.2673 | 0.5542 | 0.1286 | 0.7385 | 0.0287 | 0.04 | 0.0136 | 0.0769 | 0.1199 | 0.8529 | 0.6243 | 0.7522 | 0.0525 | 0.3849 | 0.0001 | 0.0276 | 0.0002 | 0.0615 | 0.1184 | 0.2462 | 0.1786 | 0.155 | 0.2714 | 0.0 | 0.0 | 0.2087 | 0.0096 | 0.124 | 0.1053 | 0.0 | 0.0417 | 0.0769 | 0.0345 | 0.0 |
| 10.6194 | 2.0 | 794 | 12.8017 | 0.2353 | 0.2753 | 0.2607 | -1.0 | 0.1056 | 0.2687 | 0.3513 | 0.5248 | 0.5727 | -1.0 | 0.2915 | 0.6338 | 0.2765 | 0.6393 | 0.2812 | 0.84 | 0.1872 | 0.5465 | 0.7308 | 0.9529 | 0.5067 | 0.7327 | 0.1323 | 0.7991 | 0.5556 | 0.7979 | 0.0885 | 0.6385 | 0.05 | 0.2286 | 0.2309 | 0.4897 | 0.0087 | 0.4971 | 0.4676 | 0.727 | 0.0033 | 0.5604 | 0.0105 | 0.1138 | 0.0 | 0.0269 | 0.1155 | 0.2154 | 0.125 | 0.0155 | 0.0571 | 0.0 | 0.0 | 0.1565 | 0.1538 | 0.2397 | 0.1228 | 0.0 | 0.1458 | 0.2308 | 0.0 | 0.0769 |
| 9.3747 | 3.0 | 1191 | 12.9274 | 0.2908 | 0.3358 | 0.3161 | -1.0 | 0.1738 | 0.3328 | 0.4501 | 0.6574 | 0.6929 | -1.0 | 0.4422 | 0.7565 | 0.4408 | 0.9125 | 0.3984 | 0.9077 | 0.2434 | 0.6674 | 0.8443 | 0.9603 | 0.5607 | 0.8712 | 0.0519 | 0.4114 | 0.558 | 0.7771 | 0.0484 | 0.7231 | 0.2593 | 0.6629 | 0.2785 | 0.641 | 0.0305 | 0.79 | 0.5335 | 0.887 | 0.0211 | 0.5679 | 0.0923 | 0.4138 | 0.0015 | 0.2 | 0.1097 | 0.2769 | 0.3036 | 0.0775 | 0.0286 | 0.1143 | 0.0513 | 0.1652 | 0.1827 | 0.0496 | 0.0175 | 0.0 | 0.1042 | 0.0385 | 0.2414 | 0.0385 |
| 8.5584 | 4.0 | 1588 | 13.1481 | 0.2582 | 0.3237 | 0.2803 | -1.0 | 0.1875 | 0.2811 | 0.3768 | 0.5958 | 0.6332 | -1.0 | 0.4125 | 0.6854 | 0.4776 | 0.8018 | 0.2294 | 0.8246 | 0.1273 | 0.4101 | 0.6917 | 0.8826 | 0.6564 | 0.8154 | 0.0791 | 0.6711 | 0.4123 | 0.7333 | 0.0095 | 0.6385 | 0.3179 | 0.5829 | 0.2624 | 0.5872 | 0.0081 | 0.6571 | 0.37 | 0.8322 | 0.0056 | 0.4981 | 0.211 | 0.4138 | 0.0149 | 0.15 | 0.0883 | 0.2154 | 0.3393 | 0.0233 | 0.0857 | 0.0857 | 0.0513 | 0.0522 | 0.2308 | 0.0165 | 0.0614 | 0.0 | 0.0 | 0.0 | 0.1379 | 0.0385 |
| 8.2229 | 5.0 | 1985 | 12.5362 | 0.3526 | 0.4073 | 0.3817 | -1.0 | 0.2086 | 0.3868 | 0.4674 | 0.6741 | 0.692 | -1.0 | 0.4149 | 0.7421 | 0.4901 | 0.9089 | 0.6574 | 0.9185 | 0.4427 | 0.6527 | 0.7716 | 0.9612 | 0.7027 | 0.9 | 0.1775 | 0.6921 | 0.4493 | 0.7292 | 0.0325 | 0.7308 | 0.3467 | 0.6486 | 0.2225 | 0.6 | 0.0557 | 0.8529 | 0.668 | 0.9348 | 0.0062 | 0.3491 | 0.2654 | 0.4172 | 0.0002 | 0.0846 | 0.1165 | 0.2769 | 0.3214 | 0.031 | 0.0286 | 0.1143 | 0.0256 | 0.1913 | 0.2115 | 0.1157 | 0.0614 | 0.0 | 0.0 | 0.0769 | 0.1724 | 0.0385 |
| 7.8006 | 6.0 | 2382 | 13.8322 | 0.2236 | 0.2836 | 0.2395 | -1.0 | 0.1455 | 0.246 | 0.3671 | 0.5454 | 0.5783 | -1.0 | 0.347 | 0.6441 | 0.2232 | 0.6786 | 0.4449 | 0.8677 | 0.0323 | 0.1093 | 0.5672 | 0.8942 | 0.6818 | 0.8529 | 0.1653 | 0.693 | 0.1408 | 0.3146 | 0.016 | 0.6923 | 0.1955 | 0.5657 | 0.209 | 0.5615 | 0.0235 | 0.7143 | 0.3935 | 0.7974 | 0.003 | 0.2811 | 0.216 | 0.4207 | 0.0417 | 0.2308 | 0.1078 | 0.4 | 0.1071 | 0.0078 | 0.1286 | 0.0571 | 0.0513 | 0.1391 | 0.2885 | 0.0744 | 0.0439 | 0.0 | 0.0 | 0.0385 | 0.1379 | 0.0 |
| 7.3399 | 7.0 | 2779 | 14.1298 | 0.2021 | 0.2472 | 0.2186 | -1.0 | 0.1629 | 0.2198 | 0.3594 | 0.5796 | 0.6109 | -1.0 | 0.4096 | 0.6685 | 0.1335 | 0.7286 | 0.5139 | 0.9077 | 0.046 | 0.2574 | 0.5499 | 0.9 | 0.6972 | 0.8846 | 0.2344 | 0.7825 | 0.006 | 0.1125 | 0.0324 | 0.6731 | 0.1763 | 0.5771 | 0.1288 | 0.5487 | 0.0376 | 0.8157 | 0.3048 | 0.7809 | 0.0048 | 0.4472 | 0.1457 | 0.4586 | 0.0202 | 0.2885 | 0.1068 | 0.4923 | 0.0357 | 0.0078 | 0.0857 | 0.0286 | 0.0769 | 0.0696 | 0.3462 | 0.0496 | 0.0877 | 0.0189 | 0.0 | 0.0769 | 0.069 | 0.0 |
| 6.9559 | 8.0 | 3176 | 13.9814 | 0.2519 | 0.3108 | 0.2773 | -1.0 | 0.2114 | 0.2722 | 0.3989 | 0.6308 | 0.6707 | -1.0 | 0.4594 | 0.719 | 0.1894 | 0.7804 | 0.5348 | 0.9077 | 0.042 | 0.2116 | 0.5718 | 0.8983 | 0.7154 | 0.8779 | 0.2398 | 0.8579 | 0.2397 | 0.5312 | 0.0347 | 0.7923 | 0.3936 | 0.7343 | 0.2106 | 0.6538 | 0.0319 | 0.7786 | 0.3276 | 0.7826 | 0.0047 | 0.4981 | 0.1981 | 0.5172 | 0.0441 | 0.2385 | 0.099 | 0.4308 | 0.0893 | 0.0078 | 0.0571 | 0.0857 | 0.0256 | 0.0783 | 0.2885 | 0.0661 | 0.0526 | 0.0 | 0.0 | 0.1923 | 0.069 | 0.0 |
| 6.7701 | 9.0 | 3573 | 14.2493 | 0.2282 | 0.2939 | 0.2495 | -1.0 | 0.1766 | 0.2523 | 0.367 | 0.5624 | 0.5931 | -1.0 | 0.4182 | 0.6438 | 0.151 | 0.5304 | 0.5124 | 0.8831 | 0.027 | 0.2225 | 0.5631 | 0.8835 | 0.6554 | 0.8673 | 0.2715 | 0.7456 | 0.1096 | 0.1667 | 0.0472 | 0.7423 | 0.2954 | 0.64 | 0.2526 | 0.6051 | 0.0401 | 0.7971 | 0.2058 | 0.6939 | 0.0047 | 0.4472 | 0.2381 | 0.4724 | 0.0483 | 0.2 | 0.0971 | 0.4462 | 0.0179 | 0.0 | 0.0286 | 0.0857 | 0.0513 | 0.0609 | 0.2885 | 0.0413 | 0.114 | 0.0189 | 0.0208 | 0.1923 | 0.0345 | 0.0 |
| 6.7179 | 10.0 | 3970 | 14.2566 | 0.2175 | 0.2801 | 0.2339 | -1.0 | 0.1861 | 0.2334 | 0.3663 | 0.5553 | 0.5854 | -1.0 | 0.329 | 0.6411 | 0.1125 | 0.4786 | 0.5197 | 0.8908 | 0.0338 | 0.2248 | 0.5831 | 0.8843 | 0.6613 | 0.8712 | 0.2557 | 0.7561 | 0.1058 | 0.1521 | 0.047 | 0.7654 | 0.2346 | 0.6286 | 0.1786 | 0.5436 | 0.0349 | 0.75 | 0.2246 | 0.6687 | 0.0052 | 0.4774 | 0.2361 | 0.4897 | 0.03 | 0.2 | 0.0903 | 0.4154 | 0.0179 | 0.0 | 0.0429 | 0.0 | 0.0256 | 0.0522 | 0.3173 | 0.0413 | 0.0877 | 0.0 | 0.0208 | 0.1923 | 0.0345 | 0.0 |
### Framework versions
- Transformers 4.46.0
- Pytorch 2.5.0+cu121
- Datasets 3.0.2
- Tokenizers 0.20.1
| [
"radar (small)",
"ship management system (small)",
"radar (large)",
"ship management system (large)",
"ship management system (top)",
"ecdis (large)",
"visual observation (small)",
"ecdis (small)",
"ship management system (table top)",
"thruster control",
"visual observation (left)",
"visual observation (mid)",
"visual observation (right)",
"bow thruster",
"me telegraph"
] |
braaibander/outputs |
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# outputs
This model is a fine-tuned version of [facebook/detr-resnet-50](https://huggingface.co/facebook/detr-resnet-50) on the imagefolder dataset.
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 5e-05
- train_batch_size: 6
- eval_batch_size: 8
- seed: 42
- optimizer: Use adamw_torch with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
- lr_scheduler_type: linear
- num_epochs: 3
- mixed_precision_training: Native AMP
### Training results
### Framework versions
- Transformers 4.48.0
- Pytorch 2.6.0+cu124
- Datasets 3.2.0
- Tokenizers 0.21.0
| [
"wheel_front_left",
"wheel_front_right",
"wheel_rear_left",
"wheel_rear_right",
"head_light_left",
"head_light_right",
"rear_light_left",
"rear_light_right",
"mirror_left",
"mirror_right",
"filler_cap",
"antenna",
"license_plate",
"license_plate_holder",
"brand_badge",
"roof_rack"
] |
magarcd/prueba |
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# prueba
This model is a fine-tuned version of [magarcd/prueba](https://huggingface.co/magarcd/prueba) on the None dataset.
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 1e-05
- train_batch_size: 8
- eval_batch_size: 8
- seed: 42
- optimizer: Use OptimizerNames.ADAMW_TORCH with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
- lr_scheduler_type: linear
- num_epochs: 100
- mixed_precision_training: Native AMP
### Training results
### Framework versions
- Transformers 4.47.1
- Pytorch 2.5.1+cu124
- Datasets 3.2.0
- Tokenizers 0.21.0
| [
"n/a",
"person",
"bicycle",
"car",
"motorcycle",
"airplane",
"bus",
"train",
"truck",
"boat",
"traffic light",
"fire hydrant",
"n/a",
"stop sign",
"parking meter",
"bench",
"bird",
"cat",
"dog",
"horse",
"sheep",
"cow",
"elephant",
"bear",
"zebra",
"giraffe",
"n/a",
"backpack",
"umbrella",
"n/a",
"n/a",
"handbag",
"tie",
"suitcase",
"frisbee",
"skis",
"snowboard",
"sports ball",
"kite",
"baseball bat",
"baseball glove",
"skateboard",
"surfboard",
"tennis racket",
"bottle",
"n/a",
"wine glass",
"cup",
"fork",
"knife",
"spoon",
"bowl",
"banana",
"apple",
"sandwich",
"orange",
"broccoli",
"carrot",
"hot dog",
"pizza",
"donut",
"cake",
"chair",
"couch",
"potted plant",
"bed",
"n/a",
"dining table",
"n/a",
"n/a",
"toilet",
"n/a",
"tv",
"laptop",
"mouse",
"remote",
"keyboard",
"cell phone",
"microwave",
"oven",
"toaster",
"sink",
"refrigerator",
"n/a",
"book",
"clock",
"vase",
"scissors",
"teddy bear",
"hair drier",
"toothbrush"
] |
scfive/detr-resnet-50_finetuned_cppe5 |
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# detr-resnet-50_finetuned_cppe5
This model is a fine-tuned version of [facebook/detr-resnet-50](https://huggingface.co/facebook/detr-resnet-50) on an unknown dataset.
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 1e-05
- train_batch_size: 8
- eval_batch_size: 8
- seed: 42
- optimizer: Use OptimizerNames.ADAMW_TORCH with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
- lr_scheduler_type: linear
- num_epochs: 10
- mixed_precision_training: Native AMP
### Framework versions
- Transformers 4.49.0.dev0
- Pytorch 2.5.1+cu124
- Datasets 3.2.0
- Tokenizers 0.21.0
| [
"coverall",
"face_shield",
"gloves",
"goggles",
"mask"
] |
magarcd/practica_2 |
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# practica_2
This model is a fine-tuned version of [hustvl/yolos-tiny](https://huggingface.co/hustvl/yolos-tiny) on the None dataset.
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 1e-05
- train_batch_size: 8
- eval_batch_size: 8
- seed: 42
- optimizer: Use OptimizerNames.ADAMW_TORCH with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
- lr_scheduler_type: linear
- num_epochs: 100
- mixed_precision_training: Native AMP
### Training results
### Framework versions
- Transformers 4.47.1
- Pytorch 2.5.1+cu124
- Datasets 3.2.0
- Tokenizers 0.21.0
| [
"banana",
"orange",
"apple"
] |
PekingU/rtdetr_v2_r18vd | ## RT-DETRv2
### **Overview**
The RT-DETRv2 model was proposed in [RT-DETRv2: Improved Baseline with Bag-of-Freebies for Real-Time Detection Transformer](https://arxiv.org/abs/2407.17140) by Wenyu Lv, Yian Zhao, Qinyao Chang, Kui Huang, Guanzhong Wang, Yi Liu. RT-DETRv2 refines RT-DETR by introducing selective multi-scale feature extraction, a discrete sampling operator for broader deployment compatibility, and improved training strategies like dynamic data augmentation and scale-adaptive hyperparameters.
These changes enhance flexibility and practicality while maintaining real-time performance.
This model was contributed by [@jadechoghari](https://x.com/jadechoghari) with the help of [@cyrilvallez](https://huggingface.co/cyrilvallez) and [@qubvel-hf](https://huggingface.co/qubvel-hf)
This is
### **Performance**
RT-DETRv2 consistently outperforms its predecessor across all model sizes while maintaining the same real-time speeds.

### **How to use**
```python
import torch
import requests
from PIL import Image
from transformers import RTDetrV2ForObjectDetection, RTDetrImageProcessor
url = 'http://images.cocodataset.org/val2017/000000039769.jpg'
image = Image.open(requests.get(url, stream=True).raw)
image_processor = RTDetrImageProcessor.from_pretrained("PekingU/rtdetr_v2_r18vd")
model = RTDetrV2ForObjectDetection.from_pretrained("PekingU/rtdetr_v2_r18vd")
inputs = image_processor(images=image, return_tensors="pt")
with torch.no_grad():
outputs = model(**inputs)
results = image_processor.post_process_object_detection(outputs, target_sizes=torch.tensor([(image.height, image.width)]), threshold=0.5)
for result in results:
for score, label_id, box in zip(result["scores"], result["labels"], result["boxes"]):
score, label = score.item(), label_id.item()
box = [round(i, 2) for i in box.tolist()]
print(f"{model.config.id2label[label]}: {score:.2f} {box}")
```
```
cat: 0.97 [341.14, 25.11, 639.98, 372.89]
cat: 0.96 [12.78, 56.35, 317.67, 471.34]
remote: 0.95 [39.96, 73.12, 175.65, 117.44]
sofa: 0.86 [-0.11, 2.97, 639.89, 473.62]
sofa: 0.82 [-0.12, 1.78, 639.87, 473.52]
remote: 0.79 [333.65, 76.38, 370.69, 187.48]
```
### **Training**
RT-DETRv2 is trained on COCO (Lin et al. [2014]) train2017 and validated on COCO val2017 dataset. We report the standard AP metrics (averaged over uniformly sampled IoU thresholds ranging from 0.50 − 0.95 with a step size of 0.05), and APval50 commonly used in real scenarios.
### **Applications**
RT-DETRv2 is ideal for real-time object detection in diverse applications such as **autonomous driving**, **surveillance systems**, **robotics**, and **retail analytics**. Its enhanced flexibility and deployment-friendly design make it suitable for both edge devices and large-scale systems + ensures high accuracy and speed in dynamic, real-world environments. | [
"person",
"bicycle",
"car",
"motorbike",
"aeroplane",
"bus",
"train",
"truck",
"boat",
"traffic light",
"fire hydrant",
"stop sign",
"parking meter",
"bench",
"bird",
"cat",
"dog",
"horse",
"sheep",
"cow",
"elephant",
"bear",
"zebra",
"giraffe",
"backpack",
"umbrella",
"handbag",
"tie",
"suitcase",
"frisbee",
"skis",
"snowboard",
"sports ball",
"kite",
"baseball bat",
"baseball glove",
"skateboard",
"surfboard",
"tennis racket",
"bottle",
"wine glass",
"cup",
"fork",
"knife",
"spoon",
"bowl",
"banana",
"apple",
"sandwich",
"orange",
"broccoli",
"carrot",
"hot dog",
"pizza",
"donut",
"cake",
"chair",
"sofa",
"pottedplant",
"bed",
"diningtable",
"toilet",
"tvmonitor",
"laptop",
"mouse",
"remote",
"keyboard",
"cell phone",
"microwave",
"oven",
"toaster",
"sink",
"refrigerator",
"book",
"clock",
"vase",
"scissors",
"teddy bear",
"hair drier",
"toothbrush"
] |
PekingU/rtdetr_v2_r34vd | ## RT-DETRv2
### **Overview**
The RT-DETRv2 model was proposed in [RT-DETRv2: Improved Baseline with Bag-of-Freebies for Real-Time Detection Transformer](https://arxiv.org/abs/2407.17140) by Wenyu Lv, Yian Zhao, Qinyao Chang, Kui Huang, Guanzhong Wang, Yi Liu. RT-DETRv2 refines RT-DETR by introducing selective multi-scale feature extraction, a discrete sampling operator for broader deployment compatibility, and improved training strategies like dynamic data augmentation and scale-adaptive hyperparameters.
These changes enhance flexibility and practicality while maintaining real-time performance.
This model was contributed by [@jadechoghari](https://x.com/jadechoghari) with the help of [@cyrilvallez](https://huggingface.co/cyrilvallez) and [@qubvel-hf](https://huggingface.co/qubvel-hf)
This is
### **Performance**
RT-DETRv2 consistently outperforms its predecessor across all model sizes while maintaining the same real-time speeds.

### **How to use**
```python
import torch
import requests
from PIL import Image
from transformers import RTDetrV2ForObjectDetection, RTDetrImageProcessor
url = 'http://images.cocodataset.org/val2017/000000039769.jpg'
image = Image.open(requests.get(url, stream=True).raw)
image_processor = RTDetrImageProcessor.from_pretrained("PekingU/rtdetr_v2_r34vd")
model = RTDetrV2ForObjectDetection.from_pretrained("PekingU/rtdetr_v2_r34vd")
inputs = image_processor(images=image, return_tensors="pt")
with torch.no_grad():
outputs = model(**inputs)
results = image_processor.post_process_object_detection(outputs, target_sizes=torch.tensor([(image.height, image.width)]), threshold=0.5)
for result in results:
for score, label_id, box in zip(result["scores"], result["labels"], result["boxes"]):
score, label = score.item(), label_id.item()
box = [round(i, 2) for i in box.tolist()]
print(f"{model.config.id2label[label]}: {score:.2f} {box}")
```
```
cat: 0.97 [341.14, 25.11, 639.98, 372.89]
cat: 0.96 [12.78, 56.35, 317.67, 471.34]
remote: 0.95 [39.96, 73.12, 175.65, 117.44]
sofa: 0.86 [-0.11, 2.97, 639.89, 473.62]
sofa: 0.82 [-0.12, 1.78, 639.87, 473.52]
remote: 0.79 [333.65, 76.38, 370.69, 187.48]
```
### **Training**
RT-DETRv2 is trained on COCO (Lin et al. [2014]) train2017 and validated on COCO val2017 dataset. We report the standard AP metrics (averaged over uniformly sampled IoU thresholds ranging from 0.50 − 0.95 with a step size of 0.05), and APval50 commonly used in real scenarios.
### **Applications**
RT-DETRv2 is ideal for real-time object detection in diverse applications such as **autonomous driving**, **surveillance systems**, **robotics**, and **retail analytics**. Its enhanced flexibility and deployment-friendly design make it suitable for both edge devices and large-scale systems + ensures high accuracy and speed in dynamic, real-world environments. | [
"person",
"bicycle",
"car",
"motorbike",
"aeroplane",
"bus",
"train",
"truck",
"boat",
"traffic light",
"fire hydrant",
"stop sign",
"parking meter",
"bench",
"bird",
"cat",
"dog",
"horse",
"sheep",
"cow",
"elephant",
"bear",
"zebra",
"giraffe",
"backpack",
"umbrella",
"handbag",
"tie",
"suitcase",
"frisbee",
"skis",
"snowboard",
"sports ball",
"kite",
"baseball bat",
"baseball glove",
"skateboard",
"surfboard",
"tennis racket",
"bottle",
"wine glass",
"cup",
"fork",
"knife",
"spoon",
"bowl",
"banana",
"apple",
"sandwich",
"orange",
"broccoli",
"carrot",
"hot dog",
"pizza",
"donut",
"cake",
"chair",
"sofa",
"pottedplant",
"bed",
"diningtable",
"toilet",
"tvmonitor",
"laptop",
"mouse",
"remote",
"keyboard",
"cell phone",
"microwave",
"oven",
"toaster",
"sink",
"refrigerator",
"book",
"clock",
"vase",
"scissors",
"teddy bear",
"hair drier",
"toothbrush"
] |
PekingU/rtdetr_v2_r50vd | ## RT-DETRv2
### **Overview**
The RT-DETRv2 model was proposed in [RT-DETRv2: Improved Baseline with Bag-of-Freebies for Real-Time Detection Transformer](https://arxiv.org/abs/2407.17140) by Wenyu Lv, Yian Zhao, Qinyao Chang, Kui Huang, Guanzhong Wang, Yi Liu. RT-DETRv2 refines RT-DETR by introducing selective multi-scale feature extraction, a discrete sampling operator for broader deployment compatibility, and improved training strategies like dynamic data augmentation and scale-adaptive hyperparameters.
These changes enhance flexibility and practicality while maintaining real-time performance.
This model was contributed by [@jadechoghari](https://x.com/jadechoghari) with the help of [@cyrilvallez](https://huggingface.co/cyrilvallez) and [@qubvel-hf](https://huggingface.co/qubvel-hf)
This is
### **Performance**
RT-DETRv2 consistently outperforms its predecessor across all model sizes while maintaining the same real-time speeds.

### **How to use**
```python
import torch
import requests
from PIL import Image
from transformers import RTDetrV2ForObjectDetection, RTDetrImageProcessor
url = 'http://images.cocodataset.org/val2017/000000039769.jpg'
image = Image.open(requests.get(url, stream=True).raw)
image_processor = RTDetrImageProcessor.from_pretrained("PekingU/rtdetr_v2_r50vd")
model = RTDetrV2ForObjectDetection.from_pretrained("PekingU/rtdetr_v2_r50vd")
inputs = image_processor(images=image, return_tensors="pt")
with torch.no_grad():
outputs = model(**inputs)
results = image_processor.post_process_object_detection(outputs, target_sizes=torch.tensor([(image.height, image.width)]), threshold=0.5)
for result in results:
for score, label_id, box in zip(result["scores"], result["labels"], result["boxes"]):
score, label = score.item(), label_id.item()
box = [round(i, 2) for i in box.tolist()]
print(f"{model.config.id2label[label]}: {score:.2f} {box}")
```
```
cat: 0.97 [341.14, 25.11, 639.98, 372.89]
cat: 0.96 [12.78, 56.35, 317.67, 471.34]
remote: 0.95 [39.96, 73.12, 175.65, 117.44]
sofa: 0.86 [-0.11, 2.97, 639.89, 473.62]
sofa: 0.82 [-0.12, 1.78, 639.87, 473.52]
remote: 0.79 [333.65, 76.38, 370.69, 187.48]
```
### **Training**
RT-DETRv2 is trained on COCO (Lin et al. [2014]) train2017 and validated on COCO val2017 dataset. We report the standard AP metrics (averaged over uniformly sampled IoU thresholds ranging from 0.50 − 0.95 with a step size of 0.05), and APval50 commonly used in real scenarios.
### **Applications**
RT-DETRv2 is ideal for real-time object detection in diverse applications such as **autonomous driving**, **surveillance systems**, **robotics**, and **retail analytics**. Its enhanced flexibility and deployment-friendly design make it suitable for both edge devices and large-scale systems + ensures high accuracy and speed in dynamic, real-world environments. | [
"person",
"bicycle",
"car",
"motorbike",
"aeroplane",
"bus",
"train",
"truck",
"boat",
"traffic light",
"fire hydrant",
"stop sign",
"parking meter",
"bench",
"bird",
"cat",
"dog",
"horse",
"sheep",
"cow",
"elephant",
"bear",
"zebra",
"giraffe",
"backpack",
"umbrella",
"handbag",
"tie",
"suitcase",
"frisbee",
"skis",
"snowboard",
"sports ball",
"kite",
"baseball bat",
"baseball glove",
"skateboard",
"surfboard",
"tennis racket",
"bottle",
"wine glass",
"cup",
"fork",
"knife",
"spoon",
"bowl",
"banana",
"apple",
"sandwich",
"orange",
"broccoli",
"carrot",
"hot dog",
"pizza",
"donut",
"cake",
"chair",
"sofa",
"pottedplant",
"bed",
"diningtable",
"toilet",
"tvmonitor",
"laptop",
"mouse",
"remote",
"keyboard",
"cell phone",
"microwave",
"oven",
"toaster",
"sink",
"refrigerator",
"book",
"clock",
"vase",
"scissors",
"teddy bear",
"hair drier",
"toothbrush"
] |
PekingU/rtdetr_v2_r101vd | ## RT-DETRv2
### **Overview**
The RT-DETRv2 model was proposed in [RT-DETRv2: Improved Baseline with Bag-of-Freebies for Real-Time Detection Transformer](https://arxiv.org/abs/2407.17140) by Wenyu Lv, Yian Zhao, Qinyao Chang, Kui Huang, Guanzhong Wang, Yi Liu. RT-DETRv2 refines RT-DETR by introducing selective multi-scale feature extraction, a discrete sampling operator for broader deployment compatibility, and improved training strategies like dynamic data augmentation and scale-adaptive hyperparameters.
These changes enhance flexibility and practicality while maintaining real-time performance.
This model was contributed by [@jadechoghari](https://x.com/jadechoghari) with the help of [@cyrilvallez](https://huggingface.co/cyrilvallez) and [@qubvel-hf](https://huggingface.co/qubvel-hf)
This is
### **Performance**
RT-DETRv2 consistently outperforms its predecessor across all model sizes while maintaining the same real-time speeds.

### **How to use**
```python
import torch
import requests
from PIL import Image
from transformers import RTDetrV2ForObjectDetection, RTDetrImageProcessor
url = 'http://images.cocodataset.org/val2017/000000039769.jpg'
image = Image.open(requests.get(url, stream=True).raw)
image_processor = RTDetrImageProcessor.from_pretrained("PekingU/rtdetr_v2_r101vd")
model = RTDetrV2ForObjectDetection.from_pretrained("PekingU/rtdetr_v2_r101vd")
inputs = image_processor(images=image, return_tensors="pt")
with torch.no_grad():
outputs = model(**inputs)
results = image_processor.post_process_object_detection(outputs, target_sizes=torch.tensor([(image.height, image.width)]), threshold=0.5)
for result in results:
for score, label_id, box in zip(result["scores"], result["labels"], result["boxes"]):
score, label = score.item(), label_id.item()
box = [round(i, 2) for i in box.tolist()]
print(f"{model.config.id2label[label]}: {score:.2f} {box}")
```
```
cat: 0.97 [341.14, 25.11, 639.98, 372.89]
cat: 0.96 [12.78, 56.35, 317.67, 471.34]
remote: 0.95 [39.96, 73.12, 175.65, 117.44]
sofa: 0.86 [-0.11, 2.97, 639.89, 473.62]
sofa: 0.82 [-0.12, 1.78, 639.87, 473.52]
remote: 0.79 [333.65, 76.38, 370.69, 187.48]
```
### **Training**
RT-DETRv2 is trained on COCO (Lin et al. [2014]) train2017 and validated on COCO val2017 dataset. We report the standard AP metrics (averaged over uniformly sampled IoU thresholds ranging from 0.50 − 0.95 with a step size of 0.05), and APval50 commonly used in real scenarios.
### **Applications**
RT-DETRv2 is ideal for real-time object detection in diverse applications such as **autonomous driving**, **surveillance systems**, **robotics**, and **retail analytics**. Its enhanced flexibility and deployment-friendly design make it suitable for both edge devices and large-scale systems + ensures high accuracy and speed in dynamic, real-world environments. | [
"person",
"bicycle",
"car",
"motorbike",
"aeroplane",
"bus",
"train",
"truck",
"boat",
"traffic light",
"fire hydrant",
"stop sign",
"parking meter",
"bench",
"bird",
"cat",
"dog",
"horse",
"sheep",
"cow",
"elephant",
"bear",
"zebra",
"giraffe",
"backpack",
"umbrella",
"handbag",
"tie",
"suitcase",
"frisbee",
"skis",
"snowboard",
"sports ball",
"kite",
"baseball bat",
"baseball glove",
"skateboard",
"surfboard",
"tennis racket",
"bottle",
"wine glass",
"cup",
"fork",
"knife",
"spoon",
"bowl",
"banana",
"apple",
"sandwich",
"orange",
"broccoli",
"carrot",
"hot dog",
"pizza",
"donut",
"cake",
"chair",
"sofa",
"pottedplant",
"bed",
"diningtable",
"toilet",
"tvmonitor",
"laptop",
"mouse",
"remote",
"keyboard",
"cell phone",
"microwave",
"oven",
"toaster",
"sink",
"refrigerator",
"book",
"clock",
"vase",
"scissors",
"teddy bear",
"hair drier",
"toothbrush"
] |
Pravallika6/detr-resnet-50_finetuned_cppe5 |
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# detr-resnet-50_finetuned_cppe5
This model is a fine-tuned version of [facebook/detr-resnet-50](https://huggingface.co/facebook/detr-resnet-50) on an unknown dataset.
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 1e-05
- train_batch_size: 8
- eval_batch_size: 8
- seed: 42
- optimizer: Use OptimizerNames.ADAMW_TORCH with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
- lr_scheduler_type: linear
- num_epochs: 10
- mixed_precision_training: Native AMP
### Training results
### Framework versions
- Transformers 4.49.0.dev0
- Pytorch 2.6.0+cu124
- Datasets 3.2.0
- Tokenizers 0.21.0
| [
"coverall",
"face_shield",
"gloves",
"goggles",
"mask"
] |
Sharadruthi-cg/detr-resnet-50_finetuned_cppe5 |
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# detr-resnet-50_finetuned_cppe5
This model is a fine-tuned version of [Sharadruthi-cg/detr-resnet-50_finetuned_cppe5](https://huggingface.co/Sharadruthi-cg/detr-resnet-50_finetuned_cppe5) on an unknown dataset.
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 1e-05
- train_batch_size: 8
- eval_batch_size: 8
- seed: 42
- optimizer: Use OptimizerNames.ADAMW_TORCH with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
- lr_scheduler_type: linear
- num_epochs: 10
- mixed_precision_training: Native AMP
### Training results
### Framework versions
- Transformers 4.48.2
- Pytorch 2.5.1+cu124
- Datasets 3.2.0
- Tokenizers 0.21.0
| [
"coverall",
"face_shield",
"gloves",
"goggles",
"mask"
] |
Dirrto/detaGrapePhenology2 |
# Model Card for Model ID
<!-- Provide a quick summary of what the model is/does. -->
## Model Details
### Model Description
<!-- Provide a longer summary of what this model is. -->
This is the model card of a 🤗 transformers model that has been pushed on the Hub. This model card has been automatically generated.
- **Developed by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Model type:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
- **Finetuned from model [optional]:** [More Information Needed]
### Model Sources [optional]
<!-- Provide the basic links for the model. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the model is intended to be used, including the foreseeable users of the model and those affected by the model. -->
### Direct Use
<!-- This section is for the model use without fine-tuning or plugging into a larger ecosystem/app. -->
[More Information Needed]
### Downstream Use [optional]
<!-- This section is for the model use when fine-tuned for a task, or when plugged into a larger ecosystem/app -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the model will not work well for. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.
## How to Get Started with the Model
Use the code below to get started with the model.
[More Information Needed]
## Training Details
### Training Data
<!-- This should link to a Dataset Card, perhaps with a short stub of information on what the training data is all about as well as documentation related to data pre-processing or additional filtering. -->
[More Information Needed]
### Training Procedure
<!-- This relates heavily to the Technical Specifications. Content here should link to that section when it is relevant to the training procedure. -->
#### Preprocessing [optional]
[More Information Needed]
#### Training Hyperparameters
- **Training regime:** [More Information Needed] <!--fp32, fp16 mixed precision, bf16 mixed precision, bf16 non-mixed precision, fp16 non-mixed precision, fp8 mixed precision -->
#### Speeds, Sizes, Times [optional]
<!-- This section provides information about throughput, start/end time, checkpoint size if relevant, etc. -->
[More Information Needed]
## Evaluation
<!-- This section describes the evaluation protocols and provides the results. -->
### Testing Data, Factors & Metrics
#### Testing Data
<!-- This should link to a Dataset Card if possible. -->
[More Information Needed]
#### Factors
<!-- These are the things the evaluation is disaggregating by, e.g., subpopulations or domains. -->
[More Information Needed]
#### Metrics
<!-- These are the evaluation metrics being used, ideally with a description of why. -->
[More Information Needed]
### Results
[More Information Needed]
#### Summary
## Model Examination [optional]
<!-- Relevant interpretability work for the model goes here -->
[More Information Needed]
## Environmental Impact
<!-- Total emissions (in grams of CO2eq) and additional considerations, such as electricity usage, go here. Edit the suggested text below accordingly -->
Carbon emissions can be estimated using the [Machine Learning Impact calculator](https://mlco2.github.io/impact#compute) presented in [Lacoste et al. (2019)](https://arxiv.org/abs/1910.09700).
- **Hardware Type:** [More Information Needed]
- **Hours used:** [More Information Needed]
- **Cloud Provider:** [More Information Needed]
- **Compute Region:** [More Information Needed]
- **Carbon Emitted:** [More Information Needed]
## Technical Specifications [optional]
### Model Architecture and Objective
[More Information Needed]
### Compute Infrastructure
[More Information Needed]
#### Hardware
[More Information Needed]
#### Software
[More Information Needed]
## Citation [optional]
<!-- If there is a paper or blog post introducing the model, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the model or model card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Model Card Authors [optional]
[More Information Needed]
## Model Card Contact
[More Information Needed] | [
"label_0",
"label_1",
"label_2",
"label_3",
"label_4",
"label_5",
"label_6",
"label_7"
] |
sumitD/detr-resnet-50-dc5-fashionpedia-finetuned |
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# detr-resnet-50-dc5-fashionpedia-finetuned
This model is a fine-tuned version of [facebook/detr-resnet-50-dc5](https://huggingface.co/facebook/detr-resnet-50-dc5) on an unknown dataset.
It achieves the following results on the evaluation set:
- Loss: 2.4165
- Map: 0.0214
- Map 50: 0.0712
- Map 75: 0.0069
- Map Small: 0.0
- Map Medium: 0.0239
- Map Large: 0.0282
- Mar 1: 0.0082
- Mar 10: 0.0476
- Mar 100: 0.1041
- Mar Small: 0.0
- Mar Medium: 0.1227
- Mar Large: 0.1664
- Map Table: 0.0
- Mar 100 Table: 0.0
- Map Table column: 0.0622
- Mar 100 Table column: 0.2797
- Map Table column header: 0.0
- Mar 100 Table column header: 0.0
- Map Table projected row header: 0.0
- Mar 100 Table projected row header: 0.0
- Map Table row: 0.0661
- Mar 100 Table row: 0.3448
- Map Table spanning cell: 0.0
- Mar 100 Table spanning cell: 0.0
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 1e-05
- train_batch_size: 4
- eval_batch_size: 4
- seed: 42
- optimizer: Use OptimizerNames.ADAMW_TORCH with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
- lr_scheduler_type: linear
- training_steps: 1000
- mixed_precision_training: Native AMP
### Training results
| Training Loss | Epoch | Step | Validation Loss | Map | Map 50 | Map 75 | Map Small | Map Medium | Map Large | Mar 1 | Mar 10 | Mar 100 | Mar Small | Mar Medium | Mar Large | Map Table | Mar 100 Table | Map Table column | Mar 100 Table column | Map Table column header | Mar 100 Table column header | Map Table projected row header | Mar 100 Table projected row header | Map Table row | Mar 100 Table row | Map Table spanning cell | Mar 100 Table spanning cell |
|:-------------:|:-------:|:----:|:---------------:|:------:|:------:|:------:|:---------:|:----------:|:---------:|:------:|:------:|:-------:|:---------:|:----------:|:---------:|:---------:|:-------------:|:----------------:|:--------------------:|:-----------------------:|:---------------------------:|:------------------------------:|:----------------------------------:|:-------------:|:-----------------:|:-----------------------:|:---------------------------:|
| 5.4882 | 0.9804 | 50 | 4.8033 | 0.0001 | 0.0003 | 0.0 | 0.0 | 0.0001 | 0.0005 | 0.0001 | 0.0008 | 0.0034 | 0.0 | 0.0015 | 0.0369 | 0.0 | 0.0 | 0.0 | 0.0063 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0003 | 0.0142 | 0.0 | 0.0 |
| 3.8236 | 1.9608 | 100 | 3.9505 | 0.0011 | 0.0052 | 0.0002 | 0.0 | 0.003 | 0.0008 | 0.0005 | 0.0046 | 0.0229 | 0.0 | 0.026 | 0.0669 | 0.0 | 0.0 | 0.0001 | 0.0004 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0066 | 0.1369 | 0.0 | 0.0 |
| 4.1025 | 2.9412 | 150 | 3.3041 | 0.0021 | 0.0092 | 0.0004 | 0.0 | 0.0047 | 0.0013 | 0.0009 | 0.0081 | 0.0236 | 0.0 | 0.027 | 0.0647 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0123 | 0.1416 | 0.0 | 0.0 |
| 2.7239 | 3.9216 | 200 | 3.0607 | 0.0035 | 0.0131 | 0.001 | 0.0 | 0.0098 | 0.0007 | 0.0017 | 0.0098 | 0.0324 | 0.0 | 0.0388 | 0.05 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0212 | 0.1943 | 0.0 | 0.0 |
| 3.0469 | 4.9020 | 250 | 2.9285 | 0.0056 | 0.0211 | 0.0017 | 0.0 | 0.0115 | 0.0028 | 0.003 | 0.0137 | 0.0427 | 0.0 | 0.0493 | 0.1066 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0335 | 0.2562 | 0.0 | 0.0 |
| 3.3639 | 5.8824 | 300 | 2.8313 | 0.0094 | 0.0294 | 0.0035 | 0.0 | 0.015 | 0.0023 | 0.004 | 0.019 | 0.0553 | 0.0 | 0.0695 | 0.0414 | 0.0 | 0.0 | 0.0018 | 0.01 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0544 | 0.3219 | 0.0 | 0.0 |
| 3.8375 | 6.8627 | 350 | 2.7181 | 0.0119 | 0.0399 | 0.0038 | 0.0 | 0.0181 | 0.0061 | 0.0032 | 0.029 | 0.0654 | 0.0 | 0.0875 | 0.0491 | 0.0 | 0.0 | 0.0115 | 0.0701 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0597 | 0.3224 | 0.0 | 0.0 |
| 2.8298 | 7.8431 | 400 | 2.7012 | 0.0124 | 0.046 | 0.003 | 0.0 | 0.0208 | 0.0101 | 0.0043 | 0.033 | 0.0609 | 0.0 | 0.0784 | 0.0915 | 0.0 | 0.0 | 0.0264 | 0.1207 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0478 | 0.2445 | 0.0 | 0.0 |
| 2.843 | 8.8235 | 450 | 2.6117 | 0.0182 | 0.0606 | 0.005 | 0.0 | 0.0235 | 0.0177 | 0.0088 | 0.0431 | 0.0866 | 0.0 | 0.1023 | 0.1309 | 0.0 | 0.0 | 0.0465 | 0.1889 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0627 | 0.3304 | 0.0 | 0.0 |
| 2.3581 | 9.8039 | 500 | 2.5716 | 0.0196 | 0.0663 | 0.0068 | 0.0 | 0.0255 | 0.0172 | 0.0072 | 0.046 | 0.0881 | 0.0 | 0.1027 | 0.1261 | 0.0 | 0.0 | 0.0451 | 0.2162 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0722 | 0.3125 | 0.0 | 0.0 |
| 2.9131 | 10.7843 | 550 | 2.5624 | 0.0176 | 0.0616 | 0.0048 | 0.0 | 0.0229 | 0.0252 | 0.0085 | 0.0455 | 0.0956 | 0.0 | 0.1121 | 0.1939 | 0.0 | 0.0 | 0.0571 | 0.3122 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0487 | 0.2617 | 0.0 | 0.0 |
| 2.7047 | 11.7647 | 600 | 2.4800 | 0.0212 | 0.0711 | 0.0058 | 0.0 | 0.0271 | 0.0248 | 0.0067 | 0.0479 | 0.1031 | 0.0 | 0.1242 | 0.175 | 0.0 | 0.0 | 0.059 | 0.2982 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0682 | 0.3207 | 0.0 | 0.0 |
| 2.1587 | 12.7451 | 650 | 2.5019 | 0.0214 | 0.0702 | 0.0068 | 0.0 | 0.0261 | 0.0256 | 0.0082 | 0.051 | 0.1038 | 0.0 | 0.1233 | 0.1539 | 0.0 | 0.0 | 0.0643 | 0.2908 | 0.0 | 0.0 | 0.0 | 0.0 | 0.064 | 0.3319 | 0.0 | 0.0 |
| 2.6573 | 13.7255 | 700 | 2.4376 | 0.0228 | 0.0773 | 0.0071 | 0.0 | 0.0274 | 0.0301 | 0.0074 | 0.0508 | 0.1088 | 0.0 | 0.13 | 0.1836 | 0.0 | 0.0 | 0.0686 | 0.3155 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0679 | 0.3375 | 0.0 | 0.0 |
| 2.9342 | 14.7059 | 750 | 2.4634 | 0.0214 | 0.0696 | 0.0087 | 0.0 | 0.0248 | 0.0255 | 0.0078 | 0.0468 | 0.1049 | 0.0 | 0.1236 | 0.1631 | 0.0 | 0.0 | 0.0616 | 0.2771 | 0.0 | 0.0 | 0.0 | 0.0 | 0.067 | 0.3522 | 0.0 | 0.0 |
| 2.2428 | 15.6863 | 800 | 2.4568 | 0.0234 | 0.073 | 0.0091 | 0.0 | 0.0266 | 0.0268 | 0.007 | 0.0495 | 0.1062 | 0.0 | 0.1262 | 0.1349 | 0.0 | 0.0 | 0.0571 | 0.2686 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0834 | 0.3685 | 0.0 | 0.0 |
| 2.2051 | 16.6667 | 850 | 2.4144 | 0.0228 | 0.0711 | 0.0092 | 0.0 | 0.0254 | 0.031 | 0.009 | 0.0506 | 0.1082 | 0.0 | 0.127 | 0.1682 | 0.0 | 0.0 | 0.0648 | 0.3004 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0721 | 0.3491 | 0.0 | 0.0 |
| 2.5091 | 17.6471 | 900 | 2.4344 | 0.0223 | 0.0725 | 0.0079 | 0.0 | 0.0252 | 0.0267 | 0.0076 | 0.0488 | 0.1057 | 0.0 | 0.1255 | 0.146 | 0.0 | 0.0 | 0.0607 | 0.2793 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0729 | 0.355 | 0.0 | 0.0 |
| 2.5155 | 18.6275 | 950 | 2.4167 | 0.0216 | 0.0715 | 0.0072 | 0.0 | 0.0237 | 0.0289 | 0.0082 | 0.0491 | 0.104 | 0.0 | 0.1217 | 0.1688 | 0.0 | 0.0 | 0.062 | 0.2823 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0673 | 0.342 | 0.0 | 0.0 |
| 2.1455 | 19.6078 | 1000 | 2.4165 | 0.0214 | 0.0712 | 0.0069 | 0.0 | 0.0239 | 0.0282 | 0.0082 | 0.0476 | 0.1041 | 0.0 | 0.1227 | 0.1664 | 0.0 | 0.0 | 0.0622 | 0.2797 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0661 | 0.3448 | 0.0 | 0.0 |
### Framework versions
- Transformers 4.48.2
- Pytorch 2.5.1+cu124
- Datasets 3.2.0
- Tokenizers 0.21.0
| [
"table",
"table column",
"table column header",
"table projected row header",
"table row",
"table spanning cell"
] |
sumitD/detr_finetuned_cppe5 |
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# detr_finetuned_cppe5
This model is a fine-tuned version of [microsoft/table-transformer-structure-recognition-v1.1-all](https://huggingface.co/microsoft/table-transformer-structure-recognition-v1.1-all) on an unknown dataset.
It achieves the following results on the evaluation set:
- eval_loss: 3.3083
- eval_map: 0.0584
- eval_map_50: 0.1515
- eval_map_75: 0.0479
- eval_map_small: -1.0
- eval_map_medium: 0.007
- eval_map_large: 0.0646
- eval_mar_1: 0.0746
- eval_mar_10: 0.115
- eval_mar_100: 0.1545
- eval_mar_small: -1.0
- eval_mar_medium: 0.0439
- eval_mar_large: 0.1653
- eval_map_table: 0.2451
- eval_mar_100_table: 0.2882
- eval_map_table column: 0.0237
- eval_mar_100_table column: 0.1297
- eval_map_table column header: 0.0245
- eval_mar_100_table column header: 0.1224
- eval_map_table projected row header: 0.0003
- eval_mar_100_table projected row header: 0.0125
- eval_map_table row: 0.0254
- eval_mar_100_table row: 0.235
- eval_map_table spanning cell: 0.0311
- eval_mar_100_table spanning cell: 0.1393
- eval_runtime: 80.5383
- eval_samples_per_second: 0.633
- eval_steps_per_second: 0.087
- epoch: 1.0
- step: 22
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 5e-05
- train_batch_size: 8
- eval_batch_size: 8
- seed: 42
- optimizer: Use OptimizerNames.ADAMW_TORCH with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
- lr_scheduler_type: cosine
- num_epochs: 10
### Framework versions
- Transformers 4.47.1
- Pytorch 2.5.1+cu124
- Datasets 3.2.0
- Tokenizers 0.21.0
| [
"table",
"table column",
"table column header",
"table projected row header",
"table row",
"table spanning cell"
] |
sumitD/table-transformer-structure-recognition-v1.1-all-finetuned |
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# table-transformer-structure-recognition-v1.1-all-finetuned
This model is a fine-tuned version of [sumitD/table-transformer-structure-recognition-v1.1-all-finetuned](https://huggingface.co/sumitD/table-transformer-structure-recognition-v1.1-all-finetuned) on the None dataset.
It achieves the following results on the evaluation set:
- Loss: 0.1419
- Map: 0.9219
- Map 50: 0.966
- Map 75: 0.9496
- Map Small: -1.0
- Map Medium: 0.8782
- Map Large: 0.921
- Mar 1: 0.5537
- Mar 10: 0.9407
- Mar 100: 0.9694
- Mar Small: -1.0
- Mar Medium: 0.9079
- Mar Large: 0.9693
- Map Table: 0.9882
- Mar 100 Table: 0.9964
- Map Table column: 0.9732
- Mar 100 Table column: 0.9892
- Map Table column header: 0.9543
- Mar 100 Table column header: 0.9847
- Map Table projected row header: 0.8673
- Mar 100 Table projected row header: 0.964
- Map Table row: 0.9584
- Mar 100 Table row: 0.9838
- Map Table spanning cell: 0.7903
- Mar 100 Table spanning cell: 0.8983
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 5e-05
- train_batch_size: 32
- eval_batch_size: 32
- seed: 42
- optimizer: Use adamw_torch with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
- lr_scheduler_type: cosine
- num_epochs: 5
- mixed_precision_training: Native AMP
### Training results
| Training Loss | Epoch | Step | Validation Loss | Map | Map 50 | Map 75 | Map Small | Map Medium | Map Large | Mar 1 | Mar 10 | Mar 100 | Mar Small | Mar Medium | Mar Large | Map Table | Mar 100 Table | Map Table column | Mar 100 Table column | Map Table column header | Mar 100 Table column header | Map Table projected row header | Mar 100 Table projected row header | Map Table row | Mar 100 Table row | Map Table spanning cell | Mar 100 Table spanning cell |
|:-------------:|:-----:|:------:|:---------------:|:------:|:------:|:------:|:---------:|:----------:|:---------:|:------:|:------:|:-------:|:---------:|:----------:|:---------:|:---------:|:-------------:|:----------------:|:--------------------:|:-----------------------:|:---------------------------:|:------------------------------:|:----------------------------------:|:-------------:|:-----------------:|:-----------------------:|:---------------------------:|
| 0.2338 | 1.0 | 23715 | 0.1991 | 0.8756 | 0.9505 | 0.9307 | -1.0 | 0.7912 | 0.8748 | 0.5395 | 0.9175 | 0.9471 | -1.0 | 0.8518 | 0.947 | 0.9844 | 0.9935 | 0.9582 | 0.981 | 0.9111 | 0.9647 | 0.7701 | 0.9364 | 0.9147 | 0.9577 | 0.7149 | 0.8496 |
| 0.2048 | 2.0 | 47430 | 0.1915 | 0.8827 | 0.9567 | 0.9384 | -1.0 | 0.8103 | 0.8823 | 0.54 | 0.9197 | 0.9498 | -1.0 | 0.8538 | 0.9499 | 0.9855 | 0.9944 | 0.9564 | 0.9819 | 0.9047 | 0.9527 | 0.7905 | 0.9437 | 0.9222 | 0.9651 | 0.7371 | 0.8607 |
| 0.1841 | 3.0 | 71145 | 0.1605 | 0.9087 | 0.9636 | 0.9467 | -1.0 | 0.8373 | 0.9077 | 0.548 | 0.933 | 0.9616 | -1.0 | 0.8868 | 0.9613 | 0.9836 | 0.9935 | 0.9703 | 0.9888 | 0.94 | 0.9771 | 0.8468 | 0.9545 | 0.9466 | 0.9781 | 0.765 | 0.8778 |
| 0.1914 | 4.0 | 94860 | 0.1496 | 0.9181 | 0.9652 | 0.9496 | -1.0 | 0.8741 | 0.917 | 0.552 | 0.9387 | 0.9678 | -1.0 | 0.9024 | 0.9676 | 0.9886 | 0.9968 | 0.9724 | 0.9886 | 0.9508 | 0.9824 | 0.8561 | 0.9628 | 0.9574 | 0.9829 | 0.7836 | 0.8934 |
| 0.1739 | 5.0 | 118575 | 0.1419 | 0.9219 | 0.966 | 0.9496 | -1.0 | 0.8782 | 0.921 | 0.5537 | 0.9407 | 0.9694 | -1.0 | 0.9079 | 0.9693 | 0.9882 | 0.9964 | 0.9732 | 0.9892 | 0.9543 | 0.9847 | 0.8673 | 0.964 | 0.9584 | 0.9838 | 0.7903 | 0.8983 |
### Framework versions
- Transformers 4.48.2
- Pytorch 2.6.0+cu124
- Datasets 3.2.0
- Tokenizers 0.21.0
| [
"table",
"table column",
"table column header",
"table projected row header",
"table row",
"table spanning cell"
] |
ustc-community/dfine-xlarge-coco | ## D-FINE
### **Overview**
The D-FINE model was proposed in [D-FINE: Redefine Regression Task in DETRs as Fine-grained Distribution Refinement](https://arxiv.org/abs/2410.13842) by
Yansong Peng, Hebei Li, Peixi Wu, Yueyi Zhang, Xiaoyan Sun, Feng Wu
This model was contributed by [VladOS95-cyber](https://github.com/VladOS95-cyber) with the help of [@qubvel-hf](https://huggingface.co/qubvel-hf)
This is the HF transformers implementation for D-FINE
_coco -> model trained on COCO
_obj365 -> model trained on Object365
_obj2coco -> model trained on Object365 and then finetuned on COCO
### **Performance**
D-FINE, a powerful real-time object detector that achieves outstanding localization precision by redefining the bounding box regression task in DETR models. D-FINE comprises two key components: Fine-grained Distribution Refinement (FDR) and Global Optimal Localization Self-Distillation (GO-LSD).

### **How to use**
```python
import torch
import requests
from PIL import Image
from transformers import DFineForObjectDetection, AutoImageProcessor
url = 'http://images.cocodataset.org/val2017/000000039769.jpg'
image = Image.open(requests.get(url, stream=True).raw)
image_processor = AutoImageProcessor.from_pretrained("ustc-community/dfine-xlarge-coco")
model = DFineForObjectDetection.from_pretrained("ustc-community/dfine-xlarge-coco")
inputs = image_processor(images=image, return_tensors="pt")
with torch.no_grad():
outputs = model(**inputs)
results = image_processor.post_process_object_detection(outputs, target_sizes=torch.tensor([image.size[::-1]]), threshold=0.3)
for result in results:
for score, label_id, box in zip(result["scores"], result["labels"], result["boxes"]):
score, label = score.item(), label_id.item()
box = [round(i, 2) for i in box.tolist()]
print(f"{model.config.id2label[label]}: {score:.2f} {box}")
```
### **Training**
D-FINE is trained on COCO (Lin et al. [2014]) train2017 and validated on COCO val2017 dataset. We report the standard AP metrics (averaged over uniformly sampled IoU thresholds ranging from 0.50 − 0.95 with a step size of 0.05), and APval5000 commonly used in real scenarios.
### **Applications**
D-FINE is ideal for real-time object detection in diverse applications such as **autonomous driving**, **surveillance systems**, **robotics**, and **retail analytics**. Its enhanced flexibility and deployment-friendly design make it suitable for both edge devices and large-scale systems + ensures high accuracy and speed in dynamic, real-world environments. | [
"person",
"bicycle",
"car",
"motorbike",
"aeroplane",
"bus",
"train",
"truck",
"boat",
"traffic light",
"fire hydrant",
"stop sign",
"parking meter",
"bench",
"bird",
"cat",
"dog",
"horse",
"sheep",
"cow",
"elephant",
"bear",
"zebra",
"giraffe",
"backpack",
"umbrella",
"handbag",
"tie",
"suitcase",
"frisbee",
"skis",
"snowboard",
"sports ball",
"kite",
"baseball bat",
"baseball glove",
"skateboard",
"surfboard",
"tennis racket",
"bottle",
"wine glass",
"cup",
"fork",
"knife",
"spoon",
"bowl",
"banana",
"apple",
"sandwich",
"orange",
"broccoli",
"carrot",
"hot dog",
"pizza",
"donut",
"cake",
"chair",
"sofa",
"pottedplant",
"bed",
"diningtable",
"toilet",
"tvmonitor",
"laptop",
"mouse",
"remote",
"keyboard",
"cell phone",
"microwave",
"oven",
"toaster",
"sink",
"refrigerator",
"book",
"clock",
"vase",
"scissors",
"teddy bear",
"hair drier",
"toothbrush"
] |
mcity-data-engine/fisheye8k_microsoft_conditional-detr-resnet-50 |
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# fisheye8k_microsoft_conditional-detr-resnet-50
This model is a fine-tuned version of [microsoft/conditional-detr-resnet-50](https://huggingface.co/microsoft/conditional-detr-resnet-50) on the generator dataset.
It achieves the following results on the evaluation set:
- Loss: 1.4466
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 5e-05
- train_batch_size: 1
- eval_batch_size: 8
- seed: 0
- optimizer: Use OptimizerNames.ADAMW_TORCH with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
- lr_scheduler_type: cosine
- num_epochs: 36
- mixed_precision_training: Native AMP
### Training results
| Training Loss | Epoch | Step | Validation Loss |
|:-------------:|:-----:|:-----:|:---------------:|
| 1.0211 | 1.0 | 5288 | 1.5012 |
| 0.9117 | 2.0 | 10576 | 1.4713 |
| 0.8595 | 3.0 | 15864 | 1.4364 |
| 0.7922 | 4.0 | 21152 | 1.5227 |
| 0.7764 | 5.0 | 26440 | 1.6631 |
| 0.7419 | 6.0 | 31728 | 1.4320 |
| 0.7132 | 7.0 | 37016 | 1.4661 |
| 0.6991 | 8.0 | 42304 | 1.4318 |
| 0.6585 | 9.0 | 47592 | 1.4069 |
| 0.6527 | 10.0 | 52880 | 1.4213 |
| 0.6191 | 11.0 | 58168 | 1.4144 |
| 0.6248 | 12.0 | 63456 | 1.3887 |
| 0.6085 | 13.0 | 68744 | 1.4053 |
| 0.582 | 14.0 | 74032 | 1.4418 |
| 0.5592 | 15.0 | 79320 | 1.5815 |
| 0.552 | 16.0 | 84608 | 1.4832 |
| 0.5233 | 17.0 | 89896 | 1.4466 |
### Framework versions
- Transformers 4.48.3
- Pytorch 2.5.1+cu124
- Datasets 3.2.0
- Tokenizers 0.21.0
Mcity Data Engine: https://arxiv.org/abs/2504.21614 | [
"bus",
"bike",
"car",
"pedestrian",
"truck"
] |
mcity-data-engine/fisheye8k_Omnifact_conditional-detr-resnet-101-dc5 |
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# fisheye8k_Omnifact_conditional-detr-resnet-101-dc5
This model is a fine-tuned version of [Omnifact/conditional-detr-resnet-101-dc5](https://huggingface.co/Omnifact/conditional-detr-resnet-101-dc5) on the generator dataset.
It achieves the following results on the evaluation set:
- Loss: 1.6175
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 5e-05
- train_batch_size: 1
- eval_batch_size: 8
- seed: 0
- optimizer: Use OptimizerNames.ADAMW_TORCH with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
- lr_scheduler_type: cosine
- num_epochs: 36
- mixed_precision_training: Native AMP
### Training results
| Training Loss | Epoch | Step | Validation Loss |
|:-------------:|:-----:|:-----:|:---------------:|
| 1.0147 | 1.0 | 5288 | 1.5035 |
| 0.9144 | 2.0 | 10576 | 1.4618 |
| 0.8685 | 3.0 | 15864 | 1.3823 |
| 0.8375 | 4.0 | 21152 | 1.5128 |
| 0.7715 | 5.0 | 26440 | 1.5045 |
| 0.7664 | 6.0 | 31728 | 1.6914 |
| 0.7073 | 7.0 | 37016 | 1.6101 |
| 0.6966 | 8.0 | 42304 | 1.6175 |
### Framework versions
- Transformers 4.48.3
- Pytorch 2.5.1+cu124
- Datasets 3.2.0
- Tokenizers 0.21.0
Mcity Data Engine: https://arxiv.org/abs/2504.21614 | [
"bus",
"bike",
"car",
"pedestrian",
"truck"
] |
jimmyRex/deta_AT_sliced_800_800 |
# Model Card for Model ID
<!-- Provide a quick summary of what the model is/does. -->
## Model Details
### Model Description
<!-- Provide a longer summary of what this model is. -->
This is the model card of a 🤗 transformers model that has been pushed on the Hub. This model card has been automatically generated.
- **Developed by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Model type:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
- **Finetuned from model [optional]:** [More Information Needed]
### Model Sources [optional]
<!-- Provide the basic links for the model. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the model is intended to be used, including the foreseeable users of the model and those affected by the model. -->
### Direct Use
<!-- This section is for the model use without fine-tuning or plugging into a larger ecosystem/app. -->
[More Information Needed]
### Downstream Use [optional]
<!-- This section is for the model use when fine-tuned for a task, or when plugged into a larger ecosystem/app -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the model will not work well for. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.
## How to Get Started with the Model
Use the code below to get started with the model.
[More Information Needed]
## Training Details
### Training Data
<!-- This should link to a Dataset Card, perhaps with a short stub of information on what the training data is all about as well as documentation related to data pre-processing or additional filtering. -->
[More Information Needed]
### Training Procedure
<!-- This relates heavily to the Technical Specifications. Content here should link to that section when it is relevant to the training procedure. -->
#### Preprocessing [optional]
[More Information Needed]
#### Training Hyperparameters
- **Training regime:** [More Information Needed] <!--fp32, fp16 mixed precision, bf16 mixed precision, bf16 non-mixed precision, fp16 non-mixed precision, fp8 mixed precision -->
#### Speeds, Sizes, Times [optional]
<!-- This section provides information about throughput, start/end time, checkpoint size if relevant, etc. -->
[More Information Needed]
## Evaluation
<!-- This section describes the evaluation protocols and provides the results. -->
### Testing Data, Factors & Metrics
#### Testing Data
<!-- This should link to a Dataset Card if possible. -->
[More Information Needed]
#### Factors
<!-- These are the things the evaluation is disaggregating by, e.g., subpopulations or domains. -->
[More Information Needed]
#### Metrics
<!-- These are the evaluation metrics being used, ideally with a description of why. -->
[More Information Needed]
### Results
[More Information Needed]
#### Summary
## Model Examination [optional]
<!-- Relevant interpretability work for the model goes here -->
[More Information Needed]
## Environmental Impact
<!-- Total emissions (in grams of CO2eq) and additional considerations, such as electricity usage, go here. Edit the suggested text below accordingly -->
Carbon emissions can be estimated using the [Machine Learning Impact calculator](https://mlco2.github.io/impact#compute) presented in [Lacoste et al. (2019)](https://arxiv.org/abs/1910.09700).
- **Hardware Type:** [More Information Needed]
- **Hours used:** [More Information Needed]
- **Cloud Provider:** [More Information Needed]
- **Compute Region:** [More Information Needed]
- **Carbon Emitted:** [More Information Needed]
## Technical Specifications [optional]
### Model Architecture and Objective
[More Information Needed]
### Compute Infrastructure
[More Information Needed]
#### Hardware
[More Information Needed]
#### Software
[More Information Needed]
## Citation [optional]
<!-- If there is a paper or blog post introducing the model, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the model or model card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Model Card Authors [optional]
[More Information Needed]
## Model Card Contact
[More Information Needed] | [
"label_0",
"label_1",
"label_2",
"label_3",
"label_4",
"label_5",
"label_6",
"label_7",
"label_8",
"label_9",
"label_10",
"label_11",
"label_12",
"label_13",
"label_14",
"label_15",
"label_16",
"label_17",
"label_18",
"label_19",
"label_20",
"label_21",
"label_22",
"label_23",
"label_24",
"label_25",
"label_26",
"label_27",
"label_28"
] |
Gelmi/rtdetr-r50-ball-handler-finetune |
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# rtdetr-r50-ball-handler-finetune
This model is a fine-tuned version of [PekingU/rtdetr_r50vd_coco_o365](https://huggingface.co/PekingU/rtdetr_r50vd_coco_o365) on an unknown dataset.
It achieves the following results on the evaluation set:
- Loss: 12.9794
- Map: 0.0003
- Map 50: 0.0015
- Map 75: 0.0
- Map Small: 0.0
- Map Medium: 0.0003
- Map Large: 0.003
- Mar 1: 0.0005
- Mar 10: 0.0067
- Mar 100: 0.0168
- Mar Small: 0.0
- Mar Medium: 0.0174
- Mar Large: 0.0194
- Map Player: 0.0006
- Mar 100 Player: 0.0245
- Map Ball-handler: 0.0
- Mar 100 Ball-handler: 0.009
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 5e-05
- train_batch_size: 8
- eval_batch_size: 8
- seed: 42
- optimizer: Use OptimizerNames.ADAMW_TORCH with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
- lr_scheduler_type: linear
- lr_scheduler_warmup_steps: 300
- num_epochs: 10
### Training results
| Training Loss | Epoch | Step | Validation Loss | Map | Map 50 | Map 75 | Map Small | Map Medium | Map Large | Mar 1 | Mar 10 | Mar 100 | Mar Small | Mar Medium | Mar Large | Map Player | Mar 100 Player | Map Ball-handler | Mar 100 Ball-handler |
|:-------------:|:-----:|:----:|:---------------:|:------:|:------:|:------:|:---------:|:----------:|:---------:|:------:|:------:|:-------:|:---------:|:----------:|:---------:|:----------:|:--------------:|:----------------:|:--------------------:|
| No log | 1.0 | 232 | 13.8912 | 0.0003 | 0.0013 | 0.0 | -1.0 | 0.0005 | 0.0001 | 0.0004 | 0.0091 | 0.0195 | -1.0 | 0.0304 | 0.0053 | 0.0004 | 0.0167 | 0.0002 | 0.0223 |
| No log | 2.0 | 464 | 11.6078 | 0.0002 | 0.001 | 0.0 | -1.0 | 0.0004 | 0.0 | 0.0011 | 0.0105 | 0.0186 | -1.0 | 0.029 | 0.0057 | 0.0003 | 0.0146 | 0.0001 | 0.0226 |
| 30.7716 | 3.0 | 696 | 11.0199 | 0.0003 | 0.0015 | 0.0 | -1.0 | 0.0007 | 0.0 | 0.0029 | 0.0103 | 0.0204 | -1.0 | 0.0332 | 0.0048 | 0.0003 | 0.0169 | 0.0003 | 0.0239 |
| 30.7716 | 4.0 | 928 | 10.8415 | 0.0002 | 0.0012 | 0.0 | -1.0 | 0.0005 | 0.0 | 0.0012 | 0.0122 | 0.0188 | -1.0 | 0.0304 | 0.005 | 0.0003 | 0.0146 | 0.0002 | 0.023 |
| 12.4494 | 5.0 | 1160 | 10.9467 | 0.0002 | 0.001 | 0.0 | -1.0 | 0.0004 | 0.0 | 0.0011 | 0.0106 | 0.0168 | -1.0 | 0.0282 | 0.0025 | 0.0003 | 0.0158 | 0.0001 | 0.0179 |
| 12.4494 | 6.0 | 1392 | 11.2840 | 0.0003 | 0.0013 | 0.0 | -1.0 | 0.0006 | 0.0 | 0.002 | 0.0114 | 0.0207 | -1.0 | 0.0348 | 0.0041 | 0.0003 | 0.0159 | 0.0002 | 0.0255 |
| 10.9839 | 7.0 | 1624 | 11.3669 | 0.0002 | 0.0011 | 0.0 | -1.0 | 0.0005 | 0.0 | 0.0017 | 0.012 | 0.02 | -1.0 | 0.0333 | 0.0035 | 0.0003 | 0.0171 | 0.0002 | 0.023 |
| 10.9839 | 8.0 | 1856 | 11.4150 | 0.0003 | 0.0012 | 0.0 | -1.0 | 0.0006 | 0.0 | 0.0007 | 0.0111 | 0.0194 | -1.0 | 0.0325 | 0.0031 | 0.0003 | 0.0164 | 0.0002 | 0.0223 |
| 10.0392 | 9.0 | 2088 | 11.2706 | 0.0002 | 0.0011 | 0.0 | -1.0 | 0.0005 | 0.0 | 0.0019 | 0.0118 | 0.0198 | -1.0 | 0.0337 | 0.0033 | 0.0003 | 0.0163 | 0.0002 | 0.0233 |
| 10.0392 | 10.0 | 2320 | 11.3800 | 0.0002 | 0.001 | 0.0 | -1.0 | 0.0005 | 0.0001 | 0.0019 | 0.0112 | 0.0189 | -1.0 | 0.0322 | 0.0029 | 0.0003 | 0.0155 | 0.0002 | 0.0223 |
### Framework versions
- Transformers 4.50.0.dev0
- Pytorch 2.5.1+cu124
- Datasets 3.3.2
- Tokenizers 0.21.0
| [
"player",
"ball-handler",
"player"
] |
mcity-data-engine/fisheye8k_facebook_detr-resnet-101-dc5 |
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# fisheye8k_facebook_detr-resnet-101-dc5
This model is a fine-tuned version of [facebook/detr-resnet-101-dc5](https://huggingface.co/facebook/detr-resnet-101-dc5) on the generator dataset.
It achieves the following results on the evaluation set:
- Loss: 2.6740
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 5e-05
- train_batch_size: 1
- eval_batch_size: 8
- seed: 0
- optimizer: Use OptimizerNames.ADAMW_TORCH with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
- lr_scheduler_type: cosine
- num_epochs: 36
- mixed_precision_training: Native AMP
### Training results
| Training Loss | Epoch | Step | Validation Loss |
|:-------------:|:-----:|:-----:|:---------------:|
| 2.1508 | 1.0 | 5288 | 2.4721 |
| 1.7423 | 2.0 | 10576 | 2.3029 |
| 1.5881 | 3.0 | 15864 | 2.2454 |
| 1.5641 | 4.0 | 21152 | 2.2912 |
| 1.4438 | 5.0 | 26440 | 2.2912 |
| 1.4503 | 6.0 | 31728 | 2.5056 |
| 1.3487 | 7.0 | 37016 | 2.5812 |
| 1.2777 | 8.0 | 42304 | 2.6740 |
### Framework versions
- Transformers 4.48.3
- Pytorch 2.5.1+cu124
- Datasets 3.2.0
- Tokenizers 0.21.0
Mcity Data Engine: https://arxiv.org/abs/2504.21614 | [
"bus",
"bike",
"car",
"pedestrian",
"truck"
] |
mcity-data-engine/fisheye8k_facebook_deformable-detr-detic |
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# fisheye8k_facebook_deformable-detr-detic
This model is a fine-tuned version of [facebook/deformable-detr-detic](https://huggingface.co/facebook/deformable-detr-detic) on the generator dataset.
It achieves the following results on the evaluation set:
- Loss: 2.1348
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 5e-05
- train_batch_size: 1
- eval_batch_size: 8
- seed: 0
- optimizer: Use OptimizerNames.ADAMW_TORCH with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
- lr_scheduler_type: cosine
- num_epochs: 36
- mixed_precision_training: Native AMP
### Training results
| Training Loss | Epoch | Step | Validation Loss |
|:-------------:|:-----:|:-----:|:---------------:|
| 2.435 | 1.0 | 5288 | 2.4832 |
| 2.2626 | 2.0 | 10576 | 2.6324 |
| 1.8443 | 3.0 | 15864 | 2.1361 |
| 2.4834 | 4.0 | 21152 | 2.5269 |
| 2.3417 | 5.0 | 26440 | 2.5997 |
| 1.939 | 6.0 | 31728 | 2.1948 |
| 1.8384 | 7.0 | 37016 | 2.0057 |
| 1.7235 | 8.0 | 42304 | 2.0182 |
| 1.728 | 9.0 | 47592 | 1.9454 |
| 1.621 | 10.0 | 52880 | 1.9876 |
| 1.539 | 11.0 | 58168 | 1.8862 |
| 1.7229 | 12.0 | 63456 | 2.2071 |
| 1.9613 | 13.0 | 68744 | 2.5147 |
| 1.5238 | 14.0 | 74032 | 1.9836 |
| 1.5777 | 15.0 | 79320 | 2.0812 |
| 1.5963 | 16.0 | 84608 | 2.1348 |
### Framework versions
- Transformers 4.48.3
- Pytorch 2.5.1+cu124
- Datasets 3.2.0
- Tokenizers 0.21.0
Mcity Data Engine: https://arxiv.org/abs/2504.21614 | [
"bus",
"bike",
"car",
"pedestrian",
"truck"
] |
mcity-data-engine/fisheye8k_facebook_deformable-detr-box-supervised |
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# fisheye8k_facebook_deformable-detr-box-supervised
This model is a fine-tuned version of [facebook/deformable-detr-box-supervised](https://huggingface.co/facebook/deformable-detr-box-supervised) on the generator dataset.
It achieves the following results on the evaluation set:
- Loss: 3.5085
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 5e-05
- train_batch_size: 1
- eval_batch_size: 8
- seed: 0
- optimizer: Use OptimizerNames.ADAMW_TORCH with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
- lr_scheduler_type: cosine
- num_epochs: 36
- mixed_precision_training: Native AMP
### Training results
| Training Loss | Epoch | Step | Validation Loss |
|:-------------:|:-----:|:-----:|:---------------:|
| 2.551 | 1.0 | 5288 | 2.9515 |
| 2.4989 | 2.0 | 10576 | 2.9100 |
| 2.2642 | 3.0 | 15864 | 2.9280 |
| 5.2218 | 4.0 | 21152 | 7.3972 |
| 3.69 | 5.0 | 26440 | 2.8083 |
| 3.3462 | 6.0 | 31728 | 5.0976 |
| 2.5944 | 7.0 | 37016 | 4.1669 |
| 2.5709 | 8.0 | 42304 | 3.6812 |
| 2.6956 | 9.0 | 47592 | 4.0466 |
| 2.5195 | 10.0 | 52880 | 3.5085 |
### Framework versions
- Transformers 4.48.3
- Pytorch 2.5.1+cu124
- Datasets 3.2.0
- Tokenizers 0.21.0
Mcity Data Engine: https://arxiv.org/abs/2504.21614 | [
"bus",
"bike",
"car",
"pedestrian",
"truck"
] |
mcity-data-engine/fisheye8k_SenseTime_deformable-detr |
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# fisheye8k_SenseTime_deformable-detr
This model is a fine-tuned version of [SenseTime/deformable-detr](https://huggingface.co/SenseTime/deformable-detr) on the generator dataset.
It achieves the following results on the evaluation set:
- Loss: 1.2335
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 5e-05
- train_batch_size: 1
- eval_batch_size: 8
- seed: 0
- optimizer: Use OptimizerNames.ADAMW_TORCH with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
- lr_scheduler_type: cosine
- num_epochs: 36
- mixed_precision_training: Native AMP
### Training results
| Training Loss | Epoch | Step | Validation Loss |
|:-------------:|:-----:|:-----:|:---------------:|
| 0.8943 | 1.0 | 5288 | 1.5330 |
| 0.7865 | 2.0 | 10576 | 1.4108 |
| 0.7238 | 3.0 | 15864 | 1.2660 |
| 0.6657 | 4.0 | 21152 | 1.2084 |
| 0.646 | 5.0 | 26440 | 1.2666 |
| 0.6269 | 6.0 | 31728 | 1.2555 |
| 0.6049 | 7.0 | 37016 | 1.2350 |
| 0.5894 | 8.0 | 42304 | 1.2940 |
| 0.5484 | 9.0 | 47592 | 1.2335 |
### Framework versions
- Transformers 4.48.3
- Pytorch 2.5.1+cu124
- Datasets 3.2.0
- Tokenizers 0.21.0
Mcity Data Engine: https://arxiv.org/abs/2504.21614 | [
"bus",
"bike",
"car",
"pedestrian",
"truck"
] |
mcity-data-engine/fisheye8k_jozhang97_deta-swin-large |
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# fisheye8k_jozhang97_deta-swin-large
This model is a fine-tuned version of [jozhang97/deta-swin-large](https://huggingface.co/jozhang97/deta-swin-large) on the generator dataset.
It achieves the following results on the evaluation set:
- Loss: 17.9701
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 5e-05
- train_batch_size: 1
- eval_batch_size: 8
- seed: 0
- optimizer: Use OptimizerNames.ADAMW_TORCH with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
- lr_scheduler_type: cosine
- num_epochs: 36
- mixed_precision_training: Native AMP
### Training results
| Training Loss | Epoch | Step | Validation Loss |
|:-------------:|:-----:|:-----:|:---------------:|
| 13.7551 | 1.0 | 5288 | 17.5573 |
| 12.6537 | 2.0 | 10576 | 17.4879 |
| 12.023 | 3.0 | 15864 | 17.6520 |
| 11.4167 | 4.0 | 21152 | 18.5138 |
| 10.8161 | 5.0 | 26440 | 17.7264 |
| 10.5346 | 6.0 | 31728 | 17.9145 |
| 10.1203 | 7.0 | 37016 | 17.9701 |
### Framework versions
- Transformers 4.48.3
- Pytorch 2.5.1+cu124
- Datasets 3.2.0
- Tokenizers 0.21.0
Mcity Data Engine: https://arxiv.org/abs/2504.21614 | [
"bus",
"bike",
"car",
"pedestrian",
"truck"
] |
mcity-data-engine/fisheye8k_jozhang97_deta-swin-large-o365 |
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# fisheye8k_jozhang97_deta-swin-large-o365
This model is a fine-tuned version of [jozhang97/deta-swin-large-o365](https://huggingface.co/jozhang97/deta-swin-large-o365) on the generator dataset.
It achieves the following results on the evaluation set:
- Loss: 1.0247
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 5e-05
- train_batch_size: 1
- eval_batch_size: 8
- seed: 0
- optimizer: Use OptimizerNames.ADAMW_TORCH with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
- lr_scheduler_type: cosine
- num_epochs: 36
- mixed_precision_training: Native AMP
### Training results
| Training Loss | Epoch | Step | Validation Loss |
|:-------------:|:-----:|:-----:|:---------------:|
| 1.3933 | 1.0 | 5288 | 1.6177 |
| 1.098 | 2.0 | 10576 | 1.2979 |
| 0.9565 | 3.0 | 15864 | 1.2650 |
| 0.8734 | 4.0 | 21152 | 1.2495 |
| 0.8196 | 5.0 | 26440 | 1.1328 |
| 0.7977 | 6.0 | 31728 | 1.3190 |
| 0.8448 | 7.0 | 37016 | 1.3999 |
| 0.7399 | 8.0 | 42304 | 1.3117 |
| 0.6325 | 9.0 | 47592 | 1.1202 |
| 0.621 | 10.0 | 52880 | 1.1707 |
| 0.7134 | 11.0 | 58168 | 1.2353 |
| 0.6425 | 12.0 | 63456 | 1.0416 |
| 0.5935 | 13.0 | 68744 | 0.9215 |
| 0.5798 | 14.0 | 74032 | 1.0827 |
| 0.5924 | 15.0 | 79320 | 1.0398 |
| 0.5559 | 16.0 | 84608 | 1.0112 |
| 0.5783 | 17.0 | 89896 | 1.0434 |
| 0.5536 | 18.0 | 95184 | 1.0247 |
### Framework versions
- Transformers 4.48.3
- Pytorch 2.5.1+cu124
- Datasets 3.2.0
- Tokenizers 0.21.0
Mcity Data Engine: https://arxiv.org/abs/2504.21614 | [
"bus",
"bike",
"car",
"pedestrian",
"truck"
] |
mcity-data-engine/fisheye8k_hustvl_yolos-base |
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# fisheye8k_hustvl_yolos-base
This model is a fine-tuned version of [hustvl/yolos-base](https://huggingface.co/hustvl/yolos-base) on the generator dataset.
It achieves the following results on the evaluation set:
- Loss: 2.6653
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 5e-05
- train_batch_size: 1
- eval_batch_size: 8
- seed: 0
- optimizer: Use OptimizerNames.ADAMW_TORCH with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
- lr_scheduler_type: cosine
- num_epochs: 36
- mixed_precision_training: Native AMP
### Training results
| Training Loss | Epoch | Step | Validation Loss |
|:-------------:|:-----:|:-----:|:---------------:|
| 1.9357 | 1.0 | 5288 | 2.7182 |
| 1.8095 | 2.0 | 10576 | 2.6559 |
| 1.6565 | 3.0 | 15864 | 2.5114 |
| 1.5912 | 4.0 | 21152 | 2.6875 |
| 1.6169 | 5.0 | 26440 | 2.7796 |
| 1.5075 | 6.0 | 31728 | 2.6514 |
| 1.4073 | 7.0 | 37016 | 2.7649 |
| 1.3617 | 8.0 | 42304 | 2.6653 |
### Framework versions
- Transformers 4.48.3
- Pytorch 2.5.1+cu124
- Datasets 3.2.0
- Tokenizers 0.21.0
Mcity Data Engine: https://arxiv.org/abs/2504.21614 | [
"bus",
"bike",
"car",
"pedestrian",
"truck"
] |
shawnmichael/detr-resnet-50_finetuned_cppe5 |
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# detr-resnet-50_finetuned_cppe5
This model is a fine-tuned version of [facebook/detr-resnet-50](https://huggingface.co/facebook/detr-resnet-50) on an unknown dataset.
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 1e-05
- train_batch_size: 8
- eval_batch_size: 8
- seed: 42
- optimizer: Use OptimizerNames.ADAMW_TORCH with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
- lr_scheduler_type: linear
- num_epochs: 10
- mixed_precision_training: Native AMP
### Training results
### Framework versions
- Transformers 4.48.2
- Pytorch 2.5.1+cu124
- Datasets 3.2.0
- Tokenizers 0.21.0
| [
"coverall",
"face_shield",
"gloves",
"goggles",
"mask"
] |
ekagrag99/detr_cppe5 |
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# detr_cppe5
This model is a fine-tuned version of [facebook/detr-resnet-50](https://huggingface.co/facebook/detr-resnet-50) on an unknown dataset.
It achieves the following results on the evaluation set:
- Loss: 1.1822
- Map: 0.3434
- Map 50: 0.6475
- Map 75: 0.2912
- Map Small: 0.3895
- Map Medium: 0.2794
- Map Large: 0.5928
- Mar 1: 0.3097
- Mar 10: 0.5504
- Mar 100: 0.5751
- Mar Small: 0.4457
- Mar Medium: 0.4618
- Mar Large: 0.7823
- Map Coverall: 0.4775
- Mar 100 Coverall: 0.7385
- Map Face Shield: 0.3664
- Mar 100 Face Shield: 0.7118
- Map Gloves: 0.3001
- Mar 100 Gloves: 0.4492
- Map Goggles: 0.1632
- Mar 100 Goggles: 0.4897
- Map Mask: 0.4096
- Mar 100 Mask: 0.4863
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 5e-05
- train_batch_size: 10
- eval_batch_size: 8
- seed: 42
- optimizer: Use adamw_torch with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
- lr_scheduler_type: linear
- lr_scheduler_warmup_steps: 300
- num_epochs: 50
### Training results
| Training Loss | Epoch | Step | Validation Loss | Map | Map 50 | Map 75 | Map Small | Map Medium | Map Large | Mar 1 | Mar 10 | Mar 100 | Mar Small | Mar Medium | Mar Large | Map Coverall | Mar 100 Coverall | Map Face Shield | Mar 100 Face Shield | Map Gloves | Mar 100 Gloves | Map Goggles | Mar 100 Goggles | Map Mask | Mar 100 Mask |
|:-------------:|:-----:|:----:|:---------------:|:------:|:------:|:------:|:---------:|:----------:|:---------:|:------:|:------:|:-------:|:---------:|:----------:|:---------:|:------------:|:----------------:|:---------------:|:-------------------:|:----------:|:--------------:|:-----------:|:---------------:|:--------:|:------------:|
| No log | 1.0 | 85 | 3.3808 | 0.0017 | 0.0035 | 0.0013 | 0.0 | 0.0011 | 0.0048 | 0.0074 | 0.0197 | 0.1157 | 0.0 | 0.0252 | 0.1349 | 0.0056 | 0.4828 | 0.0001 | 0.0355 | 0.0006 | 0.0045 | 0.0001 | 0.05 | 0.0023 | 0.0058 |
| No log | 2.0 | 170 | 2.3354 | 0.0059 | 0.0177 | 0.0029 | 0.002 | 0.0047 | 0.0112 | 0.0171 | 0.0932 | 0.2142 | 0.0713 | 0.1013 | 0.2484 | 0.0126 | 0.665 | 0.0 | 0.0 | 0.0061 | 0.1691 | 0.0001 | 0.0083 | 0.0106 | 0.2286 |
| No log | 3.0 | 255 | 2.0756 | 0.0382 | 0.0794 | 0.0318 | 0.0095 | 0.0184 | 0.0287 | 0.0883 | 0.2023 | 0.2537 | 0.0842 | 0.1806 | 0.2639 | 0.1301 | 0.7039 | 0.0 | 0.0 | 0.0103 | 0.2399 | 0.0 | 0.0 | 0.0504 | 0.3249 |
| No log | 4.0 | 340 | 2.0584 | 0.0474 | 0.1014 | 0.0387 | 0.0049 | 0.0114 | 0.0472 | 0.0854 | 0.1807 | 0.2201 | 0.0773 | 0.1209 | 0.2114 | 0.2022 | 0.6511 | 0.0 | 0.0 | 0.0103 | 0.2253 | 0.0 | 0.0 | 0.0243 | 0.2243 |
| No log | 5.0 | 425 | 1.9921 | 0.0482 | 0.1147 | 0.0369 | 0.01 | 0.0359 | 0.0392 | 0.0768 | 0.175 | 0.2277 | 0.0786 | 0.1806 | 0.1977 | 0.168 | 0.6989 | 0.0 | 0.0 | 0.0063 | 0.1949 | 0.0 | 0.0 | 0.0665 | 0.2444 |
| 2.6018 | 6.0 | 510 | 1.9736 | 0.0565 | 0.1186 | 0.0472 | 0.0173 | 0.0172 | 0.0497 | 0.1035 | 0.1885 | 0.2321 | 0.1065 | 0.1353 | 0.207 | 0.2205 | 0.6289 | 0.0 | 0.0 | 0.0194 | 0.273 | 0.0 | 0.0 | 0.0427 | 0.2587 |
| 2.6018 | 7.0 | 595 | 1.8589 | 0.0719 | 0.1624 | 0.0566 | 0.0183 | 0.0469 | 0.08 | 0.1186 | 0.2145 | 0.2532 | 0.064 | 0.179 | 0.2641 | 0.2502 | 0.6817 | 0.0063 | 0.029 | 0.0161 | 0.2517 | 0.0 | 0.0 | 0.0871 | 0.3037 |
| 2.6018 | 8.0 | 680 | 1.9134 | 0.0691 | 0.1429 | 0.0613 | 0.0302 | 0.0488 | 0.0664 | 0.1055 | 0.2109 | 0.235 | 0.0719 | 0.1561 | 0.2691 | 0.251 | 0.6528 | 0.001 | 0.021 | 0.0215 | 0.2225 | 0.0 | 0.0 | 0.0718 | 0.2788 |
| 2.6018 | 9.0 | 765 | 1.7886 | 0.1028 | 0.2122 | 0.0858 | 0.0201 | 0.054 | 0.1169 | 0.1442 | 0.2536 | 0.2822 | 0.0883 | 0.1797 | 0.3063 | 0.3617 | 0.6939 | 0.027 | 0.1613 | 0.0222 | 0.2551 | 0.0037 | 0.0104 | 0.0994 | 0.2905 |
| 2.6018 | 10.0 | 850 | 1.6817 | 0.1135 | 0.2311 | 0.1002 | 0.0256 | 0.0846 | 0.1063 | 0.1576 | 0.2748 | 0.3051 | 0.0412 | 0.2279 | 0.3475 | 0.3796 | 0.7344 | 0.0452 | 0.2161 | 0.0418 | 0.2809 | 0.0002 | 0.0063 | 0.1006 | 0.2878 |
| 2.6018 | 11.0 | 935 | 1.6006 | 0.1391 | 0.287 | 0.1155 | 0.0413 | 0.0785 | 0.1895 | 0.1755 | 0.3133 | 0.3397 | 0.1245 | 0.2614 | 0.4026 | 0.4634 | 0.7378 | 0.0706 | 0.2774 | 0.0425 | 0.3112 | 0.0025 | 0.025 | 0.1166 | 0.3471 |
| 1.8243 | 12.0 | 1020 | 1.6841 | 0.1332 | 0.2725 | 0.1113 | 0.0276 | 0.0543 | 0.1786 | 0.1706 | 0.3011 | 0.3159 | 0.0898 | 0.1761 | 0.4149 | 0.4214 | 0.7039 | 0.0525 | 0.229 | 0.0573 | 0.2742 | 0.0007 | 0.0417 | 0.1339 | 0.3307 |
| 1.8243 | 13.0 | 1105 | 1.5423 | 0.1664 | 0.3193 | 0.14 | 0.0784 | 0.1282 | 0.2017 | 0.2082 | 0.3567 | 0.3799 | 0.143 | 0.3117 | 0.4624 | 0.4832 | 0.7517 | 0.0636 | 0.3129 | 0.0754 | 0.3455 | 0.0215 | 0.1042 | 0.1886 | 0.3852 |
| 1.8243 | 14.0 | 1190 | 1.5372 | 0.1588 | 0.3226 | 0.1323 | 0.0513 | 0.0851 | 0.2026 | 0.1903 | 0.3321 | 0.3581 | 0.1392 | 0.2621 | 0.4049 | 0.4929 | 0.7467 | 0.0755 | 0.3032 | 0.0533 | 0.3309 | 0.0052 | 0.0833 | 0.1671 | 0.3265 |
| 1.8243 | 15.0 | 1275 | 1.5211 | 0.1724 | 0.3442 | 0.1384 | 0.0743 | 0.0986 | 0.2413 | 0.2149 | 0.3603 | 0.3804 | 0.15 | 0.2532 | 0.4776 | 0.481 | 0.7239 | 0.104 | 0.3145 | 0.0663 | 0.3528 | 0.0221 | 0.1688 | 0.1885 | 0.3418 |
| 1.8243 | 16.0 | 1360 | 1.4867 | 0.18 | 0.3647 | 0.1508 | 0.0638 | 0.1425 | 0.2643 | 0.2216 | 0.3898 | 0.4102 | 0.1252 | 0.3264 | 0.5482 | 0.5158 | 0.7661 | 0.0737 | 0.2855 | 0.0655 | 0.3275 | 0.0252 | 0.3021 | 0.2198 | 0.3698 |
| 1.8243 | 17.0 | 1445 | 1.4620 | 0.1878 | 0.3758 | 0.1547 | 0.0912 | 0.112 | 0.2571 | 0.2305 | 0.3948 | 0.4164 | 0.1873 | 0.2896 | 0.5613 | 0.4989 | 0.7539 | 0.074 | 0.329 | 0.0766 | 0.3225 | 0.042 | 0.3021 | 0.2476 | 0.3746 |
| 1.5166 | 18.0 | 1530 | 1.4209 | 0.1951 | 0.393 | 0.1661 | 0.0762 | 0.1327 | 0.2942 | 0.2479 | 0.4108 | 0.4321 | 0.1495 | 0.3271 | 0.5797 | 0.5089 | 0.7556 | 0.0801 | 0.3161 | 0.0869 | 0.3517 | 0.0602 | 0.3625 | 0.2395 | 0.3746 |
| 1.5166 | 19.0 | 1615 | 1.4394 | 0.1778 | 0.3697 | 0.1503 | 0.0828 | 0.0939 | 0.2597 | 0.2284 | 0.3856 | 0.4056 | 0.1251 | 0.2911 | 0.5356 | 0.5115 | 0.7333 | 0.0681 | 0.3323 | 0.0819 | 0.3292 | 0.0281 | 0.3 | 0.1993 | 0.3333 |
| 1.5166 | 20.0 | 1700 | 1.4189 | 0.1843 | 0.3738 | 0.1451 | 0.0553 | 0.1156 | 0.2765 | 0.2448 | 0.3994 | 0.4273 | 0.1161 | 0.338 | 0.5356 | 0.5098 | 0.7567 | 0.0647 | 0.3177 | 0.0832 | 0.364 | 0.0373 | 0.3458 | 0.2266 | 0.3524 |
| 1.5166 | 21.0 | 1785 | 1.4192 | 0.209 | 0.4249 | 0.1637 | 0.0943 | 0.1349 | 0.2795 | 0.2483 | 0.4171 | 0.443 | 0.1505 | 0.3501 | 0.5744 | 0.5389 | 0.7511 | 0.1055 | 0.3629 | 0.1052 | 0.3545 | 0.049 | 0.3854 | 0.2462 | 0.3608 |
| 1.5166 | 22.0 | 1870 | 1.3759 | 0.2164 | 0.424 | 0.1731 | 0.1037 | 0.133 | 0.3088 | 0.2593 | 0.4166 | 0.4373 | 0.2054 | 0.3142 | 0.5445 | 0.5466 | 0.7411 | 0.1236 | 0.3532 | 0.1138 | 0.3618 | 0.039 | 0.3521 | 0.2589 | 0.3783 |
| 1.5166 | 23.0 | 1955 | 1.3995 | 0.2144 | 0.4247 | 0.1782 | 0.1065 | 0.1411 | 0.3036 | 0.2741 | 0.4258 | 0.4447 | 0.1657 | 0.3348 | 0.588 | 0.5209 | 0.7478 | 0.0766 | 0.3532 | 0.1387 | 0.35 | 0.0599 | 0.4 | 0.2762 | 0.3725 |
| 1.3335 | 24.0 | 2040 | 1.3985 | 0.2096 | 0.4236 | 0.1725 | 0.066 | 0.1721 | 0.3041 | 0.2663 | 0.412 | 0.4366 | 0.127 | 0.3592 | 0.5603 | 0.5068 | 0.7589 | 0.0967 | 0.3274 | 0.1321 | 0.3287 | 0.0602 | 0.4021 | 0.2519 | 0.3661 |
| 1.3335 | 25.0 | 2125 | 1.3437 | 0.2262 | 0.4365 | 0.1996 | 0.116 | 0.195 | 0.3267 | 0.2729 | 0.4326 | 0.4501 | 0.2003 | 0.3913 | 0.5684 | 0.5386 | 0.7611 | 0.098 | 0.3677 | 0.1355 | 0.3562 | 0.0823 | 0.3812 | 0.2764 | 0.3841 |
| 1.3335 | 26.0 | 2210 | 1.3394 | 0.2273 | 0.4402 | 0.1994 | 0.1313 | 0.1695 | 0.3223 | 0.2697 | 0.4286 | 0.4491 | 0.1838 | 0.3522 | 0.603 | 0.552 | 0.7683 | 0.0979 | 0.3403 | 0.1437 | 0.373 | 0.0706 | 0.4021 | 0.2724 | 0.3619 |
| 1.3335 | 27.0 | 2295 | 1.3610 | 0.2457 | 0.4792 | 0.209 | 0.1341 | 0.1636 | 0.3294 | 0.2733 | 0.4292 | 0.443 | 0.1716 | 0.352 | 0.5693 | 0.5724 | 0.7478 | 0.106 | 0.3597 | 0.163 | 0.3691 | 0.1107 | 0.3604 | 0.2763 | 0.3778 |
| 1.3335 | 28.0 | 2380 | 1.3306 | 0.2382 | 0.4613 | 0.204 | 0.094 | 0.1493 | 0.355 | 0.2771 | 0.4543 | 0.4772 | 0.1796 | 0.375 | 0.6233 | 0.5641 | 0.7572 | 0.1047 | 0.3597 | 0.1508 | 0.3933 | 0.0708 | 0.4688 | 0.3004 | 0.4069 |
| 1.3335 | 29.0 | 2465 | 1.3467 | 0.2329 | 0.4588 | 0.1867 | 0.0908 | 0.1585 | 0.3364 | 0.2731 | 0.4368 | 0.4614 | 0.2115 | 0.3807 | 0.5879 | 0.5528 | 0.7506 | 0.0869 | 0.3855 | 0.1569 | 0.364 | 0.0976 | 0.4187 | 0.2701 | 0.3884 |
| 1.2098 | 30.0 | 2550 | 1.3040 | 0.2405 | 0.4684 | 0.1988 | 0.0967 | 0.1648 | 0.3661 | 0.2916 | 0.4535 | 0.4745 | 0.1839 | 0.3898 | 0.6246 | 0.5601 | 0.7489 | 0.1124 | 0.3823 | 0.165 | 0.377 | 0.0865 | 0.4646 | 0.2782 | 0.4 |
| 1.2098 | 31.0 | 2635 | 1.3243 | 0.2303 | 0.4668 | 0.1886 | 0.1071 | 0.1712 | 0.3252 | 0.2839 | 0.4507 | 0.4746 | 0.1955 | 0.3807 | 0.639 | 0.5402 | 0.7428 | 0.1085 | 0.3694 | 0.15 | 0.391 | 0.0889 | 0.4771 | 0.2639 | 0.3926 |
| 1.2098 | 32.0 | 2720 | 1.3032 | 0.2439 | 0.4722 | 0.1977 | 0.1023 | 0.1608 | 0.3506 | 0.2867 | 0.4453 | 0.47 | 0.2125 | 0.3839 | 0.597 | 0.5685 | 0.7556 | 0.1089 | 0.3774 | 0.1535 | 0.3747 | 0.0978 | 0.4354 | 0.2906 | 0.4069 |
| 1.2098 | 33.0 | 2805 | 1.2945 | 0.2491 | 0.4852 | 0.2148 | 0.1169 | 0.1539 | 0.3552 | 0.2932 | 0.4555 | 0.4754 | 0.1981 | 0.3771 | 0.6219 | 0.5846 | 0.755 | 0.1052 | 0.3984 | 0.1647 | 0.3747 | 0.0977 | 0.4396 | 0.2935 | 0.4095 |
| 1.2098 | 34.0 | 2890 | 1.3078 | 0.256 | 0.4961 | 0.2202 | 0.1505 | 0.1884 | 0.3566 | 0.2939 | 0.4495 | 0.4667 | 0.1925 | 0.3871 | 0.6014 | 0.563 | 0.7472 | 0.1068 | 0.3613 | 0.1649 | 0.3663 | 0.1464 | 0.4479 | 0.2989 | 0.4106 |
| 1.2098 | 35.0 | 2975 | 1.2884 | 0.2613 | 0.5125 | 0.2244 | 0.096 | 0.1776 | 0.3651 | 0.293 | 0.448 | 0.4621 | 0.1787 | 0.4024 | 0.6022 | 0.5816 | 0.7583 | 0.1365 | 0.3742 | 0.1676 | 0.3674 | 0.1175 | 0.3979 | 0.3034 | 0.4127 |
| 1.0602 | 36.0 | 3060 | 1.2815 | 0.2613 | 0.4981 | 0.2183 | 0.1174 | 0.18 | 0.3633 | 0.2981 | 0.4457 | 0.4618 | 0.1747 | 0.3881 | 0.5815 | 0.5799 | 0.7667 | 0.1394 | 0.371 | 0.1584 | 0.3702 | 0.1318 | 0.3979 | 0.2969 | 0.4032 |
| 1.0602 | 37.0 | 3145 | 1.2907 | 0.2499 | 0.4974 | 0.2101 | 0.1331 | 0.1679 | 0.3634 | 0.2973 | 0.4557 | 0.4728 | 0.1949 | 0.3897 | 0.6137 | 0.5517 | 0.7444 | 0.1308 | 0.3984 | 0.1616 | 0.3854 | 0.1044 | 0.4229 | 0.3009 | 0.4127 |
| 1.0602 | 38.0 | 3230 | 1.2805 | 0.2602 | 0.52 | 0.2108 | 0.1393 | 0.1888 | 0.3606 | 0.2966 | 0.4592 | 0.4762 | 0.1966 | 0.402 | 0.6078 | 0.5761 | 0.7556 | 0.1332 | 0.3742 | 0.1802 | 0.3809 | 0.1077 | 0.4542 | 0.3037 | 0.4164 |
| 1.0602 | 39.0 | 3315 | 1.2698 | 0.261 | 0.494 | 0.2242 | 0.14 | 0.1957 | 0.3475 | 0.2992 | 0.4674 | 0.4858 | 0.2292 | 0.4155 | 0.5978 | 0.5924 | 0.7567 | 0.1052 | 0.3903 | 0.1789 | 0.4056 | 0.1183 | 0.4479 | 0.31 | 0.4286 |
| 1.0602 | 40.0 | 3400 | 1.2609 | 0.2775 | 0.5241 | 0.242 | 0.1488 | 0.2006 | 0.3761 | 0.3033 | 0.4667 | 0.4845 | 0.216 | 0.4314 | 0.6122 | 0.5969 | 0.7522 | 0.1477 | 0.3984 | 0.2019 | 0.4096 | 0.1377 | 0.45 | 0.3033 | 0.4122 |
| 1.0602 | 41.0 | 3485 | 1.2572 | 0.2727 | 0.515 | 0.2386 | 0.1282 | 0.1983 | 0.3711 | 0.3018 | 0.459 | 0.4749 | 0.202 | 0.4185 | 0.6023 | 0.5941 | 0.7528 | 0.1339 | 0.3823 | 0.1862 | 0.3899 | 0.1427 | 0.4396 | 0.3066 | 0.4101 |
| 0.9738 | 42.0 | 3570 | 1.2510 | 0.2736 | 0.5131 | 0.2353 | 0.1534 | 0.1923 | 0.3905 | 0.3077 | 0.4753 | 0.4862 | 0.2183 | 0.412 | 0.6298 | 0.6055 | 0.7628 | 0.1376 | 0.3871 | 0.1884 | 0.3876 | 0.1201 | 0.4729 | 0.3161 | 0.4206 |
| 0.9738 | 43.0 | 3655 | 1.2695 | 0.2762 | 0.5243 | 0.2381 | 0.1287 | 0.1941 | 0.3775 | 0.3089 | 0.4632 | 0.4778 | 0.1805 | 0.4073 | 0.6117 | 0.6051 | 0.7678 | 0.1495 | 0.4048 | 0.1927 | 0.3781 | 0.1232 | 0.4208 | 0.3105 | 0.4175 |
| 0.9738 | 44.0 | 3740 | 1.2572 | 0.2734 | 0.5133 | 0.2436 | 0.1276 | 0.1833 | 0.3786 | 0.3056 | 0.4713 | 0.488 | 0.1853 | 0.4108 | 0.6138 | 0.6044 | 0.7628 | 0.1442 | 0.4113 | 0.1882 | 0.3831 | 0.1252 | 0.4688 | 0.3051 | 0.4143 |
| 0.9738 | 45.0 | 3825 | 1.2702 | 0.2776 | 0.5164 | 0.249 | 0.1282 | 0.2003 | 0.3858 | 0.3099 | 0.4648 | 0.4802 | 0.1957 | 0.4204 | 0.5994 | 0.6041 | 0.7656 | 0.1466 | 0.3806 | 0.1912 | 0.386 | 0.1381 | 0.4542 | 0.3081 | 0.4148 |
| 0.9738 | 46.0 | 3910 | 1.2658 | 0.2784 | 0.5212 | 0.2503 | 0.1457 | 0.1963 | 0.3952 | 0.3164 | 0.4727 | 0.4861 | 0.2016 | 0.4174 | 0.6137 | 0.6031 | 0.7722 | 0.1508 | 0.3968 | 0.1914 | 0.3792 | 0.1414 | 0.4708 | 0.3053 | 0.4116 |
| 0.9738 | 47.0 | 3995 | 1.2724 | 0.2805 | 0.5217 | 0.2477 | 0.1289 | 0.2009 | 0.3915 | 0.3092 | 0.4748 | 0.487 | 0.2106 | 0.4097 | 0.6215 | 0.5991 | 0.7672 | 0.1565 | 0.4048 | 0.1881 | 0.373 | 0.1551 | 0.4771 | 0.3038 | 0.4127 |
| 0.9058 | 48.0 | 4080 | 1.2603 | 0.2761 | 0.5235 | 0.2386 | 0.1343 | 0.196 | 0.376 | 0.304 | 0.4708 | 0.4827 | 0.2198 | 0.4073 | 0.5984 | 0.5976 | 0.7622 | 0.1582 | 0.421 | 0.1824 | 0.3826 | 0.14 | 0.4354 | 0.3024 | 0.4122 |
| 0.9058 | 49.0 | 4165 | 1.2626 | 0.2787 | 0.5264 | 0.2444 | 0.1316 | 0.1989 | 0.3758 | 0.3088 | 0.4785 | 0.4893 | 0.199 | 0.419 | 0.6087 | 0.5961 | 0.765 | 0.1573 | 0.4242 | 0.189 | 0.3921 | 0.1455 | 0.45 | 0.3058 | 0.4153 |
| 0.9058 | 50.0 | 4250 | 1.2636 | 0.2812 | 0.5274 | 0.242 | 0.1369 | 0.202 | 0.3818 | 0.3077 | 0.4751 | 0.4872 | 0.2071 | 0.4203 | 0.6132 | 0.6019 | 0.7678 | 0.1587 | 0.4161 | 0.1883 | 0.3792 | 0.1509 | 0.4604 | 0.3062 | 0.4127 |
### Framework versions
- Transformers 4.47.0
- Pytorch 2.5.1+cu121
- Datasets 3.2.0
- Tokenizers 0.21.0
| [
"coverall",
"face_shield",
"gloves",
"goggles",
"mask"
] |
ustc-community/dfine-small-coco | ## D-FINE
### **Overview**
The D-FINE model was proposed in [D-FINE: Redefine Regression Task in DETRs as Fine-grained Distribution Refinement](https://arxiv.org/abs/2410.13842) by
Yansong Peng, Hebei Li, Peixi Wu, Yueyi Zhang, Xiaoyan Sun, Feng Wu
This model was contributed by [VladOS95-cyber](https://github.com/VladOS95-cyber) with the help of [@qubvel-hf](https://huggingface.co/qubvel-hf)
This is the HF transformers implementation for D-FINE
_coco -> model trained on COCO
_obj365 -> model trained on Object365
_obj2coco -> model trained on Object365 and then finetuned on COCO
### **Performance**
D-FINE, a powerful real-time object detector that achieves outstanding localization precision by redefining the bounding box regression task in DETR models. D-FINE comprises two key components: Fine-grained Distribution Refinement (FDR) and Global Optimal Localization Self-Distillation (GO-LSD).

### **How to use**
```python
import torch
import requests
from PIL import Image
from transformers import DFineForObjectDetection, AutoImageProcessor
url = 'http://images.cocodataset.org/val2017/000000039769.jpg'
image = Image.open(requests.get(url, stream=True).raw)
image_processor = AutoImageProcessor.from_pretrained("ustc-community/dfine-small-coco")
model = DFineForObjectDetection.from_pretrained("ustc-community/dfine-small-coco")
inputs = image_processor(images=image, return_tensors="pt")
with torch.no_grad():
outputs = model(**inputs)
results = image_processor.post_process_object_detection(outputs, target_sizes=torch.tensor([image.size[::-1]]), threshold=0.3)
for result in results:
for score, label_id, box in zip(result["scores"], result["labels"], result["boxes"]):
score, label = score.item(), label_id.item()
box = [round(i, 2) for i in box.tolist()]
print(f"{model.config.id2label[label]}: {score:.2f} {box}")
```
### **Training**
D-FINE is trained on COCO (Lin et al. [2014]) train2017 and validated on COCO val2017 dataset. We report the standard AP metrics (averaged over uniformly sampled IoU thresholds ranging from 0.50 − 0.95 with a step size of 0.05), and APval5000 commonly used in real scenarios.
### **Applications**
D-FINE is ideal for real-time object detection in diverse applications such as **autonomous driving**, **surveillance systems**, **robotics**, and **retail analytics**. Its enhanced flexibility and deployment-friendly design make it suitable for both edge devices and large-scale systems + ensures high accuracy and speed in dynamic, real-world environments. | [
"person",
"bicycle",
"car",
"motorbike",
"aeroplane",
"bus",
"train",
"truck",
"boat",
"traffic light",
"fire hydrant",
"stop sign",
"parking meter",
"bench",
"bird",
"cat",
"dog",
"horse",
"sheep",
"cow",
"elephant",
"bear",
"zebra",
"giraffe",
"backpack",
"umbrella",
"handbag",
"tie",
"suitcase",
"frisbee",
"skis",
"snowboard",
"sports ball",
"kite",
"baseball bat",
"baseball glove",
"skateboard",
"surfboard",
"tennis racket",
"bottle",
"wine glass",
"cup",
"fork",
"knife",
"spoon",
"bowl",
"banana",
"apple",
"sandwich",
"orange",
"broccoli",
"carrot",
"hot dog",
"pizza",
"donut",
"cake",
"chair",
"sofa",
"pottedplant",
"bed",
"diningtable",
"toilet",
"tvmonitor",
"laptop",
"mouse",
"remote",
"keyboard",
"cell phone",
"microwave",
"oven",
"toaster",
"sink",
"refrigerator",
"book",
"clock",
"vase",
"scissors",
"teddy bear",
"hair drier",
"toothbrush"
] |
TheRomanFour/rtdetr-v2-r50-cppe5-finetune-2 |
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# rtdetr-v2-r50-cppe5-finetune-2
This model is a fine-tuned version of [PekingU/rtdetr_v2_r50vd](https://huggingface.co/PekingU/rtdetr_v2_r50vd) on an unknown dataset.
It achieves the following results on the evaluation set:
- Loss: 10.6611
- Map: 0.4652
- Map 50: 0.7392
- Map 75: 0.4913
- Map Small: 0.4144
- Map Medium: 0.408
- Map Large: 0.6877
- Mar 1: 0.3462
- Mar 10: 0.5905
- Mar 100: 0.6213
- Mar Small: 0.5031
- Mar Medium: 0.5941
- Mar Large: 0.797
- Map Coverall: 0.5169
- Mar 100 Coverall: 0.7308
- Map Face Shield: 0.6256
- Mar 100 Face Shield: 0.7647
- Map Gloves: 0.3657
- Mar 100 Gloves: 0.4695
- Map Goggles: 0.3236
- Mar 100 Goggles: 0.569
- Map Mask: 0.4943
- Mar 100 Mask: 0.5725
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 5e-05
- train_batch_size: 8
- eval_batch_size: 8
- seed: 42
- optimizer: Use adamw_torch with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
- lr_scheduler_type: linear
- lr_scheduler_warmup_steps: 300
- num_epochs: 40
### Training results
| Training Loss | Epoch | Step | Validation Loss | Map | Map 50 | Map 75 | Map Small | Map Medium | Map Large | Mar 1 | Mar 10 | Mar 100 | Mar Small | Mar Medium | Mar Large | Map Coverall | Mar 100 Coverall | Map Face Shield | Mar 100 Face Shield | Map Gloves | Mar 100 Gloves | Map Goggles | Mar 100 Goggles | Map Mask | Mar 100 Mask |
|:-------------:|:-----:|:----:|:---------------:|:------:|:------:|:------:|:---------:|:----------:|:---------:|:------:|:------:|:-------:|:---------:|:----------:|:---------:|:------------:|:----------------:|:---------------:|:-------------------:|:----------:|:--------------:|:-----------:|:---------------:|:--------:|:------------:|
| No log | 1.0 | 107 | 30.1839 | 0.016 | 0.0345 | 0.0134 | 0.0007 | 0.0053 | 0.0226 | 0.0455 | 0.1408 | 0.211 | 0.0084 | 0.1212 | 0.4057 | 0.0711 | 0.5662 | 0.0016 | 0.1152 | 0.0014 | 0.0906 | 0.0008 | 0.1738 | 0.005 | 0.1093 |
| No log | 2.0 | 214 | 16.7728 | 0.1009 | 0.1942 | 0.0938 | 0.0547 | 0.0805 | 0.1953 | 0.1867 | 0.3758 | 0.4332 | 0.1908 | 0.3709 | 0.6932 | 0.1882 | 0.677 | 0.032 | 0.3532 | 0.0513 | 0.3313 | 0.0052 | 0.3646 | 0.228 | 0.44 |
| No log | 3.0 | 321 | 12.7316 | 0.1804 | 0.3533 | 0.1597 | 0.0691 | 0.1641 | 0.3172 | 0.231 | 0.4267 | 0.4555 | 0.1823 | 0.4147 | 0.6742 | 0.3724 | 0.7005 | 0.0404 | 0.357 | 0.1209 | 0.3795 | 0.044 | 0.3615 | 0.3242 | 0.4791 |
| No log | 4.0 | 428 | 12.1055 | 0.2658 | 0.501 | 0.2489 | 0.1121 | 0.2263 | 0.4968 | 0.2788 | 0.4595 | 0.5005 | 0.2786 | 0.4725 | 0.6846 | 0.4333 | 0.6626 | 0.1537 | 0.5139 | 0.2083 | 0.3862 | 0.1806 | 0.4508 | 0.353 | 0.4889 |
| 42.0484 | 5.0 | 535 | 11.3959 | 0.2867 | 0.5474 | 0.2422 | 0.147 | 0.2355 | 0.4664 | 0.2805 | 0.4701 | 0.5143 | 0.3533 | 0.454 | 0.6959 | 0.4869 | 0.6775 | 0.1774 | 0.4975 | 0.2341 | 0.4397 | 0.1732 | 0.4554 | 0.3616 | 0.5013 |
| 42.0484 | 6.0 | 642 | 11.4755 | 0.2882 | 0.5541 | 0.2562 | 0.154 | 0.2311 | 0.475 | 0.2813 | 0.4717 | 0.5049 | 0.3248 | 0.4711 | 0.636 | 0.54 | 0.6905 | 0.1699 | 0.4975 | 0.2088 | 0.4107 | 0.1819 | 0.4431 | 0.3401 | 0.4827 |
| 42.0484 | 7.0 | 749 | 11.7154 | 0.2828 | 0.5508 | 0.2406 | 0.127 | 0.2091 | 0.4946 | 0.2694 | 0.4424 | 0.4697 | 0.2653 | 0.419 | 0.6831 | 0.5516 | 0.6874 | 0.2275 | 0.4304 | 0.1822 | 0.3938 | 0.1334 | 0.3862 | 0.3193 | 0.4507 |
| 42.0484 | 8.0 | 856 | 11.4855 | 0.3113 | 0.5832 | 0.2919 | 0.158 | 0.2411 | 0.5272 | 0.2789 | 0.4588 | 0.4855 | 0.2879 | 0.4188 | 0.6826 | 0.5473 | 0.6622 | 0.246 | 0.4772 | 0.2594 | 0.4076 | 0.1626 | 0.4138 | 0.3412 | 0.4667 |
| 42.0484 | 9.0 | 963 | 11.3829 | 0.3098 | 0.5677 | 0.2939 | 0.1821 | 0.2679 | 0.4987 | 0.2822 | 0.4751 | 0.5089 | 0.3598 | 0.4528 | 0.6827 | 0.5522 | 0.6842 | 0.2402 | 0.4722 | 0.2474 | 0.4214 | 0.1614 | 0.4877 | 0.3477 | 0.4791 |
| 14.9538 | 10.0 | 1070 | 11.8261 | 0.2954 | 0.556 | 0.2816 | 0.1369 | 0.2564 | 0.5044 | 0.2734 | 0.4488 | 0.4751 | 0.2991 | 0.4187 | 0.6705 | 0.5175 | 0.6631 | 0.2191 | 0.4418 | 0.2339 | 0.408 | 0.1751 | 0.4185 | 0.3314 | 0.444 |
| 14.9538 | 11.0 | 1177 | 11.9160 | 0.3104 | 0.5849 | 0.2901 | 0.1087 | 0.257 | 0.546 | 0.2978 | 0.4631 | 0.4914 | 0.2853 | 0.443 | 0.6932 | 0.532 | 0.6874 | 0.2522 | 0.4646 | 0.208 | 0.3862 | 0.2249 | 0.4569 | 0.335 | 0.4618 |
| 14.9538 | 12.0 | 1284 | 11.8827 | 0.2944 | 0.548 | 0.273 | 0.1352 | 0.2355 | 0.5081 | 0.2679 | 0.4445 | 0.4792 | 0.3048 | 0.4302 | 0.6445 | 0.5355 | 0.6779 | 0.2265 | 0.4241 | 0.215 | 0.417 | 0.1727 | 0.42 | 0.3221 | 0.4569 |
| 14.9538 | 13.0 | 1391 | 12.0347 | 0.2731 | 0.5145 | 0.2554 | 0.1074 | 0.2208 | 0.4958 | 0.2606 | 0.4344 | 0.4794 | 0.2714 | 0.4362 | 0.6668 | 0.4701 | 0.6676 | 0.2072 | 0.4405 | 0.2062 | 0.4103 | 0.1806 | 0.4308 | 0.3016 | 0.448 |
| 14.9538 | 14.0 | 1498 | 12.5216 | 0.2705 | 0.5218 | 0.2486 | 0.1253 | 0.2179 | 0.4779 | 0.2567 | 0.4199 | 0.4534 | 0.2618 | 0.4116 | 0.6342 | 0.4892 | 0.6545 | 0.1799 | 0.3848 | 0.1921 | 0.3795 | 0.1646 | 0.3938 | 0.3265 | 0.4542 |
| 13.1755 | 15.0 | 1605 | 12.3301 | 0.2762 | 0.5159 | 0.2589 | 0.0995 | 0.218 | 0.4821 | 0.2613 | 0.4207 | 0.4526 | 0.2929 | 0.401 | 0.6153 | 0.5048 | 0.664 | 0.1867 | 0.3709 | 0.1733 | 0.3795 | 0.1903 | 0.4092 | 0.3257 | 0.4396 |
| 13.1755 | 16.0 | 1712 | 12.6809 | 0.2743 | 0.5232 | 0.2615 | 0.1284 | 0.2239 | 0.4824 | 0.2645 | 0.4343 | 0.4755 | 0.2878 | 0.4315 | 0.653 | 0.4422 | 0.6351 | 0.2324 | 0.4443 | 0.1814 | 0.3688 | 0.2005 | 0.4723 | 0.3153 | 0.4569 |
| 13.1755 | 17.0 | 1819 | 12.4264 | 0.2731 | 0.5247 | 0.2445 | 0.1043 | 0.2211 | 0.4829 | 0.2642 | 0.4265 | 0.4571 | 0.2817 | 0.4008 | 0.6422 | 0.4872 | 0.6568 | 0.2021 | 0.4076 | 0.1636 | 0.3536 | 0.2064 | 0.4385 | 0.3063 | 0.4293 |
| 13.1755 | 18.0 | 1926 | 12.1690 | 0.295 | 0.5675 | 0.272 | 0.1195 | 0.2455 | 0.5172 | 0.2824 | 0.4366 | 0.4683 | 0.2559 | 0.4254 | 0.6484 | 0.5016 | 0.6595 | 0.2956 | 0.4443 | 0.1619 | 0.3603 | 0.1906 | 0.4185 | 0.3252 | 0.4591 |
| 12.1885 | 19.0 | 2033 | 12.2757 | 0.2889 | 0.5508 | 0.2619 | 0.1145 | 0.2375 | 0.5014 | 0.269 | 0.433 | 0.4638 | 0.2492 | 0.4169 | 0.6589 | 0.4878 | 0.6595 | 0.2877 | 0.4494 | 0.1704 | 0.3549 | 0.1983 | 0.4262 | 0.3002 | 0.4289 |
| 12.1885 | 20.0 | 2140 | 12.2426 | 0.3032 | 0.572 | 0.2731 | 0.103 | 0.243 | 0.5477 | 0.2773 | 0.445 | 0.4857 | 0.2911 | 0.4432 | 0.6613 | 0.5195 | 0.6581 | 0.3066 | 0.4405 | 0.1925 | 0.4134 | 0.1803 | 0.4462 | 0.3171 | 0.4702 |
| 12.1885 | 21.0 | 2247 | 12.3989 | 0.2916 | 0.5502 | 0.272 | 0.1122 | 0.2386 | 0.511 | 0.2747 | 0.4453 | 0.4804 | 0.2866 | 0.4475 | 0.641 | 0.5384 | 0.6739 | 0.2633 | 0.4481 | 0.1679 | 0.3933 | 0.1698 | 0.4169 | 0.3187 | 0.4698 |
| 12.1885 | 22.0 | 2354 | 12.0240 | 0.3148 | 0.5911 | 0.2942 | 0.1464 | 0.2741 | 0.5199 | 0.2898 | 0.4491 | 0.4895 | 0.2978 | 0.4471 | 0.6738 | 0.5314 | 0.6757 | 0.3199 | 0.4519 | 0.1978 | 0.4174 | 0.1916 | 0.4354 | 0.3335 | 0.4671 |
| 12.1885 | 23.0 | 2461 | 12.3447 | 0.3001 | 0.5627 | 0.2693 | 0.0889 | 0.2524 | 0.5265 | 0.2751 | 0.4439 | 0.4783 | 0.2785 | 0.428 | 0.655 | 0.5289 | 0.673 | 0.2589 | 0.4354 | 0.1884 | 0.3763 | 0.2093 | 0.4415 | 0.315 | 0.4653 |
| 11.4223 | 24.0 | 2568 | 12.2785 | 0.2949 | 0.5492 | 0.2699 | 0.1189 | 0.2449 | 0.5269 | 0.2723 | 0.4411 | 0.4714 | 0.2738 | 0.4217 | 0.6581 | 0.5369 | 0.6833 | 0.2443 | 0.419 | 0.1813 | 0.3741 | 0.2034 | 0.4385 | 0.3083 | 0.4422 |
| 11.4223 | 25.0 | 2675 | 12.6573 | 0.2723 | 0.5239 | 0.2439 | 0.0732 | 0.2321 | 0.4827 | 0.2624 | 0.4233 | 0.4575 | 0.2445 | 0.4103 | 0.641 | 0.4973 | 0.6541 | 0.2145 | 0.4278 | 0.1901 | 0.3661 | 0.1814 | 0.4046 | 0.2783 | 0.4351 |
| 11.4223 | 26.0 | 2782 | 12.6655 | 0.2847 | 0.5322 | 0.2672 | 0.1111 | 0.2407 | 0.5069 | 0.2691 | 0.426 | 0.4592 | 0.2404 | 0.413 | 0.6528 | 0.5108 | 0.6626 | 0.2417 | 0.4139 | 0.1877 | 0.3652 | 0.1853 | 0.4138 | 0.2979 | 0.4404 |
| 11.4223 | 27.0 | 2889 | 12.5127 | 0.2952 | 0.547 | 0.2728 | 0.1306 | 0.2385 | 0.5071 | 0.2706 | 0.4342 | 0.4672 | 0.2397 | 0.4179 | 0.6469 | 0.5328 | 0.6721 | 0.2528 | 0.4215 | 0.2026 | 0.3879 | 0.1899 | 0.4031 | 0.2981 | 0.4516 |
| 11.4223 | 28.0 | 2996 | 12.7679 | 0.271 | 0.512 | 0.2567 | 0.0931 | 0.2248 | 0.4729 | 0.2537 | 0.4154 | 0.4493 | 0.2626 | 0.3981 | 0.6353 | 0.5163 | 0.6586 | 0.2136 | 0.4089 | 0.1561 | 0.3496 | 0.1939 | 0.4123 | 0.2751 | 0.4173 |
| 10.803 | 29.0 | 3103 | 12.8845 | 0.2699 | 0.5064 | 0.2455 | 0.0718 | 0.2298 | 0.4619 | 0.2551 | 0.4191 | 0.4536 | 0.2406 | 0.4114 | 0.6345 | 0.5199 | 0.6649 | 0.2169 | 0.4316 | 0.1408 | 0.3473 | 0.1905 | 0.4031 | 0.2813 | 0.4209 |
| 10.803 | 30.0 | 3210 | 12.6244 | 0.2858 | 0.5387 | 0.2646 | 0.1235 | 0.2382 | 0.4958 | 0.2665 | 0.4265 | 0.4599 | 0.2799 | 0.3994 | 0.6424 | 0.5172 | 0.6671 | 0.2742 | 0.4114 | 0.1548 | 0.3594 | 0.1984 | 0.4385 | 0.2842 | 0.4231 |
| 10.803 | 31.0 | 3317 | 12.8007 | 0.2737 | 0.5037 | 0.2615 | 0.0809 | 0.2272 | 0.5058 | 0.2534 | 0.4145 | 0.4564 | 0.2525 | 0.4004 | 0.6573 | 0.5116 | 0.6676 | 0.2147 | 0.3924 | 0.1694 | 0.3589 | 0.1913 | 0.4338 | 0.2813 | 0.4293 |
| 10.803 | 32.0 | 3424 | 12.4891 | 0.292 | 0.5425 | 0.2747 | 0.1244 | 0.2507 | 0.5151 | 0.2728 | 0.4326 | 0.4635 | 0.2858 | 0.417 | 0.6562 | 0.519 | 0.6703 | 0.2243 | 0.3886 | 0.194 | 0.3786 | 0.2176 | 0.4446 | 0.3053 | 0.4356 |
| 10.2825 | 33.0 | 3531 | 12.5182 | 0.301 | 0.5548 | 0.2809 | 0.0843 | 0.2558 | 0.5149 | 0.2766 | 0.4427 | 0.4699 | 0.2576 | 0.425 | 0.6483 | 0.5355 | 0.6775 | 0.2817 | 0.4392 | 0.1732 | 0.3571 | 0.202 | 0.4308 | 0.3126 | 0.4449 |
| 10.2825 | 34.0 | 3638 | 12.5050 | 0.2903 | 0.5353 | 0.2866 | 0.0792 | 0.2516 | 0.4996 | 0.2687 | 0.4363 | 0.4663 | 0.2601 | 0.4267 | 0.6397 | 0.5312 | 0.6721 | 0.2486 | 0.4342 | 0.1757 | 0.358 | 0.2044 | 0.4385 | 0.2914 | 0.4289 |
| 10.2825 | 35.0 | 3745 | 12.7033 | 0.2744 | 0.5173 | 0.2569 | 0.0852 | 0.2345 | 0.4946 | 0.2661 | 0.4231 | 0.4634 | 0.2465 | 0.4147 | 0.6623 | 0.4855 | 0.6599 | 0.2283 | 0.4203 | 0.1814 | 0.367 | 0.1796 | 0.4354 | 0.2974 | 0.4347 |
| 10.2825 | 36.0 | 3852 | 12.6134 | 0.2838 | 0.5419 | 0.2591 | 0.0896 | 0.243 | 0.5131 | 0.2687 | 0.4303 | 0.4567 | 0.2489 | 0.4076 | 0.666 | 0.5217 | 0.6649 | 0.2063 | 0.3899 | 0.1811 | 0.3562 | 0.2107 | 0.4492 | 0.2992 | 0.4231 |
| 10.2825 | 37.0 | 3959 | 12.5976 | 0.2887 | 0.5406 | 0.2785 | 0.089 | 0.2484 | 0.5045 | 0.2749 | 0.4308 | 0.463 | 0.2536 | 0.4189 | 0.6548 | 0.516 | 0.6653 | 0.2379 | 0.4038 | 0.1944 | 0.367 | 0.1887 | 0.4385 | 0.3066 | 0.4404 |
| 9.7729 | 38.0 | 4066 | 12.6047 | 0.2885 | 0.5353 | 0.2776 | 0.0763 | 0.2494 | 0.5142 | 0.2698 | 0.4287 | 0.4623 | 0.2433 | 0.4153 | 0.6744 | 0.5271 | 0.6703 | 0.2344 | 0.3962 | 0.1851 | 0.3554 | 0.1887 | 0.4477 | 0.3074 | 0.4418 |
| 9.7729 | 39.0 | 4173 | 12.6272 | 0.2882 | 0.5323 | 0.2759 | 0.0807 | 0.2524 | 0.5078 | 0.272 | 0.4284 | 0.461 | 0.2471 | 0.4135 | 0.6656 | 0.5106 | 0.6631 | 0.2459 | 0.4025 | 0.1879 | 0.3612 | 0.1931 | 0.4431 | 0.3037 | 0.4351 |
| 9.7729 | 40.0 | 4280 | 12.5834 | 0.2938 | 0.5445 | 0.2786 | 0.0904 | 0.2572 | 0.5134 | 0.2707 | 0.4314 | 0.4641 | 0.2515 | 0.4156 | 0.6709 | 0.5142 | 0.6685 | 0.2442 | 0.4025 | 0.1994 | 0.3598 | 0.2002 | 0.4431 | 0.311 | 0.4467 |
### Framework versions
- Transformers 4.49.0.dev0
- Pytorch 2.6.0+cu124
- Datasets 3.2.0
- Tokenizers 0.21.0
| [
"coverall",
"face_shield",
"gloves",
"goggles",
"mask"
] |
tumul31/rtdetr-v2-r50-cardamage-40ep |
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# rtdetr-v2-r50-cardamage-40ep
This model is a fine-tuned version of [PekingU/rtdetr_r50vd](https://huggingface.co/PekingU/rtdetr_r50vd) on an unknown dataset.
It achieves the following results on the evaluation set:
- eval_loss: 12.7731
- eval_model_preparation_time: 0.0097
- eval_map: 0.2093
- eval_map_50: 0.404
- eval_map_75: 0.1925
- eval_map_small: 0.0168
- eval_map_medium: 0.0853
- eval_map_large: 0.2916
- eval_mar_1: 0.3526
- eval_mar_10: 0.4898
- eval_mar_100: 0.5314
- eval_mar_small: 0.0167
- eval_mar_medium: 0.2585
- eval_mar_large: 0.6342
- eval_map_car-parts: -1.0
- eval_mar_100_car-parts: -1.0
- eval_map_Bonet: 0.1129
- eval_mar_100_Bonet: 0.3364
- eval_map_Bumper: 0.128
- eval_mar_100_Bumper: 0.4396
- eval_map_Door: 0.1729
- eval_mar_100_Door: 0.6609
- eval_map_Headlight: 0.1512
- eval_mar_100_Headlight: 0.4947
- eval_map_Mirror: 0.2436
- eval_mar_100_Mirror: 0.5529
- eval_map_Tailight: 0.345
- eval_mar_100_Tailight: 0.575
- eval_map_Windshield: 0.3114
- eval_mar_100_Windshield: 0.66
- eval_runtime: 10.9627
- eval_samples_per_second: 14.048
- eval_steps_per_second: 1.824
- step: 0
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 5e-05
- train_batch_size: 8
- eval_batch_size: 8
- seed: 42
- optimizer: Use adamw_torch with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
- lr_scheduler_type: linear
- lr_scheduler_warmup_steps: 300
- num_epochs: 40
- mixed_precision_training: Native AMP
### Framework versions
- Transformers 4.47.0
- Pytorch 2.5.1+cu121
- Datasets 3.2.0
- Tokenizers 0.21.0
| [
"car-parts",
"bonet",
"bumper",
"door",
"headlight",
"mirror",
"tailight",
"windshield"
] |
s-4-m-a-n/detr-resnet-50-dc5-fashionpedia-finetuned |
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# detr-resnet-50-dc5-fashionpedia-finetuned
This model is a fine-tuned version of [facebook/detr-resnet-50-dc5](https://huggingface.co/facebook/detr-resnet-50-dc5) on an unknown dataset.
It achieves the following results on the evaluation set:
- eval_loss: 2.8759
- eval_map: 0.004
- eval_map_50: 0.0097
- eval_map_75: 0.0027
- eval_map_small: 0.0029
- eval_map_medium: 0.0049
- eval_map_large: 0.0018
- eval_mar_1: 0.0097
- eval_mar_10: 0.0225
- eval_mar_100: 0.0289
- eval_mar_small: 0.0163
- eval_mar_medium: 0.0303
- eval_mar_large: 0.0376
- eval_map_shirt, blouse: 0.0
- eval_mar_100_shirt, blouse: 0.0
- eval_map_top, t-shirt, sweatshirt: 0.0009
- eval_mar_100_top, t-shirt, sweatshirt: 0.0029
- eval_map_sweater: 0.0
- eval_mar_100_sweater: 0.0
- eval_map_cardigan: 0.0
- eval_mar_100_cardigan: 0.0
- eval_map_jacket: 0.0
- eval_mar_100_jacket: 0.0
- eval_map_vest: 0.0
- eval_mar_100_vest: 0.0
- eval_map_pants: 0.0053
- eval_mar_100_pants: 0.0325
- eval_map_shorts: 0.0
- eval_mar_100_shorts: 0.0
- eval_map_skirt: 0.0
- eval_mar_100_skirt: 0.0
- eval_map_coat: 0.0
- eval_mar_100_coat: 0.0
- eval_map_dress: 0.0311
- eval_mar_100_dress: 0.2667
- eval_map_jumpsuit: 0.0
- eval_mar_100_jumpsuit: 0.0
- eval_map_cape: 0.0
- eval_mar_100_cape: 0.0
- eval_map_glasses: 0.0
- eval_mar_100_glasses: 0.0
- eval_map_hat: 0.0
- eval_mar_100_hat: 0.0
- eval_map_headband, head covering, hair accessory: 0.0
- eval_mar_100_headband, head covering, hair accessory: 0.0
- eval_map_tie: 0.0
- eval_mar_100_tie: 0.0
- eval_map_glove: 0.0
- eval_mar_100_glove: 0.0
- eval_map_watch: 0.0
- eval_mar_100_watch: 0.0
- eval_map_belt: 0.0
- eval_mar_100_belt: 0.0
- eval_map_leg warmer: 0.0
- eval_mar_100_leg warmer: 0.0
- eval_map_tights, stockings: 0.0
- eval_mar_100_tights, stockings: 0.0
- eval_map_sock: 0.0
- eval_mar_100_sock: 0.0
- eval_map_shoe: 0.1184
- eval_mar_100_shoe: 0.4574
- eval_map_bag, wallet: 0.0
- eval_mar_100_bag, wallet: 0.0
- eval_map_scarf: 0.0
- eval_mar_100_scarf: 0.0
- eval_map_umbrella: 0.0
- eval_mar_100_umbrella: 0.0
- eval_map_hood: 0.0
- eval_mar_100_hood: 0.0
- eval_map_collar: 0.0
- eval_mar_100_collar: 0.0
- eval_map_lapel: 0.0
- eval_mar_100_lapel: 0.0
- eval_map_epaulette: 0.0
- eval_mar_100_epaulette: 0.0
- eval_map_sleeve: 0.0091
- eval_mar_100_sleeve: 0.3887
- eval_map_pocket: 0.0
- eval_mar_100_pocket: 0.0
- eval_map_neckline: 0.018
- eval_mar_100_neckline: 0.1796
- eval_map_buckle: 0.0
- eval_mar_100_buckle: 0.0
- eval_map_zipper: 0.0
- eval_mar_100_zipper: 0.0
- eval_map_applique: 0.0
- eval_mar_100_applique: 0.0
- eval_map_bead: 0.0
- eval_mar_100_bead: 0.0
- eval_map_bow: 0.0
- eval_mar_100_bow: 0.0
- eval_map_flower: 0.0
- eval_mar_100_flower: 0.0
- eval_map_fringe: 0.0
- eval_mar_100_fringe: 0.0
- eval_map_ribbon: 0.0
- eval_mar_100_ribbon: 0.0
- eval_map_rivet: 0.0
- eval_mar_100_rivet: 0.0
- eval_map_ruffle: 0.0
- eval_mar_100_ruffle: 0.0
- eval_map_sequin: 0.0
- eval_mar_100_sequin: 0.0
- eval_map_tassel: 0.0
- eval_mar_100_tassel: 0.0
- eval_runtime: 235.9187
- eval_samples_per_second: 4.908
- eval_steps_per_second: 1.229
- epoch: 0.1622
- step: 1850
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 1e-05
- train_batch_size: 4
- eval_batch_size: 4
- seed: 42
- optimizer: Use OptimizerNames.ADAMW_TORCH with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
- lr_scheduler_type: linear
- training_steps: 10000
- mixed_precision_training: Native AMP
### Framework versions
- Transformers 4.48.3
- Pytorch 2.5.1+cu124
- Datasets 3.2.0
- Tokenizers 0.21.0
| [
"shirt, blouse",
"top, t-shirt, sweatshirt",
"sweater",
"cardigan",
"jacket",
"vest",
"pants",
"shorts",
"skirt",
"coat",
"dress",
"jumpsuit",
"cape",
"glasses",
"hat",
"headband, head covering, hair accessory",
"tie",
"glove",
"watch",
"belt",
"leg warmer",
"tights, stockings",
"sock",
"shoe",
"bag, wallet",
"scarf",
"umbrella",
"hood",
"collar",
"lapel",
"epaulette",
"sleeve",
"pocket",
"neckline",
"buckle",
"zipper",
"applique",
"bead",
"bow",
"flower",
"fringe",
"ribbon",
"rivet",
"ruffle",
"sequin",
"tassel"
] |
joortif/practica_2_detr |
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# practica_2_detr
This model is a fine-tuned version of [facebook/detr-resnet-50](https://huggingface.co/facebook/detr-resnet-50) on the None dataset.
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 1e-05
- train_batch_size: 8
- eval_batch_size: 8
- seed: 42
- optimizer: Use OptimizerNames.ADAMW_TORCH with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
- lr_scheduler_type: linear
- num_epochs: 100
- mixed_precision_training: Native AMP
### Training results
### Framework versions
- Transformers 4.50.0
- Pytorch 2.6.0+cu124
- Datasets 3.5.0
- Tokenizers 0.21.1
| [
"n/a",
"person",
"bicycle",
"car",
"motorcycle",
"airplane",
"bus",
"train",
"truck",
"boat",
"traffic light",
"fire hydrant",
"street sign",
"stop sign",
"parking meter",
"bench",
"bird",
"cat",
"dog",
"horse",
"sheep",
"cow",
"elephant",
"bear",
"zebra",
"giraffe",
"hat",
"backpack",
"umbrella",
"shoe",
"eye glasses",
"handbag",
"tie",
"suitcase",
"frisbee",
"skis",
"snowboard",
"sports ball",
"kite",
"baseball bat",
"baseball glove",
"skateboard",
"surfboard",
"tennis racket",
"bottle",
"plate",
"wine glass",
"cup",
"fork",
"knife",
"spoon",
"bowl",
"banana",
"apple",
"sandwich",
"orange",
"broccoli",
"carrot",
"hot dog",
"pizza",
"donut",
"cake",
"chair",
"couch",
"potted plant",
"bed",
"mirror",
"dining table",
"window",
"desk",
"toilet",
"door",
"tv",
"laptop",
"mouse",
"remote",
"keyboard",
"cell phone",
"microwave",
"oven",
"toaster",
"sink",
"refrigerator",
"blender",
"book",
"clock",
"vase",
"scissors",
"teddy bear",
"hair drier",
"toothbrush"
] |
NihalGurjar/mtsr3 |
# Model Card for Model ID
<!-- Provide a quick summary of what the model is/does. -->
## Model Details
### Model Description
<!-- Provide a longer summary of what this model is. -->
This is the model card of a 🤗 transformers model that has been pushed on the Hub. This model card has been automatically generated.
- **Developed by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Model type:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
- **Finetuned from model [optional]:** [More Information Needed]
### Model Sources [optional]
<!-- Provide the basic links for the model. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the model is intended to be used, including the foreseeable users of the model and those affected by the model. -->
### Direct Use
<!-- This section is for the model use without fine-tuning or plugging into a larger ecosystem/app. -->
[More Information Needed]
### Downstream Use [optional]
<!-- This section is for the model use when fine-tuned for a task, or when plugged into a larger ecosystem/app -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the model will not work well for. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.
## How to Get Started with the Model
Use the code below to get started with the model.
[More Information Needed]
## Training Details
### Training Data
<!-- This should link to a Dataset Card, perhaps with a short stub of information on what the training data is all about as well as documentation related to data pre-processing or additional filtering. -->
[More Information Needed]
### Training Procedure
<!-- This relates heavily to the Technical Specifications. Content here should link to that section when it is relevant to the training procedure. -->
#### Preprocessing [optional]
[More Information Needed]
#### Training Hyperparameters
- **Training regime:** [More Information Needed] <!--fp32, fp16 mixed precision, bf16 mixed precision, bf16 non-mixed precision, fp16 non-mixed precision, fp8 mixed precision -->
#### Speeds, Sizes, Times [optional]
<!-- This section provides information about throughput, start/end time, checkpoint size if relevant, etc. -->
[More Information Needed]
## Evaluation
<!-- This section describes the evaluation protocols and provides the results. -->
### Testing Data, Factors & Metrics
#### Testing Data
<!-- This should link to a Dataset Card if possible. -->
[More Information Needed]
#### Factors
<!-- These are the things the evaluation is disaggregating by, e.g., subpopulations or domains. -->
[More Information Needed]
#### Metrics
<!-- These are the evaluation metrics being used, ideally with a description of why. -->
[More Information Needed]
### Results
[More Information Needed]
#### Summary
## Model Examination [optional]
<!-- Relevant interpretability work for the model goes here -->
[More Information Needed]
## Environmental Impact
<!-- Total emissions (in grams of CO2eq) and additional considerations, such as electricity usage, go here. Edit the suggested text below accordingly -->
Carbon emissions can be estimated using the [Machine Learning Impact calculator](https://mlco2.github.io/impact#compute) presented in [Lacoste et al. (2019)](https://arxiv.org/abs/1910.09700).
- **Hardware Type:** [More Information Needed]
- **Hours used:** [More Information Needed]
- **Cloud Provider:** [More Information Needed]
- **Compute Region:** [More Information Needed]
- **Carbon Emitted:** [More Information Needed]
## Technical Specifications [optional]
### Model Architecture and Objective
[More Information Needed]
### Compute Infrastructure
[More Information Needed]
#### Hardware
[More Information Needed]
#### Software
[More Information Needed]
## Citation [optional]
<!-- If there is a paper or blog post introducing the model, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the model or model card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Model Card Authors [optional]
[More Information Needed]
## Model Card Contact
[More Information Needed] | [
"table",
"table column",
"table row",
"table column header",
"table projected row header",
"table spanning cell"
] |
theButcher22/deta-swin-large |
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# deta-swin-large
This model is a fine-tuned version of [jozhang97/deta-swin-large](https://huggingface.co/jozhang97/deta-swin-large) on the None dataset.
It achieves the following results on the evaluation set:
- Loss: 7.7323
- Map: 0.448
- Map 50: 0.6146
- Map 75: 0.4994
- Map Small: 0.4453
- Map Medium: 0.5526
- Map Large: -1.0
- Mar 1: 0.286
- Mar 10: 0.4045
- Mar 100: 0.5073
- Mar Small: 0.5039
- Mar Medium: 0.571
- Mar Large: -1.0
- Map Ball: 0.2858
- Mar 100 Ball: 0.4068
- Map Goalkeeper: 0.7008
- Mar 100 Goalkeeper: 0.7649
- Map Player: 0.8053
- Mar 100 Player: 0.8576
- Map Referee: 0.0
- Mar 100 Referee: 0.0
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 0.0001
- train_batch_size: 2
- eval_batch_size: 8
- seed: 42
- optimizer: Use OptimizerNames.ADAMW_TORCH with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
- lr_scheduler_type: linear
- num_epochs: 20
### Training results
| Training Loss | Epoch | Step | Validation Loss | Map | Map 50 | Map 75 | Map Small | Map Medium | Map Large | Mar 1 | Mar 10 | Mar 100 | Mar Small | Mar Medium | Mar Large | Map Ball | Mar 100 Ball | Map Goalkeeper | Mar 100 Goalkeeper | Map Player | Mar 100 Player | Map Referee | Mar 100 Referee |
|:-------------:|:-----:|:----:|:---------------:|:------:|:------:|:------:|:---------:|:----------:|:---------:|:------:|:------:|:-------:|:---------:|:----------:|:---------:|:--------:|:------------:|:--------------:|:------------------:|:----------:|:--------------:|:-----------:|:---------------:|
| 296.278 | 1.0 | 149 | 62.3890 | 0.0999 | 0.1535 | 0.1216 | 0.0976 | 0.1516 | -1.0 | 0.0078 | 0.0616 | 0.1808 | 0.1808 | 0.2414 | -1.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.3997 | 0.7232 | 0.0 | 0.0 |
| 74.5246 | 2.0 | 298 | 25.7949 | 0.1643 | 0.236 | 0.1843 | 0.1638 | 0.2281 | -1.0 | 0.0162 | 0.1175 | 0.3643 | 0.3753 | 0.2564 | -1.0 | 0.0 | 0.0 | 0.0189 | 0.6969 | 0.6384 | 0.7605 | 0.0 | 0.0 |
| 23.5211 | 3.0 | 447 | 16.6520 | 0.1984 | 0.25 | 0.2246 | 0.1971 | 0.2759 | -1.0 | 0.0111 | 0.2288 | 0.4151 | 0.4131 | 0.5609 | -1.0 | 0.0 | 0.0 | 0.0371 | 0.8114 | 0.7566 | 0.849 | 0.0 | 0.0 |
| 18.7573 | 4.0 | 596 | 13.6105 | 0.2271 | 0.3101 | 0.2518 | 0.2264 | 0.2742 | -1.0 | 0.0608 | 0.2627 | 0.4917 | 0.4901 | 0.5426 | -1.0 | 0.1064 | 0.3159 | 0.0287 | 0.8139 | 0.7732 | 0.8368 | 0.0 | 0.0 |
| 14.4593 | 5.0 | 745 | 11.9880 | 0.2282 | 0.309 | 0.2486 | 0.2272 | 0.2805 | -1.0 | 0.0774 | 0.2571 | 0.4835 | 0.4812 | 0.531 | -1.0 | 0.1022 | 0.3023 | 0.028 | 0.7914 | 0.7824 | 0.8405 | 0.0 | 0.0 |
| 12.878 | 6.0 | 894 | 10.9109 | 0.3867 | 0.5529 | 0.418 | 0.3955 | 0.3944 | -1.0 | 0.2517 | 0.3723 | 0.4836 | 0.4837 | 0.526 | -1.0 | 0.1759 | 0.3349 | 0.6007 | 0.7629 | 0.7704 | 0.8367 | 0.0 | 0.0 |
| 11.4964 | 7.0 | 1043 | 10.0801 | 0.4098 | 0.5788 | 0.4432 | 0.4076 | 0.5423 | -1.0 | 0.2558 | 0.3796 | 0.4805 | 0.4774 | 0.5626 | -1.0 | 0.2016 | 0.3227 | 0.6587 | 0.7622 | 0.7792 | 0.8372 | 0.0 | 0.0 |
| 10.938 | 8.0 | 1192 | 9.6962 | 0.4168 | 0.5909 | 0.4511 | 0.4136 | 0.5736 | -1.0 | 0.2622 | 0.384 | 0.4829 | 0.479 | 0.5953 | -1.0 | 0.2148 | 0.3045 | 0.6771 | 0.7944 | 0.7753 | 0.8327 | 0.0 | 0.0 |
| 10.2185 | 9.0 | 1341 | 9.2201 | 0.4414 | 0.6237 | 0.4939 | 0.4461 | 0.4126 | -1.0 | 0.2975 | 0.4088 | 0.5078 | 0.5043 | 0.5659 | -1.0 | 0.3159 | 0.4023 | 0.6628 | 0.7886 | 0.7866 | 0.8403 | 0.0 | 0.0 |
| 10.1417 | 10.0 | 1490 | 8.8414 | 0.4258 | 0.5998 | 0.4616 | 0.432 | 0.4459 | -1.0 | 0.2829 | 0.4079 | 0.5097 | 0.5112 | 0.5104 | -1.0 | 0.2659 | 0.4256 | 0.6382 | 0.76 | 0.799 | 0.8533 | 0.0 | 0.0 |
| 9.6417 | 11.0 | 1639 | 8.9382 | 0.4158 | 0.5798 | 0.4758 | 0.4134 | 0.5169 | -1.0 | 0.2758 | 0.401 | 0.5016 | 0.499 | 0.5324 | -1.0 | 0.2131 | 0.3791 | 0.6644 | 0.7857 | 0.7857 | 0.8414 | 0.0 | 0.0 |
| 9.3618 | 12.0 | 1788 | 8.5653 | 0.4349 | 0.6074 | 0.5026 | 0.4337 | 0.5436 | -1.0 | 0.2857 | 0.4042 | 0.5052 | 0.5022 | 0.5646 | -1.0 | 0.2359 | 0.375 | 0.7111 | 0.8 | 0.7927 | 0.8458 | 0.0 | 0.0 |
| 9.1601 | 13.0 | 1937 | 8.5162 | 0.4346 | 0.6159 | 0.4669 | 0.4332 | 0.5417 | -1.0 | 0.2803 | 0.3997 | 0.4998 | 0.497 | 0.5623 | -1.0 | 0.2628 | 0.3795 | 0.6862 | 0.7778 | 0.7892 | 0.842 | 0.0 | 0.0 |
| 8.9938 | 14.0 | 2086 | 8.2598 | 0.4403 | 0.6165 | 0.4962 | 0.4379 | 0.5494 | -1.0 | 0.2803 | 0.4002 | 0.5017 | 0.4986 | 0.5674 | -1.0 | 0.2661 | 0.3773 | 0.7019 | 0.7806 | 0.7931 | 0.8489 | 0.0 | 0.0 |
| 8.7258 | 15.0 | 2235 | 8.2129 | 0.4507 | 0.6142 | 0.5192 | 0.4505 | 0.5363 | -1.0 | 0.2931 | 0.4137 | 0.5149 | 0.5123 | 0.5611 | -1.0 | 0.3076 | 0.4114 | 0.7154 | 0.8028 | 0.7796 | 0.8453 | 0.0 | 0.0 |
| 8.1655 | 16.0 | 2384 | 8.1459 | 0.4436 | 0.6071 | 0.4985 | 0.4455 | 0.4346 | -1.0 | 0.2891 | 0.4125 | 0.5136 | 0.5134 | 0.5115 | -1.0 | 0.3046 | 0.4091 | 0.6794 | 0.7946 | 0.7905 | 0.8507 | 0.0 | 0.0 |
| 8.1808 | 17.0 | 2533 | 8.1069 | 0.4426 | 0.6231 | 0.4971 | 0.4436 | 0.5167 | -1.0 | 0.2868 | 0.4078 | 0.5074 | 0.5057 | 0.5667 | -1.0 | 0.2985 | 0.4023 | 0.6942 | 0.7886 | 0.7776 | 0.8388 | 0.0 | 0.0 |
| 8.45 | 18.0 | 2682 | 7.9322 | 0.4443 | 0.6157 | 0.4739 | 0.4427 | 0.5757 | -1.0 | 0.2839 | 0.4082 | 0.5104 | 0.5066 | 0.5963 | -1.0 | 0.2715 | 0.4023 | 0.704 | 0.7833 | 0.8018 | 0.856 | 0.0 | 0.0 |
| 8.0344 | 19.0 | 2831 | 7.9556 | 0.4464 | 0.6133 | 0.4963 | 0.4456 | 0.5388 | -1.0 | 0.2866 | 0.4113 | 0.5129 | 0.5111 | 0.5584 | -1.0 | 0.2678 | 0.4093 | 0.722 | 0.7914 | 0.7956 | 0.851 | 0.0 | 0.0 |
| 7.9178 | 20.0 | 2980 | 7.7323 | 0.448 | 0.6146 | 0.4994 | 0.4453 | 0.5526 | -1.0 | 0.286 | 0.4045 | 0.5073 | 0.5039 | 0.571 | -1.0 | 0.2858 | 0.4068 | 0.7008 | 0.7649 | 0.8053 | 0.8576 | 0.0 | 0.0 |
### Framework versions
- Transformers 4.49.0
- Pytorch 2.5.1+cu124
- Datasets 3.3.1
- Tokenizers 0.21.0
| [
"football-players-detection",
"ball",
"goalkeeper",
"player",
"referee"
] |
ryfkn/DETR |
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# DETR
This model is a fine-tuned version of [facebook/detr-resnet-50-dc5](https://huggingface.co/facebook/detr-resnet-50-dc5) on an unknown dataset.
It achieves the following results on the evaluation set:
- Loss: 1.6175
- Map: 0.0235
- Map 50: 0.0454
- Map 75: 0.0236
- Map Small: 0.0
- Map Medium: 0.0065
- Map Large: 0.0651
- Mar 1: 0.0428
- Mar 10: 0.1393
- Mar 100: 0.3049
- Mar Small: 0.0
- Mar Medium: 0.2244
- Mar Large: 0.4274
- Map D10: 0.0131
- Mar 100 D10: 0.3508
- Map D20: 0.0574
- Mar 100 D20: 0.564
- Map D40: 0.0
- Mar 100 D40: 0.0
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 1e-05
- train_batch_size: 4
- eval_batch_size: 4
- seed: 42
- optimizer: Use adamw_torch with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
- lr_scheduler_type: linear
- training_steps: 10000
- mixed_precision_training: Native AMP
### Training results
| Training Loss | Epoch | Step | Validation Loss | Map | Map 50 | Map 75 | Map Small | Map Medium | Map Large | Mar 1 | Mar 10 | Mar 100 | Mar Small | Mar Medium | Mar Large | Map D0 | Mar 100 D0 | Map D10 | Mar 100 D10 | Map D20 | Mar 100 D20 | Map D40 | Mar 100 D40 |
|:-------------:|:------:|:-----:|:---------------:|:------:|:------:|:------:|:---------:|:----------:|:---------:|:------:|:------:|:-------:|:---------:|:----------:|:---------:|:------:|:----------:|:-------:|:-----------:|:-------:|:-----------:|:-------:|:-----------:|
| 5.176 | 0.1238 | 200 | 4.6596 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0003 | 0.0 | 0.0 | 0.0015 | -1.0 | -1.0 | 0.0 | 0.0008 | 0.0 | 0.0 | 0.0 | 0.0 |
| 3.0396 | 0.2475 | 400 | 3.5515 | 0.0 | 0.0001 | 0.0 | 0.0 | 0.0001 | 0.0001 | 0.0 | 0.0067 | 0.0307 | 0.0 | 0.0333 | 0.045 | -1.0 | -1.0 | 0.0 | 0.0 | 0.0001 | 0.092 | 0.0 | 0.0 |
| 2.95 | 0.3713 | 600 | 3.2961 | 0.0001 | 0.0003 | 0.0 | 0.0 | 0.0 | 0.0003 | 0.0 | 0.0027 | 0.0613 | 0.0 | 0.0 | 0.115 | 0.0 | 0.0 | 0.0002 | 0.184 | 0.0 | 0.0 |
| 2.6609 | 0.4950 | 800 | 3.3303 | 0.0001 | 0.0006 | 0.0 | 0.0 | 0.0001 | 0.0003 | 0.0 | 0.008 | 0.072 | 0.0 | 0.0467 | 0.1175 | 0.0 | 0.0 | 0.0004 | 0.216 | 0.0 | 0.0 |
| 3.2776 | 0.6188 | 1000 | 3.0383 | 0.0002 | 0.0007 | 0.0 | 0.0 | 0.0002 | 0.0003 | 0.0 | 0.008 | 0.0733 | 0.0 | 0.0467 | 0.12 | 0.0 | 0.0 | 0.0005 | 0.22 | 0.0 | 0.0 |
| 2.7712 | 0.7426 | 1200 | 2.9071 | 0.0004 | 0.0015 | 0.0 | 0.0 | 0.0022 | 0.0006 | 0.0 | 0.0253 | 0.0627 | 0.0 | 0.0333 | 0.105 | 0.0 | 0.0 | 0.0011 | 0.188 | 0.0 | 0.0 |
| 2.4005 | 0.8663 | 1400 | 2.7352 | 0.001 | 0.0069 | 0.0001 | 0.0 | 0.0023 | 0.0018 | 0.0013 | 0.0227 | 0.0907 | 0.0 | 0.06 | 0.1475 | 0.0 | 0.0 | 0.003 | 0.272 | 0.0 | 0.0 |
| 2.6812 | 0.9901 | 1600 | 2.5448 | 0.0009 | 0.0037 | 0.0001 | 0.0 | 0.0005 | 0.0016 | 0.004 | 0.0267 | 0.0987 | 0.0 | 0.06 | 0.1625 | 0.0 | 0.0 | 0.0026 | 0.296 | 0.0 | 0.0 |
| 1.8313 | 1.1139 | 1800 | 2.3839 | 0.0009 | 0.003 | 0.0 | 0.0 | 0.0002 | 0.0018 | 0.008 | 0.02 | 0.1173 | 0.0 | 0.1067 | 0.18 | 0.0 | 0.0 | 0.0028 | 0.352 | 0.0 | 0.0 |
| 2.6377 | 1.2376 | 2000 | 2.3837 | 0.0013 | 0.0033 | 0.0004 | 0.0 | 0.0001 | 0.0023 | 0.0 | 0.0347 | 0.1253 | 0.0 | 0.0667 | 0.21 | 0.0 | 0.0 | 0.0038 | 0.376 | 0.0 | 0.0 |
| 1.919 | 1.3614 | 2200 | 2.4226 | 0.0021 | 0.0078 | 0.0001 | 0.0 | 0.0001 | 0.0039 | 0.004 | 0.028 | 0.1227 | 0.0 | 0.0467 | 0.2125 | 0.0 | 0.0 | 0.0064 | 0.368 | 0.0 | 0.0 |
| 2.1093 | 1.4851 | 2400 | 2.3144 | 0.006 | 0.0232 | 0.0011 | 0.0 | 0.0 | 0.0109 | 0.0093 | 0.0307 | 0.12 | 0.0 | 0.0533 | 0.205 | 0.0 | 0.0 | 0.0179 | 0.36 | 0.0 | 0.0 |
| 2.4712 | 1.6089 | 2600 | 2.1712 | 0.0045 | 0.0271 | 0.0002 | 0.0 | 0.0001 | 0.0082 | 0.0061 | 0.0381 | 0.1355 | 0.0 | 0.0811 | 0.2225 | 0.0001 | 0.0024 | 0.0135 | 0.404 | 0.0 | 0.0 |
| 1.6899 | 1.7327 | 2800 | 2.1685 | 0.0053 | 0.0234 | 0.0004 | 0.0 | 0.0002 | 0.0096 | 0.0123 | 0.0456 | 0.1269 | 0.0 | 0.0689 | 0.21 | 0.0001 | 0.0048 | 0.0158 | 0.376 | 0.0 | 0.0 |
| 2.2178 | 1.8564 | 3000 | 2.0968 | 0.0049 | 0.0198 | 0.0004 | 0.0 | 0.0002 | 0.0087 | 0.0104 | 0.0291 | 0.1344 | 0.0 | 0.0748 | 0.2225 | 0.0001 | 0.0032 | 0.0145 | 0.4 | 0.0 | 0.0 |
| 1.8933 | 1.9802 | 3200 | 2.0313 | 0.0083 | 0.025 | 0.0006 | 0.0 | 0.0007 | 0.0145 | 0.0144 | 0.0429 | 0.1535 | 0.0 | 0.0847 | 0.245 | 0.0011 | 0.0246 | 0.024 | 0.436 | 0.0 | 0.0 |
| 1.853 | 2.1040 | 3400 | 2.0302 | 0.0068 | 0.0188 | 0.0045 | 0.0 | 0.0008 | 0.0119 | 0.0088 | 0.0455 | 0.161 | 0.0 | 0.1331 | 0.2325 | 0.0007 | 0.0429 | 0.0196 | 0.44 | 0.0 | 0.0 |
| 2.0421 | 2.2277 | 3600 | 1.9522 | 0.0207 | 0.0411 | 0.0193 | 0.0 | 0.0011 | 0.0369 | 0.0243 | 0.0638 | 0.1649 | 0.0 | 0.0889 | 0.2475 | 0.0018 | 0.0627 | 0.0603 | 0.432 | 0.0 | 0.0 |
| 1.8444 | 2.3515 | 3800 | 2.0036 | 0.0147 | 0.0308 | 0.02 | 0.0 | 0.0013 | 0.0268 | 0.0227 | 0.0788 | 0.1709 | 0.0 | 0.1521 | 0.2219 | 0.0016 | 0.1008 | 0.0425 | 0.412 | 0.0 | 0.0 |
| 1.6694 | 2.4752 | 4000 | 1.9610 | 0.0219 | 0.0511 | 0.0122 | 0.0 | 0.0019 | 0.0398 | 0.0232 | 0.0776 | 0.1888 | 0.0 | 0.1421 | 0.2569 | 0.0034 | 0.1063 | 0.0624 | 0.46 | 0.0 | 0.0 |
| 2.3946 | 2.5990 | 4200 | 2.0770 | 0.0108 | 0.0275 | 0.0083 | 0.0 | 0.0018 | 0.0197 | 0.0197 | 0.0792 | 0.1907 | 0.0 | 0.1258 | 0.2709 | 0.0033 | 0.1 | 0.029 | 0.472 | 0.0 | 0.0 |
| 2.4217 | 2.7228 | 4400 | 1.9638 | 0.021 | 0.0442 | 0.0115 | 0.0 | 0.004 | 0.0464 | 0.041 | 0.1054 | 0.2092 | 0.0 | 0.1873 | 0.2597 | 0.0047 | 0.1675 | 0.0583 | 0.46 | 0.0 | 0.0 |
| 1.6397 | 2.8465 | 4600 | 1.9357 | 0.0216 | 0.0519 | 0.0047 | 0.0 | 0.0031 | 0.048 | 0.0341 | 0.0923 | 0.2212 | 0.0 | 0.1441 | 0.3116 | 0.0055 | 0.1476 | 0.0592 | 0.516 | 0.0 | 0.0 |
| 1.9243 | 2.9703 | 4800 | 1.8502 | 0.016 | 0.0432 | 0.0039 | 0.0 | 0.0034 | 0.0347 | 0.0226 | 0.0791 | 0.231 | 0.0 | 0.1428 | 0.3194 | 0.0072 | 0.1929 | 0.041 | 0.5 | 0.0 | 0.0 |
| 1.6861 | 3.0941 | 5000 | 1.9368 | 0.0189 | 0.0463 | 0.0114 | 0.0 | 0.0031 | 0.0368 | 0.0296 | 0.0848 | 0.2233 | 0.0 | 0.1311 | 0.3112 | 0.0063 | 0.1619 | 0.0505 | 0.508 | 0.0 | 0.0 |
| 1.9067 | 3.2178 | 5200 | 1.8978 | 0.0169 | 0.0433 | 0.0074 | 0.0 | 0.0041 | 0.0449 | 0.0309 | 0.081 | 0.2381 | 0.0 | 0.1179 | 0.3378 | 0.0084 | 0.2143 | 0.0422 | 0.5 | 0.0 | 0.0 |
| 2.3952 | 3.3416 | 5400 | 1.8205 | 0.0156 | 0.0393 | 0.0099 | 0.0 | 0.0037 | 0.0402 | 0.0234 | 0.0938 | 0.2529 | 0.0 | 0.214 | 0.3157 | 0.0077 | 0.2508 | 0.0391 | 0.508 | 0.0 | 0.0 |
| 1.7741 | 3.4653 | 5600 | 1.7616 | 0.0256 | 0.057 | 0.0269 | 0.0 | 0.0045 | 0.0726 | 0.0301 | 0.0879 | 0.2614 | 0.0 | 0.1933 | 0.3504 | 0.0097 | 0.2563 | 0.067 | 0.528 | 0.0 | 0.0 |
| 1.5789 | 3.5891 | 5800 | 1.8325 | 0.0216 | 0.0437 | 0.0189 | 0.0064 | 0.0032 | 0.0411 | 0.0706 | 0.137 | 0.2674 | 0.0625 | 0.1486 | 0.314 | 0.0064 | 0.2063 | 0.0542 | 0.496 | 0.0043 | 0.1 |
| 1.631 | 3.7129 | 6000 | 1.8235 | 0.0209 | 0.0442 | 0.0203 | 0.0035 | 0.0036 | 0.0411 | 0.0583 | 0.1205 | 0.2719 | 0.0375 | 0.2023 | 0.3185 | 0.0065 | 0.2198 | 0.0543 | 0.536 | 0.0019 | 0.06 |
| 1.9591 | 3.8366 | 6200 | 1.6983 | 0.0195 | 0.0438 | 0.012 | 0.0 | 0.0054 | 0.0464 | 0.0322 | 0.0967 | 0.2888 | 0.0 | 0.2202 | 0.4049 | 0.0126 | 0.2865 | 0.046 | 0.58 | 0.0 | 0.0 |
| 1.7331 | 3.9604 | 6400 | 1.7687 | 0.0193 | 0.0422 | 0.016 | 0.0 | 0.0046 | 0.0485 | 0.0314 | 0.1196 | 0.2662 | 0.0 | 0.1969 | 0.3584 | 0.0095 | 0.2627 | 0.0484 | 0.536 | 0.0 | 0.0 |
| 1.5073 | 4.0842 | 6600 | 1.7121 | 0.0211 | 0.0444 | 0.022 | 0.0 | 0.005 | 0.0472 | 0.0309 | 0.1175 | 0.2625 | 0.0 | 0.1658 | 0.3643 | 0.0107 | 0.2714 | 0.0525 | 0.516 | 0.0 | 0.0 |
| 1.9417 | 4.2079 | 6800 | 1.7394 | 0.0264 | 0.049 | 0.0244 | 0.0012 | 0.0055 | 0.0582 | 0.0461 | 0.1449 | 0.2889 | 0.0125 | 0.2006 | 0.3824 | 0.0113 | 0.2746 | 0.0671 | 0.572 | 0.0007 | 0.02 |
| 1.8069 | 4.3317 | 7000 | 1.7050 | 0.0244 | 0.0497 | 0.0223 | 0.0 | 0.0051 | 0.0641 | 0.0484 | 0.1316 | 0.2643 | 0.0 | 0.1977 | 0.3856 | 0.0118 | 0.273 | 0.0614 | 0.52 | 0.0 | 0.0 |
| 1.2717 | 4.4554 | 7200 | 1.6229 | 0.0268 | 0.0508 | 0.0255 | 0.0 | 0.0078 | 0.0583 | 0.0468 | 0.1427 | 0.293 | 0.0 | 0.2127 | 0.4259 | 0.0182 | 0.3111 | 0.0621 | 0.568 | 0.0 | 0.0 |
| 1.8009 | 4.5792 | 7400 | 1.6753 | 0.0236 | 0.0533 | 0.0115 | 0.0 | 0.0053 | 0.0566 | 0.042 | 0.1215 | 0.284 | 0.0 | 0.222 | 0.3743 | 0.011 | 0.2921 | 0.0598 | 0.56 | 0.0 | 0.0 |
| 1.8661 | 4.7030 | 7600 | 1.6594 | 0.0234 | 0.0429 | 0.0232 | 0.0 | 0.0055 | 0.06 | 0.045 | 0.1372 | 0.2938 | 0.0 | 0.2174 | 0.4115 | 0.0116 | 0.3333 | 0.0586 | 0.548 | 0.0 | 0.0 |
| 1.5983 | 4.8267 | 7800 | 1.6727 | 0.0206 | 0.0436 | 0.0123 | 0.0 | 0.0056 | 0.0529 | 0.0447 | 0.1407 | 0.2986 | 0.0 | 0.2537 | 0.3909 | 0.0108 | 0.3278 | 0.051 | 0.568 | 0.0 | 0.0 |
| 2.0424 | 4.9505 | 8000 | 1.7269 | 0.0302 | 0.0571 | 0.0391 | 0.0 | 0.0062 | 0.0693 | 0.0431 | 0.1223 | 0.2887 | 0.0 | 0.2152 | 0.3659 | 0.0112 | 0.3222 | 0.0793 | 0.544 | 0.0 | 0.0 |
| 1.3068 | 5.0743 | 8200 | 1.6624 | 0.0233 | 0.0437 | 0.0233 | 0.0 | 0.0059 | 0.0579 | 0.0426 | 0.1324 | 0.293 | 0.0 | 0.2378 | 0.385 | 0.0109 | 0.319 | 0.0589 | 0.56 | 0.0 | 0.0 |
| 1.7284 | 5.1980 | 8400 | 1.6596 | 0.0286 | 0.0553 | 0.0327 | 0.0 | 0.0067 | 0.0699 | 0.0497 | 0.1385 | 0.2959 | 0.0 | 0.2515 | 0.3932 | 0.013 | 0.3278 | 0.0728 | 0.56 | 0.0 | 0.0 |
| 2.2819 | 5.3218 | 8600 | 1.6320 | 0.0221 | 0.0446 | 0.0151 | 0.0 | 0.0061 | 0.058 | 0.045 | 0.142 | 0.3095 | 0.0 | 0.2607 | 0.4043 | 0.013 | 0.3484 | 0.0533 | 0.58 | 0.0 | 0.0 |
| 1.2565 | 5.4455 | 8800 | 1.6319 | 0.0241 | 0.0507 | 0.0224 | 0.0 | 0.0057 | 0.0567 | 0.0399 | 0.1444 | 0.2943 | 0.0 | 0.2259 | 0.3918 | 0.0113 | 0.3429 | 0.061 | 0.54 | 0.0 | 0.0 |
| 2.0009 | 5.5693 | 9000 | 1.6232 | 0.0228 | 0.0461 | 0.0226 | 0.0 | 0.006 | 0.056 | 0.0405 | 0.1401 | 0.3026 | 0.0 | 0.2203 | 0.4063 | 0.0122 | 0.3437 | 0.0563 | 0.564 | 0.0 | 0.0 |
| 1.1549 | 5.6931 | 9200 | 1.6247 | 0.0241 | 0.0472 | 0.0221 | 0.0 | 0.0065 | 0.0703 | 0.0455 | 0.1297 | 0.3044 | 0.0 | 0.2236 | 0.4165 | 0.0144 | 0.3611 | 0.0579 | 0.552 | 0.0 | 0.0 |
| 1.3283 | 5.8168 | 9400 | 1.6025 | 0.0243 | 0.0497 | 0.023 | 0.0 | 0.0064 | 0.0683 | 0.0383 | 0.1299 | 0.3044 | 0.0 | 0.214 | 0.421 | 0.0137 | 0.3532 | 0.0592 | 0.56 | 0.0 | 0.0 |
| 1.9137 | 5.9406 | 9600 | 1.6489 | 0.0234 | 0.0454 | 0.024 | 0.0 | 0.0065 | 0.0713 | 0.0418 | 0.1353 | 0.3055 | 0.0 | 0.2311 | 0.4166 | 0.0127 | 0.3444 | 0.0576 | 0.572 | 0.0 | 0.0 |
| 1.8353 | 6.0644 | 9800 | 1.6261 | 0.0234 | 0.0451 | 0.0237 | 0.0 | 0.0064 | 0.0638 | 0.0426 | 0.1372 | 0.306 | 0.0 | 0.2318 | 0.4229 | 0.0127 | 0.35 | 0.0576 | 0.568 | 0.0 | 0.0 |
| 1.5491 | 6.1881 | 10000 | 1.6175 | 0.0235 | 0.0454 | 0.0236 | 0.0 | 0.0065 | 0.0651 | 0.0428 | 0.1393 | 0.3049 | 0.0 | 0.2244 | 0.4274 | 0.0131 | 0.3508 | 0.0574 | 0.564 | 0.0 | 0.0 |
### Framework versions
- Transformers 4.47.0
- Pytorch 2.5.1+cu121
- Datasets 3.2.0
- Tokenizers 0.21.0
| [
"d0",
"d10",
"d20",
"d40"
] |
edm-research/detr-resnet-50-dc5-fashionpedia-finetuned |
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# detr-resnet-50-dc5-fashionpedia-finetuned
This model is a fine-tuned version of [facebook/detr-resnet-50-dc5](https://huggingface.co/facebook/detr-resnet-50-dc5) on an unknown dataset.
It achieves the following results on the evaluation set:
- eval_loss: 4.8347
- eval_map: 0.0002
- eval_map_50: 0.0006
- eval_map_75: 0.0001
- eval_map_small: 0.0003
- eval_map_medium: 0.0004
- eval_map_large: 0.0002
- eval_mar_1: 0.0021
- eval_mar_10: 0.0063
- eval_mar_100: 0.0072
- eval_mar_small: 0.004
- eval_mar_medium: 0.0079
- eval_mar_large: 0.0062
- eval_map_shirt, blouse: 0.0
- eval_mar_100_shirt, blouse: 0.0
- eval_map_top, t-shirt, sweatshirt: 0.0002
- eval_mar_100_top, t-shirt, sweatshirt: 0.0053
- eval_map_sweater: 0.0
- eval_mar_100_sweater: 0.0
- eval_map_cardigan: 0.0
- eval_mar_100_cardigan: 0.0
- eval_map_jacket: 0.0
- eval_mar_100_jacket: 0.012
- eval_map_vest: 0.0
- eval_mar_100_vest: 0.0
- eval_map_pants: 0.0
- eval_mar_100_pants: 0.0
- eval_map_shorts: 0.0
- eval_mar_100_shorts: 0.0048
- eval_map_skirt: 0.0003
- eval_mar_100_skirt: 0.0255
- eval_map_coat: 0.0
- eval_mar_100_coat: 0.0087
- eval_map_dress: 0.0
- eval_mar_100_dress: 0.0
- eval_map_jumpsuit: 0.0
- eval_mar_100_jumpsuit: 0.0
- eval_map_cape: 0.0
- eval_mar_100_cape: 0.0
- eval_map_glasses: 0.0
- eval_mar_100_glasses: 0.0038
- eval_map_hat: 0.0
- eval_mar_100_hat: 0.0
- eval_map_headband, head covering, hair accessory: 0.0
- eval_mar_100_headband, head covering, hair accessory: 0.0
- eval_map_tie: 0.0
- eval_mar_100_tie: 0.0
- eval_map_glove: 0.0
- eval_mar_100_glove: 0.0
- eval_map_watch: 0.0
- eval_mar_100_watch: 0.0
- eval_map_belt: 0.0
- eval_mar_100_belt: 0.0
- eval_map_leg warmer: 0.0
- eval_mar_100_leg warmer: 0.0
- eval_map_tights, stockings: 0.0
- eval_mar_100_tights, stockings: 0.0
- eval_map_sock: 0.0
- eval_mar_100_sock: 0.0
- eval_map_shoe: 0.0091
- eval_mar_100_shoe: 0.1905
- eval_map_bag, wallet: 0.0
- eval_mar_100_bag, wallet: 0.0
- eval_map_scarf: 0.0
- eval_mar_100_scarf: 0.0
- eval_map_umbrella: 0.0
- eval_mar_100_umbrella: 0.0
- eval_map_hood: 0.0
- eval_mar_100_hood: 0.0
- eval_map_collar: 0.0
- eval_mar_100_collar: 0.0
- eval_map_lapel: 0.0
- eval_mar_100_lapel: 0.0
- eval_map_epaulette: 0.0
- eval_mar_100_epaulette: 0.0
- eval_map_sleeve: 0.0003
- eval_mar_100_sleeve: 0.0075
- eval_map_pocket: 0.0
- eval_mar_100_pocket: 0.0
- eval_map_neckline: 0.0001
- eval_mar_100_neckline: 0.0066
- eval_map_buckle: 0.0
- eval_mar_100_buckle: 0.0
- eval_map_zipper: 0.0
- eval_mar_100_zipper: 0.0
- eval_map_applique: 0.0
- eval_mar_100_applique: 0.0
- eval_map_bead: 0.0
- eval_mar_100_bead: 0.0
- eval_map_bow: 0.0
- eval_mar_100_bow: 0.0
- eval_map_flower: 0.0
- eval_mar_100_flower: 0.0
- eval_map_fringe: 0.0
- eval_mar_100_fringe: 0.0
- eval_map_ribbon: 0.0
- eval_mar_100_ribbon: 0.0
- eval_map_rivet: 0.0
- eval_mar_100_rivet: 0.0
- eval_map_ruffle: 0.0008
- eval_mar_100_ruffle: 0.0684
- eval_map_sequin: 0.0
- eval_mar_100_sequin: 0.0
- eval_map_tassel: 0.0
- eval_mar_100_tassel: 0.0
- eval_runtime: 217.1201
- eval_samples_per_second: 5.333
- eval_steps_per_second: 1.336
- epoch: 0.0132
- step: 150
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 1e-05
- train_batch_size: 4
- eval_batch_size: 4
- seed: 42
- optimizer: Use OptimizerNames.ADAMW_TORCH with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
- lr_scheduler_type: linear
- training_steps: 10000
- mixed_precision_training: Native AMP
### Framework versions
- Transformers 4.49.0
- Pytorch 2.5.1+cu124
- Datasets 3.3.1
- Tokenizers 0.21.0
| [
"shirt, blouse",
"top, t-shirt, sweatshirt",
"sweater",
"cardigan",
"jacket",
"vest",
"pants",
"shorts",
"skirt",
"coat",
"dress",
"jumpsuit",
"cape",
"glasses",
"hat",
"headband, head covering, hair accessory",
"tie",
"glove",
"watch",
"belt",
"leg warmer",
"tights, stockings",
"sock",
"shoe",
"bag, wallet",
"scarf",
"umbrella",
"hood",
"collar",
"lapel",
"epaulette",
"sleeve",
"pocket",
"neckline",
"buckle",
"zipper",
"applique",
"bead",
"bow",
"flower",
"fringe",
"ribbon",
"rivet",
"ruffle",
"sequin",
"tassel"
] |
joortif/practica_2_detr_resnet50 |
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# practica_2_detr_resnet50
This model is a fine-tuned version of [joortif/practica_2_detr](https://huggingface.co/joortif/practica_2_detr) on the None dataset.
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 1e-05
- train_batch_size: 8
- eval_batch_size: 8
- seed: 42
- optimizer: Use OptimizerNames.ADAMW_TORCH with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
- lr_scheduler_type: linear
- num_epochs: 100
- mixed_precision_training: Native AMP
### Training results
### Framework versions
- Transformers 4.50.0
- Pytorch 2.6.0+cu124
- Datasets 3.5.0
- Tokenizers 0.21.1
| [
"n/a",
"person",
"bicycle",
"car",
"motorcycle",
"airplane",
"bus",
"train",
"truck",
"boat",
"traffic light",
"fire hydrant",
"street sign",
"stop sign",
"parking meter",
"bench",
"bird",
"cat",
"dog",
"horse",
"sheep",
"cow",
"elephant",
"bear",
"zebra",
"giraffe",
"hat",
"backpack",
"umbrella",
"shoe",
"eye glasses",
"handbag",
"tie",
"suitcase",
"frisbee",
"skis",
"snowboard",
"sports ball",
"kite",
"baseball bat",
"baseball glove",
"skateboard",
"surfboard",
"tennis racket",
"bottle",
"plate",
"wine glass",
"cup",
"fork",
"knife",
"spoon",
"bowl",
"banana",
"apple",
"sandwich",
"orange",
"broccoli",
"carrot",
"hot dog",
"pizza",
"donut",
"cake",
"chair",
"couch",
"potted plant",
"bed",
"mirror",
"dining table",
"window",
"desk",
"toilet",
"door",
"tv",
"laptop",
"mouse",
"remote",
"keyboard",
"cell phone",
"microwave",
"oven",
"toaster",
"sink",
"refrigerator",
"blender",
"book",
"clock",
"vase",
"scissors",
"teddy bear",
"hair drier",
"toothbrush"
] |
Vorlve/detr-finetuned-pothole |
# Model Card for Model ID
<!-- Provide a quick summary of what the model is/does. -->
## Model Details
### Model Description
<!-- Provide a longer summary of what this model is. -->
This is the model card of a 🤗 transformers model that has been pushed on the Hub. This model card has been automatically generated.
- **Developed by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Model type:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
- **Finetuned from model [optional]:** [More Information Needed]
### Model Sources [optional]
<!-- Provide the basic links for the model. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the model is intended to be used, including the foreseeable users of the model and those affected by the model. -->
### Direct Use
<!-- This section is for the model use without fine-tuning or plugging into a larger ecosystem/app. -->
[More Information Needed]
### Downstream Use [optional]
<!-- This section is for the model use when fine-tuned for a task, or when plugged into a larger ecosystem/app -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the model will not work well for. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.
## How to Get Started with the Model
Use the code below to get started with the model.
[More Information Needed]
## Training Details
### Training Data
<!-- This should link to a Dataset Card, perhaps with a short stub of information on what the training data is all about as well as documentation related to data pre-processing or additional filtering. -->
[More Information Needed]
### Training Procedure
<!-- This relates heavily to the Technical Specifications. Content here should link to that section when it is relevant to the training procedure. -->
#### Preprocessing [optional]
[More Information Needed]
#### Training Hyperparameters
- **Training regime:** [More Information Needed] <!--fp32, fp16 mixed precision, bf16 mixed precision, bf16 non-mixed precision, fp16 non-mixed precision, fp8 mixed precision -->
#### Speeds, Sizes, Times [optional]
<!-- This section provides information about throughput, start/end time, checkpoint size if relevant, etc. -->
[More Information Needed]
## Evaluation
<!-- This section describes the evaluation protocols and provides the results. -->
### Testing Data, Factors & Metrics
#### Testing Data
<!-- This should link to a Dataset Card if possible. -->
[More Information Needed]
#### Factors
<!-- These are the things the evaluation is disaggregating by, e.g., subpopulations or domains. -->
[More Information Needed]
#### Metrics
<!-- These are the evaluation metrics being used, ideally with a description of why. -->
[More Information Needed]
### Results
[More Information Needed]
#### Summary
## Model Examination [optional]
<!-- Relevant interpretability work for the model goes here -->
[More Information Needed]
## Environmental Impact
<!-- Total emissions (in grams of CO2eq) and additional considerations, such as electricity usage, go here. Edit the suggested text below accordingly -->
Carbon emissions can be estimated using the [Machine Learning Impact calculator](https://mlco2.github.io/impact#compute) presented in [Lacoste et al. (2019)](https://arxiv.org/abs/1910.09700).
- **Hardware Type:** [More Information Needed]
- **Hours used:** [More Information Needed]
- **Cloud Provider:** [More Information Needed]
- **Compute Region:** [More Information Needed]
- **Carbon Emitted:** [More Information Needed]
## Technical Specifications [optional]
### Model Architecture and Objective
[More Information Needed]
### Compute Infrastructure
[More Information Needed]
#### Hardware
[More Information Needed]
#### Software
[More Information Needed]
## Citation [optional]
<!-- If there is a paper or blog post introducing the model, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the model or model card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Model Card Authors [optional]
[More Information Needed]
## Model Card Contact
[More Information Needed] | [
"background",
"class_1"
] |
Dasith77/detr-resnet-50_finetuned_cppe5 |
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# detr-resnet-50_finetuned_cppe5
This model is a fine-tuned version of [facebook/detr-resnet-50](https://huggingface.co/facebook/detr-resnet-50) on an unknown dataset.
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 1e-05
- train_batch_size: 8
- eval_batch_size: 8
- seed: 42
- optimizer: Use OptimizerNames.ADAMW_TORCH with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
- lr_scheduler_type: linear
- num_epochs: 10
- mixed_precision_training: Native AMP
### Training results
### Framework versions
- Transformers 4.48.3
- Pytorch 2.5.1+cu124
- Tokenizers 0.21.0
| [
"coverall",
"face_shield",
"gloves",
"goggles",
"mask"
] |
Bravo94/detr-resnet-50_finetuned_cppe5 |
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# detr-resnet-50_finetuned_cppe5
This model is a fine-tuned version of [facebook/detr-resnet-50](https://huggingface.co/facebook/detr-resnet-50) on an unknown dataset.
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 1e-05
- train_batch_size: 8
- eval_batch_size: 8
- seed: 42
- optimizer: Use OptimizerNames.ADAMW_TORCH with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
- lr_scheduler_type: linear
- num_epochs: 10
- mixed_precision_training: Native AMP
### Training results
### Framework versions
- Transformers 4.48.3
- Pytorch 2.5.1+cu124
- Tokenizers 0.21.0
| [
"coverall",
"face_shield",
"gloves",
"goggles",
"mask"
] |
ustc-community/dfine-large-coco | ## D-FINE
### **Overview**
The D-FINE model was proposed in [D-FINE: Redefine Regression Task in DETRs as Fine-grained Distribution Refinement](https://arxiv.org/abs/2410.13842) by
Yansong Peng, Hebei Li, Peixi Wu, Yueyi Zhang, Xiaoyan Sun, Feng Wu
This model was contributed by [VladOS95-cyber](https://github.com/VladOS95-cyber) with the help of [@qubvel-hf](https://huggingface.co/qubvel-hf)
This is the HF transformers implementation for D-FINE
_coco -> model trained on COCO
_obj365 -> model trained on Object365
_obj2coco -> model trained on Object365 and then finetuned on COCO
### **Performance**
D-FINE, a powerful real-time object detector that achieves outstanding localization precision by redefining the bounding box regression task in DETR models. D-FINE comprises two key components: Fine-grained Distribution Refinement (FDR) and Global Optimal Localization Self-Distillation (GO-LSD).

### **How to use**
```python
import torch
import requests
from PIL import Image
from transformers import DFineForObjectDetection, AutoImageProcessor
url = 'http://images.cocodataset.org/val2017/000000039769.jpg'
image = Image.open(requests.get(url, stream=True).raw)
image_processor = AutoImageProcessor.from_pretrained("ustc-community/dfine-large-coco")
model = DFineForObjectDetection.from_pretrained("ustc-community/dfine-large-coco")
inputs = image_processor(images=image, return_tensors="pt")
with torch.no_grad():
outputs = model(**inputs)
results = image_processor.post_process_object_detection(outputs, target_sizes=torch.tensor([image.size[::-1]]), threshold=0.3)
for result in results:
for score, label_id, box in zip(result["scores"], result["labels"], result["boxes"]):
score, label = score.item(), label_id.item()
box = [round(i, 2) for i in box.tolist()]
print(f"{model.config.id2label[label]}: {score:.2f} {box}")
```
### **Training**
D-FINE is trained on COCO (Lin et al. [2014]) train2017 and validated on COCO val2017 dataset. We report the standard AP metrics (averaged over uniformly sampled IoU thresholds ranging from 0.50 − 0.95 with a step size of 0.05), and APval5000 commonly used in real scenarios.
### **Applications**
D-FINE is ideal for real-time object detection in diverse applications such as **autonomous driving**, **surveillance systems**, **robotics**, and **retail analytics**. Its enhanced flexibility and deployment-friendly design make it suitable for both edge devices and large-scale systems + ensures high accuracy and speed in dynamic, real-world environments. | [
"person",
"bicycle",
"car",
"motorbike",
"aeroplane",
"bus",
"train",
"truck",
"boat",
"traffic light",
"fire hydrant",
"stop sign",
"parking meter",
"bench",
"bird",
"cat",
"dog",
"horse",
"sheep",
"cow",
"elephant",
"bear",
"zebra",
"giraffe",
"backpack",
"umbrella",
"handbag",
"tie",
"suitcase",
"frisbee",
"skis",
"snowboard",
"sports ball",
"kite",
"baseball bat",
"baseball glove",
"skateboard",
"surfboard",
"tennis racket",
"bottle",
"wine glass",
"cup",
"fork",
"knife",
"spoon",
"bowl",
"banana",
"apple",
"sandwich",
"orange",
"broccoli",
"carrot",
"hot dog",
"pizza",
"donut",
"cake",
"chair",
"sofa",
"pottedplant",
"bed",
"diningtable",
"toilet",
"tvmonitor",
"laptop",
"mouse",
"remote",
"keyboard",
"cell phone",
"microwave",
"oven",
"toaster",
"sink",
"refrigerator",
"book",
"clock",
"vase",
"scissors",
"teddy bear",
"hair drier",
"toothbrush"
] |
ustc-community/dfine-medium-coco | ## D-FINE
### **Overview**
The D-FINE model was proposed in [D-FINE: Redefine Regression Task in DETRs as Fine-grained Distribution Refinement](https://arxiv.org/abs/2410.13842) by
Yansong Peng, Hebei Li, Peixi Wu, Yueyi Zhang, Xiaoyan Sun, Feng Wu
This model was contributed by [VladOS95-cyber](https://github.com/VladOS95-cyber) with the help of [@qubvel-hf](https://huggingface.co/qubvel-hf)
This is the HF transformers implementation for D-FINE
_coco -> model trained on COCO
_obj365 -> model trained on Object365
_obj2coco -> model trained on Object365 and then finetuned on COCO
### **Performance**
D-FINE, a powerful real-time object detector that achieves outstanding localization precision by redefining the bounding box regression task in DETR models. D-FINE comprises two key components: Fine-grained Distribution Refinement (FDR) and Global Optimal Localization Self-Distillation (GO-LSD).

### **How to use**
```python
import torch
import requests
from PIL import Image
from transformers import DFineForObjectDetection, AutoImageProcessor
url = 'http://images.cocodataset.org/val2017/000000039769.jpg'
image = Image.open(requests.get(url, stream=True).raw)
image_processor = AutoImageProcessor.from_pretrained("ustc-community/dfine-medium-coco")
model = DFineForObjectDetection.from_pretrained("ustc-community/dfine-medium-coco")
inputs = image_processor(images=image, return_tensors="pt")
with torch.no_grad():
outputs = model(**inputs)
results = image_processor.post_process_object_detection(outputs, target_sizes=torch.tensor([image.size[::-1]]), threshold=0.3)
for result in results:
for score, label_id, box in zip(result["scores"], result["labels"], result["boxes"]):
score, label = score.item(), label_id.item()
box = [round(i, 2) for i in box.tolist()]
print(f"{model.config.id2label[label]}: {score:.2f} {box}")
```
### **Training**
D-FINE is trained on COCO (Lin et al. [2014]) train2017 and validated on COCO val2017 dataset. We report the standard AP metrics (averaged over uniformly sampled IoU thresholds ranging from 0.50 − 0.95 with a step size of 0.05), and APval5000 commonly used in real scenarios.
### **Applications**
D-FINE is ideal for real-time object detection in diverse applications such as **autonomous driving**, **surveillance systems**, **robotics**, and **retail analytics**. Its enhanced flexibility and deployment-friendly design make it suitable for both edge devices and large-scale systems + ensures high accuracy and speed in dynamic, real-world environments. | [
"person",
"bicycle",
"car",
"motorbike",
"aeroplane",
"bus",
"train",
"truck",
"boat",
"traffic light",
"fire hydrant",
"stop sign",
"parking meter",
"bench",
"bird",
"cat",
"dog",
"horse",
"sheep",
"cow",
"elephant",
"bear",
"zebra",
"giraffe",
"backpack",
"umbrella",
"handbag",
"tie",
"suitcase",
"frisbee",
"skis",
"snowboard",
"sports ball",
"kite",
"baseball bat",
"baseball glove",
"skateboard",
"surfboard",
"tennis racket",
"bottle",
"wine glass",
"cup",
"fork",
"knife",
"spoon",
"bowl",
"banana",
"apple",
"sandwich",
"orange",
"broccoli",
"carrot",
"hot dog",
"pizza",
"donut",
"cake",
"chair",
"sofa",
"pottedplant",
"bed",
"diningtable",
"toilet",
"tvmonitor",
"laptop",
"mouse",
"remote",
"keyboard",
"cell phone",
"microwave",
"oven",
"toaster",
"sink",
"refrigerator",
"book",
"clock",
"vase",
"scissors",
"teddy bear",
"hair drier",
"toothbrush"
] |
JeanCGuerrero/Practica2 |
# Model Card for Model ID
<!-- Provide a quick summary of what the model is/does. -->
## Model Details
### Model Description
<!-- Provide a longer summary of what this model is. -->
This is the model card of a 🤗 transformers model that has been pushed on the Hub. This model card has been automatically generated.
- **Developed by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Model type:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
- **Finetuned from model [optional]:** [More Information Needed]
### Model Sources [optional]
<!-- Provide the basic links for the model. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the model is intended to be used, including the foreseeable users of the model and those affected by the model. -->
### Direct Use
<!-- This section is for the model use without fine-tuning or plugging into a larger ecosystem/app. -->
[More Information Needed]
### Downstream Use [optional]
<!-- This section is for the model use when fine-tuned for a task, or when plugged into a larger ecosystem/app -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the model will not work well for. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.
## How to Get Started with the Model
Use the code below to get started with the model.
[More Information Needed]
## Training Details
### Training Data
<!-- This should link to a Dataset Card, perhaps with a short stub of information on what the training data is all about as well as documentation related to data pre-processing or additional filtering. -->
[More Information Needed]
### Training Procedure
<!-- This relates heavily to the Technical Specifications. Content here should link to that section when it is relevant to the training procedure. -->
#### Preprocessing [optional]
[More Information Needed]
#### Training Hyperparameters
- **Training regime:** [More Information Needed] <!--fp32, fp16 mixed precision, bf16 mixed precision, bf16 non-mixed precision, fp16 non-mixed precision, fp8 mixed precision -->
#### Speeds, Sizes, Times [optional]
<!-- This section provides information about throughput, start/end time, checkpoint size if relevant, etc. -->
[More Information Needed]
## Evaluation
<!-- This section describes the evaluation protocols and provides the results. -->
### Testing Data, Factors & Metrics
#### Testing Data
<!-- This should link to a Dataset Card if possible. -->
[More Information Needed]
#### Factors
<!-- These are the things the evaluation is disaggregating by, e.g., subpopulations or domains. -->
[More Information Needed]
#### Metrics
<!-- These are the evaluation metrics being used, ideally with a description of why. -->
[More Information Needed]
### Results
[More Information Needed]
#### Summary
## Model Examination [optional]
<!-- Relevant interpretability work for the model goes here -->
[More Information Needed]
## Environmental Impact
<!-- Total emissions (in grams of CO2eq) and additional considerations, such as electricity usage, go here. Edit the suggested text below accordingly -->
Carbon emissions can be estimated using the [Machine Learning Impact calculator](https://mlco2.github.io/impact#compute) presented in [Lacoste et al. (2019)](https://arxiv.org/abs/1910.09700).
- **Hardware Type:** [More Information Needed]
- **Hours used:** [More Information Needed]
- **Cloud Provider:** [More Information Needed]
- **Compute Region:** [More Information Needed]
- **Carbon Emitted:** [More Information Needed]
## Technical Specifications [optional]
### Model Architecture and Objective
[More Information Needed]
### Compute Infrastructure
[More Information Needed]
#### Hardware
[More Information Needed]
#### Software
[More Information Needed]
## Citation [optional]
<!-- If there is a paper or blog post introducing the model, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the model or model card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Model Card Authors [optional]
[More Information Needed]
## Model Card Contact
[More Information Needed] | [
"banana",
"orange",
"apple"
] |
ARG-NCTU/detr-resnet-50-finetuned-300-epochs-Kaohsiung-Port-dataset |
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# detr-resnet-50-finetuned-300-epochs-Kaohsiung-Port-dataset
This model is a fine-tuned version of [facebook/detr-resnet-50](https://huggingface.co/facebook/detr-resnet-50) on an unknown dataset.
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 1e-05
- train_batch_size: 8
- eval_batch_size: 8
- seed: 42
- optimizer: Use adamw_torch with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
- lr_scheduler_type: linear
- num_epochs: 20
### Training results
### Framework versions
- Transformers 4.46.3
- Pytorch 2.4.1+cu121
- Datasets 3.1.0
- Tokenizers 0.20.3
| [
"ballonboat",
"bigboat",
"boat",
"jetski",
"katamaran",
"sailboat",
"smallboat",
"speedboat",
"wam_v",
"container_ship",
"tugship",
"yacht",
"blueboat"
] |
ARG-NCTU/detr-resnet-50-finetuned-1000-epochs-Kaohsiung-Port-dataset |
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# detr-resnet-50-finetuned-1000-epochs-Kaohsiung-Port-dataset
This model is a fine-tuned version of [facebook/detr-resnet-50](https://huggingface.co/facebook/detr-resnet-50) on an unknown dataset.
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 1e-05
- train_batch_size: 8
- eval_batch_size: 8
- seed: 42
- optimizer: Use adamw_torch with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
- lr_scheduler_type: linear
- num_epochs: 20
### Training results
### Framework versions
- Transformers 4.46.3
- Pytorch 2.4.1+cu121
- Datasets 3.1.0
- Tokenizers 0.20.3
| [
"ballonboat",
"bigboat",
"boat",
"jetski",
"katamaran",
"sailboat",
"smallboat",
"speedboat",
"wam_v",
"container_ship",
"tugship",
"yacht",
"blueboat"
] |
vladislavbro/dfine_l_coco-cppe5-finetune |
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# dfine_l_coco-cppe5-finetune
This model is a fine-tuned version of [vladislavbro/dfine_l_coco](https://huggingface.co/vladislavbro/dfine_l_coco) on an unknown dataset.
It achieves the following results on the evaluation set:
- Loss: 2.7496
- Map: 0.4281
- Map 50: 0.6128
- Map 75: 0.4751
- Map Small: 0.5217
- Map Medium: 0.3947
- Map Large: 0.5981
- Mar 1: 0.3348
- Mar 10: 0.6726
- Mar 100: 0.7561
- Mar Small: 0.6875
- Mar Medium: 0.6638
- Mar Large: 0.8701
- Map Coverall: 0.2303
- Mar 100 Coverall: 0.7923
- Map Face Shield: 0.559
- Mar 100 Face Shield: 0.7588
- Map Gloves: 0.5018
- Mar 100 Gloves: 0.7492
- Map Goggles: 0.3248
- Mar 100 Goggles: 0.7448
- Map Mask: 0.5244
- Mar 100 Mask: 0.7353
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 5e-05
- train_batch_size: 8
- eval_batch_size: 8
- seed: 42
- optimizer: Use OptimizerNames.ADAMW_TORCH with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
- lr_scheduler_type: linear
- lr_scheduler_warmup_steps: 300
- num_epochs: 40
### Training results
| Training Loss | Epoch | Step | Validation Loss | Map | Map 50 | Map 75 | Map Small | Map Medium | Map Large | Mar 1 | Mar 10 | Mar 100 | Mar Small | Mar Medium | Mar Large | Map Coverall | Mar 100 Coverall | Map Face Shield | Mar 100 Face Shield | Map Gloves | Mar 100 Gloves | Map Goggles | Mar 100 Goggles | Map Mask | Mar 100 Mask |
|:-------------:|:-----:|:----:|:---------------:|:------:|:------:|:------:|:---------:|:----------:|:---------:|:------:|:------:|:-------:|:---------:|:----------:|:---------:|:------------:|:----------------:|:---------------:|:-------------------:|:----------:|:--------------:|:-----------:|:---------------:|:--------:|:------------:|
| No log | 1.0 | 107 | 12.2583 | 0.0076 | 0.017 | 0.0066 | 0.0004 | 0.002 | 0.0171 | 0.0283 | 0.0909 | 0.1254 | 0.0308 | 0.0555 | 0.2559 | 0.0322 | 0.295 | 0.0001 | 0.0506 | 0.0012 | 0.1147 | 0.0006 | 0.0846 | 0.004 | 0.0822 |
| No log | 2.0 | 214 | 6.2711 | 0.0583 | 0.1115 | 0.0522 | 0.0223 | 0.0361 | 0.09 | 0.092 | 0.2671 | 0.3486 | 0.1993 | 0.2602 | 0.568 | 0.1623 | 0.4869 | 0.0048 | 0.3139 | 0.0281 | 0.2795 | 0.0119 | 0.28 | 0.0844 | 0.3827 |
| No log | 3.0 | 321 | 4.7100 | 0.087 | 0.1571 | 0.083 | 0.041 | 0.0909 | 0.1375 | 0.1294 | 0.3482 | 0.4407 | 0.29 | 0.351 | 0.6387 | 0.1513 | 0.6532 | 0.0134 | 0.4076 | 0.0659 | 0.333 | 0.0127 | 0.3908 | 0.1916 | 0.4191 |
| No log | 4.0 | 428 | 4.2788 | 0.1351 | 0.2495 | 0.1295 | 0.0714 | 0.122 | 0.2464 | 0.1838 | 0.3833 | 0.4827 | 0.3506 | 0.3924 | 0.6756 | 0.2092 | 0.6293 | 0.0668 | 0.4696 | 0.1164 | 0.4433 | 0.0349 | 0.4046 | 0.2481 | 0.4667 |
| 11.3943 | 5.0 | 535 | 4.0960 | 0.1555 | 0.2785 | 0.1545 | 0.128 | 0.1607 | 0.252 | 0.1898 | 0.416 | 0.5219 | 0.355 | 0.4384 | 0.7168 | 0.1857 | 0.6329 | 0.1135 | 0.5392 | 0.1512 | 0.475 | 0.0836 | 0.4738 | 0.2435 | 0.4884 |
| 11.3943 | 6.0 | 642 | 4.0103 | 0.1497 | 0.2754 | 0.1512 | 0.0884 | 0.1426 | 0.2863 | 0.2146 | 0.4285 | 0.5291 | 0.3729 | 0.4504 | 0.6933 | 0.2234 | 0.6653 | 0.0682 | 0.557 | 0.1849 | 0.4509 | 0.065 | 0.5046 | 0.2071 | 0.4676 |
| 11.3943 | 7.0 | 749 | 3.7960 | 0.1528 | 0.2793 | 0.1543 | 0.0742 | 0.1214 | 0.3125 | 0.2204 | 0.4128 | 0.5311 | 0.3611 | 0.4468 | 0.7385 | 0.2321 | 0.686 | 0.078 | 0.5139 | 0.1677 | 0.4862 | 0.059 | 0.4662 | 0.227 | 0.5031 |
| 11.3943 | 8.0 | 856 | 3.7461 | 0.1474 | 0.2686 | 0.1466 | 0.0955 | 0.1477 | 0.2527 | 0.2114 | 0.4098 | 0.5052 | 0.322 | 0.4119 | 0.6795 | 0.2048 | 0.6473 | 0.0694 | 0.4418 | 0.1723 | 0.4705 | 0.0286 | 0.4785 | 0.2622 | 0.488 |
| 11.3943 | 9.0 | 963 | 3.6575 | 0.1386 | 0.2621 | 0.1295 | 0.1228 | 0.1357 | 0.23 | 0.1948 | 0.4493 | 0.5514 | 0.3762 | 0.5058 | 0.7323 | 0.1968 | 0.7077 | 0.0786 | 0.5468 | 0.1729 | 0.5022 | 0.0351 | 0.4923 | 0.2099 | 0.508 |
| 4.7509 | 10.0 | 1070 | 3.6422 | 0.1395 | 0.27 | 0.1287 | 0.1266 | 0.1418 | 0.249 | 0.201 | 0.4301 | 0.5342 | 0.3716 | 0.4432 | 0.7413 | 0.1311 | 0.6797 | 0.1029 | 0.5089 | 0.1867 | 0.4884 | 0.0799 | 0.4892 | 0.1971 | 0.5049 |
| 4.7509 | 11.0 | 1177 | 3.5627 | 0.1822 | 0.3267 | 0.1794 | 0.1295 | 0.1738 | 0.309 | 0.2261 | 0.4396 | 0.5464 | 0.3643 | 0.4763 | 0.7369 | 0.2486 | 0.7045 | 0.1213 | 0.5253 | 0.2195 | 0.5085 | 0.0906 | 0.5 | 0.231 | 0.4938 |
| 4.7509 | 12.0 | 1284 | 3.5589 | 0.1776 | 0.322 | 0.1704 | 0.1469 | 0.1596 | 0.2852 | 0.2269 | 0.4563 | 0.5585 | 0.4086 | 0.4839 | 0.7538 | 0.2413 | 0.6946 | 0.1428 | 0.5405 | 0.2158 | 0.5254 | 0.0761 | 0.5308 | 0.212 | 0.5013 |
| 4.7509 | 13.0 | 1391 | 3.5184 | 0.1667 | 0.3119 | 0.1545 | 0.1451 | 0.1796 | 0.2856 | 0.2249 | 0.4335 | 0.5406 | 0.3279 | 0.4512 | 0.7274 | 0.1494 | 0.6608 | 0.1201 | 0.5038 | 0.226 | 0.5129 | 0.0864 | 0.5231 | 0.2516 | 0.5022 |
| 4.7509 | 14.0 | 1498 | 3.4570 | 0.2238 | 0.3985 | 0.2242 | 0.1642 | 0.1924 | 0.3459 | 0.2632 | 0.4688 | 0.5615 | 0.4281 | 0.5003 | 0.7426 | 0.2959 | 0.6986 | 0.2161 | 0.5835 | 0.2167 | 0.5 | 0.0976 | 0.5231 | 0.2925 | 0.5022 |
| 4.2981 | 15.0 | 1605 | 3.4975 | 0.1898 | 0.3529 | 0.1835 | 0.1699 | 0.1706 | 0.3331 | 0.2261 | 0.4625 | 0.5597 | 0.4369 | 0.477 | 0.7396 | 0.2577 | 0.7027 | 0.1725 | 0.5456 | 0.2175 | 0.5317 | 0.059 | 0.5185 | 0.2421 | 0.5 |
| 4.2981 | 16.0 | 1712 | 3.4823 | 0.2029 | 0.3588 | 0.1989 | 0.185 | 0.1916 | 0.3601 | 0.237 | 0.4557 | 0.5619 | 0.4026 | 0.4934 | 0.7379 | 0.1876 | 0.6865 | 0.1993 | 0.5506 | 0.2333 | 0.5344 | 0.1213 | 0.5446 | 0.2729 | 0.4933 |
| 4.2981 | 17.0 | 1819 | 3.4680 | 0.1966 | 0.3431 | 0.1914 | 0.1928 | 0.2102 | 0.3505 | 0.2367 | 0.4482 | 0.5614 | 0.3739 | 0.4811 | 0.7557 | 0.1899 | 0.6878 | 0.1736 | 0.5253 | 0.229 | 0.5348 | 0.1088 | 0.5569 | 0.2818 | 0.5022 |
| 4.2981 | 18.0 | 1926 | 3.5169 | 0.1921 | 0.3388 | 0.1911 | 0.1347 | 0.1813 | 0.3905 | 0.2223 | 0.4443 | 0.5567 | 0.3799 | 0.4839 | 0.7512 | 0.1424 | 0.6617 | 0.168 | 0.5253 | 0.2432 | 0.5571 | 0.1306 | 0.52 | 0.2761 | 0.5196 |
| 4.0614 | 19.0 | 2033 | 3.4963 | 0.2067 | 0.365 | 0.1956 | 0.139 | 0.1892 | 0.3829 | 0.2504 | 0.4607 | 0.5676 | 0.4202 | 0.4924 | 0.7511 | 0.1461 | 0.682 | 0.2216 | 0.5722 | 0.2319 | 0.5482 | 0.1713 | 0.5323 | 0.2628 | 0.5031 |
| 4.0614 | 20.0 | 2140 | 3.5090 | 0.1924 | 0.3403 | 0.1858 | 0.1192 | 0.1871 | 0.3734 | 0.2191 | 0.4486 | 0.5493 | 0.3453 | 0.4729 | 0.748 | 0.0925 | 0.6815 | 0.1988 | 0.5013 | 0.2504 | 0.5585 | 0.1529 | 0.5108 | 0.2672 | 0.4942 |
| 4.0614 | 21.0 | 2247 | 3.4683 | 0.1993 | 0.3635 | 0.1858 | 0.145 | 0.1918 | 0.371 | 0.2372 | 0.4411 | 0.5551 | 0.3925 | 0.484 | 0.7343 | 0.13 | 0.6883 | 0.1896 | 0.5304 | 0.2478 | 0.542 | 0.1657 | 0.5031 | 0.2633 | 0.5116 |
| 4.0614 | 22.0 | 2354 | 3.4556 | 0.2066 | 0.3627 | 0.2075 | 0.1562 | 0.1901 | 0.3959 | 0.2452 | 0.4486 | 0.553 | 0.3986 | 0.477 | 0.7445 | 0.1576 | 0.6874 | 0.2169 | 0.5595 | 0.252 | 0.5263 | 0.1653 | 0.4846 | 0.2413 | 0.5071 |
| 4.0614 | 23.0 | 2461 | 3.4420 | 0.2009 | 0.3516 | 0.1954 | 0.18 | 0.1826 | 0.4079 | 0.2378 | 0.4519 | 0.5514 | 0.3677 | 0.4749 | 0.7416 | 0.1585 | 0.6919 | 0.1463 | 0.5342 | 0.2631 | 0.5263 | 0.1687 | 0.4969 | 0.2678 | 0.5076 |
| 3.8839 | 24.0 | 2568 | 3.3910 | 0.2333 | 0.4059 | 0.2262 | 0.1423 | 0.1911 | 0.4386 | 0.2506 | 0.4617 | 0.5561 | 0.3432 | 0.486 | 0.7486 | 0.2541 | 0.7036 | 0.1829 | 0.5304 | 0.2639 | 0.542 | 0.1909 | 0.4954 | 0.2748 | 0.5093 |
| 3.8839 | 25.0 | 2675 | 3.4301 | 0.222 | 0.3794 | 0.2246 | 0.152 | 0.1998 | 0.3831 | 0.245 | 0.4505 | 0.5346 | 0.3242 | 0.4696 | 0.722 | 0.2289 | 0.6932 | 0.1822 | 0.4873 | 0.2853 | 0.5554 | 0.1424 | 0.4431 | 0.2713 | 0.4938 |
| 3.8839 | 26.0 | 2782 | 3.4264 | 0.203 | 0.3622 | 0.1955 | 0.1845 | 0.1811 | 0.388 | 0.2422 | 0.4466 | 0.5467 | 0.3682 | 0.4734 | 0.7354 | 0.1886 | 0.6874 | 0.1586 | 0.4886 | 0.2498 | 0.5348 | 0.1347 | 0.52 | 0.2834 | 0.5027 |
| 3.8839 | 27.0 | 2889 | 3.3817 | 0.2254 | 0.3928 | 0.2197 | 0.203 | 0.1945 | 0.4092 | 0.2524 | 0.4532 | 0.5564 | 0.3589 | 0.4826 | 0.74 | 0.1935 | 0.6806 | 0.1759 | 0.5215 | 0.2837 | 0.5513 | 0.182 | 0.5185 | 0.2919 | 0.5102 |
| 3.8839 | 28.0 | 2996 | 3.4003 | 0.2256 | 0.3852 | 0.2182 | 0.1903 | 0.2088 | 0.4035 | 0.2552 | 0.4636 | 0.5604 | 0.3932 | 0.4856 | 0.7376 | 0.146 | 0.6932 | 0.2443 | 0.5468 | 0.2793 | 0.5509 | 0.1566 | 0.5046 | 0.302 | 0.5062 |
| 3.7638 | 29.0 | 3103 | 3.3650 | 0.2342 | 0.4079 | 0.2297 | 0.1691 | 0.2007 | 0.4231 | 0.2638 | 0.4664 | 0.5654 | 0.3828 | 0.4885 | 0.7596 | 0.2185 | 0.7063 | 0.2383 | 0.5608 | 0.2726 | 0.5558 | 0.1471 | 0.5046 | 0.2948 | 0.4996 |
| 3.7638 | 30.0 | 3210 | 3.3953 | 0.2297 | 0.4083 | 0.2263 | 0.1908 | 0.2014 | 0.4143 | 0.2484 | 0.4648 | 0.5632 | 0.3895 | 0.4812 | 0.7554 | 0.1799 | 0.705 | 0.2465 | 0.5481 | 0.2694 | 0.5594 | 0.1581 | 0.4892 | 0.2944 | 0.5142 |
| 3.7638 | 31.0 | 3317 | 3.3695 | 0.2321 | 0.4052 | 0.2357 | 0.1867 | 0.2086 | 0.4357 | 0.2659 | 0.4676 | 0.5651 | 0.39 | 0.491 | 0.7463 | 0.1686 | 0.6937 | 0.2441 | 0.5519 | 0.2729 | 0.5384 | 0.1775 | 0.5354 | 0.2975 | 0.5062 |
| 3.7638 | 32.0 | 3424 | 3.3628 | 0.2376 | 0.4163 | 0.2363 | 0.1931 | 0.2083 | 0.4274 | 0.2681 | 0.4661 | 0.5522 | 0.354 | 0.4712 | 0.7547 | 0.1974 | 0.6995 | 0.2428 | 0.5127 | 0.2792 | 0.5397 | 0.1609 | 0.4954 | 0.3077 | 0.5138 |
| 3.6477 | 33.0 | 3531 | 3.3567 | 0.2392 | 0.4142 | 0.2331 | 0.2006 | 0.2027 | 0.4335 | 0.2702 | 0.4717 | 0.5703 | 0.3965 | 0.4984 | 0.7471 | 0.1983 | 0.695 | 0.2472 | 0.5633 | 0.275 | 0.5629 | 0.1709 | 0.5169 | 0.3045 | 0.5133 |
| 3.6477 | 34.0 | 3638 | 3.3581 | 0.2357 | 0.4126 | 0.2298 | 0.1897 | 0.2126 | 0.4235 | 0.2594 | 0.4593 | 0.5509 | 0.3477 | 0.473 | 0.7392 | 0.178 | 0.6932 | 0.2369 | 0.5076 | 0.2834 | 0.5571 | 0.1692 | 0.48 | 0.311 | 0.5164 |
| 3.6477 | 35.0 | 3745 | 3.3458 | 0.2396 | 0.4177 | 0.2337 | 0.1989 | 0.2046 | 0.4346 | 0.254 | 0.4651 | 0.5626 | 0.3517 | 0.493 | 0.7448 | 0.2062 | 0.709 | 0.2244 | 0.5228 | 0.2838 | 0.5625 | 0.1758 | 0.5092 | 0.308 | 0.5093 |
| 3.6477 | 36.0 | 3852 | 3.3376 | 0.2385 | 0.419 | 0.2413 | 0.192 | 0.2111 | 0.4405 | 0.25 | 0.4648 | 0.5564 | 0.3456 | 0.4818 | 0.7521 | 0.1978 | 0.7009 | 0.2216 | 0.5063 | 0.2831 | 0.5491 | 0.1734 | 0.5154 | 0.3167 | 0.5102 |
| 3.6477 | 37.0 | 3959 | 3.3495 | 0.2346 | 0.4103 | 0.2254 | 0.2041 | 0.211 | 0.4315 | 0.2689 | 0.4662 | 0.5579 | 0.3562 | 0.4948 | 0.7437 | 0.1723 | 0.6833 | 0.2407 | 0.5278 | 0.2739 | 0.5487 | 0.1757 | 0.5169 | 0.3106 | 0.5129 |
| 3.5391 | 38.0 | 4066 | 3.3427 | 0.2369 | 0.4094 | 0.2328 | 0.1985 | 0.2118 | 0.4304 | 0.2582 | 0.4669 | 0.5656 | 0.3609 | 0.4957 | 0.7469 | 0.174 | 0.7041 | 0.24 | 0.538 | 0.2738 | 0.5522 | 0.1868 | 0.5215 | 0.3097 | 0.5124 |
| 3.5391 | 39.0 | 4173 | 3.3297 | 0.2371 | 0.4149 | 0.2301 | 0.1992 | 0.2096 | 0.4322 | 0.2619 | 0.4668 | 0.5626 | 0.3571 | 0.493 | 0.7464 | 0.1956 | 0.7005 | 0.2222 | 0.5253 | 0.2832 | 0.5661 | 0.1759 | 0.5077 | 0.3089 | 0.5133 |
| 3.5391 | 40.0 | 4280 | 3.3385 | 0.2371 | 0.4143 | 0.2309 | 0.1937 | 0.211 | 0.4367 | 0.2604 | 0.4634 | 0.5594 | 0.3529 | 0.4907 | 0.7388 | 0.1874 | 0.6986 | 0.2384 | 0.5241 | 0.2805 | 0.5652 | 0.1742 | 0.5 | 0.3048 | 0.5093 |
### Framework versions
- Transformers 4.50.0.dev0
- Pytorch 2.6.0+cu124
- Datasets 3.3.2
- Tokenizers 0.21.0
| [
"coverall",
"face_shield",
"gloves",
"goggles",
"mask"
] |
s87thafe/table-transformer-finetuned-gfs-tables-v1 |
# Model Card for Model ID
<!-- Provide a quick summary of what the model is/does. -->
## Model Details
### Model Description
<!-- Provide a longer summary of what this model is. -->
This is the model card of a 🤗 transformers model that has been pushed on the Hub. This model card has been automatically generated.
- **Developed by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Model type:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
- **Finetuned from model [optional]:** [More Information Needed]
### Model Sources [optional]
<!-- Provide the basic links for the model. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the model is intended to be used, including the foreseeable users of the model and those affected by the model. -->
### Direct Use
<!-- This section is for the model use without fine-tuning or plugging into a larger ecosystem/app. -->
[More Information Needed]
### Downstream Use [optional]
<!-- This section is for the model use when fine-tuned for a task, or when plugged into a larger ecosystem/app -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the model will not work well for. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.
## How to Get Started with the Model
Use the code below to get started with the model.
[More Information Needed]
## Training Details
### Training Data
<!-- This should link to a Dataset Card, perhaps with a short stub of information on what the training data is all about as well as documentation related to data pre-processing or additional filtering. -->
[More Information Needed]
### Training Procedure
<!-- This relates heavily to the Technical Specifications. Content here should link to that section when it is relevant to the training procedure. -->
#### Preprocessing [optional]
[More Information Needed]
#### Training Hyperparameters
- **Training regime:** [More Information Needed] <!--fp32, fp16 mixed precision, bf16 mixed precision, bf16 non-mixed precision, fp16 non-mixed precision, fp8 mixed precision -->
#### Speeds, Sizes, Times [optional]
<!-- This section provides information about throughput, start/end time, checkpoint size if relevant, etc. -->
[More Information Needed]
## Evaluation
<!-- This section describes the evaluation protocols and provides the results. -->
### Testing Data, Factors & Metrics
#### Testing Data
<!-- This should link to a Dataset Card if possible. -->
[More Information Needed]
#### Factors
<!-- These are the things the evaluation is disaggregating by, e.g., subpopulations or domains. -->
[More Information Needed]
#### Metrics
<!-- These are the evaluation metrics being used, ideally with a description of why. -->
[More Information Needed]
### Results
[More Information Needed]
#### Summary
## Model Examination [optional]
<!-- Relevant interpretability work for the model goes here -->
[More Information Needed]
## Environmental Impact
<!-- Total emissions (in grams of CO2eq) and additional considerations, such as electricity usage, go here. Edit the suggested text below accordingly -->
Carbon emissions can be estimated using the [Machine Learning Impact calculator](https://mlco2.github.io/impact#compute) presented in [Lacoste et al. (2019)](https://arxiv.org/abs/1910.09700).
- **Hardware Type:** [More Information Needed]
- **Hours used:** [More Information Needed]
- **Cloud Provider:** [More Information Needed]
- **Compute Region:** [More Information Needed]
- **Carbon Emitted:** [More Information Needed]
## Technical Specifications [optional]
### Model Architecture and Objective
[More Information Needed]
### Compute Infrastructure
[More Information Needed]
#### Hardware
[More Information Needed]
#### Software
[More Information Needed]
## Citation [optional]
<!-- If there is a paper or blog post introducing the model, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the model or model card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Model Card Authors [optional]
[More Information Needed]
## Model Card Contact
[More Information Needed] | [
"table",
"table column",
"table row",
"table column header",
"table projected row header",
"table spanning cell"
] |
rsfsjsu/detr-resnet-50_finetuned_cppe5 |
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# detr-resnet-50_finetuned_cppe5
This model is a fine-tuned version of [facebook/detr-resnet-50](https://huggingface.co/facebook/detr-resnet-50) on an unknown dataset.
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 1e-05
- train_batch_size: 8
- eval_batch_size: 8
- seed: 42
- optimizer: Use OptimizerNames.ADAMW_TORCH with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
- lr_scheduler_type: linear
- num_epochs: 10
- mixed_precision_training: Native AMP
### Training results
### Framework versions
- Transformers 4.49.0
- Pytorch 2.5.1
- Datasets 3.3.2
- Tokenizers 0.21.0
| [
"coverall",
"face_shield",
"gloves",
"goggles",
"mask"
] |
franperic/yolos-datamatrix |
# Model Card for Model ID
<!-- Provide a quick summary of what the model is/does. -->
## Model Details
### Model Description
<!-- Provide a longer summary of what this model is. -->
This is the model card of a 🤗 transformers model that has been pushed on the Hub. This model card has been automatically generated.
- **Developed by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Model type:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
- **Finetuned from model [optional]:** [More Information Needed]
### Model Sources [optional]
<!-- Provide the basic links for the model. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the model is intended to be used, including the foreseeable users of the model and those affected by the model. -->
### Direct Use
<!-- This section is for the model use without fine-tuning or plugging into a larger ecosystem/app. -->
[More Information Needed]
### Downstream Use [optional]
<!-- This section is for the model use when fine-tuned for a task, or when plugged into a larger ecosystem/app -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the model will not work well for. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.
## How to Get Started with the Model
Use the code below to get started with the model.
[More Information Needed]
## Training Details
### Training Data
<!-- This should link to a Dataset Card, perhaps with a short stub of information on what the training data is all about as well as documentation related to data pre-processing or additional filtering. -->
[More Information Needed]
### Training Procedure
<!-- This relates heavily to the Technical Specifications. Content here should link to that section when it is relevant to the training procedure. -->
#### Preprocessing [optional]
[More Information Needed]
#### Training Hyperparameters
- **Training regime:** [More Information Needed] <!--fp32, fp16 mixed precision, bf16 mixed precision, bf16 non-mixed precision, fp16 non-mixed precision, fp8 mixed precision -->
#### Speeds, Sizes, Times [optional]
<!-- This section provides information about throughput, start/end time, checkpoint size if relevant, etc. -->
[More Information Needed]
## Evaluation
<!-- This section describes the evaluation protocols and provides the results. -->
### Testing Data, Factors & Metrics
#### Testing Data
<!-- This should link to a Dataset Card if possible. -->
[More Information Needed]
#### Factors
<!-- These are the things the evaluation is disaggregating by, e.g., subpopulations or domains. -->
[More Information Needed]
#### Metrics
<!-- These are the evaluation metrics being used, ideally with a description of why. -->
[More Information Needed]
### Results
[More Information Needed]
#### Summary
## Model Examination [optional]
<!-- Relevant interpretability work for the model goes here -->
[More Information Needed]
## Environmental Impact
<!-- Total emissions (in grams of CO2eq) and additional considerations, such as electricity usage, go here. Edit the suggested text below accordingly -->
Carbon emissions can be estimated using the [Machine Learning Impact calculator](https://mlco2.github.io/impact#compute) presented in [Lacoste et al. (2019)](https://arxiv.org/abs/1910.09700).
- **Hardware Type:** [More Information Needed]
- **Hours used:** [More Information Needed]
- **Cloud Provider:** [More Information Needed]
- **Compute Region:** [More Information Needed]
- **Carbon Emitted:** [More Information Needed]
## Technical Specifications [optional]
### Model Architecture and Objective
[More Information Needed]
### Compute Infrastructure
[More Information Needed]
#### Hardware
[More Information Needed]
#### Software
[More Information Needed]
## Citation [optional]
<!-- If there is a paper or blog post introducing the model, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the model or model card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Model Card Authors [optional]
[More Information Needed]
## Model Card Contact
[More Information Needed] | [
"background",
"datamatrix"
] |
madhava-yallanki/detr_finetuned_cppe5 |
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# detr_finetuned_cppe5
This model is a fine-tuned version of [facebook/detr-resnet-50](https://huggingface.co/facebook/detr-resnet-50) on an unknown dataset.
It achieves the following results on the evaluation set:
- Loss: 2.6375
- Map: 0.0618
- Map 50: 0.1727
- Map 75: 0.0707
- Map Small: 0.0
- Map Medium: 0.1642
- Map Large: 0.0627
- Mar 1: 0.0908
- Mar 10: 0.1208
- Mar 100: 0.1608
- Mar Small: 0.0
- Mar Medium: 0.1983
- Mar Large: 0.2208
- Map Coverall: 0.2826
- Mar 100 Coverall: 0.625
- Map Face Shield: 0.0
- Mar 100 Face Shield: 0.0
- Map Gloves: 0.023
- Mar 100 Gloves: 0.1625
- Map Goggles: 0.0
- Mar 100 Goggles: 0.0
- Map Mask: 0.0034
- Mar 100 Mask: 0.0167
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 5e-05
- train_batch_size: 8
- eval_batch_size: 8
- seed: 42
- optimizer: Use OptimizerNames.ADAMW_TORCH with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
- lr_scheduler_type: cosine
- num_epochs: 30
### Training results
| Training Loss | Epoch | Step | Validation Loss | Map | Map 50 | Map 75 | Map Small | Map Medium | Map Large | Mar 1 | Mar 10 | Mar 100 | Mar Small | Mar Medium | Mar Large | Map Coverall | Mar 100 Coverall | Map Face Shield | Mar 100 Face Shield | Map Gloves | Mar 100 Gloves | Map Goggles | Mar 100 Goggles | Map Mask | Mar 100 Mask |
|:-------------:|:-----:|:----:|:---------------:|:------:|:------:|:------:|:---------:|:----------:|:---------:|:------:|:------:|:-------:|:---------:|:----------:|:---------:|:------------:|:----------------:|:---------------:|:-------------------:|:----------:|:--------------:|:-----------:|:---------------:|:--------:|:------------:|
| No log | 1.0 | 2 | 3.6370 | 0.0139 | 0.0385 | 0.0064 | 0.0 | 0.0447 | 0.0205 | 0.015 | 0.065 | 0.0725 | 0.0 | 0.13 | 0.075 | 0.0694 | 0.35 | 0.0 | 0.0 | 0.0002 | 0.0125 | 0.0 | 0.0 | 0.0 | 0.0 |
| No log | 2.0 | 4 | 3.3372 | 0.0084 | 0.023 | 0.0106 | 0.0 | 0.0261 | 0.0077 | 0.0 | 0.08 | 0.0975 | 0.0 | 0.185 | 0.0833 | 0.0404 | 0.4 | 0.0 | 0.0 | 0.0016 | 0.0875 | 0.0 | 0.0 | 0.0 | 0.0 |
| No log | 3.0 | 6 | 3.2809 | 0.0175 | 0.0686 | 0.0096 | 0.0 | 0.0692 | 0.0062 | 0.01 | 0.08 | 0.0975 | 0.0 | 0.21 | 0.075 | 0.0862 | 0.4 | 0.0 | 0.0 | 0.0015 | 0.0875 | 0.0 | 0.0 | 0.0 | 0.0 |
| No log | 4.0 | 8 | 3.0734 | 0.0305 | 0.0958 | 0.0172 | 0.0 | 0.1039 | 0.0224 | 0.05 | 0.0925 | 0.1025 | 0.0 | 0.21 | 0.0833 | 0.149 | 0.425 | 0.0 | 0.0 | 0.0035 | 0.0875 | 0.0 | 0.0 | 0.0 | 0.0 |
| No log | 5.0 | 10 | 3.1212 | 0.0098 | 0.0457 | 0.007 | 0.0 | 0.0514 | 0.0082 | 0.01 | 0.05 | 0.1 | 0.0 | 0.18 | 0.0917 | 0.0476 | 0.425 | 0.0 | 0.0 | 0.0015 | 0.075 | 0.0 | 0.0 | 0.0 | 0.0 |
| No log | 6.0 | 12 | 2.9352 | 0.0084 | 0.03 | 0.0086 | 0.0 | 0.0386 | 0.0073 | 0.0 | 0.08 | 0.115 | 0.0 | 0.18 | 0.1167 | 0.0405 | 0.5 | 0.0 | 0.0 | 0.0014 | 0.075 | 0.0 | 0.0 | 0.0 | 0.0 |
| No log | 7.0 | 14 | 2.8460 | 0.0099 | 0.039 | 0.002 | 0.0 | 0.0451 | 0.0084 | 0.0 | 0.0875 | 0.1275 | 0.0 | 0.18 | 0.125 | 0.044 | 0.5 | 0.0 | 0.0 | 0.0054 | 0.1375 | 0.0 | 0.0 | 0.0 | 0.0 |
| No log | 8.0 | 16 | 2.8413 | 0.0071 | 0.0285 | 0.0028 | 0.0 | 0.0164 | 0.0149 | 0.005 | 0.095 | 0.13 | 0.0 | 0.15 | 0.1583 | 0.0273 | 0.5 | 0.0 | 0.0 | 0.0082 | 0.15 | 0.0 | 0.0 | 0.0 | 0.0 |
| No log | 9.0 | 18 | 2.6854 | 0.0123 | 0.0493 | 0.0054 | 0.0 | 0.0219 | 0.0249 | 0.01 | 0.095 | 0.135 | 0.0 | 0.15 | 0.1833 | 0.0517 | 0.5 | 0.0 | 0.0 | 0.01 | 0.175 | 0.0 | 0.0 | 0.0 | 0.0 |
| No log | 10.0 | 20 | 2.7651 | 0.0344 | 0.0962 | 0.0379 | 0.0 | 0.0791 | 0.0287 | 0.03 | 0.1025 | 0.155 | 0.0 | 0.2 | 0.1833 | 0.0629 | 0.5 | 0.0 | 0.0 | 0.0082 | 0.175 | 0.0 | 0.0 | 0.101 | 0.1 |
| No log | 11.0 | 22 | 2.7936 | 0.0226 | 0.0829 | 0.018 | 0.0 | 0.0438 | 0.0431 | 0.035 | 0.09 | 0.1 | 0.0 | 0.155 | 0.1333 | 0.1078 | 0.375 | 0.0 | 0.0 | 0.005 | 0.125 | 0.0 | 0.0 | 0.0 | 0.0 |
| No log | 12.0 | 24 | 2.7867 | 0.0216 | 0.0842 | 0.0055 | 0.0 | 0.0647 | 0.0309 | 0.015 | 0.09 | 0.12 | 0.0 | 0.16 | 0.1542 | 0.1029 | 0.475 | 0.0 | 0.0 | 0.005 | 0.125 | 0.0 | 0.0 | 0.0 | 0.0 |
| No log | 13.0 | 26 | 2.9817 | 0.0112 | 0.0372 | 0.0055 | 0.0 | 0.0442 | 0.0144 | 0.0 | 0.0975 | 0.1175 | 0.0 | 0.16 | 0.1417 | 0.0509 | 0.475 | 0.0 | 0.0 | 0.0053 | 0.1125 | 0.0 | 0.0 | 0.0 | 0.0 |
| No log | 14.0 | 28 | 2.9225 | 0.0278 | 0.1074 | 0.0148 | 0.0 | 0.0775 | 0.0425 | 0.005 | 0.1067 | 0.1617 | 0.0 | 0.1817 | 0.2 | 0.1119 | 0.65 | 0.0 | 0.0 | 0.01 | 0.125 | 0.0 | 0.0 | 0.0168 | 0.0333 |
| No log | 15.0 | 30 | 2.8893 | 0.0347 | 0.1374 | 0.0193 | 0.0 | 0.0831 | 0.0566 | 0.0558 | 0.1092 | 0.1442 | 0.0 | 0.1817 | 0.1792 | 0.1356 | 0.55 | 0.0 | 0.0 | 0.0126 | 0.1375 | 0.0 | 0.0 | 0.0252 | 0.0333 |
| No log | 16.0 | 32 | 2.8845 | 0.0328 | 0.1313 | 0.0151 | 0.0 | 0.088 | 0.0427 | 0.0617 | 0.1092 | 0.1442 | 0.0 | 0.1817 | 0.1792 | 0.1163 | 0.55 | 0.0 | 0.0 | 0.0139 | 0.1375 | 0.0 | 0.0 | 0.0337 | 0.0333 |
| No log | 17.0 | 34 | 2.8390 | 0.0383 | 0.1469 | 0.0128 | 0.0 | 0.1033 | 0.0365 | 0.02 | 0.1125 | 0.1475 | 0.0 | 0.19 | 0.1792 | 0.1056 | 0.55 | 0.0 | 0.0 | 0.0356 | 0.1375 | 0.0 | 0.0 | 0.0505 | 0.05 |
| No log | 18.0 | 36 | 2.8089 | 0.0364 | 0.1392 | 0.0129 | 0.0 | 0.0877 | 0.0362 | 0.0175 | 0.1125 | 0.1525 | 0.0 | 0.19 | 0.1875 | 0.1296 | 0.575 | 0.0 | 0.0 | 0.027 | 0.1375 | 0.0 | 0.0 | 0.0252 | 0.05 |
| No log | 19.0 | 38 | 2.8321 | 0.0329 | 0.1149 | 0.0352 | 0.0 | 0.0884 | 0.0327 | 0.0117 | 0.1142 | 0.1542 | 0.0 | 0.2017 | 0.2 | 0.1363 | 0.6 | 0.0 | 0.0 | 0.0199 | 0.1375 | 0.0 | 0.0 | 0.0084 | 0.0333 |
| No log | 20.0 | 40 | 2.8311 | 0.0498 | 0.1556 | 0.0568 | 0.0 | 0.1641 | 0.0433 | 0.0417 | 0.115 | 0.155 | 0.0 | 0.205 | 0.2 | 0.2192 | 0.6 | 0.0 | 0.0 | 0.0181 | 0.125 | 0.0 | 0.0 | 0.0118 | 0.05 |
| No log | 21.0 | 42 | 2.7355 | 0.0488 | 0.1556 | 0.0538 | 0.0 | 0.1617 | 0.0444 | 0.0417 | 0.1117 | 0.1517 | 0.0 | 0.1967 | 0.2 | 0.2192 | 0.6 | 0.0 | 0.0 | 0.0166 | 0.125 | 0.0 | 0.0 | 0.0084 | 0.0333 |
| No log | 22.0 | 44 | 2.6950 | 0.0484 | 0.1527 | 0.0538 | 0.0 | 0.1605 | 0.0451 | 0.0383 | 0.1167 | 0.1567 | 0.0 | 0.2017 | 0.2125 | 0.2192 | 0.6 | 0.0 | 0.0 | 0.0181 | 0.15 | 0.0 | 0.0 | 0.0049 | 0.0333 |
| No log | 23.0 | 46 | 2.6900 | 0.0484 | 0.1527 | 0.0538 | 0.0 | 0.1605 | 0.045 | 0.0383 | 0.1167 | 0.1567 | 0.0 | 0.2017 | 0.2125 | 0.2191 | 0.6 | 0.0 | 0.0 | 0.0182 | 0.15 | 0.0 | 0.0 | 0.0049 | 0.0333 |
| No log | 24.0 | 48 | 2.6738 | 0.0481 | 0.1539 | 0.0538 | 0.0 | 0.1598 | 0.0451 | 0.0383 | 0.1133 | 0.1533 | 0.0 | 0.1933 | 0.2125 | 0.2192 | 0.6 | 0.0 | 0.0 | 0.0182 | 0.15 | 0.0 | 0.0 | 0.0034 | 0.0167 |
| No log | 25.0 | 50 | 2.6657 | 0.061 | 0.1663 | 0.0707 | 0.0 | 0.1603 | 0.0632 | 0.0883 | 0.1208 | 0.1608 | 0.0 | 0.1983 | 0.2208 | 0.2826 | 0.625 | 0.0 | 0.0 | 0.0192 | 0.1625 | 0.0 | 0.0 | 0.0034 | 0.0167 |
| No log | 26.0 | 52 | 2.6576 | 0.0627 | 0.1831 | 0.0707 | 0.0 | 0.1632 | 0.0627 | 0.0875 | 0.1208 | 0.1608 | 0.0 | 0.1983 | 0.2208 | 0.2826 | 0.625 | 0.0 | 0.0 | 0.0288 | 0.1625 | 0.0 | 0.0 | 0.0021 | 0.0167 |
| No log | 27.0 | 54 | 2.6461 | 0.0631 | 0.1856 | 0.0707 | 0.0 | 0.1642 | 0.0628 | 0.0908 | 0.1208 | 0.1608 | 0.0 | 0.1983 | 0.2208 | 0.2827 | 0.625 | 0.0 | 0.0 | 0.0295 | 0.1625 | 0.0 | 0.0 | 0.0034 | 0.0167 |
| No log | 28.0 | 56 | 2.6412 | 0.0618 | 0.1727 | 0.0707 | 0.0 | 0.1642 | 0.0627 | 0.0908 | 0.1208 | 0.1608 | 0.0 | 0.1983 | 0.2208 | 0.2826 | 0.625 | 0.0 | 0.0 | 0.023 | 0.1625 | 0.0 | 0.0 | 0.0034 | 0.0167 |
| No log | 29.0 | 58 | 2.6383 | 0.0618 | 0.1727 | 0.0707 | 0.0 | 0.1642 | 0.0627 | 0.0908 | 0.1208 | 0.1608 | 0.0 | 0.1983 | 0.2208 | 0.2826 | 0.625 | 0.0 | 0.0 | 0.023 | 0.1625 | 0.0 | 0.0 | 0.0034 | 0.0167 |
| No log | 30.0 | 60 | 2.6375 | 0.0618 | 0.1727 | 0.0707 | 0.0 | 0.1642 | 0.0627 | 0.0908 | 0.1208 | 0.1608 | 0.0 | 0.1983 | 0.2208 | 0.2826 | 0.625 | 0.0 | 0.0 | 0.023 | 0.1625 | 0.0 | 0.0 | 0.0034 | 0.0167 |
### Framework versions
- Transformers 4.49.0
- Pytorch 2.6.0
- Datasets 3.3.2
- Tokenizers 0.21.0
| [
"coverall",
"face_shield",
"gloves",
"goggles",
"mask"
] |
mstrautk/detr-resnet-50_finetuned_cppe5 |
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# detr-resnet-50_finetuned_cppe5
This model is a fine-tuned version of [facebook/detr-resnet-50](https://huggingface.co/facebook/detr-resnet-50) on an unknown dataset.
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 1e-05
- train_batch_size: 8
- eval_batch_size: 8
- seed: 42
- optimizer: Use OptimizerNames.ADAMW_TORCH with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
- lr_scheduler_type: linear
- num_epochs: 10
- mixed_precision_training: Native AMP
### Training results
### Framework versions
- Transformers 4.49.0
- Pytorch 2.5.1+cu124
- Datasets 3.3.2
- Tokenizers 0.21.0
| [
"coverall",
"face_shield",
"gloves",
"goggles",
"mask"
] |
kine1004/detr_finetuned_cppe5 |
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# detr_finetuned_cppe5
This model is a fine-tuned version of [microsoft/conditional-detr-resnet-50](https://huggingface.co/microsoft/conditional-detr-resnet-50) on an unknown dataset.
It achieves the following results on the evaluation set:
- Loss: 2.3956
- Map: 0.0065
- Map 50: 0.0182
- Map 75: 0.0036
- Map Small: 0.0006
- Map Medium: 0.014
- Map Large: 0.007
- Mar 1: 0.0123
- Mar 10: 0.0366
- Mar 100: 0.067
- Mar Small: 0.0072
- Mar Medium: 0.0388
- Mar Large: 0.0714
- Map Coverall: 0.0323
- Mar 100 Coverall: 0.3018
- Map Face Shield: 0.0
- Mar 100 Face Shield: 0.0
- Map Gloves: 0.0001
- Mar 100 Gloves: 0.0165
- Map Goggles: 0.0
- Mar 100 Goggles: 0.0062
- Map Mask: 0.0
- Mar 100 Mask: 0.0107
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 5e-05
- train_batch_size: 8
- eval_batch_size: 8
- seed: 42
- optimizer: Use OptimizerNames.ADAMW_TORCH with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
- lr_scheduler_type: cosine
- num_epochs: 3
### Training results
| Training Loss | Epoch | Step | Validation Loss | Map | Map 50 | Map 75 | Map Small | Map Medium | Map Large | Mar 1 | Mar 10 | Mar 100 | Mar Small | Mar Medium | Mar Large | Map Coverall | Mar 100 Coverall | Map Face Shield | Mar 100 Face Shield | Map Gloves | Mar 100 Gloves | Map Goggles | Mar 100 Goggles | Map Mask | Mar 100 Mask |
|:-------------:|:-----:|:----:|:---------------:|:------:|:------:|:------:|:---------:|:----------:|:---------:|:------:|:------:|:-------:|:---------:|:----------:|:---------:|:------------:|:----------------:|:---------------:|:-------------------:|:----------:|:--------------:|:-----------:|:---------------:|:--------:|:------------:|
| No log | 1.0 | 107 | 2.6211 | 0.0025 | 0.0088 | 0.0008 | 0.0 | 0.008 | 0.0025 | 0.0098 | 0.0172 | 0.0569 | 0.0 | 0.0309 | 0.0621 | 0.0124 | 0.2802 | 0.0 | 0.0025 | 0.0 | 0.0004 | 0.0 | 0.0015 | 0.0 | 0.0 |
| No log | 2.0 | 214 | 2.4397 | 0.0058 | 0.0156 | 0.0032 | 0.0004 | 0.014 | 0.0062 | 0.0081 | 0.0352 | 0.0525 | 0.0098 | 0.0338 | 0.0524 | 0.0287 | 0.2297 | 0.0 | 0.0 | 0.0001 | 0.029 | 0.0 | 0.0 | 0.0 | 0.0036 |
| No log | 3.0 | 321 | 2.3956 | 0.0065 | 0.0182 | 0.0036 | 0.0006 | 0.014 | 0.007 | 0.0123 | 0.0366 | 0.067 | 0.0072 | 0.0388 | 0.0714 | 0.0323 | 0.3018 | 0.0 | 0.0 | 0.0001 | 0.0165 | 0.0 | 0.0062 | 0.0 | 0.0107 |
### Framework versions
- Transformers 4.48.3
- Pytorch 2.5.1+cu124
- Datasets 3.3.2
- Tokenizers 0.21.0
| [
"coverall",
"face_shield",
"gloves",
"goggles",
"mask"
] |
TowardsUtopia/detr-finetuned-historic-v1 |
# Model Card for Model ID
<!-- Provide a quick summary of what the model is/does. -->
## Model Details
### Model Description
<!-- Provide a longer summary of what this model is. -->
This is the model card of a 🤗 transformers model that has been pushed on the Hub. This model card has been automatically generated.
- **Developed by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Model type:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
- **Finetuned from model [optional]:** [More Information Needed]
### Model Sources [optional]
<!-- Provide the basic links for the model. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the model is intended to be used, including the foreseeable users of the model and those affected by the model. -->
### Direct Use
<!-- This section is for the model use without fine-tuning or plugging into a larger ecosystem/app. -->
[More Information Needed]
### Downstream Use [optional]
<!-- This section is for the model use when fine-tuned for a task, or when plugged into a larger ecosystem/app -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the model will not work well for. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.
## How to Get Started with the Model
Use the code below to get started with the model.
[More Information Needed]
## Training Details
### Training Data
<!-- This should link to a Dataset Card, perhaps with a short stub of information on what the training data is all about as well as documentation related to data pre-processing or additional filtering. -->
[More Information Needed]
### Training Procedure
<!-- This relates heavily to the Technical Specifications. Content here should link to that section when it is relevant to the training procedure. -->
#### Preprocessing [optional]
[More Information Needed]
#### Training Hyperparameters
- **Training regime:** [More Information Needed] <!--fp32, fp16 mixed precision, bf16 mixed precision, bf16 non-mixed precision, fp16 non-mixed precision, fp8 mixed precision -->
#### Speeds, Sizes, Times [optional]
<!-- This section provides information about throughput, start/end time, checkpoint size if relevant, etc. -->
[More Information Needed]
## Evaluation
<!-- This section describes the evaluation protocols and provides the results. -->
### Testing Data, Factors & Metrics
#### Testing Data
<!-- This should link to a Dataset Card if possible. -->
[More Information Needed]
#### Factors
<!-- These are the things the evaluation is disaggregating by, e.g., subpopulations or domains. -->
[More Information Needed]
#### Metrics
<!-- These are the evaluation metrics being used, ideally with a description of why. -->
[More Information Needed]
### Results
[More Information Needed]
#### Summary
## Model Examination [optional]
<!-- Relevant interpretability work for the model goes here -->
[More Information Needed]
## Environmental Impact
<!-- Total emissions (in grams of CO2eq) and additional considerations, such as electricity usage, go here. Edit the suggested text below accordingly -->
Carbon emissions can be estimated using the [Machine Learning Impact calculator](https://mlco2.github.io/impact#compute) presented in [Lacoste et al. (2019)](https://arxiv.org/abs/1910.09700).
- **Hardware Type:** [More Information Needed]
- **Hours used:** [More Information Needed]
- **Cloud Provider:** [More Information Needed]
- **Compute Region:** [More Information Needed]
- **Carbon Emitted:** [More Information Needed]
## Technical Specifications [optional]
### Model Architecture and Objective
[More Information Needed]
### Compute Infrastructure
[More Information Needed]
#### Hardware
[More Information Needed]
#### Software
[More Information Needed]
## Citation [optional]
<!-- If there is a paper or blog post introducing the model, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the model or model card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Model Card Authors [optional]
[More Information Needed]
## Model Card Contact
[More Information Needed] | [
"label_0",
"label_1",
"label_2",
"label_3",
"label_4",
"label_5",
"label_6",
"label_7",
"label_8"
] |
Danilegovich/detr-resnet-50_finetuned_cppe5 |
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# detr-resnet-50_finetuned_cppe5
This model is a fine-tuned version of [facebook/detr-resnet-50](https://huggingface.co/facebook/detr-resnet-50) on an unknown dataset.
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 1e-05
- train_batch_size: 8
- eval_batch_size: 8
- seed: 42
- optimizer: Use OptimizerNames.ADAMW_TORCH with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
- lr_scheduler_type: linear
- num_epochs: 10
- mixed_precision_training: Native AMP
### Framework versions
- Transformers 4.50.0
- Pytorch 2.6.0+cu124
- Datasets 3.5.0
- Tokenizers 0.21.1
| [
"coverall",
"face_shield",
"gloves",
"goggles",
"mask"
] |
toukapy/detr_finetuned_cppe5 |
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# detr_finetuned_cppe5
This model is a fine-tuned version of [toukapy/detr_finetuned_cppe5](https://huggingface.co/toukapy/detr_finetuned_cppe5) on an unknown dataset.
It achieves the following results on the evaluation set:
- Loss: 1.1348
- Map: 0.2218
- Map 50: 0.3828
- Map 75: 0.2171
- Map Small: 0.25
- Map Medium: 0.1585
- Map Large: 0.2409
- Mar 1: 0.2038
- Mar 10: 0.5442
- Mar 100: 0.6037
- Mar Small: 0.4333
- Mar Medium: 0.4272
- Mar Large: 0.6401
- Map Coverall: 0.4068
- Mar 100 Coverall: 0.6207
- Map Face Shield: 0.1384
- Mar 100 Face Shield: 0.5684
- Map Gloves: 0.2036
- Mar 100 Gloves: 0.6545
- Map Goggles: 0.1505
- Mar 100 Goggles: 0.5462
- Map Mask: 0.2099
- Mar 100 Mask: 0.6289
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 0.0001
- train_batch_size: 8
- eval_batch_size: 8
- seed: 42
- optimizer: Use OptimizerNames.ADAMW_TORCH with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
- lr_scheduler_type: cosine
- num_epochs: 60
### Training results
| Training Loss | Epoch | Step | Validation Loss | Map | Map 50 | Map 75 | Map Small | Map Medium | Map Large | Mar 1 | Mar 10 | Mar 100 | Mar Small | Mar Medium | Mar Large | Map Coverall | Mar 100 Coverall | Map Face Shield | Mar 100 Face Shield | Map Gloves | Mar 100 Gloves | Map Goggles | Mar 100 Goggles | Map Mask | Mar 100 Mask |
|:-------------:|:-----:|:----:|:---------------:|:------:|:------:|:------:|:---------:|:----------:|:---------:|:------:|:------:|:-------:|:---------:|:----------:|:---------:|:------------:|:----------------:|:---------------:|:-------------------:|:----------:|:--------------:|:-----------:|:---------------:|:--------:|:------------:|
| No log | 1.0 | 107 | 1.4728 | 0.0576 | 0.1377 | 0.0406 | 0.0513 | 0.105 | 0.0581 | 0.081 | 0.3151 | 0.3917 | 0.3 | 0.2766 | 0.4112 | 0.1353 | 0.4766 | 0.013 | 0.2949 | 0.061 | 0.4509 | 0.012 | 0.2554 | 0.0667 | 0.4809 |
| No log | 2.0 | 214 | 1.3775 | 0.0856 | 0.1821 | 0.0689 | 0.1145 | 0.1106 | 0.0881 | 0.1261 | 0.3593 | 0.4296 | 0.4 | 0.316 | 0.4493 | 0.2013 | 0.5986 | 0.0258 | 0.3203 | 0.0968 | 0.4263 | 0.0196 | 0.2738 | 0.0846 | 0.5289 |
| No log | 3.0 | 321 | 1.4530 | 0.072 | 0.1514 | 0.0567 | 0.0959 | 0.1114 | 0.0754 | 0.0935 | 0.3448 | 0.4296 | 0.3444 | 0.3285 | 0.4492 | 0.1892 | 0.5342 | 0.0189 | 0.3392 | 0.0526 | 0.3362 | 0.0278 | 0.3831 | 0.0717 | 0.5551 |
| No log | 4.0 | 428 | 1.4093 | 0.078 | 0.1703 | 0.064 | 0.0631 | 0.0937 | 0.0802 | 0.0979 | 0.3576 | 0.4514 | 0.3111 | 0.3132 | 0.4766 | 0.2188 | 0.5896 | 0.0143 | 0.3266 | 0.0654 | 0.4656 | 0.0163 | 0.3754 | 0.075 | 0.4996 |
| 1.2913 | 5.0 | 535 | 1.3523 | 0.0935 | 0.1946 | 0.0813 | 0.1433 | 0.0977 | 0.0983 | 0.1373 | 0.3945 | 0.4669 | 0.4667 | 0.3354 | 0.4883 | 0.2406 | 0.5775 | 0.0264 | 0.3658 | 0.0824 | 0.5312 | 0.0186 | 0.3446 | 0.0992 | 0.5151 |
| 1.2913 | 6.0 | 642 | 1.3167 | 0.1168 | 0.2413 | 0.0967 | 0.2318 | 0.1052 | 0.1257 | 0.144 | 0.406 | 0.4793 | 0.5111 | 0.3539 | 0.4988 | 0.2589 | 0.6032 | 0.0318 | 0.3101 | 0.1211 | 0.558 | 0.0347 | 0.3954 | 0.1372 | 0.5298 |
| 1.2913 | 7.0 | 749 | 1.3705 | 0.0867 | 0.179 | 0.0789 | 0.1412 | 0.0866 | 0.0908 | 0.1079 | 0.3493 | 0.4314 | 0.4333 | 0.3251 | 0.4481 | 0.2583 | 0.5991 | 0.0217 | 0.3266 | 0.0579 | 0.4496 | 0.0192 | 0.3015 | 0.0766 | 0.4804 |
| 1.2913 | 8.0 | 856 | 1.3039 | 0.1226 | 0.2434 | 0.1086 | 0.2239 | 0.1174 | 0.1324 | 0.1441 | 0.4055 | 0.4866 | 0.4333 | 0.3496 | 0.5108 | 0.271 | 0.5793 | 0.049 | 0.3367 | 0.1065 | 0.5491 | 0.0383 | 0.3892 | 0.1481 | 0.5787 |
| 1.2913 | 9.0 | 963 | 1.3037 | 0.1285 | 0.2571 | 0.1136 | 0.2093 | 0.1105 | 0.1389 | 0.1331 | 0.4177 | 0.4944 | 0.4 | 0.3407 | 0.5242 | 0.2914 | 0.5901 | 0.033 | 0.3228 | 0.119 | 0.5652 | 0.0486 | 0.4154 | 0.1505 | 0.5787 |
| 1.2347 | 10.0 | 1070 | 1.3257 | 0.1182 | 0.2333 | 0.1088 | 0.1854 | 0.1144 | 0.1254 | 0.1297 | 0.4006 | 0.4731 | 0.4 | 0.3409 | 0.4955 | 0.2874 | 0.5865 | 0.0498 | 0.3671 | 0.1011 | 0.5696 | 0.0258 | 0.3446 | 0.1269 | 0.4978 |
| 1.2347 | 11.0 | 1177 | 1.3714 | 0.1009 | 0.2112 | 0.0859 | 0.2207 | 0.1094 | 0.105 | 0.1146 | 0.3726 | 0.4489 | 0.4222 | 0.3339 | 0.4708 | 0.2602 | 0.5306 | 0.0248 | 0.3342 | 0.0933 | 0.4848 | 0.0196 | 0.34 | 0.1065 | 0.5551 |
| 1.2347 | 12.0 | 1284 | 1.2914 | 0.114 | 0.2345 | 0.0975 | 0.2238 | 0.1157 | 0.1198 | 0.1135 | 0.4098 | 0.4941 | 0.5333 | 0.3461 | 0.5214 | 0.2967 | 0.582 | 0.0365 | 0.3734 | 0.0878 | 0.5496 | 0.033 | 0.3769 | 0.116 | 0.5884 |
| 1.2347 | 13.0 | 1391 | 1.2971 | 0.1276 | 0.2576 | 0.115 | 0.0742 | 0.1371 | 0.1376 | 0.1431 | 0.4174 | 0.4908 | 0.2778 | 0.3695 | 0.5134 | 0.3022 | 0.5901 | 0.0589 | 0.3595 | 0.1094 | 0.5714 | 0.0399 | 0.3862 | 0.1277 | 0.5467 |
| 1.2347 | 14.0 | 1498 | 1.2865 | 0.1261 | 0.2526 | 0.1238 | 0.2737 | 0.1314 | 0.1335 | 0.1451 | 0.4131 | 0.4924 | 0.4556 | 0.3545 | 0.5167 | 0.2817 | 0.6032 | 0.0381 | 0.3633 | 0.1448 | 0.542 | 0.0458 | 0.3754 | 0.1201 | 0.5782 |
| 1.1962 | 15.0 | 1605 | 1.2636 | 0.1133 | 0.2255 | 0.1028 | 0.177 | 0.1201 | 0.1213 | 0.1229 | 0.3957 | 0.4991 | 0.4444 | 0.3547 | 0.5259 | 0.3056 | 0.5766 | 0.0305 | 0.357 | 0.0826 | 0.5562 | 0.0295 | 0.3969 | 0.1185 | 0.6089 |
| 1.1962 | 16.0 | 1712 | 1.2621 | 0.1225 | 0.2436 | 0.1101 | 0.1751 | 0.1179 | 0.1282 | 0.1194 | 0.424 | 0.5104 | 0.3667 | 0.3851 | 0.5341 | 0.2991 | 0.582 | 0.0368 | 0.3937 | 0.1276 | 0.5839 | 0.0274 | 0.3938 | 0.1215 | 0.5987 |
| 1.1962 | 17.0 | 1819 | 1.2575 | 0.1134 | 0.2228 | 0.103 | 0.2661 | 0.1192 | 0.1207 | 0.1168 | 0.4327 | 0.5074 | 0.4333 | 0.3504 | 0.5362 | 0.2773 | 0.5887 | 0.0337 | 0.4177 | 0.1106 | 0.5634 | 0.0263 | 0.3769 | 0.1191 | 0.5902 |
| 1.1962 | 18.0 | 1926 | 1.2579 | 0.1396 | 0.2539 | 0.136 | 0.2361 | 0.1246 | 0.1469 | 0.1634 | 0.4522 | 0.5309 | 0.4333 | 0.3661 | 0.5624 | 0.3251 | 0.5941 | 0.06 | 0.481 | 0.1185 | 0.5714 | 0.038 | 0.4385 | 0.1561 | 0.5693 |
| 1.1126 | 19.0 | 2033 | 1.2450 | 0.1353 | 0.2652 | 0.1277 | 0.2612 | 0.1217 | 0.1454 | 0.1354 | 0.4442 | 0.5224 | 0.4444 | 0.3859 | 0.5471 | 0.3129 | 0.5734 | 0.0564 | 0.4646 | 0.1203 | 0.6004 | 0.0523 | 0.4277 | 0.1348 | 0.5458 |
| 1.1126 | 20.0 | 2140 | 1.2278 | 0.1362 | 0.2649 | 0.1322 | 0.2935 | 0.1242 | 0.1435 | 0.1564 | 0.452 | 0.5258 | 0.3667 | 0.3525 | 0.5566 | 0.3109 | 0.605 | 0.05 | 0.457 | 0.1409 | 0.5844 | 0.038 | 0.4123 | 0.1409 | 0.5702 |
| 1.1126 | 21.0 | 2247 | 1.2046 | 0.1359 | 0.2631 | 0.1215 | 0.3752 | 0.1267 | 0.1432 | 0.1434 | 0.4576 | 0.5391 | 0.5222 | 0.4336 | 0.5596 | 0.3291 | 0.5955 | 0.0458 | 0.457 | 0.1223 | 0.5906 | 0.0447 | 0.4631 | 0.1377 | 0.5893 |
| 1.1126 | 22.0 | 2354 | 1.2224 | 0.1524 | 0.2793 | 0.1477 | 0.2993 | 0.133 | 0.1616 | 0.1557 | 0.4721 | 0.5387 | 0.5 | 0.365 | 0.5694 | 0.3378 | 0.6005 | 0.055 | 0.4873 | 0.1613 | 0.5777 | 0.0696 | 0.4585 | 0.1385 | 0.5693 |
| 1.1126 | 23.0 | 2461 | 1.1996 | 0.1568 | 0.2861 | 0.1499 | 0.2102 | 0.1397 | 0.1709 | 0.1636 | 0.4779 | 0.5574 | 0.3333 | 0.403 | 0.5883 | 0.3615 | 0.6108 | 0.0621 | 0.4646 | 0.1275 | 0.6076 | 0.0643 | 0.4892 | 0.1683 | 0.6147 |
| 1.0432 | 24.0 | 2568 | 1.2302 | 0.1552 | 0.2941 | 0.1441 | 0.2166 | 0.12 | 0.1671 | 0.1635 | 0.4638 | 0.5405 | 0.3444 | 0.3936 | 0.5716 | 0.3594 | 0.6239 | 0.0549 | 0.4506 | 0.1452 | 0.5714 | 0.0693 | 0.4892 | 0.1474 | 0.5676 |
| 1.0432 | 25.0 | 2675 | 1.1967 | 0.1747 | 0.3135 | 0.177 | 0.2256 | 0.1508 | 0.1863 | 0.1772 | 0.4987 | 0.562 | 0.3667 | 0.384 | 0.595 | 0.3741 | 0.6167 | 0.0693 | 0.5481 | 0.1844 | 0.6161 | 0.076 | 0.4415 | 0.1698 | 0.5876 |
| 1.0432 | 26.0 | 2782 | 1.1815 | 0.1576 | 0.2967 | 0.1497 | 0.3017 | 0.1551 | 0.1633 | 0.1453 | 0.4962 | 0.5588 | 0.4222 | 0.424 | 0.5855 | 0.3463 | 0.6023 | 0.0465 | 0.457 | 0.1508 | 0.633 | 0.0698 | 0.4969 | 0.1744 | 0.6049 |
| 1.0432 | 27.0 | 2889 | 1.1788 | 0.1561 | 0.2869 | 0.1427 | 0.3365 | 0.1197 | 0.1658 | 0.1492 | 0.4796 | 0.571 | 0.4667 | 0.4035 | 0.6024 | 0.3687 | 0.6257 | 0.0506 | 0.5177 | 0.1464 | 0.6379 | 0.0617 | 0.4615 | 0.153 | 0.612 |
| 1.0432 | 28.0 | 2996 | 1.1831 | 0.1634 | 0.298 | 0.1602 | 0.41 | 0.1289 | 0.1742 | 0.16 | 0.4992 | 0.5716 | 0.4889 | 0.388 | 0.6072 | 0.3625 | 0.6198 | 0.0638 | 0.481 | 0.1614 | 0.6379 | 0.0719 | 0.5292 | 0.1574 | 0.5902 |
| 0.9815 | 29.0 | 3103 | 1.1504 | 0.1819 | 0.3319 | 0.1842 | 0.385 | 0.1385 | 0.1991 | 0.1933 | 0.5125 | 0.5859 | 0.5444 | 0.4026 | 0.6205 | 0.3634 | 0.5968 | 0.1012 | 0.5392 | 0.1867 | 0.6326 | 0.069 | 0.5308 | 0.1889 | 0.6302 |
| 0.9815 | 30.0 | 3210 | 1.1551 | 0.1685 | 0.305 | 0.1678 | 0.2734 | 0.1439 | 0.1829 | 0.1764 | 0.5099 | 0.5807 | 0.4667 | 0.4089 | 0.6162 | 0.3566 | 0.6315 | 0.0858 | 0.5418 | 0.1558 | 0.6085 | 0.0651 | 0.5077 | 0.179 | 0.6138 |
| 0.9815 | 31.0 | 3317 | 1.1503 | 0.1716 | 0.3165 | 0.1654 | 0.2964 | 0.1458 | 0.1828 | 0.1714 | 0.5028 | 0.5738 | 0.5 | 0.4066 | 0.6053 | 0.3815 | 0.6176 | 0.0809 | 0.5291 | 0.1791 | 0.6509 | 0.0589 | 0.48 | 0.1575 | 0.5916 |
| 0.9815 | 32.0 | 3424 | 1.1930 | 0.1617 | 0.3028 | 0.1523 | 0.2053 | 0.15 | 0.1725 | 0.1466 | 0.5019 | 0.5698 | 0.4222 | 0.4119 | 0.601 | 0.3764 | 0.609 | 0.0664 | 0.5228 | 0.1473 | 0.6263 | 0.0669 | 0.52 | 0.1514 | 0.5711 |
| 0.9111 | 33.0 | 3531 | 1.1475 | 0.1843 | 0.3324 | 0.1749 | 0.2151 | 0.1673 | 0.1954 | 0.1782 | 0.518 | 0.5827 | 0.3667 | 0.4129 | 0.6159 | 0.3722 | 0.6095 | 0.0625 | 0.5139 | 0.2035 | 0.6429 | 0.0792 | 0.5215 | 0.2041 | 0.6258 |
| 0.9111 | 34.0 | 3638 | 1.1393 | 0.1896 | 0.3416 | 0.1848 | 0.2793 | 0.1843 | 0.1996 | 0.1779 | 0.5045 | 0.5834 | 0.4556 | 0.4326 | 0.6121 | 0.3853 | 0.6374 | 0.0859 | 0.557 | 0.188 | 0.6384 | 0.0964 | 0.4631 | 0.1922 | 0.6213 |
| 0.9111 | 35.0 | 3745 | 1.1390 | 0.1821 | 0.3296 | 0.1772 | 0.1976 | 0.1588 | 0.1951 | 0.1766 | 0.5242 | 0.5913 | 0.3778 | 0.4181 | 0.6268 | 0.3804 | 0.6239 | 0.0891 | 0.5519 | 0.1824 | 0.6348 | 0.08 | 0.5215 | 0.1787 | 0.6244 |
| 0.9111 | 36.0 | 3852 | 1.1436 | 0.1796 | 0.3259 | 0.1734 | 0.2158 | 0.1483 | 0.1927 | 0.1773 | 0.5039 | 0.5781 | 0.3889 | 0.4132 | 0.612 | 0.3821 | 0.6144 | 0.0908 | 0.5152 | 0.1612 | 0.6482 | 0.0841 | 0.4985 | 0.1799 | 0.6142 |
| 0.9111 | 37.0 | 3959 | 1.1582 | 0.1926 | 0.3435 | 0.1946 | 0.2217 | 0.1719 | 0.2054 | 0.1912 | 0.5207 | 0.5882 | 0.4 | 0.4268 | 0.6215 | 0.4011 | 0.6243 | 0.1066 | 0.5494 | 0.1681 | 0.6344 | 0.0878 | 0.5308 | 0.1993 | 0.6022 |
| 0.8415 | 38.0 | 4066 | 1.1340 | 0.1934 | 0.3428 | 0.188 | 0.2015 | 0.1754 | 0.2055 | 0.1922 | 0.5233 | 0.5914 | 0.3556 | 0.4282 | 0.6251 | 0.3868 | 0.6063 | 0.1029 | 0.5405 | 0.1725 | 0.6518 | 0.1214 | 0.5338 | 0.1834 | 0.6244 |
| 0.8415 | 39.0 | 4173 | 1.1374 | 0.1939 | 0.3408 | 0.1842 | 0.2191 | 0.1596 | 0.2085 | 0.1849 | 0.5219 | 0.5947 | 0.4111 | 0.4206 | 0.6303 | 0.3979 | 0.6261 | 0.0979 | 0.5342 | 0.1807 | 0.6571 | 0.113 | 0.5415 | 0.18 | 0.6147 |
| 0.8415 | 40.0 | 4280 | 1.1291 | 0.1993 | 0.3499 | 0.1909 | 0.2219 | 0.1538 | 0.217 | 0.1944 | 0.535 | 0.5949 | 0.4222 | 0.426 | 0.6303 | 0.3866 | 0.6194 | 0.1319 | 0.5367 | 0.1814 | 0.6536 | 0.0976 | 0.5369 | 0.1989 | 0.628 |
| 0.8415 | 41.0 | 4387 | 1.1614 | 0.2023 | 0.3533 | 0.2052 | 0.1967 | 0.176 | 0.2177 | 0.1885 | 0.5219 | 0.5903 | 0.3444 | 0.4128 | 0.6275 | 0.4135 | 0.6347 | 0.1297 | 0.543 | 0.1814 | 0.6469 | 0.0997 | 0.5369 | 0.1872 | 0.5902 |
| 0.8415 | 42.0 | 4494 | 1.1548 | 0.189 | 0.3374 | 0.1904 | 0.2249 | 0.1539 | 0.2037 | 0.183 | 0.5192 | 0.5862 | 0.4 | 0.4137 | 0.6217 | 0.3969 | 0.6171 | 0.0915 | 0.5038 | 0.1744 | 0.6353 | 0.0989 | 0.5569 | 0.1835 | 0.6178 |
| 0.7841 | 43.0 | 4601 | 1.1335 | 0.2184 | 0.3761 | 0.2232 | 0.2424 | 0.1704 | 0.2331 | 0.2047 | 0.5403 | 0.5983 | 0.4444 | 0.418 | 0.6347 | 0.4118 | 0.6216 | 0.1268 | 0.543 | 0.21 | 0.6536 | 0.143 | 0.5615 | 0.2002 | 0.6116 |
| 0.7841 | 44.0 | 4708 | 1.1404 | 0.2084 | 0.366 | 0.2102 | 0.3102 | 0.1644 | 0.2242 | 0.2008 | 0.5308 | 0.5908 | 0.4556 | 0.4021 | 0.6289 | 0.4013 | 0.6131 | 0.123 | 0.5354 | 0.1965 | 0.6478 | 0.1267 | 0.54 | 0.1944 | 0.6178 |
| 0.7841 | 45.0 | 4815 | 1.1449 | 0.2062 | 0.3577 | 0.2024 | 0.2299 | 0.1483 | 0.2249 | 0.1992 | 0.5339 | 0.5944 | 0.4222 | 0.405 | 0.6334 | 0.4039 | 0.6158 | 0.1361 | 0.5456 | 0.1846 | 0.6469 | 0.1132 | 0.5446 | 0.1933 | 0.6191 |
| 0.7841 | 46.0 | 4922 | 1.1400 | 0.2072 | 0.3586 | 0.2053 | 0.2425 | 0.1587 | 0.2246 | 0.1998 | 0.5361 | 0.5945 | 0.4222 | 0.4179 | 0.6313 | 0.4006 | 0.6194 | 0.1301 | 0.5481 | 0.1836 | 0.6379 | 0.1237 | 0.5369 | 0.1982 | 0.6302 |
| 0.7353 | 47.0 | 5029 | 1.1411 | 0.2068 | 0.3624 | 0.2035 | 0.2337 | 0.1474 | 0.2251 | 0.2036 | 0.5326 | 0.595 | 0.3889 | 0.4249 | 0.63 | 0.4014 | 0.6194 | 0.1261 | 0.543 | 0.1922 | 0.6536 | 0.1153 | 0.5431 | 0.1988 | 0.616 |
| 0.7353 | 48.0 | 5136 | 1.1374 | 0.2084 | 0.3613 | 0.2021 | 0.2473 | 0.1526 | 0.227 | 0.2025 | 0.5344 | 0.5959 | 0.3667 | 0.4298 | 0.6303 | 0.4052 | 0.6203 | 0.1197 | 0.557 | 0.1924 | 0.6504 | 0.1233 | 0.5369 | 0.2011 | 0.6151 |
| 0.7353 | 49.0 | 5243 | 1.1376 | 0.2062 | 0.3565 | 0.1993 | 0.2511 | 0.151 | 0.2239 | 0.1943 | 0.5358 | 0.6022 | 0.4222 | 0.4257 | 0.6381 | 0.4026 | 0.6203 | 0.12 | 0.5671 | 0.1868 | 0.6478 | 0.1275 | 0.5523 | 0.1941 | 0.6236 |
| 0.7353 | 50.0 | 5350 | 1.1358 | 0.2144 | 0.372 | 0.2115 | 0.2518 | 0.1576 | 0.2323 | 0.2051 | 0.5387 | 0.5982 | 0.4 | 0.4209 | 0.6355 | 0.4063 | 0.6149 | 0.127 | 0.5557 | 0.1935 | 0.6487 | 0.1375 | 0.5446 | 0.2079 | 0.6271 |
| 0.7353 | 51.0 | 5457 | 1.1373 | 0.2139 | 0.372 | 0.2107 | 0.2413 | 0.161 | 0.2322 | 0.2041 | 0.5384 | 0.6004 | 0.4222 | 0.4198 | 0.6373 | 0.4054 | 0.6194 | 0.1292 | 0.5557 | 0.1934 | 0.6513 | 0.1333 | 0.5446 | 0.2079 | 0.6311 |
| 0.6951 | 52.0 | 5564 | 1.1437 | 0.2146 | 0.375 | 0.2091 | 0.2376 | 0.1652 | 0.2326 | 0.2007 | 0.5317 | 0.5968 | 0.3778 | 0.4268 | 0.6326 | 0.4053 | 0.6113 | 0.1289 | 0.5443 | 0.1985 | 0.6527 | 0.134 | 0.5492 | 0.2065 | 0.6267 |
| 0.6951 | 53.0 | 5671 | 1.1379 | 0.2181 | 0.3803 | 0.2151 | 0.2481 | 0.1592 | 0.2357 | 0.2077 | 0.5359 | 0.5989 | 0.4333 | 0.4255 | 0.6346 | 0.4043 | 0.6162 | 0.1341 | 0.5532 | 0.2027 | 0.6504 | 0.1389 | 0.5477 | 0.2103 | 0.6271 |
| 0.6951 | 54.0 | 5778 | 1.1353 | 0.2166 | 0.3763 | 0.2097 | 0.25 | 0.1647 | 0.2338 | 0.2037 | 0.5376 | 0.5988 | 0.4333 | 0.4263 | 0.6338 | 0.4067 | 0.6203 | 0.126 | 0.543 | 0.1993 | 0.6522 | 0.1404 | 0.5508 | 0.2107 | 0.6276 |
| 0.6951 | 55.0 | 5885 | 1.1323 | 0.2189 | 0.3786 | 0.2139 | 0.2498 | 0.1584 | 0.2367 | 0.2038 | 0.5379 | 0.6029 | 0.4333 | 0.4258 | 0.6388 | 0.4089 | 0.6212 | 0.1319 | 0.5557 | 0.2011 | 0.6522 | 0.1423 | 0.5538 | 0.2102 | 0.6316 |
| 0.6951 | 56.0 | 5992 | 1.1332 | 0.2201 | 0.3802 | 0.2153 | 0.2497 | 0.1609 | 0.2388 | 0.202 | 0.5406 | 0.6045 | 0.4333 | 0.4292 | 0.6409 | 0.4053 | 0.6216 | 0.1385 | 0.5658 | 0.2017 | 0.654 | 0.1444 | 0.5492 | 0.2107 | 0.632 |
| 0.6751 | 57.0 | 6099 | 1.1343 | 0.2219 | 0.3832 | 0.2166 | 0.25 | 0.1581 | 0.241 | 0.2043 | 0.5418 | 0.6027 | 0.4333 | 0.428 | 0.6387 | 0.4054 | 0.6189 | 0.1386 | 0.5646 | 0.204 | 0.6545 | 0.1511 | 0.5477 | 0.2103 | 0.628 |
| 0.6751 | 58.0 | 6206 | 1.1348 | 0.2217 | 0.383 | 0.217 | 0.25 | 0.1584 | 0.2405 | 0.2038 | 0.5425 | 0.6031 | 0.4333 | 0.4272 | 0.6394 | 0.4062 | 0.6194 | 0.138 | 0.5658 | 0.2034 | 0.6545 | 0.1507 | 0.5462 | 0.2102 | 0.6298 |
| 0.6751 | 59.0 | 6313 | 1.1348 | 0.2219 | 0.3828 | 0.2171 | 0.25 | 0.1586 | 0.2409 | 0.2038 | 0.5432 | 0.6039 | 0.4333 | 0.4272 | 0.6403 | 0.407 | 0.6207 | 0.1384 | 0.5696 | 0.2036 | 0.654 | 0.1504 | 0.5462 | 0.2101 | 0.6289 |
| 0.6751 | 60.0 | 6420 | 1.1348 | 0.2218 | 0.3828 | 0.2171 | 0.25 | 0.1585 | 0.2409 | 0.2038 | 0.5442 | 0.6037 | 0.4333 | 0.4272 | 0.6401 | 0.4068 | 0.6207 | 0.1384 | 0.5684 | 0.2036 | 0.6545 | 0.1505 | 0.5462 | 0.2099 | 0.6289 |
### Framework versions
- Transformers 4.49.0
- Pytorch 2.6.0+cu124
- Datasets 3.3.2
- Tokenizers 0.21.0
| [
"coverall",
"face_shield",
"gloves",
"goggles",
"mask"
] |
cgr/fasterrcnn_resnet50_fpn_custom |
A custom RNN for detecting panels in p3d. | [
"background",
"pfd",
"mfd",
"ufcp",
"hud",
"eicas"
] |
pneupane/table-transformer-finetuned |
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# table-transformer-finetuned
This model is a fine-tuned version of [microsoft/table-structure-recognition-v1.1-all](https://huggingface.co/microsoft/table-structure-recognition-v1.1-all) on an unknown dataset.
It achieves the following results on the evaluation set:
- Loss: 2.2890
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 1e-06
- train_batch_size: 1
- eval_batch_size: 1
- seed: 42
- gradient_accumulation_steps: 8
- total_train_batch_size: 8
- optimizer: Use OptimizerNames.ADAMW_TORCH with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
- lr_scheduler_type: linear
- num_epochs: 20
- mixed_precision_training: Native AMP
### Training results
| Training Loss | Epoch | Step | Validation Loss |
|:-------------:|:-------:|:----:|:---------------:|
| 9.2523 | 1.0 | 14 | 9.3207 |
| 7.9949 | 2.0 | 28 | 9.0533 |
| 8.2292 | 3.0 | 42 | 8.6547 |
| 7.1495 | 4.0 | 56 | 8.0844 |
| 6.9055 | 5.0 | 70 | 7.3339 |
| 7.0771 | 6.0 | 84 | 7.2344 |
| 5.6554 | 7.0 | 98 | 6.6026 |
| 5.5633 | 8.0 | 112 | 6.3610 |
| 4.9506 | 9.0 | 126 | 5.9567 |
| 4.0856 | 10.0 | 140 | 5.9191 |
| 4.4286 | 11.0 | 154 | 5.6374 |
| 2.9043 | 12.0 | 168 | 4.9525 |
| 3.4755 | 13.0 | 182 | 4.6846 |
| 3.4152 | 14.0 | 196 | 4.3171 |
| 2.8456 | 15.0 | 210 | 3.5661 |
| 2.4149 | 16.0 | 224 | 3.6150 |
| 1.9328 | 17.0 | 238 | 3.2264 |
| 1.8503 | 18.0 | 252 | 2.8540 |
| 1.596 | 18.5981 | 260 | 2.2890 |
### Framework versions
- Transformers 4.48.3
- Pytorch 2.5.1+cu124
- Datasets 3.3.2
- Tokenizers 0.21.0
| [
"table",
"table column",
"table row",
"table column header",
"table projected row header",
"table spanning cell"
] |
pranavvmurthy26/detr-finetuned-cars-03052025 |
# Model Card for Model ID
<!-- Provide a quick summary of what the model is/does. -->
## Model Details
### Model Description
<!-- Provide a longer summary of what this model is. -->
This is the model card of a 🤗 transformers model that has been pushed on the Hub. This model card has been automatically generated.
- **Developed by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Model type:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
- **Finetuned from model [optional]:** [More Information Needed]
### Model Sources [optional]
<!-- Provide the basic links for the model. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the model is intended to be used, including the foreseeable users of the model and those affected by the model. -->
### Direct Use
<!-- This section is for the model use without fine-tuning or plugging into a larger ecosystem/app. -->
[More Information Needed]
### Downstream Use [optional]
<!-- This section is for the model use when fine-tuned for a task, or when plugged into a larger ecosystem/app -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the model will not work well for. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.
## How to Get Started with the Model
Use the code below to get started with the model.
[More Information Needed]
## Training Details
### Training Data
<!-- This should link to a Dataset Card, perhaps with a short stub of information on what the training data is all about as well as documentation related to data pre-processing or additional filtering. -->
[More Information Needed]
### Training Procedure
<!-- This relates heavily to the Technical Specifications. Content here should link to that section when it is relevant to the training procedure. -->
#### Preprocessing [optional]
[More Information Needed]
#### Training Hyperparameters
- **Training regime:** [More Information Needed] <!--fp32, fp16 mixed precision, bf16 mixed precision, bf16 non-mixed precision, fp16 non-mixed precision, fp8 mixed precision -->
#### Speeds, Sizes, Times [optional]
<!-- This section provides information about throughput, start/end time, checkpoint size if relevant, etc. -->
[More Information Needed]
## Evaluation
<!-- This section describes the evaluation protocols and provides the results. -->
### Testing Data, Factors & Metrics
#### Testing Data
<!-- This should link to a Dataset Card if possible. -->
[More Information Needed]
#### Factors
<!-- These are the things the evaluation is disaggregating by, e.g., subpopulations or domains. -->
[More Information Needed]
#### Metrics
<!-- These are the evaluation metrics being used, ideally with a description of why. -->
[More Information Needed]
### Results
[More Information Needed]
#### Summary
## Model Examination [optional]
<!-- Relevant interpretability work for the model goes here -->
[More Information Needed]
## Environmental Impact
<!-- Total emissions (in grams of CO2eq) and additional considerations, such as electricity usage, go here. Edit the suggested text below accordingly -->
Carbon emissions can be estimated using the [Machine Learning Impact calculator](https://mlco2.github.io/impact#compute) presented in [Lacoste et al. (2019)](https://arxiv.org/abs/1910.09700).
- **Hardware Type:** [More Information Needed]
- **Hours used:** [More Information Needed]
- **Cloud Provider:** [More Information Needed]
- **Compute Region:** [More Information Needed]
- **Carbon Emitted:** [More Information Needed]
## Technical Specifications [optional]
### Model Architecture and Objective
[More Information Needed]
### Compute Infrastructure
[More Information Needed]
#### Hardware
[More Information Needed]
#### Software
[More Information Needed]
## Citation [optional]
<!-- If there is a paper or blog post introducing the model, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the model or model card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Model Card Authors [optional]
[More Information Needed]
## Model Card Contact
[More Information Needed] | [
"label_0",
"label_1",
"label_2",
"label_3",
"label_4",
"label_5",
"label_6",
"label_7",
"label_8",
"label_9",
"label_10",
"label_11",
"label_12",
"label_13",
"label_14",
"label_15",
"label_16",
"label_17",
"label_18",
"label_19",
"label_20",
"label_21",
"label_22",
"label_23",
"label_24",
"label_25",
"label_26",
"label_27",
"label_28",
"label_29",
"label_30",
"label_31",
"label_32",
"label_33",
"label_34",
"label_35",
"label_36",
"label_37",
"label_38",
"label_39",
"label_40",
"label_41",
"label_42",
"label_43",
"label_44",
"label_45",
"label_46",
"label_47",
"label_48",
"label_49",
"label_50",
"label_51",
"label_52",
"label_53",
"label_54",
"label_55",
"label_56",
"label_57",
"label_58",
"label_59",
"label_60",
"label_61",
"label_62",
"label_63",
"label_64"
] |
toukapy/detr_finetuned_kitti_mots |
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# detr_finetuned_kitti_mots
This model is a fine-tuned version of [microsoft/conditional-detr-resnet-50](https://huggingface.co/microsoft/conditional-detr-resnet-50) on the None dataset.
It achieves the following results on the evaluation set:
- Loss: 0.5468
- Map: 0.5754
- Map 50: 0.8688
- Map 75: 0.625
- Map Small: 0.3516
- Map Medium: 0.5857
- Map Large: 0.783
- Mar 1: 0.1695
- Mar 10: 0.6195
- Mar 100: 0.6862
- Mar Small: 0.5362
- Mar Medium: 0.6977
- Mar Large: 0.8372
- Map Car: 0.6938
- Mar 100 Car: 0.769
- Map Pedestrian: 0.4569
- Mar 100 Pedestrian: 0.6034
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 0.0001
- train_batch_size: 8
- eval_batch_size: 8
- seed: 42
- optimizer: Use OptimizerNames.ADAMW_TORCH with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
- lr_scheduler_type: cosine
- num_epochs: 50
### Training results
| Training Loss | Epoch | Step | Validation Loss | Map | Map 50 | Map 75 | Map Small | Map Medium | Map Large | Mar 1 | Mar 10 | Mar 100 | Mar Small | Mar Medium | Mar Large | Map Car | Mar 100 Car | Map Pedestrian | Mar 100 Pedestrian |
|:-------------:|:-----:|:-----:|:---------------:|:------:|:------:|:------:|:---------:|:----------:|:---------:|:------:|:------:|:-------:|:---------:|:----------:|:---------:|:-------:|:-----------:|:--------------:|:------------------:|
| 1.5904 | 1.0 | 743 | 1.2276 | 0.1644 | 0.3572 | 0.1419 | 0.0253 | 0.1267 | 0.3984 | 0.08 | 0.2466 | 0.3977 | 0.1364 | 0.4022 | 0.7009 | 0.277 | 0.4472 | 0.0518 | 0.3482 |
| 1.3176 | 2.0 | 1486 | 1.1301 | 0.2261 | 0.5107 | 0.1714 | 0.0479 | 0.1929 | 0.5 | 0.1024 | 0.3066 | 0.4201 | 0.201 | 0.4196 | 0.6825 | 0.3095 | 0.4709 | 0.1426 | 0.3694 |
| 1.1187 | 3.0 | 2229 | 1.0565 | 0.2699 | 0.5705 | 0.2147 | 0.0416 | 0.2365 | 0.5833 | 0.1142 | 0.332 | 0.4328 | 0.1875 | 0.4434 | 0.7003 | 0.3674 | 0.4924 | 0.1725 | 0.3733 |
| 1.0835 | 4.0 | 2972 | 1.0326 | 0.2851 | 0.5903 | 0.2363 | 0.0572 | 0.2557 | 0.6048 | 0.116 | 0.3481 | 0.4474 | 0.2339 | 0.4447 | 0.7149 | 0.3781 | 0.5115 | 0.1921 | 0.3832 |
| 1.0681 | 5.0 | 3715 | 1.0095 | 0.3017 | 0.5998 | 0.2728 | 0.0538 | 0.2878 | 0.6051 | 0.1205 | 0.3625 | 0.4664 | 0.2237 | 0.4787 | 0.7298 | 0.4003 | 0.5288 | 0.2031 | 0.404 |
| 1.042 | 6.0 | 4458 | 1.0315 | 0.271 | 0.6062 | 0.1981 | 0.0628 | 0.2597 | 0.5656 | 0.1073 | 0.3465 | 0.4393 | 0.2258 | 0.4469 | 0.6762 | 0.3601 | 0.4929 | 0.182 | 0.3857 |
| 1.0051 | 7.0 | 5201 | 0.9796 | 0.2875 | 0.6189 | 0.2331 | 0.0695 | 0.2789 | 0.5719 | 0.1122 | 0.367 | 0.473 | 0.2403 | 0.4912 | 0.7076 | 0.3936 | 0.5423 | 0.1814 | 0.4036 |
| 0.9798 | 8.0 | 5944 | 0.9567 | 0.3178 | 0.6257 | 0.2915 | 0.0823 | 0.3041 | 0.6211 | 0.124 | 0.379 | 0.485 | 0.2552 | 0.5059 | 0.7105 | 0.4214 | 0.5633 | 0.2142 | 0.4068 |
| 0.9814 | 9.0 | 6687 | 1.0122 | 0.2799 | 0.579 | 0.239 | 0.0493 | 0.2585 | 0.5948 | 0.1124 | 0.3488 | 0.4607 | 0.2195 | 0.4725 | 0.7226 | 0.3877 | 0.5279 | 0.1721 | 0.3934 |
| 1.0057 | 10.0 | 7430 | 0.9464 | 0.3156 | 0.6365 | 0.2797 | 0.081 | 0.3083 | 0.6042 | 0.1201 | 0.3814 | 0.4929 | 0.275 | 0.5011 | 0.736 | 0.4326 | 0.5579 | 0.1986 | 0.428 |
| 0.9267 | 11.0 | 8173 | 0.9137 | 0.3364 | 0.6634 | 0.2955 | 0.118 | 0.326 | 0.6291 | 0.1276 | 0.3995 | 0.5073 | 0.3059 | 0.5147 | 0.7301 | 0.4369 | 0.5728 | 0.2359 | 0.4417 |
| 0.8938 | 12.0 | 8916 | 0.8808 | 0.3622 | 0.6811 | 0.3348 | 0.1147 | 0.3497 | 0.6639 | 0.1358 | 0.4187 | 0.5212 | 0.2956 | 0.536 | 0.7572 | 0.4784 | 0.5986 | 0.246 | 0.4438 |
| 0.8575 | 13.0 | 9659 | 0.8632 | 0.3614 | 0.6888 | 0.34 | 0.1005 | 0.3647 | 0.6565 | 0.1305 | 0.4254 | 0.5258 | 0.2989 | 0.5449 | 0.7523 | 0.4629 | 0.5893 | 0.2599 | 0.4624 |
| 0.8442 | 14.0 | 10402 | 0.8544 | 0.3746 | 0.6922 | 0.3628 | 0.1129 | 0.3759 | 0.6419 | 0.1356 | 0.4284 | 0.5352 | 0.3325 | 0.5545 | 0.7302 | 0.4878 | 0.6007 | 0.2614 | 0.4696 |
| 0.8213 | 15.0 | 11145 | 0.8441 | 0.3792 | 0.7034 | 0.3561 | 0.1321 | 0.3742 | 0.6534 | 0.1365 | 0.4317 | 0.5378 | 0.324 | 0.56 | 0.7363 | 0.5039 | 0.6128 | 0.2544 | 0.4628 |
| 0.8115 | 16.0 | 11888 | 0.8210 | 0.3888 | 0.709 | 0.3807 | 0.128 | 0.392 | 0.6675 | 0.1399 | 0.4413 | 0.5468 | 0.3479 | 0.5616 | 0.7491 | 0.5037 | 0.6212 | 0.2739 | 0.4723 |
| 0.8093 | 17.0 | 12631 | 0.8127 | 0.385 | 0.7092 | 0.3751 | 0.1383 | 0.3858 | 0.6626 | 0.1381 | 0.4399 | 0.5493 | 0.3673 | 0.556 | 0.7529 | 0.5141 | 0.6257 | 0.2559 | 0.4729 |
| 0.7893 | 18.0 | 13374 | 0.8204 | 0.3901 | 0.7194 | 0.3675 | 0.1295 | 0.3842 | 0.6851 | 0.1378 | 0.4404 | 0.5416 | 0.3486 | 0.5481 | 0.7602 | 0.5048 | 0.6092 | 0.2755 | 0.474 |
| 0.7459 | 19.0 | 14117 | 0.7880 | 0.4058 | 0.729 | 0.4085 | 0.1582 | 0.4043 | 0.6709 | 0.1407 | 0.46 | 0.5627 | 0.3686 | 0.5794 | 0.7528 | 0.53 | 0.6421 | 0.2816 | 0.4832 |
| 0.7383 | 20.0 | 14860 | 0.7477 | 0.4264 | 0.7551 | 0.4264 | 0.1563 | 0.4328 | 0.7092 | 0.1441 | 0.48 | 0.5888 | 0.3989 | 0.6004 | 0.788 | 0.538 | 0.6577 | 0.3149 | 0.5199 |
| 0.7207 | 21.0 | 15603 | 0.7688 | 0.4188 | 0.7599 | 0.4087 | 0.1591 | 0.419 | 0.7057 | 0.1419 | 0.4691 | 0.5641 | 0.3832 | 0.5686 | 0.772 | 0.5306 | 0.6361 | 0.307 | 0.492 |
| 0.7127 | 22.0 | 16346 | 0.7450 | 0.4368 | 0.7622 | 0.4379 | 0.1733 | 0.4368 | 0.7099 | 0.1483 | 0.4828 | 0.5834 | 0.3873 | 0.5959 | 0.7875 | 0.5601 | 0.6532 | 0.3136 | 0.5136 |
| 0.698 | 23.0 | 17089 | 0.7429 | 0.4401 | 0.7626 | 0.4515 | 0.1739 | 0.4456 | 0.7194 | 0.1498 | 0.4841 | 0.5901 | 0.3878 | 0.6079 | 0.7878 | 0.5536 | 0.6524 | 0.3266 | 0.5278 |
| 0.6836 | 24.0 | 17832 | 0.7642 | 0.422 | 0.7565 | 0.4217 | 0.154 | 0.428 | 0.7066 | 0.1433 | 0.47 | 0.5743 | 0.3941 | 0.5791 | 0.7801 | 0.5423 | 0.6404 | 0.3017 | 0.5081 |
| 0.6684 | 25.0 | 18575 | 0.7016 | 0.4599 | 0.7889 | 0.4717 | 0.1873 | 0.4706 | 0.7334 | 0.1497 | 0.5093 | 0.6116 | 0.4221 | 0.6271 | 0.8006 | 0.5767 | 0.6837 | 0.3431 | 0.5395 |
| 0.6471 | 26.0 | 19318 | 0.6890 | 0.4724 | 0.791 | 0.4869 | 0.2053 | 0.4825 | 0.7304 | 0.1552 | 0.516 | 0.6068 | 0.4245 | 0.6179 | 0.7994 | 0.6004 | 0.6895 | 0.3443 | 0.5241 |
| 0.6259 | 27.0 | 20061 | 0.6788 | 0.4726 | 0.7816 | 0.4914 | 0.2182 | 0.481 | 0.7202 | 0.1555 | 0.5212 | 0.6199 | 0.4374 | 0.6387 | 0.7919 | 0.6137 | 0.7058 | 0.3316 | 0.5339 |
| 0.6038 | 28.0 | 20804 | 0.6664 | 0.4844 | 0.8057 | 0.5206 | 0.221 | 0.4985 | 0.7352 | 0.1539 | 0.5336 | 0.6236 | 0.4423 | 0.6382 | 0.8047 | 0.6049 | 0.6987 | 0.3639 | 0.5486 |
| 0.5963 | 29.0 | 21547 | 0.6562 | 0.4952 | 0.8143 | 0.5265 | 0.2398 | 0.5087 | 0.7436 | 0.1549 | 0.54 | 0.6299 | 0.4495 | 0.6445 | 0.8094 | 0.6177 | 0.7083 | 0.3728 | 0.5514 |
| 0.5821 | 30.0 | 22290 | 0.6533 | 0.5019 | 0.8238 | 0.5289 | 0.2385 | 0.5151 | 0.7553 | 0.1579 | 0.5441 | 0.6328 | 0.4478 | 0.6471 | 0.8184 | 0.6045 | 0.6965 | 0.3992 | 0.5691 |
| 0.5642 | 31.0 | 23033 | 0.6434 | 0.506 | 0.8291 | 0.532 | 0.2451 | 0.5199 | 0.7421 | 0.1575 | 0.5505 | 0.6342 | 0.4598 | 0.6482 | 0.8086 | 0.6226 | 0.7103 | 0.3893 | 0.558 |
| 0.5547 | 32.0 | 23776 | 0.6382 | 0.5041 | 0.8261 | 0.5454 | 0.2484 | 0.5193 | 0.7529 | 0.1566 | 0.551 | 0.6357 | 0.4587 | 0.6488 | 0.8154 | 0.6221 | 0.7123 | 0.3861 | 0.5591 |
| 0.536 | 33.0 | 24519 | 0.6175 | 0.5188 | 0.8382 | 0.5556 | 0.2732 | 0.5318 | 0.753 | 0.1607 | 0.565 | 0.644 | 0.4731 | 0.659 | 0.8112 | 0.6352 | 0.7259 | 0.4024 | 0.5621 |
| 0.5231 | 34.0 | 25262 | 0.6037 | 0.531 | 0.8421 | 0.5703 | 0.2879 | 0.5412 | 0.765 | 0.1636 | 0.5774 | 0.6578 | 0.4848 | 0.6731 | 0.827 | 0.6485 | 0.7353 | 0.4134 | 0.5803 |
| 0.5042 | 35.0 | 26005 | 0.5947 | 0.5373 | 0.846 | 0.5869 | 0.3039 | 0.5495 | 0.7621 | 0.1622 | 0.583 | 0.6614 | 0.507 | 0.6715 | 0.8218 | 0.6552 | 0.7398 | 0.4194 | 0.5829 |
| 0.4956 | 36.0 | 26748 | 0.5955 | 0.5379 | 0.8496 | 0.5835 | 0.2997 | 0.5484 | 0.7641 | 0.1651 | 0.5818 | 0.6599 | 0.498 | 0.6726 | 0.8222 | 0.6606 | 0.7391 | 0.4152 | 0.5808 |
| 0.4838 | 37.0 | 27491 | 0.5849 | 0.549 | 0.8514 | 0.591 | 0.3076 | 0.5609 | 0.7749 | 0.1669 | 0.5937 | 0.6703 | 0.5038 | 0.6858 | 0.8311 | 0.6618 | 0.7439 | 0.4362 | 0.5967 |
| 0.4586 | 38.0 | 28234 | 0.5708 | 0.5568 | 0.8591 | 0.6044 | 0.3296 | 0.5658 | 0.7782 | 0.1678 | 0.6007 | 0.6776 | 0.5205 | 0.6896 | 0.8356 | 0.6762 | 0.7563 | 0.4374 | 0.5989 |
| 0.455 | 39.0 | 28977 | 0.5749 | 0.5525 | 0.8581 | 0.6 | 0.3221 | 0.5597 | 0.7801 | 0.1669 | 0.5965 | 0.6695 | 0.5138 | 0.6786 | 0.834 | 0.6722 | 0.7511 | 0.4327 | 0.5879 |
| 0.4426 | 40.0 | 29720 | 0.5670 | 0.5605 | 0.8626 | 0.6086 | 0.3276 | 0.573 | 0.7795 | 0.1678 | 0.6052 | 0.6765 | 0.5188 | 0.6894 | 0.8331 | 0.6771 | 0.7574 | 0.4439 | 0.5957 |
| 0.4337 | 41.0 | 30463 | 0.5652 | 0.5621 | 0.8631 | 0.6117 | 0.3368 | 0.5715 | 0.7822 | 0.1677 | 0.6074 | 0.6774 | 0.5267 | 0.686 | 0.8367 | 0.6783 | 0.7547 | 0.4459 | 0.6001 |
| 0.4181 | 42.0 | 31206 | 0.5612 | 0.5625 | 0.8635 | 0.6096 | 0.3284 | 0.5754 | 0.782 | 0.1677 | 0.6074 | 0.678 | 0.5179 | 0.6906 | 0.8381 | 0.6785 | 0.7569 | 0.4466 | 0.5992 |
| 0.4198 | 43.0 | 31949 | 0.5575 | 0.5692 | 0.8651 | 0.6197 | 0.3385 | 0.5803 | 0.7836 | 0.1698 | 0.6139 | 0.6811 | 0.5308 | 0.6905 | 0.8384 | 0.6867 | 0.7632 | 0.4516 | 0.599 |
| 0.4058 | 44.0 | 32692 | 0.5524 | 0.5706 | 0.8675 | 0.6212 | 0.3436 | 0.5816 | 0.7814 | 0.1692 | 0.6154 | 0.6847 | 0.5345 | 0.6962 | 0.8357 | 0.6871 | 0.7652 | 0.4541 | 0.6042 |
| 0.4026 | 45.0 | 33435 | 0.5508 | 0.5722 | 0.8664 | 0.6223 | 0.3448 | 0.5838 | 0.7817 | 0.1686 | 0.618 | 0.6842 | 0.5337 | 0.6959 | 0.8356 | 0.6902 | 0.767 | 0.4541 | 0.6015 |
| 0.398 | 46.0 | 34178 | 0.5498 | 0.5733 | 0.8701 | 0.6257 | 0.348 | 0.5843 | 0.7826 | 0.169 | 0.6185 | 0.6856 | 0.5347 | 0.6974 | 0.8371 | 0.6898 | 0.7665 | 0.4569 | 0.6048 |
| 0.3954 | 47.0 | 34921 | 0.5471 | 0.5749 | 0.8687 | 0.6261 | 0.3515 | 0.5856 | 0.784 | 0.1695 | 0.6195 | 0.6866 | 0.5355 | 0.6977 | 0.8395 | 0.6925 | 0.7683 | 0.4573 | 0.6049 |
| 0.3967 | 48.0 | 35664 | 0.5491 | 0.5736 | 0.87 | 0.6228 | 0.3481 | 0.5848 | 0.7816 | 0.169 | 0.6181 | 0.6848 | 0.5329 | 0.697 | 0.8362 | 0.6918 | 0.7674 | 0.4554 | 0.6022 |
| 0.3874 | 49.0 | 36407 | 0.5468 | 0.5756 | 0.8689 | 0.6253 | 0.3525 | 0.5857 | 0.783 | 0.1697 | 0.62 | 0.6863 | 0.5366 | 0.6977 | 0.8372 | 0.694 | 0.7691 | 0.4571 | 0.6035 |
| 0.3861 | 50.0 | 37150 | 0.5468 | 0.5754 | 0.8688 | 0.625 | 0.3516 | 0.5857 | 0.783 | 0.1695 | 0.6195 | 0.6862 | 0.5362 | 0.6977 | 0.8372 | 0.6938 | 0.769 | 0.4569 | 0.6034 |
### Framework versions
- Transformers 4.49.0
- Pytorch 2.5.1+cu121
- Datasets 3.3.2
- Tokenizers 0.21.0
| [
"car",
"pedestrian"
] |
uisikdag/autotrain-detr-resnet-50-excavator-detection |
# Model Trained Using AutoTrain
- Problem type: Object Detection
## Validation Metrics
loss: 0.6406955122947693
map: 0.4249
map_50: 0.5213
map_75: 0.4799
map_small: -1.0
map_medium: 0.0574
map_large: 0.4568
mar_1: 0.5772
mar_10: 0.7076
mar_100: 0.7407
mar_small: -1.0
mar_medium: 0.3029
mar_large: 0.7729
| [
"class_0",
"class_1",
"class_2",
"class_3"
] |
uisikdag/autotrain-detr-resnet-50-apple-detection |
# Model Trained Using AutoTrain
- Problem type: Object Detection
## Validation Metrics
loss: 0.8038374781608582
map: 0.3049
map_50: 0.497
map_75: 0.3381
map_small: -1.0
map_medium: 0.3577
map_large: 0.3559
mar_1: 0.2076
mar_10: 0.5234
mar_100: 0.5494
mar_small: -1.0
mar_medium: 0.4669
mar_large: 0.5793
| [
"class_0",
"class_1",
"class_2"
] |
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.