model_id
stringlengths 9
102
| model_card
stringlengths 4
343k
| model_labels
listlengths 2
50.8k
|
---|---|---|
joe611/chickens-composite-403232323232-150-epochs-wo-transform-metrics-test |
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# chickens-composite-403232323232-150-epochs-wo-transform-metrics-test
This model is a fine-tuned version of [facebook/detr-resnet-50](https://huggingface.co/facebook/detr-resnet-50) on the None dataset.
It achieves the following results on the evaluation set:
- Loss: 0.2554
- Map: 0.8454
- Map 50: 0.9638
- Map 75: 0.9414
- Map Small: 0.3586
- Map Medium: 0.8525
- Map Large: 0.8191
- Mar 1: 0.3399
- Mar 10: 0.8795
- Mar 100: 0.8819
- Mar Small: 0.5029
- Mar Medium: 0.8889
- Mar Large: 0.8593
- Map Chicken: 0.8417
- Mar 100 Chicken: 0.8813
- Map Duck: 0.7981
- Mar 100 Duck: 0.8407
- Map Plant: 0.8964
- Mar 100 Plant: 0.9237
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 1e-05
- train_batch_size: 2
- eval_batch_size: 8
- seed: 42
- optimizer: Use adamw_torch with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
- lr_scheduler_type: cosine
- num_epochs: 150
### Training results
| Training Loss | Epoch | Step | Validation Loss | Map | Map 50 | Map 75 | Map Small | Map Medium | Map Large | Mar 1 | Mar 10 | Mar 100 | Mar Small | Mar Medium | Mar Large | Map Chicken | Mar 100 Chicken | Map Duck | Mar 100 Duck | Map Plant | Mar 100 Plant |
|:-------------:|:-----:|:------:|:---------------:|:------:|:------:|:------:|:---------:|:----------:|:---------:|:------:|:------:|:-------:|:---------:|:----------:|:---------:|:-----------:|:---------------:|:--------:|:------------:|:---------:|:-------------:|
| 1.0777 | 1.0 | 1000 | 1.0389 | 0.2068 | 0.2913 | 0.2391 | 0.013 | 0.1388 | 0.2388 | 0.0854 | 0.2648 | 0.2788 | 0.0549 | 0.2419 | 0.2929 | 0.0635 | 0.1076 | 0.0 | 0.0 | 0.5568 | 0.7287 |
| 0.8605 | 2.0 | 2000 | 0.8134 | 0.3533 | 0.5097 | 0.4179 | 0.0132 | 0.2947 | 0.3849 | 0.1266 | 0.4611 | 0.4654 | 0.0593 | 0.4277 | 0.4824 | 0.4202 | 0.6698 | 0.0 | 0.0 | 0.6397 | 0.7265 |
| 0.8163 | 3.0 | 3000 | 0.6903 | 0.3885 | 0.5318 | 0.4618 | 0.0471 | 0.3484 | 0.4085 | 0.1319 | 0.4922 | 0.4948 | 0.1039 | 0.4679 | 0.5117 | 0.4633 | 0.7131 | 0.0 | 0.0 | 0.7023 | 0.7713 |
| 0.7015 | 4.0 | 4000 | 0.5761 | 0.4228 | 0.565 | 0.5003 | 0.0365 | 0.389 | 0.4316 | 0.1365 | 0.5115 | 0.5139 | 0.1026 | 0.4867 | 0.5243 | 0.5112 | 0.7286 | 0.0001 | 0.0046 | 0.7572 | 0.8084 |
| 0.5928 | 5.0 | 5000 | 0.5544 | 0.438 | 0.5832 | 0.5134 | 0.0648 | 0.404 | 0.4456 | 0.1392 | 0.5173 | 0.5211 | 0.1939 | 0.4958 | 0.5281 | 0.5463 | 0.7431 | 0.0002 | 0.0046 | 0.7676 | 0.8157 |
| 0.6028 | 6.0 | 6000 | 0.5337 | 0.4498 | 0.5828 | 0.5342 | 0.1047 | 0.4231 | 0.4426 | 0.144 | 0.5279 | 0.5315 | 0.2775 | 0.5092 | 0.5247 | 0.5573 | 0.7543 | 0.0 | 0.0021 | 0.7921 | 0.838 |
| 0.5511 | 7.0 | 7000 | 0.5070 | 0.4707 | 0.6037 | 0.5519 | 0.1308 | 0.4376 | 0.4764 | 0.1481 | 0.5353 | 0.5372 | 0.2447 | 0.5114 | 0.5461 | 0.6194 | 0.7702 | 0.0001 | 0.0077 | 0.7925 | 0.8336 |
| 0.5221 | 8.0 | 8000 | 0.4854 | 0.496 | 0.6353 | 0.5799 | 0.1876 | 0.4756 | 0.5014 | 0.1495 | 0.5374 | 0.5398 | 0.286 | 0.5207 | 0.547 | 0.6793 | 0.7672 | 0.0 | 0.0015 | 0.8087 | 0.8507 |
| 0.5255 | 9.0 | 9000 | 0.4765 | 0.5201 | 0.6555 | 0.6127 | 0.154 | 0.4918 | 0.52 | 0.1641 | 0.5524 | 0.5555 | 0.3129 | 0.5299 | 0.5556 | 0.7176 | 0.7829 | 0.0299 | 0.0335 | 0.8127 | 0.8502 |
| 0.5094 | 10.0 | 10000 | 0.4482 | 0.6952 | 0.935 | 0.8324 | 0.1606 | 0.6718 | 0.7034 | 0.2909 | 0.7386 | 0.7411 | 0.2664 | 0.7321 | 0.7442 | 0.6869 | 0.7366 | 0.5971 | 0.6469 | 0.8016 | 0.8397 |
| 0.4806 | 11.0 | 11000 | 0.4050 | 0.7173 | 0.9272 | 0.8508 | 0.1944 | 0.7026 | 0.6995 | 0.2989 | 0.7653 | 0.7668 | 0.3625 | 0.7569 | 0.7501 | 0.7195 | 0.7714 | 0.6175 | 0.6727 | 0.8148 | 0.8563 |
| 0.4399 | 12.0 | 12000 | 0.3718 | 0.7302 | 0.9368 | 0.8634 | 0.1579 | 0.7164 | 0.7401 | 0.2978 | 0.7738 | 0.7781 | 0.3615 | 0.7704 | 0.7846 | 0.7396 | 0.7907 | 0.6266 | 0.6789 | 0.8243 | 0.8648 |
| 0.4352 | 13.0 | 13000 | 0.3683 | 0.7381 | 0.9387 | 0.8804 | 0.1006 | 0.7237 | 0.7379 | 0.3038 | 0.7804 | 0.7877 | 0.2985 | 0.782 | 0.7847 | 0.7423 | 0.7905 | 0.6523 | 0.7098 | 0.8196 | 0.8629 |
| 0.3661 | 14.0 | 14000 | 0.3637 | 0.7371 | 0.9533 | 0.8831 | 0.2483 | 0.7218 | 0.7479 | 0.3036 | 0.7782 | 0.7834 | 0.4209 | 0.7736 | 0.7855 | 0.7363 | 0.7843 | 0.65 | 0.7031 | 0.8251 | 0.8627 |
| 0.3787 | 15.0 | 15000 | 0.3630 | 0.7421 | 0.95 | 0.8744 | 0.2192 | 0.7345 | 0.7165 | 0.3009 | 0.7818 | 0.7868 | 0.3649 | 0.7837 | 0.7659 | 0.7441 | 0.7879 | 0.6515 | 0.7021 | 0.8306 | 0.8703 |
| 0.4016 | 16.0 | 16000 | 0.3423 | 0.7638 | 0.963 | 0.9129 | 0.2469 | 0.7567 | 0.7516 | 0.3139 | 0.8077 | 0.8116 | 0.4551 | 0.8085 | 0.7946 | 0.7511 | 0.801 | 0.7178 | 0.768 | 0.8224 | 0.8658 |
| 0.3734 | 17.0 | 17000 | 0.3366 | 0.7627 | 0.9572 | 0.8953 | 0.2228 | 0.7498 | 0.7609 | 0.3135 | 0.8018 | 0.8073 | 0.4384 | 0.7996 | 0.8067 | 0.7595 | 0.8111 | 0.6948 | 0.7366 | 0.8338 | 0.8744 |
| 0.4089 | 18.0 | 18000 | 0.3535 | 0.7547 | 0.9596 | 0.9035 | 0.2335 | 0.7443 | 0.7635 | 0.3108 | 0.7979 | 0.8037 | 0.3826 | 0.7987 | 0.8063 | 0.7372 | 0.7853 | 0.7054 | 0.7613 | 0.8215 | 0.8645 |
| 0.386 | 19.0 | 19000 | 0.3192 | 0.776 | 0.9557 | 0.901 | 0.1504 | 0.7675 | 0.7732 | 0.3187 | 0.8141 | 0.8202 | 0.37 | 0.8164 | 0.8149 | 0.77 | 0.8137 | 0.7253 | 0.7727 | 0.8326 | 0.8742 |
| 0.3379 | 20.0 | 20000 | 0.3019 | 0.7837 | 0.9568 | 0.9125 | 0.2331 | 0.7658 | 0.7835 | 0.3238 | 0.8228 | 0.8276 | 0.4099 | 0.8195 | 0.8256 | 0.7882 | 0.8318 | 0.7176 | 0.768 | 0.8452 | 0.8831 |
| 0.3237 | 21.0 | 21000 | 0.3427 | 0.7534 | 0.9543 | 0.8973 | 0.234 | 0.7377 | 0.76 | 0.306 | 0.7913 | 0.7963 | 0.4244 | 0.7846 | 0.8012 | 0.7278 | 0.7746 | 0.698 | 0.7459 | 0.8345 | 0.8683 |
| 0.3787 | 22.0 | 22000 | 0.3177 | 0.7791 | 0.9631 | 0.921 | 0.3373 | 0.7714 | 0.7715 | 0.3184 | 0.818 | 0.8219 | 0.4727 | 0.8171 | 0.8123 | 0.7613 | 0.8054 | 0.7344 | 0.7804 | 0.8415 | 0.8799 |
| 0.363 | 23.0 | 23000 | 0.3142 | 0.7761 | 0.955 | 0.9036 | 0.3077 | 0.7687 | 0.7433 | 0.3168 | 0.8141 | 0.8173 | 0.4544 | 0.8183 | 0.7843 | 0.7704 | 0.8074 | 0.7183 | 0.7613 | 0.8397 | 0.883 |
| 0.3256 | 24.0 | 24000 | 0.3298 | 0.7648 | 0.9578 | 0.8992 | 0.3243 | 0.7578 | 0.7536 | 0.3114 | 0.809 | 0.8121 | 0.467 | 0.8099 | 0.7978 | 0.751 | 0.7966 | 0.7092 | 0.7608 | 0.8343 | 0.8789 |
| 0.3629 | 25.0 | 25000 | 0.3238 | 0.7746 | 0.9575 | 0.9013 | 0.2561 | 0.7578 | 0.773 | 0.3179 | 0.8117 | 0.8165 | 0.4143 | 0.8055 | 0.8169 | 0.7679 | 0.806 | 0.721 | 0.7732 | 0.8349 | 0.8702 |
| 0.3193 | 26.0 | 26000 | 0.3196 | 0.7699 | 0.9585 | 0.9099 | 0.3111 | 0.7654 | 0.775 | 0.3129 | 0.8108 | 0.8151 | 0.469 | 0.8158 | 0.8147 | 0.7563 | 0.798 | 0.7077 | 0.7639 | 0.8456 | 0.8834 |
| 0.3119 | 27.0 | 27000 | 0.3063 | 0.7866 | 0.9609 | 0.9063 | 0.2972 | 0.7846 | 0.773 | 0.3224 | 0.8255 | 0.8295 | 0.4342 | 0.8284 | 0.8149 | 0.7844 | 0.8233 | 0.731 | 0.7804 | 0.8444 | 0.8849 |
| 0.3644 | 28.0 | 28000 | 0.3047 | 0.789 | 0.9631 | 0.9188 | 0.3042 | 0.7771 | 0.7968 | 0.3216 | 0.8292 | 0.8346 | 0.4279 | 0.8264 | 0.8385 | 0.778 | 0.8249 | 0.746 | 0.7959 | 0.843 | 0.883 |
| 0.312 | 29.0 | 29000 | 0.3091 | 0.7825 | 0.9551 | 0.9123 | 0.3037 | 0.7805 | 0.764 | 0.3189 | 0.8218 | 0.8252 | 0.4506 | 0.8265 | 0.8011 | 0.7715 | 0.8155 | 0.728 | 0.7727 | 0.848 | 0.8875 |
| 0.2541 | 30.0 | 30000 | 0.3137 | 0.7853 | 0.9639 | 0.9196 | 0.2953 | 0.7781 | 0.7713 | 0.3156 | 0.8207 | 0.8256 | 0.4451 | 0.821 | 0.8114 | 0.7712 | 0.8125 | 0.7305 | 0.7732 | 0.8542 | 0.8911 |
| 0.3204 | 31.0 | 31000 | 0.2987 | 0.7912 | 0.9557 | 0.9176 | 0.2836 | 0.7826 | 0.7955 | 0.3236 | 0.8258 | 0.8285 | 0.4021 | 0.8251 | 0.8285 | 0.789 | 0.826 | 0.7328 | 0.7727 | 0.8517 | 0.8869 |
| 0.314 | 32.0 | 32000 | 0.2865 | 0.7905 | 0.9563 | 0.9137 | 0.2431 | 0.7889 | 0.7756 | 0.3239 | 0.8278 | 0.8312 | 0.3641 | 0.8324 | 0.8173 | 0.7991 | 0.839 | 0.7176 | 0.7655 | 0.8548 | 0.8892 |
| 0.3041 | 33.0 | 33000 | 0.2888 | 0.7928 | 0.9579 | 0.9221 | 0.2753 | 0.7879 | 0.7912 | 0.3216 | 0.8318 | 0.8348 | 0.3892 | 0.8368 | 0.8271 | 0.7846 | 0.8252 | 0.7294 | 0.7809 | 0.8644 | 0.8983 |
| 0.311 | 34.0 | 34000 | 0.3012 | 0.7866 | 0.9631 | 0.9211 | 0.3232 | 0.7823 | 0.776 | 0.317 | 0.8222 | 0.826 | 0.4078 | 0.8275 | 0.8139 | 0.7746 | 0.8135 | 0.7308 | 0.7722 | 0.8545 | 0.8924 |
| 0.2986 | 35.0 | 35000 | 0.2985 | 0.7789 | 0.9575 | 0.9083 | 0.2946 | 0.7748 | 0.7589 | 0.3175 | 0.816 | 0.8206 | 0.4241 | 0.8223 | 0.8018 | 0.7725 | 0.8219 | 0.7044 | 0.7438 | 0.8597 | 0.8961 |
| 0.2779 | 36.0 | 36000 | 0.2815 | 0.7984 | 0.9577 | 0.9214 | 0.3043 | 0.7975 | 0.7964 | 0.3264 | 0.8364 | 0.8402 | 0.4018 | 0.8414 | 0.8337 | 0.7837 | 0.8258 | 0.7479 | 0.7959 | 0.8636 | 0.899 |
| 0.3041 | 37.0 | 37000 | 0.2968 | 0.7835 | 0.9652 | 0.9151 | 0.2779 | 0.7822 | 0.7796 | 0.3181 | 0.8251 | 0.8294 | 0.3964 | 0.8296 | 0.8234 | 0.7722 | 0.8207 | 0.7261 | 0.7768 | 0.8522 | 0.8907 |
| 0.2759 | 38.0 | 38000 | 0.3044 | 0.7804 | 0.96 | 0.9298 | 0.357 | 0.7765 | 0.7682 | 0.3136 | 0.82 | 0.825 | 0.4764 | 0.8242 | 0.806 | 0.7703 | 0.8189 | 0.7096 | 0.7613 | 0.8613 | 0.8946 |
| 0.2659 | 39.0 | 39000 | 0.2878 | 0.802 | 0.9627 | 0.926 | 0.3142 | 0.799 | 0.789 | 0.3279 | 0.841 | 0.845 | 0.4201 | 0.8447 | 0.8291 | 0.792 | 0.8344 | 0.7472 | 0.8005 | 0.8667 | 0.9 |
| 0.287 | 40.0 | 40000 | 0.2871 | 0.7942 | 0.964 | 0.9279 | 0.3398 | 0.7903 | 0.7839 | 0.3204 | 0.8352 | 0.8393 | 0.4533 | 0.8389 | 0.8277 | 0.784 | 0.8312 | 0.7389 | 0.7881 | 0.8597 | 0.8985 |
| 0.3186 | 41.0 | 41000 | 0.2942 | 0.795 | 0.96 | 0.9199 | 0.3527 | 0.7891 | 0.7935 | 0.3218 | 0.8335 | 0.837 | 0.4792 | 0.8329 | 0.8288 | 0.7732 | 0.8187 | 0.7463 | 0.7923 | 0.8656 | 0.9 |
| 0.318 | 42.0 | 42000 | 0.2868 | 0.7997 | 0.9574 | 0.9266 | 0.3171 | 0.7981 | 0.7881 | 0.324 | 0.8406 | 0.8452 | 0.4519 | 0.845 | 0.8307 | 0.797 | 0.8485 | 0.7407 | 0.7892 | 0.8615 | 0.898 |
| 0.2938 | 43.0 | 43000 | 0.2847 | 0.7943 | 0.9655 | 0.9252 | 0.3267 | 0.7888 | 0.7861 | 0.3237 | 0.834 | 0.8371 | 0.457 | 0.8359 | 0.8237 | 0.7879 | 0.8306 | 0.7315 | 0.782 | 0.8634 | 0.8987 |
| 0.2896 | 44.0 | 44000 | 0.2987 | 0.7884 | 0.9518 | 0.9183 | 0.354 | 0.7829 | 0.772 | 0.3174 | 0.8259 | 0.8296 | 0.464 | 0.8259 | 0.8082 | 0.7842 | 0.8288 | 0.7213 | 0.7655 | 0.8596 | 0.8945 |
| 0.2588 | 45.0 | 45000 | 0.2865 | 0.8004 | 0.9594 | 0.922 | 0.3865 | 0.8 | 0.7771 | 0.3217 | 0.8396 | 0.8419 | 0.4955 | 0.8438 | 0.813 | 0.7962 | 0.8451 | 0.7376 | 0.7799 | 0.8675 | 0.9009 |
| 0.2979 | 46.0 | 46000 | 0.3028 | 0.7692 | 0.9613 | 0.9167 | 0.3461 | 0.7702 | 0.7511 | 0.3139 | 0.8158 | 0.8197 | 0.4684 | 0.82 | 0.7991 | 0.7498 | 0.806 | 0.7057 | 0.7644 | 0.8521 | 0.8885 |
| 0.2467 | 47.0 | 47000 | 0.2854 | 0.8015 | 0.9609 | 0.9304 | 0.3137 | 0.7975 | 0.8061 | 0.3248 | 0.8371 | 0.8409 | 0.4401 | 0.8391 | 0.8429 | 0.7927 | 0.8354 | 0.7457 | 0.7892 | 0.8661 | 0.8981 |
| 0.3189 | 48.0 | 48000 | 0.2804 | 0.8037 | 0.966 | 0.9236 | 0.3198 | 0.8022 | 0.7992 | 0.3237 | 0.8408 | 0.8448 | 0.4561 | 0.843 | 0.8367 | 0.7804 | 0.8249 | 0.7561 | 0.8046 | 0.8746 | 0.9049 |
| 0.2527 | 49.0 | 49000 | 0.2588 | 0.8172 | 0.9648 | 0.9392 | 0.3083 | 0.82 | 0.7967 | 0.3312 | 0.8566 | 0.8609 | 0.4853 | 0.865 | 0.8372 | 0.8058 | 0.8491 | 0.7695 | 0.8227 | 0.8761 | 0.9109 |
| 0.2682 | 50.0 | 50000 | 0.2740 | 0.8047 | 0.9608 | 0.9262 | 0.3255 | 0.8015 | 0.7942 | 0.3274 | 0.8435 | 0.8463 | 0.4503 | 0.8446 | 0.8355 | 0.8002 | 0.8419 | 0.756 | 0.8021 | 0.8578 | 0.8951 |
| 0.2709 | 51.0 | 51000 | 0.2839 | 0.8034 | 0.9627 | 0.9375 | 0.3237 | 0.8041 | 0.7891 | 0.3238 | 0.8433 | 0.8473 | 0.4685 | 0.8493 | 0.8242 | 0.7958 | 0.8364 | 0.7432 | 0.801 | 0.8711 | 0.9044 |
| 0.2605 | 52.0 | 52000 | 0.2930 | 0.7915 | 0.9559 | 0.92 | 0.2725 | 0.7932 | 0.7729 | 0.3206 | 0.8321 | 0.8358 | 0.4688 | 0.8374 | 0.8145 | 0.7878 | 0.8348 | 0.7183 | 0.7716 | 0.8683 | 0.9009 |
| 0.2383 | 53.0 | 53000 | 0.2681 | 0.8148 | 0.9555 | 0.9313 | 0.3031 | 0.815 | 0.7972 | 0.3299 | 0.8531 | 0.8559 | 0.4438 | 0.8565 | 0.8359 | 0.8156 | 0.8571 | 0.7567 | 0.8067 | 0.8721 | 0.9039 |
| 0.2372 | 54.0 | 54000 | 0.2740 | 0.8139 | 0.955 | 0.9275 | 0.2924 | 0.818 | 0.7979 | 0.332 | 0.851 | 0.8556 | 0.4377 | 0.859 | 0.8352 | 0.8068 | 0.8495 | 0.7583 | 0.8077 | 0.8765 | 0.9095 |
| 0.2498 | 55.0 | 55000 | 0.2761 | 0.8101 | 0.9575 | 0.9276 | 0.3276 | 0.8163 | 0.7922 | 0.3296 | 0.851 | 0.8546 | 0.4877 | 0.8606 | 0.8295 | 0.8028 | 0.8485 | 0.7492 | 0.8015 | 0.8784 | 0.9138 |
| 0.233 | 56.0 | 56000 | 0.2814 | 0.8116 | 0.9548 | 0.9196 | 0.2684 | 0.8109 | 0.801 | 0.3301 | 0.8497 | 0.8555 | 0.4748 | 0.8568 | 0.8435 | 0.8139 | 0.8571 | 0.7518 | 0.8041 | 0.8691 | 0.9051 |
| 0.2464 | 57.0 | 57000 | 0.2623 | 0.8174 | 0.9624 | 0.9295 | 0.3325 | 0.8219 | 0.8034 | 0.33 | 0.8532 | 0.8565 | 0.4635 | 0.8607 | 0.8406 | 0.8134 | 0.8535 | 0.761 | 0.8062 | 0.8779 | 0.9099 |
| 0.2549 | 58.0 | 58000 | 0.2771 | 0.8033 | 0.965 | 0.935 | 0.3208 | 0.8014 | 0.795 | 0.3243 | 0.8397 | 0.8427 | 0.4593 | 0.8402 | 0.8308 | 0.7849 | 0.8249 | 0.75 | 0.7985 | 0.8751 | 0.9048 |
| 0.2956 | 59.0 | 59000 | 0.2846 | 0.7956 | 0.9549 | 0.9152 | 0.342 | 0.7951 | 0.7762 | 0.3243 | 0.8358 | 0.8387 | 0.47 | 0.8406 | 0.8193 | 0.7906 | 0.8374 | 0.7276 | 0.7768 | 0.8688 | 0.9017 |
| 0.2468 | 60.0 | 60000 | 0.2522 | 0.8268 | 0.9656 | 0.9312 | 0.359 | 0.833 | 0.7982 | 0.3345 | 0.8631 | 0.8663 | 0.4718 | 0.8726 | 0.8386 | 0.8226 | 0.8632 | 0.7741 | 0.8211 | 0.8837 | 0.9146 |
| 0.258 | 61.0 | 61000 | 0.2676 | 0.8138 | 0.9597 | 0.931 | 0.349 | 0.8216 | 0.7884 | 0.3294 | 0.8541 | 0.8571 | 0.4606 | 0.864 | 0.8268 | 0.8143 | 0.8563 | 0.7525 | 0.8077 | 0.8747 | 0.9071 |
| 0.2439 | 62.0 | 62000 | 0.2643 | 0.8146 | 0.9612 | 0.9303 | 0.3543 | 0.8204 | 0.7861 | 0.3305 | 0.8562 | 0.8588 | 0.4796 | 0.8636 | 0.8297 | 0.8115 | 0.8561 | 0.757 | 0.8108 | 0.8753 | 0.9095 |
| 0.2355 | 63.0 | 63000 | 0.2709 | 0.8123 | 0.9634 | 0.9229 | 0.3778 | 0.813 | 0.7858 | 0.3303 | 0.8483 | 0.8528 | 0.4972 | 0.855 | 0.8264 | 0.8108 | 0.8479 | 0.7523 | 0.8031 | 0.8738 | 0.9074 |
| 0.2522 | 64.0 | 64000 | 0.2627 | 0.8184 | 0.9612 | 0.9221 | 0.3852 | 0.8206 | 0.7973 | 0.3337 | 0.8562 | 0.8586 | 0.4958 | 0.8622 | 0.8351 | 0.8092 | 0.8507 | 0.7626 | 0.8103 | 0.8832 | 0.9147 |
| 0.2428 | 65.0 | 65000 | 0.2917 | 0.8024 | 0.9618 | 0.9159 | 0.2852 | 0.8015 | 0.7915 | 0.3257 | 0.8411 | 0.8441 | 0.4207 | 0.8445 | 0.8292 | 0.7914 | 0.8354 | 0.7549 | 0.8031 | 0.8611 | 0.8939 |
| 0.2405 | 66.0 | 66000 | 0.2771 | 0.81 | 0.962 | 0.9272 | 0.3362 | 0.8129 | 0.7865 | 0.3294 | 0.8512 | 0.8544 | 0.4887 | 0.8582 | 0.8285 | 0.8114 | 0.8563 | 0.7559 | 0.8077 | 0.8627 | 0.8993 |
| 0.255 | 67.0 | 67000 | 0.2727 | 0.8104 | 0.9623 | 0.926 | 0.3047 | 0.8139 | 0.7856 | 0.3283 | 0.8447 | 0.8491 | 0.4225 | 0.8529 | 0.8272 | 0.8019 | 0.8449 | 0.7536 | 0.7959 | 0.8757 | 0.9066 |
| 0.2475 | 68.0 | 68000 | 0.2639 | 0.8162 | 0.9556 | 0.9204 | 0.3243 | 0.8188 | 0.7926 | 0.331 | 0.8545 | 0.8577 | 0.4737 | 0.8609 | 0.8351 | 0.8143 | 0.8598 | 0.755 | 0.801 | 0.8793 | 0.9124 |
| 0.2041 | 69.0 | 69000 | 0.2542 | 0.8289 | 0.9595 | 0.9288 | 0.3341 | 0.833 | 0.8189 | 0.3353 | 0.8623 | 0.8663 | 0.4711 | 0.87 | 0.8509 | 0.8292 | 0.8696 | 0.7689 | 0.8124 | 0.8885 | 0.917 |
| 0.2391 | 70.0 | 70000 | 0.2644 | 0.8188 | 0.9622 | 0.9296 | 0.3309 | 0.8203 | 0.8009 | 0.3339 | 0.8559 | 0.8601 | 0.444 | 0.8605 | 0.8428 | 0.8081 | 0.8563 | 0.7641 | 0.8108 | 0.8843 | 0.9131 |
| 0.1918 | 71.0 | 71000 | 0.2626 | 0.8182 | 0.9631 | 0.9182 | 0.3108 | 0.826 | 0.7918 | 0.3331 | 0.8582 | 0.8617 | 0.4313 | 0.8668 | 0.8396 | 0.8135 | 0.8586 | 0.7589 | 0.8129 | 0.8822 | 0.9137 |
| 0.2418 | 72.0 | 72000 | 0.2565 | 0.8287 | 0.9627 | 0.9296 | 0.3695 | 0.8276 | 0.8099 | 0.3355 | 0.8638 | 0.8683 | 0.5079 | 0.8691 | 0.8483 | 0.8292 | 0.8694 | 0.7733 | 0.8216 | 0.8835 | 0.914 |
| 0.2341 | 73.0 | 73000 | 0.2736 | 0.8156 | 0.9598 | 0.931 | 0.3116 | 0.8172 | 0.7993 | 0.3301 | 0.8541 | 0.8566 | 0.4335 | 0.8559 | 0.838 | 0.8113 | 0.8513 | 0.7619 | 0.8129 | 0.8735 | 0.9055 |
| 0.2384 | 74.0 | 74000 | 0.2652 | 0.8241 | 0.9617 | 0.9293 | 0.3518 | 0.8213 | 0.8063 | 0.3337 | 0.8598 | 0.8621 | 0.4717 | 0.8596 | 0.8437 | 0.814 | 0.8545 | 0.7789 | 0.8227 | 0.8795 | 0.9092 |
| 0.2204 | 75.0 | 75000 | 0.2631 | 0.828 | 0.9668 | 0.9335 | 0.3388 | 0.8291 | 0.8109 | 0.3329 | 0.8635 | 0.8674 | 0.5036 | 0.8668 | 0.8509 | 0.8216 | 0.8632 | 0.7807 | 0.8289 | 0.8817 | 0.9102 |
| 0.2124 | 76.0 | 76000 | 0.2679 | 0.8189 | 0.9638 | 0.928 | 0.3074 | 0.8241 | 0.7996 | 0.3308 | 0.8591 | 0.862 | 0.4452 | 0.8679 | 0.8441 | 0.8011 | 0.8465 | 0.7689 | 0.8242 | 0.8865 | 0.9154 |
| 0.2254 | 77.0 | 77000 | 0.2620 | 0.8233 | 0.9644 | 0.9317 | 0.3056 | 0.8228 | 0.812 | 0.3332 | 0.8603 | 0.8634 | 0.4343 | 0.864 | 0.8499 | 0.8119 | 0.8561 | 0.7773 | 0.8237 | 0.8808 | 0.9103 |
| 0.2171 | 78.0 | 78000 | 0.2613 | 0.8292 | 0.9634 | 0.9308 | 0.3425 | 0.8314 | 0.8124 | 0.3355 | 0.8666 | 0.8694 | 0.4835 | 0.8718 | 0.8512 | 0.8186 | 0.8604 | 0.7792 | 0.8294 | 0.8897 | 0.9185 |
| 0.2361 | 79.0 | 79000 | 0.2700 | 0.8236 | 0.9606 | 0.924 | 0.3619 | 0.8287 | 0.7972 | 0.3327 | 0.861 | 0.8641 | 0.4759 | 0.8719 | 0.8367 | 0.8184 | 0.8622 | 0.7702 | 0.8149 | 0.8824 | 0.9151 |
| 0.2112 | 80.0 | 80000 | 0.2678 | 0.8251 | 0.9632 | 0.925 | 0.3404 | 0.8294 | 0.8013 | 0.3315 | 0.8641 | 0.867 | 0.4954 | 0.8675 | 0.8457 | 0.8223 | 0.8658 | 0.7722 | 0.8227 | 0.8808 | 0.9125 |
| 0.2106 | 81.0 | 81000 | 0.2684 | 0.8197 | 0.9621 | 0.9325 | 0.3081 | 0.8237 | 0.7928 | 0.331 | 0.8555 | 0.859 | 0.4629 | 0.8627 | 0.8278 | 0.8161 | 0.8596 | 0.7587 | 0.8041 | 0.8843 | 0.9132 |
| 0.2021 | 82.0 | 82000 | 0.2732 | 0.8164 | 0.9624 | 0.936 | 0.3383 | 0.8179 | 0.7907 | 0.3303 | 0.8543 | 0.8586 | 0.4965 | 0.8593 | 0.8344 | 0.8119 | 0.8561 | 0.7643 | 0.8124 | 0.8729 | 0.9073 |
| 0.1938 | 83.0 | 83000 | 0.2688 | 0.8262 | 0.9618 | 0.9244 | 0.281 | 0.8355 | 0.7942 | 0.3356 | 0.8644 | 0.8685 | 0.4293 | 0.8748 | 0.8382 | 0.8152 | 0.8632 | 0.7902 | 0.8356 | 0.873 | 0.9068 |
| 0.2231 | 84.0 | 84000 | 0.2531 | 0.8342 | 0.9652 | 0.9373 | 0.3177 | 0.8448 | 0.7984 | 0.3356 | 0.8695 | 0.8721 | 0.4551 | 0.879 | 0.8432 | 0.8315 | 0.8728 | 0.7866 | 0.8309 | 0.8846 | 0.9127 |
| 0.2314 | 85.0 | 85000 | 0.2723 | 0.8151 | 0.9587 | 0.925 | 0.3363 | 0.8207 | 0.7865 | 0.3289 | 0.8503 | 0.8541 | 0.4551 | 0.8602 | 0.8276 | 0.8125 | 0.8569 | 0.7536 | 0.7938 | 0.8792 | 0.9115 |
| 0.2052 | 86.0 | 86000 | 0.2638 | 0.826 | 0.9656 | 0.9323 | 0.3508 | 0.8314 | 0.8039 | 0.334 | 0.8627 | 0.8661 | 0.4971 | 0.87 | 0.844 | 0.8171 | 0.8612 | 0.7766 | 0.8237 | 0.8842 | 0.9134 |
| 0.2455 | 87.0 | 87000 | 0.2559 | 0.8287 | 0.9634 | 0.9376 | 0.3345 | 0.8368 | 0.8071 | 0.3335 | 0.8652 | 0.8686 | 0.4934 | 0.8729 | 0.8473 | 0.823 | 0.868 | 0.7753 | 0.8206 | 0.8879 | 0.917 |
| 0.2092 | 88.0 | 88000 | 0.2528 | 0.8323 | 0.9653 | 0.935 | 0.3103 | 0.8377 | 0.8141 | 0.335 | 0.8653 | 0.8696 | 0.4575 | 0.8733 | 0.851 | 0.824 | 0.864 | 0.7837 | 0.8273 | 0.8891 | 0.9175 |
| 0.2307 | 89.0 | 89000 | 0.2581 | 0.8313 | 0.9631 | 0.9325 | 0.3615 | 0.8363 | 0.8091 | 0.3369 | 0.8671 | 0.8691 | 0.4948 | 0.8745 | 0.8477 | 0.8244 | 0.8672 | 0.7815 | 0.8216 | 0.888 | 0.9183 |
| 0.202 | 90.0 | 90000 | 0.2578 | 0.8324 | 0.9648 | 0.9372 | 0.3184 | 0.8376 | 0.805 | 0.3365 | 0.8686 | 0.8722 | 0.4423 | 0.8799 | 0.8457 | 0.8226 | 0.8656 | 0.7872 | 0.833 | 0.8873 | 0.9179 |
| 0.199 | 91.0 | 91000 | 0.2610 | 0.8245 | 0.964 | 0.93 | 0.3308 | 0.8253 | 0.8027 | 0.3342 | 0.8618 | 0.8654 | 0.4759 | 0.8669 | 0.8453 | 0.8155 | 0.8618 | 0.7703 | 0.818 | 0.8878 | 0.9164 |
| 0.2106 | 92.0 | 92000 | 0.2545 | 0.832 | 0.9644 | 0.9406 | 0.3802 | 0.8344 | 0.8012 | 0.3365 | 0.8683 | 0.8714 | 0.5107 | 0.8758 | 0.8441 | 0.8226 | 0.8654 | 0.7831 | 0.8299 | 0.8903 | 0.9189 |
| 0.1869 | 93.0 | 93000 | 0.2589 | 0.8285 | 0.9636 | 0.938 | 0.3702 | 0.8361 | 0.8014 | 0.3361 | 0.8649 | 0.8686 | 0.5086 | 0.8758 | 0.8438 | 0.8214 | 0.8656 | 0.7779 | 0.8247 | 0.8862 | 0.9156 |
| 0.2032 | 94.0 | 94000 | 0.2597 | 0.8349 | 0.9664 | 0.9383 | 0.4111 | 0.8417 | 0.7981 | 0.338 | 0.8722 | 0.8757 | 0.5746 | 0.8828 | 0.8415 | 0.83 | 0.871 | 0.7888 | 0.8371 | 0.886 | 0.9189 |
| 0.211 | 95.0 | 95000 | 0.2447 | 0.8411 | 0.9676 | 0.9384 | 0.3862 | 0.85 | 0.8085 | 0.3392 | 0.8765 | 0.8808 | 0.5183 | 0.8908 | 0.847 | 0.8365 | 0.8779 | 0.7967 | 0.8423 | 0.8903 | 0.9221 |
| 0.1982 | 96.0 | 96000 | 0.2555 | 0.8309 | 0.9676 | 0.9437 | 0.3621 | 0.834 | 0.8032 | 0.3347 | 0.8646 | 0.8684 | 0.509 | 0.8735 | 0.8408 | 0.8224 | 0.8604 | 0.7882 | 0.8304 | 0.8822 | 0.9144 |
| 0.1964 | 97.0 | 97000 | 0.2496 | 0.8376 | 0.9662 | 0.9431 | 0.393 | 0.8461 | 0.8067 | 0.3389 | 0.8717 | 0.8754 | 0.5202 | 0.884 | 0.8459 | 0.8263 | 0.8678 | 0.7966 | 0.8381 | 0.8898 | 0.9202 |
| 0.188 | 98.0 | 98000 | 0.2560 | 0.8329 | 0.963 | 0.9372 | 0.342 | 0.8379 | 0.8092 | 0.3374 | 0.8673 | 0.872 | 0.4843 | 0.8767 | 0.8494 | 0.8204 | 0.867 | 0.7864 | 0.8294 | 0.892 | 0.9195 |
| 0.2084 | 99.0 | 99000 | 0.2566 | 0.8353 | 0.9652 | 0.9402 | 0.3626 | 0.8444 | 0.8013 | 0.3365 | 0.8707 | 0.8745 | 0.4925 | 0.882 | 0.8442 | 0.823 | 0.8654 | 0.7933 | 0.8387 | 0.8896 | 0.9195 |
| 0.2004 | 100.0 | 100000 | 0.2680 | 0.825 | 0.9633 | 0.9384 | 0.3756 | 0.8293 | 0.7981 | 0.3326 | 0.8611 | 0.8644 | 0.5153 | 0.8696 | 0.8375 | 0.8143 | 0.8555 | 0.7725 | 0.8201 | 0.8883 | 0.9175 |
| 0.1818 | 101.0 | 101000 | 0.2498 | 0.8432 | 0.9643 | 0.9405 | 0.3612 | 0.8505 | 0.805 | 0.3391 | 0.8769 | 0.8797 | 0.4994 | 0.8871 | 0.8442 | 0.8327 | 0.873 | 0.8057 | 0.8459 | 0.8912 | 0.9202 |
| 0.178 | 102.0 | 102000 | 0.2545 | 0.8381 | 0.9604 | 0.9382 | 0.3649 | 0.8464 | 0.8108 | 0.3389 | 0.8734 | 0.8757 | 0.4773 | 0.8831 | 0.8518 | 0.8292 | 0.8702 | 0.7921 | 0.8345 | 0.8931 | 0.9223 |
| 0.1937 | 103.0 | 103000 | 0.2556 | 0.8349 | 0.9647 | 0.9407 | 0.3542 | 0.8371 | 0.813 | 0.3361 | 0.8697 | 0.8733 | 0.5009 | 0.8765 | 0.8536 | 0.8255 | 0.8652 | 0.7893 | 0.8351 | 0.89 | 0.9198 |
| 0.1879 | 104.0 | 104000 | 0.2592 | 0.8356 | 0.9672 | 0.9342 | 0.3424 | 0.8393 | 0.8126 | 0.3354 | 0.8696 | 0.8734 | 0.49 | 0.878 | 0.8504 | 0.8264 | 0.8664 | 0.7942 | 0.8366 | 0.8863 | 0.9173 |
| 0.2098 | 105.0 | 105000 | 0.2523 | 0.8367 | 0.9649 | 0.9434 | 0.3683 | 0.8408 | 0.8066 | 0.3376 | 0.8717 | 0.8747 | 0.5092 | 0.8792 | 0.8444 | 0.8333 | 0.8744 | 0.7882 | 0.832 | 0.8886 | 0.9176 |
| 0.1813 | 106.0 | 106000 | 0.2571 | 0.8383 | 0.9641 | 0.9338 | 0.3601 | 0.8416 | 0.8167 | 0.3378 | 0.8721 | 0.8746 | 0.5145 | 0.8786 | 0.8533 | 0.8352 | 0.8736 | 0.7895 | 0.8309 | 0.8902 | 0.9194 |
| 0.2125 | 107.0 | 107000 | 0.2517 | 0.8389 | 0.9655 | 0.9326 | 0.381 | 0.8432 | 0.8084 | 0.3387 | 0.8742 | 0.8775 | 0.5383 | 0.8819 | 0.8486 | 0.8346 | 0.8742 | 0.7882 | 0.8361 | 0.894 | 0.9221 |
| 0.1987 | 108.0 | 108000 | 0.2592 | 0.8335 | 0.9582 | 0.9326 | 0.3625 | 0.8386 | 0.8135 | 0.338 | 0.8694 | 0.8723 | 0.4925 | 0.877 | 0.8552 | 0.8261 | 0.868 | 0.7815 | 0.8284 | 0.8929 | 0.9207 |
| 0.2048 | 109.0 | 109000 | 0.2571 | 0.8435 | 0.9681 | 0.9365 | 0.3467 | 0.8487 | 0.8149 | 0.3404 | 0.8767 | 0.8806 | 0.4947 | 0.8874 | 0.8505 | 0.8319 | 0.872 | 0.8028 | 0.8469 | 0.8958 | 0.9229 |
| 0.1861 | 110.0 | 110000 | 0.2564 | 0.8409 | 0.9643 | 0.9365 | 0.3893 | 0.8418 | 0.8189 | 0.3404 | 0.8735 | 0.876 | 0.5313 | 0.8792 | 0.8579 | 0.8333 | 0.874 | 0.7955 | 0.834 | 0.8939 | 0.9198 |
| 0.1834 | 111.0 | 111000 | 0.2541 | 0.8398 | 0.9644 | 0.938 | 0.3715 | 0.8442 | 0.822 | 0.339 | 0.8725 | 0.8759 | 0.5182 | 0.8816 | 0.8587 | 0.8327 | 0.8702 | 0.796 | 0.8356 | 0.8907 | 0.9218 |
| 0.1925 | 112.0 | 112000 | 0.2557 | 0.8383 | 0.964 | 0.9352 | 0.3695 | 0.8419 | 0.8198 | 0.3398 | 0.8754 | 0.8784 | 0.5056 | 0.8811 | 0.8632 | 0.8367 | 0.8775 | 0.7873 | 0.8371 | 0.891 | 0.9205 |
| 0.1996 | 113.0 | 113000 | 0.2539 | 0.8412 | 0.9631 | 0.9437 | 0.3637 | 0.8459 | 0.8177 | 0.3394 | 0.8754 | 0.8783 | 0.4989 | 0.8848 | 0.8566 | 0.8363 | 0.8744 | 0.7943 | 0.8392 | 0.8931 | 0.9214 |
| 0.1771 | 114.0 | 114000 | 0.2525 | 0.8473 | 0.9616 | 0.9437 | 0.365 | 0.8512 | 0.8305 | 0.3413 | 0.8792 | 0.8831 | 0.4942 | 0.8892 | 0.8672 | 0.8445 | 0.8827 | 0.8009 | 0.8423 | 0.8963 | 0.9245 |
| 0.1794 | 115.0 | 115000 | 0.2547 | 0.8401 | 0.9666 | 0.9407 | 0.3566 | 0.8433 | 0.819 | 0.3392 | 0.8747 | 0.8782 | 0.4861 | 0.8833 | 0.8583 | 0.829 | 0.8698 | 0.7972 | 0.8423 | 0.8942 | 0.9226 |
| 0.1915 | 116.0 | 116000 | 0.2512 | 0.8442 | 0.9639 | 0.9369 | 0.3772 | 0.8491 | 0.8249 | 0.3396 | 0.8756 | 0.8787 | 0.4985 | 0.8852 | 0.8613 | 0.8439 | 0.8813 | 0.794 | 0.832 | 0.8948 | 0.9227 |
| 0.1716 | 117.0 | 117000 | 0.2520 | 0.8415 | 0.9634 | 0.9411 | 0.395 | 0.8444 | 0.8165 | 0.3397 | 0.8764 | 0.8799 | 0.5354 | 0.8843 | 0.8585 | 0.8375 | 0.8783 | 0.7957 | 0.8407 | 0.8914 | 0.9207 |
| 0.2288 | 118.0 | 118000 | 0.2557 | 0.8364 | 0.9624 | 0.9329 | 0.3636 | 0.8423 | 0.8172 | 0.3382 | 0.871 | 0.8751 | 0.5037 | 0.8821 | 0.8563 | 0.8338 | 0.874 | 0.7838 | 0.8294 | 0.8915 | 0.922 |
| 0.1847 | 119.0 | 119000 | 0.2550 | 0.8475 | 0.9675 | 0.9419 | 0.3875 | 0.8515 | 0.8227 | 0.3403 | 0.8803 | 0.8835 | 0.5371 | 0.8886 | 0.8619 | 0.8386 | 0.8775 | 0.81 | 0.85 | 0.8939 | 0.9231 |
| 0.1883 | 120.0 | 120000 | 0.2592 | 0.842 | 0.9671 | 0.9376 | 0.3497 | 0.8446 | 0.8199 | 0.3392 | 0.8768 | 0.8799 | 0.5026 | 0.8851 | 0.8609 | 0.8365 | 0.8763 | 0.7942 | 0.8407 | 0.8954 | 0.9229 |
| 0.1745 | 121.0 | 121000 | 0.2545 | 0.8424 | 0.9646 | 0.9365 | 0.3549 | 0.8515 | 0.815 | 0.3399 | 0.8771 | 0.8797 | 0.4978 | 0.8887 | 0.8557 | 0.8372 | 0.8775 | 0.7945 | 0.8376 | 0.8955 | 0.924 |
| 0.1779 | 122.0 | 122000 | 0.2553 | 0.8468 | 0.9652 | 0.9393 | 0.3822 | 0.8529 | 0.8214 | 0.3409 | 0.8804 | 0.883 | 0.5073 | 0.89 | 0.8588 | 0.8367 | 0.8748 | 0.8066 | 0.8505 | 0.897 | 0.9236 |
| 0.185 | 123.0 | 123000 | 0.2605 | 0.8413 | 0.966 | 0.9416 | 0.3636 | 0.848 | 0.8138 | 0.3386 | 0.8763 | 0.8789 | 0.4996 | 0.886 | 0.8525 | 0.833 | 0.873 | 0.7977 | 0.8423 | 0.8933 | 0.9215 |
| 0.1827 | 124.0 | 124000 | 0.2546 | 0.8445 | 0.9649 | 0.9423 | 0.3812 | 0.8516 | 0.8209 | 0.3396 | 0.88 | 0.8826 | 0.5116 | 0.8898 | 0.8575 | 0.8396 | 0.8791 | 0.7995 | 0.8459 | 0.8945 | 0.923 |
| 0.1875 | 125.0 | 125000 | 0.2500 | 0.8475 | 0.9674 | 0.9385 | 0.3571 | 0.8549 | 0.821 | 0.3411 | 0.8817 | 0.8843 | 0.5009 | 0.8928 | 0.8582 | 0.8422 | 0.8811 | 0.8041 | 0.8485 | 0.8963 | 0.9234 |
| 0.1694 | 126.0 | 126000 | 0.2610 | 0.8388 | 0.9629 | 0.9428 | 0.3501 | 0.8462 | 0.8117 | 0.3384 | 0.8738 | 0.8761 | 0.4843 | 0.8827 | 0.8517 | 0.8313 | 0.8702 | 0.7915 | 0.8371 | 0.8936 | 0.9211 |
| 0.1899 | 127.0 | 127000 | 0.2542 | 0.8414 | 0.9645 | 0.9408 | 0.3818 | 0.8475 | 0.8125 | 0.3384 | 0.876 | 0.8781 | 0.4918 | 0.8858 | 0.8508 | 0.8398 | 0.8779 | 0.7893 | 0.8335 | 0.895 | 0.9229 |
| 0.1861 | 128.0 | 128000 | 0.2551 | 0.8387 | 0.963 | 0.9424 | 0.3713 | 0.8465 | 0.813 | 0.3378 | 0.8757 | 0.8782 | 0.4943 | 0.8867 | 0.8539 | 0.8353 | 0.8759 | 0.7848 | 0.8351 | 0.8959 | 0.9237 |
| 0.1945 | 129.0 | 129000 | 0.2532 | 0.8441 | 0.9643 | 0.9438 | 0.3788 | 0.8541 | 0.8202 | 0.3405 | 0.88 | 0.8822 | 0.4978 | 0.8919 | 0.8583 | 0.8406 | 0.8797 | 0.795 | 0.8418 | 0.8966 | 0.925 |
| 0.179 | 130.0 | 130000 | 0.2606 | 0.8417 | 0.964 | 0.942 | 0.354 | 0.8486 | 0.8147 | 0.3403 | 0.8773 | 0.8793 | 0.4922 | 0.886 | 0.8545 | 0.8365 | 0.8742 | 0.795 | 0.8423 | 0.8935 | 0.9213 |
| 0.1717 | 131.0 | 131000 | 0.2546 | 0.8423 | 0.9642 | 0.9417 | 0.3611 | 0.849 | 0.8174 | 0.3393 | 0.8766 | 0.8792 | 0.4865 | 0.8866 | 0.8547 | 0.8344 | 0.874 | 0.7952 | 0.8402 | 0.8972 | 0.9234 |
| 0.1802 | 132.0 | 132000 | 0.2572 | 0.8419 | 0.9641 | 0.9436 | 0.3438 | 0.8481 | 0.8167 | 0.3392 | 0.8765 | 0.879 | 0.491 | 0.8851 | 0.8564 | 0.8337 | 0.8726 | 0.7959 | 0.8412 | 0.8961 | 0.923 |
| 0.1993 | 133.0 | 133000 | 0.2560 | 0.8421 | 0.9642 | 0.9403 | 0.3594 | 0.8499 | 0.8141 | 0.3391 | 0.8768 | 0.8795 | 0.5064 | 0.8873 | 0.8545 | 0.8368 | 0.8763 | 0.7912 | 0.8371 | 0.8984 | 0.925 |
| 0.1877 | 134.0 | 134000 | 0.2565 | 0.8434 | 0.9644 | 0.943 | 0.3654 | 0.8479 | 0.8161 | 0.3393 | 0.8762 | 0.8789 | 0.5068 | 0.8853 | 0.8543 | 0.8391 | 0.8769 | 0.7949 | 0.8366 | 0.8961 | 0.9233 |
| 0.1557 | 135.0 | 135000 | 0.2516 | 0.8447 | 0.9644 | 0.9404 | 0.3534 | 0.8503 | 0.8183 | 0.3399 | 0.8782 | 0.881 | 0.4918 | 0.8874 | 0.8583 | 0.8391 | 0.8801 | 0.799 | 0.8402 | 0.8961 | 0.9227 |
| 0.1825 | 136.0 | 136000 | 0.2555 | 0.8453 | 0.9643 | 0.9439 | 0.3752 | 0.852 | 0.8187 | 0.3399 | 0.8783 | 0.8809 | 0.4963 | 0.8881 | 0.8602 | 0.8406 | 0.8793 | 0.7989 | 0.8402 | 0.8963 | 0.9231 |
| 0.1785 | 137.0 | 137000 | 0.2565 | 0.8431 | 0.9642 | 0.9407 | 0.3736 | 0.8496 | 0.8168 | 0.339 | 0.8771 | 0.8794 | 0.4897 | 0.8868 | 0.8573 | 0.8381 | 0.8779 | 0.796 | 0.8371 | 0.8953 | 0.9231 |
| 0.1839 | 138.0 | 138000 | 0.2572 | 0.8405 | 0.9635 | 0.9393 | 0.3572 | 0.847 | 0.8149 | 0.3388 | 0.877 | 0.8792 | 0.4918 | 0.8865 | 0.8562 | 0.8358 | 0.8769 | 0.7904 | 0.8371 | 0.8954 | 0.9236 |
| 0.1637 | 139.0 | 139000 | 0.2531 | 0.8423 | 0.9625 | 0.9426 | 0.3606 | 0.8493 | 0.8205 | 0.3401 | 0.8789 | 0.8813 | 0.4918 | 0.8892 | 0.8605 | 0.8397 | 0.8801 | 0.7909 | 0.8397 | 0.8963 | 0.9242 |
| 0.1719 | 140.0 | 140000 | 0.2543 | 0.843 | 0.9638 | 0.9409 | 0.3549 | 0.8504 | 0.8201 | 0.3395 | 0.8777 | 0.8804 | 0.4999 | 0.8873 | 0.8604 | 0.8372 | 0.8783 | 0.7956 | 0.8392 | 0.8963 | 0.9239 |
| 0.1795 | 141.0 | 141000 | 0.2555 | 0.8433 | 0.9641 | 0.9404 | 0.3562 | 0.8503 | 0.8177 | 0.3396 | 0.8781 | 0.8805 | 0.4961 | 0.8878 | 0.8583 | 0.8384 | 0.8787 | 0.7943 | 0.8392 | 0.8972 | 0.9237 |
| 0.1657 | 142.0 | 142000 | 0.2554 | 0.8429 | 0.9632 | 0.9389 | 0.363 | 0.8488 | 0.8188 | 0.3392 | 0.8777 | 0.8801 | 0.5015 | 0.8868 | 0.8581 | 0.8388 | 0.8785 | 0.7935 | 0.8387 | 0.8965 | 0.9231 |
| 0.1977 | 143.0 | 143000 | 0.2539 | 0.8461 | 0.9641 | 0.9436 | 0.3616 | 0.854 | 0.82 | 0.3407 | 0.8802 | 0.8827 | 0.5015 | 0.8905 | 0.8604 | 0.8415 | 0.8813 | 0.7995 | 0.8428 | 0.8972 | 0.9242 |
| 0.1938 | 144.0 | 144000 | 0.2543 | 0.8456 | 0.9639 | 0.9396 | 0.3635 | 0.8532 | 0.8196 | 0.3403 | 0.8795 | 0.8819 | 0.505 | 0.8891 | 0.8596 | 0.8425 | 0.8817 | 0.7984 | 0.8407 | 0.896 | 0.9234 |
| 0.1728 | 145.0 | 145000 | 0.2548 | 0.8447 | 0.9641 | 0.9415 | 0.3581 | 0.8517 | 0.819 | 0.3394 | 0.8787 | 0.8811 | 0.5029 | 0.8882 | 0.8587 | 0.8413 | 0.8809 | 0.7963 | 0.8387 | 0.8965 | 0.9237 |
| 0.1656 | 146.0 | 146000 | 0.2552 | 0.8449 | 0.9639 | 0.9452 | 0.3586 | 0.8516 | 0.8188 | 0.3395 | 0.879 | 0.8814 | 0.5029 | 0.8884 | 0.8591 | 0.8404 | 0.8805 | 0.7978 | 0.8402 | 0.8963 | 0.9234 |
| 0.1753 | 147.0 | 147000 | 0.2554 | 0.8449 | 0.9639 | 0.9414 | 0.3586 | 0.8516 | 0.8192 | 0.3395 | 0.8788 | 0.8813 | 0.5029 | 0.8882 | 0.8593 | 0.8402 | 0.8801 | 0.7972 | 0.8397 | 0.8972 | 0.924 |
| 0.1928 | 148.0 | 148000 | 0.2554 | 0.8457 | 0.9638 | 0.9414 | 0.3586 | 0.8528 | 0.8199 | 0.3399 | 0.8797 | 0.8822 | 0.5029 | 0.8893 | 0.8598 | 0.8425 | 0.8819 | 0.7981 | 0.8407 | 0.8964 | 0.9239 |
| 0.1662 | 149.0 | 149000 | 0.2554 | 0.8454 | 0.9638 | 0.9414 | 0.3586 | 0.8525 | 0.8191 | 0.3399 | 0.8795 | 0.8819 | 0.5029 | 0.8889 | 0.8593 | 0.8417 | 0.8813 | 0.7981 | 0.8407 | 0.8964 | 0.9237 |
| 0.1556 | 150.0 | 150000 | 0.2554 | 0.8454 | 0.9638 | 0.9414 | 0.3586 | 0.8525 | 0.8191 | 0.3399 | 0.8795 | 0.8819 | 0.5029 | 0.8889 | 0.8593 | 0.8417 | 0.8813 | 0.7981 | 0.8407 | 0.8964 | 0.9237 |
### Framework versions
- Transformers 4.46.1
- Pytorch 2.5.0+cu121
- Datasets 2.19.2
- Tokenizers 0.20.1
| [
"chicken",
"duck",
"plant"
] |
joe611/chickens-composite-02020202020-150-epochs-w-transform-metrics-test |
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# chickens-composite-02020202020-150-epochs-w-transform-metrics-test
This model is a fine-tuned version of [facebook/detr-resnet-50](https://huggingface.co/facebook/detr-resnet-50) on the None dataset.
It achieves the following results on the evaluation set:
- Loss: 0.2867
- Map: 0.7918
- Map 50: 0.9463
- Map 75: 0.8926
- Map Small: 0.2343
- Map Medium: 0.7785
- Map Large: 0.8114
- Mar 1: 0.3499
- Mar 10: 0.8327
- Mar 100: 0.8371
- Mar Small: 0.2786
- Mar Medium: 0.8275
- Mar Large: 0.842
- Map Chicken: 0.7868
- Mar 100 Chicken: 0.8356
- Map Duck: 0.7203
- Mar 100 Duck: 0.7758
- Map Plant: 0.8682
- Mar 100 Plant: 0.9
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 1e-05
- train_batch_size: 2
- eval_batch_size: 8
- seed: 42
- optimizer: Use adamw_torch with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
- lr_scheduler_type: cosine
- num_epochs: 150
### Training results
| Training Loss | Epoch | Step | Validation Loss | Map | Map 50 | Map 75 | Map Small | Map Medium | Map Large | Mar 1 | Mar 10 | Mar 100 | Mar Small | Mar Medium | Mar Large | Map Chicken | Mar 100 Chicken | Map Duck | Mar 100 Duck | Map Plant | Mar 100 Plant |
|:-------------:|:-----:|:-----:|:---------------:|:------:|:------:|:------:|:---------:|:----------:|:---------:|:------:|:------:|:-------:|:---------:|:----------:|:---------:|:-----------:|:---------------:|:--------:|:------------:|:---------:|:-------------:|
| 1.4028 | 1.0 | 500 | 1.4007 | 0.1032 | 0.149 | 0.1199 | 0.0019 | 0.0595 | 0.12 | 0.0805 | 0.2536 | 0.3566 | 0.1119 | 0.3419 | 0.3611 | 0.0572 | 0.5017 | 0.0 | 0.0 | 0.2524 | 0.568 |
| 1.1569 | 2.0 | 1000 | 1.2683 | 0.2107 | 0.2975 | 0.2472 | 0.0032 | 0.1265 | 0.2298 | 0.101 | 0.3546 | 0.4324 | 0.1655 | 0.384 | 0.4503 | 0.1044 | 0.5301 | 0.0 | 0.0 | 0.5277 | 0.7672 |
| 1.1556 | 3.0 | 1500 | 1.1637 | 0.2445 | 0.3539 | 0.2909 | 0.0142 | 0.1511 | 0.2787 | 0.1108 | 0.3825 | 0.4565 | 0.1702 | 0.4125 | 0.4691 | 0.1606 | 0.6004 | 0.0 | 0.0 | 0.5729 | 0.7692 |
| 1.0682 | 4.0 | 2000 | 1.0133 | 0.3096 | 0.4465 | 0.3615 | 0.0255 | 0.2184 | 0.3511 | 0.1187 | 0.4326 | 0.4626 | 0.0976 | 0.4169 | 0.493 | 0.2763 | 0.618 | 0.0 | 0.0 | 0.6524 | 0.7698 |
| 0.8943 | 5.0 | 2500 | 0.9855 | 0.3031 | 0.4609 | 0.3381 | 0.0948 | 0.2224 | 0.3382 | 0.116 | 0.4171 | 0.4384 | 0.1179 | 0.4006 | 0.4565 | 0.259 | 0.5707 | 0.0 | 0.0 | 0.6503 | 0.7444 |
| 0.8934 | 6.0 | 3000 | 0.9259 | 0.3341 | 0.4896 | 0.3878 | 0.0461 | 0.2597 | 0.3773 | 0.124 | 0.4412 | 0.4595 | 0.0833 | 0.4209 | 0.5054 | 0.3227 | 0.6222 | 0.0 | 0.0 | 0.6798 | 0.7562 |
| 0.7969 | 7.0 | 3500 | 0.8708 | 0.3652 | 0.5141 | 0.4262 | 0.0087 | 0.2969 | 0.413 | 0.1331 | 0.4642 | 0.4781 | 0.0226 | 0.4437 | 0.5125 | 0.3829 | 0.6569 | 0.0 | 0.0 | 0.7128 | 0.7775 |
| 0.8765 | 8.0 | 4000 | 0.8635 | 0.3498 | 0.5025 | 0.4051 | 0.0193 | 0.2876 | 0.3941 | 0.1221 | 0.4575 | 0.4779 | 0.1 | 0.4359 | 0.5054 | 0.3436 | 0.6636 | 0.0 | 0.0 | 0.7059 | 0.7701 |
| 0.7567 | 9.0 | 4500 | 0.7950 | 0.3589 | 0.5031 | 0.4159 | 0.0334 | 0.3002 | 0.3935 | 0.129 | 0.4714 | 0.4908 | 0.0964 | 0.4391 | 0.5252 | 0.3617 | 0.6929 | 0.0 | 0.0 | 0.715 | 0.7796 |
| 0.8705 | 10.0 | 5000 | 0.7733 | 0.3744 | 0.5364 | 0.4346 | 0.004 | 0.3259 | 0.4146 | 0.1276 | 0.4737 | 0.4815 | 0.0762 | 0.4385 | 0.5143 | 0.4088 | 0.6749 | 0.0 | 0.0 | 0.7144 | 0.7695 |
| 0.8581 | 11.0 | 5500 | 0.7106 | 0.4042 | 0.5579 | 0.4747 | 0.0098 | 0.3586 | 0.4354 | 0.1295 | 0.493 | 0.5007 | 0.0571 | 0.4624 | 0.527 | 0.479 | 0.7092 | 0.0 | 0.0 | 0.7337 | 0.7929 |
| 0.8615 | 12.0 | 6000 | 0.7182 | 0.4043 | 0.5639 | 0.4749 | 0.0042 | 0.3559 | 0.4375 | 0.131 | 0.4874 | 0.496 | 0.0905 | 0.4549 | 0.5188 | 0.4919 | 0.7054 | 0.0 | 0.0 | 0.7211 | 0.7825 |
| 0.7232 | 13.0 | 6500 | 0.6980 | 0.4015 | 0.5636 | 0.461 | 0.011 | 0.3574 | 0.4236 | 0.1274 | 0.4847 | 0.4942 | 0.1048 | 0.4544 | 0.5214 | 0.4784 | 0.6946 | 0.0 | 0.0 | 0.7259 | 0.7882 |
| 0.9433 | 14.0 | 7000 | 0.6518 | 0.4203 | 0.5701 | 0.4899 | 0.0138 | 0.3713 | 0.4559 | 0.138 | 0.4957 | 0.5015 | 0.0952 | 0.4593 | 0.5255 | 0.525 | 0.7167 | 0.0 | 0.0 | 0.7359 | 0.7879 |
| 0.7484 | 15.0 | 7500 | 0.6229 | 0.4354 | 0.5736 | 0.5186 | 0.0036 | 0.3859 | 0.4695 | 0.1398 | 0.509 | 0.5159 | 0.0952 | 0.4701 | 0.5409 | 0.5498 | 0.7427 | 0.0 | 0.0 | 0.7563 | 0.805 |
| 0.6236 | 16.0 | 8000 | 0.6744 | 0.4065 | 0.5691 | 0.4806 | 0.0016 | 0.3602 | 0.4482 | 0.137 | 0.4804 | 0.4901 | 0.0429 | 0.4497 | 0.5244 | 0.4729 | 0.6762 | 0.0 | 0.0 | 0.7467 | 0.7941 |
| 0.7592 | 17.0 | 8500 | 0.6446 | 0.4212 | 0.5814 | 0.492 | 0.0081 | 0.3725 | 0.4562 | 0.1454 | 0.4895 | 0.4973 | 0.0714 | 0.4603 | 0.5196 | 0.5266 | 0.7088 | 0.0 | 0.0 | 0.737 | 0.7831 |
| 0.676 | 18.0 | 9000 | 0.6180 | 0.4296 | 0.5808 | 0.5064 | 0.0042 | 0.3813 | 0.4602 | 0.1479 | 0.5018 | 0.5087 | 0.0714 | 0.4676 | 0.5342 | 0.5383 | 0.7234 | 0.0 | 0.0 | 0.7505 | 0.8027 |
| 0.6221 | 19.0 | 9500 | 0.6123 | 0.4305 | 0.5833 | 0.503 | 0.0066 | 0.3893 | 0.4492 | 0.1472 | 0.4987 | 0.5045 | 0.1048 | 0.4735 | 0.5184 | 0.536 | 0.7109 | 0.0 | 0.0 | 0.7556 | 0.8027 |
| 0.5736 | 20.0 | 10000 | 0.5780 | 0.446 | 0.586 | 0.5308 | 0.0178 | 0.4027 | 0.4708 | 0.1467 | 0.5137 | 0.5208 | 0.1238 | 0.4822 | 0.5413 | 0.567 | 0.7477 | 0.0 | 0.0 | 0.7711 | 0.8148 |
| 0.6252 | 21.0 | 10500 | 0.5930 | 0.4359 | 0.5933 | 0.5344 | 0.0123 | 0.3891 | 0.4559 | 0.1444 | 0.495 | 0.5015 | 0.1 | 0.4613 | 0.5173 | 0.5647 | 0.7151 | 0.0 | 0.0 | 0.743 | 0.7893 |
| 0.6602 | 22.0 | 11000 | 0.5519 | 0.4522 | 0.5909 | 0.5234 | 0.0074 | 0.4068 | 0.4801 | 0.151 | 0.5182 | 0.5264 | 0.1048 | 0.4923 | 0.545 | 0.5809 | 0.7573 | 0.0 | 0.0 | 0.7757 | 0.8219 |
| 0.6418 | 23.0 | 11500 | 0.5593 | 0.4536 | 0.5918 | 0.5364 | 0.0122 | 0.4053 | 0.4819 | 0.1502 | 0.5134 | 0.5178 | 0.1 | 0.4835 | 0.5342 | 0.5961 | 0.7456 | 0.0 | 0.0 | 0.7647 | 0.8077 |
| 0.5511 | 24.0 | 12000 | 0.5629 | 0.4534 | 0.6039 | 0.5374 | 0.0114 | 0.4046 | 0.4859 | 0.153 | 0.5207 | 0.529 | 0.1095 | 0.4981 | 0.5474 | 0.5877 | 0.7502 | 0.0057 | 0.0198 | 0.7668 | 0.8172 |
| 0.657 | 25.0 | 12500 | 0.5526 | 0.4536 | 0.602 | 0.5313 | 0.0543 | 0.4096 | 0.4728 | 0.1491 | 0.5126 | 0.5195 | 0.1357 | 0.4839 | 0.527 | 0.5908 | 0.746 | 0.0069 | 0.0044 | 0.7632 | 0.808 |
| 0.6453 | 26.0 | 13000 | 0.5527 | 0.4535 | 0.5965 | 0.5329 | 0.0074 | 0.4053 | 0.486 | 0.1496 | 0.5147 | 0.5218 | 0.0714 | 0.4844 | 0.5427 | 0.5939 | 0.749 | 0.0001 | 0.0011 | 0.7667 | 0.8154 |
| 0.5256 | 27.0 | 13500 | 0.5611 | 0.4564 | 0.6062 | 0.5467 | 0.0166 | 0.408 | 0.4885 | 0.1525 | 0.5144 | 0.5187 | 0.0952 | 0.4777 | 0.5425 | 0.6018 | 0.7406 | 0.0015 | 0.0088 | 0.7658 | 0.8068 |
| 0.5827 | 28.0 | 14000 | 0.5226 | 0.4712 | 0.6153 | 0.5485 | 0.0311 | 0.4306 | 0.503 | 0.1576 | 0.5286 | 0.5337 | 0.0857 | 0.5024 | 0.5537 | 0.6247 | 0.7644 | 0.0036 | 0.0055 | 0.7853 | 0.8311 |
| 0.5469 | 29.0 | 14500 | 0.5364 | 0.4698 | 0.6207 | 0.5634 | 0.07 | 0.4254 | 0.4954 | 0.1591 | 0.5242 | 0.5303 | 0.1488 | 0.5015 | 0.5386 | 0.6274 | 0.7586 | 0.0031 | 0.011 | 0.779 | 0.8213 |
| 0.5167 | 30.0 | 15000 | 0.5388 | 0.4614 | 0.6096 | 0.5525 | 0.0186 | 0.4109 | 0.4979 | 0.1533 | 0.5192 | 0.5233 | 0.1 | 0.4786 | 0.5544 | 0.6108 | 0.7423 | 0.0024 | 0.0132 | 0.7711 | 0.8145 |
| 0.5829 | 31.0 | 15500 | 0.5306 | 0.4713 | 0.6245 | 0.5616 | 0.0081 | 0.4265 | 0.4991 | 0.1544 | 0.5234 | 0.5304 | 0.119 | 0.5005 | 0.5409 | 0.6422 | 0.7665 | 0.0029 | 0.011 | 0.7688 | 0.8136 |
| 0.5343 | 32.0 | 16000 | 0.5142 | 0.481 | 0.6364 | 0.5825 | 0.0703 | 0.4456 | 0.5031 | 0.1605 | 0.5315 | 0.5384 | 0.1583 | 0.5133 | 0.5438 | 0.649 | 0.7632 | 0.0091 | 0.0231 | 0.7847 | 0.829 |
| 0.6008 | 33.0 | 16500 | 0.5275 | 0.4836 | 0.6435 | 0.575 | 0.0123 | 0.4449 | 0.5109 | 0.1629 | 0.534 | 0.5394 | 0.0857 | 0.5094 | 0.5517 | 0.6567 | 0.7611 | 0.0172 | 0.0374 | 0.777 | 0.8198 |
| 0.5171 | 34.0 | 17000 | 0.5114 | 0.5137 | 0.7007 | 0.5899 | 0.0855 | 0.4833 | 0.5404 | 0.1896 | 0.5779 | 0.5853 | 0.1738 | 0.5637 | 0.585 | 0.6844 | 0.7711 | 0.0717 | 0.1571 | 0.7851 | 0.8275 |
| 0.5217 | 35.0 | 17500 | 0.5236 | 0.5015 | 0.6923 | 0.5814 | 0.0922 | 0.4624 | 0.5269 | 0.1853 | 0.5718 | 0.5784 | 0.1774 | 0.5503 | 0.5766 | 0.6545 | 0.7427 | 0.0672 | 0.1659 | 0.7829 | 0.8266 |
| 0.5971 | 36.0 | 18000 | 0.4942 | 0.5181 | 0.6977 | 0.599 | 0.0494 | 0.4871 | 0.5287 | 0.1864 | 0.5759 | 0.583 | 0.1821 | 0.56 | 0.5844 | 0.6911 | 0.7678 | 0.0771 | 0.156 | 0.7862 | 0.8251 |
| 0.6785 | 37.0 | 18500 | 0.4917 | 0.5261 | 0.6974 | 0.6082 | 0.0645 | 0.4874 | 0.5511 | 0.1917 | 0.58 | 0.5845 | 0.1417 | 0.5472 | 0.6057 | 0.6916 | 0.7611 | 0.0965 | 0.1659 | 0.7902 | 0.8263 |
| 0.5474 | 38.0 | 19000 | 0.4861 | 0.5849 | 0.7862 | 0.6779 | 0.0677 | 0.5609 | 0.5807 | 0.2369 | 0.6282 | 0.6344 | 0.1619 | 0.6148 | 0.6238 | 0.6802 | 0.7427 | 0.2843 | 0.3319 | 0.7903 | 0.8287 |
| 0.5205 | 39.0 | 19500 | 0.4513 | 0.6675 | 0.8701 | 0.7964 | 0.1495 | 0.6457 | 0.6641 | 0.2983 | 0.7164 | 0.7209 | 0.2393 | 0.7036 | 0.7133 | 0.6895 | 0.7481 | 0.5075 | 0.567 | 0.8054 | 0.8476 |
| 0.5673 | 40.0 | 20000 | 0.4444 | 0.697 | 0.8958 | 0.8348 | 0.0546 | 0.6706 | 0.7229 | 0.3196 | 0.747 | 0.7515 | 0.1488 | 0.7254 | 0.7721 | 0.7095 | 0.7644 | 0.5829 | 0.6527 | 0.7987 | 0.8373 |
| 0.506 | 41.0 | 20500 | 0.4403 | 0.6973 | 0.9101 | 0.855 | 0.0914 | 0.6657 | 0.7401 | 0.3198 | 0.7508 | 0.7558 | 0.1857 | 0.7269 | 0.7937 | 0.7077 | 0.7598 | 0.5827 | 0.6637 | 0.8016 | 0.8438 |
| 0.4758 | 42.0 | 21000 | 0.4174 | 0.7063 | 0.9019 | 0.8437 | 0.1032 | 0.6837 | 0.7274 | 0.3188 | 0.7529 | 0.7593 | 0.206 | 0.7392 | 0.7777 | 0.7228 | 0.7682 | 0.5886 | 0.6615 | 0.8074 | 0.8482 |
| 0.5359 | 43.0 | 21500 | 0.4246 | 0.7034 | 0.9155 | 0.8383 | 0.0681 | 0.6742 | 0.7279 | 0.321 | 0.7539 | 0.7584 | 0.175 | 0.7343 | 0.7685 | 0.7167 | 0.764 | 0.602 | 0.6769 | 0.7915 | 0.8343 |
| 0.494 | 44.0 | 22000 | 0.4142 | 0.701 | 0.9109 | 0.8539 | 0.0457 | 0.6612 | 0.7498 | 0.315 | 0.7458 | 0.7508 | 0.1679 | 0.7183 | 0.7933 | 0.7141 | 0.7665 | 0.5939 | 0.6516 | 0.7951 | 0.8343 |
| 0.6382 | 45.0 | 22500 | 0.4074 | 0.7136 | 0.9207 | 0.8607 | 0.0343 | 0.6869 | 0.752 | 0.3211 | 0.7618 | 0.768 | 0.1429 | 0.7493 | 0.7882 | 0.7058 | 0.7623 | 0.6174 | 0.6912 | 0.8175 | 0.8506 |
| 0.6811 | 46.0 | 23000 | 0.3996 | 0.7303 | 0.9415 | 0.8514 | 0.1423 | 0.7073 | 0.7583 | 0.3313 | 0.7811 | 0.7871 | 0.2369 | 0.7679 | 0.8027 | 0.7323 | 0.7854 | 0.6482 | 0.7231 | 0.8104 | 0.853 |
| 0.443 | 47.0 | 23500 | 0.4059 | 0.7173 | 0.93 | 0.8484 | 0.1302 | 0.692 | 0.7507 | 0.328 | 0.7649 | 0.7692 | 0.2214 | 0.7505 | 0.7897 | 0.6986 | 0.7448 | 0.6363 | 0.7088 | 0.817 | 0.8541 |
| 0.4663 | 48.0 | 24000 | 0.3971 | 0.7165 | 0.935 | 0.8313 | 0.0973 | 0.6911 | 0.7414 | 0.3183 | 0.7631 | 0.7695 | 0.2048 | 0.7496 | 0.7852 | 0.7186 | 0.7686 | 0.6184 | 0.6901 | 0.8125 | 0.8497 |
| 0.4728 | 49.0 | 24500 | 0.3779 | 0.7371 | 0.9421 | 0.8685 | 0.0804 | 0.7129 | 0.7624 | 0.3296 | 0.7801 | 0.7874 | 0.1857 | 0.7702 | 0.8069 | 0.7437 | 0.7895 | 0.6525 | 0.722 | 0.8152 | 0.8506 |
| 0.6411 | 50.0 | 25000 | 0.4020 | 0.7219 | 0.9356 | 0.8507 | 0.0823 | 0.689 | 0.7456 | 0.3268 | 0.7675 | 0.7739 | 0.175 | 0.7479 | 0.787 | 0.716 | 0.764 | 0.6405 | 0.7088 | 0.8092 | 0.8488 |
| 0.3769 | 51.0 | 25500 | 0.3804 | 0.7326 | 0.9373 | 0.8639 | 0.1069 | 0.7067 | 0.7672 | 0.3251 | 0.7778 | 0.7848 | 0.2405 | 0.7659 | 0.8083 | 0.7317 | 0.7799 | 0.6519 | 0.7242 | 0.8142 | 0.8503 |
| 0.4945 | 52.0 | 26000 | 0.3946 | 0.7261 | 0.9375 | 0.8597 | 0.0862 | 0.6976 | 0.7559 | 0.3221 | 0.7665 | 0.7728 | 0.2298 | 0.7489 | 0.7926 | 0.7245 | 0.7682 | 0.638 | 0.7011 | 0.8157 | 0.8491 |
| 0.4687 | 53.0 | 26500 | 0.3805 | 0.7251 | 0.9252 | 0.874 | 0.1258 | 0.6911 | 0.7687 | 0.3181 | 0.7681 | 0.775 | 0.2679 | 0.747 | 0.8089 | 0.7335 | 0.7862 | 0.6226 | 0.6824 | 0.8192 | 0.8565 |
| 0.5792 | 54.0 | 27000 | 0.3738 | 0.7424 | 0.9394 | 0.8667 | 0.1461 | 0.7139 | 0.7751 | 0.3318 | 0.7855 | 0.7921 | 0.2714 | 0.7738 | 0.8146 | 0.7472 | 0.7962 | 0.6557 | 0.7231 | 0.8243 | 0.8571 |
| 0.4655 | 55.0 | 27500 | 0.3995 | 0.7202 | 0.9248 | 0.8552 | 0.1029 | 0.6947 | 0.7694 | 0.3257 | 0.7624 | 0.7682 | 0.1845 | 0.7491 | 0.8026 | 0.7066 | 0.7536 | 0.6387 | 0.6989 | 0.8154 | 0.8521 |
| 0.4438 | 56.0 | 28000 | 0.3878 | 0.7365 | 0.9339 | 0.8643 | 0.1023 | 0.7126 | 0.7633 | 0.3299 | 0.7831 | 0.7865 | 0.1964 | 0.7708 | 0.8006 | 0.7329 | 0.7787 | 0.6555 | 0.7209 | 0.821 | 0.8601 |
| 0.4894 | 57.0 | 28500 | 0.3993 | 0.7228 | 0.937 | 0.8443 | 0.1098 | 0.6878 | 0.7457 | 0.326 | 0.7675 | 0.7737 | 0.2083 | 0.7489 | 0.7834 | 0.6996 | 0.7456 | 0.6523 | 0.7242 | 0.8164 | 0.8512 |
| 0.4617 | 58.0 | 29000 | 0.3812 | 0.7374 | 0.9223 | 0.8637 | 0.1468 | 0.7072 | 0.7766 | 0.333 | 0.7837 | 0.7876 | 0.2512 | 0.7679 | 0.8131 | 0.7423 | 0.8008 | 0.6414 | 0.6989 | 0.8284 | 0.863 |
| 0.4108 | 59.0 | 29500 | 0.4103 | 0.7132 | 0.9307 | 0.8565 | 0.1059 | 0.6858 | 0.7395 | 0.3272 | 0.7602 | 0.7638 | 0.225 | 0.7483 | 0.7815 | 0.6928 | 0.7393 | 0.6588 | 0.7297 | 0.7878 | 0.8225 |
| 0.5917 | 60.0 | 30000 | 0.3873 | 0.7326 | 0.93 | 0.8635 | 0.1354 | 0.7103 | 0.7562 | 0.3248 | 0.7759 | 0.782 | 0.2155 | 0.7653 | 0.7954 | 0.7366 | 0.787 | 0.6427 | 0.7033 | 0.8186 | 0.8556 |
| 0.4408 | 61.0 | 30500 | 0.3685 | 0.7349 | 0.9404 | 0.8731 | 0.1531 | 0.707 | 0.7531 | 0.333 | 0.7799 | 0.785 | 0.2345 | 0.7605 | 0.7958 | 0.7255 | 0.772 | 0.6599 | 0.7242 | 0.8192 | 0.8589 |
| 0.4881 | 62.0 | 31000 | 0.3971 | 0.724 | 0.9318 | 0.8628 | 0.1548 | 0.7056 | 0.736 | 0.3272 | 0.7677 | 0.7721 | 0.1786 | 0.7569 | 0.778 | 0.7131 | 0.7577 | 0.6594 | 0.7187 | 0.7993 | 0.8399 |
| 0.4307 | 63.0 | 31500 | 0.3745 | 0.7365 | 0.9371 | 0.8813 | 0.1343 | 0.7095 | 0.7581 | 0.3298 | 0.7775 | 0.7836 | 0.2238 | 0.7614 | 0.7997 | 0.7164 | 0.7632 | 0.6726 | 0.7319 | 0.8206 | 0.8556 |
| 0.4612 | 64.0 | 32000 | 0.3491 | 0.7564 | 0.9367 | 0.8746 | 0.1445 | 0.7346 | 0.7809 | 0.3398 | 0.7958 | 0.8002 | 0.2631 | 0.7795 | 0.8247 | 0.7598 | 0.7996 | 0.6749 | 0.7286 | 0.8345 | 0.8725 |
| 0.4558 | 65.0 | 32500 | 0.3588 | 0.7472 | 0.9361 | 0.869 | 0.1337 | 0.7244 | 0.7719 | 0.3354 | 0.7892 | 0.7946 | 0.2571 | 0.7777 | 0.8145 | 0.7494 | 0.7954 | 0.6708 | 0.7297 | 0.8215 | 0.8589 |
| 0.392 | 66.0 | 33000 | 0.3584 | 0.7459 | 0.9422 | 0.873 | 0.1929 | 0.7292 | 0.7584 | 0.3346 | 0.7862 | 0.7923 | 0.2595 | 0.7785 | 0.8018 | 0.7515 | 0.7975 | 0.6634 | 0.7198 | 0.8227 | 0.8598 |
| 0.4172 | 67.0 | 33500 | 0.3857 | 0.7264 | 0.949 | 0.8732 | 0.1647 | 0.7101 | 0.7442 | 0.3225 | 0.7655 | 0.7716 | 0.2476 | 0.7633 | 0.7822 | 0.7163 | 0.7607 | 0.6415 | 0.6978 | 0.8214 | 0.8562 |
| 0.3736 | 68.0 | 34000 | 0.3592 | 0.7448 | 0.9417 | 0.8782 | 0.1767 | 0.7168 | 0.7687 | 0.3336 | 0.7847 | 0.7896 | 0.2798 | 0.7683 | 0.8046 | 0.7402 | 0.7816 | 0.6592 | 0.7209 | 0.8348 | 0.8663 |
| 0.3744 | 69.0 | 34500 | 0.3572 | 0.7365 | 0.945 | 0.8647 | 0.1704 | 0.7109 | 0.7661 | 0.3323 | 0.7833 | 0.7888 | 0.2131 | 0.7684 | 0.8093 | 0.7369 | 0.7879 | 0.6481 | 0.7176 | 0.8246 | 0.8609 |
| 0.4157 | 70.0 | 35000 | 0.3494 | 0.7447 | 0.939 | 0.8636 | 0.1402 | 0.7182 | 0.7657 | 0.3335 | 0.7918 | 0.7975 | 0.2655 | 0.7759 | 0.8124 | 0.7466 | 0.7992 | 0.6571 | 0.7242 | 0.8305 | 0.8692 |
| 0.4944 | 71.0 | 35500 | 0.3328 | 0.7626 | 0.9474 | 0.8651 | 0.1845 | 0.7392 | 0.7887 | 0.3457 | 0.8103 | 0.8142 | 0.2214 | 0.7962 | 0.8287 | 0.7515 | 0.8008 | 0.6971 | 0.7681 | 0.8393 | 0.8737 |
| 0.4611 | 72.0 | 36000 | 0.3360 | 0.7657 | 0.9481 | 0.8754 | 0.1742 | 0.742 | 0.7843 | 0.3417 | 0.8094 | 0.8129 | 0.2821 | 0.7942 | 0.8182 | 0.7708 | 0.8192 | 0.6881 | 0.7473 | 0.8382 | 0.8722 |
| 0.4413 | 73.0 | 36500 | 0.3490 | 0.75 | 0.9387 | 0.8815 | 0.2018 | 0.7324 | 0.7719 | 0.3336 | 0.7964 | 0.8005 | 0.2774 | 0.7863 | 0.8092 | 0.7501 | 0.795 | 0.6662 | 0.7352 | 0.8338 | 0.8713 |
| 0.4712 | 74.0 | 37000 | 0.3358 | 0.7601 | 0.9452 | 0.8747 | 0.1039 | 0.7411 | 0.777 | 0.3431 | 0.8044 | 0.8091 | 0.1536 | 0.7924 | 0.8196 | 0.7468 | 0.7933 | 0.6947 | 0.7593 | 0.8388 | 0.8746 |
| 0.3787 | 75.0 | 37500 | 0.3430 | 0.7593 | 0.9458 | 0.8907 | 0.1535 | 0.74 | 0.7817 | 0.3366 | 0.8001 | 0.8053 | 0.2012 | 0.7898 | 0.8173 | 0.7597 | 0.8063 | 0.6808 | 0.7363 | 0.8374 | 0.8734 |
| 0.4252 | 76.0 | 38000 | 0.3431 | 0.7571 | 0.9431 | 0.8806 | 0.1021 | 0.7366 | 0.7799 | 0.3411 | 0.7994 | 0.8033 | 0.2 | 0.7848 | 0.817 | 0.7454 | 0.7908 | 0.6912 | 0.7495 | 0.8347 | 0.8695 |
| 0.3564 | 77.0 | 38500 | 0.3485 | 0.7473 | 0.9223 | 0.8796 | 0.1115 | 0.7236 | 0.7709 | 0.3269 | 0.7853 | 0.79 | 0.2262 | 0.7692 | 0.8029 | 0.7636 | 0.8138 | 0.6393 | 0.6824 | 0.8391 | 0.8737 |
| 0.4422 | 78.0 | 39000 | 0.3266 | 0.7695 | 0.945 | 0.8811 | 0.1326 | 0.7496 | 0.7889 | 0.3426 | 0.8126 | 0.8179 | 0.1845 | 0.8005 | 0.8325 | 0.7717 | 0.8197 | 0.6951 | 0.7582 | 0.8416 | 0.8757 |
| 0.4254 | 79.0 | 39500 | 0.3294 | 0.7682 | 0.9474 | 0.8808 | 0.1473 | 0.7499 | 0.7884 | 0.3431 | 0.8115 | 0.8169 | 0.2917 | 0.806 | 0.825 | 0.7624 | 0.8126 | 0.7019 | 0.7626 | 0.8402 | 0.8754 |
| 0.5614 | 80.0 | 40000 | 0.3243 | 0.7744 | 0.9485 | 0.9041 | 0.1867 | 0.7562 | 0.7969 | 0.3436 | 0.8148 | 0.8196 | 0.2714 | 0.8054 | 0.8302 | 0.7636 | 0.8142 | 0.7131 | 0.7637 | 0.8463 | 0.8808 |
| 0.3283 | 81.0 | 40500 | 0.3264 | 0.77 | 0.9512 | 0.8837 | 0.1983 | 0.7526 | 0.8022 | 0.3448 | 0.8126 | 0.818 | 0.2833 | 0.8048 | 0.8352 | 0.7619 | 0.8084 | 0.7068 | 0.7659 | 0.8414 | 0.8796 |
| 0.3889 | 82.0 | 41000 | 0.3263 | 0.7817 | 0.9491 | 0.8921 | 0.2169 | 0.766 | 0.8069 | 0.3469 | 0.8168 | 0.8229 | 0.2667 | 0.812 | 0.8382 | 0.7828 | 0.8247 | 0.7145 | 0.7626 | 0.8479 | 0.8814 |
| 0.3442 | 83.0 | 41500 | 0.3267 | 0.7691 | 0.9467 | 0.89 | 0.2088 | 0.7538 | 0.7843 | 0.3389 | 0.8153 | 0.8196 | 0.2595 | 0.8093 | 0.819 | 0.77 | 0.8205 | 0.6952 | 0.7593 | 0.8422 | 0.879 |
| 0.3858 | 84.0 | 42000 | 0.3347 | 0.7635 | 0.9443 | 0.892 | 0.1941 | 0.7414 | 0.7981 | 0.338 | 0.808 | 0.8126 | 0.2833 | 0.7943 | 0.837 | 0.7533 | 0.8067 | 0.6939 | 0.7516 | 0.8433 | 0.8796 |
| 0.3639 | 85.0 | 42500 | 0.3283 | 0.7709 | 0.9471 | 0.8992 | 0.195 | 0.7559 | 0.7866 | 0.3398 | 0.8123 | 0.819 | 0.281 | 0.8038 | 0.8244 | 0.7711 | 0.8192 | 0.6999 | 0.7593 | 0.8416 | 0.8784 |
| 0.3761 | 86.0 | 43000 | 0.3210 | 0.7698 | 0.9442 | 0.8898 | 0.197 | 0.7508 | 0.7919 | 0.3406 | 0.8118 | 0.8173 | 0.2512 | 0.8057 | 0.8238 | 0.7644 | 0.8121 | 0.7072 | 0.7626 | 0.838 | 0.8772 |
| 0.4861 | 87.0 | 43500 | 0.3160 | 0.7821 | 0.9503 | 0.899 | 0.196 | 0.7671 | 0.7959 | 0.3451 | 0.8209 | 0.8265 | 0.2369 | 0.8148 | 0.8258 | 0.7821 | 0.8259 | 0.7174 | 0.767 | 0.8469 | 0.8864 |
| 0.3586 | 88.0 | 44000 | 0.3202 | 0.7715 | 0.9468 | 0.8912 | 0.2025 | 0.7482 | 0.7994 | 0.3402 | 0.8125 | 0.817 | 0.2274 | 0.7963 | 0.835 | 0.7715 | 0.8105 | 0.6988 | 0.7593 | 0.8441 | 0.8811 |
| 0.4547 | 89.0 | 44500 | 0.3250 | 0.773 | 0.9468 | 0.8984 | 0.2276 | 0.7456 | 0.8057 | 0.3444 | 0.8162 | 0.8212 | 0.2679 | 0.7979 | 0.8446 | 0.7746 | 0.823 | 0.7049 | 0.7626 | 0.8397 | 0.8778 |
| 0.3646 | 90.0 | 45000 | 0.3179 | 0.7704 | 0.9508 | 0.8899 | 0.1969 | 0.7533 | 0.7924 | 0.3422 | 0.8123 | 0.8175 | 0.2548 | 0.8055 | 0.829 | 0.7697 | 0.8092 | 0.6979 | 0.7604 | 0.8435 | 0.8828 |
| 0.3942 | 91.0 | 45500 | 0.3112 | 0.7769 | 0.9422 | 0.8929 | 0.213 | 0.7553 | 0.8103 | 0.3443 | 0.8184 | 0.8231 | 0.2845 | 0.8056 | 0.8429 | 0.7797 | 0.8234 | 0.6981 | 0.7582 | 0.8529 | 0.8876 |
| 0.3932 | 92.0 | 46000 | 0.3057 | 0.7802 | 0.9483 | 0.8893 | 0.2187 | 0.7659 | 0.8095 | 0.3462 | 0.8232 | 0.828 | 0.281 | 0.8147 | 0.8472 | 0.7801 | 0.8222 | 0.7111 | 0.7758 | 0.8493 | 0.8861 |
| 0.4528 | 93.0 | 46500 | 0.3111 | 0.7768 | 0.948 | 0.8954 | 0.1844 | 0.7549 | 0.8161 | 0.3448 | 0.8217 | 0.827 | 0.2357 | 0.809 | 0.8512 | 0.7812 | 0.8285 | 0.6964 | 0.7626 | 0.8527 | 0.8899 |
| 0.4239 | 94.0 | 47000 | 0.3126 | 0.78 | 0.9474 | 0.8895 | 0.2248 | 0.7666 | 0.805 | 0.3461 | 0.823 | 0.8279 | 0.2988 | 0.8157 | 0.8424 | 0.7827 | 0.8301 | 0.7099 | 0.7681 | 0.8474 | 0.8855 |
| 0.3341 | 95.0 | 47500 | 0.3240 | 0.7673 | 0.9526 | 0.9017 | 0.2292 | 0.7526 | 0.7891 | 0.3391 | 0.8118 | 0.8157 | 0.2952 | 0.8035 | 0.8289 | 0.7622 | 0.8096 | 0.7014 | 0.7626 | 0.8383 | 0.8749 |
| 0.3849 | 96.0 | 48000 | 0.3116 | 0.7703 | 0.9468 | 0.8931 | 0.2012 | 0.7506 | 0.798 | 0.341 | 0.8123 | 0.8168 | 0.2369 | 0.7984 | 0.8381 | 0.7773 | 0.8264 | 0.6834 | 0.7385 | 0.8501 | 0.8855 |
| 0.4644 | 97.0 | 48500 | 0.3258 | 0.7674 | 0.9535 | 0.9073 | 0.1899 | 0.7485 | 0.792 | 0.3411 | 0.8113 | 0.8153 | 0.2274 | 0.7985 | 0.8317 | 0.767 | 0.8113 | 0.7033 | 0.7593 | 0.8321 | 0.8751 |
| 0.3149 | 98.0 | 49000 | 0.3149 | 0.779 | 0.9549 | 0.9193 | 0.2234 | 0.7595 | 0.8007 | 0.3426 | 0.8186 | 0.8226 | 0.2619 | 0.8089 | 0.834 | 0.7738 | 0.8197 | 0.7167 | 0.7637 | 0.8466 | 0.8843 |
| 0.3366 | 99.0 | 49500 | 0.3108 | 0.7792 | 0.9435 | 0.8995 | 0.1821 | 0.7552 | 0.8211 | 0.3447 | 0.8199 | 0.8236 | 0.2357 | 0.8039 | 0.8528 | 0.7786 | 0.8264 | 0.7071 | 0.7549 | 0.852 | 0.8896 |
| 0.4285 | 100.0 | 50000 | 0.3148 | 0.7793 | 0.9518 | 0.8912 | 0.1654 | 0.7618 | 0.7996 | 0.3437 | 0.8209 | 0.8255 | 0.2012 | 0.8141 | 0.8321 | 0.7704 | 0.813 | 0.7169 | 0.7758 | 0.8507 | 0.8876 |
| 0.338 | 101.0 | 50500 | 0.2999 | 0.789 | 0.9553 | 0.9009 | 0.2188 | 0.7707 | 0.8101 | 0.3477 | 0.8354 | 0.8397 | 0.2619 | 0.8259 | 0.8479 | 0.7843 | 0.8347 | 0.7291 | 0.7945 | 0.8538 | 0.8899 |
| 0.4276 | 102.0 | 51000 | 0.3197 | 0.7727 | 0.9493 | 0.8895 | 0.2196 | 0.7529 | 0.8002 | 0.3398 | 0.8179 | 0.8224 | 0.2714 | 0.8071 | 0.8352 | 0.766 | 0.8159 | 0.6999 | 0.7637 | 0.8521 | 0.8876 |
| 0.3458 | 103.0 | 51500 | 0.3131 | 0.776 | 0.9461 | 0.8902 | 0.2038 | 0.7555 | 0.8043 | 0.3415 | 0.8193 | 0.8248 | 0.2976 | 0.8067 | 0.8408 | 0.78 | 0.8297 | 0.6984 | 0.7582 | 0.8494 | 0.8864 |
| 0.3632 | 104.0 | 52000 | 0.3111 | 0.7706 | 0.9429 | 0.8902 | 0.2237 | 0.7502 | 0.7957 | 0.3397 | 0.8139 | 0.8183 | 0.2571 | 0.8025 | 0.8306 | 0.7744 | 0.8243 | 0.6862 | 0.7418 | 0.8511 | 0.8888 |
| 0.3418 | 105.0 | 52500 | 0.2995 | 0.7866 | 0.9533 | 0.8959 | 0.2163 | 0.773 | 0.7984 | 0.3482 | 0.8267 | 0.8312 | 0.2726 | 0.8204 | 0.8316 | 0.7871 | 0.8322 | 0.7198 | 0.7725 | 0.8528 | 0.8888 |
| 0.4659 | 106.0 | 53000 | 0.3084 | 0.7809 | 0.9461 | 0.8917 | 0.2058 | 0.764 | 0.7998 | 0.3442 | 0.8173 | 0.8221 | 0.2405 | 0.8071 | 0.8331 | 0.7829 | 0.8226 | 0.7091 | 0.7571 | 0.8506 | 0.8867 |
| 0.3353 | 107.0 | 53500 | 0.3096 | 0.7724 | 0.941 | 0.8869 | 0.2076 | 0.7529 | 0.7959 | 0.3409 | 0.8136 | 0.82 | 0.2667 | 0.8055 | 0.8362 | 0.7739 | 0.8234 | 0.691 | 0.744 | 0.8523 | 0.8926 |
| 0.4245 | 108.0 | 54000 | 0.3102 | 0.7815 | 0.9454 | 0.8897 | 0.2151 | 0.7601 | 0.8095 | 0.3418 | 0.8217 | 0.8271 | 0.2679 | 0.8108 | 0.843 | 0.7797 | 0.8264 | 0.7083 | 0.7615 | 0.8565 | 0.8935 |
| 0.3782 | 109.0 | 54500 | 0.3119 | 0.7723 | 0.9472 | 0.8848 | 0.2107 | 0.7563 | 0.7891 | 0.3407 | 0.8148 | 0.8197 | 0.231 | 0.8085 | 0.8266 | 0.7684 | 0.8184 | 0.6932 | 0.7495 | 0.8552 | 0.8911 |
| 0.3651 | 110.0 | 55000 | 0.2999 | 0.7909 | 0.9535 | 0.9105 | 0.1801 | 0.7794 | 0.8125 | 0.347 | 0.8301 | 0.8359 | 0.2107 | 0.8304 | 0.844 | 0.7845 | 0.8335 | 0.7307 | 0.7802 | 0.8575 | 0.8941 |
| 0.2835 | 111.0 | 55500 | 0.3068 | 0.7864 | 0.9536 | 0.898 | 0.2101 | 0.7707 | 0.8047 | 0.3453 | 0.8264 | 0.8318 | 0.2536 | 0.8228 | 0.8375 | 0.7747 | 0.8255 | 0.7242 | 0.7769 | 0.8603 | 0.8929 |
| 0.3702 | 112.0 | 56000 | 0.3100 | 0.7764 | 0.9481 | 0.8884 | 0.2008 | 0.7606 | 0.7983 | 0.344 | 0.8192 | 0.8244 | 0.2405 | 0.8142 | 0.8301 | 0.7667 | 0.8126 | 0.7063 | 0.7692 | 0.8561 | 0.8914 |
| 0.3475 | 113.0 | 56500 | 0.2957 | 0.7895 | 0.9499 | 0.8996 | 0.2189 | 0.7707 | 0.8125 | 0.3459 | 0.8293 | 0.835 | 0.2667 | 0.8223 | 0.845 | 0.7887 | 0.8372 | 0.7203 | 0.7736 | 0.8595 | 0.8941 |
| 0.4056 | 114.0 | 57000 | 0.3000 | 0.7834 | 0.9456 | 0.893 | 0.2203 | 0.7659 | 0.805 | 0.3443 | 0.8236 | 0.8292 | 0.2798 | 0.8183 | 0.8355 | 0.782 | 0.8297 | 0.7073 | 0.7615 | 0.8609 | 0.8964 |
| 0.3708 | 115.0 | 57500 | 0.2999 | 0.7834 | 0.9461 | 0.8906 | 0.2253 | 0.7644 | 0.8069 | 0.3467 | 0.8256 | 0.83 | 0.2714 | 0.8162 | 0.8417 | 0.7827 | 0.8305 | 0.7082 | 0.7659 | 0.8592 | 0.8935 |
| 0.365 | 116.0 | 58000 | 0.3010 | 0.7841 | 0.9443 | 0.8948 | 0.2201 | 0.7647 | 0.8019 | 0.3444 | 0.8243 | 0.8283 | 0.2607 | 0.8166 | 0.8345 | 0.7735 | 0.8218 | 0.7153 | 0.7681 | 0.8635 | 0.895 |
| 0.6735 | 117.0 | 58500 | 0.2989 | 0.7816 | 0.9433 | 0.8879 | 0.2312 | 0.765 | 0.7929 | 0.3428 | 0.8237 | 0.8286 | 0.3167 | 0.82 | 0.8265 | 0.7816 | 0.8347 | 0.7014 | 0.7549 | 0.8618 | 0.8962 |
| 0.3886 | 118.0 | 59000 | 0.3009 | 0.7801 | 0.9493 | 0.8908 | 0.2309 | 0.7666 | 0.7872 | 0.3436 | 0.8234 | 0.8274 | 0.2893 | 0.8166 | 0.8262 | 0.7695 | 0.8205 | 0.7099 | 0.767 | 0.861 | 0.8947 |
| 0.4127 | 119.0 | 59500 | 0.2982 | 0.7874 | 0.9496 | 0.893 | 0.2301 | 0.771 | 0.8011 | 0.3482 | 0.8301 | 0.8346 | 0.2798 | 0.8237 | 0.8364 | 0.7818 | 0.8301 | 0.7181 | 0.778 | 0.8624 | 0.8956 |
| 0.3387 | 120.0 | 60000 | 0.2970 | 0.7884 | 0.9474 | 0.8815 | 0.2342 | 0.7709 | 0.8079 | 0.3493 | 0.8296 | 0.8343 | 0.2845 | 0.8198 | 0.8406 | 0.7811 | 0.8251 | 0.7189 | 0.7802 | 0.8653 | 0.8976 |
| 0.3816 | 121.0 | 60500 | 0.2993 | 0.7831 | 0.9469 | 0.8864 | 0.2278 | 0.7687 | 0.795 | 0.3458 | 0.8241 | 0.8291 | 0.2762 | 0.8182 | 0.8301 | 0.7806 | 0.8268 | 0.7087 | 0.767 | 0.8601 | 0.8935 |
| 0.3813 | 122.0 | 61000 | 0.2972 | 0.786 | 0.9465 | 0.8914 | 0.2353 | 0.7676 | 0.8009 | 0.3462 | 0.8273 | 0.8325 | 0.2976 | 0.8184 | 0.8399 | 0.7873 | 0.8347 | 0.7079 | 0.767 | 0.8626 | 0.8959 |
| 0.3785 | 123.0 | 61500 | 0.2949 | 0.7896 | 0.9484 | 0.8975 | 0.23 | 0.776 | 0.8025 | 0.3475 | 0.831 | 0.8352 | 0.2833 | 0.8218 | 0.8402 | 0.7832 | 0.8301 | 0.7223 | 0.778 | 0.8632 | 0.8973 |
| 0.3602 | 124.0 | 62000 | 0.2936 | 0.7902 | 0.9496 | 0.8968 | 0.2422 | 0.7785 | 0.8011 | 0.3494 | 0.8306 | 0.8355 | 0.2833 | 0.825 | 0.8368 | 0.7852 | 0.8301 | 0.7183 | 0.7769 | 0.867 | 0.8994 |
| 0.3403 | 125.0 | 62500 | 0.2882 | 0.7907 | 0.9451 | 0.8956 | 0.2436 | 0.7753 | 0.8116 | 0.3495 | 0.8332 | 0.8378 | 0.2976 | 0.8255 | 0.8476 | 0.7891 | 0.836 | 0.7172 | 0.7791 | 0.8657 | 0.8982 |
| 0.39 | 126.0 | 63000 | 0.2917 | 0.7919 | 0.949 | 0.8946 | 0.2401 | 0.7759 | 0.8135 | 0.349 | 0.8313 | 0.8362 | 0.2881 | 0.8242 | 0.849 | 0.783 | 0.8305 | 0.7224 | 0.7758 | 0.8704 | 0.9024 |
| 0.3643 | 127.0 | 63500 | 0.2932 | 0.7887 | 0.9491 | 0.8942 | 0.237 | 0.7749 | 0.8047 | 0.3466 | 0.8302 | 0.8343 | 0.2893 | 0.823 | 0.8402 | 0.7812 | 0.8289 | 0.7195 | 0.7758 | 0.8654 | 0.8982 |
| 0.3822 | 128.0 | 64000 | 0.2889 | 0.7937 | 0.9457 | 0.9002 | 0.2327 | 0.7757 | 0.8141 | 0.3512 | 0.8345 | 0.8397 | 0.2893 | 0.8281 | 0.8491 | 0.7926 | 0.841 | 0.7228 | 0.778 | 0.8656 | 0.9 |
| 0.377 | 129.0 | 64500 | 0.2926 | 0.7891 | 0.9477 | 0.8955 | 0.2366 | 0.773 | 0.8104 | 0.3492 | 0.83 | 0.8349 | 0.2976 | 0.8235 | 0.8443 | 0.7861 | 0.8356 | 0.718 | 0.7725 | 0.8631 | 0.8967 |
| 0.3162 | 130.0 | 65000 | 0.2890 | 0.7921 | 0.9485 | 0.8954 | 0.2376 | 0.7722 | 0.8173 | 0.3513 | 0.8332 | 0.8378 | 0.3024 | 0.8237 | 0.85 | 0.7905 | 0.8393 | 0.7211 | 0.7769 | 0.8646 | 0.897 |
| 0.3399 | 131.0 | 65500 | 0.2894 | 0.7912 | 0.9466 | 0.8924 | 0.237 | 0.7737 | 0.8152 | 0.3493 | 0.8312 | 0.8356 | 0.2976 | 0.8241 | 0.8459 | 0.7856 | 0.831 | 0.7222 | 0.7769 | 0.8658 | 0.8988 |
| 0.3201 | 132.0 | 66000 | 0.2857 | 0.7934 | 0.948 | 0.899 | 0.2417 | 0.7749 | 0.8127 | 0.3512 | 0.8338 | 0.8384 | 0.306 | 0.8254 | 0.8456 | 0.7885 | 0.836 | 0.7219 | 0.778 | 0.8698 | 0.9012 |
| 0.3627 | 133.0 | 66500 | 0.2849 | 0.7937 | 0.9479 | 0.8927 | 0.2158 | 0.777 | 0.8169 | 0.3511 | 0.8333 | 0.838 | 0.2798 | 0.8265 | 0.8497 | 0.7869 | 0.8356 | 0.7235 | 0.7769 | 0.8707 | 0.9015 |
| 0.356 | 134.0 | 67000 | 0.2875 | 0.7904 | 0.946 | 0.8934 | 0.2285 | 0.7735 | 0.8125 | 0.3494 | 0.83 | 0.8345 | 0.2845 | 0.823 | 0.8433 | 0.7857 | 0.8339 | 0.7184 | 0.7714 | 0.8672 | 0.8982 |
| 0.3514 | 135.0 | 67500 | 0.2868 | 0.7909 | 0.9464 | 0.8928 | 0.226 | 0.7745 | 0.8119 | 0.3497 | 0.8305 | 0.8351 | 0.2786 | 0.8233 | 0.8429 | 0.785 | 0.8351 | 0.7197 | 0.7714 | 0.8679 | 0.8988 |
| 0.3632 | 136.0 | 68000 | 0.2867 | 0.7917 | 0.9463 | 0.8935 | 0.226 | 0.7771 | 0.8123 | 0.3506 | 0.8319 | 0.8366 | 0.2702 | 0.8267 | 0.8451 | 0.7888 | 0.836 | 0.72 | 0.7747 | 0.8663 | 0.8991 |
| 0.3077 | 137.0 | 68500 | 0.2863 | 0.7915 | 0.9468 | 0.893 | 0.2342 | 0.7769 | 0.8136 | 0.3496 | 0.8319 | 0.8367 | 0.2786 | 0.8254 | 0.847 | 0.7886 | 0.8372 | 0.7194 | 0.7736 | 0.8665 | 0.8991 |
| 0.2911 | 138.0 | 69000 | 0.2887 | 0.7917 | 0.9478 | 0.8926 | 0.2376 | 0.7762 | 0.8091 | 0.349 | 0.8322 | 0.8368 | 0.2976 | 0.8258 | 0.8432 | 0.7879 | 0.8351 | 0.7191 | 0.7747 | 0.8679 | 0.9006 |
| 0.4159 | 139.0 | 69500 | 0.2869 | 0.7926 | 0.9485 | 0.8932 | 0.2376 | 0.7763 | 0.8119 | 0.3499 | 0.8334 | 0.8377 | 0.2976 | 0.8267 | 0.8421 | 0.7898 | 0.8364 | 0.7208 | 0.7769 | 0.8671 | 0.8997 |
| 0.6002 | 140.0 | 70000 | 0.2845 | 0.7939 | 0.9468 | 0.8933 | 0.2345 | 0.7803 | 0.8121 | 0.3505 | 0.8344 | 0.8386 | 0.2786 | 0.8289 | 0.8429 | 0.7912 | 0.8368 | 0.7211 | 0.7769 | 0.8694 | 0.9021 |
| 0.4506 | 141.0 | 70500 | 0.2858 | 0.793 | 0.9463 | 0.8927 | 0.226 | 0.7788 | 0.8129 | 0.35 | 0.8342 | 0.8387 | 0.2702 | 0.829 | 0.8447 | 0.7905 | 0.8385 | 0.7209 | 0.7769 | 0.8677 | 0.9006 |
| 0.3234 | 142.0 | 71000 | 0.2849 | 0.7936 | 0.9464 | 0.893 | 0.226 | 0.7792 | 0.8134 | 0.3504 | 0.8342 | 0.8389 | 0.2702 | 0.8293 | 0.8435 | 0.7897 | 0.8381 | 0.7211 | 0.7769 | 0.8699 | 0.9018 |
| 0.3553 | 143.0 | 71500 | 0.2860 | 0.7929 | 0.9465 | 0.8928 | 0.2342 | 0.7799 | 0.8114 | 0.3499 | 0.8335 | 0.8382 | 0.2786 | 0.8293 | 0.8422 | 0.7891 | 0.8377 | 0.7206 | 0.7758 | 0.8688 | 0.9012 |
| 0.3854 | 144.0 | 72000 | 0.2862 | 0.7931 | 0.9466 | 0.8929 | 0.2342 | 0.7799 | 0.8126 | 0.3503 | 0.8336 | 0.8381 | 0.2786 | 0.8283 | 0.844 | 0.7891 | 0.8364 | 0.7216 | 0.7769 | 0.8686 | 0.9009 |
| 0.3787 | 145.0 | 72500 | 0.2859 | 0.7928 | 0.9465 | 0.8928 | 0.2342 | 0.7793 | 0.8131 | 0.3505 | 0.8335 | 0.838 | 0.2786 | 0.8282 | 0.844 | 0.789 | 0.8372 | 0.7208 | 0.7758 | 0.8687 | 0.9009 |
| 0.4225 | 146.0 | 73000 | 0.2864 | 0.7919 | 0.9465 | 0.8929 | 0.2342 | 0.7784 | 0.8117 | 0.3495 | 0.8325 | 0.837 | 0.2786 | 0.8275 | 0.8421 | 0.7873 | 0.8356 | 0.7197 | 0.7747 | 0.8688 | 0.9006 |
| 0.389 | 147.0 | 73500 | 0.2867 | 0.7921 | 0.9463 | 0.8927 | 0.2342 | 0.7789 | 0.8116 | 0.3499 | 0.833 | 0.8377 | 0.2786 | 0.828 | 0.8422 | 0.7869 | 0.8356 | 0.7212 | 0.7769 | 0.8681 | 0.9006 |
| 0.3307 | 148.0 | 74000 | 0.2867 | 0.7921 | 0.9464 | 0.8927 | 0.2343 | 0.779 | 0.8114 | 0.3499 | 0.8331 | 0.8375 | 0.2786 | 0.828 | 0.842 | 0.787 | 0.8356 | 0.7212 | 0.7769 | 0.8682 | 0.9 |
| 0.3403 | 149.0 | 74500 | 0.2867 | 0.7918 | 0.9463 | 0.8926 | 0.2343 | 0.7785 | 0.8114 | 0.3499 | 0.8327 | 0.8371 | 0.2786 | 0.8275 | 0.842 | 0.7868 | 0.8356 | 0.7203 | 0.7758 | 0.8682 | 0.9 |
| 0.4439 | 150.0 | 75000 | 0.2867 | 0.7918 | 0.9463 | 0.8926 | 0.2343 | 0.7785 | 0.8114 | 0.3499 | 0.8327 | 0.8371 | 0.2786 | 0.8275 | 0.842 | 0.7868 | 0.8356 | 0.7203 | 0.7758 | 0.8682 | 0.9 |
### Framework versions
- Transformers 4.46.1
- Pytorch 2.5.0+cu121
- Datasets 2.19.2
- Tokenizers 0.20.1
| [
"chicken",
"duck",
"plant"
] |
ARG-NCTU/detr-resnet-50-finetuned-30-epochs-boat-dataset |
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# detr-resnet-50-finetuned-30-epochs-boat-dataset
This model is a fine-tuned version of [ARG-NCTU/detr-resnet-50-finetuned-30-epochs-boat-dataset](https://huggingface.co/ARG-NCTU/detr-resnet-50-finetuned-30-epochs-boat-dataset) on an unknown dataset.
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 1e-05
- train_batch_size: 8
- eval_batch_size: 8
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 25
### Framework versions
- Transformers 4.44.2
- Pytorch 2.4.1+cu121
- Datasets 2.21.0
- Tokenizers 0.19.1
| [
"ballonboat",
"bigboat",
"boat",
"jetski",
"katamaran",
"sailboat",
"smallboat",
"speedboat",
"wam_v",
"container_ship",
"tugship",
"yacht",
"blueboat"
] |
madhutry/detr-finetuned-other-98samples-4 |
# Model Card for Model ID
<!-- Provide a quick summary of what the model is/does. -->
## Model Details
### Model Description
<!-- Provide a longer summary of what this model is. -->
This is the model card of a 🤗 transformers model that has been pushed on the Hub. This model card has been automatically generated.
- **Developed by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Model type:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
- **Finetuned from model [optional]:** [More Information Needed]
### Model Sources [optional]
<!-- Provide the basic links for the model. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the model is intended to be used, including the foreseeable users of the model and those affected by the model. -->
### Direct Use
<!-- This section is for the model use without fine-tuning or plugging into a larger ecosystem/app. -->
[More Information Needed]
### Downstream Use [optional]
<!-- This section is for the model use when fine-tuned for a task, or when plugged into a larger ecosystem/app -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the model will not work well for. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.
## How to Get Started with the Model
Use the code below to get started with the model.
[More Information Needed]
## Training Details
### Training Data
<!-- This should link to a Dataset Card, perhaps with a short stub of information on what the training data is all about as well as documentation related to data pre-processing or additional filtering. -->
[More Information Needed]
### Training Procedure
<!-- This relates heavily to the Technical Specifications. Content here should link to that section when it is relevant to the training procedure. -->
#### Preprocessing [optional]
[More Information Needed]
#### Training Hyperparameters
- **Training regime:** [More Information Needed] <!--fp32, fp16 mixed precision, bf16 mixed precision, bf16 non-mixed precision, fp16 non-mixed precision, fp8 mixed precision -->
#### Speeds, Sizes, Times [optional]
<!-- This section provides information about throughput, start/end time, checkpoint size if relevant, etc. -->
[More Information Needed]
## Evaluation
<!-- This section describes the evaluation protocols and provides the results. -->
### Testing Data, Factors & Metrics
#### Testing Data
<!-- This should link to a Dataset Card if possible. -->
[More Information Needed]
#### Factors
<!-- These are the things the evaluation is disaggregating by, e.g., subpopulations or domains. -->
[More Information Needed]
#### Metrics
<!-- These are the evaluation metrics being used, ideally with a description of why. -->
[More Information Needed]
### Results
[More Information Needed]
#### Summary
## Model Examination [optional]
<!-- Relevant interpretability work for the model goes here -->
[More Information Needed]
## Environmental Impact
<!-- Total emissions (in grams of CO2eq) and additional considerations, such as electricity usage, go here. Edit the suggested text below accordingly -->
Carbon emissions can be estimated using the [Machine Learning Impact calculator](https://mlco2.github.io/impact#compute) presented in [Lacoste et al. (2019)](https://arxiv.org/abs/1910.09700).
- **Hardware Type:** [More Information Needed]
- **Hours used:** [More Information Needed]
- **Cloud Provider:** [More Information Needed]
- **Compute Region:** [More Information Needed]
- **Carbon Emitted:** [More Information Needed]
## Technical Specifications [optional]
### Model Architecture and Objective
[More Information Needed]
### Compute Infrastructure
[More Information Needed]
#### Hardware
[More Information Needed]
#### Software
[More Information Needed]
## Citation [optional]
<!-- If there is a paper or blog post introducing the model, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the model or model card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Model Card Authors [optional]
[More Information Needed]
## Model Card Contact
[More Information Needed] | [
"label_0",
"label_1",
"label_2",
"label_3",
"label_4",
"label_5",
"label_6",
"label_7",
"label_8",
"label_9",
"label_10",
"label_11",
"label_12"
] |
joe611/chickens-composite-403232323232-150-epochs-w-transform-metrics-test |
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# chickens-composite-403232323232-150-epochs-w-transform-metrics-test
This model is a fine-tuned version of [facebook/detr-resnet-50](https://huggingface.co/facebook/detr-resnet-50) on the None dataset.
It achieves the following results on the evaluation set:
- Loss: 0.2394
- Map: 0.8413
- Map 50: 0.9641
- Map 75: 0.9341
- Map Small: 0.3268
- Map Medium: 0.8408
- Map Large: 0.8507
- Mar 1: 0.3376
- Mar 10: 0.8711
- Mar 100: 0.8749
- Mar Small: 0.3947
- Mar Medium: 0.8792
- Mar Large: 0.881
- Map Chicken: 0.8309
- Mar 100 Chicken: 0.8738
- Map Duck: 0.7956
- Mar 100 Duck: 0.8294
- Map Plant: 0.8973
- Mar 100 Plant: 0.9215
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 1e-05
- train_batch_size: 2
- eval_batch_size: 8
- seed: 42
- optimizer: Use adamw_torch with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
- lr_scheduler_type: cosine
- num_epochs: 150
### Training results
| Training Loss | Epoch | Step | Validation Loss | Map | Map 50 | Map 75 | Map Small | Map Medium | Map Large | Mar 1 | Mar 10 | Mar 100 | Mar Small | Mar Medium | Mar Large | Map Chicken | Mar 100 Chicken | Map Duck | Mar 100 Duck | Map Plant | Mar 100 Plant |
|:-------------:|:-----:|:------:|:---------------:|:------:|:------:|:------:|:---------:|:----------:|:---------:|:------:|:------:|:-------:|:---------:|:----------:|:---------:|:-----------:|:---------------:|:--------:|:------------:|:---------:|:-------------:|
| 1.1267 | 1.0 | 1000 | 1.2143 | 0.2128 | 0.3053 | 0.2462 | 0.0212 | 0.1247 | 0.2574 | 0.1042 | 0.3253 | 0.3593 | 0.0767 | 0.3264 | 0.3725 | 0.1029 | 0.3376 | 0.0023 | 0.0036 | 0.5333 | 0.7365 |
| 1.1423 | 2.0 | 2000 | 1.0583 | 0.2633 | 0.384 | 0.2877 | 0.0549 | 0.1816 | 0.2767 | 0.1024 | 0.3946 | 0.4425 | 0.13 | 0.4032 | 0.4552 | 0.1457 | 0.5795 | 0.0 | 0.0 | 0.6442 | 0.7479 |
| 0.9168 | 3.0 | 3000 | 0.9335 | 0.3008 | 0.4485 | 0.341 | 0.0694 | 0.2511 | 0.3019 | 0.1107 | 0.4104 | 0.4208 | 0.1154 | 0.3965 | 0.4023 | 0.2437 | 0.534 | 0.0 | 0.0 | 0.6588 | 0.7284 |
| 0.8732 | 4.0 | 4000 | 0.8630 | 0.3268 | 0.4634 | 0.3815 | 0.0599 | 0.2883 | 0.3552 | 0.1214 | 0.4573 | 0.49 | 0.1374 | 0.4629 | 0.5136 | 0.2845 | 0.7165 | 0.0 | 0.0 | 0.6958 | 0.7536 |
| 0.7036 | 5.0 | 5000 | 0.7595 | 0.3359 | 0.4642 | 0.3931 | 0.0739 | 0.2912 | 0.3764 | 0.1255 | 0.4737 | 0.5037 | 0.1679 | 0.4693 | 0.5219 | 0.2899 | 0.7404 | 0.0 | 0.0 | 0.7178 | 0.7707 |
| 0.979 | 6.0 | 6000 | 0.7069 | 0.3728 | 0.5214 | 0.4438 | 0.0768 | 0.3332 | 0.4012 | 0.1296 | 0.4833 | 0.4882 | 0.1172 | 0.4582 | 0.5065 | 0.4058 | 0.7012 | 0.0 | 0.0 | 0.7125 | 0.7633 |
| 0.7254 | 7.0 | 7000 | 0.6566 | 0.3939 | 0.5385 | 0.4648 | 0.0613 | 0.3571 | 0.4192 | 0.1369 | 0.4969 | 0.5026 | 0.153 | 0.4737 | 0.5184 | 0.4487 | 0.7254 | 0.0 | 0.0 | 0.7331 | 0.7825 |
| 0.6696 | 8.0 | 8000 | 0.6276 | 0.4213 | 0.5797 | 0.5037 | 0.0573 | 0.3791 | 0.4437 | 0.1378 | 0.4964 | 0.4996 | 0.1033 | 0.467 | 0.5195 | 0.5311 | 0.7203 | 0.0 | 0.0 | 0.7327 | 0.7786 |
| 0.6583 | 9.0 | 9000 | 0.6021 | 0.4331 | 0.5873 | 0.5108 | 0.0796 | 0.3986 | 0.4557 | 0.1397 | 0.5046 | 0.5091 | 0.1539 | 0.478 | 0.5262 | 0.5463 | 0.7264 | 0.0 | 0.0 | 0.7528 | 0.8009 |
| 0.5876 | 10.0 | 10000 | 0.5796 | 0.4447 | 0.6032 | 0.5274 | 0.1098 | 0.4084 | 0.4701 | 0.1426 | 0.5064 | 0.5112 | 0.1833 | 0.4843 | 0.5263 | 0.5865 | 0.7376 | 0.0 | 0.0 | 0.7475 | 0.7961 |
| 0.4736 | 11.0 | 11000 | 0.5645 | 0.4457 | 0.6013 | 0.5331 | 0.1092 | 0.4073 | 0.4769 | 0.1397 | 0.5073 | 0.5115 | 0.2014 | 0.4811 | 0.5278 | 0.5734 | 0.7258 | 0.0 | 0.0 | 0.7638 | 0.8087 |
| 0.6139 | 12.0 | 12000 | 0.5577 | 0.4428 | 0.5984 | 0.5244 | 0.0509 | 0.4043 | 0.4745 | 0.1392 | 0.5128 | 0.5177 | 0.1709 | 0.4876 | 0.5341 | 0.5648 | 0.7408 | 0.0 | 0.0 | 0.7636 | 0.8124 |
| 0.7356 | 13.0 | 13000 | 0.5289 | 0.4651 | 0.6128 | 0.5378 | 0.048 | 0.4393 | 0.489 | 0.146 | 0.5238 | 0.5272 | 0.1102 | 0.5075 | 0.5375 | 0.6242 | 0.762 | 0.0 | 0.0 | 0.7711 | 0.8197 |
| 0.5112 | 14.0 | 14000 | 0.5340 | 0.4658 | 0.6239 | 0.5556 | 0.0889 | 0.4309 | 0.4849 | 0.1444 | 0.5135 | 0.5164 | 0.1865 | 0.4892 | 0.5316 | 0.6346 | 0.7404 | 0.0 | 0.0 | 0.7627 | 0.8087 |
| 0.5302 | 15.0 | 15000 | 0.5103 | 0.478 | 0.6328 | 0.5738 | 0.0727 | 0.4478 | 0.4967 | 0.1478 | 0.5216 | 0.5244 | 0.2081 | 0.498 | 0.54 | 0.6635 | 0.7571 | 0.0 | 0.0 | 0.7705 | 0.816 |
| 0.4855 | 16.0 | 16000 | 0.5183 | 0.5038 | 0.6727 | 0.6016 | 0.1285 | 0.4726 | 0.5257 | 0.1738 | 0.5454 | 0.5486 | 0.2061 | 0.5213 | 0.5649 | 0.6608 | 0.7408 | 0.0921 | 0.0928 | 0.7587 | 0.8121 |
| 0.4891 | 17.0 | 17000 | 0.4886 | 0.6311 | 0.8473 | 0.7764 | 0.1735 | 0.612 | 0.6078 | 0.2544 | 0.6734 | 0.6769 | 0.2801 | 0.6651 | 0.6446 | 0.676 | 0.7467 | 0.4514 | 0.4758 | 0.7659 | 0.8082 |
| 0.5348 | 18.0 | 18000 | 0.4468 | 0.6909 | 0.8959 | 0.8328 | 0.1362 | 0.6815 | 0.6633 | 0.2864 | 0.733 | 0.7378 | 0.2592 | 0.7358 | 0.701 | 0.7055 | 0.7714 | 0.586 | 0.6165 | 0.7811 | 0.8256 |
| 0.5136 | 19.0 | 19000 | 0.4364 | 0.6945 | 0.9219 | 0.8428 | 0.1052 | 0.6706 | 0.73 | 0.2905 | 0.7425 | 0.7467 | 0.254 | 0.7333 | 0.7721 | 0.6837 | 0.7541 | 0.6198 | 0.6603 | 0.78 | 0.8256 |
| 0.7763 | 20.0 | 20000 | 0.4125 | 0.7082 | 0.9235 | 0.8439 | 0.115 | 0.6876 | 0.729 | 0.2988 | 0.7549 | 0.7599 | 0.2393 | 0.7483 | 0.7776 | 0.6954 | 0.763 | 0.6381 | 0.6794 | 0.791 | 0.8374 |
| 0.531 | 21.0 | 21000 | 0.4182 | 0.7035 | 0.9164 | 0.8414 | 0.1179 | 0.6748 | 0.7277 | 0.297 | 0.7502 | 0.7537 | 0.2354 | 0.7369 | 0.7777 | 0.6977 | 0.7618 | 0.6212 | 0.6639 | 0.7917 | 0.8354 |
| 0.5738 | 22.0 | 22000 | 0.4124 | 0.704 | 0.924 | 0.852 | 0.1622 | 0.6842 | 0.7077 | 0.2951 | 0.7477 | 0.7512 | 0.2837 | 0.7373 | 0.7539 | 0.6921 | 0.7501 | 0.6329 | 0.6696 | 0.787 | 0.8338 |
| 0.4659 | 23.0 | 23000 | 0.3881 | 0.7229 | 0.9359 | 0.8721 | 0.1295 | 0.6982 | 0.7482 | 0.3033 | 0.7637 | 0.7699 | 0.3028 | 0.7558 | 0.7945 | 0.7215 | 0.7793 | 0.6537 | 0.6887 | 0.7935 | 0.8418 |
| 0.476 | 24.0 | 24000 | 0.4041 | 0.6974 | 0.9329 | 0.8277 | 0.1692 | 0.6708 | 0.7309 | 0.2946 | 0.7399 | 0.7446 | 0.2606 | 0.7272 | 0.7784 | 0.6745 | 0.7354 | 0.6167 | 0.6562 | 0.8008 | 0.8422 |
| 0.4144 | 25.0 | 25000 | 0.3697 | 0.7384 | 0.9476 | 0.8756 | 0.1688 | 0.7133 | 0.7618 | 0.3101 | 0.7763 | 0.7809 | 0.2761 | 0.7666 | 0.8109 | 0.7223 | 0.773 | 0.6905 | 0.7237 | 0.8024 | 0.846 |
| 0.6979 | 26.0 | 26000 | 0.3699 | 0.7347 | 0.9514 | 0.8711 | 0.1661 | 0.7181 | 0.73 | 0.3042 | 0.7751 | 0.7798 | 0.274 | 0.7739 | 0.7747 | 0.7269 | 0.7813 | 0.6696 | 0.7113 | 0.8076 | 0.8469 |
| 0.5105 | 27.0 | 27000 | 0.3672 | 0.7439 | 0.9498 | 0.8953 | 0.1973 | 0.7226 | 0.7565 | 0.3107 | 0.7836 | 0.7882 | 0.3234 | 0.7824 | 0.7989 | 0.7272 | 0.7801 | 0.7013 | 0.7397 | 0.8031 | 0.845 |
| 0.4635 | 28.0 | 28000 | 0.3588 | 0.7511 | 0.9547 | 0.8886 | 0.246 | 0.7298 | 0.7617 | 0.3085 | 0.7913 | 0.7971 | 0.3761 | 0.7844 | 0.8112 | 0.7263 | 0.7821 | 0.7119 | 0.7521 | 0.815 | 0.8571 |
| 0.5447 | 29.0 | 29000 | 0.4269 | 0.7002 | 0.9507 | 0.8595 | 0.2218 | 0.6811 | 0.7176 | 0.2901 | 0.7457 | 0.75 | 0.3412 | 0.7405 | 0.7706 | 0.6529 | 0.7137 | 0.6597 | 0.7052 | 0.7882 | 0.8313 |
| 0.3582 | 30.0 | 30000 | 0.3582 | 0.7525 | 0.9477 | 0.8899 | 0.1913 | 0.7345 | 0.7843 | 0.3133 | 0.791 | 0.7963 | 0.3302 | 0.785 | 0.8261 | 0.7305 | 0.7865 | 0.7022 | 0.7356 | 0.8248 | 0.8668 |
| 0.5428 | 31.0 | 31000 | 0.3839 | 0.7306 | 0.9549 | 0.8861 | 0.157 | 0.707 | 0.7562 | 0.3038 | 0.772 | 0.7767 | 0.291 | 0.7621 | 0.8003 | 0.7048 | 0.766 | 0.6953 | 0.7356 | 0.7918 | 0.8285 |
| 0.4257 | 32.0 | 32000 | 0.3435 | 0.7691 | 0.9579 | 0.8979 | 0.1462 | 0.7552 | 0.7886 | 0.3172 | 0.806 | 0.8112 | 0.2958 | 0.8032 | 0.8352 | 0.7551 | 0.8058 | 0.732 | 0.7675 | 0.8203 | 0.8601 |
| 0.4555 | 33.0 | 33000 | 0.3404 | 0.7594 | 0.9571 | 0.9004 | 0.1775 | 0.7488 | 0.7696 | 0.3116 | 0.7999 | 0.8043 | 0.2972 | 0.7983 | 0.8178 | 0.7524 | 0.8014 | 0.7026 | 0.7495 | 0.8232 | 0.8622 |
| 0.4041 | 34.0 | 34000 | 0.3376 | 0.7636 | 0.9597 | 0.8954 | 0.2114 | 0.7438 | 0.7815 | 0.3155 | 0.8009 | 0.8066 | 0.335 | 0.7991 | 0.8226 | 0.754 | 0.8028 | 0.7136 | 0.7562 | 0.8233 | 0.8608 |
| 0.4214 | 35.0 | 35000 | 0.3321 | 0.763 | 0.9569 | 0.9045 | 0.2104 | 0.7405 | 0.7754 | 0.3153 | 0.8026 | 0.8058 | 0.3039 | 0.7932 | 0.8216 | 0.7431 | 0.8006 | 0.7167 | 0.7521 | 0.8291 | 0.8648 |
| 0.4448 | 36.0 | 36000 | 0.3371 | 0.7682 | 0.9618 | 0.9098 | 0.2417 | 0.7433 | 0.7862 | 0.3155 | 0.8044 | 0.8084 | 0.33 | 0.7944 | 0.8259 | 0.755 | 0.799 | 0.7253 | 0.7613 | 0.8242 | 0.8648 |
| 0.4959 | 37.0 | 37000 | 0.3453 | 0.7548 | 0.9564 | 0.9069 | 0.1768 | 0.7334 | 0.7937 | 0.3097 | 0.7944 | 0.7977 | 0.3251 | 0.7796 | 0.8376 | 0.7553 | 0.8038 | 0.701 | 0.7371 | 0.8082 | 0.8521 |
| 0.4388 | 38.0 | 38000 | 0.3390 | 0.7615 | 0.961 | 0.9044 | 0.1391 | 0.7481 | 0.7754 | 0.311 | 0.7998 | 0.8038 | 0.277 | 0.7954 | 0.8243 | 0.7523 | 0.8012 | 0.7146 | 0.7531 | 0.8175 | 0.8572 |
| 0.364 | 39.0 | 39000 | 0.3301 | 0.7624 | 0.95 | 0.8996 | 0.1968 | 0.758 | 0.7724 | 0.3122 | 0.8012 | 0.8046 | 0.2741 | 0.8034 | 0.8187 | 0.7593 | 0.8119 | 0.7028 | 0.7371 | 0.825 | 0.8649 |
| 0.4423 | 40.0 | 40000 | 0.3265 | 0.7682 | 0.951 | 0.8942 | 0.1885 | 0.7628 | 0.7604 | 0.3178 | 0.8099 | 0.8139 | 0.3056 | 0.8098 | 0.8128 | 0.7557 | 0.8115 | 0.7204 | 0.7577 | 0.8285 | 0.8726 |
| 0.3772 | 41.0 | 41000 | 0.3485 | 0.7493 | 0.9639 | 0.9004 | 0.2278 | 0.7341 | 0.7674 | 0.3049 | 0.7884 | 0.7944 | 0.3943 | 0.7842 | 0.8144 | 0.7131 | 0.7694 | 0.7034 | 0.7443 | 0.8314 | 0.8694 |
| 0.4682 | 42.0 | 42000 | 0.3570 | 0.7437 | 0.9572 | 0.894 | 0.2024 | 0.7235 | 0.7549 | 0.3045 | 0.782 | 0.7869 | 0.331 | 0.7728 | 0.8017 | 0.7434 | 0.794 | 0.6911 | 0.7299 | 0.7965 | 0.837 |
| 0.4829 | 43.0 | 43000 | 0.3295 | 0.7733 | 0.9559 | 0.9061 | 0.2047 | 0.7635 | 0.7853 | 0.3174 | 0.8091 | 0.8123 | 0.2933 | 0.8077 | 0.8238 | 0.77 | 0.8153 | 0.7311 | 0.767 | 0.8189 | 0.8546 |
| 0.4646 | 44.0 | 44000 | 0.3219 | 0.7697 | 0.9622 | 0.9075 | 0.2313 | 0.756 | 0.7799 | 0.3135 | 0.808 | 0.8135 | 0.367 | 0.8049 | 0.8236 | 0.7647 | 0.8129 | 0.7122 | 0.7577 | 0.8321 | 0.87 |
| 0.3236 | 45.0 | 45000 | 0.3212 | 0.7713 | 0.961 | 0.9025 | 0.2236 | 0.7623 | 0.7637 | 0.3156 | 0.8108 | 0.8148 | 0.3744 | 0.8115 | 0.8092 | 0.7576 | 0.8139 | 0.7256 | 0.7624 | 0.8307 | 0.8683 |
| 0.3759 | 46.0 | 46000 | 0.3281 | 0.756 | 0.9604 | 0.8943 | 0.2171 | 0.7466 | 0.7606 | 0.3069 | 0.7945 | 0.8001 | 0.3131 | 0.7971 | 0.8076 | 0.7556 | 0.8042 | 0.6797 | 0.7237 | 0.8326 | 0.8725 |
| 0.3627 | 47.0 | 47000 | 0.3181 | 0.7674 | 0.9546 | 0.9086 | 0.2258 | 0.7583 | 0.7736 | 0.3161 | 0.8091 | 0.8149 | 0.3537 | 0.8108 | 0.8225 | 0.7595 | 0.8161 | 0.7131 | 0.7577 | 0.8297 | 0.8709 |
| 0.4146 | 48.0 | 48000 | 0.3072 | 0.7807 | 0.9641 | 0.9209 | 0.2599 | 0.7662 | 0.8023 | 0.3178 | 0.8212 | 0.8257 | 0.3962 | 0.8184 | 0.8447 | 0.7758 | 0.8249 | 0.7334 | 0.7758 | 0.833 | 0.8763 |
| 0.4113 | 49.0 | 49000 | 0.3452 | 0.74 | 0.9649 | 0.8877 | 0.258 | 0.7318 | 0.7372 | 0.3014 | 0.7885 | 0.793 | 0.374 | 0.784 | 0.7922 | 0.7012 | 0.7678 | 0.6958 | 0.7443 | 0.8229 | 0.867 |
| 0.4146 | 50.0 | 50000 | 0.3091 | 0.7822 | 0.9588 | 0.9016 | 0.2296 | 0.7753 | 0.7786 | 0.3203 | 0.82 | 0.8255 | 0.3581 | 0.8178 | 0.8262 | 0.772 | 0.8284 | 0.7344 | 0.7716 | 0.8404 | 0.8766 |
| 0.3777 | 51.0 | 51000 | 0.3168 | 0.7772 | 0.9559 | 0.9072 | 0.1974 | 0.7684 | 0.7844 | 0.3203 | 0.8163 | 0.8209 | 0.2859 | 0.816 | 0.8346 | 0.7685 | 0.8183 | 0.7287 | 0.7696 | 0.8343 | 0.8747 |
| 0.3417 | 52.0 | 52000 | 0.3135 | 0.7745 | 0.958 | 0.9068 | 0.2364 | 0.7548 | 0.804 | 0.3164 | 0.8137 | 0.8183 | 0.3284 | 0.8071 | 0.8433 | 0.7776 | 0.828 | 0.7113 | 0.7515 | 0.8347 | 0.8754 |
| 0.4088 | 53.0 | 53000 | 0.3145 | 0.7689 | 0.9569 | 0.9183 | 0.2712 | 0.7593 | 0.7631 | 0.3134 | 0.8108 | 0.8147 | 0.3583 | 0.8093 | 0.8093 | 0.7593 | 0.8117 | 0.7116 | 0.7557 | 0.8358 | 0.8767 |
| 0.4384 | 54.0 | 54000 | 0.2973 | 0.7848 | 0.9618 | 0.9157 | 0.2523 | 0.7746 | 0.7875 | 0.3201 | 0.8243 | 0.8284 | 0.3324 | 0.8227 | 0.8343 | 0.768 | 0.8203 | 0.7407 | 0.782 | 0.8456 | 0.8828 |
| 0.3848 | 55.0 | 55000 | 0.3071 | 0.7806 | 0.9572 | 0.911 | 0.2747 | 0.7711 | 0.7916 | 0.3205 | 0.8198 | 0.8234 | 0.3651 | 0.8158 | 0.8349 | 0.7773 | 0.8288 | 0.7303 | 0.767 | 0.834 | 0.8744 |
| 0.4163 | 56.0 | 56000 | 0.3055 | 0.7775 | 0.9568 | 0.9054 | 0.2209 | 0.7647 | 0.784 | 0.3201 | 0.8194 | 0.8221 | 0.3228 | 0.8135 | 0.8257 | 0.7653 | 0.8181 | 0.7328 | 0.7789 | 0.8343 | 0.8694 |
| 0.4013 | 57.0 | 57000 | 0.3098 | 0.7777 | 0.9568 | 0.9094 | 0.2633 | 0.764 | 0.8032 | 0.3181 | 0.8165 | 0.8202 | 0.3683 | 0.8118 | 0.8402 | 0.7674 | 0.8219 | 0.7292 | 0.7696 | 0.8365 | 0.8691 |
| 0.2877 | 58.0 | 58000 | 0.2941 | 0.7872 | 0.9582 | 0.9177 | 0.2115 | 0.781 | 0.7936 | 0.3202 | 0.8214 | 0.8249 | 0.2915 | 0.8268 | 0.8291 | 0.7804 | 0.8294 | 0.7313 | 0.7629 | 0.8499 | 0.8825 |
| 0.4487 | 59.0 | 59000 | 0.2830 | 0.7979 | 0.9649 | 0.9219 | 0.2545 | 0.7887 | 0.8174 | 0.322 | 0.835 | 0.8389 | 0.377 | 0.8374 | 0.8519 | 0.7902 | 0.8378 | 0.7438 | 0.7881 | 0.8597 | 0.8907 |
| 0.3549 | 60.0 | 60000 | 0.3007 | 0.7947 | 0.9604 | 0.9171 | 0.3052 | 0.7754 | 0.8257 | 0.3241 | 0.8311 | 0.8339 | 0.3913 | 0.8184 | 0.8606 | 0.7931 | 0.8414 | 0.7457 | 0.783 | 0.8452 | 0.8771 |
| 0.3725 | 61.0 | 61000 | 0.3180 | 0.7636 | 0.9581 | 0.9079 | 0.2728 | 0.7514 | 0.7981 | 0.3094 | 0.8045 | 0.8077 | 0.3622 | 0.798 | 0.8388 | 0.7511 | 0.8074 | 0.701 | 0.7433 | 0.8388 | 0.8723 |
| 0.3535 | 62.0 | 62000 | 0.3056 | 0.7781 | 0.962 | 0.9129 | 0.2116 | 0.7632 | 0.8043 | 0.3189 | 0.8165 | 0.8191 | 0.3281 | 0.8074 | 0.8377 | 0.7639 | 0.8173 | 0.7345 | 0.7686 | 0.8359 | 0.8713 |
| 0.4038 | 63.0 | 63000 | 0.2947 | 0.7899 | 0.9584 | 0.9148 | 0.2826 | 0.7853 | 0.8087 | 0.3239 | 0.8263 | 0.8289 | 0.3443 | 0.8273 | 0.8401 | 0.7778 | 0.8282 | 0.7416 | 0.7784 | 0.8502 | 0.8802 |
| 0.4424 | 64.0 | 64000 | 0.2922 | 0.7939 | 0.9606 | 0.9113 | 0.241 | 0.7862 | 0.817 | 0.3225 | 0.8305 | 0.8342 | 0.3485 | 0.8289 | 0.8462 | 0.7892 | 0.8402 | 0.742 | 0.7804 | 0.8505 | 0.882 |
| 0.3878 | 65.0 | 65000 | 0.2872 | 0.799 | 0.9605 | 0.9221 | 0.2855 | 0.7922 | 0.8179 | 0.3216 | 0.8324 | 0.8358 | 0.3819 | 0.8297 | 0.8544 | 0.7892 | 0.837 | 0.7498 | 0.7809 | 0.858 | 0.8895 |
| 0.3628 | 66.0 | 66000 | 0.2975 | 0.7879 | 0.9592 | 0.9135 | 0.2664 | 0.7765 | 0.8108 | 0.3237 | 0.8232 | 0.8273 | 0.3577 | 0.8191 | 0.8456 | 0.791 | 0.8364 | 0.7272 | 0.7655 | 0.8456 | 0.8799 |
| 0.3657 | 67.0 | 67000 | 0.2714 | 0.8096 | 0.9656 | 0.9242 | 0.2913 | 0.799 | 0.8191 | 0.3297 | 0.8438 | 0.8486 | 0.4078 | 0.8418 | 0.8561 | 0.8102 | 0.8543 | 0.7621 | 0.8036 | 0.8566 | 0.8879 |
| 0.3254 | 68.0 | 68000 | 0.2934 | 0.793 | 0.9615 | 0.919 | 0.2837 | 0.7855 | 0.8032 | 0.3232 | 0.8291 | 0.8325 | 0.3423 | 0.8287 | 0.8382 | 0.7734 | 0.8199 | 0.7553 | 0.7948 | 0.8502 | 0.8828 |
| 0.4403 | 69.0 | 69000 | 0.2675 | 0.8138 | 0.9639 | 0.9218 | 0.2836 | 0.8104 | 0.8154 | 0.3296 | 0.8477 | 0.8539 | 0.4037 | 0.8523 | 0.856 | 0.8099 | 0.8592 | 0.7637 | 0.801 | 0.8677 | 0.9016 |
| 0.3707 | 70.0 | 70000 | 0.2790 | 0.8017 | 0.9582 | 0.923 | 0.2908 | 0.7982 | 0.8042 | 0.3243 | 0.8376 | 0.8421 | 0.398 | 0.8389 | 0.8458 | 0.8024 | 0.8515 | 0.7398 | 0.7809 | 0.8629 | 0.8939 |
| 0.3141 | 71.0 | 71000 | 0.2897 | 0.791 | 0.9656 | 0.9244 | 0.2887 | 0.7846 | 0.81 | 0.3221 | 0.8283 | 0.8329 | 0.3876 | 0.8297 | 0.8479 | 0.7686 | 0.8171 | 0.7479 | 0.7892 | 0.8564 | 0.8924 |
| 0.3931 | 72.0 | 72000 | 0.2896 | 0.7999 | 0.9558 | 0.9205 | 0.2829 | 0.7915 | 0.8152 | 0.3241 | 0.8333 | 0.8371 | 0.3537 | 0.8314 | 0.8531 | 0.7957 | 0.8416 | 0.7506 | 0.784 | 0.8534 | 0.8857 |
| 0.3108 | 73.0 | 73000 | 0.2754 | 0.8067 | 0.9674 | 0.9317 | 0.309 | 0.8031 | 0.802 | 0.3267 | 0.8413 | 0.8465 | 0.4119 | 0.8443 | 0.8441 | 0.7807 | 0.8288 | 0.7709 | 0.8124 | 0.8684 | 0.8983 |
| 0.3259 | 74.0 | 74000 | 0.2741 | 0.8073 | 0.9645 | 0.9314 | 0.2921 | 0.7985 | 0.8422 | 0.327 | 0.8429 | 0.8477 | 0.4056 | 0.8404 | 0.8717 | 0.8031 | 0.8497 | 0.7619 | 0.8031 | 0.857 | 0.8902 |
| 0.3673 | 75.0 | 75000 | 0.2774 | 0.8075 | 0.9697 | 0.9246 | 0.2445 | 0.8 | 0.822 | 0.3252 | 0.8427 | 0.8478 | 0.4028 | 0.8408 | 0.8602 | 0.7883 | 0.8348 | 0.7663 | 0.8098 | 0.8681 | 0.8988 |
| 0.3785 | 76.0 | 76000 | 0.2867 | 0.8005 | 0.962 | 0.9259 | 0.2625 | 0.7912 | 0.8152 | 0.3243 | 0.8353 | 0.8393 | 0.3776 | 0.8332 | 0.8547 | 0.789 | 0.8382 | 0.7507 | 0.7887 | 0.8619 | 0.8911 |
| 0.3441 | 77.0 | 77000 | 0.2762 | 0.8081 | 0.9584 | 0.9263 | 0.3086 | 0.802 | 0.8231 | 0.3269 | 0.8434 | 0.8469 | 0.3839 | 0.8449 | 0.8575 | 0.7948 | 0.8435 | 0.7573 | 0.7964 | 0.8721 | 0.9009 |
| 0.344 | 78.0 | 78000 | 0.2863 | 0.8003 | 0.9617 | 0.9286 | 0.2783 | 0.793 | 0.8183 | 0.3227 | 0.8338 | 0.8376 | 0.3869 | 0.8327 | 0.8544 | 0.785 | 0.8336 | 0.751 | 0.7866 | 0.8648 | 0.8926 |
| 0.339 | 79.0 | 79000 | 0.2687 | 0.8186 | 0.9613 | 0.9276 | 0.2672 | 0.8113 | 0.8268 | 0.3303 | 0.8499 | 0.8542 | 0.3919 | 0.85 | 0.8578 | 0.8079 | 0.8519 | 0.7733 | 0.8098 | 0.8747 | 0.901 |
| 0.2642 | 80.0 | 80000 | 0.2587 | 0.8227 | 0.9658 | 0.9254 | 0.2459 | 0.8182 | 0.8269 | 0.3321 | 0.8543 | 0.8583 | 0.39 | 0.8576 | 0.858 | 0.8123 | 0.8551 | 0.78 | 0.8165 | 0.8757 | 0.9033 |
| 0.3122 | 81.0 | 81000 | 0.2692 | 0.8148 | 0.9609 | 0.9193 | 0.2916 | 0.8076 | 0.8143 | 0.3303 | 0.8479 | 0.8517 | 0.3864 | 0.8495 | 0.851 | 0.8096 | 0.8557 | 0.7596 | 0.7974 | 0.8751 | 0.902 |
| 0.3475 | 82.0 | 82000 | 0.2805 | 0.797 | 0.9577 | 0.9257 | 0.2179 | 0.789 | 0.816 | 0.3231 | 0.8329 | 0.8366 | 0.3324 | 0.831 | 0.8536 | 0.7938 | 0.8421 | 0.7365 | 0.7789 | 0.8607 | 0.8888 |
| 0.4223 | 83.0 | 83000 | 0.2652 | 0.8082 | 0.9656 | 0.9314 | 0.2965 | 0.8025 | 0.8239 | 0.3252 | 0.8455 | 0.8485 | 0.3701 | 0.8477 | 0.8643 | 0.797 | 0.8447 | 0.7561 | 0.8005 | 0.8714 | 0.9003 |
| 0.3136 | 84.0 | 84000 | 0.2655 | 0.8157 | 0.9692 | 0.9236 | 0.3004 | 0.8136 | 0.8269 | 0.3272 | 0.85 | 0.8539 | 0.4066 | 0.8524 | 0.8615 | 0.8092 | 0.8573 | 0.7676 | 0.8031 | 0.8703 | 0.9012 |
| 0.2849 | 85.0 | 85000 | 0.2659 | 0.8126 | 0.9654 | 0.9284 | 0.2584 | 0.8039 | 0.8435 | 0.3268 | 0.8465 | 0.85 | 0.3664 | 0.8451 | 0.8762 | 0.7995 | 0.8459 | 0.7634 | 0.7995 | 0.8748 | 0.9045 |
| 0.3634 | 86.0 | 86000 | 0.2642 | 0.8194 | 0.9604 | 0.9231 | 0.2498 | 0.8143 | 0.8379 | 0.3308 | 0.8509 | 0.8538 | 0.2906 | 0.8519 | 0.8711 | 0.8109 | 0.8541 | 0.7685 | 0.8021 | 0.8789 | 0.9051 |
| 0.4086 | 87.0 | 87000 | 0.2655 | 0.8124 | 0.9649 | 0.924 | 0.3069 | 0.8076 | 0.8237 | 0.3279 | 0.8446 | 0.8484 | 0.3824 | 0.845 | 0.8606 | 0.7972 | 0.8421 | 0.7569 | 0.7938 | 0.8832 | 0.9095 |
| 0.3238 | 88.0 | 88000 | 0.2543 | 0.822 | 0.9675 | 0.9315 | 0.2809 | 0.8157 | 0.8393 | 0.3303 | 0.8551 | 0.8586 | 0.3553 | 0.8548 | 0.8723 | 0.8092 | 0.8541 | 0.7755 | 0.8144 | 0.8814 | 0.9073 |
| 0.465 | 89.0 | 89000 | 0.2690 | 0.818 | 0.9661 | 0.9275 | 0.3431 | 0.8111 | 0.8314 | 0.3274 | 0.8482 | 0.8524 | 0.4379 | 0.8484 | 0.8649 | 0.7979 | 0.8396 | 0.7725 | 0.8046 | 0.8835 | 0.913 |
| 0.37 | 90.0 | 90000 | 0.2602 | 0.8221 | 0.9593 | 0.9235 | 0.2761 | 0.8197 | 0.8417 | 0.3301 | 0.8531 | 0.8564 | 0.3389 | 0.8569 | 0.8744 | 0.8138 | 0.8584 | 0.7712 | 0.801 | 0.8813 | 0.9098 |
| 0.3063 | 91.0 | 91000 | 0.2617 | 0.8144 | 0.9619 | 0.9289 | 0.3302 | 0.8134 | 0.8147 | 0.3274 | 0.8485 | 0.8515 | 0.4172 | 0.8521 | 0.8532 | 0.8069 | 0.8575 | 0.7583 | 0.7902 | 0.8779 | 0.9068 |
| 0.2721 | 92.0 | 92000 | 0.2699 | 0.8123 | 0.961 | 0.9226 | 0.2827 | 0.8072 | 0.8225 | 0.3289 | 0.8461 | 0.8492 | 0.3659 | 0.848 | 0.86 | 0.8041 | 0.8467 | 0.7592 | 0.8 | 0.8736 | 0.9009 |
| 0.2704 | 93.0 | 93000 | 0.2531 | 0.8251 | 0.9587 | 0.9243 | 0.3119 | 0.8192 | 0.8461 | 0.3351 | 0.8579 | 0.8611 | 0.3758 | 0.8575 | 0.8786 | 0.8164 | 0.864 | 0.7768 | 0.8113 | 0.882 | 0.908 |
| 0.3274 | 94.0 | 94000 | 0.2599 | 0.8229 | 0.968 | 0.9292 | 0.3198 | 0.8176 | 0.8409 | 0.3306 | 0.8551 | 0.8583 | 0.4022 | 0.8564 | 0.8716 | 0.8147 | 0.8602 | 0.7744 | 0.8082 | 0.8797 | 0.9066 |
| 0.3198 | 95.0 | 95000 | 0.2561 | 0.8264 | 0.9661 | 0.9288 | 0.303 | 0.8232 | 0.8412 | 0.3327 | 0.859 | 0.8625 | 0.3882 | 0.8619 | 0.8739 | 0.8153 | 0.8628 | 0.7869 | 0.8191 | 0.8768 | 0.9057 |
| 0.3286 | 96.0 | 96000 | 0.2624 | 0.8178 | 0.9671 | 0.9258 | 0.3039 | 0.8123 | 0.8334 | 0.3303 | 0.8535 | 0.8567 | 0.3876 | 0.8552 | 0.87 | 0.817 | 0.8624 | 0.7705 | 0.8082 | 0.8658 | 0.8996 |
| 0.35 | 97.0 | 97000 | 0.2436 | 0.8304 | 0.9659 | 0.9291 | 0.3281 | 0.8274 | 0.8375 | 0.3336 | 0.866 | 0.8697 | 0.4291 | 0.8701 | 0.8707 | 0.8168 | 0.8644 | 0.7886 | 0.8299 | 0.8857 | 0.9147 |
| 0.3377 | 98.0 | 98000 | 0.2626 | 0.819 | 0.9551 | 0.921 | 0.294 | 0.8177 | 0.819 | 0.3325 | 0.8525 | 0.8559 | 0.3603 | 0.8563 | 0.8551 | 0.8114 | 0.8588 | 0.7658 | 0.8005 | 0.8798 | 0.9083 |
| 0.3617 | 99.0 | 99000 | 0.2673 | 0.8161 | 0.9609 | 0.9234 | 0.2854 | 0.8086 | 0.8305 | 0.3311 | 0.8506 | 0.8545 | 0.3632 | 0.8498 | 0.87 | 0.8023 | 0.8485 | 0.7725 | 0.8134 | 0.8736 | 0.9016 |
| 0.364 | 100.0 | 100000 | 0.2605 | 0.824 | 0.9626 | 0.9338 | 0.323 | 0.8195 | 0.8453 | 0.3306 | 0.8547 | 0.8589 | 0.4057 | 0.8573 | 0.8753 | 0.8203 | 0.861 | 0.7746 | 0.8093 | 0.8771 | 0.9064 |
| 0.3617 | 101.0 | 101000 | 0.2504 | 0.8272 | 0.9633 | 0.9247 | 0.2991 | 0.8225 | 0.8427 | 0.3337 | 0.859 | 0.8635 | 0.3882 | 0.8625 | 0.8814 | 0.8188 | 0.862 | 0.7768 | 0.8139 | 0.886 | 0.9146 |
| 0.2855 | 102.0 | 102000 | 0.2508 | 0.8213 | 0.9647 | 0.9329 | 0.3075 | 0.8192 | 0.8351 | 0.3294 | 0.8561 | 0.86 | 0.392 | 0.8619 | 0.8727 | 0.8165 | 0.865 | 0.7631 | 0.8031 | 0.8843 | 0.9118 |
| 0.3384 | 103.0 | 103000 | 0.2512 | 0.828 | 0.9619 | 0.934 | 0.3208 | 0.8276 | 0.8439 | 0.3343 | 0.8606 | 0.8646 | 0.4164 | 0.8657 | 0.8757 | 0.8158 | 0.8602 | 0.7848 | 0.8216 | 0.8833 | 0.9119 |
| 0.3331 | 104.0 | 104000 | 0.2545 | 0.8195 | 0.9621 | 0.9333 | 0.3061 | 0.8183 | 0.8263 | 0.331 | 0.8535 | 0.8564 | 0.4039 | 0.8573 | 0.8602 | 0.813 | 0.8569 | 0.7652 | 0.8052 | 0.8803 | 0.9071 |
| 0.3158 | 105.0 | 105000 | 0.2531 | 0.8304 | 0.9634 | 0.9254 | 0.2998 | 0.8263 | 0.8411 | 0.3356 | 0.8632 | 0.8671 | 0.3968 | 0.865 | 0.8794 | 0.8222 | 0.867 | 0.7788 | 0.8196 | 0.89 | 0.9148 |
| 0.301 | 106.0 | 106000 | 0.2596 | 0.8221 | 0.961 | 0.9276 | 0.2968 | 0.8191 | 0.8335 | 0.3325 | 0.8551 | 0.8587 | 0.3629 | 0.8586 | 0.8703 | 0.8082 | 0.8543 | 0.7767 | 0.8149 | 0.8815 | 0.9067 |
| 0.3579 | 107.0 | 107000 | 0.2434 | 0.8339 | 0.9647 | 0.9304 | 0.3098 | 0.832 | 0.8464 | 0.3344 | 0.8641 | 0.8685 | 0.3989 | 0.8705 | 0.877 | 0.8228 | 0.8678 | 0.7883 | 0.8227 | 0.8905 | 0.915 |
| 0.3682 | 108.0 | 108000 | 0.2440 | 0.8324 | 0.9624 | 0.9341 | 0.3185 | 0.8311 | 0.8428 | 0.3348 | 0.8644 | 0.8681 | 0.404 | 0.8697 | 0.8751 | 0.8208 | 0.8648 | 0.7831 | 0.8227 | 0.8932 | 0.9167 |
| 0.3234 | 109.0 | 109000 | 0.2532 | 0.8224 | 0.9605 | 0.9338 | 0.3491 | 0.8203 | 0.8253 | 0.3316 | 0.8569 | 0.8609 | 0.4301 | 0.8603 | 0.8643 | 0.8134 | 0.8622 | 0.7699 | 0.8108 | 0.884 | 0.9098 |
| 0.3412 | 110.0 | 110000 | 0.2400 | 0.8375 | 0.966 | 0.9346 | 0.3187 | 0.8398 | 0.8387 | 0.3368 | 0.8692 | 0.8733 | 0.4148 | 0.878 | 0.8725 | 0.829 | 0.8722 | 0.7912 | 0.8309 | 0.8923 | 0.9167 |
| 0.4866 | 111.0 | 111000 | 0.2558 | 0.8248 | 0.9649 | 0.9282 | 0.3024 | 0.8222 | 0.8365 | 0.3305 | 0.8537 | 0.8579 | 0.3818 | 0.858 | 0.8692 | 0.8161 | 0.8596 | 0.7728 | 0.8052 | 0.8855 | 0.9089 |
| 0.2781 | 112.0 | 112000 | 0.2461 | 0.8324 | 0.9616 | 0.9294 | 0.3114 | 0.8333 | 0.8408 | 0.3349 | 0.8623 | 0.8659 | 0.3745 | 0.8679 | 0.8706 | 0.8247 | 0.8686 | 0.7794 | 0.8124 | 0.893 | 0.9166 |
| 0.3233 | 113.0 | 113000 | 0.2467 | 0.8333 | 0.9634 | 0.9328 | 0.3187 | 0.8308 | 0.8462 | 0.3342 | 0.865 | 0.8689 | 0.4206 | 0.8696 | 0.8788 | 0.8201 | 0.867 | 0.7881 | 0.8222 | 0.8917 | 0.9176 |
| 0.2915 | 114.0 | 114000 | 0.2393 | 0.8366 | 0.9605 | 0.9276 | 0.3392 | 0.8387 | 0.8412 | 0.336 | 0.8677 | 0.8707 | 0.3934 | 0.8751 | 0.8747 | 0.8273 | 0.8734 | 0.79 | 0.8216 | 0.8925 | 0.917 |
| 0.3298 | 115.0 | 115000 | 0.2474 | 0.8323 | 0.9637 | 0.9268 | 0.3252 | 0.8334 | 0.8287 | 0.3324 | 0.8629 | 0.8667 | 0.4067 | 0.8709 | 0.8656 | 0.8222 | 0.8672 | 0.7779 | 0.8129 | 0.8966 | 0.9199 |
| 0.3928 | 116.0 | 116000 | 0.2425 | 0.8386 | 0.9669 | 0.9343 | 0.3302 | 0.8384 | 0.8459 | 0.336 | 0.8695 | 0.8739 | 0.4186 | 0.8764 | 0.879 | 0.8272 | 0.872 | 0.7927 | 0.8284 | 0.896 | 0.9213 |
| 0.3156 | 117.0 | 117000 | 0.2514 | 0.8285 | 0.9597 | 0.9311 | 0.3177 | 0.8317 | 0.827 | 0.3333 | 0.8605 | 0.8645 | 0.392 | 0.8681 | 0.8634 | 0.8205 | 0.8646 | 0.7744 | 0.8119 | 0.8907 | 0.917 |
| 0.3184 | 118.0 | 118000 | 0.2497 | 0.8294 | 0.9636 | 0.9301 | 0.3286 | 0.8321 | 0.8352 | 0.3334 | 0.8631 | 0.8669 | 0.3995 | 0.8691 | 0.8738 | 0.8231 | 0.8672 | 0.7759 | 0.817 | 0.8892 | 0.9166 |
| 0.2561 | 119.0 | 119000 | 0.2440 | 0.8339 | 0.963 | 0.929 | 0.3126 | 0.8324 | 0.8349 | 0.3359 | 0.8661 | 0.8701 | 0.3844 | 0.8711 | 0.8724 | 0.8266 | 0.8702 | 0.785 | 0.8232 | 0.8901 | 0.917 |
| 0.2776 | 120.0 | 120000 | 0.2442 | 0.8358 | 0.9622 | 0.9283 | 0.3166 | 0.8371 | 0.835 | 0.3366 | 0.8677 | 0.8717 | 0.3809 | 0.8759 | 0.8706 | 0.8278 | 0.8724 | 0.7874 | 0.8242 | 0.8921 | 0.9183 |
| 0.2591 | 121.0 | 121000 | 0.2473 | 0.8302 | 0.9627 | 0.93 | 0.3066 | 0.8283 | 0.8306 | 0.3334 | 0.8615 | 0.8653 | 0.3879 | 0.8663 | 0.8695 | 0.819 | 0.863 | 0.7774 | 0.8144 | 0.8942 | 0.9185 |
| 0.3241 | 122.0 | 122000 | 0.2518 | 0.831 | 0.9615 | 0.9294 | 0.3046 | 0.8319 | 0.8359 | 0.3329 | 0.8637 | 0.8668 | 0.3782 | 0.8693 | 0.8747 | 0.82 | 0.8662 | 0.7825 | 0.818 | 0.8905 | 0.9163 |
| 0.3599 | 123.0 | 123000 | 0.2362 | 0.84 | 0.9602 | 0.9317 | 0.3253 | 0.8382 | 0.8521 | 0.3385 | 0.8713 | 0.8747 | 0.3742 | 0.8767 | 0.8869 | 0.8325 | 0.8775 | 0.7897 | 0.8253 | 0.8978 | 0.9214 |
| 0.2938 | 124.0 | 124000 | 0.2403 | 0.84 | 0.9615 | 0.935 | 0.3254 | 0.8403 | 0.8457 | 0.337 | 0.8711 | 0.8742 | 0.3856 | 0.8781 | 0.8804 | 0.8314 | 0.8753 | 0.7929 | 0.8263 | 0.8958 | 0.921 |
| 0.2533 | 125.0 | 125000 | 0.2422 | 0.8363 | 0.9619 | 0.9322 | 0.3405 | 0.8376 | 0.8377 | 0.3355 | 0.8677 | 0.8709 | 0.4067 | 0.8744 | 0.8757 | 0.8245 | 0.8686 | 0.7883 | 0.8232 | 0.8962 | 0.9208 |
| 0.3822 | 126.0 | 126000 | 0.2427 | 0.8376 | 0.9645 | 0.9307 | 0.3178 | 0.8354 | 0.8477 | 0.3366 | 0.8695 | 0.8726 | 0.4005 | 0.8739 | 0.8793 | 0.8244 | 0.8696 | 0.7953 | 0.8299 | 0.8932 | 0.9182 |
| 0.3135 | 127.0 | 127000 | 0.2462 | 0.8335 | 0.9645 | 0.9305 | 0.3185 | 0.8314 | 0.8431 | 0.3348 | 0.8659 | 0.8691 | 0.4007 | 0.8697 | 0.8765 | 0.8224 | 0.8674 | 0.7884 | 0.8242 | 0.8898 | 0.9156 |
| 0.4718 | 128.0 | 128000 | 0.2414 | 0.8367 | 0.9644 | 0.9302 | 0.3142 | 0.8356 | 0.8441 | 0.3355 | 0.8685 | 0.8717 | 0.3959 | 0.8733 | 0.8785 | 0.8257 | 0.8702 | 0.7906 | 0.8258 | 0.8938 | 0.9191 |
| 0.2618 | 129.0 | 129000 | 0.2431 | 0.8372 | 0.9628 | 0.9374 | 0.3165 | 0.8355 | 0.8481 | 0.336 | 0.8684 | 0.8716 | 0.39 | 0.8742 | 0.88 | 0.8263 | 0.8688 | 0.7917 | 0.8268 | 0.8936 | 0.9191 |
| 0.3085 | 130.0 | 130000 | 0.2409 | 0.8389 | 0.9647 | 0.9338 | 0.3262 | 0.8395 | 0.8435 | 0.3365 | 0.8714 | 0.8748 | 0.4021 | 0.8795 | 0.8782 | 0.8298 | 0.8738 | 0.7931 | 0.8299 | 0.8938 | 0.9207 |
| 0.3682 | 131.0 | 131000 | 0.2402 | 0.8385 | 0.9641 | 0.9329 | 0.3243 | 0.8393 | 0.8421 | 0.3366 | 0.8699 | 0.8734 | 0.3914 | 0.8773 | 0.8787 | 0.8265 | 0.871 | 0.7923 | 0.8278 | 0.8966 | 0.9213 |
| 0.2999 | 132.0 | 132000 | 0.2411 | 0.8374 | 0.9641 | 0.9296 | 0.3115 | 0.8377 | 0.8448 | 0.3362 | 0.869 | 0.8722 | 0.3776 | 0.8761 | 0.8773 | 0.8243 | 0.8702 | 0.7922 | 0.8263 | 0.8957 | 0.9201 |
| 0.3393 | 133.0 | 133000 | 0.2399 | 0.839 | 0.9641 | 0.9345 | 0.3157 | 0.838 | 0.847 | 0.3372 | 0.87 | 0.8732 | 0.3913 | 0.8758 | 0.8797 | 0.8276 | 0.8706 | 0.7974 | 0.8309 | 0.8919 | 0.918 |
| 0.3064 | 134.0 | 134000 | 0.2377 | 0.8417 | 0.964 | 0.9382 | 0.3154 | 0.8393 | 0.8561 | 0.3387 | 0.8727 | 0.8754 | 0.3835 | 0.8772 | 0.8869 | 0.8329 | 0.8769 | 0.7981 | 0.8304 | 0.8943 | 0.9191 |
| 0.2612 | 135.0 | 135000 | 0.2375 | 0.8423 | 0.9644 | 0.9339 | 0.3126 | 0.842 | 0.849 | 0.3391 | 0.8727 | 0.8761 | 0.3803 | 0.8801 | 0.8812 | 0.8308 | 0.8748 | 0.7974 | 0.8314 | 0.8986 | 0.922 |
| 0.2906 | 136.0 | 136000 | 0.2385 | 0.8392 | 0.9641 | 0.9341 | 0.3276 | 0.8415 | 0.8443 | 0.3364 | 0.8705 | 0.874 | 0.3938 | 0.8796 | 0.8775 | 0.8299 | 0.8742 | 0.7909 | 0.8258 | 0.8969 | 0.9218 |
| 0.2954 | 137.0 | 137000 | 0.2363 | 0.8422 | 0.9641 | 0.9382 | 0.3286 | 0.8414 | 0.8493 | 0.3381 | 0.8725 | 0.8764 | 0.4059 | 0.8793 | 0.8806 | 0.8326 | 0.8757 | 0.7944 | 0.8304 | 0.8996 | 0.9231 |
| 0.304 | 138.0 | 138000 | 0.2413 | 0.8374 | 0.9641 | 0.934 | 0.3224 | 0.8376 | 0.8419 | 0.3362 | 0.8684 | 0.872 | 0.3962 | 0.8749 | 0.8758 | 0.8264 | 0.8712 | 0.7891 | 0.8253 | 0.8966 | 0.9195 |
| 0.2716 | 139.0 | 139000 | 0.2420 | 0.8401 | 0.964 | 0.9343 | 0.3285 | 0.8406 | 0.8493 | 0.3376 | 0.8709 | 0.8745 | 0.3978 | 0.8779 | 0.88 | 0.8264 | 0.8716 | 0.7967 | 0.8309 | 0.8971 | 0.9208 |
| 0.3027 | 140.0 | 140000 | 0.2401 | 0.8416 | 0.9644 | 0.9344 | 0.3279 | 0.8409 | 0.8482 | 0.3375 | 0.8715 | 0.8755 | 0.3965 | 0.8797 | 0.8787 | 0.8312 | 0.874 | 0.795 | 0.8304 | 0.8985 | 0.922 |
| 0.2667 | 141.0 | 141000 | 0.2400 | 0.8399 | 0.9641 | 0.9341 | 0.3252 | 0.8405 | 0.8463 | 0.3374 | 0.8704 | 0.8743 | 0.3943 | 0.8781 | 0.8801 | 0.8283 | 0.8728 | 0.7945 | 0.8294 | 0.8968 | 0.9208 |
| 0.2245 | 142.0 | 142000 | 0.2408 | 0.8404 | 0.9641 | 0.9337 | 0.3176 | 0.8406 | 0.8463 | 0.3371 | 0.8702 | 0.874 | 0.3929 | 0.8783 | 0.878 | 0.8276 | 0.8716 | 0.797 | 0.8304 | 0.8967 | 0.9201 |
| 0.3448 | 143.0 | 143000 | 0.2394 | 0.8421 | 0.9641 | 0.9344 | 0.3271 | 0.8427 | 0.851 | 0.3385 | 0.8725 | 0.8763 | 0.3978 | 0.8804 | 0.8815 | 0.8322 | 0.8753 | 0.7968 | 0.8325 | 0.8974 | 0.9213 |
| 0.3681 | 144.0 | 144000 | 0.2401 | 0.8413 | 0.9642 | 0.9341 | 0.3267 | 0.8413 | 0.8491 | 0.3377 | 0.8713 | 0.8749 | 0.396 | 0.879 | 0.8802 | 0.8309 | 0.8738 | 0.7958 | 0.8299 | 0.8974 | 0.921 |
| 0.2593 | 145.0 | 145000 | 0.2393 | 0.8414 | 0.9641 | 0.9344 | 0.3271 | 0.8411 | 0.8522 | 0.3378 | 0.8713 | 0.875 | 0.3947 | 0.879 | 0.8821 | 0.8313 | 0.8742 | 0.7955 | 0.8294 | 0.8973 | 0.9213 |
| 0.4266 | 146.0 | 146000 | 0.2398 | 0.8403 | 0.9641 | 0.934 | 0.3289 | 0.8405 | 0.8493 | 0.3368 | 0.8705 | 0.8745 | 0.4008 | 0.8786 | 0.8799 | 0.8302 | 0.8728 | 0.7933 | 0.8289 | 0.8974 | 0.9218 |
| 0.2411 | 147.0 | 147000 | 0.2392 | 0.8412 | 0.9641 | 0.934 | 0.3283 | 0.8403 | 0.8507 | 0.3374 | 0.8709 | 0.8747 | 0.3978 | 0.8787 | 0.8811 | 0.8315 | 0.8736 | 0.7949 | 0.8289 | 0.8973 | 0.9217 |
| 0.2675 | 148.0 | 148000 | 0.2393 | 0.8412 | 0.9641 | 0.934 | 0.3268 | 0.8407 | 0.8507 | 0.3374 | 0.871 | 0.8748 | 0.3947 | 0.879 | 0.881 | 0.8315 | 0.8738 | 0.795 | 0.8289 | 0.8973 | 0.9215 |
| 0.2945 | 149.0 | 149000 | 0.2394 | 0.8413 | 0.9641 | 0.9341 | 0.3268 | 0.8408 | 0.8507 | 0.3376 | 0.8711 | 0.8749 | 0.3947 | 0.8792 | 0.881 | 0.8309 | 0.8738 | 0.7956 | 0.8294 | 0.8973 | 0.9215 |
| 0.2848 | 150.0 | 150000 | 0.2394 | 0.8413 | 0.9641 | 0.9341 | 0.3268 | 0.8408 | 0.8507 | 0.3376 | 0.8711 | 0.8749 | 0.3947 | 0.8792 | 0.881 | 0.8309 | 0.8738 | 0.7956 | 0.8294 | 0.8973 | 0.9215 |
### Framework versions
- Transformers 4.46.1
- Pytorch 2.5.0+cu121
- Datasets 2.19.2
- Tokenizers 0.20.1
| [
"chicken",
"duck",
"plant"
] |
markytools/detr-resnet-50_finetuned_cppe5 |
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# detr-resnet-50_finetuned_cppe5
This model is a fine-tuned version of [facebook/detr-resnet-50](https://huggingface.co/facebook/detr-resnet-50) on an unknown dataset.
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 1e-05
- train_batch_size: 8
- eval_batch_size: 8
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 2500
- mixed_precision_training: Native AMP
### Training results
### Framework versions
- Transformers 4.45.1
- Pytorch 2.4.0
- Datasets 3.0.1
- Tokenizers 0.20.0
| [
"p1",
"p2",
"p3"
] |
Jeasun/detr-resnet-50_finetuned_cppe5 |
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# detr-resnet-50_finetuned_cppe5
This model is a fine-tuned version of [facebook/detr-resnet-50](https://huggingface.co/facebook/detr-resnet-50) on an unknown dataset.
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 1e-05
- train_batch_size: 8
- eval_batch_size: 8
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 10
- mixed_precision_training: Native AMP
### Framework versions
- Transformers 4.44.2
- Pytorch 2.5.0+cu121
- Datasets 3.1.0
- Tokenizers 0.19.1
| [
"coverall",
"face_shield",
"gloves",
"goggles",
"mask"
] |
madhutry/detr-finetuned-scrn-98samples-1 |
# Model Card for Model ID
<!-- Provide a quick summary of what the model is/does. -->
## Model Details
### Model Description
<!-- Provide a longer summary of what this model is. -->
This is the model card of a 🤗 transformers model that has been pushed on the Hub. This model card has been automatically generated.
- **Developed by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Model type:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
- **Finetuned from model [optional]:** [More Information Needed]
### Model Sources [optional]
<!-- Provide the basic links for the model. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the model is intended to be used, including the foreseeable users of the model and those affected by the model. -->
### Direct Use
<!-- This section is for the model use without fine-tuning or plugging into a larger ecosystem/app. -->
[More Information Needed]
### Downstream Use [optional]
<!-- This section is for the model use when fine-tuned for a task, or when plugged into a larger ecosystem/app -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the model will not work well for. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.
## How to Get Started with the Model
Use the code below to get started with the model.
[More Information Needed]
## Training Details
### Training Data
<!-- This should link to a Dataset Card, perhaps with a short stub of information on what the training data is all about as well as documentation related to data pre-processing or additional filtering. -->
[More Information Needed]
### Training Procedure
<!-- This relates heavily to the Technical Specifications. Content here should link to that section when it is relevant to the training procedure. -->
#### Preprocessing [optional]
[More Information Needed]
#### Training Hyperparameters
- **Training regime:** [More Information Needed] <!--fp32, fp16 mixed precision, bf16 mixed precision, bf16 non-mixed precision, fp16 non-mixed precision, fp8 mixed precision -->
#### Speeds, Sizes, Times [optional]
<!-- This section provides information about throughput, start/end time, checkpoint size if relevant, etc. -->
[More Information Needed]
## Evaluation
<!-- This section describes the evaluation protocols and provides the results. -->
### Testing Data, Factors & Metrics
#### Testing Data
<!-- This should link to a Dataset Card if possible. -->
[More Information Needed]
#### Factors
<!-- These are the things the evaluation is disaggregating by, e.g., subpopulations or domains. -->
[More Information Needed]
#### Metrics
<!-- These are the evaluation metrics being used, ideally with a description of why. -->
[More Information Needed]
### Results
[More Information Needed]
#### Summary
## Model Examination [optional]
<!-- Relevant interpretability work for the model goes here -->
[More Information Needed]
## Environmental Impact
<!-- Total emissions (in grams of CO2eq) and additional considerations, such as electricity usage, go here. Edit the suggested text below accordingly -->
Carbon emissions can be estimated using the [Machine Learning Impact calculator](https://mlco2.github.io/impact#compute) presented in [Lacoste et al. (2019)](https://arxiv.org/abs/1910.09700).
- **Hardware Type:** [More Information Needed]
- **Hours used:** [More Information Needed]
- **Cloud Provider:** [More Information Needed]
- **Compute Region:** [More Information Needed]
- **Carbon Emitted:** [More Information Needed]
## Technical Specifications [optional]
### Model Architecture and Objective
[More Information Needed]
### Compute Infrastructure
[More Information Needed]
#### Hardware
[More Information Needed]
#### Software
[More Information Needed]
## Citation [optional]
<!-- If there is a paper or blog post introducing the model, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the model or model card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Model Card Authors [optional]
[More Information Needed]
## Model Card Contact
[More Information Needed] | [
"label_0",
"label_1",
"label_2",
"label_3",
"label_4",
"label_5",
"label_6",
"label_7",
"label_8",
"label_9",
"label_10",
"label_11"
] |
BjngChjjljng/detr-finetuned |
# Model Card for Model ID
<!-- Provide a quick summary of what the model is/does. -->
## Model Details
### Model Description
<!-- Provide a longer summary of what this model is. -->
This is the model card of a 🤗 transformers model that has been pushed on the Hub. This model card has been automatically generated.
- **Developed by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Model type:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
- **Finetuned from model [optional]:** [More Information Needed]
### Model Sources [optional]
<!-- Provide the basic links for the model. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the model is intended to be used, including the foreseeable users of the model and those affected by the model. -->
### Direct Use
<!-- This section is for the model use without fine-tuning or plugging into a larger ecosystem/app. -->
[More Information Needed]
### Downstream Use [optional]
<!-- This section is for the model use when fine-tuned for a task, or when plugged into a larger ecosystem/app -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the model will not work well for. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.
## How to Get Started with the Model
Use the code below to get started with the model.
[More Information Needed]
## Training Details
### Training Data
<!-- This should link to a Dataset Card, perhaps with a short stub of information on what the training data is all about as well as documentation related to data pre-processing or additional filtering. -->
[More Information Needed]
### Training Procedure
<!-- This relates heavily to the Technical Specifications. Content here should link to that section when it is relevant to the training procedure. -->
#### Preprocessing [optional]
[More Information Needed]
#### Training Hyperparameters
- **Training regime:** [More Information Needed] <!--fp32, fp16 mixed precision, bf16 mixed precision, bf16 non-mixed precision, fp16 non-mixed precision, fp8 mixed precision -->
#### Speeds, Sizes, Times [optional]
<!-- This section provides information about throughput, start/end time, checkpoint size if relevant, etc. -->
[More Information Needed]
## Evaluation
<!-- This section describes the evaluation protocols and provides the results. -->
### Testing Data, Factors & Metrics
#### Testing Data
<!-- This should link to a Dataset Card if possible. -->
[More Information Needed]
#### Factors
<!-- These are the things the evaluation is disaggregating by, e.g., subpopulations or domains. -->
[More Information Needed]
#### Metrics
<!-- These are the evaluation metrics being used, ideally with a description of why. -->
[More Information Needed]
### Results
[More Information Needed]
#### Summary
## Model Examination [optional]
<!-- Relevant interpretability work for the model goes here -->
[More Information Needed]
## Environmental Impact
<!-- Total emissions (in grams of CO2eq) and additional considerations, such as electricity usage, go here. Edit the suggested text below accordingly -->
Carbon emissions can be estimated using the [Machine Learning Impact calculator](https://mlco2.github.io/impact#compute) presented in [Lacoste et al. (2019)](https://arxiv.org/abs/1910.09700).
- **Hardware Type:** [More Information Needed]
- **Hours used:** [More Information Needed]
- **Cloud Provider:** [More Information Needed]
- **Compute Region:** [More Information Needed]
- **Carbon Emitted:** [More Information Needed]
## Technical Specifications [optional]
### Model Architecture and Objective
[More Information Needed]
### Compute Infrastructure
[More Information Needed]
#### Hardware
[More Information Needed]
#### Software
[More Information Needed]
## Citation [optional]
<!-- If there is a paper or blog post introducing the model, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the model or model card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Model Card Authors [optional]
[More Information Needed]
## Model Card Contact
[More Information Needed] | [
"label_0",
"label_1",
"label_2",
"label_3"
] |
BjngChjjljng/detr-finetuned_v2 |
# Model Card for Model ID
<!-- Provide a quick summary of what the model is/does. -->
## Model Details
### Model Description
<!-- Provide a longer summary of what this model is. -->
This is the model card of a 🤗 transformers model that has been pushed on the Hub. This model card has been automatically generated.
- **Developed by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Model type:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
- **Finetuned from model [optional]:** [More Information Needed]
### Model Sources [optional]
<!-- Provide the basic links for the model. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the model is intended to be used, including the foreseeable users of the model and those affected by the model. -->
### Direct Use
<!-- This section is for the model use without fine-tuning or plugging into a larger ecosystem/app. -->
[More Information Needed]
### Downstream Use [optional]
<!-- This section is for the model use when fine-tuned for a task, or when plugged into a larger ecosystem/app -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the model will not work well for. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.
## How to Get Started with the Model
Use the code below to get started with the model.
[More Information Needed]
## Training Details
### Training Data
<!-- This should link to a Dataset Card, perhaps with a short stub of information on what the training data is all about as well as documentation related to data pre-processing or additional filtering. -->
[More Information Needed]
### Training Procedure
<!-- This relates heavily to the Technical Specifications. Content here should link to that section when it is relevant to the training procedure. -->
#### Preprocessing [optional]
[More Information Needed]
#### Training Hyperparameters
- **Training regime:** [More Information Needed] <!--fp32, fp16 mixed precision, bf16 mixed precision, bf16 non-mixed precision, fp16 non-mixed precision, fp8 mixed precision -->
#### Speeds, Sizes, Times [optional]
<!-- This section provides information about throughput, start/end time, checkpoint size if relevant, etc. -->
[More Information Needed]
## Evaluation
<!-- This section describes the evaluation protocols and provides the results. -->
### Testing Data, Factors & Metrics
#### Testing Data
<!-- This should link to a Dataset Card if possible. -->
[More Information Needed]
#### Factors
<!-- These are the things the evaluation is disaggregating by, e.g., subpopulations or domains. -->
[More Information Needed]
#### Metrics
<!-- These are the evaluation metrics being used, ideally with a description of why. -->
[More Information Needed]
### Results
[More Information Needed]
#### Summary
## Model Examination [optional]
<!-- Relevant interpretability work for the model goes here -->
[More Information Needed]
## Environmental Impact
<!-- Total emissions (in grams of CO2eq) and additional considerations, such as electricity usage, go here. Edit the suggested text below accordingly -->
Carbon emissions can be estimated using the [Machine Learning Impact calculator](https://mlco2.github.io/impact#compute) presented in [Lacoste et al. (2019)](https://arxiv.org/abs/1910.09700).
- **Hardware Type:** [More Information Needed]
- **Hours used:** [More Information Needed]
- **Cloud Provider:** [More Information Needed]
- **Compute Region:** [More Information Needed]
- **Carbon Emitted:** [More Information Needed]
## Technical Specifications [optional]
### Model Architecture and Objective
[More Information Needed]
### Compute Infrastructure
[More Information Needed]
#### Hardware
[More Information Needed]
#### Software
[More Information Needed]
## Citation [optional]
<!-- If there is a paper or blog post introducing the model, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the model or model card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Model Card Authors [optional]
[More Information Needed]
## Model Card Contact
[More Information Needed] | [
"label_0",
"label_1",
"label_2",
"label_3"
] |
Sabbasi-11/results |
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# results
This model is a fine-tuned version of [facebook/detr-resnet-50](https://huggingface.co/facebook/detr-resnet-50) on the imagefolder dataset.
It achieves the following results on the evaluation set:
- Loss: 1.1423
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 5e-05
- train_batch_size: 4
- eval_batch_size: 4
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 30
### Training results
| Training Loss | Epoch | Step | Validation Loss |
|:-------------:|:-----:|:----:|:---------------:|
| No log | 1.0 | 2 | 5.4207 |
| No log | 2.0 | 4 | 3.5959 |
| No log | 3.0 | 6 | 2.6546 |
| No log | 4.0 | 8 | 2.5924 |
| 4.7135 | 5.0 | 10 | 2.4246 |
| 4.7135 | 6.0 | 12 | 2.3156 |
| 4.7135 | 7.0 | 14 | 2.0630 |
| 4.7135 | 8.0 | 16 | 1.8700 |
| 4.7135 | 9.0 | 18 | 1.9057 |
| 1.9751 | 10.0 | 20 | 1.8555 |
| 1.9751 | 11.0 | 22 | 1.7879 |
| 1.9751 | 12.0 | 24 | 1.6941 |
| 1.9751 | 13.0 | 26 | 1.6313 |
| 1.9751 | 14.0 | 28 | 1.5788 |
| 1.6067 | 15.0 | 30 | 1.5095 |
| 1.6067 | 16.0 | 32 | 1.4926 |
| 1.6067 | 17.0 | 34 | 1.3962 |
| 1.6067 | 18.0 | 36 | 1.3799 |
| 1.6067 | 19.0 | 38 | 1.4024 |
| 1.3628 | 20.0 | 40 | 1.3731 |
| 1.3628 | 21.0 | 42 | 1.3351 |
| 1.3628 | 22.0 | 44 | 1.2939 |
| 1.3628 | 23.0 | 46 | 1.3056 |
| 1.3628 | 24.0 | 48 | 1.2118 |
| 1.2051 | 25.0 | 50 | 1.1925 |
| 1.2051 | 26.0 | 52 | 1.1810 |
| 1.2051 | 27.0 | 54 | 1.1621 |
| 1.2051 | 28.0 | 56 | 1.1491 |
| 1.2051 | 29.0 | 58 | 1.1433 |
| 1.0991 | 30.0 | 60 | 1.1423 |
### Framework versions
- Transformers 4.44.2
- Pytorch 2.5.0+cu121
- Datasets 3.1.0
- Tokenizers 0.19.1
| [
"n/a",
"person",
"bicycle",
"car",
"motorcycle",
"airplane",
"bus",
"train",
"truck",
"boat",
"traffic light",
"fire hydrant",
"street sign",
"stop sign",
"parking meter",
"bench",
"bird",
"cat",
"dog",
"horse",
"sheep",
"cow",
"elephant",
"bear",
"zebra",
"giraffe",
"hat",
"backpack",
"umbrella",
"shoe",
"eye glasses",
"handbag",
"tie",
"suitcase",
"frisbee",
"skis",
"snowboard",
"sports ball",
"kite",
"baseball bat",
"baseball glove",
"skateboard",
"surfboard",
"tennis racket",
"bottle",
"plate",
"wine glass",
"cup",
"fork",
"knife",
"spoon",
"bowl",
"banana",
"apple",
"sandwich",
"orange",
"broccoli",
"carrot",
"hot dog",
"pizza",
"donut",
"cake",
"chair",
"couch",
"potted plant",
"bed",
"mirror",
"dining table",
"window",
"desk",
"toilet",
"door",
"tv",
"laptop",
"mouse",
"remote",
"keyboard",
"cell phone",
"microwave",
"oven",
"toaster",
"sink",
"refrigerator",
"blender",
"book",
"clock",
"vase",
"scissors",
"teddy bear",
"hair drier",
"toothbrush"
] |
joe611/chickens-composite-8044444-150-epochs-w-transform-metrics-test |
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# chickens-composite-8044444-150-epochs-w-transform-metrics-test
This model is a fine-tuned version of [facebook/detr-resnet-50](https://huggingface.co/facebook/detr-resnet-50) on the None dataset.
It achieves the following results on the evaluation set:
- Loss: 0.2398
- Map: 0.824
- Map 50: 0.9727
- Map 75: 0.9165
- Map Small: 0.4223
- Map Medium: 0.8232
- Map Large: 0.9015
- Mar 1: 0.305
- Mar 10: 0.8488
- Mar 100: 0.8578
- Mar Small: 0.5127
- Mar Medium: 0.8612
- Mar Large: 0.9219
- Map Chicken: 0.8138
- Mar 100 Chicken: 0.8522
- Map Duck: 0.7742
- Mar 100 Duck: 0.8072
- Map Plant: 0.884
- Mar 100 Plant: 0.9139
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 1e-05
- train_batch_size: 2
- eval_batch_size: 8
- seed: 42
- optimizer: Use adamw_torch with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
- lr_scheduler_type: cosine
- num_epochs: 150
### Training results
| Training Loss | Epoch | Step | Validation Loss | Map | Map 50 | Map 75 | Map Small | Map Medium | Map Large | Mar 1 | Mar 10 | Mar 100 | Mar Small | Mar Medium | Mar Large | Map Chicken | Mar 100 Chicken | Map Duck | Mar 100 Duck | Map Plant | Mar 100 Plant |
|:-------------:|:-----:|:-----:|:---------------:|:------:|:------:|:------:|:---------:|:----------:|:---------:|:------:|:------:|:-------:|:---------:|:----------:|:---------:|:-----------:|:---------------:|:--------:|:------------:|:---------:|:-------------:|
| 1.3258 | 1.0 | 500 | 1.4495 | 0.1028 | 0.1496 | 0.1164 | 0.0079 | 0.0455 | 0.1405 | 0.088 | 0.2289 | 0.2836 | 0.0714 | 0.2474 | 0.3068 | 0.0139 | 0.0627 | 0.0148 | 0.0629 | 0.2798 | 0.7251 |
| 1.2172 | 2.0 | 1000 | 1.2176 | 0.1438 | 0.2013 | 0.1592 | 0.0055 | 0.0876 | 0.2159 | 0.0789 | 0.2432 | 0.2699 | 0.0532 | 0.245 | 0.3211 | 0.0106 | 0.0346 | 0.0124 | 0.0299 | 0.4083 | 0.7451 |
| 1.1316 | 3.0 | 1500 | 1.1163 | 0.1459 | 0.2041 | 0.1661 | 0.0148 | 0.0912 | 0.1655 | 0.0757 | 0.2291 | 0.2558 | 0.0484 | 0.237 | 0.2575 | 0.0085 | 0.032 | 0.0 | 0.0 | 0.4291 | 0.7353 |
| 0.8258 | 4.0 | 2000 | 0.9967 | 0.2301 | 0.3246 | 0.2716 | 0.0105 | 0.1904 | 0.1937 | 0.1471 | 0.35 | 0.374 | 0.0571 | 0.3659 | 0.2702 | 0.0435 | 0.1123 | 0.1368 | 0.2402 | 0.5099 | 0.7694 |
| 0.8826 | 5.0 | 2500 | 0.9241 | 0.258 | 0.3617 | 0.3112 | 0.01 | 0.204 | 0.273 | 0.1345 | 0.3216 | 0.3326 | 0.1008 | 0.3088 | 0.2973 | 0.0403 | 0.1066 | 0.0816 | 0.1351 | 0.6521 | 0.7561 |
| 1.1515 | 6.0 | 3000 | 0.8450 | 0.3557 | 0.5129 | 0.4146 | 0.0094 | 0.3342 | 0.3448 | 0.1735 | 0.4761 | 0.4883 | 0.1294 | 0.4755 | 0.3812 | 0.2899 | 0.4996 | 0.0998 | 0.2093 | 0.6773 | 0.7561 |
| 0.8812 | 7.0 | 3500 | 0.7954 | 0.3922 | 0.5562 | 0.4736 | 0.0061 | 0.3716 | 0.4021 | 0.1557 | 0.4886 | 0.4982 | 0.073 | 0.4877 | 0.4682 | 0.4053 | 0.6456 | 0.0721 | 0.0918 | 0.6991 | 0.7572 |
| 0.7966 | 8.0 | 4000 | 0.7400 | 0.421 | 0.5833 | 0.4935 | 0.0221 | 0.3907 | 0.4147 | 0.1454 | 0.5 | 0.5065 | 0.1095 | 0.4871 | 0.4727 | 0.4963 | 0.6882 | 0.0323 | 0.0433 | 0.7345 | 0.7882 |
| 0.8197 | 9.0 | 4500 | 0.8054 | 0.3719 | 0.5572 | 0.4327 | 0.0144 | 0.3497 | 0.3799 | 0.1213 | 0.4525 | 0.4561 | 0.0484 | 0.4421 | 0.4387 | 0.4323 | 0.6368 | 0.0 | 0.001 | 0.6834 | 0.7303 |
| 0.6943 | 10.0 | 5000 | 0.7024 | 0.423 | 0.5892 | 0.501 | 0.0456 | 0.4037 | 0.4403 | 0.141 | 0.4909 | 0.4961 | 0.0754 | 0.4782 | 0.5038 | 0.5316 | 0.7018 | 0.0173 | 0.0155 | 0.72 | 0.7711 |
| 0.7918 | 11.0 | 5500 | 0.6560 | 0.4241 | 0.5829 | 0.5038 | 0.0122 | 0.3919 | 0.497 | 0.135 | 0.4891 | 0.4958 | 0.0944 | 0.4696 | 0.5622 | 0.5372 | 0.7088 | 0.0 | 0.0 | 0.7352 | 0.7786 |
| 0.7057 | 12.0 | 6000 | 0.6571 | 0.4207 | 0.5943 | 0.4955 | 0.0344 | 0.3927 | 0.4853 | 0.1325 | 0.4821 | 0.4873 | 0.1071 | 0.4669 | 0.5378 | 0.5155 | 0.6794 | 0.0139 | 0.0072 | 0.7327 | 0.7754 |
| 0.7291 | 13.0 | 6500 | 0.6155 | 0.442 | 0.5991 | 0.5201 | 0.0386 | 0.4151 | 0.4586 | 0.1355 | 0.4977 | 0.5045 | 0.1476 | 0.4859 | 0.5374 | 0.5666 | 0.7105 | 0.0007 | 0.001 | 0.7586 | 0.802 |
| 0.6841 | 14.0 | 7000 | 0.6251 | 0.4453 | 0.6092 | 0.5368 | 0.0301 | 0.4199 | 0.4712 | 0.1436 | 0.4932 | 0.4982 | 0.15 | 0.4751 | 0.5405 | 0.5808 | 0.7039 | 0.0158 | 0.0082 | 0.7392 | 0.7824 |
| 0.7706 | 15.0 | 7500 | 0.5625 | 0.4656 | 0.625 | 0.5586 | 0.0519 | 0.4389 | 0.4879 | 0.1497 | 0.5127 | 0.519 | 0.1302 | 0.4962 | 0.548 | 0.6247 | 0.736 | 0.015 | 0.0196 | 0.7571 | 0.8014 |
| 0.5477 | 16.0 | 8000 | 0.6342 | 0.441 | 0.6213 | 0.5428 | 0.0194 | 0.4154 | 0.4709 | 0.1344 | 0.4847 | 0.4929 | 0.1421 | 0.478 | 0.509 | 0.5621 | 0.6754 | 0.0139 | 0.0072 | 0.7471 | 0.796 |
| 0.5659 | 17.0 | 8500 | 0.5588 | 0.486 | 0.6467 | 0.5788 | 0.0506 | 0.456 | 0.5136 | 0.1568 | 0.52 | 0.5281 | 0.1976 | 0.5092 | 0.5522 | 0.6651 | 0.7465 | 0.0316 | 0.0278 | 0.7614 | 0.8101 |
| 0.5889 | 18.0 | 9000 | 0.5632 | 0.5271 | 0.7073 | 0.6332 | 0.0837 | 0.499 | 0.5429 | 0.187 | 0.5601 | 0.5648 | 0.1543 | 0.5438 | 0.5602 | 0.6564 | 0.7184 | 0.1669 | 0.1763 | 0.758 | 0.7997 |
| 0.6217 | 19.0 | 9500 | 0.5608 | 0.5621 | 0.7535 | 0.666 | 0.0361 | 0.5417 | 0.5104 | 0.2008 | 0.5924 | 0.5971 | 0.1071 | 0.5822 | 0.535 | 0.675 | 0.7289 | 0.2595 | 0.2691 | 0.7518 | 0.7934 |
| 0.5486 | 20.0 | 10000 | 0.5374 | 0.5916 | 0.7987 | 0.716 | 0.0318 | 0.5746 | 0.5224 | 0.2156 | 0.6249 | 0.6302 | 0.1016 | 0.6171 | 0.5488 | 0.6805 | 0.7404 | 0.3386 | 0.3557 | 0.7556 | 0.7945 |
| 0.7596 | 21.0 | 10500 | 0.5062 | 0.6677 | 0.8908 | 0.815 | 0.1271 | 0.6657 | 0.5097 | 0.2509 | 0.7101 | 0.7147 | 0.1984 | 0.7203 | 0.5405 | 0.6848 | 0.7434 | 0.5532 | 0.5907 | 0.7651 | 0.8098 |
| 0.6202 | 22.0 | 11000 | 0.4852 | 0.6878 | 0.919 | 0.847 | 0.0815 | 0.6882 | 0.672 | 0.2688 | 0.7283 | 0.7339 | 0.2248 | 0.7411 | 0.6929 | 0.6872 | 0.7364 | 0.5974 | 0.6423 | 0.779 | 0.8231 |
| 0.5244 | 23.0 | 11500 | 0.4770 | 0.6907 | 0.9157 | 0.8412 | 0.1031 | 0.6915 | 0.6892 | 0.2629 | 0.7257 | 0.7324 | 0.2208 | 0.7444 | 0.7067 | 0.7031 | 0.7461 | 0.606 | 0.6443 | 0.763 | 0.8069 |
| 0.5388 | 24.0 | 12000 | 0.4780 | 0.6913 | 0.9393 | 0.8303 | 0.0933 | 0.6918 | 0.8095 | 0.2723 | 0.7367 | 0.7462 | 0.2711 | 0.7496 | 0.8402 | 0.6766 | 0.7338 | 0.6428 | 0.6969 | 0.7545 | 0.8078 |
| 0.5626 | 25.0 | 12500 | 0.4525 | 0.7036 | 0.942 | 0.8388 | 0.067 | 0.6948 | 0.7415 | 0.2692 | 0.7399 | 0.7482 | 0.2529 | 0.7493 | 0.7609 | 0.6981 | 0.7522 | 0.6246 | 0.667 | 0.7879 | 0.8254 |
| 0.5437 | 26.0 | 13000 | 0.4455 | 0.6936 | 0.942 | 0.8324 | 0.1783 | 0.6799 | 0.7869 | 0.2661 | 0.7313 | 0.7399 | 0.33 | 0.7337 | 0.8079 | 0.6602 | 0.7206 | 0.6347 | 0.6711 | 0.7859 | 0.828 |
| 0.4593 | 27.0 | 13500 | 0.4121 | 0.7152 | 0.9424 | 0.8505 | 0.1477 | 0.6971 | 0.8166 | 0.2728 | 0.7549 | 0.7608 | 0.3173 | 0.7484 | 0.8455 | 0.7176 | 0.768 | 0.6484 | 0.6938 | 0.7795 | 0.8205 |
| 0.443 | 28.0 | 14000 | 0.3894 | 0.7317 | 0.958 | 0.856 | 0.1687 | 0.7199 | 0.79 | 0.2815 | 0.7672 | 0.7745 | 0.3371 | 0.7672 | 0.8131 | 0.7262 | 0.7697 | 0.6736 | 0.7155 | 0.7952 | 0.8384 |
| 0.5382 | 29.0 | 14500 | 0.3988 | 0.7239 | 0.9536 | 0.8693 | 0.154 | 0.7142 | 0.784 | 0.2761 | 0.7592 | 0.7657 | 0.3257 | 0.7626 | 0.8013 | 0.7275 | 0.768 | 0.6591 | 0.7082 | 0.7851 | 0.8208 |
| 0.5577 | 30.0 | 15000 | 0.4155 | 0.7009 | 0.9393 | 0.838 | 0.1252 | 0.6847 | 0.7812 | 0.2699 | 0.7415 | 0.7478 | 0.2841 | 0.7402 | 0.8342 | 0.6869 | 0.7456 | 0.6331 | 0.6722 | 0.7827 | 0.8257 |
| 0.4783 | 31.0 | 15500 | 0.3982 | 0.7292 | 0.9532 | 0.8735 | 0.1013 | 0.7186 | 0.842 | 0.2801 | 0.768 | 0.7785 | 0.3257 | 0.7694 | 0.8752 | 0.7144 | 0.7697 | 0.6747 | 0.7268 | 0.7986 | 0.839 |
| 0.6258 | 32.0 | 16000 | 0.4124 | 0.7153 | 0.9477 | 0.8638 | 0.1807 | 0.7081 | 0.8077 | 0.2802 | 0.751 | 0.7613 | 0.359 | 0.7584 | 0.8379 | 0.6965 | 0.7518 | 0.662 | 0.7093 | 0.7876 | 0.8228 |
| 0.5925 | 33.0 | 16500 | 0.4105 | 0.7075 | 0.9482 | 0.8358 | 0.1448 | 0.6937 | 0.7069 | 0.2756 | 0.7472 | 0.758 | 0.3602 | 0.7468 | 0.7396 | 0.6625 | 0.7263 | 0.6728 | 0.7165 | 0.7873 | 0.8312 |
| 0.4881 | 34.0 | 17000 | 0.3894 | 0.7286 | 0.943 | 0.8578 | 0.1267 | 0.7228 | 0.7782 | 0.2807 | 0.76 | 0.7698 | 0.2954 | 0.7682 | 0.8119 | 0.7401 | 0.7846 | 0.6453 | 0.6866 | 0.8004 | 0.8382 |
| 0.4997 | 35.0 | 17500 | 0.4187 | 0.6981 | 0.9429 | 0.8356 | 0.1088 | 0.6881 | 0.7562 | 0.2702 | 0.7402 | 0.7487 | 0.264 | 0.7429 | 0.793 | 0.6582 | 0.725 | 0.6499 | 0.6938 | 0.7861 | 0.8272 |
| 0.5615 | 36.0 | 18000 | 0.3873 | 0.7175 | 0.9394 | 0.8574 | 0.1488 | 0.7163 | 0.8008 | 0.2718 | 0.7508 | 0.762 | 0.2957 | 0.7638 | 0.8383 | 0.7029 | 0.761 | 0.6459 | 0.6773 | 0.8037 | 0.8477 |
| 0.4264 | 37.0 | 18500 | 0.3824 | 0.7249 | 0.9506 | 0.8461 | 0.1428 | 0.7257 | 0.7932 | 0.277 | 0.7608 | 0.771 | 0.294 | 0.7772 | 0.8397 | 0.7128 | 0.7671 | 0.6637 | 0.7021 | 0.7982 | 0.8439 |
| 0.483 | 38.0 | 19000 | 0.4033 | 0.7047 | 0.948 | 0.8554 | 0.1284 | 0.6928 | 0.7695 | 0.2696 | 0.7413 | 0.7532 | 0.3059 | 0.7505 | 0.8006 | 0.6985 | 0.7478 | 0.6188 | 0.667 | 0.7967 | 0.8448 |
| 0.494 | 39.0 | 19500 | 0.3950 | 0.7128 | 0.9568 | 0.8444 | 0.1775 | 0.6964 | 0.7949 | 0.2714 | 0.7482 | 0.7608 | 0.3522 | 0.7512 | 0.828 | 0.7056 | 0.7592 | 0.6511 | 0.7 | 0.7815 | 0.8231 |
| 0.4404 | 40.0 | 20000 | 0.3621 | 0.748 | 0.9601 | 0.8748 | 0.1922 | 0.7384 | 0.8074 | 0.2814 | 0.784 | 0.7923 | 0.3511 | 0.7915 | 0.8372 | 0.746 | 0.7943 | 0.7071 | 0.7495 | 0.7909 | 0.8332 |
| 0.4331 | 41.0 | 20500 | 0.3805 | 0.7126 | 0.9569 | 0.8584 | 0.1783 | 0.7077 | 0.8299 | 0.2718 | 0.7491 | 0.7585 | 0.2976 | 0.7608 | 0.8553 | 0.6886 | 0.7434 | 0.6482 | 0.6928 | 0.8009 | 0.8393 |
| 0.4512 | 42.0 | 21000 | 0.3552 | 0.734 | 0.9532 | 0.8672 | 0.1147 | 0.7298 | 0.8301 | 0.2789 | 0.7722 | 0.7826 | 0.2963 | 0.782 | 0.8568 | 0.7234 | 0.7811 | 0.6643 | 0.7155 | 0.8144 | 0.8512 |
| 0.5316 | 43.0 | 21500 | 0.3719 | 0.7167 | 0.9517 | 0.8343 | 0.203 | 0.7047 | 0.855 | 0.2734 | 0.756 | 0.7657 | 0.3625 | 0.7612 | 0.8785 | 0.7022 | 0.7662 | 0.6293 | 0.6742 | 0.8184 | 0.8566 |
| 0.4831 | 44.0 | 22000 | 0.3764 | 0.7162 | 0.9592 | 0.8589 | 0.164 | 0.6979 | 0.7871 | 0.2724 | 0.7591 | 0.7676 | 0.347 | 0.7586 | 0.8204 | 0.6976 | 0.7632 | 0.6524 | 0.7021 | 0.7987 | 0.8376 |
| 0.4695 | 45.0 | 22500 | 0.3649 | 0.7209 | 0.953 | 0.8495 | 0.2409 | 0.7035 | 0.8185 | 0.2745 | 0.7626 | 0.7713 | 0.4152 | 0.7601 | 0.8455 | 0.6928 | 0.7575 | 0.6664 | 0.7144 | 0.8034 | 0.8419 |
| 0.4496 | 46.0 | 23000 | 0.3299 | 0.7595 | 0.961 | 0.8566 | 0.1556 | 0.7544 | 0.8008 | 0.2878 | 0.7909 | 0.8002 | 0.3222 | 0.8011 | 0.835 | 0.7521 | 0.8031 | 0.7055 | 0.7361 | 0.821 | 0.8616 |
| 0.5266 | 47.0 | 23500 | 0.3426 | 0.7417 | 0.9492 | 0.8594 | 0.1904 | 0.7334 | 0.8306 | 0.2857 | 0.7793 | 0.7896 | 0.3432 | 0.7875 | 0.8591 | 0.7259 | 0.789 | 0.6848 | 0.7216 | 0.8144 | 0.8581 |
| 0.4198 | 48.0 | 24000 | 0.3321 | 0.7541 | 0.958 | 0.8727 | 0.2462 | 0.741 | 0.8369 | 0.2848 | 0.7888 | 0.7966 | 0.3871 | 0.7899 | 0.8648 | 0.7331 | 0.7908 | 0.704 | 0.7361 | 0.8254 | 0.863 |
| 0.3626 | 49.0 | 24500 | 0.3264 | 0.7526 | 0.9569 | 0.8613 | 0.1831 | 0.7531 | 0.7752 | 0.2867 | 0.7845 | 0.7938 | 0.331 | 0.7976 | 0.8121 | 0.757 | 0.8079 | 0.6785 | 0.7144 | 0.8224 | 0.859 |
| 0.5079 | 50.0 | 25000 | 0.3391 | 0.7483 | 0.9632 | 0.877 | 0.2876 | 0.743 | 0.825 | 0.2848 | 0.7823 | 0.7915 | 0.3716 | 0.7893 | 0.8643 | 0.7183 | 0.7732 | 0.6981 | 0.7371 | 0.8285 | 0.8642 |
| 0.4655 | 51.0 | 25500 | 0.3227 | 0.769 | 0.9639 | 0.8888 | 0.1495 | 0.7743 | 0.7826 | 0.2865 | 0.8006 | 0.8104 | 0.3386 | 0.8163 | 0.8161 | 0.7759 | 0.8219 | 0.711 | 0.7495 | 0.8203 | 0.8598 |
| 0.4147 | 52.0 | 26000 | 0.3369 | 0.7476 | 0.9649 | 0.8747 | 0.2097 | 0.7431 | 0.7928 | 0.2805 | 0.7806 | 0.7893 | 0.3446 | 0.7865 | 0.8191 | 0.7364 | 0.7807 | 0.6806 | 0.7227 | 0.826 | 0.8645 |
| 0.3522 | 53.0 | 26500 | 0.3338 | 0.7504 | 0.964 | 0.8823 | 0.2502 | 0.7382 | 0.8241 | 0.2841 | 0.7822 | 0.7902 | 0.3556 | 0.7838 | 0.8573 | 0.7311 | 0.7803 | 0.6929 | 0.7278 | 0.8272 | 0.8624 |
| 0.5069 | 54.0 | 27000 | 0.3415 | 0.742 | 0.9592 | 0.8822 | 0.161 | 0.7375 | 0.768 | 0.2819 | 0.7726 | 0.7821 | 0.3387 | 0.7823 | 0.7963 | 0.7358 | 0.7833 | 0.6772 | 0.7155 | 0.8129 | 0.8474 |
| 0.4392 | 55.0 | 27500 | 0.3394 | 0.743 | 0.9635 | 0.8703 | 0.1928 | 0.7382 | 0.7685 | 0.2788 | 0.7778 | 0.7882 | 0.3948 | 0.7849 | 0.8008 | 0.715 | 0.7715 | 0.691 | 0.7309 | 0.8229 | 0.8621 |
| 0.3583 | 56.0 | 28000 | 0.3197 | 0.7741 | 0.9549 | 0.8902 | 0.1839 | 0.7766 | 0.826 | 0.292 | 0.8031 | 0.8146 | 0.3824 | 0.8209 | 0.8559 | 0.761 | 0.8057 | 0.7256 | 0.766 | 0.8358 | 0.8723 |
| 0.4463 | 57.0 | 28500 | 0.3288 | 0.7552 | 0.9614 | 0.8838 | 0.2159 | 0.7462 | 0.8201 | 0.2843 | 0.7881 | 0.801 | 0.4111 | 0.7975 | 0.8501 | 0.7623 | 0.8114 | 0.6776 | 0.7309 | 0.8256 | 0.8607 |
| 0.3672 | 58.0 | 29000 | 0.3026 | 0.7795 | 0.9689 | 0.878 | 0.2143 | 0.7773 | 0.8153 | 0.2964 | 0.8114 | 0.8214 | 0.3852 | 0.8216 | 0.8528 | 0.7779 | 0.8219 | 0.732 | 0.7753 | 0.8287 | 0.8671 |
| 0.3754 | 59.0 | 29500 | 0.3061 | 0.7658 | 0.9601 | 0.8813 | 0.1573 | 0.761 | 0.8092 | 0.2903 | 0.7974 | 0.8059 | 0.3337 | 0.8084 | 0.8381 | 0.7717 | 0.814 | 0.6843 | 0.7309 | 0.8414 | 0.8728 |
| 0.5756 | 60.0 | 30000 | 0.3290 | 0.7457 | 0.9618 | 0.8737 | 0.1866 | 0.7343 | 0.7888 | 0.2792 | 0.7776 | 0.7897 | 0.3741 | 0.7858 | 0.8148 | 0.7412 | 0.786 | 0.6674 | 0.7196 | 0.8287 | 0.8636 |
| 0.3744 | 61.0 | 30500 | 0.3089 | 0.7651 | 0.9677 | 0.8934 | 0.2155 | 0.7577 | 0.8068 | 0.2882 | 0.7962 | 0.8077 | 0.4148 | 0.8063 | 0.8347 | 0.7644 | 0.8118 | 0.7017 | 0.7464 | 0.8291 | 0.8647 |
| 0.45 | 62.0 | 31000 | 0.3171 | 0.7538 | 0.9736 | 0.8901 | 0.2144 | 0.7503 | 0.7972 | 0.2835 | 0.7896 | 0.7982 | 0.4362 | 0.798 | 0.8175 | 0.7557 | 0.7987 | 0.676 | 0.7278 | 0.8298 | 0.8682 |
| 0.4322 | 63.0 | 31500 | 0.3263 | 0.7569 | 0.9686 | 0.8958 | 0.2233 | 0.744 | 0.8092 | 0.2863 | 0.7881 | 0.7998 | 0.4005 | 0.7923 | 0.8367 | 0.7421 | 0.7868 | 0.7049 | 0.7474 | 0.8237 | 0.865 |
| 0.3976 | 64.0 | 32000 | 0.3135 | 0.7689 | 0.9586 | 0.8947 | 0.2697 | 0.7697 | 0.8111 | 0.2865 | 0.8001 | 0.8066 | 0.3951 | 0.8119 | 0.8434 | 0.7767 | 0.8118 | 0.7075 | 0.7474 | 0.8225 | 0.8604 |
| 0.4146 | 65.0 | 32500 | 0.3080 | 0.7638 | 0.9691 | 0.8854 | 0.3276 | 0.7586 | 0.8485 | 0.2831 | 0.7994 | 0.8059 | 0.4362 | 0.8056 | 0.8737 | 0.7392 | 0.7864 | 0.7146 | 0.7567 | 0.8374 | 0.8746 |
| 0.3933 | 66.0 | 33000 | 0.3147 | 0.7686 | 0.9718 | 0.8914 | 0.3364 | 0.7589 | 0.8308 | 0.2871 | 0.8003 | 0.8072 | 0.4403 | 0.8031 | 0.8662 | 0.7434 | 0.7904 | 0.7271 | 0.7588 | 0.8354 | 0.8725 |
| 0.3177 | 67.0 | 33500 | 0.3117 | 0.7689 | 0.9674 | 0.8938 | 0.3073 | 0.7627 | 0.8084 | 0.2878 | 0.7971 | 0.8053 | 0.42 | 0.8056 | 0.837 | 0.7578 | 0.7961 | 0.7195 | 0.7577 | 0.8292 | 0.8621 |
| 0.4222 | 68.0 | 34000 | 0.3019 | 0.7821 | 0.9568 | 0.8939 | 0.2657 | 0.7769 | 0.8733 | 0.2948 | 0.8094 | 0.8188 | 0.363 | 0.8194 | 0.9045 | 0.7759 | 0.8167 | 0.7297 | 0.767 | 0.8407 | 0.8728 |
| 0.3707 | 69.0 | 34500 | 0.3024 | 0.7882 | 0.9666 | 0.8876 | 0.2674 | 0.7788 | 0.8222 | 0.2924 | 0.8138 | 0.8227 | 0.4344 | 0.8215 | 0.8503 | 0.7807 | 0.8211 | 0.7386 | 0.7711 | 0.8452 | 0.876 |
| 0.3954 | 70.0 | 35000 | 0.2981 | 0.7838 | 0.9702 | 0.8976 | 0.2649 | 0.773 | 0.8496 | 0.2922 | 0.8091 | 0.818 | 0.4121 | 0.8157 | 0.8765 | 0.7846 | 0.8202 | 0.7261 | 0.7567 | 0.8406 | 0.8772 |
| 0.4 | 71.0 | 35500 | 0.3075 | 0.7771 | 0.9684 | 0.8829 | 0.2776 | 0.7716 | 0.8157 | 0.2932 | 0.8027 | 0.8117 | 0.4168 | 0.8131 | 0.8481 | 0.7657 | 0.8039 | 0.7289 | 0.7598 | 0.8368 | 0.8714 |
| 0.4862 | 72.0 | 36000 | 0.2756 | 0.8 | 0.9711 | 0.8878 | 0.2811 | 0.7969 | 0.8532 | 0.2989 | 0.8256 | 0.8366 | 0.4486 | 0.8378 | 0.8932 | 0.7937 | 0.8311 | 0.7594 | 0.7959 | 0.8469 | 0.8827 |
| 0.3944 | 73.0 | 36500 | 0.2948 | 0.7764 | 0.971 | 0.8765 | 0.3132 | 0.7653 | 0.8221 | 0.2897 | 0.8053 | 0.8147 | 0.4729 | 0.8098 | 0.8608 | 0.7571 | 0.8022 | 0.7261 | 0.7588 | 0.846 | 0.8832 |
| 0.3581 | 74.0 | 37000 | 0.2841 | 0.79 | 0.9709 | 0.901 | 0.3747 | 0.7752 | 0.8603 | 0.2955 | 0.8245 | 0.8336 | 0.5056 | 0.8245 | 0.8912 | 0.7733 | 0.8228 | 0.7485 | 0.7938 | 0.8483 | 0.8841 |
| 0.4114 | 75.0 | 37500 | 0.3027 | 0.7876 | 0.9665 | 0.8956 | 0.3452 | 0.7811 | 0.8481 | 0.2944 | 0.8162 | 0.8254 | 0.4776 | 0.8238 | 0.8813 | 0.7753 | 0.8171 | 0.75 | 0.7866 | 0.8376 | 0.8725 |
| 0.3524 | 76.0 | 38000 | 0.2774 | 0.8019 | 0.9669 | 0.8993 | 0.3286 | 0.7904 | 0.8576 | 0.3026 | 0.8325 | 0.8422 | 0.4741 | 0.8391 | 0.8858 | 0.8009 | 0.8417 | 0.7617 | 0.8021 | 0.8431 | 0.8829 |
| 0.3696 | 77.0 | 38500 | 0.2794 | 0.7996 | 0.9665 | 0.8968 | 0.2717 | 0.7941 | 0.8639 | 0.3006 | 0.8271 | 0.8378 | 0.4351 | 0.839 | 0.8956 | 0.7828 | 0.8254 | 0.7629 | 0.799 | 0.8532 | 0.889 |
| 0.4521 | 78.0 | 39000 | 0.2840 | 0.7874 | 0.9643 | 0.8887 | 0.309 | 0.7841 | 0.8069 | 0.2966 | 0.8159 | 0.8252 | 0.446 | 0.8286 | 0.8454 | 0.7665 | 0.814 | 0.7391 | 0.7711 | 0.8566 | 0.8905 |
| 0.408 | 79.0 | 39500 | 0.2931 | 0.781 | 0.9736 | 0.8996 | 0.338 | 0.7757 | 0.8284 | 0.2924 | 0.8105 | 0.8209 | 0.4787 | 0.8207 | 0.8706 | 0.7646 | 0.8096 | 0.7383 | 0.7732 | 0.8399 | 0.8798 |
| 0.407 | 80.0 | 40000 | 0.2896 | 0.7899 | 0.9661 | 0.8926 | 0.2783 | 0.7868 | 0.8533 | 0.2952 | 0.8179 | 0.8269 | 0.4105 | 0.8322 | 0.8838 | 0.7781 | 0.8175 | 0.7476 | 0.7825 | 0.844 | 0.8806 |
| 0.3787 | 81.0 | 40500 | 0.2987 | 0.785 | 0.9641 | 0.903 | 0.3313 | 0.7732 | 0.8315 | 0.2909 | 0.8152 | 0.8235 | 0.4644 | 0.8196 | 0.853 | 0.7599 | 0.807 | 0.7483 | 0.7794 | 0.8468 | 0.8841 |
| 0.5486 | 82.0 | 41000 | 0.2717 | 0.8044 | 0.9717 | 0.8995 | 0.321 | 0.8003 | 0.8397 | 0.2978 | 0.8282 | 0.8389 | 0.4363 | 0.8421 | 0.8717 | 0.792 | 0.8316 | 0.7632 | 0.7918 | 0.8581 | 0.8934 |
| 0.3667 | 83.0 | 41500 | 0.2861 | 0.7927 | 0.9671 | 0.9075 | 0.3308 | 0.7884 | 0.8494 | 0.2959 | 0.8201 | 0.8279 | 0.4811 | 0.8286 | 0.8695 | 0.7646 | 0.8092 | 0.7556 | 0.7856 | 0.858 | 0.889 |
| 0.3308 | 84.0 | 42000 | 0.2853 | 0.7904 | 0.969 | 0.9019 | 0.349 | 0.7802 | 0.8418 | 0.2958 | 0.8188 | 0.827 | 0.4798 | 0.8284 | 0.8622 | 0.774 | 0.8149 | 0.7391 | 0.7742 | 0.8582 | 0.8919 |
| 0.5384 | 85.0 | 42500 | 0.2897 | 0.7874 | 0.9632 | 0.9013 | 0.3242 | 0.7807 | 0.8498 | 0.2927 | 0.818 | 0.8255 | 0.4692 | 0.8267 | 0.8806 | 0.7721 | 0.8175 | 0.7396 | 0.7732 | 0.8504 | 0.8858 |
| 0.3107 | 86.0 | 43000 | 0.2720 | 0.802 | 0.9671 | 0.891 | 0.3156 | 0.8007 | 0.8421 | 0.297 | 0.8282 | 0.8374 | 0.4338 | 0.8408 | 0.8744 | 0.8001 | 0.8395 | 0.7502 | 0.7825 | 0.8557 | 0.8902 |
| 0.3186 | 87.0 | 43500 | 0.2684 | 0.8023 | 0.9688 | 0.9073 | 0.3675 | 0.7995 | 0.8198 | 0.2955 | 0.8299 | 0.8398 | 0.5105 | 0.8417 | 0.8509 | 0.7962 | 0.8377 | 0.75 | 0.7835 | 0.8606 | 0.8983 |
| 0.3108 | 88.0 | 44000 | 0.2636 | 0.8033 | 0.972 | 0.9095 | 0.3473 | 0.8027 | 0.8541 | 0.2968 | 0.8316 | 0.8409 | 0.4879 | 0.8437 | 0.8805 | 0.7949 | 0.8355 | 0.7539 | 0.7876 | 0.861 | 0.8994 |
| 0.3512 | 89.0 | 44500 | 0.2755 | 0.8006 | 0.9701 | 0.9125 | 0.3356 | 0.7962 | 0.8639 | 0.2967 | 0.8256 | 0.834 | 0.4579 | 0.8368 | 0.8794 | 0.7964 | 0.8338 | 0.7427 | 0.7742 | 0.8627 | 0.8939 |
| 0.3915 | 90.0 | 45000 | 0.2700 | 0.808 | 0.9657 | 0.9233 | 0.3649 | 0.8039 | 0.8509 | 0.2993 | 0.8332 | 0.8414 | 0.4452 | 0.8448 | 0.8725 | 0.795 | 0.8346 | 0.7641 | 0.7918 | 0.8649 | 0.8977 |
| 0.3837 | 91.0 | 45500 | 0.2740 | 0.8022 | 0.9682 | 0.9052 | 0.3671 | 0.7965 | 0.8576 | 0.2964 | 0.8295 | 0.8388 | 0.4827 | 0.8401 | 0.8745 | 0.7923 | 0.8325 | 0.7551 | 0.7918 | 0.8594 | 0.8922 |
| 0.3216 | 92.0 | 46000 | 0.2698 | 0.8058 | 0.9737 | 0.894 | 0.3404 | 0.8006 | 0.8636 | 0.2993 | 0.8297 | 0.8396 | 0.4484 | 0.8412 | 0.8727 | 0.7913 | 0.8294 | 0.7615 | 0.7928 | 0.8646 | 0.8965 |
| 0.3808 | 93.0 | 46500 | 0.2834 | 0.7832 | 0.9649 | 0.8966 | 0.3481 | 0.7796 | 0.8617 | 0.2891 | 0.8137 | 0.8225 | 0.4492 | 0.8247 | 0.8733 | 0.7735 | 0.8184 | 0.7219 | 0.7598 | 0.8542 | 0.8893 |
| 0.4463 | 94.0 | 47000 | 0.2720 | 0.8029 | 0.9683 | 0.896 | 0.3484 | 0.7967 | 0.836 | 0.2989 | 0.8282 | 0.8394 | 0.4786 | 0.8408 | 0.8642 | 0.7937 | 0.8333 | 0.7655 | 0.8 | 0.8497 | 0.885 |
| 0.3548 | 95.0 | 47500 | 0.2720 | 0.8029 | 0.9716 | 0.9152 | 0.3798 | 0.8008 | 0.8577 | 0.2958 | 0.8311 | 0.8397 | 0.5203 | 0.8432 | 0.8773 | 0.7945 | 0.8346 | 0.7543 | 0.7907 | 0.8599 | 0.8936 |
| 0.396 | 96.0 | 48000 | 0.2735 | 0.8003 | 0.9705 | 0.9095 | 0.375 | 0.7942 | 0.863 | 0.2977 | 0.827 | 0.8385 | 0.5043 | 0.8391 | 0.8905 | 0.7894 | 0.8311 | 0.7524 | 0.7928 | 0.8591 | 0.8916 |
| 0.424 | 97.0 | 48500 | 0.2673 | 0.7994 | 0.9742 | 0.9125 | 0.4045 | 0.7918 | 0.8614 | 0.2929 | 0.8257 | 0.8355 | 0.5294 | 0.832 | 0.8928 | 0.7848 | 0.8263 | 0.7457 | 0.7835 | 0.8678 | 0.8968 |
| 0.3468 | 98.0 | 49000 | 0.2518 | 0.8156 | 0.9747 | 0.9192 | 0.3678 | 0.8131 | 0.8647 | 0.3008 | 0.8399 | 0.8495 | 0.4881 | 0.8532 | 0.89 | 0.8002 | 0.8382 | 0.7718 | 0.8062 | 0.8747 | 0.904 |
| 0.3835 | 99.0 | 49500 | 0.2703 | 0.7944 | 0.9748 | 0.9169 | 0.3729 | 0.7933 | 0.8509 | 0.2911 | 0.8209 | 0.8295 | 0.4898 | 0.8333 | 0.8776 | 0.7866 | 0.8263 | 0.7333 | 0.767 | 0.8634 | 0.8951 |
| 0.499 | 100.0 | 50000 | 0.2582 | 0.811 | 0.9757 | 0.9167 | 0.3739 | 0.8101 | 0.8642 | 0.2979 | 0.836 | 0.8467 | 0.5243 | 0.85 | 0.8938 | 0.7986 | 0.8368 | 0.7574 | 0.7969 | 0.877 | 0.9064 |
| 0.4067 | 101.0 | 50500 | 0.2530 | 0.8125 | 0.9685 | 0.9179 | 0.3733 | 0.8088 | 0.8878 | 0.2982 | 0.8379 | 0.849 | 0.5092 | 0.8488 | 0.9123 | 0.812 | 0.85 | 0.7471 | 0.7897 | 0.8785 | 0.9072 |
| 0.4467 | 102.0 | 51000 | 0.2668 | 0.8024 | 0.9672 | 0.9127 | 0.3878 | 0.7976 | 0.8682 | 0.2978 | 0.8295 | 0.8389 | 0.4976 | 0.8377 | 0.9024 | 0.791 | 0.8333 | 0.7463 | 0.7825 | 0.87 | 0.9009 |
| 0.3025 | 103.0 | 51500 | 0.2481 | 0.8182 | 0.9701 | 0.9189 | 0.3928 | 0.8116 | 0.8758 | 0.3017 | 0.8428 | 0.8528 | 0.5133 | 0.8518 | 0.902 | 0.8124 | 0.85 | 0.7613 | 0.7969 | 0.881 | 0.9116 |
| 0.2758 | 104.0 | 52000 | 0.2630 | 0.8078 | 0.9699 | 0.9124 | 0.3751 | 0.8054 | 0.8835 | 0.2982 | 0.8365 | 0.8461 | 0.5049 | 0.8474 | 0.9112 | 0.7963 | 0.8373 | 0.7562 | 0.7959 | 0.871 | 0.9052 |
| 0.3517 | 105.0 | 52500 | 0.2664 | 0.8041 | 0.9683 | 0.9128 | 0.3931 | 0.8011 | 0.8829 | 0.2962 | 0.8308 | 0.8404 | 0.4978 | 0.8432 | 0.91 | 0.7847 | 0.825 | 0.7588 | 0.7938 | 0.8689 | 0.9023 |
| 0.2906 | 106.0 | 53000 | 0.2545 | 0.8094 | 0.967 | 0.9151 | 0.3886 | 0.8038 | 0.8793 | 0.2985 | 0.8338 | 0.844 | 0.5008 | 0.8435 | 0.8997 | 0.7964 | 0.8368 | 0.7547 | 0.7897 | 0.877 | 0.9055 |
| 0.3454 | 107.0 | 53500 | 0.2541 | 0.8125 | 0.9708 | 0.9065 | 0.389 | 0.8083 | 0.8838 | 0.3014 | 0.8363 | 0.8465 | 0.4857 | 0.8484 | 0.9074 | 0.805 | 0.8395 | 0.7618 | 0.7959 | 0.8706 | 0.904 |
| 0.3432 | 108.0 | 54000 | 0.2567 | 0.8133 | 0.9673 | 0.9154 | 0.4185 | 0.8071 | 0.8756 | 0.3013 | 0.8385 | 0.8481 | 0.5214 | 0.8482 | 0.8986 | 0.8017 | 0.8395 | 0.767 | 0.801 | 0.8711 | 0.9038 |
| 0.4038 | 109.0 | 54500 | 0.2582 | 0.8065 | 0.9706 | 0.9082 | 0.3954 | 0.7986 | 0.9028 | 0.3001 | 0.8322 | 0.8429 | 0.4938 | 0.84 | 0.9283 | 0.8001 | 0.8404 | 0.7512 | 0.7876 | 0.8684 | 0.9009 |
| 0.3918 | 110.0 | 55000 | 0.2604 | 0.8122 | 0.9683 | 0.9129 | 0.4252 | 0.8073 | 0.8937 | 0.2996 | 0.8383 | 0.8482 | 0.5379 | 0.8493 | 0.9186 | 0.8036 | 0.8417 | 0.7662 | 0.799 | 0.8669 | 0.904 |
| 0.362 | 111.0 | 55500 | 0.2574 | 0.8129 | 0.9684 | 0.9126 | 0.4005 | 0.8052 | 0.8564 | 0.3015 | 0.8385 | 0.8481 | 0.4965 | 0.8454 | 0.8864 | 0.809 | 0.8474 | 0.7638 | 0.7979 | 0.8659 | 0.8988 |
| 0.321 | 112.0 | 56000 | 0.2501 | 0.8189 | 0.968 | 0.9192 | 0.4027 | 0.8143 | 0.8606 | 0.3037 | 0.842 | 0.8519 | 0.486 | 0.8528 | 0.8876 | 0.8122 | 0.8513 | 0.7682 | 0.7979 | 0.8762 | 0.9064 |
| 0.4514 | 113.0 | 56500 | 0.2479 | 0.8246 | 0.968 | 0.9158 | 0.3999 | 0.823 | 0.8624 | 0.3052 | 0.8488 | 0.8581 | 0.4849 | 0.8611 | 0.9002 | 0.8149 | 0.8522 | 0.784 | 0.8144 | 0.875 | 0.9078 |
| 0.373 | 114.0 | 57000 | 0.2444 | 0.8199 | 0.9683 | 0.9222 | 0.3971 | 0.8164 | 0.8793 | 0.3038 | 0.8447 | 0.8536 | 0.4862 | 0.8543 | 0.909 | 0.8077 | 0.8496 | 0.7743 | 0.8041 | 0.8777 | 0.9072 |
| 0.2791 | 115.0 | 57500 | 0.2639 | 0.7969 | 0.972 | 0.9146 | 0.4228 | 0.7907 | 0.8981 | 0.2962 | 0.8262 | 0.8351 | 0.5179 | 0.8325 | 0.922 | 0.7874 | 0.8294 | 0.7398 | 0.7784 | 0.8635 | 0.8977 |
| 0.3787 | 116.0 | 58000 | 0.2659 | 0.8032 | 0.9685 | 0.9167 | 0.382 | 0.7967 | 0.8761 | 0.2984 | 0.8297 | 0.8387 | 0.4848 | 0.8373 | 0.8987 | 0.7956 | 0.8338 | 0.7443 | 0.7814 | 0.8696 | 0.9009 |
| 0.368 | 117.0 | 58500 | 0.2533 | 0.8154 | 0.9683 | 0.92 | 0.4096 | 0.8114 | 0.9057 | 0.3032 | 0.843 | 0.8532 | 0.4967 | 0.8514 | 0.927 | 0.8017 | 0.8425 | 0.7644 | 0.8062 | 0.8803 | 0.911 |
| 0.2714 | 118.0 | 59000 | 0.2525 | 0.8163 | 0.9671 | 0.9233 | 0.4065 | 0.8136 | 0.893 | 0.3026 | 0.8407 | 0.8496 | 0.48 | 0.8525 | 0.9158 | 0.8053 | 0.8434 | 0.7693 | 0.8 | 0.8745 | 0.9055 |
| 0.3878 | 119.0 | 59500 | 0.2498 | 0.8199 | 0.9668 | 0.9214 | 0.3979 | 0.8193 | 0.8989 | 0.302 | 0.8439 | 0.8538 | 0.4843 | 0.8568 | 0.9223 | 0.8079 | 0.8504 | 0.7695 | 0.799 | 0.8822 | 0.9121 |
| 0.335 | 120.0 | 60000 | 0.2511 | 0.8164 | 0.9706 | 0.9206 | 0.4013 | 0.8135 | 0.898 | 0.3002 | 0.8404 | 0.8491 | 0.4906 | 0.8504 | 0.9176 | 0.8044 | 0.8461 | 0.7665 | 0.7918 | 0.8784 | 0.9095 |
| 0.3062 | 121.0 | 60500 | 0.2500 | 0.8148 | 0.9717 | 0.9214 | 0.3963 | 0.8109 | 0.8758 | 0.3007 | 0.8413 | 0.8496 | 0.4932 | 0.8507 | 0.8987 | 0.808 | 0.8465 | 0.7658 | 0.799 | 0.8705 | 0.9035 |
| 0.3787 | 122.0 | 61000 | 0.2432 | 0.8252 | 0.9719 | 0.9315 | 0.421 | 0.8199 | 0.8788 | 0.3037 | 0.8479 | 0.8559 | 0.5075 | 0.8563 | 0.9011 | 0.8116 | 0.8465 | 0.7849 | 0.8113 | 0.8792 | 0.9098 |
| 0.3067 | 123.0 | 61500 | 0.2471 | 0.8182 | 0.9737 | 0.9127 | 0.409 | 0.813 | 0.8976 | 0.3021 | 0.8429 | 0.8523 | 0.5124 | 0.8522 | 0.9162 | 0.8067 | 0.8434 | 0.7726 | 0.8082 | 0.8754 | 0.9052 |
| 0.3869 | 124.0 | 62000 | 0.2395 | 0.8252 | 0.9748 | 0.9183 | 0.3996 | 0.8215 | 0.9094 | 0.3048 | 0.8487 | 0.86 | 0.489 | 0.861 | 0.9274 | 0.8142 | 0.8513 | 0.7769 | 0.8155 | 0.8846 | 0.9133 |
| 0.2933 | 125.0 | 62500 | 0.2452 | 0.8177 | 0.9751 | 0.9128 | 0.4009 | 0.814 | 0.9069 | 0.3004 | 0.8426 | 0.8524 | 0.4887 | 0.8539 | 0.9259 | 0.81 | 0.8461 | 0.765 | 0.801 | 0.8781 | 0.9101 |
| 0.2843 | 126.0 | 63000 | 0.2433 | 0.8201 | 0.975 | 0.9189 | 0.4255 | 0.818 | 0.8811 | 0.3013 | 0.8439 | 0.8539 | 0.5219 | 0.856 | 0.9016 | 0.8068 | 0.8443 | 0.7691 | 0.8031 | 0.8845 | 0.9142 |
| 0.2707 | 127.0 | 63500 | 0.2418 | 0.8194 | 0.972 | 0.9179 | 0.4057 | 0.8184 | 0.9003 | 0.3022 | 0.8457 | 0.8555 | 0.504 | 0.8583 | 0.9217 | 0.8134 | 0.8509 | 0.7654 | 0.8031 | 0.8794 | 0.9124 |
| 0.323 | 128.0 | 64000 | 0.2434 | 0.8196 | 0.9715 | 0.9161 | 0.4199 | 0.8163 | 0.9008 | 0.3005 | 0.846 | 0.8556 | 0.5219 | 0.856 | 0.9217 | 0.8136 | 0.85 | 0.766 | 0.8052 | 0.8793 | 0.9116 |
| 0.3649 | 129.0 | 64500 | 0.2445 | 0.817 | 0.9719 | 0.9154 | 0.425 | 0.8123 | 0.88 | 0.3018 | 0.8419 | 0.852 | 0.526 | 0.8518 | 0.9008 | 0.8075 | 0.8443 | 0.7638 | 0.801 | 0.8796 | 0.9107 |
| 0.3522 | 130.0 | 65000 | 0.2426 | 0.8204 | 0.9718 | 0.9152 | 0.4168 | 0.8196 | 0.8845 | 0.3019 | 0.8447 | 0.8549 | 0.5141 | 0.858 | 0.9049 | 0.807 | 0.8443 | 0.773 | 0.8072 | 0.8811 | 0.9133 |
| 0.3731 | 131.0 | 65500 | 0.2398 | 0.8253 | 0.9719 | 0.9183 | 0.4157 | 0.8234 | 0.8846 | 0.3034 | 0.8488 | 0.8593 | 0.5229 | 0.8612 | 0.9049 | 0.8182 | 0.8544 | 0.776 | 0.8113 | 0.8817 | 0.9121 |
| 0.3208 | 132.0 | 66000 | 0.2410 | 0.8252 | 0.9718 | 0.9181 | 0.4215 | 0.8234 | 0.8849 | 0.3036 | 0.8484 | 0.8576 | 0.5227 | 0.8606 | 0.9051 | 0.8168 | 0.8526 | 0.7746 | 0.8062 | 0.8842 | 0.9139 |
| 0.3514 | 133.0 | 66500 | 0.2463 | 0.8214 | 0.9719 | 0.9197 | 0.426 | 0.8183 | 0.9069 | 0.3018 | 0.8448 | 0.8539 | 0.5194 | 0.8559 | 0.9255 | 0.8111 | 0.8465 | 0.7707 | 0.8031 | 0.8825 | 0.9121 |
| 0.3643 | 134.0 | 67000 | 0.2414 | 0.8235 | 0.9728 | 0.9229 | 0.4333 | 0.8209 | 0.9016 | 0.3031 | 0.8475 | 0.8573 | 0.5306 | 0.8588 | 0.9222 | 0.8128 | 0.8491 | 0.7732 | 0.8082 | 0.8844 | 0.9145 |
| 0.3888 | 135.0 | 67500 | 0.2433 | 0.8235 | 0.9719 | 0.9216 | 0.4276 | 0.8219 | 0.901 | 0.3027 | 0.848 | 0.8581 | 0.5227 | 0.8604 | 0.9216 | 0.813 | 0.8509 | 0.7746 | 0.8103 | 0.883 | 0.913 |
| 0.3093 | 136.0 | 68000 | 0.2406 | 0.823 | 0.9719 | 0.9179 | 0.4344 | 0.8217 | 0.9009 | 0.3034 | 0.8477 | 0.8572 | 0.5249 | 0.8595 | 0.9216 | 0.8112 | 0.8504 | 0.7739 | 0.8082 | 0.884 | 0.913 |
| 0.3101 | 137.0 | 68500 | 0.2417 | 0.8229 | 0.9727 | 0.9197 | 0.4321 | 0.8214 | 0.8839 | 0.3032 | 0.847 | 0.8565 | 0.5241 | 0.8593 | 0.9047 | 0.8129 | 0.8509 | 0.7724 | 0.8062 | 0.8832 | 0.9124 |
| 0.2872 | 138.0 | 69000 | 0.2437 | 0.8209 | 0.9726 | 0.9197 | 0.4247 | 0.8188 | 0.9006 | 0.302 | 0.8453 | 0.8552 | 0.5184 | 0.8577 | 0.9215 | 0.8078 | 0.8461 | 0.7724 | 0.8072 | 0.8826 | 0.9124 |
| 0.3664 | 139.0 | 69500 | 0.2422 | 0.8231 | 0.9726 | 0.9185 | 0.4299 | 0.8198 | 0.9026 | 0.3035 | 0.848 | 0.8576 | 0.516 | 0.8594 | 0.9226 | 0.8106 | 0.85 | 0.7738 | 0.8082 | 0.885 | 0.9145 |
| 0.345 | 140.0 | 70000 | 0.2415 | 0.8243 | 0.9726 | 0.918 | 0.4296 | 0.8219 | 0.9019 | 0.3041 | 0.8492 | 0.8582 | 0.5216 | 0.8608 | 0.9222 | 0.8119 | 0.8509 | 0.7759 | 0.8093 | 0.8851 | 0.9145 |
| 0.2741 | 141.0 | 70500 | 0.2404 | 0.825 | 0.9727 | 0.918 | 0.43 | 0.8237 | 0.9018 | 0.3047 | 0.8496 | 0.8585 | 0.516 | 0.8614 | 0.9223 | 0.8106 | 0.8496 | 0.7792 | 0.8113 | 0.8853 | 0.9147 |
| 0.4387 | 142.0 | 71000 | 0.2400 | 0.8252 | 0.9727 | 0.918 | 0.4245 | 0.8249 | 0.9016 | 0.3048 | 0.8496 | 0.8585 | 0.5137 | 0.8622 | 0.9222 | 0.8141 | 0.8513 | 0.7767 | 0.8093 | 0.8847 | 0.915 |
| 0.3424 | 143.0 | 71500 | 0.2410 | 0.8246 | 0.9726 | 0.9145 | 0.4218 | 0.8233 | 0.901 | 0.3048 | 0.8489 | 0.8577 | 0.5079 | 0.8617 | 0.9216 | 0.8132 | 0.8504 | 0.7774 | 0.8093 | 0.8831 | 0.9133 |
| 0.2974 | 144.0 | 72000 | 0.2409 | 0.824 | 0.9726 | 0.9164 | 0.4222 | 0.823 | 0.901 | 0.3049 | 0.8488 | 0.8576 | 0.5103 | 0.8615 | 0.9216 | 0.8141 | 0.8513 | 0.7754 | 0.8082 | 0.8826 | 0.9133 |
| 0.4419 | 145.0 | 72500 | 0.2415 | 0.8235 | 0.9727 | 0.9165 | 0.4223 | 0.8225 | 0.9011 | 0.3048 | 0.8483 | 0.8573 | 0.5127 | 0.8613 | 0.9217 | 0.8126 | 0.8504 | 0.7745 | 0.8072 | 0.8835 | 0.9142 |
| 0.3411 | 146.0 | 73000 | 0.2407 | 0.8239 | 0.9727 | 0.9165 | 0.4216 | 0.8227 | 0.9015 | 0.3048 | 0.8485 | 0.8576 | 0.5127 | 0.8611 | 0.9219 | 0.8134 | 0.8518 | 0.7742 | 0.8072 | 0.8839 | 0.9139 |
| 0.3353 | 147.0 | 73500 | 0.2400 | 0.8236 | 0.9727 | 0.9165 | 0.4223 | 0.8227 | 0.9015 | 0.3048 | 0.8484 | 0.8573 | 0.5127 | 0.8607 | 0.9219 | 0.8126 | 0.8509 | 0.7742 | 0.8072 | 0.884 | 0.9139 |
| 0.4092 | 148.0 | 74000 | 0.2398 | 0.824 | 0.9727 | 0.9165 | 0.4223 | 0.8232 | 0.9015 | 0.305 | 0.8488 | 0.8578 | 0.5127 | 0.8612 | 0.9219 | 0.8139 | 0.8522 | 0.7742 | 0.8072 | 0.884 | 0.9139 |
| 0.3577 | 149.0 | 74500 | 0.2398 | 0.824 | 0.9727 | 0.9165 | 0.4223 | 0.8232 | 0.9015 | 0.305 | 0.8488 | 0.8578 | 0.5127 | 0.8612 | 0.9219 | 0.8138 | 0.8522 | 0.7742 | 0.8072 | 0.884 | 0.9139 |
| 0.3363 | 150.0 | 75000 | 0.2398 | 0.824 | 0.9727 | 0.9165 | 0.4223 | 0.8232 | 0.9015 | 0.305 | 0.8488 | 0.8578 | 0.5127 | 0.8612 | 0.9219 | 0.8138 | 0.8522 | 0.7742 | 0.8072 | 0.884 | 0.9139 |
### Framework versions
- Transformers 4.46.1
- Pytorch 2.5.0+cu121
- Datasets 2.19.2
- Tokenizers 0.20.1
| [
"chicken",
"duck",
"plant"
] |
joe611/chickens-composite-8044444-150-epochs-wo-transform-metrics-test |
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# chickens-composite-8044444-150-epochs-wo-transform-metrics-test
This model is a fine-tuned version of [facebook/detr-resnet-50](https://huggingface.co/facebook/detr-resnet-50) on the None dataset.
It achieves the following results on the evaluation set:
- Loss: 0.2292
- Map: 0.8325
- Map 50: 0.9667
- Map 75: 0.9216
- Map Small: 0.3829
- Map Medium: 0.8475
- Map Large: 0.8697
- Mar 1: 0.3088
- Mar 10: 0.862
- Mar 100: 0.8694
- Mar Small: 0.4616
- Mar Medium: 0.8815
- Mar Large: 0.898
- Map Chicken: 0.821
- Mar 100 Chicken: 0.8702
- Map Duck: 0.7881
- Mar 100 Duck: 0.8196
- Map Plant: 0.8885
- Mar 100 Plant: 0.9185
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 1e-05
- train_batch_size: 2
- eval_batch_size: 8
- seed: 42
- optimizer: Use adamw_torch with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
- lr_scheduler_type: cosine
- num_epochs: 150
### Training results
| Training Loss | Epoch | Step | Validation Loss | Map | Map 50 | Map 75 | Map Small | Map Medium | Map Large | Mar 1 | Mar 10 | Mar 100 | Mar Small | Mar Medium | Mar Large | Map Chicken | Mar 100 Chicken | Map Duck | Mar 100 Duck | Map Plant | Mar 100 Plant |
|:-------------:|:-----:|:-----:|:---------------:|:------:|:------:|:------:|:---------:|:----------:|:---------:|:------:|:------:|:-------:|:---------:|:----------:|:---------:|:-----------:|:---------------:|:--------:|:------------:|:---------:|:-------------:|
| 1.3665 | 1.0 | 500 | 1.2582 | 0.1438 | 0.2022 | 0.1612 | 0.0167 | 0.0697 | 0.1576 | 0.0926 | 0.264 | 0.3047 | 0.0913 | 0.282 | 0.2644 | 0.064 | 0.1654 | 0.0 | 0.0 | 0.3675 | 0.7488 |
| 1.0808 | 2.0 | 1000 | 1.1152 | 0.3091 | 0.4427 | 0.364 | 0.0108 | 0.2714 | 0.2898 | 0.1214 | 0.4314 | 0.4494 | 0.1159 | 0.4481 | 0.375 | 0.3302 | 0.6289 | 0.0 | 0.0 | 0.5973 | 0.7194 |
| 1.1646 | 3.0 | 1500 | 1.1408 | 0.3658 | 0.5123 | 0.4385 | 0.0194 | 0.3393 | 0.3661 | 0.1336 | 0.4454 | 0.4531 | 0.1063 | 0.4416 | 0.4383 | 0.4551 | 0.6474 | 0.0 | 0.0 | 0.6424 | 0.7118 |
| 0.7699 | 4.0 | 2000 | 0.7439 | 0.4294 | 0.5912 | 0.5051 | 0.017 | 0.4129 | 0.3907 | 0.1419 | 0.4823 | 0.4936 | 0.0771 | 0.4789 | 0.4893 | 0.5743 | 0.6974 | 0.0026 | 0.0021 | 0.7114 | 0.7815 |
| 0.6842 | 5.0 | 2500 | 0.7106 | 0.4509 | 0.6047 | 0.5126 | 0.0687 | 0.441 | 0.3752 | 0.1493 | 0.5017 | 0.5073 | 0.1516 | 0.5 | 0.4411 | 0.5772 | 0.6943 | 0.0376 | 0.0309 | 0.738 | 0.7965 |
| 0.7341 | 6.0 | 3000 | 0.6190 | 0.5818 | 0.787 | 0.6866 | 0.0719 | 0.5818 | 0.4839 | 0.2076 | 0.6268 | 0.6327 | 0.1465 | 0.6381 | 0.5287 | 0.6662 | 0.736 | 0.3327 | 0.3649 | 0.7465 | 0.7971 |
| 0.6874 | 7.0 | 3500 | 0.5968 | 0.6621 | 0.8818 | 0.7921 | 0.1147 | 0.666 | 0.5083 | 0.2505 | 0.7082 | 0.7114 | 0.1732 | 0.7182 | 0.5332 | 0.6796 | 0.7364 | 0.5434 | 0.5938 | 0.7633 | 0.804 |
| 0.5629 | 8.0 | 4000 | 0.4814 | 0.6924 | 0.9199 | 0.8353 | 0.1931 | 0.694 | 0.6502 | 0.2642 | 0.7303 | 0.7378 | 0.2541 | 0.7426 | 0.6854 | 0.6963 | 0.7443 | 0.6074 | 0.6536 | 0.7735 | 0.8156 |
| 0.5402 | 9.0 | 4500 | 0.4782 | 0.6851 | 0.9326 | 0.8199 | 0.1449 | 0.6855 | 0.71 | 0.2695 | 0.7302 | 0.738 | 0.2183 | 0.7465 | 0.7443 | 0.6692 | 0.7237 | 0.6302 | 0.6856 | 0.7558 | 0.8046 |
| 0.5441 | 10.0 | 5000 | 0.4672 | 0.6996 | 0.9249 | 0.8338 | 0.1596 | 0.6963 | 0.726 | 0.2742 | 0.7365 | 0.7455 | 0.2484 | 0.75 | 0.777 | 0.705 | 0.7526 | 0.621 | 0.668 | 0.7728 | 0.8159 |
| 0.5064 | 11.0 | 5500 | 0.4573 | 0.6829 | 0.9423 | 0.8361 | 0.1564 | 0.6839 | 0.7062 | 0.2671 | 0.7234 | 0.7298 | 0.2135 | 0.7354 | 0.7519 | 0.6741 | 0.7206 | 0.5991 | 0.6536 | 0.7754 | 0.8153 |
| 0.4986 | 12.0 | 6000 | 0.4355 | 0.7122 | 0.9471 | 0.859 | 0.1343 | 0.7156 | 0.7748 | 0.2769 | 0.7513 | 0.7606 | 0.3159 | 0.7661 | 0.8076 | 0.7058 | 0.7526 | 0.635 | 0.6928 | 0.7959 | 0.8364 |
| 0.4475 | 13.0 | 6500 | 0.4073 | 0.7194 | 0.9374 | 0.8445 | 0.1866 | 0.7153 | 0.7717 | 0.279 | 0.7597 | 0.7664 | 0.2921 | 0.7686 | 0.8002 | 0.7074 | 0.7548 | 0.6432 | 0.6979 | 0.8075 | 0.8465 |
| 0.4608 | 14.0 | 7000 | 0.4035 | 0.7112 | 0.9522 | 0.8556 | 0.1984 | 0.7069 | 0.7929 | 0.2746 | 0.7522 | 0.76 | 0.3516 | 0.7578 | 0.8268 | 0.6808 | 0.7382 | 0.6508 | 0.7021 | 0.802 | 0.8399 |
| 0.4381 | 15.0 | 7500 | 0.3794 | 0.7421 | 0.9437 | 0.8736 | 0.1956 | 0.7471 | 0.8036 | 0.2831 | 0.7792 | 0.7868 | 0.3598 | 0.7905 | 0.8309 | 0.7245 | 0.7702 | 0.6826 | 0.732 | 0.8191 | 0.8584 |
| 0.4408 | 16.0 | 8000 | 0.3875 | 0.7223 | 0.9515 | 0.8639 | 0.1728 | 0.7225 | 0.7922 | 0.2799 | 0.7651 | 0.7726 | 0.2813 | 0.7773 | 0.8122 | 0.705 | 0.7535 | 0.6551 | 0.7155 | 0.8069 | 0.8488 |
| 0.4457 | 17.0 | 8500 | 0.3790 | 0.727 | 0.9475 | 0.8658 | 0.1998 | 0.7343 | 0.8161 | 0.2768 | 0.7703 | 0.7781 | 0.3524 | 0.7828 | 0.8426 | 0.7086 | 0.7654 | 0.6544 | 0.7093 | 0.8179 | 0.8595 |
| 0.4147 | 18.0 | 9000 | 0.3879 | 0.7153 | 0.9512 | 0.8664 | 0.1327 | 0.7234 | 0.7512 | 0.2835 | 0.7547 | 0.7629 | 0.2871 | 0.7766 | 0.782 | 0.6957 | 0.7513 | 0.6502 | 0.6969 | 0.7999 | 0.8405 |
| 0.4486 | 19.0 | 9500 | 0.3716 | 0.7228 | 0.9498 | 0.8644 | 0.2172 | 0.734 | 0.8005 | 0.2764 | 0.7643 | 0.7701 | 0.3906 | 0.7806 | 0.8357 | 0.7037 | 0.7601 | 0.6591 | 0.7041 | 0.8055 | 0.846 |
| 0.3893 | 20.0 | 10000 | 0.3230 | 0.7585 | 0.9652 | 0.8814 | 0.1642 | 0.7585 | 0.7941 | 0.29 | 0.7952 | 0.804 | 0.3671 | 0.807 | 0.8238 | 0.7519 | 0.7982 | 0.7046 | 0.7546 | 0.8189 | 0.8592 |
| 0.4252 | 21.0 | 10500 | 0.3389 | 0.7442 | 0.9533 | 0.8747 | 0.2651 | 0.7454 | 0.7785 | 0.2829 | 0.7794 | 0.7849 | 0.3351 | 0.7909 | 0.8031 | 0.7377 | 0.7816 | 0.6733 | 0.7134 | 0.8216 | 0.8598 |
| 0.3994 | 22.0 | 11000 | 0.3339 | 0.7498 | 0.9536 | 0.8878 | 0.2776 | 0.7565 | 0.7892 | 0.288 | 0.787 | 0.7927 | 0.376 | 0.8032 | 0.8295 | 0.7357 | 0.782 | 0.6972 | 0.7351 | 0.8166 | 0.861 |
| 0.394 | 23.0 | 11500 | 0.3255 | 0.7622 | 0.9586 | 0.8878 | 0.2854 | 0.7615 | 0.7528 | 0.2871 | 0.7953 | 0.8025 | 0.4003 | 0.8064 | 0.8029 | 0.76 | 0.8026 | 0.7038 | 0.7423 | 0.8226 | 0.8627 |
| 0.3649 | 24.0 | 12000 | 0.3400 | 0.75 | 0.9491 | 0.8873 | 0.2846 | 0.7498 | 0.7916 | 0.2815 | 0.7837 | 0.7931 | 0.4268 | 0.8003 | 0.8172 | 0.7444 | 0.7921 | 0.6814 | 0.7247 | 0.8244 | 0.8624 |
| 0.3888 | 25.0 | 12500 | 0.3268 | 0.7588 | 0.9633 | 0.8935 | 0.2659 | 0.7578 | 0.7577 | 0.284 | 0.7911 | 0.7992 | 0.3956 | 0.7976 | 0.8004 | 0.7536 | 0.8 | 0.6944 | 0.7361 | 0.8285 | 0.8616 |
| 0.3799 | 26.0 | 13000 | 0.3458 | 0.7695 | 0.9553 | 0.8786 | 0.2255 | 0.7735 | 0.7984 | 0.295 | 0.8002 | 0.8062 | 0.3092 | 0.8188 | 0.8266 | 0.7645 | 0.8075 | 0.7051 | 0.7392 | 0.8388 | 0.872 |
| 0.3497 | 27.0 | 13500 | 0.3204 | 0.7616 | 0.9684 | 0.8839 | 0.2739 | 0.7562 | 0.7911 | 0.2835 | 0.797 | 0.8051 | 0.4668 | 0.8004 | 0.8259 | 0.7689 | 0.8154 | 0.6865 | 0.733 | 0.8294 | 0.8671 |
| 0.3791 | 28.0 | 14000 | 0.3331 | 0.7461 | 0.9533 | 0.8785 | 0.3349 | 0.7513 | 0.7643 | 0.2829 | 0.7821 | 0.7885 | 0.4184 | 0.7983 | 0.7884 | 0.7324 | 0.7807 | 0.6742 | 0.7196 | 0.8317 | 0.8653 |
| 0.3625 | 29.0 | 14500 | 0.3239 | 0.7652 | 0.9533 | 0.895 | 0.2719 | 0.7671 | 0.7787 | 0.286 | 0.8008 | 0.8093 | 0.447 | 0.8173 | 0.8331 | 0.742 | 0.7978 | 0.7126 | 0.7567 | 0.8411 | 0.8734 |
| 0.3868 | 30.0 | 15000 | 0.3399 | 0.7558 | 0.9525 | 0.876 | 0.2418 | 0.7654 | 0.7898 | 0.2856 | 0.7956 | 0.8022 | 0.3935 | 0.8156 | 0.8413 | 0.7362 | 0.7912 | 0.6916 | 0.7392 | 0.8397 | 0.8763 |
| 0.3328 | 31.0 | 15500 | 0.3207 | 0.7668 | 0.946 | 0.8966 | 0.3242 | 0.774 | 0.77 | 0.2897 | 0.8011 | 0.8076 | 0.4367 | 0.8177 | 0.8033 | 0.7661 | 0.8066 | 0.6867 | 0.732 | 0.8476 | 0.8844 |
| 0.3572 | 32.0 | 16000 | 0.3117 | 0.7613 | 0.9559 | 0.8975 | 0.3224 | 0.7684 | 0.8026 | 0.2879 | 0.798 | 0.8042 | 0.3973 | 0.8145 | 0.8442 | 0.7571 | 0.8057 | 0.6949 | 0.7381 | 0.832 | 0.8688 |
| 0.3516 | 33.0 | 16500 | 0.3172 | 0.7555 | 0.9523 | 0.8998 | 0.3281 | 0.7577 | 0.7618 | 0.2859 | 0.7921 | 0.8005 | 0.4671 | 0.8099 | 0.7872 | 0.7406 | 0.7895 | 0.6831 | 0.7351 | 0.8429 | 0.8769 |
| 0.3601 | 34.0 | 17000 | 0.3115 | 0.7653 | 0.9697 | 0.8887 | 0.2614 | 0.7687 | 0.7992 | 0.2904 | 0.8038 | 0.8117 | 0.424 | 0.8194 | 0.8366 | 0.7594 | 0.8044 | 0.7116 | 0.7608 | 0.8248 | 0.8699 |
| 0.3492 | 35.0 | 17500 | 0.2925 | 0.7773 | 0.9598 | 0.8891 | 0.2556 | 0.787 | 0.7397 | 0.2968 | 0.8111 | 0.8199 | 0.3889 | 0.8312 | 0.7956 | 0.7592 | 0.8048 | 0.7232 | 0.7711 | 0.8496 | 0.8838 |
| 0.3272 | 36.0 | 18000 | 0.2892 | 0.771 | 0.9544 | 0.894 | 0.3564 | 0.7711 | 0.7458 | 0.291 | 0.8081 | 0.8178 | 0.5102 | 0.8196 | 0.7988 | 0.7593 | 0.8123 | 0.7088 | 0.7598 | 0.845 | 0.8812 |
| 0.3489 | 37.0 | 18500 | 0.2874 | 0.7815 | 0.9564 | 0.9088 | 0.4035 | 0.7794 | 0.7943 | 0.2929 | 0.8114 | 0.8195 | 0.4833 | 0.8208 | 0.8209 | 0.7762 | 0.8171 | 0.7203 | 0.7577 | 0.8481 | 0.8835 |
| 0.3114 | 38.0 | 19000 | 0.3157 | 0.7713 | 0.9465 | 0.9017 | 0.3277 | 0.767 | 0.7637 | 0.2877 | 0.802 | 0.811 | 0.5103 | 0.812 | 0.7926 | 0.7698 | 0.818 | 0.6907 | 0.7299 | 0.8534 | 0.8853 |
| 0.3189 | 39.0 | 19500 | 0.2966 | 0.7794 | 0.9614 | 0.9169 | 0.3714 | 0.779 | 0.7577 | 0.2906 | 0.8141 | 0.82 | 0.4944 | 0.8206 | 0.7914 | 0.7818 | 0.8237 | 0.7162 | 0.7557 | 0.84 | 0.8806 |
| 0.3044 | 40.0 | 20000 | 0.2893 | 0.781 | 0.9664 | 0.8992 | 0.3004 | 0.7809 | 0.7865 | 0.2912 | 0.8145 | 0.8205 | 0.4671 | 0.8213 | 0.8221 | 0.7855 | 0.8259 | 0.7101 | 0.7505 | 0.8474 | 0.885 |
| 0.3039 | 41.0 | 20500 | 0.3026 | 0.7698 | 0.9614 | 0.9166 | 0.3763 | 0.7665 | 0.7726 | 0.2864 | 0.804 | 0.8122 | 0.5071 | 0.8088 | 0.805 | 0.764 | 0.8026 | 0.6944 | 0.7505 | 0.8508 | 0.8835 |
| 0.3059 | 42.0 | 21000 | 0.2805 | 0.7868 | 0.9626 | 0.9021 | 0.3435 | 0.791 | 0.8266 | 0.2946 | 0.8152 | 0.8235 | 0.4825 | 0.8291 | 0.8553 | 0.7831 | 0.8259 | 0.7197 | 0.7546 | 0.8574 | 0.8899 |
| 0.3155 | 43.0 | 21500 | 0.2971 | 0.7792 | 0.965 | 0.9147 | 0.2983 | 0.7769 | 0.8119 | 0.2899 | 0.8135 | 0.8246 | 0.5089 | 0.8241 | 0.8384 | 0.7801 | 0.8263 | 0.721 | 0.7732 | 0.8364 | 0.8743 |
| 0.3091 | 44.0 | 22000 | 0.2909 | 0.7753 | 0.9571 | 0.9104 | 0.3672 | 0.7777 | 0.8133 | 0.2923 | 0.8127 | 0.8188 | 0.4624 | 0.823 | 0.8436 | 0.7657 | 0.8215 | 0.7065 | 0.7464 | 0.8537 | 0.8884 |
| 0.3035 | 45.0 | 22500 | 0.2967 | 0.7661 | 0.9546 | 0.8953 | 0.3482 | 0.7704 | 0.6473 | 0.2859 | 0.8009 | 0.8076 | 0.434 | 0.8164 | 0.6756 | 0.772 | 0.8136 | 0.674 | 0.7237 | 0.8521 | 0.8855 |
| 0.2954 | 46.0 | 23000 | 0.2702 | 0.7968 | 0.9627 | 0.9146 | 0.345 | 0.8002 | 0.8122 | 0.2973 | 0.8283 | 0.8354 | 0.4727 | 0.8385 | 0.8493 | 0.7862 | 0.8329 | 0.7476 | 0.7814 | 0.8565 | 0.8919 |
| 0.3337 | 47.0 | 23500 | 0.2990 | 0.7864 | 0.9567 | 0.911 | 0.2928 | 0.7954 | 0.8273 | 0.2955 | 0.814 | 0.8209 | 0.3768 | 0.8358 | 0.8518 | 0.7839 | 0.8276 | 0.7164 | 0.7433 | 0.8588 | 0.8919 |
| 0.2959 | 48.0 | 24000 | 0.3018 | 0.7746 | 0.9606 | 0.9037 | 0.3431 | 0.7729 | 0.7805 | 0.2896 | 0.81 | 0.8164 | 0.429 | 0.8181 | 0.7999 | 0.757 | 0.8022 | 0.7117 | 0.7567 | 0.8551 | 0.8902 |
| 0.3146 | 49.0 | 24500 | 0.2912 | 0.7802 | 0.9646 | 0.9048 | 0.323 | 0.7799 | 0.8511 | 0.2907 | 0.8131 | 0.8202 | 0.4632 | 0.8249 | 0.8774 | 0.7898 | 0.8325 | 0.7061 | 0.7474 | 0.8448 | 0.8806 |
| 0.2862 | 50.0 | 25000 | 0.2872 | 0.794 | 0.9603 | 0.9028 | 0.3087 | 0.801 | 0.8453 | 0.2989 | 0.8253 | 0.8319 | 0.3862 | 0.8461 | 0.8632 | 0.7834 | 0.8232 | 0.7278 | 0.7711 | 0.871 | 0.9014 |
| 0.3084 | 51.0 | 25500 | 0.2613 | 0.8039 | 0.9636 | 0.9141 | 0.3628 | 0.8051 | 0.8528 | 0.3035 | 0.8339 | 0.8403 | 0.4544 | 0.8472 | 0.8757 | 0.7959 | 0.8355 | 0.7462 | 0.7856 | 0.8696 | 0.8997 |
| 0.2911 | 52.0 | 26000 | 0.3012 | 0.786 | 0.9471 | 0.9043 | 0.32 | 0.7933 | 0.8539 | 0.2948 | 0.8175 | 0.8249 | 0.4362 | 0.8346 | 0.8743 | 0.7782 | 0.8237 | 0.7166 | 0.7546 | 0.8631 | 0.8962 |
| 0.3434 | 53.0 | 26500 | 0.2772 | 0.7991 | 0.9617 | 0.9078 | 0.3542 | 0.808 | 0.8579 | 0.3002 | 0.8277 | 0.835 | 0.4795 | 0.8466 | 0.8734 | 0.7729 | 0.8154 | 0.7554 | 0.7897 | 0.869 | 0.9 |
| 0.2702 | 54.0 | 27000 | 0.3074 | 0.7844 | 0.951 | 0.9099 | 0.219 | 0.7892 | 0.8343 | 0.2926 | 0.8132 | 0.8197 | 0.3275 | 0.8268 | 0.8663 | 0.7674 | 0.8136 | 0.7324 | 0.7588 | 0.8532 | 0.8867 |
| 0.2631 | 55.0 | 27500 | 0.2868 | 0.7836 | 0.9578 | 0.9106 | 0.3111 | 0.7922 | 0.8246 | 0.2938 | 0.8163 | 0.824 | 0.406 | 0.834 | 0.8603 | 0.7789 | 0.8246 | 0.7195 | 0.7577 | 0.8523 | 0.8896 |
| 0.2772 | 56.0 | 28000 | 0.2640 | 0.8057 | 0.9597 | 0.9232 | 0.3827 | 0.8113 | 0.8482 | 0.2997 | 0.8325 | 0.8427 | 0.5098 | 0.8502 | 0.8646 | 0.7848 | 0.8329 | 0.7591 | 0.7887 | 0.8732 | 0.9066 |
| 0.2674 | 57.0 | 28500 | 0.2989 | 0.7828 | 0.9526 | 0.9086 | 0.2358 | 0.793 | 0.8113 | 0.296 | 0.8143 | 0.8201 | 0.3081 | 0.8341 | 0.8295 | 0.7747 | 0.8189 | 0.7257 | 0.7608 | 0.8479 | 0.8806 |
| 0.3017 | 58.0 | 29000 | 0.2915 | 0.7922 | 0.961 | 0.9158 | 0.3729 | 0.8016 | 0.8193 | 0.2945 | 0.8207 | 0.8284 | 0.4871 | 0.8396 | 0.8398 | 0.774 | 0.8215 | 0.7346 | 0.7649 | 0.8681 | 0.8988 |
| 0.2769 | 59.0 | 29500 | 0.2767 | 0.8062 | 0.977 | 0.9227 | 0.4494 | 0.802 | 0.8261 | 0.2961 | 0.8392 | 0.8466 | 0.5863 | 0.8448 | 0.8573 | 0.7938 | 0.8386 | 0.7642 | 0.8052 | 0.8605 | 0.896 |
| 0.294 | 60.0 | 30000 | 0.2988 | 0.7853 | 0.9564 | 0.9147 | 0.3306 | 0.7951 | 0.8104 | 0.2928 | 0.8177 | 0.8264 | 0.4733 | 0.8352 | 0.8365 | 0.7791 | 0.8259 | 0.7171 | 0.7557 | 0.8597 | 0.8977 |
| 0.2964 | 61.0 | 30500 | 0.2834 | 0.7995 | 0.9569 | 0.9145 | 0.2904 | 0.813 | 0.8063 | 0.3038 | 0.8307 | 0.839 | 0.3917 | 0.8523 | 0.8431 | 0.7748 | 0.8219 | 0.756 | 0.7938 | 0.8676 | 0.9012 |
| 0.2735 | 62.0 | 31000 | 0.2701 | 0.813 | 0.9623 | 0.9217 | 0.3599 | 0.819 | 0.8607 | 0.3069 | 0.8429 | 0.8512 | 0.4662 | 0.8565 | 0.8815 | 0.7981 | 0.8395 | 0.7726 | 0.8113 | 0.8682 | 0.9029 |
| 0.27 | 63.0 | 31500 | 0.2853 | 0.7947 | 0.9607 | 0.9078 | 0.2996 | 0.8078 | 0.8146 | 0.2994 | 0.8242 | 0.8324 | 0.4003 | 0.8487 | 0.8296 | 0.7817 | 0.8268 | 0.7339 | 0.768 | 0.8686 | 0.9023 |
| 0.2792 | 64.0 | 32000 | 0.2743 | 0.8163 | 0.9637 | 0.9112 | 0.3217 | 0.8296 | 0.8224 | 0.3083 | 0.8467 | 0.8528 | 0.4316 | 0.8677 | 0.8473 | 0.7815 | 0.8294 | 0.7957 | 0.8237 | 0.8717 | 0.9052 |
| 0.249 | 65.0 | 32500 | 0.2722 | 0.8115 | 0.9631 | 0.9126 | 0.3639 | 0.82 | 0.8432 | 0.305 | 0.8398 | 0.8463 | 0.4657 | 0.8561 | 0.8628 | 0.7918 | 0.8311 | 0.7729 | 0.8062 | 0.8698 | 0.9014 |
| 0.251 | 66.0 | 33000 | 0.2777 | 0.8003 | 0.9625 | 0.9139 | 0.3511 | 0.8095 | 0.8273 | 0.2994 | 0.8306 | 0.8379 | 0.4279 | 0.8486 | 0.8646 | 0.7872 | 0.8333 | 0.7548 | 0.7887 | 0.8589 | 0.8916 |
| 0.2392 | 67.0 | 33500 | 0.2714 | 0.8056 | 0.9525 | 0.9092 | 0.3448 | 0.8166 | 0.8115 | 0.3016 | 0.8331 | 0.8405 | 0.4073 | 0.8551 | 0.8419 | 0.7923 | 0.8342 | 0.7571 | 0.7866 | 0.8676 | 0.9006 |
| 0.2833 | 68.0 | 34000 | 0.2719 | 0.7984 | 0.9566 | 0.9162 | 0.2997 | 0.8051 | 0.8329 | 0.2999 | 0.8282 | 0.8347 | 0.3994 | 0.8446 | 0.8569 | 0.7765 | 0.8215 | 0.7517 | 0.7804 | 0.8671 | 0.9023 |
| 0.2544 | 69.0 | 34500 | 0.2496 | 0.818 | 0.9707 | 0.9164 | 0.3735 | 0.8301 | 0.8329 | 0.3061 | 0.8456 | 0.8544 | 0.4594 | 0.8668 | 0.8574 | 0.7961 | 0.8408 | 0.7812 | 0.8113 | 0.8766 | 0.911 |
| 0.2748 | 70.0 | 35000 | 0.2602 | 0.8062 | 0.9567 | 0.917 | 0.3879 | 0.8125 | 0.8251 | 0.2981 | 0.833 | 0.84 | 0.4465 | 0.8462 | 0.8586 | 0.7886 | 0.8329 | 0.7534 | 0.7794 | 0.8767 | 0.9078 |
| 0.2762 | 71.0 | 35500 | 0.2855 | 0.7889 | 0.9624 | 0.906 | 0.3154 | 0.7979 | 0.8197 | 0.2967 | 0.8219 | 0.8275 | 0.3848 | 0.8378 | 0.8507 | 0.7675 | 0.8162 | 0.7346 | 0.7701 | 0.8647 | 0.8962 |
| 0.2585 | 72.0 | 36000 | 0.2694 | 0.8046 | 0.9597 | 0.9051 | 0.2921 | 0.8163 | 0.8159 | 0.307 | 0.837 | 0.8427 | 0.3862 | 0.8572 | 0.842 | 0.7959 | 0.8417 | 0.7506 | 0.7887 | 0.8674 | 0.8977 |
| 0.2707 | 73.0 | 36500 | 0.2479 | 0.8165 | 0.971 | 0.9107 | 0.3561 | 0.8219 | 0.8195 | 0.3061 | 0.8462 | 0.8538 | 0.4729 | 0.8638 | 0.8536 | 0.7914 | 0.8333 | 0.7805 | 0.8175 | 0.8777 | 0.9104 |
| 0.2503 | 74.0 | 37000 | 0.2666 | 0.8225 | 0.9662 | 0.9142 | 0.375 | 0.8339 | 0.8479 | 0.3097 | 0.8526 | 0.8585 | 0.4516 | 0.8713 | 0.8751 | 0.8 | 0.8439 | 0.7995 | 0.8299 | 0.8679 | 0.9017 |
| 0.2435 | 75.0 | 37500 | 0.2530 | 0.8135 | 0.9734 | 0.913 | 0.3323 | 0.832 | 0.8232 | 0.3045 | 0.8405 | 0.8474 | 0.4187 | 0.8678 | 0.8424 | 0.8041 | 0.843 | 0.7584 | 0.7907 | 0.8779 | 0.9084 |
| 0.2311 | 76.0 | 38000 | 0.2581 | 0.82 | 0.9607 | 0.9149 | 0.383 | 0.8297 | 0.8657 | 0.3063 | 0.8496 | 0.8553 | 0.4719 | 0.8653 | 0.8931 | 0.8105 | 0.8539 | 0.7784 | 0.8082 | 0.8711 | 0.9038 |
| 0.2483 | 77.0 | 38500 | 0.2830 | 0.8084 | 0.9654 | 0.9087 | 0.34 | 0.8272 | 0.8526 | 0.3068 | 0.8397 | 0.8463 | 0.4403 | 0.866 | 0.8728 | 0.7967 | 0.8404 | 0.7594 | 0.7959 | 0.8692 | 0.9026 |
| 0.3111 | 78.0 | 39000 | 0.2762 | 0.8095 | 0.9605 | 0.9036 | 0.3142 | 0.8246 | 0.8748 | 0.3054 | 0.8383 | 0.8449 | 0.3989 | 0.862 | 0.8954 | 0.8002 | 0.8425 | 0.7592 | 0.7907 | 0.8691 | 0.9014 |
| 0.2526 | 79.0 | 39500 | 0.2583 | 0.8148 | 0.9573 | 0.9147 | 0.3846 | 0.8293 | 0.8642 | 0.3042 | 0.8421 | 0.8503 | 0.4497 | 0.866 | 0.8839 | 0.7984 | 0.843 | 0.7691 | 0.801 | 0.877 | 0.9069 |
| 0.2369 | 80.0 | 40000 | 0.2718 | 0.8094 | 0.9558 | 0.9041 | 0.2824 | 0.8265 | 0.8939 | 0.3044 | 0.8386 | 0.8447 | 0.3975 | 0.8604 | 0.9112 | 0.8047 | 0.8456 | 0.7587 | 0.7918 | 0.8647 | 0.8968 |
| 0.2489 | 81.0 | 40500 | 0.2595 | 0.8181 | 0.9593 | 0.914 | 0.3263 | 0.83 | 0.8471 | 0.3075 | 0.8455 | 0.8518 | 0.4137 | 0.8663 | 0.8686 | 0.8116 | 0.8474 | 0.7701 | 0.8031 | 0.8727 | 0.9049 |
| 0.2473 | 82.0 | 41000 | 0.2571 | 0.8139 | 0.9579 | 0.9144 | 0.3845 | 0.8209 | 0.8674 | 0.3034 | 0.8423 | 0.8486 | 0.4654 | 0.8572 | 0.8893 | 0.8031 | 0.8421 | 0.7656 | 0.8 | 0.8732 | 0.9038 |
| 0.2458 | 83.0 | 41500 | 0.2593 | 0.8127 | 0.9534 | 0.9088 | 0.3364 | 0.8221 | 0.8661 | 0.3072 | 0.8419 | 0.8499 | 0.449 | 0.8584 | 0.8821 | 0.7992 | 0.8452 | 0.7644 | 0.8 | 0.8744 | 0.9046 |
| 0.2541 | 84.0 | 42000 | 0.2355 | 0.8216 | 0.9615 | 0.9242 | 0.4101 | 0.837 | 0.8918 | 0.3079 | 0.8493 | 0.8577 | 0.469 | 0.8705 | 0.9133 | 0.8032 | 0.85 | 0.7743 | 0.8062 | 0.8873 | 0.9171 |
| 0.2492 | 85.0 | 42500 | 0.2382 | 0.8326 | 0.9598 | 0.9181 | 0.4088 | 0.8463 | 0.8602 | 0.31 | 0.8587 | 0.8668 | 0.523 | 0.879 | 0.8785 | 0.8142 | 0.8579 | 0.8008 | 0.8278 | 0.8828 | 0.9147 |
| 0.2452 | 86.0 | 43000 | 0.2320 | 0.8279 | 0.961 | 0.9274 | 0.4008 | 0.8426 | 0.8327 | 0.3076 | 0.855 | 0.8629 | 0.5048 | 0.8756 | 0.874 | 0.8156 | 0.8618 | 0.7871 | 0.8134 | 0.8809 | 0.9136 |
| 0.2256 | 87.0 | 43500 | 0.2459 | 0.8166 | 0.9583 | 0.9184 | 0.3619 | 0.8318 | 0.8595 | 0.3012 | 0.8453 | 0.8526 | 0.4719 | 0.8673 | 0.8846 | 0.8093 | 0.8557 | 0.77 | 0.7938 | 0.8706 | 0.9084 |
| 0.2496 | 88.0 | 44000 | 0.2447 | 0.8135 | 0.9564 | 0.9078 | 0.3749 | 0.8285 | 0.8531 | 0.3045 | 0.8453 | 0.8551 | 0.4708 | 0.8687 | 0.8782 | 0.8016 | 0.8548 | 0.7654 | 0.799 | 0.8734 | 0.9116 |
| 0.2172 | 89.0 | 44500 | 0.2376 | 0.8192 | 0.9676 | 0.9264 | 0.4072 | 0.8306 | 0.8319 | 0.3046 | 0.8483 | 0.8553 | 0.509 | 0.8675 | 0.8639 | 0.8129 | 0.8535 | 0.7728 | 0.8041 | 0.872 | 0.9084 |
| 0.2778 | 90.0 | 45000 | 0.2361 | 0.8261 | 0.9642 | 0.9185 | 0.392 | 0.8394 | 0.8524 | 0.3092 | 0.8529 | 0.8602 | 0.4825 | 0.8724 | 0.877 | 0.8169 | 0.861 | 0.788 | 0.8124 | 0.8734 | 0.9072 |
| 0.223 | 91.0 | 45500 | 0.2505 | 0.8261 | 0.9679 | 0.9274 | 0.3941 | 0.8407 | 0.8661 | 0.3054 | 0.8536 | 0.8608 | 0.4989 | 0.8737 | 0.8911 | 0.813 | 0.8583 | 0.7795 | 0.8082 | 0.8856 | 0.9159 |
| 0.2241 | 92.0 | 46000 | 0.2486 | 0.8201 | 0.962 | 0.929 | 0.3875 | 0.8342 | 0.8669 | 0.3041 | 0.8496 | 0.8571 | 0.4873 | 0.8701 | 0.8918 | 0.807 | 0.8518 | 0.7676 | 0.8021 | 0.8856 | 0.9173 |
| 0.2465 | 93.0 | 46500 | 0.2443 | 0.8274 | 0.9602 | 0.9227 | 0.3638 | 0.8447 | 0.8942 | 0.3079 | 0.8522 | 0.8599 | 0.4797 | 0.8741 | 0.9122 | 0.8219 | 0.8632 | 0.7749 | 0.801 | 0.8854 | 0.9156 |
| 0.2513 | 94.0 | 47000 | 0.2493 | 0.8188 | 0.9637 | 0.9304 | 0.4143 | 0.8287 | 0.8448 | 0.3039 | 0.846 | 0.8558 | 0.5559 | 0.8644 | 0.8809 | 0.8109 | 0.8544 | 0.7691 | 0.8041 | 0.8764 | 0.909 |
| 0.2251 | 95.0 | 47500 | 0.2414 | 0.8218 | 0.9599 | 0.9229 | 0.3853 | 0.8393 | 0.8641 | 0.304 | 0.8507 | 0.8573 | 0.4756 | 0.8728 | 0.8939 | 0.8112 | 0.8553 | 0.7759 | 0.8041 | 0.8783 | 0.9124 |
| 0.2473 | 96.0 | 48000 | 0.2571 | 0.8181 | 0.957 | 0.9135 | 0.3424 | 0.8368 | 0.8717 | 0.3061 | 0.8454 | 0.8541 | 0.47 | 0.8713 | 0.8973 | 0.8045 | 0.85 | 0.7723 | 0.8021 | 0.8774 | 0.9101 |
| 0.23 | 97.0 | 48500 | 0.2362 | 0.8293 | 0.9631 | 0.9154 | 0.3715 | 0.8414 | 0.8603 | 0.3115 | 0.8557 | 0.8646 | 0.513 | 0.8766 | 0.8791 | 0.8114 | 0.8566 | 0.7916 | 0.8216 | 0.885 | 0.9156 |
| 0.2115 | 98.0 | 49000 | 0.2341 | 0.828 | 0.9636 | 0.9116 | 0.3231 | 0.8434 | 0.8604 | 0.3089 | 0.8542 | 0.862 | 0.4581 | 0.8761 | 0.8799 | 0.8214 | 0.8654 | 0.7776 | 0.8052 | 0.8849 | 0.9156 |
| 0.2281 | 99.0 | 49500 | 0.2287 | 0.8254 | 0.9729 | 0.9287 | 0.4068 | 0.8346 | 0.8709 | 0.3102 | 0.8542 | 0.8614 | 0.4911 | 0.8711 | 0.8873 | 0.812 | 0.8566 | 0.7823 | 0.8155 | 0.882 | 0.9121 |
| 0.2183 | 100.0 | 50000 | 0.2383 | 0.8248 | 0.9626 | 0.9233 | 0.3947 | 0.8319 | 0.8726 | 0.3071 | 0.8508 | 0.8584 | 0.5062 | 0.8665 | 0.8924 | 0.8154 | 0.861 | 0.7742 | 0.8 | 0.8849 | 0.9142 |
| 0.2259 | 101.0 | 50500 | 0.2291 | 0.837 | 0.9651 | 0.9289 | 0.3979 | 0.8485 | 0.8745 | 0.3113 | 0.8615 | 0.8699 | 0.5163 | 0.881 | 0.9002 | 0.8291 | 0.8711 | 0.794 | 0.8227 | 0.8879 | 0.9159 |
| 0.2213 | 102.0 | 51000 | 0.2472 | 0.8261 | 0.9706 | 0.9235 | 0.3687 | 0.8347 | 0.8724 | 0.3074 | 0.8525 | 0.8599 | 0.4962 | 0.868 | 0.8961 | 0.8135 | 0.8566 | 0.7778 | 0.8103 | 0.8871 | 0.9127 |
| 0.229 | 103.0 | 51500 | 0.2338 | 0.8285 | 0.9699 | 0.9166 | 0.3596 | 0.8393 | 0.8645 | 0.3064 | 0.8551 | 0.8637 | 0.5306 | 0.8755 | 0.8833 | 0.8253 | 0.8697 | 0.7703 | 0.8041 | 0.8898 | 0.9173 |
| 0.2281 | 104.0 | 52000 | 0.2300 | 0.8339 | 0.9635 | 0.9255 | 0.3881 | 0.8448 | 0.8693 | 0.3098 | 0.8596 | 0.8678 | 0.4989 | 0.8774 | 0.8845 | 0.8208 | 0.8662 | 0.7905 | 0.8206 | 0.8904 | 0.9165 |
| 0.209 | 105.0 | 52500 | 0.2423 | 0.8171 | 0.9679 | 0.9212 | 0.3746 | 0.8249 | 0.8637 | 0.3033 | 0.8451 | 0.8537 | 0.4962 | 0.8608 | 0.8826 | 0.8078 | 0.8539 | 0.7647 | 0.7979 | 0.8788 | 0.9092 |
| 0.2213 | 106.0 | 53000 | 0.2394 | 0.8256 | 0.9635 | 0.9075 | 0.3323 | 0.8394 | 0.8743 | 0.3078 | 0.8508 | 0.859 | 0.4359 | 0.8722 | 0.8881 | 0.811 | 0.8566 | 0.778 | 0.8062 | 0.888 | 0.9142 |
| 0.227 | 107.0 | 53500 | 0.2417 | 0.8279 | 0.9694 | 0.9158 | 0.3544 | 0.8416 | 0.8716 | 0.3085 | 0.8556 | 0.8637 | 0.4657 | 0.8761 | 0.8875 | 0.8167 | 0.8614 | 0.7846 | 0.8165 | 0.8825 | 0.9133 |
| 0.2084 | 108.0 | 54000 | 0.2341 | 0.8277 | 0.9711 | 0.9257 | 0.3836 | 0.8382 | 0.8664 | 0.3067 | 0.8539 | 0.8629 | 0.4692 | 0.8724 | 0.8838 | 0.8085 | 0.8596 | 0.7903 | 0.8165 | 0.8843 | 0.9124 |
| 0.2095 | 109.0 | 54500 | 0.2300 | 0.8307 | 0.964 | 0.9304 | 0.4225 | 0.8405 | 0.8951 | 0.3072 | 0.8592 | 0.8664 | 0.494 | 0.8755 | 0.9142 | 0.8151 | 0.8596 | 0.7875 | 0.8216 | 0.8896 | 0.9179 |
| 0.2103 | 110.0 | 55000 | 0.2246 | 0.8273 | 0.9684 | 0.9252 | 0.4302 | 0.8377 | 0.8706 | 0.3084 | 0.8589 | 0.8661 | 0.5087 | 0.878 | 0.8973 | 0.8141 | 0.8592 | 0.7812 | 0.8186 | 0.8866 | 0.9205 |
| 0.2245 | 111.0 | 55500 | 0.2338 | 0.8309 | 0.9709 | 0.9249 | 0.4221 | 0.8386 | 0.8575 | 0.3068 | 0.8595 | 0.867 | 0.5368 | 0.8744 | 0.8762 | 0.8142 | 0.8588 | 0.7897 | 0.8247 | 0.8887 | 0.9173 |
| 0.2281 | 112.0 | 56000 | 0.2262 | 0.8354 | 0.9673 | 0.9293 | 0.432 | 0.8445 | 0.8699 | 0.3067 | 0.8635 | 0.8715 | 0.5317 | 0.8806 | 0.8851 | 0.8237 | 0.8719 | 0.7926 | 0.8227 | 0.89 | 0.9199 |
| 0.2205 | 113.0 | 56500 | 0.2347 | 0.823 | 0.9683 | 0.9043 | 0.3592 | 0.8366 | 0.8696 | 0.3071 | 0.8519 | 0.86 | 0.4729 | 0.8716 | 0.8896 | 0.8103 | 0.8588 | 0.7685 | 0.8021 | 0.8903 | 0.9191 |
| 0.2374 | 114.0 | 57000 | 0.2360 | 0.8295 | 0.9629 | 0.9078 | 0.3395 | 0.8457 | 0.8675 | 0.3095 | 0.8598 | 0.8676 | 0.4697 | 0.881 | 0.8841 | 0.8151 | 0.8667 | 0.7891 | 0.8206 | 0.8842 | 0.9156 |
| 0.204 | 115.0 | 57500 | 0.2358 | 0.8231 | 0.9718 | 0.9045 | 0.3297 | 0.8352 | 0.8432 | 0.3064 | 0.8518 | 0.8601 | 0.4546 | 0.8713 | 0.8681 | 0.8017 | 0.8504 | 0.7839 | 0.8155 | 0.8836 | 0.9145 |
| 0.2502 | 116.0 | 58000 | 0.2258 | 0.831 | 0.9681 | 0.921 | 0.4177 | 0.8448 | 0.8674 | 0.3096 | 0.8606 | 0.8691 | 0.5117 | 0.882 | 0.8842 | 0.8169 | 0.8654 | 0.787 | 0.8227 | 0.889 | 0.9194 |
| 0.238 | 117.0 | 58500 | 0.2288 | 0.8291 | 0.9616 | 0.9182 | 0.4143 | 0.8382 | 0.9019 | 0.3079 | 0.8568 | 0.865 | 0.4976 | 0.8735 | 0.9192 | 0.824 | 0.8697 | 0.7749 | 0.8062 | 0.8885 | 0.9191 |
| 0.2226 | 118.0 | 59000 | 0.2374 | 0.8306 | 0.9665 | 0.9135 | 0.3645 | 0.8445 | 0.8667 | 0.311 | 0.862 | 0.8692 | 0.4611 | 0.8811 | 0.8849 | 0.816 | 0.8671 | 0.7866 | 0.8237 | 0.8893 | 0.9168 |
| 0.2259 | 119.0 | 59500 | 0.2232 | 0.8308 | 0.9663 | 0.9136 | 0.3674 | 0.844 | 0.8746 | 0.3108 | 0.8612 | 0.8687 | 0.4486 | 0.881 | 0.8903 | 0.8175 | 0.8693 | 0.7854 | 0.8175 | 0.8894 | 0.9194 |
| 0.2047 | 120.0 | 60000 | 0.2327 | 0.829 | 0.9629 | 0.9237 | 0.3909 | 0.8393 | 0.8677 | 0.3065 | 0.8587 | 0.8662 | 0.4817 | 0.876 | 0.8855 | 0.8135 | 0.8654 | 0.7854 | 0.8155 | 0.888 | 0.9179 |
| 0.2223 | 121.0 | 60500 | 0.2336 | 0.8325 | 0.9631 | 0.9231 | 0.4079 | 0.8459 | 0.8669 | 0.309 | 0.8616 | 0.8694 | 0.4995 | 0.8792 | 0.8847 | 0.8203 | 0.8697 | 0.7898 | 0.8216 | 0.8873 | 0.9168 |
| 0.1936 | 122.0 | 61000 | 0.2319 | 0.8321 | 0.964 | 0.9165 | 0.3641 | 0.8493 | 0.8615 | 0.3108 | 0.8599 | 0.8685 | 0.4417 | 0.8819 | 0.8814 | 0.8139 | 0.8632 | 0.7924 | 0.8237 | 0.89 | 0.9185 |
| 0.2038 | 123.0 | 61500 | 0.2330 | 0.8267 | 0.9608 | 0.9099 | 0.35 | 0.8442 | 0.8677 | 0.3092 | 0.8565 | 0.8646 | 0.433 | 0.8791 | 0.8852 | 0.8105 | 0.8627 | 0.7817 | 0.8134 | 0.8879 | 0.9176 |
| 0.2036 | 124.0 | 62000 | 0.2300 | 0.8318 | 0.9631 | 0.9257 | 0.3749 | 0.8444 | 0.8667 | 0.3108 | 0.8599 | 0.8687 | 0.4695 | 0.8799 | 0.8866 | 0.8169 | 0.8645 | 0.7867 | 0.8206 | 0.8918 | 0.9211 |
| 0.1994 | 125.0 | 62500 | 0.2295 | 0.8299 | 0.9608 | 0.9107 | 0.3536 | 0.848 | 0.8695 | 0.3095 | 0.86 | 0.8682 | 0.4451 | 0.8833 | 0.8856 | 0.8142 | 0.8658 | 0.7852 | 0.8186 | 0.8904 | 0.9202 |
| 0.2026 | 126.0 | 63000 | 0.2343 | 0.829 | 0.9626 | 0.9233 | 0.3791 | 0.844 | 0.8677 | 0.308 | 0.8592 | 0.8667 | 0.4624 | 0.879 | 0.8844 | 0.8149 | 0.8671 | 0.7859 | 0.8175 | 0.8862 | 0.9156 |
| 0.2 | 127.0 | 63500 | 0.2327 | 0.8339 | 0.9658 | 0.9133 | 0.3706 | 0.8496 | 0.878 | 0.311 | 0.8629 | 0.8713 | 0.4579 | 0.8839 | 0.9031 | 0.8217 | 0.8711 | 0.7882 | 0.8216 | 0.8917 | 0.9211 |
| 0.2144 | 128.0 | 64000 | 0.2318 | 0.8291 | 0.9655 | 0.9154 | 0.3751 | 0.844 | 0.877 | 0.3098 | 0.8591 | 0.8678 | 0.4717 | 0.8796 | 0.9019 | 0.8184 | 0.8693 | 0.7795 | 0.8155 | 0.8893 | 0.9185 |
| 0.2059 | 129.0 | 64500 | 0.2329 | 0.8341 | 0.9658 | 0.9201 | 0.384 | 0.8478 | 0.8706 | 0.3099 | 0.8624 | 0.871 | 0.4875 | 0.881 | 0.8984 | 0.8218 | 0.8706 | 0.7891 | 0.8227 | 0.8913 | 0.9197 |
| 0.1823 | 130.0 | 65000 | 0.2337 | 0.8335 | 0.9655 | 0.9249 | 0.3836 | 0.8476 | 0.8754 | 0.3092 | 0.8624 | 0.8714 | 0.476 | 0.8829 | 0.8906 | 0.8168 | 0.8697 | 0.7905 | 0.8227 | 0.893 | 0.9217 |
| 0.2173 | 131.0 | 65500 | 0.2296 | 0.8324 | 0.9667 | 0.9218 | 0.3771 | 0.8467 | 0.8773 | 0.3092 | 0.8616 | 0.8698 | 0.4705 | 0.881 | 0.9024 | 0.8186 | 0.8693 | 0.788 | 0.8206 | 0.8907 | 0.9194 |
| 0.2017 | 132.0 | 66000 | 0.2282 | 0.8301 | 0.9667 | 0.9236 | 0.3857 | 0.8453 | 0.8682 | 0.3096 | 0.8605 | 0.8688 | 0.479 | 0.8803 | 0.8855 | 0.822 | 0.8728 | 0.7787 | 0.8144 | 0.8894 | 0.9191 |
| 0.1973 | 133.0 | 66500 | 0.2262 | 0.8327 | 0.9668 | 0.9231 | 0.3903 | 0.8475 | 0.8708 | 0.3106 | 0.8638 | 0.8718 | 0.4762 | 0.8828 | 0.8986 | 0.8198 | 0.8702 | 0.7871 | 0.8247 | 0.8913 | 0.9205 |
| 0.2199 | 134.0 | 67000 | 0.2275 | 0.8325 | 0.9669 | 0.9223 | 0.3866 | 0.8461 | 0.878 | 0.3097 | 0.8628 | 0.8709 | 0.4716 | 0.8813 | 0.9031 | 0.8197 | 0.8702 | 0.7858 | 0.8216 | 0.892 | 0.9208 |
| 0.2184 | 135.0 | 67500 | 0.2289 | 0.8334 | 0.9659 | 0.9238 | 0.3801 | 0.8461 | 0.871 | 0.3085 | 0.8627 | 0.8708 | 0.47 | 0.8811 | 0.8988 | 0.8215 | 0.8697 | 0.7881 | 0.8227 | 0.8904 | 0.9199 |
| 0.1923 | 136.0 | 68000 | 0.2295 | 0.8342 | 0.9658 | 0.9282 | 0.4046 | 0.8454 | 0.8787 | 0.3096 | 0.8635 | 0.8714 | 0.4843 | 0.8808 | 0.9031 | 0.8221 | 0.8711 | 0.7878 | 0.8227 | 0.8926 | 0.9205 |
| 0.2154 | 137.0 | 68500 | 0.2292 | 0.8336 | 0.9652 | 0.9277 | 0.3918 | 0.847 | 0.8714 | 0.3089 | 0.8627 | 0.8705 | 0.4765 | 0.8813 | 0.8986 | 0.8232 | 0.8719 | 0.7862 | 0.8196 | 0.8912 | 0.9199 |
| 0.1954 | 138.0 | 69000 | 0.2295 | 0.8327 | 0.9668 | 0.9283 | 0.392 | 0.8453 | 0.8628 | 0.3085 | 0.861 | 0.869 | 0.4767 | 0.8789 | 0.8818 | 0.8228 | 0.8719 | 0.785 | 0.8165 | 0.8903 | 0.9185 |
| 0.2247 | 139.0 | 69500 | 0.2279 | 0.8338 | 0.9668 | 0.924 | 0.3881 | 0.8476 | 0.8625 | 0.309 | 0.8623 | 0.87 | 0.4729 | 0.8814 | 0.8818 | 0.8217 | 0.8715 | 0.7876 | 0.8186 | 0.8922 | 0.9199 |
| 0.2036 | 140.0 | 70000 | 0.2297 | 0.833 | 0.9668 | 0.9249 | 0.3989 | 0.8455 | 0.8692 | 0.3097 | 0.8613 | 0.8689 | 0.4806 | 0.8795 | 0.8859 | 0.8214 | 0.8697 | 0.7864 | 0.8175 | 0.8912 | 0.9194 |
| 0.1877 | 141.0 | 70500 | 0.2284 | 0.8348 | 0.9667 | 0.9249 | 0.3994 | 0.8487 | 0.8618 | 0.3101 | 0.8633 | 0.8708 | 0.4806 | 0.8821 | 0.8814 | 0.822 | 0.8711 | 0.7921 | 0.8227 | 0.8902 | 0.9188 |
| 0.2172 | 142.0 | 71000 | 0.2302 | 0.833 | 0.9667 | 0.9249 | 0.3958 | 0.8468 | 0.8766 | 0.3098 | 0.8609 | 0.8684 | 0.4725 | 0.8801 | 0.9021 | 0.8208 | 0.8675 | 0.7873 | 0.8186 | 0.8909 | 0.9191 |
| 0.2075 | 143.0 | 71500 | 0.2295 | 0.8336 | 0.9667 | 0.9217 | 0.3773 | 0.8481 | 0.8767 | 0.309 | 0.8623 | 0.8699 | 0.4624 | 0.8819 | 0.9023 | 0.8211 | 0.8689 | 0.7898 | 0.8216 | 0.89 | 0.9191 |
| 0.2151 | 144.0 | 72000 | 0.2293 | 0.8337 | 0.9667 | 0.9217 | 0.381 | 0.8486 | 0.8767 | 0.3093 | 0.8628 | 0.8702 | 0.4648 | 0.882 | 0.9024 | 0.822 | 0.8706 | 0.7886 | 0.8206 | 0.8904 | 0.9194 |
| 0.2604 | 145.0 | 72500 | 0.2288 | 0.8325 | 0.9667 | 0.9216 | 0.3788 | 0.848 | 0.8695 | 0.3091 | 0.8624 | 0.8698 | 0.4592 | 0.8822 | 0.8978 | 0.8211 | 0.8706 | 0.7884 | 0.8206 | 0.8881 | 0.9182 |
| 0.2113 | 146.0 | 73000 | 0.2287 | 0.8329 | 0.9667 | 0.9216 | 0.384 | 0.8477 | 0.8697 | 0.3089 | 0.8621 | 0.8696 | 0.4616 | 0.8816 | 0.898 | 0.8222 | 0.8706 | 0.7881 | 0.8196 | 0.8885 | 0.9185 |
| 0.2192 | 147.0 | 73500 | 0.2292 | 0.8329 | 0.9667 | 0.9216 | 0.384 | 0.8476 | 0.8696 | 0.3088 | 0.8623 | 0.8697 | 0.4616 | 0.8818 | 0.898 | 0.8221 | 0.8711 | 0.7881 | 0.8196 | 0.8884 | 0.9185 |
| 0.2251 | 148.0 | 74000 | 0.2292 | 0.8326 | 0.9667 | 0.9216 | 0.3829 | 0.8476 | 0.8697 | 0.3088 | 0.8621 | 0.8696 | 0.4616 | 0.8816 | 0.898 | 0.8212 | 0.8706 | 0.7881 | 0.8196 | 0.8885 | 0.9185 |
| 0.2084 | 149.0 | 74500 | 0.2292 | 0.8325 | 0.9667 | 0.9216 | 0.3829 | 0.8475 | 0.8697 | 0.3088 | 0.862 | 0.8694 | 0.4616 | 0.8815 | 0.898 | 0.821 | 0.8702 | 0.7881 | 0.8196 | 0.8885 | 0.9185 |
| 0.2152 | 150.0 | 75000 | 0.2292 | 0.8325 | 0.9667 | 0.9216 | 0.3829 | 0.8475 | 0.8697 | 0.3088 | 0.862 | 0.8694 | 0.4616 | 0.8815 | 0.898 | 0.821 | 0.8702 | 0.7881 | 0.8196 | 0.8885 | 0.9185 |
### Framework versions
- Transformers 4.46.1
- Pytorch 2.5.0+cu121
- Datasets 2.19.2
- Tokenizers 0.20.1
| [
"chicken",
"duck",
"plant"
] |
3essa214142/detr-finetuned-malaria-v2 |
# Model Card for Model ID
<!-- Provide a quick summary of what the model is/does. -->
## Model Details
### Model Description
<!-- Provide a longer summary of what this model is. -->
This is the model card of a 🤗 transformers model that has been pushed on the Hub. This model card has been automatically generated.
- **Developed by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Model type:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
- **Finetuned from model [optional]:** [More Information Needed]
### Model Sources [optional]
<!-- Provide the basic links for the model. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the model is intended to be used, including the foreseeable users of the model and those affected by the model. -->
### Direct Use
<!-- This section is for the model use without fine-tuning or plugging into a larger ecosystem/app. -->
[More Information Needed]
### Downstream Use [optional]
<!-- This section is for the model use when fine-tuned for a task, or when plugged into a larger ecosystem/app -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the model will not work well for. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.
## How to Get Started with the Model
Use the code below to get started with the model.
[More Information Needed]
## Training Details
### Training Data
<!-- This should link to a Dataset Card, perhaps with a short stub of information on what the training data is all about as well as documentation related to data pre-processing or additional filtering. -->
[More Information Needed]
### Training Procedure
<!-- This relates heavily to the Technical Specifications. Content here should link to that section when it is relevant to the training procedure. -->
#### Preprocessing [optional]
[More Information Needed]
#### Training Hyperparameters
- **Training regime:** [More Information Needed] <!--fp32, fp16 mixed precision, bf16 mixed precision, bf16 non-mixed precision, fp16 non-mixed precision, fp8 mixed precision -->
#### Speeds, Sizes, Times [optional]
<!-- This section provides information about throughput, start/end time, checkpoint size if relevant, etc. -->
[More Information Needed]
## Evaluation
<!-- This section describes the evaluation protocols and provides the results. -->
### Testing Data, Factors & Metrics
#### Testing Data
<!-- This should link to a Dataset Card if possible. -->
[More Information Needed]
#### Factors
<!-- These are the things the evaluation is disaggregating by, e.g., subpopulations or domains. -->
[More Information Needed]
#### Metrics
<!-- These are the evaluation metrics being used, ideally with a description of why. -->
[More Information Needed]
### Results
[More Information Needed]
#### Summary
## Model Examination [optional]
<!-- Relevant interpretability work for the model goes here -->
[More Information Needed]
## Environmental Impact
<!-- Total emissions (in grams of CO2eq) and additional considerations, such as electricity usage, go here. Edit the suggested text below accordingly -->
Carbon emissions can be estimated using the [Machine Learning Impact calculator](https://mlco2.github.io/impact#compute) presented in [Lacoste et al. (2019)](https://arxiv.org/abs/1910.09700).
- **Hardware Type:** [More Information Needed]
- **Hours used:** [More Information Needed]
- **Cloud Provider:** [More Information Needed]
- **Compute Region:** [More Information Needed]
- **Carbon Emitted:** [More Information Needed]
## Technical Specifications [optional]
### Model Architecture and Objective
[More Information Needed]
### Compute Infrastructure
[More Information Needed]
#### Hardware
[More Information Needed]
#### Software
[More Information Needed]
## Citation [optional]
<!-- If there is a paper or blog post introducing the model, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the model or model card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Model Card Authors [optional]
[More Information Needed]
## Model Card Contact
[More Information Needed] | [
"label_0",
"label_1",
"label_2"
] |
3essa214142/detr-finetuned-malaria-v3 |
# Model Card for Model ID
<!-- Provide a quick summary of what the model is/does. -->
## Model Details
### Model Description
<!-- Provide a longer summary of what this model is. -->
This is the model card of a 🤗 transformers model that has been pushed on the Hub. This model card has been automatically generated.
- **Developed by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Model type:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
- **Finetuned from model [optional]:** [More Information Needed]
### Model Sources [optional]
<!-- Provide the basic links for the model. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the model is intended to be used, including the foreseeable users of the model and those affected by the model. -->
### Direct Use
<!-- This section is for the model use without fine-tuning or plugging into a larger ecosystem/app. -->
[More Information Needed]
### Downstream Use [optional]
<!-- This section is for the model use when fine-tuned for a task, or when plugged into a larger ecosystem/app -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the model will not work well for. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.
## How to Get Started with the Model
Use the code below to get started with the model.
[More Information Needed]
## Training Details
### Training Data
<!-- This should link to a Dataset Card, perhaps with a short stub of information on what the training data is all about as well as documentation related to data pre-processing or additional filtering. -->
[More Information Needed]
### Training Procedure
<!-- This relates heavily to the Technical Specifications. Content here should link to that section when it is relevant to the training procedure. -->
#### Preprocessing [optional]
[More Information Needed]
#### Training Hyperparameters
- **Training regime:** [More Information Needed] <!--fp32, fp16 mixed precision, bf16 mixed precision, bf16 non-mixed precision, fp16 non-mixed precision, fp8 mixed precision -->
#### Speeds, Sizes, Times [optional]
<!-- This section provides information about throughput, start/end time, checkpoint size if relevant, etc. -->
[More Information Needed]
## Evaluation
<!-- This section describes the evaluation protocols and provides the results. -->
### Testing Data, Factors & Metrics
#### Testing Data
<!-- This should link to a Dataset Card if possible. -->
[More Information Needed]
#### Factors
<!-- These are the things the evaluation is disaggregating by, e.g., subpopulations or domains. -->
[More Information Needed]
#### Metrics
<!-- These are the evaluation metrics being used, ideally with a description of why. -->
[More Information Needed]
### Results
[More Information Needed]
#### Summary
## Model Examination [optional]
<!-- Relevant interpretability work for the model goes here -->
[More Information Needed]
## Environmental Impact
<!-- Total emissions (in grams of CO2eq) and additional considerations, such as electricity usage, go here. Edit the suggested text below accordingly -->
Carbon emissions can be estimated using the [Machine Learning Impact calculator](https://mlco2.github.io/impact#compute) presented in [Lacoste et al. (2019)](https://arxiv.org/abs/1910.09700).
- **Hardware Type:** [More Information Needed]
- **Hours used:** [More Information Needed]
- **Cloud Provider:** [More Information Needed]
- **Compute Region:** [More Information Needed]
- **Carbon Emitted:** [More Information Needed]
## Technical Specifications [optional]
### Model Architecture and Objective
[More Information Needed]
### Compute Infrastructure
[More Information Needed]
#### Hardware
[More Information Needed]
#### Software
[More Information Needed]
## Citation [optional]
<!-- If there is a paper or blog post introducing the model, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the model or model card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Model Card Authors [optional]
[More Information Needed]
## Model Card Contact
[More Information Needed] | [
"label_0",
"label_1",
"label_2",
"label_3"
] |
3essa214142/detr-finetuned-malaria-v4 |
# Model Card for Model ID
<!-- Provide a quick summary of what the model is/does. -->
## Model Details
### Model Description
<!-- Provide a longer summary of what this model is. -->
This is the model card of a 🤗 transformers model that has been pushed on the Hub. This model card has been automatically generated.
- **Developed by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Model type:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
- **Finetuned from model [optional]:** [More Information Needed]
### Model Sources [optional]
<!-- Provide the basic links for the model. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the model is intended to be used, including the foreseeable users of the model and those affected by the model. -->
### Direct Use
<!-- This section is for the model use without fine-tuning or plugging into a larger ecosystem/app. -->
[More Information Needed]
### Downstream Use [optional]
<!-- This section is for the model use when fine-tuned for a task, or when plugged into a larger ecosystem/app -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the model will not work well for. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.
## How to Get Started with the Model
Use the code below to get started with the model.
[More Information Needed]
## Training Details
### Training Data
<!-- This should link to a Dataset Card, perhaps with a short stub of information on what the training data is all about as well as documentation related to data pre-processing or additional filtering. -->
[More Information Needed]
### Training Procedure
<!-- This relates heavily to the Technical Specifications. Content here should link to that section when it is relevant to the training procedure. -->
#### Preprocessing [optional]
[More Information Needed]
#### Training Hyperparameters
- **Training regime:** [More Information Needed] <!--fp32, fp16 mixed precision, bf16 mixed precision, bf16 non-mixed precision, fp16 non-mixed precision, fp8 mixed precision -->
#### Speeds, Sizes, Times [optional]
<!-- This section provides information about throughput, start/end time, checkpoint size if relevant, etc. -->
[More Information Needed]
## Evaluation
<!-- This section describes the evaluation protocols and provides the results. -->
### Testing Data, Factors & Metrics
#### Testing Data
<!-- This should link to a Dataset Card if possible. -->
[More Information Needed]
#### Factors
<!-- These are the things the evaluation is disaggregating by, e.g., subpopulations or domains. -->
[More Information Needed]
#### Metrics
<!-- These are the evaluation metrics being used, ideally with a description of why. -->
[More Information Needed]
### Results
[More Information Needed]
#### Summary
## Model Examination [optional]
<!-- Relevant interpretability work for the model goes here -->
[More Information Needed]
## Environmental Impact
<!-- Total emissions (in grams of CO2eq) and additional considerations, such as electricity usage, go here. Edit the suggested text below accordingly -->
Carbon emissions can be estimated using the [Machine Learning Impact calculator](https://mlco2.github.io/impact#compute) presented in [Lacoste et al. (2019)](https://arxiv.org/abs/1910.09700).
- **Hardware Type:** [More Information Needed]
- **Hours used:** [More Information Needed]
- **Cloud Provider:** [More Information Needed]
- **Compute Region:** [More Information Needed]
- **Carbon Emitted:** [More Information Needed]
## Technical Specifications [optional]
### Model Architecture and Objective
[More Information Needed]
### Compute Infrastructure
[More Information Needed]
#### Hardware
[More Information Needed]
#### Software
[More Information Needed]
## Citation [optional]
<!-- If there is a paper or blog post introducing the model, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the model or model card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Model Card Authors [optional]
[More Information Needed]
## Model Card Contact
[More Information Needed] | [
"label_0",
"label_1",
"label_2",
"label_3"
] |
Rookieez/detr_finetuned_cppe5 |
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# detr_finetuned_cppe5
This model is a fine-tuned version of [facebook/detr-resnet-50](https://huggingface.co/facebook/detr-resnet-50) on the imagefolder dataset.
It achieves the following results on the evaluation set:
- Loss: 2.1426
- Map: 0.0
- Map 50: 0.0
- Map 75: 0.0
- Map Small: -1.0
- Map Medium: 0.0
- Map Large: -1.0
- Mar 1: 0.0
- Mar 10: 0.0
- Mar 100: 0.0
- Mar Small: -1.0
- Mar Medium: 0.0
- Mar Large: -1.0
- Map Grey Star: -1.0
- Mar 100 Grey Star: -1.0
- Map Insect: 0.0
- Mar 100 Insect: 0.0
- Map Moon: 0.0
- Mar 100 Moon: 0.0
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 5e-05
- train_batch_size: 1
- eval_batch_size: 8
- seed: 42
- optimizer: Use adamw_torch with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
- lr_scheduler_type: cosine
- num_epochs: 30
### Training results
| Training Loss | Epoch | Step | Validation Loss | Map | Map 50 | Map 75 | Map Small | Map Medium | Map Large | Mar 1 | Mar 10 | Mar 100 | Mar Small | Mar Medium | Mar Large | Map Black Star | Mar 100 Black Star | Map Cat | Mar 100 Cat | Map Grey Star | Mar 100 Grey Star | Map Insect | Mar 100 Insect | Map Moon | Mar 100 Moon | Map Unicorn Head | Mar 100 Unicorn Head | Map Unicorn Whole | Mar 100 Unicorn Whole |
|:-------------:|:-----:|:----:|:---------------:|:---:|:------:|:------:|:---------:|:----------:|:---------:|:-----:|:------:|:-------:|:---------:|:----------:|:---------:|:--------------:|:------------------:|:-------:|:-----------:|:-------------:|:-----------------:|:----------:|:--------------:|:--------:|:------------:|:----------------:|:--------------------:|:-----------------:|:---------------------:|
| No log | 1.0 | 9 | 3.0843 | 0.0 | 0.0 | 0.0 | -1.0 | 0.0 | -1.0 | 0.0 | 0.0 | 0.0 | -1.0 | 0.0 | -1.0 | -1.0 | -1.0 | -1.0 | -1.0 | -1.0 | -1.0 | 0.0 | 0.0 | 0.0 | 0.0 | -1.0 | -1.0 | -1.0 | -1.0 |
| No log | 2.0 | 18 | 2.8919 | 0.0 | 0.0 | 0.0 | -1.0 | 0.0 | -1.0 | 0.0 | 0.0 | 0.0 | -1.0 | 0.0 | -1.0 | -1.0 | -1.0 | -1.0 | -1.0 | -1.0 | -1.0 | 0.0 | 0.0 | 0.0 | 0.0 | -1.0 | -1.0 | -1.0 | -1.0 |
| No log | 3.0 | 27 | 2.7564 | 0.0 | 0.0 | 0.0 | -1.0 | 0.0 | -1.0 | 0.0 | 0.0 | 0.0 | -1.0 | 0.0 | -1.0 | -1.0 | -1.0 | -1.0 | -1.0 | -1.0 | -1.0 | 0.0 | 0.0 | 0.0 | 0.0 | -1.0 | -1.0 | -1.0 | -1.0 |
| No log | 4.0 | 36 | 2.7363 | 0.0 | 0.0 | 0.0 | -1.0 | 0.0 | -1.0 | 0.0 | 0.0 | 0.0 | -1.0 | 0.0 | -1.0 | -1.0 | -1.0 | -1.0 | -1.0 | -1.0 | -1.0 | 0.0 | 0.0 | 0.0 | 0.0 | -1.0 | -1.0 | -1.0 | -1.0 |
| No log | 5.0 | 45 | 2.6145 | 0.0 | 0.0 | 0.0 | -1.0 | 0.0 | -1.0 | 0.0 | 0.0 | 0.0 | -1.0 | 0.0 | -1.0 | -1.0 | -1.0 | -1.0 | -1.0 | -1.0 | -1.0 | 0.0 | 0.0 | 0.0 | 0.0 | -1.0 | -1.0 | -1.0 | -1.0 |
| No log | 6.0 | 54 | 2.5328 | 0.0 | 0.0 | 0.0 | -1.0 | 0.0 | -1.0 | 0.0 | 0.0 | 0.0 | -1.0 | 0.0 | -1.0 | -1.0 | -1.0 | -1.0 | -1.0 | -1.0 | -1.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| No log | 7.0 | 63 | 2.5044 | 0.0 | 0.0 | 0.0 | -1.0 | 0.0 | -1.0 | 0.0 | 0.0 | 0.0 | -1.0 | 0.0 | -1.0 | -1.0 | -1.0 | -1.0 | -1.0 | 0.0 | 0.0 | 0.0 | 0.0 | -1.0 | -1.0 |
| No log | 8.0 | 72 | 2.4637 | 0.0 | 0.0 | 0.0 | -1.0 | 0.0 | -1.0 | 0.0 | 0.0 | 0.0 | -1.0 | 0.0 | -1.0 | -1.0 | -1.0 | -1.0 | -1.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| No log | 9.0 | 81 | 2.5407 | 0.0 | 0.0 | 0.0 | -1.0 | 0.0 | -1.0 | 0.0 | 0.0 | 0.0 | -1.0 | 0.0 | -1.0 | -1.0 | -1.0 | -1.0 | -1.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| No log | 10.0 | 90 | 2.4309 | 0.0 | 0.0 | 0.0 | -1.0 | 0.0 | -1.0 | 0.0 | 0.0 | 0.0 | -1.0 | 0.0 | -1.0 | -1.0 | -1.0 | -1.0 | -1.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| No log | 11.0 | 99 | 2.4180 | 0.0 | 0.0 | 0.0 | -1.0 | 0.0 | -1.0 | 0.0 | 0.0 | 0.0 | -1.0 | 0.0 | -1.0 | -1.0 | -1.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| No log | 12.0 | 108 | 2.5998 | 0.0 | 0.0 | 0.0 | -1.0 | 0.0 | -1.0 | 0.0 | 0.0 | 0.0 | -1.0 | 0.0 | -1.0 | -1.0 | -1.0 | -1.0 | -1.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| No log | 13.0 | 117 | 2.4657 | 0.0 | 0.0 | 0.0 | -1.0 | 0.0 | -1.0 | 0.0 | 0.0 | 0.0 | -1.0 | 0.0 | -1.0 | -1.0 | -1.0 | -1.0 | -1.0 | 0.0 | 0.0 | 0.0 | 0.0 | -1.0 | -1.0 | -1.0 | -1.0 |
| No log | 14.0 | 126 | 2.3398 | 0.0 | 0.0 | 0.0 | -1.0 | 0.0 | -1.0 | 0.0 | 0.0 | 0.0 | -1.0 | 0.0 | -1.0 | -1.0 | -1.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| No log | 15.0 | 135 | 2.3136 | 0.0 | 0.0 | 0.0 | -1.0 | 0.0 | -1.0 | 0.0 | 0.0 | 0.0 | -1.0 | 0.0 | -1.0 | -1.0 | -1.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| No log | 16.0 | 144 | 2.2952 | 0.0 | 0.0 | 0.0 | -1.0 | 0.0 | -1.0 | 0.0 | 0.0 | 0.0 | -1.0 | 0.0 | -1.0 | -1.0 | -1.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| No log | 17.0 | 153 | 2.3616 | 0.0 | 0.0 | 0.0 | -1.0 | 0.0 | -1.0 | 0.0 | 0.0 | 0.0 | -1.0 | 0.0 | -1.0 | -1.0 | -1.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| No log | 18.0 | 162 | 2.4010 | 0.0 | 0.0 | 0.0 | -1.0 | 0.0 | -1.0 | 0.0 | 0.0 | 0.0 | -1.0 | 0.0 | -1.0 | -1.0 | -1.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| No log | 19.0 | 171 | 2.3679 | 0.0 | 0.0 | 0.0 | -1.0 | 0.0 | -1.0 | 0.0 | 0.0 | 0.0 | -1.0 | 0.0 | -1.0 | -1.0 | -1.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| No log | 20.0 | 180 | 2.3450 | 0.0 | 0.0 | 0.0 | -1.0 | 0.0 | -1.0 | 0.0 | 0.0 | 0.0 | -1.0 | 0.0 | -1.0 | -1.0 | -1.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| No log | 21.0 | 189 | 2.3824 | 0.0 | 0.0 | 0.0 | -1.0 | 0.0 | -1.0 | 0.0 | 0.0 | 0.0 | -1.0 | 0.0 | -1.0 | -1.0 | -1.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| No log | 22.0 | 198 | 2.2668 | 0.0 | 0.0 | 0.0 | -1.0 | 0.0 | -1.0 | 0.0 | 0.0 | 0.0 | -1.0 | 0.0 | -1.0 | -1.0 | -1.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| No log | 23.0 | 207 | 2.1832 | 0.0 | 0.0 | 0.0 | -1.0 | 0.0 | -1.0 | 0.0 | 0.0 | 0.0 | -1.0 | 0.0 | -1.0 | -1.0 | -1.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| No log | 24.0 | 216 | 2.1715 | 0.0 | 0.0 | 0.0 | -1.0 | 0.0 | -1.0 | 0.0 | 0.0 | 0.0 | -1.0 | 0.0 | -1.0 | -1.0 | -1.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| No log | 25.0 | 225 | 2.1695 | 0.0 | 0.0 | 0.0 | -1.0 | 0.0 | -1.0 | 0.0 | 0.0 | 0.0 | -1.0 | 0.0 | -1.0 | -1.0 | -1.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| No log | 26.0 | 234 | 2.1456 | 0.0 | 0.0 | 0.0 | -1.0 | 0.0 | -1.0 | 0.0 | 0.0 | 0.0 | -1.0 | 0.0 | -1.0 | -1.0 | -1.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| No log | 27.0 | 243 | 2.1490 | 0.0 | 0.0 | 0.0 | -1.0 | 0.0 | -1.0 | 0.0 | 0.0 | 0.0 | -1.0 | 0.0 | -1.0 | -1.0 | -1.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| No log | 28.0 | 252 | 2.1432 | 0.0 | 0.0 | 0.0 | -1.0 | 0.0 | -1.0 | 0.0 | 0.0 | 0.0 | -1.0 | 0.0 | -1.0 | -1.0 | -1.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| No log | 29.0 | 261 | 2.1424 | 0.0 | 0.0 | 0.0 | -1.0 | 0.0 | -1.0 | 0.0 | 0.0 | 0.0 | -1.0 | 0.0 | -1.0 | -1.0 | -1.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| No log | 30.0 | 270 | 2.1426 | 0.0 | 0.0 | 0.0 | -1.0 | 0.0 | -1.0 | 0.0 | 0.0 | 0.0 | -1.0 | 0.0 | -1.0 | -1.0 | -1.0 | 0.0 | 0.0 | 0.0 | 0.0 |
### Framework versions
- Transformers 4.46.1
- Pytorch 2.5.1
- Datasets 3.1.0
- Tokenizers 0.20.2
| [
"black_star",
"cat",
"grey_star",
"insect",
"moon",
"owl",
"unicorn_head",
"unicorn_whole"
] |
atmiaxue/detr_finetuned_cppe5 |
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# detr_finetuned_cppe5
This model is a fine-tuned version of [facebook/detr-resnet-50](https://huggingface.co/facebook/detr-resnet-50) on the imagefolder dataset.
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 5e-05
- train_batch_size: 1
- eval_batch_size: 8
- seed: 42
- optimizer: Use OptimizerNames.ADAMW_TORCH with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
- lr_scheduler_type: cosine
- num_epochs: 0
### Training results
### Framework versions
- Transformers 4.46.2
- Pytorch 2.5.1+cpu
- Datasets 2.19.2
- Tokenizers 0.20.3
| [
"black_star",
"cat",
"grey_star",
"insect",
"moon",
"owl",
"unicorn_head",
"unicorn_whole"
] |
BjngChjjljng/detr-finetuned_v3 |
# Model Card for Model ID
<!-- Provide a quick summary of what the model is/does. -->
## Model Details
### Model Description
<!-- Provide a longer summary of what this model is. -->
This is the model card of a 🤗 transformers model that has been pushed on the Hub. This model card has been automatically generated.
- **Developed by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Model type:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
- **Finetuned from model [optional]:** [More Information Needed]
### Model Sources [optional]
<!-- Provide the basic links for the model. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the model is intended to be used, including the foreseeable users of the model and those affected by the model. -->
### Direct Use
<!-- This section is for the model use without fine-tuning or plugging into a larger ecosystem/app. -->
[More Information Needed]
### Downstream Use [optional]
<!-- This section is for the model use when fine-tuned for a task, or when plugged into a larger ecosystem/app -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the model will not work well for. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.
## How to Get Started with the Model
Use the code below to get started with the model.
[More Information Needed]
## Training Details
### Training Data
<!-- This should link to a Dataset Card, perhaps with a short stub of information on what the training data is all about as well as documentation related to data pre-processing or additional filtering. -->
[More Information Needed]
### Training Procedure
<!-- This relates heavily to the Technical Specifications. Content here should link to that section when it is relevant to the training procedure. -->
#### Preprocessing [optional]
[More Information Needed]
#### Training Hyperparameters
- **Training regime:** [More Information Needed] <!--fp32, fp16 mixed precision, bf16 mixed precision, bf16 non-mixed precision, fp16 non-mixed precision, fp8 mixed precision -->
#### Speeds, Sizes, Times [optional]
<!-- This section provides information about throughput, start/end time, checkpoint size if relevant, etc. -->
[More Information Needed]
## Evaluation
<!-- This section describes the evaluation protocols and provides the results. -->
### Testing Data, Factors & Metrics
#### Testing Data
<!-- This should link to a Dataset Card if possible. -->
[More Information Needed]
#### Factors
<!-- These are the things the evaluation is disaggregating by, e.g., subpopulations or domains. -->
[More Information Needed]
#### Metrics
<!-- These are the evaluation metrics being used, ideally with a description of why. -->
[More Information Needed]
### Results
[More Information Needed]
#### Summary
## Model Examination [optional]
<!-- Relevant interpretability work for the model goes here -->
[More Information Needed]
## Environmental Impact
<!-- Total emissions (in grams of CO2eq) and additional considerations, such as electricity usage, go here. Edit the suggested text below accordingly -->
Carbon emissions can be estimated using the [Machine Learning Impact calculator](https://mlco2.github.io/impact#compute) presented in [Lacoste et al. (2019)](https://arxiv.org/abs/1910.09700).
- **Hardware Type:** [More Information Needed]
- **Hours used:** [More Information Needed]
- **Cloud Provider:** [More Information Needed]
- **Compute Region:** [More Information Needed]
- **Carbon Emitted:** [More Information Needed]
## Technical Specifications [optional]
### Model Architecture and Objective
[More Information Needed]
### Compute Infrastructure
[More Information Needed]
#### Hardware
[More Information Needed]
#### Software
[More Information Needed]
## Citation [optional]
<!-- If there is a paper or blog post introducing the model, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the model or model card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Model Card Authors [optional]
[More Information Needed]
## Model Card Contact
[More Information Needed] | [
"label_0",
"label_1",
"label_2",
"label_3"
] |
pabloOmega/equations-detection |
# Model Card for Model ID
<!-- Provide a quick summary of what the model is/does. -->
## Model Details
### Model Description
<!-- Provide a longer summary of what this model is. -->
This is the model card of a 🤗 transformers model that has been pushed on the Hub. This model card has been automatically generated.
- **Developed by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Model type:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
- **Finetuned from model [optional]:** [More Information Needed]
### Model Sources [optional]
<!-- Provide the basic links for the model. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the model is intended to be used, including the foreseeable users of the model and those affected by the model. -->
### Direct Use
<!-- This section is for the model use without fine-tuning or plugging into a larger ecosystem/app. -->
[More Information Needed]
### Downstream Use [optional]
<!-- This section is for the model use when fine-tuned for a task, or when plugged into a larger ecosystem/app -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the model will not work well for. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.
## How to Get Started with the Model
Use the code below to get started with the model.
[More Information Needed]
## Training Details
### Training Data
<!-- This should link to a Dataset Card, perhaps with a short stub of information on what the training data is all about as well as documentation related to data pre-processing or additional filtering. -->
[More Information Needed]
### Training Procedure
<!-- This relates heavily to the Technical Specifications. Content here should link to that section when it is relevant to the training procedure. -->
#### Preprocessing [optional]
[More Information Needed]
#### Training Hyperparameters
- **Training regime:** [More Information Needed] <!--fp32, fp16 mixed precision, bf16 mixed precision, bf16 non-mixed precision, fp16 non-mixed precision, fp8 mixed precision -->
#### Speeds, Sizes, Times [optional]
<!-- This section provides information about throughput, start/end time, checkpoint size if relevant, etc. -->
[More Information Needed]
## Evaluation
<!-- This section describes the evaluation protocols and provides the results. -->
### Testing Data, Factors & Metrics
#### Testing Data
<!-- This should link to a Dataset Card if possible. -->
[More Information Needed]
#### Factors
<!-- These are the things the evaluation is disaggregating by, e.g., subpopulations or domains. -->
[More Information Needed]
#### Metrics
<!-- These are the evaluation metrics being used, ideally with a description of why. -->
[More Information Needed]
### Results
[More Information Needed]
#### Summary
## Model Examination [optional]
<!-- Relevant interpretability work for the model goes here -->
[More Information Needed]
## Environmental Impact
<!-- Total emissions (in grams of CO2eq) and additional considerations, such as electricity usage, go here. Edit the suggested text below accordingly -->
Carbon emissions can be estimated using the [Machine Learning Impact calculator](https://mlco2.github.io/impact#compute) presented in [Lacoste et al. (2019)](https://arxiv.org/abs/1910.09700).
- **Hardware Type:** [More Information Needed]
- **Hours used:** [More Information Needed]
- **Cloud Provider:** [More Information Needed]
- **Compute Region:** [More Information Needed]
- **Carbon Emitted:** [More Information Needed]
## Technical Specifications [optional]
### Model Architecture and Objective
[More Information Needed]
### Compute Infrastructure
[More Information Needed]
#### Hardware
[More Information Needed]
#### Software
[More Information Needed]
## Citation [optional]
<!-- If there is a paper or blog post introducing the model, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the model or model card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Model Card Authors [optional]
[More Information Needed]
## Model Card Contact
[More Information Needed] | [
"label_0",
"label_1",
"label_2",
"label_3"
] |
joe611/chickens-composite-201616161616-150-epochs-w-hybrid-transform-metrics-test |
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# chickens-composite-201616161616-150-epochs-w-hybrid-transform-metrics-test
This model is a fine-tuned version of [facebook/detr-resnet-50](https://huggingface.co/facebook/detr-resnet-50) on the None dataset.
It achieves the following results on the evaluation set:
- Loss: 0.2826
- Map: 0.8176
- Map 50: 0.9579
- Map 75: 0.9306
- Map Small: 0.3875
- Map Medium: 0.8219
- Map Large: 0.8195
- Mar 1: 0.326
- Mar 10: 0.8548
- Mar 100: 0.8598
- Mar Small: 0.5
- Mar Medium: 0.8639
- Mar Large: 0.8516
- Map Chicken: 0.8172
- Mar 100 Chicken: 0.8631
- Map Duck: 0.7548
- Mar 100 Duck: 0.8103
- Map Plant: 0.8807
- Mar 100 Plant: 0.9061
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 1e-05
- train_batch_size: 2
- eval_batch_size: 8
- seed: 42
- optimizer: Use adamw_torch with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
- lr_scheduler_type: cosine
- num_epochs: 150
### Training results
| Training Loss | Epoch | Step | Validation Loss | Map | Map 50 | Map 75 | Map Small | Map Medium | Map Large | Mar 1 | Mar 10 | Mar 100 | Mar Small | Mar Medium | Mar Large | Map Chicken | Mar 100 Chicken | Map Duck | Mar 100 Duck | Map Plant | Mar 100 Plant |
|:-------------:|:-----:|:-----:|:---------------:|:------:|:------:|:------:|:---------:|:----------:|:---------:|:------:|:------:|:-------:|:---------:|:----------:|:---------:|:-----------:|:---------------:|:--------:|:------------:|:---------:|:-------------:|
| 1.3637 | 1.0 | 500 | 1.3375 | 0.2022 | 0.2836 | 0.2296 | 0.0079 | 0.1517 | 0.2202 | 0.0888 | 0.2702 | 0.3003 | 0.0857 | 0.2804 | 0.2934 | 0.0406 | 0.1329 | 0.0 | 0.0 | 0.5659 | 0.7679 |
| 1.3442 | 2.0 | 1000 | 1.0745 | 0.2876 | 0.4185 | 0.3389 | 0.0443 | 0.2462 | 0.3099 | 0.1281 | 0.3932 | 0.4098 | 0.0719 | 0.3902 | 0.3825 | 0.2497 | 0.4857 | 0.0 | 0.0 | 0.6132 | 0.7436 |
| 0.9363 | 3.0 | 1500 | 0.8883 | 0.3672 | 0.5234 | 0.4237 | 0.0059 | 0.3185 | 0.4115 | 0.1346 | 0.4657 | 0.4762 | 0.0733 | 0.4505 | 0.5145 | 0.4617 | 0.6905 | 0.0 | 0.0 | 0.6399 | 0.7382 |
| 0.89 | 4.0 | 2000 | 0.7873 | 0.4078 | 0.5706 | 0.4813 | 0.0096 | 0.3748 | 0.4286 | 0.1375 | 0.4847 | 0.4937 | 0.0924 | 0.4683 | 0.5261 | 0.5262 | 0.7206 | 0.0 | 0.0 | 0.6972 | 0.7606 |
| 0.8308 | 5.0 | 2500 | 0.7448 | 0.4216 | 0.5847 | 0.4779 | 0.009 | 0.3892 | 0.4485 | 0.1391 | 0.4851 | 0.4939 | 0.13 | 0.4674 | 0.5251 | 0.5319 | 0.7063 | 0.0 | 0.0 | 0.7329 | 0.7755 |
| 0.778 | 6.0 | 3000 | 0.6862 | 0.4308 | 0.6027 | 0.5111 | 0.0344 | 0.4084 | 0.4212 | 0.1423 | 0.4912 | 0.4957 | 0.141 | 0.4752 | 0.5122 | 0.5414 | 0.6933 | 0.0158 | 0.0103 | 0.7352 | 0.7836 |
| 0.7088 | 7.0 | 3500 | 0.6767 | 0.4691 | 0.6565 | 0.5456 | 0.042 | 0.4453 | 0.4841 | 0.1676 | 0.5327 | 0.5368 | 0.1286 | 0.5151 | 0.5653 | 0.621 | 0.7206 | 0.0456 | 0.1082 | 0.7406 | 0.7815 |
| 0.7226 | 8.0 | 4000 | 0.6517 | 0.4727 | 0.7023 | 0.5455 | 0.0235 | 0.4553 | 0.4898 | 0.1744 | 0.5315 | 0.5381 | 0.1267 | 0.5221 | 0.5627 | 0.6112 | 0.6917 | 0.0952 | 0.1629 | 0.7116 | 0.7597 |
| 0.6153 | 9.0 | 4500 | 0.5861 | 0.502 | 0.7073 | 0.5879 | 0.0876 | 0.478 | 0.5352 | 0.1748 | 0.5451 | 0.5485 | 0.1638 | 0.5274 | 0.5779 | 0.6671 | 0.7298 | 0.0972 | 0.132 | 0.7418 | 0.7836 |
| 0.6657 | 10.0 | 5000 | 0.5325 | 0.6021 | 0.8248 | 0.706 | 0.0865 | 0.5977 | 0.5955 | 0.2438 | 0.6653 | 0.6714 | 0.2181 | 0.6692 | 0.6794 | 0.6827 | 0.7397 | 0.3456 | 0.4515 | 0.778 | 0.823 |
| 0.5313 | 11.0 | 5500 | 0.5259 | 0.624 | 0.8498 | 0.744 | 0.0867 | 0.6118 | 0.6409 | 0.2389 | 0.6733 | 0.68 | 0.2086 | 0.6693 | 0.6879 | 0.6732 | 0.7329 | 0.4293 | 0.4938 | 0.7695 | 0.8133 |
| 0.5726 | 12.0 | 6000 | 0.5454 | 0.6278 | 0.8783 | 0.7558 | 0.0719 | 0.6309 | 0.6167 | 0.25 | 0.6767 | 0.6844 | 0.2052 | 0.6879 | 0.6757 | 0.6544 | 0.7123 | 0.4677 | 0.5351 | 0.7613 | 0.8058 |
| 0.5241 | 13.0 | 6500 | 0.4902 | 0.6769 | 0.9015 | 0.8088 | 0.1959 | 0.6814 | 0.6693 | 0.278 | 0.7256 | 0.733 | 0.3619 | 0.732 | 0.7283 | 0.7033 | 0.7607 | 0.5484 | 0.6155 | 0.779 | 0.8227 |
| 0.6324 | 14.0 | 7000 | 0.4576 | 0.6937 | 0.9148 | 0.8439 | 0.1523 | 0.6985 | 0.6877 | 0.2846 | 0.7398 | 0.7457 | 0.3148 | 0.7466 | 0.7375 | 0.7229 | 0.7714 | 0.5792 | 0.6433 | 0.7791 | 0.8224 |
| 0.5425 | 15.0 | 7500 | 0.4671 | 0.6819 | 0.9082 | 0.8144 | 0.1523 | 0.6779 | 0.6896 | 0.2749 | 0.7227 | 0.7299 | 0.2771 | 0.7295 | 0.7222 | 0.6927 | 0.7437 | 0.5561 | 0.6082 | 0.7969 | 0.8379 |
| 0.4946 | 16.0 | 8000 | 0.4771 | 0.6759 | 0.9287 | 0.8051 | 0.1942 | 0.664 | 0.6773 | 0.2758 | 0.7211 | 0.7304 | 0.3419 | 0.7206 | 0.7289 | 0.6853 | 0.7429 | 0.5681 | 0.634 | 0.7741 | 0.8142 |
| 0.4592 | 17.0 | 8500 | 0.4605 | 0.6986 | 0.9305 | 0.8387 | 0.1326 | 0.696 | 0.7166 | 0.2842 | 0.7451 | 0.7496 | 0.2586 | 0.7471 | 0.7666 | 0.7156 | 0.7639 | 0.5969 | 0.6619 | 0.7834 | 0.823 |
| 0.4969 | 18.0 | 9000 | 0.4115 | 0.7144 | 0.9368 | 0.8357 | 0.1913 | 0.7049 | 0.7354 | 0.2886 | 0.7592 | 0.7666 | 0.3181 | 0.7568 | 0.7811 | 0.7215 | 0.7722 | 0.6137 | 0.6794 | 0.808 | 0.8482 |
| 0.4928 | 19.0 | 9500 | 0.4163 | 0.7035 | 0.9355 | 0.8452 | 0.1962 | 0.7061 | 0.6968 | 0.2862 | 0.7529 | 0.7584 | 0.2681 | 0.7582 | 0.7548 | 0.6955 | 0.7504 | 0.6098 | 0.6794 | 0.8051 | 0.8455 |
| 0.5201 | 20.0 | 10000 | 0.4534 | 0.6923 | 0.924 | 0.8385 | 0.2083 | 0.6837 | 0.703 | 0.2802 | 0.7335 | 0.7376 | 0.3029 | 0.7304 | 0.7484 | 0.6918 | 0.7413 | 0.5991 | 0.6526 | 0.7861 | 0.8191 |
| 0.5026 | 21.0 | 10500 | 0.3920 | 0.7237 | 0.9333 | 0.8567 | 0.2019 | 0.7263 | 0.7093 | 0.2905 | 0.7621 | 0.7662 | 0.2771 | 0.7727 | 0.749 | 0.7321 | 0.7802 | 0.6233 | 0.667 | 0.8157 | 0.8515 |
| 0.6227 | 22.0 | 11000 | 0.4082 | 0.7172 | 0.9319 | 0.8411 | 0.0973 | 0.7143 | 0.7056 | 0.2866 | 0.7566 | 0.7596 | 0.2224 | 0.759 | 0.7403 | 0.741 | 0.7841 | 0.609 | 0.6567 | 0.8016 | 0.8379 |
| 0.46 | 23.0 | 11500 | 0.4134 | 0.7211 | 0.9334 | 0.867 | 0.2071 | 0.7073 | 0.7728 | 0.2958 | 0.7614 | 0.7667 | 0.2967 | 0.7554 | 0.8106 | 0.7142 | 0.7627 | 0.6374 | 0.6887 | 0.8116 | 0.8488 |
| 0.3883 | 24.0 | 12000 | 0.4062 | 0.725 | 0.9453 | 0.8536 | 0.1324 | 0.7104 | 0.7854 | 0.3012 | 0.7681 | 0.7732 | 0.2967 | 0.7581 | 0.8239 | 0.7191 | 0.7643 | 0.6501 | 0.7134 | 0.8058 | 0.8418 |
| 0.4362 | 25.0 | 12500 | 0.3831 | 0.7137 | 0.94 | 0.8691 | 0.2175 | 0.6978 | 0.7459 | 0.2904 | 0.7618 | 0.7694 | 0.3457 | 0.7566 | 0.7837 | 0.7168 | 0.7726 | 0.6148 | 0.6814 | 0.8095 | 0.8542 |
| 0.4348 | 26.0 | 13000 | 0.3933 | 0.7303 | 0.9396 | 0.8519 | 0.2368 | 0.7314 | 0.7378 | 0.2937 | 0.7692 | 0.7761 | 0.3295 | 0.776 | 0.78 | 0.7511 | 0.7944 | 0.6293 | 0.6856 | 0.8104 | 0.8482 |
| 0.3909 | 27.0 | 13500 | 0.3736 | 0.7413 | 0.9409 | 0.8734 | 0.2818 | 0.7467 | 0.6951 | 0.2976 | 0.7825 | 0.7877 | 0.4105 | 0.7886 | 0.7395 | 0.7515 | 0.7988 | 0.6571 | 0.7124 | 0.8153 | 0.8518 |
| 0.3944 | 28.0 | 14000 | 0.3780 | 0.7305 | 0.9325 | 0.872 | 0.2714 | 0.7218 | 0.7149 | 0.2936 | 0.7701 | 0.7788 | 0.4614 | 0.7695 | 0.7534 | 0.7512 | 0.7972 | 0.6358 | 0.6959 | 0.8046 | 0.8433 |
| 0.3858 | 29.0 | 14500 | 0.3911 | 0.7133 | 0.9471 | 0.8613 | 0.1945 | 0.7042 | 0.7527 | 0.2898 | 0.7603 | 0.7666 | 0.29 | 0.7605 | 0.795 | 0.7013 | 0.7548 | 0.6264 | 0.6959 | 0.8123 | 0.8491 |
| 0.3998 | 30.0 | 15000 | 0.3728 | 0.7316 | 0.9269 | 0.8793 | 0.2131 | 0.7387 | 0.7112 | 0.2941 | 0.771 | 0.7763 | 0.3071 | 0.7887 | 0.7482 | 0.7395 | 0.7905 | 0.6496 | 0.6938 | 0.8056 | 0.8445 |
| 0.3604 | 31.0 | 15500 | 0.3699 | 0.7325 | 0.95 | 0.8751 | 0.229 | 0.7248 | 0.7741 | 0.293 | 0.7753 | 0.7801 | 0.3529 | 0.7724 | 0.8156 | 0.7261 | 0.771 | 0.6521 | 0.7144 | 0.8192 | 0.8548 |
| 0.3902 | 32.0 | 16000 | 0.3573 | 0.7518 | 0.9508 | 0.8913 | 0.197 | 0.7475 | 0.7941 | 0.3019 | 0.7907 | 0.7958 | 0.3152 | 0.7903 | 0.8282 | 0.7471 | 0.7925 | 0.6842 | 0.733 | 0.8241 | 0.8618 |
| 0.4054 | 33.0 | 16500 | 0.3459 | 0.7527 | 0.9461 | 0.903 | 0.2512 | 0.7555 | 0.7825 | 0.3012 | 0.7889 | 0.7971 | 0.4448 | 0.7978 | 0.8216 | 0.7604 | 0.8063 | 0.671 | 0.7227 | 0.8266 | 0.8621 |
| 0.4273 | 34.0 | 17000 | 0.3686 | 0.7342 | 0.9373 | 0.8812 | 0.2655 | 0.726 | 0.7763 | 0.2977 | 0.7802 | 0.7849 | 0.3833 | 0.7737 | 0.8162 | 0.7497 | 0.7972 | 0.6367 | 0.7031 | 0.8161 | 0.8542 |
| 0.4061 | 35.0 | 17500 | 0.3784 | 0.7399 | 0.9397 | 0.8932 | 0.241 | 0.7295 | 0.7874 | 0.3016 | 0.7832 | 0.7923 | 0.4824 | 0.7787 | 0.8219 | 0.7375 | 0.7857 | 0.6564 | 0.7268 | 0.8259 | 0.8642 |
| 0.3534 | 36.0 | 18000 | 0.3852 | 0.7407 | 0.9353 | 0.8765 | 0.3013 | 0.7307 | 0.7745 | 0.3057 | 0.7917 | 0.7947 | 0.4233 | 0.785 | 0.8182 | 0.7425 | 0.7937 | 0.6614 | 0.7371 | 0.8181 | 0.8533 |
| 0.3802 | 37.0 | 18500 | 0.3701 | 0.7473 | 0.951 | 0.8693 | 0.2757 | 0.7422 | 0.7779 | 0.3065 | 0.7909 | 0.7976 | 0.4676 | 0.7884 | 0.8179 | 0.7667 | 0.8071 | 0.6716 | 0.7402 | 0.8036 | 0.8455 |
| 0.4238 | 38.0 | 19000 | 0.3718 | 0.7385 | 0.9331 | 0.8621 | 0.2269 | 0.7285 | 0.7558 | 0.2966 | 0.7766 | 0.7845 | 0.411 | 0.7737 | 0.7854 | 0.7463 | 0.7937 | 0.6503 | 0.7021 | 0.8188 | 0.8579 |
| 0.384 | 39.0 | 19500 | 0.3475 | 0.7573 | 0.9468 | 0.8936 | 0.2334 | 0.7534 | 0.7792 | 0.3119 | 0.795 | 0.802 | 0.401 | 0.7979 | 0.8192 | 0.7515 | 0.7929 | 0.6929 | 0.7474 | 0.8274 | 0.8658 |
| 0.3767 | 40.0 | 20000 | 0.3546 | 0.748 | 0.9558 | 0.8997 | 0.298 | 0.7391 | 0.7842 | 0.3009 | 0.7946 | 0.8035 | 0.4962 | 0.7937 | 0.8265 | 0.7671 | 0.8123 | 0.6575 | 0.734 | 0.8193 | 0.8642 |
| 0.3587 | 41.0 | 20500 | 0.3592 | 0.75 | 0.9459 | 0.9021 | 0.2815 | 0.7384 | 0.7797 | 0.3056 | 0.7883 | 0.7966 | 0.4676 | 0.7826 | 0.8161 | 0.7547 | 0.802 | 0.6879 | 0.7423 | 0.8075 | 0.8455 |
| 0.3892 | 42.0 | 21000 | 0.3502 | 0.7557 | 0.9496 | 0.9071 | 0.256 | 0.7472 | 0.7944 | 0.3067 | 0.7939 | 0.8033 | 0.4557 | 0.7924 | 0.8283 | 0.7556 | 0.7968 | 0.6892 | 0.7515 | 0.8222 | 0.8615 |
| 0.3965 | 43.0 | 21500 | 0.3423 | 0.7619 | 0.9494 | 0.8955 | 0.2201 | 0.7643 | 0.7944 | 0.3109 | 0.8013 | 0.8067 | 0.3619 | 0.8075 | 0.8299 | 0.7579 | 0.804 | 0.7127 | 0.7608 | 0.815 | 0.8555 |
| 0.4117 | 44.0 | 22000 | 0.3461 | 0.7709 | 0.9514 | 0.8986 | 0.2226 | 0.7728 | 0.7921 | 0.3123 | 0.8104 | 0.8151 | 0.3595 | 0.8182 | 0.8228 | 0.7658 | 0.8071 | 0.7176 | 0.7722 | 0.8293 | 0.8661 |
| 0.371 | 45.0 | 22500 | 0.3394 | 0.7575 | 0.951 | 0.9036 | 0.2713 | 0.7565 | 0.7873 | 0.3067 | 0.7961 | 0.8056 | 0.4729 | 0.8002 | 0.8273 | 0.7655 | 0.8143 | 0.6865 | 0.7412 | 0.8205 | 0.8612 |
| 0.3745 | 46.0 | 23000 | 0.3338 | 0.7585 | 0.9491 | 0.905 | 0.314 | 0.755 | 0.7677 | 0.3104 | 0.7998 | 0.8059 | 0.4686 | 0.8018 | 0.8021 | 0.7661 | 0.8155 | 0.6791 | 0.7371 | 0.8303 | 0.8652 |
| 0.3535 | 47.0 | 23500 | 0.3374 | 0.7599 | 0.943 | 0.8887 | 0.3107 | 0.7491 | 0.7831 | 0.3106 | 0.7977 | 0.8052 | 0.4733 | 0.7967 | 0.8156 | 0.7645 | 0.8139 | 0.6824 | 0.734 | 0.8329 | 0.8676 |
| 0.3852 | 48.0 | 24000 | 0.3219 | 0.7721 | 0.9565 | 0.9012 | 0.3121 | 0.7702 | 0.802 | 0.313 | 0.8129 | 0.8196 | 0.4281 | 0.8151 | 0.8358 | 0.7631 | 0.8127 | 0.7105 | 0.7639 | 0.8428 | 0.8821 |
| 0.3774 | 49.0 | 24500 | 0.3365 | 0.7641 | 0.956 | 0.9064 | 0.2601 | 0.7618 | 0.7848 | 0.3103 | 0.8038 | 0.8093 | 0.401 | 0.8075 | 0.821 | 0.7504 | 0.7968 | 0.7116 | 0.766 | 0.8304 | 0.8652 |
| 0.328 | 50.0 | 25000 | 0.3349 | 0.7651 | 0.956 | 0.9021 | 0.2856 | 0.7596 | 0.7859 | 0.3055 | 0.8064 | 0.8135 | 0.4576 | 0.8057 | 0.8259 | 0.7504 | 0.7956 | 0.7111 | 0.7753 | 0.8338 | 0.8697 |
| 0.3735 | 51.0 | 25500 | 0.3249 | 0.7643 | 0.9307 | 0.8785 | 0.2775 | 0.7617 | 0.7686 | 0.3034 | 0.7993 | 0.8073 | 0.3895 | 0.804 | 0.7965 | 0.7851 | 0.8306 | 0.6666 | 0.7155 | 0.8412 | 0.8758 |
| 0.4171 | 52.0 | 26000 | 0.3363 | 0.7597 | 0.9441 | 0.8874 | 0.2759 | 0.7622 | 0.7774 | 0.309 | 0.7961 | 0.8026 | 0.4248 | 0.8004 | 0.8073 | 0.7732 | 0.8179 | 0.6757 | 0.7216 | 0.8302 | 0.8682 |
| 0.3355 | 53.0 | 26500 | 0.3175 | 0.7735 | 0.9515 | 0.9071 | 0.3236 | 0.774 | 0.7715 | 0.3121 | 0.816 | 0.8226 | 0.4905 | 0.8181 | 0.8123 | 0.781 | 0.8234 | 0.6936 | 0.7649 | 0.846 | 0.8794 |
| 0.3006 | 54.0 | 27000 | 0.3109 | 0.7736 | 0.9556 | 0.8946 | 0.2718 | 0.7771 | 0.7868 | 0.3137 | 0.814 | 0.8195 | 0.4167 | 0.8179 | 0.8304 | 0.7769 | 0.8183 | 0.6959 | 0.7588 | 0.8481 | 0.8815 |
| 0.3252 | 55.0 | 27500 | 0.3157 | 0.7742 | 0.9552 | 0.9055 | 0.295 | 0.7787 | 0.7803 | 0.3077 | 0.8152 | 0.821 | 0.4471 | 0.8222 | 0.8187 | 0.7697 | 0.8143 | 0.7089 | 0.768 | 0.844 | 0.8806 |
| 0.3341 | 56.0 | 28000 | 0.3221 | 0.7661 | 0.9575 | 0.9019 | 0.2773 | 0.7711 | 0.7937 | 0.3093 | 0.8101 | 0.8169 | 0.4243 | 0.8156 | 0.8359 | 0.7617 | 0.8048 | 0.6915 | 0.7639 | 0.8452 | 0.8821 |
| 0.2994 | 57.0 | 28500 | 0.3109 | 0.7773 | 0.9507 | 0.9021 | 0.3057 | 0.7762 | 0.7781 | 0.3111 | 0.8129 | 0.8205 | 0.479 | 0.8209 | 0.812 | 0.7857 | 0.825 | 0.6957 | 0.7515 | 0.8505 | 0.8848 |
| 0.3167 | 58.0 | 29000 | 0.3244 | 0.7758 | 0.9514 | 0.901 | 0.2373 | 0.7804 | 0.7872 | 0.313 | 0.8148 | 0.8208 | 0.3957 | 0.8239 | 0.8167 | 0.7712 | 0.8151 | 0.7126 | 0.7649 | 0.8437 | 0.8824 |
| 0.3074 | 59.0 | 29500 | 0.3182 | 0.7809 | 0.96 | 0.9199 | 0.2609 | 0.7805 | 0.7921 | 0.3125 | 0.8191 | 0.828 | 0.4752 | 0.8269 | 0.8308 | 0.7695 | 0.8143 | 0.7246 | 0.7845 | 0.8487 | 0.8852 |
| 0.3369 | 60.0 | 30000 | 0.3242 | 0.7739 | 0.9497 | 0.9078 | 0.2622 | 0.7731 | 0.7783 | 0.3113 | 0.8107 | 0.8201 | 0.4729 | 0.8185 | 0.8077 | 0.7768 | 0.8226 | 0.7029 | 0.7588 | 0.8422 | 0.8788 |
| 0.3005 | 61.0 | 30500 | 0.3319 | 0.7724 | 0.9573 | 0.9041 | 0.2995 | 0.7733 | 0.7921 | 0.3076 | 0.8102 | 0.8187 | 0.4633 | 0.8189 | 0.8296 | 0.7655 | 0.8067 | 0.7055 | 0.767 | 0.8462 | 0.8824 |
| 0.3328 | 62.0 | 31000 | 0.3273 | 0.7812 | 0.9559 | 0.9051 | 0.2928 | 0.7776 | 0.8125 | 0.3112 | 0.8205 | 0.8262 | 0.4476 | 0.8246 | 0.842 | 0.7686 | 0.8119 | 0.7238 | 0.7814 | 0.851 | 0.8852 |
| 0.3873 | 63.0 | 31500 | 0.3161 | 0.7829 | 0.9567 | 0.9063 | 0.2979 | 0.7736 | 0.8104 | 0.3151 | 0.819 | 0.827 | 0.471 | 0.8191 | 0.8416 | 0.7796 | 0.8206 | 0.7247 | 0.7835 | 0.8443 | 0.877 |
| 0.3308 | 64.0 | 32000 | 0.3166 | 0.7859 | 0.9579 | 0.9142 | 0.3263 | 0.7833 | 0.7926 | 0.3148 | 0.8224 | 0.8302 | 0.4838 | 0.8258 | 0.8243 | 0.7821 | 0.8206 | 0.7178 | 0.7845 | 0.8579 | 0.8855 |
| 0.3222 | 65.0 | 32500 | 0.3202 | 0.7827 | 0.9594 | 0.9056 | 0.275 | 0.7796 | 0.8137 | 0.314 | 0.821 | 0.8276 | 0.4148 | 0.8252 | 0.8498 | 0.7679 | 0.8079 | 0.7263 | 0.7876 | 0.854 | 0.8873 |
| 0.4033 | 66.0 | 33000 | 0.3232 | 0.7636 | 0.9516 | 0.895 | 0.2858 | 0.7661 | 0.7817 | 0.3043 | 0.8063 | 0.811 | 0.37 | 0.8125 | 0.8294 | 0.769 | 0.8111 | 0.675 | 0.7454 | 0.8468 | 0.8767 |
| 0.3163 | 67.0 | 33500 | 0.3201 | 0.783 | 0.9552 | 0.9034 | 0.3358 | 0.7797 | 0.8046 | 0.3146 | 0.825 | 0.83 | 0.44 | 0.8275 | 0.8378 | 0.7897 | 0.8254 | 0.7044 | 0.7773 | 0.8549 | 0.8873 |
| 0.3443 | 68.0 | 34000 | 0.3102 | 0.783 | 0.9546 | 0.8955 | 0.2943 | 0.7845 | 0.8009 | 0.311 | 0.8232 | 0.8279 | 0.3895 | 0.8291 | 0.8357 | 0.8062 | 0.8425 | 0.6863 | 0.7557 | 0.8565 | 0.8855 |
| 0.2685 | 69.0 | 34500 | 0.3266 | 0.773 | 0.9546 | 0.8951 | 0.2206 | 0.773 | 0.8143 | 0.3098 | 0.8114 | 0.8146 | 0.31 | 0.8146 | 0.848 | 0.7705 | 0.8091 | 0.6992 | 0.7557 | 0.8493 | 0.8791 |
| 0.254 | 70.0 | 35000 | 0.3053 | 0.7816 | 0.9447 | 0.8764 | 0.314 | 0.7861 | 0.7917 | 0.3136 | 0.8187 | 0.8233 | 0.3986 | 0.827 | 0.8207 | 0.8054 | 0.8452 | 0.6802 | 0.7351 | 0.8594 | 0.8897 |
| 0.292 | 71.0 | 35500 | 0.3023 | 0.792 | 0.9538 | 0.9122 | 0.2953 | 0.7958 | 0.8023 | 0.317 | 0.8277 | 0.8329 | 0.4014 | 0.8373 | 0.8342 | 0.8057 | 0.8468 | 0.7129 | 0.7649 | 0.8573 | 0.887 |
| 0.2921 | 72.0 | 36000 | 0.3102 | 0.7772 | 0.9515 | 0.9175 | 0.3279 | 0.7742 | 0.7945 | 0.3114 | 0.8166 | 0.8219 | 0.4195 | 0.8243 | 0.8227 | 0.7752 | 0.8179 | 0.7025 | 0.7629 | 0.8539 | 0.8848 |
| 0.3412 | 73.0 | 36500 | 0.3100 | 0.7817 | 0.9597 | 0.913 | 0.3416 | 0.785 | 0.8018 | 0.3123 | 0.8224 | 0.8276 | 0.4333 | 0.8306 | 0.8333 | 0.782 | 0.8302 | 0.7155 | 0.7711 | 0.8476 | 0.8815 |
| 0.3075 | 74.0 | 37000 | 0.3014 | 0.7946 | 0.9561 | 0.9164 | 0.3 | 0.7964 | 0.8209 | 0.3182 | 0.834 | 0.8385 | 0.3986 | 0.8407 | 0.8561 | 0.7948 | 0.8373 | 0.7279 | 0.7897 | 0.8612 | 0.8885 |
| 0.3023 | 75.0 | 37500 | 0.2935 | 0.7947 | 0.9603 | 0.9196 | 0.3455 | 0.7988 | 0.8218 | 0.3168 | 0.8343 | 0.8398 | 0.4629 | 0.8436 | 0.8527 | 0.8062 | 0.8464 | 0.7108 | 0.7773 | 0.8673 | 0.8958 |
| 0.3002 | 76.0 | 38000 | 0.3102 | 0.7838 | 0.9577 | 0.9181 | 0.318 | 0.7851 | 0.796 | 0.3158 | 0.8263 | 0.833 | 0.4776 | 0.8332 | 0.834 | 0.7756 | 0.8194 | 0.7227 | 0.7959 | 0.8531 | 0.8836 |
| 0.3307 | 77.0 | 38500 | 0.2983 | 0.7925 | 0.9555 | 0.9009 | 0.298 | 0.7967 | 0.807 | 0.317 | 0.8314 | 0.8342 | 0.3962 | 0.838 | 0.8397 | 0.7937 | 0.8373 | 0.7208 | 0.7753 | 0.8631 | 0.89 |
| 0.3413 | 78.0 | 39000 | 0.3021 | 0.7865 | 0.946 | 0.9058 | 0.3466 | 0.7881 | 0.7897 | 0.3155 | 0.8245 | 0.8299 | 0.4881 | 0.8297 | 0.8258 | 0.7951 | 0.8433 | 0.6961 | 0.7485 | 0.8682 | 0.8979 |
| 0.3045 | 79.0 | 39500 | 0.3084 | 0.7888 | 0.9472 | 0.8942 | 0.3235 | 0.7925 | 0.7943 | 0.3175 | 0.83 | 0.8338 | 0.3967 | 0.8379 | 0.8229 | 0.7896 | 0.8341 | 0.7124 | 0.7742 | 0.8644 | 0.893 |
| 0.2488 | 80.0 | 40000 | 0.2982 | 0.7953 | 0.9528 | 0.914 | 0.325 | 0.7916 | 0.8208 | 0.3202 | 0.8348 | 0.8383 | 0.4086 | 0.8356 | 0.8579 | 0.8057 | 0.8437 | 0.7147 | 0.7773 | 0.8655 | 0.8939 |
| 0.3217 | 81.0 | 40500 | 0.3044 | 0.7914 | 0.9556 | 0.9082 | 0.2782 | 0.7956 | 0.7996 | 0.3157 | 0.8283 | 0.8326 | 0.3581 | 0.8366 | 0.8352 | 0.7964 | 0.8385 | 0.7142 | 0.7691 | 0.8636 | 0.8903 |
| 0.3004 | 82.0 | 41000 | 0.3022 | 0.7843 | 0.9572 | 0.9058 | 0.3107 | 0.7879 | 0.7954 | 0.3137 | 0.8267 | 0.8324 | 0.4329 | 0.8352 | 0.834 | 0.7847 | 0.8274 | 0.7074 | 0.7794 | 0.8609 | 0.8903 |
| 0.3093 | 83.0 | 41500 | 0.3071 | 0.785 | 0.9517 | 0.9143 | 0.3427 | 0.7862 | 0.8026 | 0.3152 | 0.8254 | 0.8297 | 0.4381 | 0.8312 | 0.8425 | 0.7865 | 0.8341 | 0.7104 | 0.766 | 0.8581 | 0.8891 |
| 0.2752 | 84.0 | 42000 | 0.2960 | 0.7935 | 0.9572 | 0.9176 | 0.3257 | 0.7954 | 0.812 | 0.3188 | 0.835 | 0.8383 | 0.4529 | 0.8402 | 0.8476 | 0.7907 | 0.8361 | 0.7309 | 0.7887 | 0.859 | 0.89 |
| 0.26 | 85.0 | 42500 | 0.3140 | 0.789 | 0.9532 | 0.913 | 0.3043 | 0.7915 | 0.8005 | 0.3192 | 0.8303 | 0.8344 | 0.4133 | 0.8353 | 0.8375 | 0.7715 | 0.8159 | 0.7299 | 0.7928 | 0.8657 | 0.8945 |
| 0.258 | 86.0 | 43000 | 0.2883 | 0.8003 | 0.9557 | 0.9097 | 0.3498 | 0.8041 | 0.8061 | 0.3181 | 0.8406 | 0.8455 | 0.4662 | 0.8498 | 0.8382 | 0.801 | 0.8413 | 0.7255 | 0.7948 | 0.8743 | 0.9003 |
| 0.3089 | 87.0 | 43500 | 0.2843 | 0.8069 | 0.9607 | 0.9228 | 0.3816 | 0.812 | 0.8019 | 0.3221 | 0.8448 | 0.8493 | 0.4976 | 0.8535 | 0.8391 | 0.806 | 0.8468 | 0.7421 | 0.801 | 0.8725 | 0.9 |
| 0.2824 | 88.0 | 44000 | 0.2950 | 0.7961 | 0.9524 | 0.9114 | 0.3053 | 0.8021 | 0.801 | 0.3197 | 0.8362 | 0.8409 | 0.4262 | 0.8465 | 0.8378 | 0.7861 | 0.8337 | 0.7303 | 0.7918 | 0.8717 | 0.8973 |
| 0.2414 | 89.0 | 44500 | 0.3011 | 0.7851 | 0.9545 | 0.903 | 0.276 | 0.7891 | 0.8036 | 0.3158 | 0.8263 | 0.8307 | 0.4038 | 0.834 | 0.838 | 0.7813 | 0.8278 | 0.7174 | 0.7753 | 0.8564 | 0.8891 |
| 0.2478 | 90.0 | 45000 | 0.2817 | 0.8068 | 0.9522 | 0.9161 | 0.333 | 0.8107 | 0.8157 | 0.3217 | 0.8425 | 0.8468 | 0.4214 | 0.8526 | 0.847 | 0.811 | 0.8512 | 0.7381 | 0.7897 | 0.8714 | 0.8994 |
| 0.2355 | 91.0 | 45500 | 0.2872 | 0.8034 | 0.9469 | 0.9103 | 0.3361 | 0.811 | 0.8067 | 0.3188 | 0.8388 | 0.8449 | 0.46 | 0.8512 | 0.8393 | 0.813 | 0.8552 | 0.7202 | 0.7753 | 0.8769 | 0.9042 |
| 0.2681 | 92.0 | 46000 | 0.2875 | 0.7976 | 0.955 | 0.9142 | 0.368 | 0.8074 | 0.7996 | 0.3144 | 0.834 | 0.8383 | 0.4381 | 0.8488 | 0.8328 | 0.7937 | 0.8357 | 0.7256 | 0.7773 | 0.8735 | 0.9018 |
| 0.2674 | 93.0 | 46500 | 0.2918 | 0.8012 | 0.9527 | 0.913 | 0.3776 | 0.8067 | 0.8022 | 0.3192 | 0.8376 | 0.8417 | 0.4676 | 0.8463 | 0.8334 | 0.8069 | 0.8492 | 0.7304 | 0.7814 | 0.8662 | 0.8945 |
| 0.2494 | 94.0 | 47000 | 0.2939 | 0.8009 | 0.9619 | 0.9184 | 0.3542 | 0.8047 | 0.8084 | 0.317 | 0.8377 | 0.8427 | 0.4429 | 0.8479 | 0.8372 | 0.7996 | 0.8409 | 0.7361 | 0.7907 | 0.867 | 0.8964 |
| 0.2763 | 95.0 | 47500 | 0.3031 | 0.8011 | 0.958 | 0.9032 | 0.3648 | 0.8039 | 0.8072 | 0.3179 | 0.838 | 0.8426 | 0.4481 | 0.8464 | 0.8361 | 0.7969 | 0.8345 | 0.7369 | 0.7948 | 0.8697 | 0.8985 |
| 0.2984 | 96.0 | 48000 | 0.3025 | 0.7951 | 0.9582 | 0.9122 | 0.3507 | 0.7975 | 0.7983 | 0.3193 | 0.8328 | 0.8355 | 0.4119 | 0.8387 | 0.8301 | 0.7936 | 0.8353 | 0.7192 | 0.7732 | 0.8726 | 0.8979 |
| 0.3038 | 97.0 | 48500 | 0.2947 | 0.7968 | 0.9548 | 0.9133 | 0.3387 | 0.7983 | 0.8028 | 0.3233 | 0.8417 | 0.845 | 0.4286 | 0.8499 | 0.8395 | 0.8039 | 0.848 | 0.7194 | 0.7907 | 0.8671 | 0.8964 |
| 0.247 | 98.0 | 49000 | 0.2914 | 0.8014 | 0.9593 | 0.9175 | 0.3421 | 0.8014 | 0.8069 | 0.3215 | 0.842 | 0.8462 | 0.4705 | 0.8469 | 0.8422 | 0.8089 | 0.8544 | 0.7278 | 0.7876 | 0.8675 | 0.8967 |
| 0.2909 | 99.0 | 49500 | 0.2928 | 0.8014 | 0.9546 | 0.9145 | 0.3573 | 0.801 | 0.8208 | 0.3248 | 0.8428 | 0.8466 | 0.4371 | 0.8486 | 0.8505 | 0.8066 | 0.8548 | 0.7311 | 0.7897 | 0.8666 | 0.8955 |
| 0.3016 | 100.0 | 50000 | 0.2921 | 0.8036 | 0.9515 | 0.9174 | 0.3575 | 0.8037 | 0.8162 | 0.3212 | 0.8438 | 0.8473 | 0.4233 | 0.851 | 0.85 | 0.8045 | 0.8508 | 0.7345 | 0.7887 | 0.8718 | 0.9024 |
| 0.302 | 101.0 | 50500 | 0.2868 | 0.7982 | 0.9508 | 0.9107 | 0.36 | 0.8029 | 0.8121 | 0.3155 | 0.8376 | 0.8411 | 0.4543 | 0.8452 | 0.8431 | 0.8013 | 0.85 | 0.717 | 0.768 | 0.8763 | 0.9052 |
| 0.2353 | 102.0 | 51000 | 0.2846 | 0.8089 | 0.9553 | 0.9112 | 0.3669 | 0.8158 | 0.8189 | 0.3221 | 0.8484 | 0.8514 | 0.4329 | 0.8579 | 0.8485 | 0.8159 | 0.8563 | 0.7359 | 0.7948 | 0.8751 | 0.903 |
| 0.2575 | 103.0 | 51500 | 0.2977 | 0.8054 | 0.952 | 0.915 | 0.3537 | 0.8087 | 0.8065 | 0.325 | 0.8431 | 0.8461 | 0.4095 | 0.8511 | 0.8371 | 0.8038 | 0.8484 | 0.7449 | 0.7938 | 0.8676 | 0.8961 |
| 0.291 | 104.0 | 52000 | 0.2912 | 0.7997 | 0.9466 | 0.9142 | 0.3904 | 0.8025 | 0.7997 | 0.3223 | 0.8387 | 0.8426 | 0.4748 | 0.8455 | 0.8343 | 0.8042 | 0.854 | 0.7241 | 0.7732 | 0.8707 | 0.9006 |
| 0.28 | 105.0 | 52500 | 0.2882 | 0.8063 | 0.9571 | 0.9279 | 0.3691 | 0.8113 | 0.8243 | 0.3234 | 0.8448 | 0.8489 | 0.4429 | 0.8524 | 0.8556 | 0.8032 | 0.8468 | 0.7458 | 0.801 | 0.87 | 0.8988 |
| 0.3112 | 106.0 | 53000 | 0.2837 | 0.811 | 0.9572 | 0.9217 | 0.3889 | 0.8121 | 0.8262 | 0.3231 | 0.8466 | 0.851 | 0.4819 | 0.8529 | 0.8575 | 0.8068 | 0.8548 | 0.7538 | 0.7979 | 0.8725 | 0.9003 |
| 0.2614 | 107.0 | 53500 | 0.2850 | 0.8081 | 0.9573 | 0.9274 | 0.4009 | 0.8095 | 0.8211 | 0.3244 | 0.8457 | 0.851 | 0.5033 | 0.8523 | 0.8508 | 0.8 | 0.8429 | 0.7493 | 0.8072 | 0.875 | 0.903 |
| 0.2612 | 108.0 | 54000 | 0.2851 | 0.8042 | 0.9568 | 0.9155 | 0.3603 | 0.808 | 0.8256 | 0.3219 | 0.8436 | 0.8476 | 0.4343 | 0.8535 | 0.8555 | 0.8003 | 0.8452 | 0.7427 | 0.7979 | 0.8697 | 0.8997 |
| 0.3001 | 109.0 | 54500 | 0.2800 | 0.8074 | 0.9564 | 0.919 | 0.3767 | 0.8097 | 0.8226 | 0.3235 | 0.8455 | 0.8497 | 0.4752 | 0.8517 | 0.8574 | 0.8099 | 0.854 | 0.7405 | 0.7938 | 0.8719 | 0.9012 |
| 0.2585 | 110.0 | 55000 | 0.2918 | 0.8069 | 0.9565 | 0.9162 | 0.3777 | 0.8099 | 0.8226 | 0.3229 | 0.8464 | 0.8503 | 0.4605 | 0.8545 | 0.8548 | 0.7991 | 0.8448 | 0.7462 | 0.8041 | 0.8755 | 0.9018 |
| 0.273 | 111.0 | 55500 | 0.2890 | 0.8083 | 0.9558 | 0.927 | 0.3654 | 0.81 | 0.8263 | 0.3251 | 0.8488 | 0.8522 | 0.4295 | 0.8559 | 0.8572 | 0.8041 | 0.85 | 0.7464 | 0.8062 | 0.8744 | 0.9003 |
| 0.2339 | 112.0 | 56000 | 0.2911 | 0.8039 | 0.9557 | 0.9259 | 0.3828 | 0.805 | 0.8195 | 0.3169 | 0.844 | 0.8497 | 0.4876 | 0.8522 | 0.8502 | 0.8027 | 0.85 | 0.7341 | 0.7959 | 0.8748 | 0.9033 |
| 0.2383 | 113.0 | 56500 | 0.2991 | 0.8106 | 0.9585 | 0.918 | 0.3764 | 0.813 | 0.8248 | 0.3212 | 0.8479 | 0.8527 | 0.461 | 0.8534 | 0.8558 | 0.8049 | 0.8488 | 0.7547 | 0.8082 | 0.8721 | 0.9009 |
| 0.2731 | 114.0 | 57000 | 0.2857 | 0.8121 | 0.9561 | 0.9195 | 0.3659 | 0.817 | 0.8208 | 0.3236 | 0.8521 | 0.8555 | 0.4376 | 0.8612 | 0.8529 | 0.8037 | 0.8484 | 0.7544 | 0.8134 | 0.8781 | 0.9045 |
| 0.2248 | 115.0 | 57500 | 0.2981 | 0.8019 | 0.9566 | 0.9271 | 0.3887 | 0.805 | 0.8082 | 0.3184 | 0.8437 | 0.8479 | 0.4733 | 0.8533 | 0.8453 | 0.7973 | 0.8429 | 0.7365 | 0.8 | 0.8719 | 0.9009 |
| 0.252 | 116.0 | 58000 | 0.2910 | 0.8106 | 0.9552 | 0.9285 | 0.3823 | 0.8141 | 0.816 | 0.3219 | 0.8483 | 0.8526 | 0.4757 | 0.8572 | 0.8489 | 0.8031 | 0.8508 | 0.7536 | 0.8052 | 0.8749 | 0.9018 |
| 0.2847 | 117.0 | 58500 | 0.2856 | 0.8084 | 0.9559 | 0.9225 | 0.3742 | 0.8131 | 0.8173 | 0.3197 | 0.848 | 0.8512 | 0.449 | 0.8561 | 0.8494 | 0.803 | 0.8488 | 0.7511 | 0.8052 | 0.8711 | 0.8997 |
| 0.2934 | 118.0 | 59000 | 0.2856 | 0.8109 | 0.9559 | 0.9263 | 0.3909 | 0.8168 | 0.8112 | 0.3219 | 0.8517 | 0.855 | 0.4786 | 0.8607 | 0.8459 | 0.8046 | 0.8524 | 0.7513 | 0.8093 | 0.8767 | 0.9033 |
| 0.2435 | 119.0 | 59500 | 0.2891 | 0.8084 | 0.9561 | 0.9247 | 0.3774 | 0.8149 | 0.808 | 0.3191 | 0.8481 | 0.8526 | 0.469 | 0.8584 | 0.8417 | 0.8032 | 0.8492 | 0.7475 | 0.8052 | 0.8746 | 0.9033 |
| 0.2808 | 120.0 | 60000 | 0.2873 | 0.8088 | 0.9571 | 0.92 | 0.3884 | 0.8126 | 0.8071 | 0.3201 | 0.8484 | 0.8535 | 0.4914 | 0.8576 | 0.8408 | 0.808 | 0.8532 | 0.7427 | 0.8031 | 0.8756 | 0.9042 |
| 0.2391 | 121.0 | 60500 | 0.2912 | 0.81 | 0.9559 | 0.9162 | 0.3701 | 0.8166 | 0.8145 | 0.3214 | 0.8487 | 0.8536 | 0.449 | 0.8603 | 0.848 | 0.8077 | 0.8532 | 0.7471 | 0.8041 | 0.8753 | 0.9036 |
| 0.2206 | 122.0 | 61000 | 0.2914 | 0.8064 | 0.9555 | 0.9144 | 0.3813 | 0.8154 | 0.8055 | 0.3218 | 0.8467 | 0.8506 | 0.439 | 0.8587 | 0.8384 | 0.7992 | 0.8472 | 0.7458 | 0.8021 | 0.8742 | 0.9024 |
| 0.2755 | 123.0 | 61500 | 0.2921 | 0.8068 | 0.9537 | 0.9236 | 0.3731 | 0.814 | 0.8138 | 0.3236 | 0.8474 | 0.8505 | 0.4281 | 0.8578 | 0.8479 | 0.8023 | 0.8504 | 0.7424 | 0.7979 | 0.8758 | 0.903 |
| 0.237 | 124.0 | 62000 | 0.2860 | 0.8114 | 0.9568 | 0.9227 | 0.3928 | 0.8187 | 0.8167 | 0.3225 | 0.8502 | 0.854 | 0.4633 | 0.8611 | 0.8476 | 0.8072 | 0.8528 | 0.7493 | 0.8041 | 0.8777 | 0.9052 |
| 0.2732 | 125.0 | 62500 | 0.2879 | 0.8097 | 0.9555 | 0.9235 | 0.3676 | 0.8152 | 0.8118 | 0.3225 | 0.8478 | 0.8535 | 0.4524 | 0.8598 | 0.8432 | 0.806 | 0.852 | 0.7459 | 0.8031 | 0.8771 | 0.9055 |
| 0.2499 | 126.0 | 63000 | 0.2894 | 0.8091 | 0.9561 | 0.9229 | 0.3738 | 0.8154 | 0.8136 | 0.3229 | 0.8491 | 0.854 | 0.4738 | 0.8594 | 0.847 | 0.808 | 0.8563 | 0.7471 | 0.8052 | 0.8722 | 0.9006 |
| 0.318 | 127.0 | 63500 | 0.2878 | 0.8127 | 0.9561 | 0.9202 | 0.3881 | 0.8165 | 0.818 | 0.3268 | 0.8497 | 0.855 | 0.4886 | 0.8581 | 0.8482 | 0.8084 | 0.8552 | 0.7542 | 0.8062 | 0.8754 | 0.9036 |
| 0.2367 | 128.0 | 64000 | 0.2858 | 0.8111 | 0.957 | 0.92 | 0.387 | 0.8192 | 0.8107 | 0.3243 | 0.8491 | 0.8536 | 0.4657 | 0.8598 | 0.8466 | 0.8081 | 0.8532 | 0.7502 | 0.8041 | 0.875 | 0.9036 |
| 0.2424 | 129.0 | 64500 | 0.2847 | 0.8136 | 0.9571 | 0.9228 | 0.381 | 0.8184 | 0.8178 | 0.3259 | 0.8498 | 0.8541 | 0.4571 | 0.8596 | 0.8479 | 0.8073 | 0.8524 | 0.757 | 0.8062 | 0.8765 | 0.9036 |
| 0.2599 | 130.0 | 65000 | 0.2825 | 0.8158 | 0.9572 | 0.9229 | 0.38 | 0.822 | 0.8196 | 0.3246 | 0.8525 | 0.8569 | 0.4719 | 0.8618 | 0.8522 | 0.8094 | 0.8556 | 0.7587 | 0.8093 | 0.8794 | 0.9058 |
| 0.2459 | 131.0 | 65500 | 0.2810 | 0.8172 | 0.9569 | 0.9268 | 0.3685 | 0.8237 | 0.8197 | 0.3275 | 0.8538 | 0.8584 | 0.4738 | 0.864 | 0.8506 | 0.8141 | 0.8599 | 0.7584 | 0.8093 | 0.879 | 0.9061 |
| 0.2522 | 132.0 | 66000 | 0.2825 | 0.8188 | 0.9595 | 0.9316 | 0.3846 | 0.8266 | 0.8149 | 0.3243 | 0.8522 | 0.8576 | 0.4933 | 0.8634 | 0.8476 | 0.8138 | 0.8603 | 0.7643 | 0.8062 | 0.8784 | 0.9064 |
| 0.2804 | 133.0 | 66500 | 0.2833 | 0.8211 | 0.9595 | 0.9293 | 0.3892 | 0.8275 | 0.8209 | 0.3261 | 0.8555 | 0.8604 | 0.5 | 0.8656 | 0.8503 | 0.8164 | 0.8623 | 0.7675 | 0.8124 | 0.8792 | 0.9067 |
| 0.2773 | 134.0 | 67000 | 0.2819 | 0.817 | 0.9553 | 0.9304 | 0.3784 | 0.8219 | 0.8222 | 0.3246 | 0.8534 | 0.8579 | 0.489 | 0.8623 | 0.8539 | 0.8153 | 0.8615 | 0.755 | 0.8052 | 0.8808 | 0.907 |
| 0.2379 | 135.0 | 67500 | 0.2811 | 0.8152 | 0.9553 | 0.9274 | 0.3972 | 0.8191 | 0.8207 | 0.3254 | 0.8528 | 0.8571 | 0.5 | 0.8607 | 0.8519 | 0.8108 | 0.8571 | 0.7529 | 0.8072 | 0.8818 | 0.907 |
| 0.2451 | 136.0 | 68000 | 0.2830 | 0.8179 | 0.9566 | 0.9304 | 0.3849 | 0.824 | 0.8208 | 0.3278 | 0.8549 | 0.8594 | 0.4886 | 0.8646 | 0.8531 | 0.8181 | 0.8627 | 0.7577 | 0.8103 | 0.8781 | 0.9052 |
| 0.2712 | 137.0 | 68500 | 0.2809 | 0.8139 | 0.9562 | 0.9306 | 0.377 | 0.8198 | 0.8168 | 0.3252 | 0.851 | 0.8551 | 0.4724 | 0.861 | 0.849 | 0.8121 | 0.8575 | 0.7509 | 0.8021 | 0.8788 | 0.9058 |
| 0.2524 | 138.0 | 69000 | 0.2816 | 0.8191 | 0.9576 | 0.9277 | 0.379 | 0.8246 | 0.8252 | 0.3269 | 0.8561 | 0.8606 | 0.4819 | 0.8659 | 0.8543 | 0.8156 | 0.8611 | 0.7612 | 0.8134 | 0.8804 | 0.9073 |
| 0.2524 | 139.0 | 69500 | 0.2821 | 0.8183 | 0.9581 | 0.9278 | 0.3799 | 0.8224 | 0.8204 | 0.3261 | 0.8547 | 0.8598 | 0.49 | 0.8638 | 0.8505 | 0.8162 | 0.8619 | 0.7592 | 0.8113 | 0.8796 | 0.9061 |
| 0.2704 | 140.0 | 70000 | 0.2833 | 0.816 | 0.9579 | 0.9278 | 0.3801 | 0.8204 | 0.8161 | 0.3256 | 0.8532 | 0.8584 | 0.4967 | 0.8627 | 0.8481 | 0.815 | 0.8615 | 0.7535 | 0.8082 | 0.8796 | 0.9055 |
| 0.2495 | 141.0 | 70500 | 0.2830 | 0.8168 | 0.9579 | 0.9307 | 0.3807 | 0.8203 | 0.8169 | 0.3255 | 0.8534 | 0.8586 | 0.5 | 0.8625 | 0.8495 | 0.8166 | 0.8619 | 0.7516 | 0.8072 | 0.8822 | 0.9067 |
| 0.2503 | 142.0 | 71000 | 0.2827 | 0.8166 | 0.9581 | 0.9308 | 0.3842 | 0.8214 | 0.8196 | 0.3257 | 0.8539 | 0.8589 | 0.4967 | 0.8634 | 0.8515 | 0.816 | 0.8615 | 0.7536 | 0.8093 | 0.8801 | 0.9061 |
| 0.2408 | 143.0 | 71500 | 0.2819 | 0.8174 | 0.9579 | 0.9278 | 0.3841 | 0.8222 | 0.8192 | 0.3263 | 0.8543 | 0.8594 | 0.4967 | 0.8639 | 0.8505 | 0.8149 | 0.8611 | 0.7567 | 0.8113 | 0.8805 | 0.9058 |
| 0.2237 | 144.0 | 72000 | 0.2823 | 0.817 | 0.9578 | 0.9305 | 0.3875 | 0.8209 | 0.8204 | 0.326 | 0.8545 | 0.8596 | 0.5 | 0.8632 | 0.8526 | 0.8161 | 0.8627 | 0.7548 | 0.8103 | 0.8801 | 0.9058 |
| 0.2405 | 145.0 | 72500 | 0.2828 | 0.8162 | 0.9578 | 0.9305 | 0.3875 | 0.8205 | 0.8196 | 0.3251 | 0.8537 | 0.8588 | 0.5 | 0.8628 | 0.8515 | 0.8141 | 0.8603 | 0.7545 | 0.8103 | 0.88 | 0.9058 |
| 0.2662 | 146.0 | 73000 | 0.2822 | 0.8172 | 0.9578 | 0.9306 | 0.3842 | 0.8209 | 0.82 | 0.3252 | 0.8545 | 0.8596 | 0.5 | 0.8632 | 0.8518 | 0.8158 | 0.8623 | 0.7546 | 0.8103 | 0.8812 | 0.9061 |
| 0.3253 | 147.0 | 73500 | 0.2825 | 0.8173 | 0.9579 | 0.9306 | 0.3874 | 0.8217 | 0.8199 | 0.326 | 0.8546 | 0.8597 | 0.4967 | 0.8637 | 0.8518 | 0.8162 | 0.8623 | 0.7547 | 0.8103 | 0.881 | 0.9064 |
| 0.2588 | 148.0 | 74000 | 0.2826 | 0.8175 | 0.9579 | 0.9307 | 0.3875 | 0.8217 | 0.8195 | 0.326 | 0.8546 | 0.8597 | 0.5 | 0.8637 | 0.8516 | 0.8168 | 0.8627 | 0.7548 | 0.8103 | 0.8807 | 0.9061 |
| 0.2447 | 149.0 | 74500 | 0.2826 | 0.8176 | 0.9579 | 0.9306 | 0.3875 | 0.8219 | 0.8195 | 0.326 | 0.8548 | 0.8598 | 0.5 | 0.8639 | 0.8516 | 0.8172 | 0.8631 | 0.7548 | 0.8103 | 0.8807 | 0.9061 |
| 0.2683 | 150.0 | 75000 | 0.2826 | 0.8176 | 0.9579 | 0.9306 | 0.3875 | 0.8219 | 0.8195 | 0.326 | 0.8548 | 0.8598 | 0.5 | 0.8639 | 0.8516 | 0.8172 | 0.8631 | 0.7548 | 0.8103 | 0.8807 | 0.9061 |
### Framework versions
- Transformers 4.46.2
- Pytorch 2.5.0+cu121
- Datasets 2.19.2
- Tokenizers 0.20.3
| [
"chicken",
"duck",
"plant"
] |
alekhyavinni12/detr_finetuned_cppe5 |
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# detr_finetuned_cppe5
This model is a fine-tuned version of [microsoft/conditional-detr-resnet-50](https://huggingface.co/microsoft/conditional-detr-resnet-50) on the imagefolder dataset.
It achieves the following results on the evaluation set:
- Loss: 0.6547
- Map: 0.2667
- Map 50: 0.3333
- Map 75: 0.3333
- Map Small: 0.4
- Map Medium: 0.2667
- Map Large: -1.0
- Mar 1: 0.4
- Mar 10: 0.5333
- Mar 100: 0.5333
- Mar Small: 0.5333
- Mar Medium: 0.8
- Mar Large: -1.0
- Map Black Star: -1.0
- Mar 100 Black Star: -1.0
- Map Cat: 0.4
- Mar 100 Cat: 0.8
- Map Grey Star: -1.0
- Mar 100 Grey Star: -1.0
- Map Insect: 0.4
- Mar 100 Insect: 0.8
- Map Moon: 0.0
- Mar 100 Moon: 0.0
- Map Owl: -1.0
- Mar 100 Owl: -1.0
- Map Unicorn Head: -1.0
- Mar 100 Unicorn Head: -1.0
- Map Unicorn Whole: -1.0
- Mar 100 Unicorn Whole: -1.0
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 5e-05
- train_batch_size: 8
- eval_batch_size: 8
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: cosine
- num_epochs: 300
### Training results
| Training Loss | Epoch | Step | Validation Loss | Map | Map 50 | Map 75 | Map Small | Map Medium | Map Large | Mar 1 | Mar 10 | Mar 100 | Mar Small | Mar Medium | Mar Large | Map Black Star | Mar 100 Black Star | Map Cat | Mar 100 Cat | Map Grey Star | Mar 100 Grey Star | Map Insect | Mar 100 Insect | Map Moon | Mar 100 Moon | Map Owl | Mar 100 Owl | Map Unicorn Head | Mar 100 Unicorn Head | Map Unicorn Whole | Mar 100 Unicorn Whole |
|:-------------:|:-----:|:----:|:---------------:|:------:|:------:|:------:|:---------:|:----------:|:---------:|:------:|:------:|:-------:|:---------:|:----------:|:---------:|:--------------:|:------------------:|:-------:|:-----------:|:-------------:|:-----------------:|:----------:|:--------------:|:--------:|:------------:|:-------:|:-----------:|:----------------:|:--------------------:|:-----------------:|:---------------------:|
| No log | 1.0 | 2 | 1.2274 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | -1.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | -1.0 | -1.0 | -1.0 | 0.0 | 0.0 | -1.0 | -1.0 | 0.0 | 0.0 | 0.0 | 0.0 | -1.0 | -1.0 | -1.0 | -1.0 | -1.0 | -1.0 |
| No log | 2.0 | 4 | 1.2465 | 0.0186 | 0.0316 | 0.0094 | 0.0276 | 0.0 | -1.0 | 0.0 | 0.1333 | 0.3 | 0.4333 | 0.0 | -1.0 | -1.0 | -1.0 | 0.0 | 0.0 | -1.0 | -1.0 | 0.0224 | 0.4 | 0.0333 | 0.5 | -1.0 | -1.0 | -1.0 | -1.0 | -1.0 | -1.0 |
| No log | 3.0 | 6 | 1.1943 | 0.0368 | 0.0486 | 0.0486 | 0.0328 | 0.2667 | -1.0 | 0.0 | 0.1333 | 0.5 | 0.5 | 0.8 | -1.0 | -1.0 | -1.0 | 0.0 | 0.0 | -1.0 | -1.0 | 0.0667 | 0.8 | 0.0437 | 0.7 | -1.0 | -1.0 | -1.0 | -1.0 | -1.0 | -1.0 |
| No log | 4.0 | 8 | 1.1972 | 0.0241 | 0.0345 | 0.0345 | 0.0317 | 0.0 | -1.0 | 0.0 | 0.1167 | 0.35 | 0.4667 | 0.0 | -1.0 | -1.0 | -1.0 | 0.0 | 0.0 | -1.0 | -1.0 | 0.0186 | 0.35 | 0.0538 | 0.7 | -1.0 | -1.0 | -1.0 | -1.0 | -1.0 | -1.0 |
| No log | 5.0 | 10 | 1.1821 | 0.0213 | 0.0276 | 0.0276 | 0.0268 | 0.0 | -1.0 | 0.0 | 0.1167 | 0.3833 | 0.5 | 0.0 | -1.0 | -1.0 | -1.0 | 0.0 | 0.0 | -1.0 | -1.0 | 0.0168 | 0.35 | 0.0471 | 0.8 | -1.0 | -1.0 | -1.0 | -1.0 | -1.0 | -1.0 |
| No log | 6.0 | 12 | 1.1269 | 0.0195 | 0.0279 | 0.0279 | 0.0292 | 0.0 | -1.0 | 0.0 | 0.0 | 0.35 | 0.5 | 0.0 | -1.0 | -1.0 | -1.0 | 0.0 | 0.0 | -1.0 | -1.0 | 0.0252 | 0.45 | 0.0333 | 0.6 | -1.0 | -1.0 | -1.0 | -1.0 | -1.0 | -1.0 |
| No log | 7.0 | 14 | 1.1229 | 0.016 | 0.0269 | 0.0084 | 0.0237 | 0.0 | -1.0 | 0.0 | 0.0 | 0.3 | 0.4333 | 0.0 | -1.0 | -1.0 | -1.0 | 0.0 | 0.0 | -1.0 | -1.0 | 0.0202 | 0.4 | 0.0278 | 0.5 | -1.0 | -1.0 | -1.0 | -1.0 | -1.0 | -1.0 |
| No log | 8.0 | 16 | 1.0829 | 0.0376 | 0.0514 | 0.0514 | 0.0278 | 0.45 | -1.0 | 0.0 | 0.2667 | 0.5 | 0.4667 | 0.9 | -1.0 | -1.0 | -1.0 | 0.0 | 0.0 | -1.0 | -1.0 | 0.0715 | 0.8 | 0.0412 | 0.7 | -1.0 | -1.0 | -1.0 | -1.0 | -1.0 | -1.0 |
| No log | 9.0 | 18 | 1.1243 | 0.0377 | 0.0475 | 0.0475 | 0.0304 | 0.4 | -1.0 | 0.0 | 0.1167 | 0.55 | 0.5333 | 0.8 | -1.0 | -1.0 | -1.0 | 0.0 | 0.0 | -1.0 | -1.0 | 0.0631 | 0.75 | 0.05 | 0.9 | -1.0 | -1.0 | -1.0 | -1.0 | -1.0 | -1.0 |
| No log | 10.0 | 20 | 1.1698 | 0.0352 | 0.0463 | 0.0463 | 0.0283 | 0.2667 | -1.0 | 0.0 | 0.1333 | 0.5 | 0.5 | 0.8 | -1.0 | -1.0 | -1.0 | 0.0 | 0.0 | -1.0 | -1.0 | 0.0667 | 0.8 | 0.0389 | 0.7 | -1.0 | -1.0 | -1.0 | -1.0 | -1.0 | -1.0 |
| No log | 11.0 | 22 | 1.1318 | 0.0359 | 0.0463 | 0.0463 | 0.0282 | 0.4 | -1.0 | 0.0 | 0.15 | 0.5167 | 0.5333 | 0.8 | -1.0 | -1.0 | -1.0 | 0.0 | 0.0 | -1.0 | -1.0 | 0.0689 | 0.85 | 0.0389 | 0.7 | -1.0 | -1.0 | -1.0 | -1.0 | -1.0 | -1.0 |
| No log | 12.0 | 24 | 1.0970 | 0.0356 | 0.0444 | 0.0444 | 0.0294 | 0.4 | -1.0 | 0.0 | 0.1333 | 0.5333 | 0.5333 | 0.8 | -1.0 | -1.0 | -1.0 | 0.0 | 0.0 | -1.0 | -1.0 | 0.0667 | 0.8 | 0.04 | 0.8 | -1.0 | -1.0 | -1.0 | -1.0 | -1.0 | -1.0 |
| No log | 13.0 | 26 | 1.1264 | 0.0323 | 0.0442 | 0.0442 | 0.0258 | 0.45 | -1.0 | 0.0 | 0.15 | 0.5167 | 0.4667 | 0.9 | -1.0 | -1.0 | -1.0 | 0.0 | 0.0 | -1.0 | -1.0 | 0.0549 | 0.75 | 0.0421 | 0.8 | -1.0 | -1.0 | -1.0 | -1.0 | -1.0 | -1.0 |
| No log | 14.0 | 28 | 1.1112 | 0.0324 | 0.0442 | 0.0442 | 0.0252 | 0.3 | -1.0 | 0.0 | 0.15 | 0.5 | 0.4667 | 0.9 | -1.0 | -1.0 | -1.0 | 0.0 | 0.0 | -1.0 | -1.0 | 0.0602 | 0.8 | 0.0368 | 0.7 | -1.0 | -1.0 | -1.0 | -1.0 | -1.0 | -1.0 |
| No log | 15.0 | 30 | 1.1563 | 0.0183 | 0.0273 | 0.0273 | 0.0258 | 0.0 | -1.0 | 0.0 | 0.35 | 0.35 | 0.5 | 0.0 | -1.0 | -1.0 | -1.0 | 0.0 | 0.0 | -1.0 | -1.0 | 0.0175 | 0.45 | 0.0375 | 0.6 | -1.0 | -1.0 | -1.0 | -1.0 | -1.0 | -1.0 |
| No log | 16.0 | 32 | 1.1635 | 0.0191 | 0.0263 | 0.0263 | 0.0253 | 0.0 | -1.0 | 0.0 | 0.1333 | 0.3667 | 0.5 | 0.0 | -1.0 | -1.0 | -1.0 | 0.0 | 0.0 | -1.0 | -1.0 | 0.0162 | 0.4 | 0.0412 | 0.7 | -1.0 | -1.0 | -1.0 | -1.0 | -1.0 | -1.0 |
| No log | 17.0 | 34 | 1.0852 | 0.0216 | 0.0278 | 0.0278 | 0.0273 | 0.0 | -1.0 | 0.0 | 0.3833 | 0.3833 | 0.5 | 0.0 | -1.0 | -1.0 | -1.0 | 0.0 | 0.0 | -1.0 | -1.0 | 0.0147 | 0.35 | 0.05 | 0.8 | -1.0 | -1.0 | -1.0 | -1.0 | -1.0 | -1.0 |
| No log | 18.0 | 36 | 1.0761 | 0.0249 | 0.0311 | 0.0311 | 0.0317 | 0.0 | -1.0 | 0.0 | 0.4 | 0.4 | 0.5333 | 0.0 | -1.0 | -1.0 | -1.0 | 0.0 | 0.0 | -1.0 | -1.0 | 0.0176 | 0.4 | 0.0571 | 0.8 | -1.0 | -1.0 | -1.0 | -1.0 | -1.0 | -1.0 |
| No log | 19.0 | 38 | 1.0665 | 0.0215 | 0.0269 | 0.0269 | 0.0294 | 0.0 | -1.0 | 0.0 | 0.1333 | 0.4 | 0.5333 | 0.0 | -1.0 | -1.0 | -1.0 | 0.0 | 0.0 | -1.0 | -1.0 | 0.0176 | 0.4 | 0.0471 | 0.8 | -1.0 | -1.0 | -1.0 | -1.0 | -1.0 | -1.0 |
| No log | 20.0 | 40 | 1.0589 | 0.0228 | 0.0269 | 0.0269 | 0.0299 | 0.0 | -1.0 | 0.0 | 0.1167 | 0.4167 | 0.5333 | 0.0 | -1.0 | -1.0 | -1.0 | 0.0 | 0.0 | -1.0 | -1.0 | 0.0154 | 0.35 | 0.0529 | 0.9 | -1.0 | -1.0 | -1.0 | -1.0 | -1.0 | -1.0 |
| No log | 21.0 | 42 | 1.0659 | 0.022 | 0.0285 | 0.0285 | 0.0283 | 0.0 | -1.0 | 0.0 | 0.3833 | 0.3833 | 0.5 | 0.0 | -1.0 | -1.0 | -1.0 | 0.0 | 0.0 | -1.0 | -1.0 | 0.0161 | 0.35 | 0.05 | 0.8 | -1.0 | -1.0 | -1.0 | -1.0 | -1.0 | -1.0 |
| No log | 22.0 | 44 | 1.0784 | 0.0217 | 0.0299 | 0.0299 | 0.0283 | 0.0 | -1.0 | 0.0 | 0.1333 | 0.3667 | 0.5 | 0.0 | -1.0 | -1.0 | -1.0 | 0.0 | 0.0 | -1.0 | -1.0 | 0.0184 | 0.4 | 0.0467 | 0.7 | -1.0 | -1.0 | -1.0 | -1.0 | -1.0 | -1.0 |
| No log | 23.0 | 46 | 0.9880 | 0.0213 | 0.0266 | 0.0266 | 0.0273 | 0.0 | -1.0 | 0.0 | 0.1333 | 0.4 | 0.5333 | 0.0 | -1.0 | -1.0 | -1.0 | 0.0 | 0.0 | -1.0 | -1.0 | 0.0168 | 0.4 | 0.0471 | 0.8 | -1.0 | -1.0 | -1.0 | -1.0 | -1.0 | -1.0 |
| No log | 24.0 | 48 | 1.0116 | 0.0204 | 0.0263 | 0.0263 | 0.0254 | 0.0 | -1.0 | 0.0 | 0.1167 | 0.3833 | 0.5 | 0.0 | -1.0 | -1.0 | -1.0 | 0.0 | 0.0 | -1.0 | -1.0 | 0.0141 | 0.35 | 0.0471 | 0.8 | -1.0 | -1.0 | -1.0 | -1.0 | -1.0 | -1.0 |
| No log | 25.0 | 50 | 1.0435 | 0.033 | 0.0434 | 0.0434 | 0.0254 | 0.2 | -1.0 | 0.0 | 0.1167 | 0.5167 | 0.5 | 0.8 | -1.0 | -1.0 | -1.0 | 0.0 | 0.0 | -1.0 | -1.0 | 0.0518 | 0.75 | 0.0471 | 0.8 | -1.0 | -1.0 | -1.0 | -1.0 | -1.0 | -1.0 |
| No log | 26.0 | 52 | 1.0467 | 0.0374 | 0.046 | 0.046 | 0.0294 | 0.225 | -1.0 | 0.0 | 0.4 | 0.55 | 0.5333 | 0.9 | -1.0 | -1.0 | -1.0 | 0.0 | 0.0 | -1.0 | -1.0 | 0.0589 | 0.85 | 0.0533 | 0.8 | -1.0 | -1.0 | -1.0 | -1.0 | -1.0 | -1.0 |
| No log | 27.0 | 54 | 1.0652 | 0.0346 | 0.046 | 0.046 | 0.0283 | 0.2 | -1.0 | 0.0 | 0.3667 | 0.5 | 0.5 | 0.8 | -1.0 | -1.0 | -1.0 | 0.0 | 0.0 | -1.0 | -1.0 | 0.0571 | 0.8 | 0.0467 | 0.7 | -1.0 | -1.0 | -1.0 | -1.0 | -1.0 | -1.0 |
| No log | 28.0 | 56 | 1.0341 | 0.035 | 0.0444 | 0.0444 | 0.0294 | 0.1967 | -1.0 | 0.0 | 0.3833 | 0.55 | 0.5 | 1.0 | -1.0 | -1.0 | -1.0 | 0.0 | 0.0 | -1.0 | -1.0 | 0.0517 | 0.85 | 0.0533 | 0.8 | -1.0 | -1.0 | -1.0 | -1.0 | -1.0 | -1.0 |
| No log | 29.0 | 58 | 1.0602 | 0.0363 | 0.0468 | 0.0468 | 0.0302 | 0.18 | -1.0 | 0.0 | 0.3833 | 0.5333 | 0.5 | 0.9 | -1.0 | -1.0 | -1.0 | 0.0 | 0.0 | -1.0 | -1.0 | 0.0518 | 0.8 | 0.0571 | 0.8 | -1.0 | -1.0 | -1.0 | -1.0 | -1.0 | -1.0 |
| No log | 30.0 | 60 | 1.0678 | 0.0398 | 0.0513 | 0.0513 | 0.0328 | 0.18 | -1.0 | 0.0 | 0.3833 | 0.5333 | 0.5 | 0.9 | -1.0 | -1.0 | -1.0 | 0.0 | 0.0 | -1.0 | -1.0 | 0.0577 | 0.8 | 0.0615 | 0.8 | -1.0 | -1.0 | -1.0 | -1.0 | -1.0 | -1.0 |
| No log | 31.0 | 62 | 1.0135 | 0.0398 | 0.0513 | 0.0513 | 0.0322 | 0.18 | -1.0 | 0.0 | 0.3833 | 0.5333 | 0.5 | 0.9 | -1.0 | -1.0 | -1.0 | 0.0 | 0.0 | -1.0 | -1.0 | 0.0577 | 0.8 | 0.0615 | 0.8 | -1.0 | -1.0 | -1.0 | -1.0 | -1.0 | -1.0 |
| No log | 32.0 | 64 | 1.0206 | 0.0281 | 0.038 | 0.038 | 0.0362 | 0.0 | -1.0 | 0.0 | 0.3833 | 0.3833 | 0.5333 | 0.0 | -1.0 | -1.0 | -1.0 | 0.0 | 0.0 | -1.0 | -1.0 | 0.0207 | 0.45 | 0.0636 | 0.7 | -1.0 | -1.0 | -1.0 | -1.0 | -1.0 | -1.0 |
| No log | 33.0 | 66 | 1.0374 | 0.0296 | 0.0403 | 0.0403 | 0.037 | 0.0 | -1.0 | 0.0 | 0.3833 | 0.3833 | 0.5333 | 0.0 | -1.0 | -1.0 | -1.0 | 0.0 | 0.0 | -1.0 | -1.0 | 0.0189 | 0.45 | 0.07 | 0.7 | -1.0 | -1.0 | -1.0 | -1.0 | -1.0 | -1.0 |
| No log | 34.0 | 68 | 1.0254 | 0.0278 | 0.0348 | 0.0348 | 0.0343 | 0.0 | -1.0 | 0.0 | 0.4 | 0.4 | 0.5333 | 0.0 | -1.0 | -1.0 | -1.0 | 0.0 | 0.0 | -1.0 | -1.0 | 0.0168 | 0.4 | 0.0667 | 0.8 | -1.0 | -1.0 | -1.0 | -1.0 | -1.0 | -1.0 |
| No log | 35.0 | 70 | 0.9879 | 0.0326 | 0.0415 | 0.0415 | 0.0258 | 0.2 | -1.0 | 0.0 | 0.1167 | 0.55 | 0.5 | 1.0 | -1.0 | -1.0 | -1.0 | 0.0 | 0.0 | -1.0 | -1.0 | 0.0535 | 0.85 | 0.0444 | 0.8 | -1.0 | -1.0 | -1.0 | -1.0 | -1.0 | -1.0 |
| No log | 36.0 | 72 | 0.9183 | 0.0378 | 0.0442 | 0.0442 | 0.0298 | 0.225 | -1.0 | 0.0 | 0.1333 | 0.5833 | 0.5667 | 0.9 | -1.0 | -1.0 | -1.0 | 0.0 | 0.0 | -1.0 | -1.0 | 0.0635 | 0.85 | 0.05 | 0.9 | -1.0 | -1.0 | -1.0 | -1.0 | -1.0 | -1.0 |
| No log | 37.0 | 74 | 0.9336 | 0.0358 | 0.0432 | 0.0432 | 0.0278 | 0.25 | -1.0 | 0.0 | 0.1333 | 0.5667 | 0.5333 | 1.0 | -1.0 | -1.0 | -1.0 | 0.0 | 0.0 | -1.0 | -1.0 | 0.063 | 0.9 | 0.0444 | 0.8 | -1.0 | -1.0 | -1.0 | -1.0 | -1.0 | -1.0 |
| No log | 38.0 | 76 | 0.9307 | 0.0351 | 0.0422 | 0.0422 | 0.0273 | 0.25 | -1.0 | 0.0 | 0.0 | 0.5667 | 0.5333 | 1.0 | -1.0 | -1.0 | -1.0 | 0.0 | 0.0 | -1.0 | -1.0 | 0.0631 | 0.9 | 0.0421 | 0.8 | -1.0 | -1.0 | -1.0 | -1.0 | -1.0 | -1.0 |
| No log | 39.0 | 78 | 0.9378 | 0.0339 | 0.0415 | 0.0415 | 0.0289 | 0.18 | -1.0 | 0.0 | 0.4167 | 0.5667 | 0.5333 | 0.9 | -1.0 | -1.0 | -1.0 | 0.0 | 0.0 | -1.0 | -1.0 | 0.0518 | 0.8 | 0.05 | 0.9 | -1.0 | -1.0 | -1.0 | -1.0 | -1.0 | -1.0 |
| No log | 40.0 | 80 | 0.9379 | 0.0365 | 0.0423 | 0.0423 | 0.0307 | 0.15 | -1.0 | 0.0 | 0.3 | 0.5833 | 0.5667 | 0.9 | -1.0 | -1.0 | -1.0 | 0.0 | 0.0 | -1.0 | -1.0 | 0.0532 | 0.85 | 0.0562 | 0.9 | -1.0 | -1.0 | -1.0 | -1.0 | -1.0 | -1.0 |
| No log | 41.0 | 82 | 0.9257 | 0.0393 | 0.0431 | 0.0431 | 0.033 | 0.1429 | -1.0 | 0.0 | 0.3 | 0.6167 | 0.6 | 1.0 | -1.0 | -1.0 | -1.0 | 0.0 | 0.0 | -1.0 | -1.0 | 0.0617 | 0.95 | 0.0562 | 0.9 | -1.0 | -1.0 | -1.0 | -1.0 | -1.0 | -1.0 |
| No log | 42.0 | 84 | 0.9214 | 0.067 | 0.083 | 0.083 | 0.0611 | 0.15 | -1.0 | 0.0 | 0.2667 | 0.8167 | 0.8 | 0.9 | -1.0 | -1.0 | -1.0 | 0.1 | 0.8 | -1.0 | -1.0 | 0.0589 | 0.85 | 0.0421 | 0.8 | -1.0 | -1.0 | -1.0 | -1.0 | -1.0 | -1.0 |
| No log | 43.0 | 86 | 0.9455 | 0.0703 | 0.0813 | 0.0813 | 0.0648 | 0.1429 | -1.0 | 0.0 | 0.3 | 0.8833 | 0.8333 | 1.0 | -1.0 | -1.0 | -1.0 | 0.1125 | 0.9 | -1.0 | -1.0 | 0.0535 | 0.85 | 0.045 | 0.9 | -1.0 | -1.0 | -1.0 | -1.0 | -1.0 | -1.0 |
| No log | 44.0 | 88 | 0.9512 | 0.0703 | 0.0813 | 0.0813 | 0.0639 | 0.1667 | -1.0 | 0.0 | 0.3 | 0.8833 | 0.8333 | 1.0 | -1.0 | -1.0 | -1.0 | 0.1125 | 0.9 | -1.0 | -1.0 | 0.0556 | 0.85 | 0.0429 | 0.9 | -1.0 | -1.0 | -1.0 | -1.0 | -1.0 | -1.0 |
| No log | 45.0 | 90 | 0.9421 | 0.0723 | 0.0822 | 0.0822 | 0.0663 | 0.1286 | -1.0 | 0.0 | 0.3 | 0.8833 | 0.8667 | 0.9 | -1.0 | -1.0 | -1.0 | 0.1125 | 0.9 | -1.0 | -1.0 | 0.0569 | 0.85 | 0.0474 | 0.9 | -1.0 | -1.0 | -1.0 | -1.0 | -1.0 | -1.0 |
| No log | 46.0 | 92 | 0.9449 | 0.0705 | 0.0821 | 0.0821 | 0.0642 | 0.1286 | -1.0 | 0.0 | 0.3 | 0.85 | 0.8333 | 0.9 | -1.0 | -1.0 | -1.0 | 0.1125 | 0.9 | -1.0 | -1.0 | 0.0589 | 0.85 | 0.04 | 0.8 | -1.0 | -1.0 | -1.0 | -1.0 | -1.0 | -1.0 |
| No log | 47.0 | 94 | 0.9339 | 0.0671 | 0.0784 | 0.0784 | 0.0617 | 0.1286 | -1.0 | 0.0 | 0.3 | 0.8667 | 0.8333 | 0.9 | -1.0 | -1.0 | -1.0 | 0.1 | 0.9 | -1.0 | -1.0 | 0.0539 | 0.8 | 0.0474 | 0.9 | -1.0 | -1.0 | -1.0 | -1.0 | -1.0 | -1.0 |
| No log | 48.0 | 96 | 0.9383 | 0.0666 | 0.076 | 0.076 | 0.0633 | 0.1125 | -1.0 | 0.0 | 0.3 | 0.8833 | 0.8667 | 0.9 | -1.0 | -1.0 | -1.0 | 0.1 | 0.9 | -1.0 | -1.0 | 0.059 | 0.85 | 0.0409 | 0.9 | -1.0 | -1.0 | -1.0 | -1.0 | -1.0 | -1.0 |
| No log | 49.0 | 98 | 0.9569 | 0.0664 | 0.0776 | 0.0776 | 0.0631 | 0.1143 | -1.0 | 0.0 | 0.3 | 0.85 | 0.8667 | 0.8 | -1.0 | -1.0 | -1.0 | 0.1 | 0.9 | -1.0 | -1.0 | 0.0612 | 0.85 | 0.0381 | 0.8 | -1.0 | -1.0 | -1.0 | -1.0 | -1.0 | -1.0 |
| No log | 50.0 | 100 | 0.9588 | 0.0721 | 0.0812 | 0.0812 | 0.0667 | 0.1286 | -1.0 | 0.0 | 0.3 | 0.8833 | 0.9 | 0.9 | -1.0 | -1.0 | -1.0 | 0.1 | 0.9 | -1.0 | -1.0 | 0.0741 | 0.95 | 0.0421 | 0.8 | -1.0 | -1.0 | -1.0 | -1.0 | -1.0 | -1.0 |
| No log | 51.0 | 102 | 0.9490 | 0.0656 | 0.0759 | 0.0759 | 0.0609 | 0.15 | -1.0 | 0.0 | 0.2667 | 0.8833 | 0.9 | 0.9 | -1.0 | -1.0 | -1.0 | 0.08 | 0.8 | -1.0 | -1.0 | 0.074 | 0.95 | 0.0429 | 0.9 | -1.0 | -1.0 | -1.0 | -1.0 | -1.0 | -1.0 |
| No log | 52.0 | 104 | 0.9267 | 0.0709 | 0.077 | 0.077 | 0.0662 | 0.1286 | -1.0 | 0.0 | 0.3 | 0.9333 | 0.9333 | 0.9 | -1.0 | -1.0 | -1.0 | 0.09 | 0.9 | -1.0 | -1.0 | 0.075 | 0.9 | 0.0476 | 1.0 | -1.0 | -1.0 | -1.0 | -1.0 | -1.0 | -1.0 |
| No log | 53.0 | 106 | 0.9466 | 0.067 | 0.0767 | 0.0767 | 0.0625 | 0.1286 | -1.0 | 0.0 | 0.3 | 0.8833 | 0.8667 | 0.9 | -1.0 | -1.0 | -1.0 | 0.09 | 0.9 | -1.0 | -1.0 | 0.0661 | 0.85 | 0.045 | 0.9 | -1.0 | -1.0 | -1.0 | -1.0 | -1.0 | -1.0 |
| No log | 54.0 | 108 | 0.9340 | 0.0683 | 0.0804 | 0.0804 | 0.0641 | 0.1286 | -1.0 | 0.0 | 0.2667 | 0.8833 | 0.8667 | 0.9 | -1.0 | -1.0 | -1.0 | 0.0889 | 0.8 | -1.0 | -1.0 | 0.066 | 0.85 | 0.05 | 1.0 | -1.0 | -1.0 | -1.0 | -1.0 | -1.0 | -1.0 |
| No log | 55.0 | 110 | 0.9496 | 0.0736 | 0.089 | 0.089 | 0.0715 | 0.1143 | -1.0 | 0.0 | 0.2667 | 0.85 | 0.8667 | 0.8 | -1.0 | -1.0 | -1.0 | 0.1143 | 0.8 | -1.0 | -1.0 | 0.0592 | 0.85 | 0.0474 | 0.9 | -1.0 | -1.0 | -1.0 | -1.0 | -1.0 | -1.0 |
| No log | 56.0 | 112 | 0.9287 | 0.0751 | 0.0843 | 0.0843 | 0.0771 | 0.0889 | -1.0 | 0.0 | 0.3 | 0.9 | 0.9333 | 0.8 | -1.0 | -1.0 | -1.0 | 0.1125 | 0.9 | -1.0 | -1.0 | 0.0598 | 0.9 | 0.0529 | 0.9 | -1.0 | -1.0 | -1.0 | -1.0 | -1.0 | -1.0 |
| No log | 57.0 | 114 | 0.9671 | 0.0663 | 0.0775 | 0.0775 | 0.0658 | 0.08 | -1.0 | 0.0 | 0.3 | 0.85 | 0.8667 | 0.8 | -1.0 | -1.0 | -1.0 | 0.1 | 0.9 | -1.0 | -1.0 | 0.0519 | 0.85 | 0.0471 | 0.8 | -1.0 | -1.0 | -1.0 | -1.0 | -1.0 | -1.0 |
| No log | 58.0 | 116 | 0.9205 | 0.0716 | 0.081 | 0.081 | 0.0739 | 0.0727 | -1.0 | 0.0 | 0.3 | 0.8833 | 0.9 | 0.8 | -1.0 | -1.0 | -1.0 | 0.1125 | 0.9 | -1.0 | -1.0 | 0.0522 | 0.85 | 0.05 | 0.9 | -1.0 | -1.0 | -1.0 | -1.0 | -1.0 | -1.0 |
| No log | 59.0 | 118 | 0.9027 | 0.0702 | 0.0832 | 0.0832 | 0.0735 | 0.08 | -1.0 | 0.0 | 0.4 | 0.8667 | 0.8667 | 0.8 | -1.0 | -1.0 | -1.0 | 0.1 | 0.8 | -1.0 | -1.0 | 0.0552 | 0.8 | 0.0556 | 1.0 | -1.0 | -1.0 | -1.0 | -1.0 | -1.0 | -1.0 |
| No log | 60.0 | 120 | 0.8984 | 0.0673 | 0.0768 | 0.0768 | 0.071 | 0.0891 | -1.0 | 0.0 | 0.4333 | 0.8833 | 0.8667 | 0.9 | -1.0 | -1.0 | -1.0 | 0.09 | 0.9 | -1.0 | -1.0 | 0.0589 | 0.85 | 0.0529 | 0.9 | -1.0 | -1.0 | -1.0 | -1.0 | -1.0 | -1.0 |
| No log | 61.0 | 122 | 0.9150 | 0.0684 | 0.0844 | 0.0844 | 0.0775 | 0.1125 | -1.0 | 0.0 | 0.4167 | 0.8 | 0.8 | 0.9 | -1.0 | -1.0 | -1.0 | 0.0889 | 0.8 | -1.0 | -1.0 | 0.075 | 0.9 | 0.0412 | 0.7 | -1.0 | -1.0 | -1.0 | -1.0 | -1.0 | -1.0 |
| No log | 62.0 | 124 | 0.9707 | 0.0688 | 0.0898 | 0.0898 | 0.0804 | 0.15 | -1.0 | 0.0 | 0.5333 | 0.7667 | 0.7667 | 0.9 | -1.0 | -1.0 | -1.0 | 0.0875 | 0.7 | -1.0 | -1.0 | 0.0801 | 0.9 | 0.0389 | 0.7 | -1.0 | -1.0 | -1.0 | -1.0 | -1.0 | -1.0 |
| No log | 63.0 | 126 | 0.9841 | 0.0733 | 0.0962 | 0.0962 | 0.0908 | 0.15 | -1.0 | 0.0 | 0.3833 | 0.7667 | 0.7667 | 0.9 | -1.0 | -1.0 | -1.0 | 0.1 | 0.7 | -1.0 | -1.0 | 0.0811 | 0.9 | 0.0389 | 0.7 | -1.0 | -1.0 | -1.0 | -1.0 | -1.0 | -1.0 |
| No log | 64.0 | 128 | 0.9543 | 0.0665 | 0.0814 | 0.0814 | 0.0786 | 0.0889 | -1.0 | 0.0 | 0.4167 | 0.8167 | 0.8333 | 0.8 | -1.0 | -1.0 | -1.0 | 0.0889 | 0.8 | -1.0 | -1.0 | 0.0662 | 0.85 | 0.0444 | 0.8 | -1.0 | -1.0 | -1.0 | -1.0 | -1.0 | -1.0 |
| No log | 65.0 | 130 | 0.9361 | 0.075 | 0.0869 | 0.0869 | 0.0957 | 0.1 | -1.0 | 0.0 | 0.5833 | 0.85 | 0.8667 | 0.8 | -1.0 | -1.0 | -1.0 | 0.09 | 0.9 | -1.0 | -1.0 | 0.0928 | 0.85 | 0.0421 | 0.8 | -1.0 | -1.0 | -1.0 | -1.0 | -1.0 | -1.0 |
| No log | 66.0 | 132 | 0.8495 | 0.0974 | 0.1083 | 0.1083 | 0.1467 | 0.18 | -1.0 | 0.15 | 0.6 | 0.9 | 0.9 | 0.9 | -1.0 | -1.0 | -1.0 | 0.09 | 0.9 | -1.0 | -1.0 | 0.1594 | 0.9 | 0.0429 | 0.9 | -1.0 | -1.0 | -1.0 | -1.0 | -1.0 | -1.0 |
| No log | 67.0 | 134 | 0.8382 | 0.1284 | 0.1468 | 0.1468 | 0.1463 | 0.3 | -1.0 | 0.15 | 0.5667 | 0.8667 | 0.8667 | 0.9 | -1.0 | -1.0 | -1.0 | 0.0889 | 0.8 | -1.0 | -1.0 | 0.2571 | 0.9 | 0.0391 | 0.9 | -1.0 | -1.0 | -1.0 | -1.0 | -1.0 | -1.0 |
| No log | 68.0 | 136 | 0.8237 | 0.1183 | 0.1355 | 0.1355 | 0.1473 | 0.225 | -1.0 | 0.15 | 0.5667 | 0.8667 | 0.8667 | 0.9 | -1.0 | -1.0 | -1.0 | 0.0889 | 0.8 | -1.0 | -1.0 | 0.225 | 0.9 | 0.0409 | 0.9 | -1.0 | -1.0 | -1.0 | -1.0 | -1.0 | -1.0 |
| No log | 69.0 | 138 | 0.8878 | 0.0953 | 0.1075 | 0.1075 | 0.1457 | 0.18 | -1.0 | 0.15 | 0.6 | 0.8667 | 0.8667 | 0.9 | -1.0 | -1.0 | -1.0 | 0.09 | 0.9 | -1.0 | -1.0 | 0.1594 | 0.9 | 0.0364 | 0.8 | -1.0 | -1.0 | -1.0 | -1.0 | -1.0 | -1.0 |
| No log | 70.0 | 140 | 0.9115 | 0.1006 | 0.1159 | 0.1159 | 0.1473 | 0.18 | -1.0 | 0.15 | 0.5667 | 0.8667 | 0.8667 | 0.9 | -1.0 | -1.0 | -1.0 | 0.0889 | 0.8 | -1.0 | -1.0 | 0.1719 | 0.9 | 0.0409 | 0.9 | -1.0 | -1.0 | -1.0 | -1.0 | -1.0 | -1.0 |
| No log | 71.0 | 142 | 0.9379 | 0.1317 | 0.1679 | 0.1679 | 0.1347 | 0.2667 | -1.0 | 0.1333 | 0.5 | 0.8 | 0.8 | 0.8 | -1.0 | -1.0 | -1.0 | 0.0875 | 0.7 | -1.0 | -1.0 | 0.2667 | 0.8 | 0.0409 | 0.9 | -1.0 | -1.0 | -1.0 | -1.0 | -1.0 | -1.0 |
| No log | 72.0 | 144 | 0.9314 | 0.1399 | 0.1687 | 0.1687 | 0.151 | 0.2667 | -1.0 | 0.15 | 0.55 | 0.85 | 0.8667 | 0.8 | -1.0 | -1.0 | -1.0 | 0.1 | 0.8 | -1.0 | -1.0 | 0.2768 | 0.85 | 0.0429 | 0.9 | -1.0 | -1.0 | -1.0 | -1.0 | -1.0 | -1.0 |
| No log | 73.0 | 146 | 0.9406 | 0.1574 | 0.1724 | 0.1724 | 0.1594 | 0.3 | -1.0 | 0.15 | 0.9 | 0.9 | 0.9 | 0.9 | -1.0 | -1.0 | -1.0 | 0.125 | 1.0 | -1.0 | -1.0 | 0.3 | 0.9 | 0.0471 | 0.8 | -1.0 | -1.0 | -1.0 | -1.0 | -1.0 | -1.0 |
| No log | 74.0 | 148 | 0.9458 | 0.1574 | 0.1724 | 0.1724 | 0.1583 | 0.3 | -1.0 | 0.15 | 0.6333 | 0.9 | 0.9 | 0.9 | -1.0 | -1.0 | -1.0 | 0.125 | 1.0 | -1.0 | -1.0 | 0.3 | 0.9 | 0.0471 | 0.8 | -1.0 | -1.0 | -1.0 | -1.0 | -1.0 | -1.0 |
| No log | 75.0 | 150 | 0.8777 | 0.1542 | 0.1713 | 0.1713 | 0.1551 | 0.3 | -1.0 | 0.15 | 0.6 | 0.9 | 0.9 | 0.9 | -1.0 | -1.0 | -1.0 | 0.1125 | 0.9 | -1.0 | -1.0 | 0.3 | 0.9 | 0.05 | 0.9 | -1.0 | -1.0 | -1.0 | -1.0 | -1.0 | -1.0 |
| No log | 76.0 | 152 | 0.8415 | 0.1502 | 0.1687 | 0.1687 | 0.1523 | 0.3 | -1.0 | 0.15 | 0.6 | 0.8667 | 0.8667 | 0.9 | -1.0 | -1.0 | -1.0 | 0.1125 | 0.9 | -1.0 | -1.0 | 0.3 | 0.9 | 0.0381 | 0.8 | -1.0 | -1.0 | -1.0 | -1.0 | -1.0 | -1.0 |
| No log | 77.0 | 154 | 0.8335 | 0.1562 | 0.1754 | 0.1754 | 0.1569 | 0.3 | -1.0 | 0.15 | 0.6 | 0.8667 | 0.8667 | 0.9 | -1.0 | -1.0 | -1.0 | 0.1286 | 0.9 | -1.0 | -1.0 | 0.3 | 0.9 | 0.04 | 0.8 | -1.0 | -1.0 | -1.0 | -1.0 | -1.0 | -1.0 |
| No log | 78.0 | 156 | 0.8349 | 0.1603 | 0.1746 | 0.1746 | 0.1624 | 0.3 | -1.0 | 0.15 | 0.6333 | 0.9 | 0.9 | 0.9 | -1.0 | -1.0 | -1.0 | 0.1429 | 1.0 | -1.0 | -1.0 | 0.3 | 0.9 | 0.0381 | 0.8 | -1.0 | -1.0 | -1.0 | -1.0 | -1.0 | -1.0 |
| No log | 79.0 | 158 | 0.8439 | 0.1523 | 0.1679 | 0.1679 | 0.1539 | 0.3 | -1.0 | 0.15 | 0.6333 | 0.8667 | 0.8667 | 0.9 | -1.0 | -1.0 | -1.0 | 0.125 | 1.0 | -1.0 | -1.0 | 0.3 | 0.9 | 0.0318 | 0.7 | -1.0 | -1.0 | -1.0 | -1.0 | -1.0 | -1.0 |
| No log | 80.0 | 160 | 0.8159 | 0.1545 | 0.1732 | 0.1732 | 0.2062 | 0.3 | -1.0 | 0.15 | 0.6 | 0.8667 | 0.8667 | 0.9 | -1.0 | -1.0 | -1.0 | 0.1286 | 0.9 | -1.0 | -1.0 | 0.3 | 0.9 | 0.0348 | 0.8 | -1.0 | -1.0 | -1.0 | -1.0 | -1.0 | -1.0 |
| No log | 81.0 | 162 | 0.8051 | 0.1674 | 0.1961 | 0.1961 | 0.1912 | 0.3 | -1.0 | 0.15 | 0.5833 | 0.8833 | 0.8667 | 0.9 | -1.0 | -1.0 | -1.0 | 0.1286 | 0.9 | -1.0 | -1.0 | 0.3326 | 0.85 | 0.0409 | 0.9 | -1.0 | -1.0 | -1.0 | -1.0 | -1.0 | -1.0 |
| No log | 82.0 | 164 | 0.7991 | 0.1679 | 0.1976 | 0.1976 | 0.192 | 0.3 | -1.0 | 0.1333 | 0.5833 | 0.8833 | 0.8667 | 0.9 | -1.0 | -1.0 | -1.0 | 0.1286 | 0.9 | -1.0 | -1.0 | 0.3301 | 0.85 | 0.045 | 0.9 | -1.0 | -1.0 | -1.0 | -1.0 | -1.0 | -1.0 |
| No log | 83.0 | 166 | 0.7923 | 0.1556 | 0.1746 | 0.1746 | 0.2069 | 0.3 | -1.0 | 0.15 | 0.6 | 0.8667 | 0.8667 | 0.9 | -1.0 | -1.0 | -1.0 | 0.1286 | 0.9 | -1.0 | -1.0 | 0.3 | 0.9 | 0.0381 | 0.8 | -1.0 | -1.0 | -1.0 | -1.0 | -1.0 | -1.0 |
| No log | 84.0 | 168 | 0.7915 | 0.1479 | 0.1754 | 0.1754 | 0.1466 | 0.3 | -1.0 | 0.1333 | 0.5833 | 0.85 | 0.8333 | 0.9 | -1.0 | -1.0 | -1.0 | 0.1286 | 0.9 | -1.0 | -1.0 | 0.2751 | 0.85 | 0.04 | 0.8 | -1.0 | -1.0 | -1.0 | -1.0 | -1.0 | -1.0 |
| No log | 85.0 | 170 | 0.7987 | 0.1511 | 0.1739 | 0.1739 | 0.1503 | 0.3 | -1.0 | 0.15 | 0.5833 | 0.9167 | 0.9 | 0.9 | -1.0 | -1.0 | -1.0 | 0.1286 | 0.9 | -1.0 | -1.0 | 0.2793 | 0.85 | 0.0455 | 1.0 | -1.0 | -1.0 | -1.0 | -1.0 | -1.0 | -1.0 |
| No log | 86.0 | 172 | 0.7922 | 0.1503 | 0.1739 | 0.1739 | 0.1503 | 0.3 | -1.0 | 0.15 | 0.5833 | 0.9167 | 0.9 | 0.9 | -1.0 | -1.0 | -1.0 | 0.1286 | 0.9 | -1.0 | -1.0 | 0.2768 | 0.85 | 0.0455 | 1.0 | -1.0 | -1.0 | -1.0 | -1.0 | -1.0 | -1.0 |
| No log | 87.0 | 174 | 0.7992 | 0.1439 | 0.1739 | 0.1739 | 0.1466 | 0.2667 | -1.0 | 0.1333 | 0.5667 | 0.8333 | 0.8333 | 0.8 | -1.0 | -1.0 | -1.0 | 0.1286 | 0.9 | -1.0 | -1.0 | 0.2667 | 0.8 | 0.0364 | 0.8 | -1.0 | -1.0 | -1.0 | -1.0 | -1.0 | -1.0 |
| No log | 88.0 | 176 | 0.7846 | 0.1644 | 0.1937 | 0.1937 | 0.1637 | 0.3 | -1.0 | 0.1333 | 0.5833 | 0.85 | 0.8333 | 0.9 | -1.0 | -1.0 | -1.0 | 0.18 | 0.9 | -1.0 | -1.0 | 0.2751 | 0.85 | 0.0381 | 0.8 | -1.0 | -1.0 | -1.0 | -1.0 | -1.0 | -1.0 |
| No log | 89.0 | 178 | 0.7767 | 0.166 | 0.1937 | 0.1937 | 0.1656 | 0.3 | -1.0 | 0.1333 | 0.5833 | 0.8833 | 0.8667 | 0.9 | -1.0 | -1.0 | -1.0 | 0.18 | 0.9 | -1.0 | -1.0 | 0.2751 | 0.85 | 0.0429 | 0.9 | -1.0 | -1.0 | -1.0 | -1.0 | -1.0 | -1.0 |
| No log | 90.0 | 180 | 0.7721 | 0.1788 | 0.2096 | 0.2096 | 0.1787 | 0.3 | -1.0 | 0.1333 | 0.5833 | 0.85 | 0.8333 | 0.9 | -1.0 | -1.0 | -1.0 | 0.225 | 0.9 | -1.0 | -1.0 | 0.2751 | 0.85 | 0.0364 | 0.8 | -1.0 | -1.0 | -1.0 | -1.0 | -1.0 | -1.0 |
| No log | 91.0 | 182 | 0.7921 | 0.1778 | 0.2083 | 0.2083 | 0.1779 | 0.3 | -1.0 | 0.1333 | 0.5833 | 0.85 | 0.8333 | 0.9 | -1.0 | -1.0 | -1.0 | 0.225 | 0.9 | -1.0 | -1.0 | 0.2751 | 0.85 | 0.0333 | 0.8 | -1.0 | -1.0 | -1.0 | -1.0 | -1.0 | -1.0 |
| No log | 92.0 | 184 | 0.7830 | 0.1764 | 0.2083 | 0.2083 | 0.1797 | 0.2667 | -1.0 | 0.1333 | 0.5667 | 0.8667 | 0.8667 | 0.8 | -1.0 | -1.0 | -1.0 | 0.225 | 0.9 | -1.0 | -1.0 | 0.2667 | 0.8 | 0.0375 | 0.9 | -1.0 | -1.0 | -1.0 | -1.0 | -1.0 | -1.0 |
| No log | 93.0 | 186 | 0.7622 | 0.2068 | 0.2374 | 0.2374 | 0.2085 | 0.3 | -1.0 | 0.4333 | 0.5833 | 0.9167 | 0.9 | 0.9 | -1.0 | -1.0 | -1.0 | 0.3 | 0.9 | -1.0 | -1.0 | 0.2751 | 0.85 | 0.0455 | 1.0 | -1.0 | -1.0 | -1.0 | -1.0 | -1.0 | -1.0 |
| No log | 94.0 | 188 | 0.7823 | 0.2047 | 0.2367 | 0.2367 | 0.2065 | 0.3 | -1.0 | 0.4333 | 0.5833 | 0.8833 | 0.8667 | 0.9 | -1.0 | -1.0 | -1.0 | 0.3 | 0.9 | -1.0 | -1.0 | 0.2751 | 0.85 | 0.0391 | 0.9 | -1.0 | -1.0 | -1.0 | -1.0 | -1.0 | -1.0 |
| No log | 95.0 | 190 | 0.7928 | 0.2005 | 0.2367 | 0.2367 | 0.2046 | 0.2667 | -1.0 | 0.4333 | 0.5667 | 0.8333 | 0.8333 | 0.8 | -1.0 | -1.0 | -1.0 | 0.3 | 0.9 | -1.0 | -1.0 | 0.2667 | 0.8 | 0.0348 | 0.8 | -1.0 | -1.0 | -1.0 | -1.0 | -1.0 | -1.0 |
| No log | 96.0 | 192 | 0.7853 | 0.213 | 0.2367 | 0.2367 | 0.2176 | 0.2667 | -1.0 | 0.4667 | 0.6 | 0.9 | 0.9 | 0.8 | -1.0 | -1.0 | -1.0 | 0.3333 | 1.0 | -1.0 | -1.0 | 0.2667 | 0.8 | 0.0391 | 0.9 | -1.0 | -1.0 | -1.0 | -1.0 | -1.0 | -1.0 |
| No log | 97.0 | 194 | 0.8034 | 0.18 | 0.2003 | 0.2003 | 0.1754 | 0.2667 | -1.0 | 0.4667 | 0.5833 | 0.8833 | 0.8667 | 0.8 | -1.0 | -1.0 | -1.0 | 0.3333 | 1.0 | -1.0 | -1.0 | 0.1657 | 0.75 | 0.0409 | 0.9 | -1.0 | -1.0 | -1.0 | -1.0 | -1.0 | -1.0 |
| No log | 98.0 | 196 | 0.8144 | 0.1689 | 0.2003 | 0.2003 | 0.1643 | 0.2667 | -1.0 | 0.4333 | 0.55 | 0.85 | 0.8333 | 0.8 | -1.0 | -1.0 | -1.0 | 0.3 | 0.9 | -1.0 | -1.0 | 0.1657 | 0.75 | 0.0409 | 0.9 | -1.0 | -1.0 | -1.0 | -1.0 | -1.0 | -1.0 |
| No log | 99.0 | 198 | 0.7820 | 0.1932 | 0.2215 | 0.2215 | 0.2065 | 0.3 | -1.0 | 0.45 | 0.5833 | 0.8833 | 0.8667 | 0.9 | -1.0 | -1.0 | -1.0 | 0.3 | 0.9 | -1.0 | -1.0 | 0.2387 | 0.85 | 0.0409 | 0.9 | -1.0 | -1.0 | -1.0 | -1.0 | -1.0 | -1.0 |
| No log | 100.0 | 200 | 0.7825 | 0.2044 | 0.2374 | 0.2374 | 0.2046 | 0.3 | -1.0 | 0.45 | 0.5833 | 0.85 | 0.8333 | 0.9 | -1.0 | -1.0 | -1.0 | 0.3 | 0.9 | -1.0 | -1.0 | 0.2768 | 0.85 | 0.0364 | 0.8 | -1.0 | -1.0 | -1.0 | -1.0 | -1.0 | -1.0 |
| No log | 101.0 | 202 | 0.7756 | 0.2127 | 0.2381 | 0.2381 | 0.2157 | 0.3 | -1.0 | 0.45 | 0.6 | 0.8667 | 0.8667 | 0.9 | -1.0 | -1.0 | -1.0 | 0.3 | 0.9 | -1.0 | -1.0 | 0.3 | 0.9 | 0.0381 | 0.8 | -1.0 | -1.0 | -1.0 | -1.0 | -1.0 | -1.0 |
| No log | 102.0 | 204 | 0.7455 | 0.2243 | 0.2603 | 0.2603 | 0.251 | 0.3 | -1.0 | 0.4333 | 0.5833 | 0.8833 | 0.8667 | 0.9 | -1.0 | -1.0 | -1.0 | 0.3 | 0.9 | -1.0 | -1.0 | 0.3301 | 0.85 | 0.0429 | 0.9 | -1.0 | -1.0 | -1.0 | -1.0 | -1.0 | -1.0 |
| No log | 103.0 | 206 | 0.7586 | 0.2311 | 0.2937 | 0.2937 | 0.3399 | 0.3 | -1.0 | 0.4167 | 0.5333 | 0.8333 | 0.8 | 0.9 | -1.0 | -1.0 | -1.0 | 0.2667 | 0.8 | -1.0 | -1.0 | 0.3837 | 0.8 | 0.0429 | 0.9 | -1.0 | -1.0 | -1.0 | -1.0 | -1.0 | -1.0 |
| No log | 104.0 | 208 | 0.7458 | 0.2415 | 0.2929 | 0.2929 | 0.351 | 0.3 | -1.0 | 0.45 | 0.5667 | 0.8667 | 0.8333 | 0.9 | -1.0 | -1.0 | -1.0 | 0.3 | 0.9 | -1.0 | -1.0 | 0.3837 | 0.8 | 0.0409 | 0.9 | -1.0 | -1.0 | -1.0 | -1.0 | -1.0 | -1.0 |
| No log | 105.0 | 210 | 0.7275 | 0.2363 | 0.2603 | 0.2603 | 0.2632 | 0.3 | -1.0 | 0.4833 | 0.6167 | 0.9167 | 0.9 | 0.9 | -1.0 | -1.0 | -1.0 | 0.3333 | 1.0 | -1.0 | -1.0 | 0.3326 | 0.85 | 0.0429 | 0.9 | -1.0 | -1.0 | -1.0 | -1.0 | -1.0 | -1.0 |
| No log | 106.0 | 212 | 0.7315 | 0.2316 | 0.2589 | 0.2589 | 0.2657 | 0.3 | -1.0 | 0.45 | 0.6 | 0.8667 | 0.8667 | 0.9 | -1.0 | -1.0 | -1.0 | 0.3 | 0.9 | -1.0 | -1.0 | 0.36 | 0.9 | 0.0348 | 0.8 | -1.0 | -1.0 | -1.0 | -1.0 | -1.0 | -1.0 |
| No log | 107.0 | 214 | 0.7477 | 0.2482 | 0.2911 | 0.2911 | 0.4133 | 0.2667 | -1.0 | 0.4333 | 0.5833 | 0.85 | 0.8667 | 0.8 | -1.0 | -1.0 | -1.0 | 0.3 | 0.9 | -1.0 | -1.0 | 0.4126 | 0.85 | 0.032 | 0.8 | -1.0 | -1.0 | -1.0 | -1.0 | -1.0 | -1.0 |
| No log | 108.0 | 216 | 0.7716 | 0.2333 | 0.2917 | 0.2917 | 0.3689 | 0.2667 | -1.0 | 0.4 | 0.5333 | 0.8 | 0.8 | 0.8 | -1.0 | -1.0 | -1.0 | 0.2667 | 0.8 | -1.0 | -1.0 | 0.4 | 0.8 | 0.0333 | 0.8 | -1.0 | -1.0 | -1.0 | -1.0 | -1.0 | -1.0 |
| No log | 109.0 | 218 | 0.7454 | 0.2151 | 0.2583 | 0.2583 | 0.2398 | 0.3 | -1.0 | 0.4167 | 0.55 | 0.8833 | 0.8667 | 0.9 | -1.0 | -1.0 | -1.0 | 0.2667 | 0.8 | -1.0 | -1.0 | 0.3368 | 0.85 | 0.0417 | 1.0 | -1.0 | -1.0 | -1.0 | -1.0 | -1.0 | -1.0 |
| No log | 110.0 | 220 | 0.7304 | 0.2345 | 0.2583 | 0.2583 | 0.2611 | 0.3 | -1.0 | 0.4833 | 0.6167 | 0.9167 | 0.9 | 0.9 | -1.0 | -1.0 | -1.0 | 0.3333 | 1.0 | -1.0 | -1.0 | 0.3326 | 0.85 | 0.0375 | 0.9 | -1.0 | -1.0 | -1.0 | -1.0 | -1.0 | -1.0 |
| No log | 111.0 | 222 | 0.7373 | 0.2294 | 0.2589 | 0.2589 | 0.2601 | 0.2667 | -1.0 | 0.4667 | 0.6 | 0.8667 | 0.8667 | 0.8 | -1.0 | -1.0 | -1.0 | 0.3333 | 1.0 | -1.0 | -1.0 | 0.32 | 0.8 | 0.0348 | 0.8 | -1.0 | -1.0 | -1.0 | -1.0 | -1.0 | -1.0 |
| No log | 112.0 | 224 | 0.7518 | 0.2244 | 0.2596 | 0.2596 | 0.249 | 0.3 | -1.0 | 0.45 | 0.5833 | 0.85 | 0.8333 | 0.9 | -1.0 | -1.0 | -1.0 | 0.3 | 0.9 | -1.0 | -1.0 | 0.3368 | 0.85 | 0.0364 | 0.8 | -1.0 | -1.0 | -1.0 | -1.0 | -1.0 | -1.0 |
| No log | 113.0 | 226 | 0.7707 | 0.2012 | 0.2444 | 0.2444 | 0.2222 | 0.3 | -1.0 | 0.4167 | 0.55 | 0.55 | 0.5333 | 0.9 | -1.0 | -1.0 | -1.0 | 0.2667 | 0.8 | -1.0 | -1.0 | 0.3368 | 0.85 | 0.0 | 0.0 | -1.0 | -1.0 | -1.0 | -1.0 | -1.0 | -1.0 |
| No log | 114.0 | 228 | 0.7341 | 0.2109 | 0.2444 | 0.2444 | 0.2333 | 0.3 | -1.0 | 0.45 | 0.5833 | 0.5833 | 0.5667 | 0.9 | -1.0 | -1.0 | -1.0 | 0.3 | 0.9 | -1.0 | -1.0 | 0.3326 | 0.85 | 0.0 | 0.0 | -1.0 | -1.0 | -1.0 | -1.0 | -1.0 | -1.0 |
| No log | 115.0 | 230 | 0.7356 | 0.2 | 0.2222 | 0.2222 | 0.2 | 0.3 | -1.0 | 0.45 | 0.6 | 0.6 | 0.6 | 0.9 | -1.0 | -1.0 | -1.0 | 0.3 | 0.9 | -1.0 | -1.0 | 0.3 | 0.9 | 0.0 | 0.0 | -1.0 | -1.0 | -1.0 | -1.0 | -1.0 | -1.0 |
| No log | 116.0 | 232 | 0.7527 | 0.205 | 0.2381 | 0.2381 | 0.2056 | 0.3 | -1.0 | 0.45 | 0.5833 | 0.85 | 0.8333 | 0.9 | -1.0 | -1.0 | -1.0 | 0.3 | 0.9 | -1.0 | -1.0 | 0.2768 | 0.85 | 0.0381 | 0.8 | -1.0 | -1.0 | -1.0 | -1.0 | -1.0 | -1.0 |
| No log | 117.0 | 234 | 0.7377 | 0.2065 | 0.2381 | 0.2381 | 0.2076 | 0.3 | -1.0 | 0.45 | 0.5833 | 0.8833 | 0.8667 | 0.9 | -1.0 | -1.0 | -1.0 | 0.3 | 0.9 | -1.0 | -1.0 | 0.2768 | 0.85 | 0.0429 | 0.9 | -1.0 | -1.0 | -1.0 | -1.0 | -1.0 | -1.0 |
| No log | 118.0 | 236 | 0.7546 | 0.2405 | 0.2937 | 0.2937 | 0.2379 | 0.3 | -1.0 | 0.4167 | 0.55 | 0.8167 | 0.8 | 0.9 | -1.0 | -1.0 | -1.0 | 0.4 | 0.8 | -1.0 | -1.0 | 0.2835 | 0.85 | 0.0381 | 0.8 | -1.0 | -1.0 | -1.0 | -1.0 | -1.0 | -1.0 |
| No log | 119.0 | 238 | 0.7682 | 0.2521 | 0.3152 | 0.3152 | 0.2824 | 0.2667 | -1.0 | 0.4 | 0.5333 | 0.8 | 0.8 | 0.8 | -1.0 | -1.0 | -1.0 | 0.4 | 0.8 | -1.0 | -1.0 | 0.32 | 0.8 | 0.0364 | 0.8 | -1.0 | -1.0 | -1.0 | -1.0 | -1.0 | -1.0 |
| No log | 120.0 | 240 | 0.7387 | 0.2688 | 0.3152 | 0.3152 | 0.299 | 0.2667 | -1.0 | 0.4333 | 0.5667 | 0.8333 | 0.8333 | 0.8 | -1.0 | -1.0 | -1.0 | 0.45 | 0.9 | -1.0 | -1.0 | 0.32 | 0.8 | 0.0364 | 0.8 | -1.0 | -1.0 | -1.0 | -1.0 | -1.0 | -1.0 |
| No log | 121.0 | 242 | 0.7265 | 0.2855 | 0.3152 | 0.3152 | 0.3167 | 0.2667 | -1.0 | 0.4667 | 0.6 | 0.8667 | 0.8667 | 0.8 | -1.0 | -1.0 | -1.0 | 0.5 | 1.0 | -1.0 | -1.0 | 0.32 | 0.8 | 0.0364 | 0.8 | -1.0 | -1.0 | -1.0 | -1.0 | -1.0 | -1.0 |
| No log | 122.0 | 244 | 0.7544 | 0.222 | 0.2603 | 0.2603 | 0.2479 | 0.3 | -1.0 | 0.45 | 0.5833 | 0.8167 | 0.8 | 0.9 | -1.0 | -1.0 | -1.0 | 0.3 | 0.9 | -1.0 | -1.0 | 0.3326 | 0.85 | 0.0333 | 0.7 | -1.0 | -1.0 | -1.0 | -1.0 | -1.0 | -1.0 |
| No log | 123.0 | 246 | 0.7399 | 0.2173 | 0.2603 | 0.2603 | 0.2333 | 0.3 | -1.0 | 0.45 | 0.5667 | 0.8333 | 0.8 | 0.9 | -1.0 | -1.0 | -1.0 | 0.3 | 0.9 | -1.0 | -1.0 | 0.3137 | 0.8 | 0.0381 | 0.8 | -1.0 | -1.0 | -1.0 | -1.0 | -1.0 | -1.0 |
| No log | 124.0 | 248 | 0.7335 | 0.2167 | 0.2596 | 0.2596 | 0.2324 | 0.3 | -1.0 | 0.45 | 0.5667 | 0.8333 | 0.8 | 0.9 | -1.0 | -1.0 | -1.0 | 0.3 | 0.9 | -1.0 | -1.0 | 0.3137 | 0.8 | 0.0364 | 0.8 | -1.0 | -1.0 | -1.0 | -1.0 | -1.0 | -1.0 |
| No log | 125.0 | 250 | 0.7374 | 0.2118 | 0.2596 | 0.2596 | 0.2352 | 0.3 | -1.0 | 0.4167 | 0.55 | 0.7833 | 0.7667 | 0.9 | -1.0 | -1.0 | -1.0 | 0.2667 | 0.8 | -1.0 | -1.0 | 0.3368 | 0.85 | 0.0318 | 0.7 | -1.0 | -1.0 | -1.0 | -1.0 | -1.0 | -1.0 |
| No log | 126.0 | 252 | 0.7278 | 0.2224 | 0.2589 | 0.2589 | 0.2471 | 0.3 | -1.0 | 0.45 | 0.5833 | 0.8167 | 0.8 | 0.9 | -1.0 | -1.0 | -1.0 | 0.3 | 0.9 | -1.0 | -1.0 | 0.3368 | 0.85 | 0.0304 | 0.7 | -1.0 | -1.0 | -1.0 | -1.0 | -1.0 | -1.0 |
| No log | 127.0 | 254 | 0.7320 | 0.2123 | 0.2444 | 0.2444 | 0.2333 | 0.3 | -1.0 | 0.45 | 0.5833 | 0.5833 | 0.5667 | 0.9 | -1.0 | -1.0 | -1.0 | 0.3 | 0.9 | -1.0 | -1.0 | 0.3368 | 0.85 | 0.0 | 0.0 | -1.0 | -1.0 | -1.0 | -1.0 | -1.0 | -1.0 |
| No log | 128.0 | 256 | 0.7381 | 0.2178 | 0.2444 | 0.2444 | 0.2444 | 0.2667 | -1.0 | 0.4667 | 0.6 | 0.6 | 0.6 | 0.8 | -1.0 | -1.0 | -1.0 | 0.3333 | 1.0 | -1.0 | -1.0 | 0.32 | 0.8 | 0.0 | 0.0 | -1.0 | -1.0 | -1.0 | -1.0 | -1.0 | -1.0 |
| No log | 129.0 | 258 | 0.7364 | 0.2157 | 0.2444 | 0.2444 | 0.2278 | 0.3 | -1.0 | 0.4833 | 0.6 | 0.6 | 0.5667 | 0.9 | -1.0 | -1.0 | -1.0 | 0.3333 | 1.0 | -1.0 | -1.0 | 0.3137 | 0.8 | 0.0 | 0.0 | -1.0 | -1.0 | -1.0 | -1.0 | -1.0 | -1.0 |
| No log | 130.0 | 260 | 0.7288 | 0.2652 | 0.3133 | 0.3133 | 0.2807 | 0.3 | -1.0 | 0.45 | 0.5667 | 0.8333 | 0.8 | 0.9 | -1.0 | -1.0 | -1.0 | 0.45 | 0.9 | -1.0 | -1.0 | 0.3137 | 0.8 | 0.032 | 0.8 | -1.0 | -1.0 | -1.0 | -1.0 | -1.0 | -1.0 |
| No log | 131.0 | 262 | 0.7496 | 0.2719 | 0.3467 | 0.3467 | 0.3807 | 0.3 | -1.0 | 0.4167 | 0.5333 | 0.8 | 0.7667 | 0.9 | -1.0 | -1.0 | -1.0 | 0.4 | 0.8 | -1.0 | -1.0 | 0.3837 | 0.8 | 0.032 | 0.8 | -1.0 | -1.0 | -1.0 | -1.0 | -1.0 | -1.0 |
| No log | 132.0 | 264 | 0.7517 | 0.2778 | 0.3472 | 0.3472 | 0.4148 | 0.2667 | -1.0 | 0.4 | 0.5333 | 0.8 | 0.8 | 0.8 | -1.0 | -1.0 | -1.0 | 0.4 | 0.8 | -1.0 | -1.0 | 0.4 | 0.8 | 0.0333 | 0.8 | -1.0 | -1.0 | -1.0 | -1.0 | -1.0 | -1.0 |
| No log | 133.0 | 266 | 0.7232 | 0.3001 | 0.3472 | 0.3472 | 0.4315 | 0.3 | -1.0 | 0.45 | 0.5833 | 0.85 | 0.8333 | 0.9 | -1.0 | -1.0 | -1.0 | 0.45 | 0.9 | -1.0 | -1.0 | 0.4168 | 0.85 | 0.0333 | 0.8 | -1.0 | -1.0 | -1.0 | -1.0 | -1.0 | -1.0 |
| No log | 134.0 | 268 | 0.7281 | 0.2729 | 0.3133 | 0.3133 | 0.2974 | 0.3 | -1.0 | 0.45 | 0.5833 | 0.85 | 0.8333 | 0.9 | -1.0 | -1.0 | -1.0 | 0.45 | 0.9 | -1.0 | -1.0 | 0.3368 | 0.85 | 0.032 | 0.8 | -1.0 | -1.0 | -1.0 | -1.0 | -1.0 | -1.0 |
| No log | 135.0 | 270 | 0.7369 | 0.2546 | 0.3 | 0.3 | 0.2667 | 0.3 | -1.0 | 0.45 | 0.5667 | 0.5667 | 0.5333 | 0.9 | -1.0 | -1.0 | -1.0 | 0.45 | 0.9 | -1.0 | -1.0 | 0.3137 | 0.8 | 0.0 | 0.0 | -1.0 | -1.0 | -1.0 | -1.0 | -1.0 | -1.0 |
| No log | 136.0 | 272 | 0.7434 | 0.2779 | 0.3333 | 0.3333 | 0.3833 | 0.3 | -1.0 | 0.45 | 0.5667 | 0.5667 | 0.5333 | 0.9 | -1.0 | -1.0 | -1.0 | 0.45 | 0.9 | -1.0 | -1.0 | 0.3837 | 0.8 | 0.0 | 0.0 | -1.0 | -1.0 | -1.0 | -1.0 | -1.0 | -1.0 |
| No log | 137.0 | 274 | 0.7278 | 0.2456 | 0.3 | 0.3 | 0.2667 | 0.3 | -1.0 | 0.4167 | 0.55 | 0.55 | 0.5333 | 0.9 | -1.0 | -1.0 | -1.0 | 0.4 | 0.8 | -1.0 | -1.0 | 0.3368 | 0.85 | 0.0 | 0.0 | -1.0 | -1.0 | -1.0 | -1.0 | -1.0 | -1.0 |
| No log | 138.0 | 276 | 0.7170 | 0.2623 | 0.3 | 0.3 | 0.2833 | 0.3 | -1.0 | 0.45 | 0.5833 | 0.5833 | 0.5667 | 0.9 | -1.0 | -1.0 | -1.0 | 0.45 | 0.9 | -1.0 | -1.0 | 0.3368 | 0.85 | 0.0 | 0.0 | -1.0 | -1.0 | -1.0 | -1.0 | -1.0 | -1.0 |
| No log | 139.0 | 278 | 0.7312 | 0.27 | 0.3 | 0.3 | 0.3 | 0.3 | -1.0 | 0.45 | 0.6 | 0.6 | 0.6 | 0.9 | -1.0 | -1.0 | -1.0 | 0.45 | 0.9 | -1.0 | -1.0 | 0.36 | 0.9 | 0.0 | 0.0 | -1.0 | -1.0 | -1.0 | -1.0 | -1.0 | -1.0 |
| No log | 140.0 | 280 | 0.7492 | 0.2716 | 0.3133 | 0.3133 | 0.2956 | 0.3 | -1.0 | 0.45 | 0.5833 | 0.8167 | 0.8 | 0.9 | -1.0 | -1.0 | -1.0 | 0.45 | 0.9 | -1.0 | -1.0 | 0.3368 | 0.85 | 0.028 | 0.7 | -1.0 | -1.0 | -1.0 | -1.0 | -1.0 | -1.0 |
| No log | 141.0 | 282 | 0.7549 | 0.2559 | 0.3128 | 0.3128 | 0.2807 | 0.3 | -1.0 | 0.4167 | 0.55 | 0.8167 | 0.8 | 0.9 | -1.0 | -1.0 | -1.0 | 0.4 | 0.8 | -1.0 | -1.0 | 0.3368 | 0.85 | 0.0308 | 0.8 | -1.0 | -1.0 | -1.0 | -1.0 | -1.0 | -1.0 |
| No log | 142.0 | 284 | 0.7636 | 0.2456 | 0.3 | 0.3 | 0.2667 | 0.3 | -1.0 | 0.4167 | 0.55 | 0.55 | 0.5333 | 0.9 | -1.0 | -1.0 | -1.0 | 0.4 | 0.8 | -1.0 | -1.0 | 0.3368 | 0.85 | 0.0 | 0.0 | -1.0 | -1.0 | -1.0 | -1.0 | -1.0 | -1.0 |
| No log | 143.0 | 286 | 0.7544 | 0.2723 | 0.3333 | 0.3333 | 0.4 | 0.3 | -1.0 | 0.4167 | 0.55 | 0.55 | 0.5333 | 0.9 | -1.0 | -1.0 | -1.0 | 0.4 | 0.8 | -1.0 | -1.0 | 0.4168 | 0.85 | 0.0 | 0.0 | -1.0 | -1.0 | -1.0 | -1.0 | -1.0 | -1.0 |
| No log | 144.0 | 288 | 0.7138 | 0.2833 | 0.3333 | 0.3333 | 0.4167 | 0.2667 | -1.0 | 0.4333 | 0.5667 | 0.5667 | 0.5667 | 0.8 | -1.0 | -1.0 | -1.0 | 0.45 | 0.9 | -1.0 | -1.0 | 0.4 | 0.8 | 0.0 | 0.0 | -1.0 | -1.0 | -1.0 | -1.0 | -1.0 | -1.0 |
| No log | 145.0 | 290 | 0.7023 | 0.2889 | 0.3333 | 0.3333 | 0.4167 | 0.3 | -1.0 | 0.45 | 0.5833 | 0.5833 | 0.5667 | 0.9 | -1.0 | -1.0 | -1.0 | 0.45 | 0.9 | -1.0 | -1.0 | 0.4168 | 0.85 | 0.0 | 0.0 | -1.0 | -1.0 | -1.0 | -1.0 | -1.0 | -1.0 |
| No log | 146.0 | 292 | 0.7070 | 0.3056 | 0.3333 | 0.3333 | 0.4333 | 0.3 | -1.0 | 0.4833 | 0.6167 | 0.6167 | 0.6 | 0.9 | -1.0 | -1.0 | -1.0 | 0.5 | 1.0 | -1.0 | -1.0 | 0.4168 | 0.85 | 0.0 | 0.0 | -1.0 | -1.0 | -1.0 | -1.0 | -1.0 | -1.0 |
| No log | 147.0 | 294 | 0.7356 | 0.2889 | 0.3333 | 0.3333 | 0.4167 | 0.3 | -1.0 | 0.45 | 0.5833 | 0.5833 | 0.5667 | 0.9 | -1.0 | -1.0 | -1.0 | 0.45 | 0.9 | -1.0 | -1.0 | 0.4168 | 0.85 | 0.0 | 0.0 | -1.0 | -1.0 | -1.0 | -1.0 | -1.0 | -1.0 |
| No log | 148.0 | 296 | 0.7607 | 0.2723 | 0.3333 | 0.3333 | 0.4 | 0.3 | -1.0 | 0.4167 | 0.55 | 0.55 | 0.5333 | 0.9 | -1.0 | -1.0 | -1.0 | 0.4 | 0.8 | -1.0 | -1.0 | 0.4168 | 0.85 | 0.0 | 0.0 | -1.0 | -1.0 | -1.0 | -1.0 | -1.0 | -1.0 |
| No log | 149.0 | 298 | 0.7531 | 0.2779 | 0.3333 | 0.3333 | 0.3833 | 0.3 | -1.0 | 0.45 | 0.5667 | 0.5667 | 0.5333 | 0.9 | -1.0 | -1.0 | -1.0 | 0.45 | 0.9 | -1.0 | -1.0 | 0.3837 | 0.8 | 0.0 | 0.0 | -1.0 | -1.0 | -1.0 | -1.0 | -1.0 | -1.0 |
| No log | 150.0 | 300 | 0.7105 | 0.3056 | 0.3333 | 0.3333 | 0.4333 | 0.3 | -1.0 | 0.4833 | 0.6167 | 0.6167 | 0.6 | 0.9 | -1.0 | -1.0 | -1.0 | 0.5 | 1.0 | -1.0 | -1.0 | 0.4168 | 0.85 | 0.0 | 0.0 | -1.0 | -1.0 | -1.0 | -1.0 | -1.0 | -1.0 |
| No log | 151.0 | 302 | 0.7177 | 0.3056 | 0.3333 | 0.3333 | 0.4333 | 0.3 | -1.0 | 0.4833 | 0.6167 | 0.6167 | 0.6 | 0.9 | -1.0 | -1.0 | -1.0 | 0.5 | 1.0 | -1.0 | -1.0 | 0.4168 | 0.85 | 0.0 | 0.0 | -1.0 | -1.0 | -1.0 | -1.0 | -1.0 | -1.0 |
| No log | 152.0 | 304 | 0.7032 | 0.2889 | 0.3333 | 0.3333 | 0.4167 | 0.3 | -1.0 | 0.45 | 0.5833 | 0.5833 | 0.5667 | 0.9 | -1.0 | -1.0 | -1.0 | 0.45 | 0.9 | -1.0 | -1.0 | 0.4168 | 0.85 | 0.0 | 0.0 | -1.0 | -1.0 | -1.0 | -1.0 | -1.0 | -1.0 |
| No log | 153.0 | 306 | 0.7232 | 0.2889 | 0.3333 | 0.3333 | 0.4167 | 0.3 | -1.0 | 0.45 | 0.5833 | 0.5833 | 0.5667 | 0.9 | -1.0 | -1.0 | -1.0 | 0.45 | 0.9 | -1.0 | -1.0 | 0.4168 | 0.85 | 0.0 | 0.0 | -1.0 | -1.0 | -1.0 | -1.0 | -1.0 | -1.0 |
| No log | 154.0 | 308 | 0.7311 | 0.2723 | 0.3333 | 0.3333 | 0.3833 | 0.2667 | -1.0 | 0.4333 | 0.55 | 0.55 | 0.5333 | 0.8 | -1.0 | -1.0 | -1.0 | 0.45 | 0.9 | -1.0 | -1.0 | 0.3668 | 0.75 | 0.0 | 0.0 | -1.0 | -1.0 | -1.0 | -1.0 | -1.0 | -1.0 |
| No log | 155.0 | 310 | 0.7147 | 0.3 | 0.3333 | 0.3333 | 0.4333 | 0.2667 | -1.0 | 0.4667 | 0.6 | 0.6 | 0.6 | 0.8 | -1.0 | -1.0 | -1.0 | 0.5 | 1.0 | -1.0 | -1.0 | 0.4 | 0.8 | 0.0 | 0.0 | -1.0 | -1.0 | -1.0 | -1.0 | -1.0 | -1.0 |
| No log | 156.0 | 312 | 0.7170 | 0.3 | 0.3333 | 0.3333 | 0.4333 | 0.2667 | -1.0 | 0.4667 | 0.6 | 0.6 | 0.6 | 0.8 | -1.0 | -1.0 | -1.0 | 0.5 | 1.0 | -1.0 | -1.0 | 0.4 | 0.8 | 0.0 | 0.0 | -1.0 | -1.0 | -1.0 | -1.0 | -1.0 | -1.0 |
| No log | 157.0 | 314 | 0.7112 | 0.3 | 0.3333 | 0.3333 | 0.4333 | 0.2667 | -1.0 | 0.4667 | 0.6 | 0.6 | 0.6 | 0.8 | -1.0 | -1.0 | -1.0 | 0.5 | 1.0 | -1.0 | -1.0 | 0.4 | 0.8 | 0.0 | 0.0 | -1.0 | -1.0 | -1.0 | -1.0 | -1.0 | -1.0 |
| No log | 158.0 | 316 | 0.7134 | 0.2889 | 0.3333 | 0.3333 | 0.4167 | 0.3 | -1.0 | 0.45 | 0.5833 | 0.5833 | 0.5667 | 0.9 | -1.0 | -1.0 | -1.0 | 0.45 | 0.9 | -1.0 | -1.0 | 0.4168 | 0.85 | 0.0 | 0.0 | -1.0 | -1.0 | -1.0 | -1.0 | -1.0 | -1.0 |
| No log | 159.0 | 318 | 0.7141 | 0.27 | 0.3 | 0.3 | 0.3 | 0.3 | -1.0 | 0.45 | 0.6 | 0.6 | 0.6 | 0.9 | -1.0 | -1.0 | -1.0 | 0.45 | 0.9 | -1.0 | -1.0 | 0.36 | 0.9 | 0.0 | 0.0 | -1.0 | -1.0 | -1.0 | -1.0 | -1.0 | -1.0 |
| No log | 160.0 | 320 | 0.7000 | 0.27 | 0.3 | 0.3 | 0.3 | 0.3 | -1.0 | 0.45 | 0.6 | 0.6 | 0.6 | 0.9 | -1.0 | -1.0 | -1.0 | 0.45 | 0.9 | -1.0 | -1.0 | 0.36 | 0.9 | 0.0 | 0.0 | -1.0 | -1.0 | -1.0 | -1.0 | -1.0 | -1.0 |
| No log | 161.0 | 322 | 0.7068 | 0.2789 | 0.3 | 0.3 | 0.3 | 0.3 | -1.0 | 0.4833 | 0.6167 | 0.6167 | 0.6 | 0.9 | -1.0 | -1.0 | -1.0 | 0.5 | 1.0 | -1.0 | -1.0 | 0.3368 | 0.85 | 0.0 | 0.0 | -1.0 | -1.0 | -1.0 | -1.0 | -1.0 | -1.0 |
| No log | 162.0 | 324 | 0.7004 | 0.2623 | 0.3 | 0.3 | 0.2833 | 0.3 | -1.0 | 0.45 | 0.5833 | 0.5833 | 0.5667 | 0.9 | -1.0 | -1.0 | -1.0 | 0.45 | 0.9 | -1.0 | -1.0 | 0.3368 | 0.85 | 0.0 | 0.0 | -1.0 | -1.0 | -1.0 | -1.0 | -1.0 | -1.0 |
| No log | 163.0 | 326 | 0.7056 | 0.2563 | 0.3133 | 0.3133 | 0.2815 | 0.3 | -1.0 | 0.4167 | 0.55 | 0.8167 | 0.8 | 0.9 | -1.0 | -1.0 | -1.0 | 0.4 | 0.8 | -1.0 | -1.0 | 0.3368 | 0.85 | 0.032 | 0.8 | -1.0 | -1.0 | -1.0 | -1.0 | -1.0 | -1.0 |
| No log | 164.0 | 328 | 0.6894 | 0.2734 | 0.3139 | 0.3139 | 0.299 | 0.3 | -1.0 | 0.45 | 0.5833 | 0.85 | 0.8333 | 0.9 | -1.0 | -1.0 | -1.0 | 0.45 | 0.9 | -1.0 | -1.0 | 0.3368 | 0.85 | 0.0333 | 0.8 | -1.0 | -1.0 | -1.0 | -1.0 | -1.0 | -1.0 |
| No log | 165.0 | 330 | 0.6989 | 0.2623 | 0.3 | 0.3 | 0.2833 | 0.3 | -1.0 | 0.45 | 0.5833 | 0.5833 | 0.5667 | 0.9 | -1.0 | -1.0 | -1.0 | 0.45 | 0.9 | -1.0 | -1.0 | 0.3368 | 0.85 | 0.0 | 0.0 | -1.0 | -1.0 | -1.0 | -1.0 | -1.0 | -1.0 |
| No log | 166.0 | 332 | 0.6876 | 0.2623 | 0.3 | 0.3 | 0.2833 | 0.3 | -1.0 | 0.45 | 0.5833 | 0.5833 | 0.5667 | 0.9 | -1.0 | -1.0 | -1.0 | 0.45 | 0.9 | -1.0 | -1.0 | 0.3368 | 0.85 | 0.0 | 0.0 | -1.0 | -1.0 | -1.0 | -1.0 | -1.0 | -1.0 |
| No log | 167.0 | 334 | 0.6904 | 0.2623 | 0.3 | 0.3 | 0.2833 | 0.3 | -1.0 | 0.45 | 0.5833 | 0.5833 | 0.5667 | 0.9 | -1.0 | -1.0 | -1.0 | 0.45 | 0.9 | -1.0 | -1.0 | 0.3368 | 0.85 | 0.0 | 0.0 | -1.0 | -1.0 | -1.0 | -1.0 | -1.0 | -1.0 |
| No log | 168.0 | 336 | 0.7001 | 0.2623 | 0.3 | 0.3 | 0.2833 | 0.3 | -1.0 | 0.45 | 0.5833 | 0.5833 | 0.5667 | 0.9 | -1.0 | -1.0 | -1.0 | 0.45 | 0.9 | -1.0 | -1.0 | 0.3368 | 0.85 | 0.0 | 0.0 | -1.0 | -1.0 | -1.0 | -1.0 | -1.0 | -1.0 |
| No log | 169.0 | 338 | 0.6969 | 0.2659 | 0.3 | 0.3 | 0.2889 | 0.3 | -1.0 | 0.45 | 0.6 | 0.6 | 0.6 | 0.9 | -1.0 | -1.0 | -1.0 | 0.45 | 0.9 | -1.0 | -1.0 | 0.3478 | 0.9 | 0.0 | 0.0 | -1.0 | -1.0 | -1.0 | -1.0 | -1.0 | -1.0 |
| No log | 170.0 | 340 | 0.6970 | 0.2646 | 0.3 | 0.3 | 0.2881 | 0.3 | -1.0 | 0.45 | 0.6 | 0.6 | 0.6 | 0.9 | -1.0 | -1.0 | -1.0 | 0.45 | 0.9 | -1.0 | -1.0 | 0.3439 | 0.9 | 0.0 | 0.0 | -1.0 | -1.0 | -1.0 | -1.0 | -1.0 | -1.0 |
| No log | 171.0 | 342 | 0.7053 | 0.2623 | 0.3 | 0.3 | 0.2833 | 0.3 | -1.0 | 0.45 | 0.5833 | 0.5833 | 0.5667 | 0.9 | -1.0 | -1.0 | -1.0 | 0.45 | 0.9 | -1.0 | -1.0 | 0.3368 | 0.85 | 0.0 | 0.0 | -1.0 | -1.0 | -1.0 | -1.0 | -1.0 | -1.0 |
| No log | 172.0 | 344 | 0.7235 | 0.2623 | 0.3 | 0.3 | 0.2833 | 0.3 | -1.0 | 0.45 | 0.5833 | 0.5833 | 0.5667 | 0.9 | -1.0 | -1.0 | -1.0 | 0.45 | 0.9 | -1.0 | -1.0 | 0.3368 | 0.85 | 0.0 | 0.0 | -1.0 | -1.0 | -1.0 | -1.0 | -1.0 | -1.0 |
| No log | 173.0 | 346 | 0.7236 | 0.2623 | 0.3 | 0.3 | 0.2833 | 0.3 | -1.0 | 0.45 | 0.5833 | 0.5833 | 0.5667 | 0.9 | -1.0 | -1.0 | -1.0 | 0.45 | 0.9 | -1.0 | -1.0 | 0.3368 | 0.85 | 0.0 | 0.0 | -1.0 | -1.0 | -1.0 | -1.0 | -1.0 | -1.0 |
| No log | 174.0 | 348 | 0.7160 | 0.2889 | 0.3333 | 0.3333 | 0.4167 | 0.3 | -1.0 | 0.45 | 0.5833 | 0.5833 | 0.5667 | 0.9 | -1.0 | -1.0 | -1.0 | 0.45 | 0.9 | -1.0 | -1.0 | 0.4168 | 0.85 | 0.0 | 0.0 | -1.0 | -1.0 | -1.0 | -1.0 | -1.0 | -1.0 |
| No log | 175.0 | 350 | 0.7156 | 0.2889 | 0.3333 | 0.3333 | 0.4167 | 0.3 | -1.0 | 0.45 | 0.5833 | 0.5833 | 0.5667 | 0.9 | -1.0 | -1.0 | -1.0 | 0.45 | 0.9 | -1.0 | -1.0 | 0.4168 | 0.85 | 0.0 | 0.0 | -1.0 | -1.0 | -1.0 | -1.0 | -1.0 | -1.0 |
| No log | 176.0 | 352 | 0.7172 | 0.2889 | 0.3333 | 0.3333 | 0.4167 | 0.3 | -1.0 | 0.45 | 0.5833 | 0.5833 | 0.5667 | 0.9 | -1.0 | -1.0 | -1.0 | 0.45 | 0.9 | -1.0 | -1.0 | 0.4168 | 0.85 | 0.0 | 0.0 | -1.0 | -1.0 | -1.0 | -1.0 | -1.0 | -1.0 |
| No log | 177.0 | 354 | 0.7085 | 0.2723 | 0.3333 | 0.3333 | 0.4 | 0.3 | -1.0 | 0.4167 | 0.55 | 0.55 | 0.5333 | 0.9 | -1.0 | -1.0 | -1.0 | 0.4 | 0.8 | -1.0 | -1.0 | 0.4168 | 0.85 | 0.0 | 0.0 | -1.0 | -1.0 | -1.0 | -1.0 | -1.0 | -1.0 |
| No log | 178.0 | 356 | 0.7073 | 0.2723 | 0.3333 | 0.3333 | 0.4 | 0.3 | -1.0 | 0.4167 | 0.55 | 0.55 | 0.5333 | 0.9 | -1.0 | -1.0 | -1.0 | 0.4 | 0.8 | -1.0 | -1.0 | 0.4168 | 0.85 | 0.0 | 0.0 | -1.0 | -1.0 | -1.0 | -1.0 | -1.0 | -1.0 |
| No log | 179.0 | 358 | 0.6975 | 0.2723 | 0.3333 | 0.3333 | 0.4 | 0.3 | -1.0 | 0.4167 | 0.55 | 0.55 | 0.5333 | 0.9 | -1.0 | -1.0 | -1.0 | 0.4 | 0.8 | -1.0 | -1.0 | 0.4168 | 0.85 | 0.0 | 0.0 | -1.0 | -1.0 | -1.0 | -1.0 | -1.0 | -1.0 |
| No log | 180.0 | 360 | 0.6852 | 0.2738 | 0.3333 | 0.3333 | 0.4028 | 0.3 | -1.0 | 0.4167 | 0.55 | 0.5667 | 0.5667 | 0.9 | -1.0 | -1.0 | -1.0 | 0.4 | 0.8 | -1.0 | -1.0 | 0.4213 | 0.9 | 0.0 | 0.0 | -1.0 | -1.0 | -1.0 | -1.0 | -1.0 | -1.0 |
| No log | 181.0 | 362 | 0.6740 | 0.291 | 0.3333 | 0.3333 | 0.42 | 0.3 | -1.0 | 0.45 | 0.6 | 0.6 | 0.6 | 0.9 | -1.0 | -1.0 | -1.0 | 0.45 | 0.9 | -1.0 | -1.0 | 0.423 | 0.9 | 0.0 | 0.0 | -1.0 | -1.0 | -1.0 | -1.0 | -1.0 | -1.0 |
| No log | 182.0 | 364 | 0.6865 | 0.2833 | 0.3333 | 0.3333 | 0.4167 | 0.2667 | -1.0 | 0.4333 | 0.5667 | 0.5667 | 0.5667 | 0.8 | -1.0 | -1.0 | -1.0 | 0.45 | 0.9 | -1.0 | -1.0 | 0.4 | 0.8 | 0.0 | 0.0 | -1.0 | -1.0 | -1.0 | -1.0 | -1.0 | -1.0 |
| No log | 183.0 | 366 | 0.6932 | 0.2833 | 0.3333 | 0.3333 | 0.4167 | 0.2667 | -1.0 | 0.4333 | 0.5667 | 0.5667 | 0.5667 | 0.8 | -1.0 | -1.0 | -1.0 | 0.45 | 0.9 | -1.0 | -1.0 | 0.4 | 0.8 | 0.0 | 0.0 | -1.0 | -1.0 | -1.0 | -1.0 | -1.0 | -1.0 |
| No log | 184.0 | 368 | 0.6771 | 0.3 | 0.3333 | 0.3333 | 0.45 | 0.3 | -1.0 | 0.45 | 0.6 | 0.6 | 0.6 | 0.9 | -1.0 | -1.0 | -1.0 | 0.45 | 0.9 | -1.0 | -1.0 | 0.45 | 0.9 | 0.0 | 0.0 | -1.0 | -1.0 | -1.0 | -1.0 | -1.0 | -1.0 |
| No log | 185.0 | 370 | 0.6664 | 0.3 | 0.3333 | 0.3333 | 0.45 | 0.3 | -1.0 | 0.45 | 0.6 | 0.6 | 0.6 | 0.9 | -1.0 | -1.0 | -1.0 | 0.45 | 0.9 | -1.0 | -1.0 | 0.45 | 0.9 | 0.0 | 0.0 | -1.0 | -1.0 | -1.0 | -1.0 | -1.0 | -1.0 |
| No log | 186.0 | 372 | 0.6920 | 0.2667 | 0.3333 | 0.3333 | 0.4 | 0.2667 | -1.0 | 0.4 | 0.5333 | 0.5333 | 0.5333 | 0.8 | -1.0 | -1.0 | -1.0 | 0.4 | 0.8 | -1.0 | -1.0 | 0.4 | 0.8 | 0.0 | 0.0 | -1.0 | -1.0 | -1.0 | -1.0 | -1.0 | -1.0 |
| No log | 187.0 | 374 | 0.7042 | 0.2627 | 0.3333 | 0.3333 | 0.3694 | 0.3 | -1.0 | 0.4167 | 0.5333 | 0.55 | 0.5333 | 0.9 | -1.0 | -1.0 | -1.0 | 0.4 | 0.8 | -1.0 | -1.0 | 0.388 | 0.85 | 0.0 | 0.0 | -1.0 | -1.0 | -1.0 | -1.0 | -1.0 | -1.0 |
| No log | 188.0 | 376 | 0.7066 | 0.2723 | 0.3333 | 0.3333 | 0.4 | 0.3 | -1.0 | 0.4167 | 0.55 | 0.55 | 0.5333 | 0.9 | -1.0 | -1.0 | -1.0 | 0.4 | 0.8 | -1.0 | -1.0 | 0.4168 | 0.85 | 0.0 | 0.0 | -1.0 | -1.0 | -1.0 | -1.0 | -1.0 | -1.0 |
| No log | 189.0 | 378 | 0.7077 | 0.2889 | 0.3333 | 0.3333 | 0.4167 | 0.3 | -1.0 | 0.45 | 0.5833 | 0.5833 | 0.5667 | 0.9 | -1.0 | -1.0 | -1.0 | 0.45 | 0.9 | -1.0 | -1.0 | 0.4168 | 0.85 | 0.0 | 0.0 | -1.0 | -1.0 | -1.0 | -1.0 | -1.0 | -1.0 |
| No log | 190.0 | 380 | 0.7138 | 0.2889 | 0.3333 | 0.3333 | 0.4167 | 0.3 | -1.0 | 0.45 | 0.5833 | 0.5833 | 0.5667 | 0.9 | -1.0 | -1.0 | -1.0 | 0.45 | 0.9 | -1.0 | -1.0 | 0.4168 | 0.85 | 0.0 | 0.0 | -1.0 | -1.0 | -1.0 | -1.0 | -1.0 | -1.0 |
| No log | 191.0 | 382 | 0.7155 | 0.2889 | 0.3333 | 0.3333 | 0.4167 | 0.3 | -1.0 | 0.45 | 0.5833 | 0.5833 | 0.5667 | 0.9 | -1.0 | -1.0 | -1.0 | 0.45 | 0.9 | -1.0 | -1.0 | 0.4168 | 0.85 | 0.0 | 0.0 | -1.0 | -1.0 | -1.0 | -1.0 | -1.0 | -1.0 |
| No log | 192.0 | 384 | 0.7146 | 0.2889 | 0.3333 | 0.3333 | 0.4167 | 0.3 | -1.0 | 0.45 | 0.5833 | 0.5833 | 0.5667 | 0.9 | -1.0 | -1.0 | -1.0 | 0.45 | 0.9 | -1.0 | -1.0 | 0.4168 | 0.85 | 0.0 | 0.0 | -1.0 | -1.0 | -1.0 | -1.0 | -1.0 | -1.0 |
| No log | 193.0 | 386 | 0.7228 | 0.2723 | 0.3333 | 0.3333 | 0.4 | 0.3 | -1.0 | 0.4167 | 0.55 | 0.55 | 0.5333 | 0.9 | -1.0 | -1.0 | -1.0 | 0.4 | 0.8 | -1.0 | -1.0 | 0.4168 | 0.85 | 0.0 | 0.0 | -1.0 | -1.0 | -1.0 | -1.0 | -1.0 | -1.0 |
| No log | 194.0 | 388 | 0.7120 | 0.2723 | 0.3333 | 0.3333 | 0.4 | 0.3 | -1.0 | 0.4167 | 0.55 | 0.55 | 0.5333 | 0.9 | -1.0 | -1.0 | -1.0 | 0.4 | 0.8 | -1.0 | -1.0 | 0.4168 | 0.85 | 0.0 | 0.0 | -1.0 | -1.0 | -1.0 | -1.0 | -1.0 | -1.0 |
| No log | 195.0 | 390 | 0.6936 | 0.2612 | 0.3333 | 0.3333 | 0.3667 | 0.3 | -1.0 | 0.4167 | 0.5333 | 0.5333 | 0.5 | 0.9 | -1.0 | -1.0 | -1.0 | 0.4 | 0.8 | -1.0 | -1.0 | 0.3837 | 0.8 | 0.0 | 0.0 | -1.0 | -1.0 | -1.0 | -1.0 | -1.0 | -1.0 |
| No log | 196.0 | 392 | 0.6674 | 0.2723 | 0.3333 | 0.3333 | 0.4 | 0.3 | -1.0 | 0.4167 | 0.55 | 0.55 | 0.5333 | 0.9 | -1.0 | -1.0 | -1.0 | 0.4 | 0.8 | -1.0 | -1.0 | 0.4168 | 0.85 | 0.0 | 0.0 | -1.0 | -1.0 | -1.0 | -1.0 | -1.0 | -1.0 |
| No log | 197.0 | 394 | 0.6600 | 0.2946 | 0.3333 | 0.3333 | 0.4167 | 0.3333 | -1.0 | 0.4667 | 0.6 | 0.6 | 0.5667 | 1.0 | -1.0 | -1.0 | -1.0 | 0.45 | 0.9 | -1.0 | -1.0 | 0.4337 | 0.9 | 0.0 | 0.0 | -1.0 | -1.0 | -1.0 | -1.0 | -1.0 | -1.0 |
| No log | 198.0 | 396 | 0.6611 | 0.2889 | 0.3333 | 0.3333 | 0.4167 | 0.3 | -1.0 | 0.45 | 0.5833 | 0.5833 | 0.5667 | 0.9 | -1.0 | -1.0 | -1.0 | 0.45 | 0.9 | -1.0 | -1.0 | 0.4168 | 0.85 | 0.0 | 0.0 | -1.0 | -1.0 | -1.0 | -1.0 | -1.0 | -1.0 |
| No log | 199.0 | 398 | 0.6740 | 0.2779 | 0.3333 | 0.3333 | 0.3833 | 0.3 | -1.0 | 0.45 | 0.5667 | 0.5667 | 0.5333 | 0.9 | -1.0 | -1.0 | -1.0 | 0.45 | 0.9 | -1.0 | -1.0 | 0.3837 | 0.8 | 0.0 | 0.0 | -1.0 | -1.0 | -1.0 | -1.0 | -1.0 | -1.0 |
| No log | 200.0 | 400 | 0.6869 | 0.2779 | 0.3333 | 0.3333 | 0.3833 | 0.3 | -1.0 | 0.45 | 0.5667 | 0.5667 | 0.5333 | 0.9 | -1.0 | -1.0 | -1.0 | 0.45 | 0.9 | -1.0 | -1.0 | 0.3837 | 0.8 | 0.0 | 0.0 | -1.0 | -1.0 | -1.0 | -1.0 | -1.0 | -1.0 |
| No log | 201.0 | 402 | 0.6872 | 0.2612 | 0.3333 | 0.3333 | 0.3667 | 0.3 | -1.0 | 0.4167 | 0.5333 | 0.5333 | 0.5 | 0.9 | -1.0 | -1.0 | -1.0 | 0.4 | 0.8 | -1.0 | -1.0 | 0.3837 | 0.8 | 0.0 | 0.0 | -1.0 | -1.0 | -1.0 | -1.0 | -1.0 | -1.0 |
| No log | 202.0 | 404 | 0.6741 | 0.2612 | 0.3333 | 0.3333 | 0.3667 | 0.3 | -1.0 | 0.4167 | 0.5333 | 0.5333 | 0.5 | 0.9 | -1.0 | -1.0 | -1.0 | 0.4 | 0.8 | -1.0 | -1.0 | 0.3837 | 0.8 | 0.0 | 0.0 | -1.0 | -1.0 | -1.0 | -1.0 | -1.0 | -1.0 |
| No log | 203.0 | 406 | 0.6904 | 0.2723 | 0.3333 | 0.3333 | 0.4 | 0.3 | -1.0 | 0.4167 | 0.55 | 0.55 | 0.5333 | 0.9 | -1.0 | -1.0 | -1.0 | 0.4 | 0.8 | -1.0 | -1.0 | 0.4168 | 0.85 | 0.0 | 0.0 | -1.0 | -1.0 | -1.0 | -1.0 | -1.0 | -1.0 |
| No log | 204.0 | 408 | 0.6990 | 0.2889 | 0.3333 | 0.3333 | 0.4167 | 0.3 | -1.0 | 0.45 | 0.5833 | 0.5833 | 0.5667 | 0.9 | -1.0 | -1.0 | -1.0 | 0.45 | 0.9 | -1.0 | -1.0 | 0.4168 | 0.85 | 0.0 | 0.0 | -1.0 | -1.0 | -1.0 | -1.0 | -1.0 | -1.0 |
| No log | 205.0 | 410 | 0.7041 | 0.2889 | 0.3333 | 0.3333 | 0.4167 | 0.3 | -1.0 | 0.45 | 0.5833 | 0.5833 | 0.5667 | 0.9 | -1.0 | -1.0 | -1.0 | 0.45 | 0.9 | -1.0 | -1.0 | 0.4168 | 0.85 | 0.0 | 0.0 | -1.0 | -1.0 | -1.0 | -1.0 | -1.0 | -1.0 |
| No log | 206.0 | 412 | 0.7022 | 0.2889 | 0.3333 | 0.3333 | 0.4167 | 0.3 | -1.0 | 0.45 | 0.5833 | 0.5833 | 0.5667 | 0.9 | -1.0 | -1.0 | -1.0 | 0.45 | 0.9 | -1.0 | -1.0 | 0.4168 | 0.85 | 0.0 | 0.0 | -1.0 | -1.0 | -1.0 | -1.0 | -1.0 | -1.0 |
| No log | 207.0 | 414 | 0.6918 | 0.2889 | 0.3333 | 0.3333 | 0.4167 | 0.3 | -1.0 | 0.45 | 0.5833 | 0.5833 | 0.5667 | 0.9 | -1.0 | -1.0 | -1.0 | 0.45 | 0.9 | -1.0 | -1.0 | 0.4168 | 0.85 | 0.0 | 0.0 | -1.0 | -1.0 | -1.0 | -1.0 | -1.0 | -1.0 |
| No log | 208.0 | 416 | 0.6790 | 0.2667 | 0.3333 | 0.3333 | 0.4 | 0.2667 | -1.0 | 0.4 | 0.5333 | 0.5333 | 0.5333 | 0.8 | -1.0 | -1.0 | -1.0 | 0.4 | 0.8 | -1.0 | -1.0 | 0.4 | 0.8 | 0.0 | 0.0 | -1.0 | -1.0 | -1.0 | -1.0 | -1.0 | -1.0 |
| No log | 209.0 | 418 | 0.6642 | 0.2667 | 0.3333 | 0.3333 | 0.4 | 0.2667 | -1.0 | 0.4 | 0.5333 | 0.5333 | 0.5333 | 0.8 | -1.0 | -1.0 | -1.0 | 0.4 | 0.8 | -1.0 | -1.0 | 0.4 | 0.8 | 0.0 | 0.0 | -1.0 | -1.0 | -1.0 | -1.0 | -1.0 | -1.0 |
| No log | 210.0 | 420 | 0.6621 | 0.2723 | 0.3333 | 0.3333 | 0.4 | 0.3 | -1.0 | 0.4167 | 0.55 | 0.55 | 0.5333 | 0.9 | -1.0 | -1.0 | -1.0 | 0.4 | 0.8 | -1.0 | -1.0 | 0.4168 | 0.85 | 0.0 | 0.0 | -1.0 | -1.0 | -1.0 | -1.0 | -1.0 | -1.0 |
| No log | 211.0 | 422 | 0.6693 | 0.2905 | 0.3333 | 0.3333 | 0.4197 | 0.3 | -1.0 | 0.45 | 0.6 | 0.6 | 0.6 | 0.9 | -1.0 | -1.0 | -1.0 | 0.45 | 0.9 | -1.0 | -1.0 | 0.4215 | 0.9 | 0.0 | 0.0 | -1.0 | -1.0 | -1.0 | -1.0 | -1.0 | -1.0 |
| No log | 212.0 | 424 | 0.6705 | 0.2907 | 0.3333 | 0.3333 | 0.4197 | 0.3 | -1.0 | 0.45 | 0.6 | 0.6 | 0.6 | 0.9 | -1.0 | -1.0 | -1.0 | 0.45 | 0.9 | -1.0 | -1.0 | 0.422 | 0.9 | 0.0 | 0.0 | -1.0 | -1.0 | -1.0 | -1.0 | -1.0 | -1.0 |
| No log | 213.0 | 426 | 0.6744 | 0.2723 | 0.3333 | 0.3333 | 0.4 | 0.3 | -1.0 | 0.4167 | 0.55 | 0.55 | 0.5333 | 0.9 | -1.0 | -1.0 | -1.0 | 0.4 | 0.8 | -1.0 | -1.0 | 0.4168 | 0.85 | 0.0 | 0.0 | -1.0 | -1.0 | -1.0 | -1.0 | -1.0 | -1.0 |
| No log | 214.0 | 428 | 0.6662 | 0.2723 | 0.3333 | 0.3333 | 0.4 | 0.3 | -1.0 | 0.4167 | 0.55 | 0.55 | 0.5333 | 0.9 | -1.0 | -1.0 | -1.0 | 0.4 | 0.8 | -1.0 | -1.0 | 0.4168 | 0.85 | 0.0 | 0.0 | -1.0 | -1.0 | -1.0 | -1.0 | -1.0 | -1.0 |
| No log | 215.0 | 430 | 0.6588 | 0.2667 | 0.3333 | 0.3333 | 0.4 | 0.2667 | -1.0 | 0.4 | 0.5333 | 0.5333 | 0.5333 | 0.8 | -1.0 | -1.0 | -1.0 | 0.4 | 0.8 | -1.0 | -1.0 | 0.4 | 0.8 | 0.0 | 0.0 | -1.0 | -1.0 | -1.0 | -1.0 | -1.0 | -1.0 |
| No log | 216.0 | 432 | 0.6607 | 0.2667 | 0.3333 | 0.3333 | 0.4 | 0.2667 | -1.0 | 0.4 | 0.5333 | 0.5333 | 0.5333 | 0.8 | -1.0 | -1.0 | -1.0 | 0.4 | 0.8 | -1.0 | -1.0 | 0.4 | 0.8 | 0.0 | 0.0 | -1.0 | -1.0 | -1.0 | -1.0 | -1.0 | -1.0 |
| No log | 217.0 | 434 | 0.6601 | 0.2667 | 0.3333 | 0.3333 | 0.4 | 0.2667 | -1.0 | 0.4 | 0.5333 | 0.5333 | 0.5333 | 0.8 | -1.0 | -1.0 | -1.0 | 0.4 | 0.8 | -1.0 | -1.0 | 0.4 | 0.8 | 0.0 | 0.0 | -1.0 | -1.0 | -1.0 | -1.0 | -1.0 | -1.0 |
| No log | 218.0 | 436 | 0.6571 | 0.2667 | 0.3333 | 0.3333 | 0.4 | 0.2667 | -1.0 | 0.4 | 0.5333 | 0.5333 | 0.5333 | 0.8 | -1.0 | -1.0 | -1.0 | 0.4 | 0.8 | -1.0 | -1.0 | 0.4 | 0.8 | 0.0 | 0.0 | -1.0 | -1.0 | -1.0 | -1.0 | -1.0 | -1.0 |
| No log | 219.0 | 438 | 0.6558 | 0.2667 | 0.3333 | 0.3333 | 0.4 | 0.2667 | -1.0 | 0.4 | 0.5333 | 0.5333 | 0.5333 | 0.8 | -1.0 | -1.0 | -1.0 | 0.4 | 0.8 | -1.0 | -1.0 | 0.4 | 0.8 | 0.0 | 0.0 | -1.0 | -1.0 | -1.0 | -1.0 | -1.0 | -1.0 |
| No log | 220.0 | 440 | 0.6541 | 0.2667 | 0.3333 | 0.3333 | 0.4 | 0.2667 | -1.0 | 0.4 | 0.5333 | 0.5333 | 0.5333 | 0.8 | -1.0 | -1.0 | -1.0 | 0.4 | 0.8 | -1.0 | -1.0 | 0.4 | 0.8 | 0.0 | 0.0 | -1.0 | -1.0 | -1.0 | -1.0 | -1.0 | -1.0 |
| No log | 221.0 | 442 | 0.6528 | 0.2667 | 0.3333 | 0.3333 | 0.4 | 0.2667 | -1.0 | 0.4 | 0.5333 | 0.5333 | 0.5333 | 0.8 | -1.0 | -1.0 | -1.0 | 0.4 | 0.8 | -1.0 | -1.0 | 0.4 | 0.8 | 0.0 | 0.0 | -1.0 | -1.0 | -1.0 | -1.0 | -1.0 | -1.0 |
| No log | 222.0 | 444 | 0.6544 | 0.2667 | 0.3333 | 0.3333 | 0.4 | 0.2667 | -1.0 | 0.4 | 0.5333 | 0.5333 | 0.5333 | 0.8 | -1.0 | -1.0 | -1.0 | 0.4 | 0.8 | -1.0 | -1.0 | 0.4 | 0.8 | 0.0 | 0.0 | -1.0 | -1.0 | -1.0 | -1.0 | -1.0 | -1.0 |
| No log | 223.0 | 446 | 0.6618 | 0.2667 | 0.3333 | 0.3333 | 0.4 | 0.2667 | -1.0 | 0.4 | 0.5333 | 0.5333 | 0.5333 | 0.8 | -1.0 | -1.0 | -1.0 | 0.4 | 0.8 | -1.0 | -1.0 | 0.4 | 0.8 | 0.0 | 0.0 | -1.0 | -1.0 | -1.0 | -1.0 | -1.0 | -1.0 |
| No log | 224.0 | 448 | 0.6637 | 0.2667 | 0.3333 | 0.3333 | 0.4 | 0.2667 | -1.0 | 0.4 | 0.5333 | 0.5333 | 0.5333 | 0.8 | -1.0 | -1.0 | -1.0 | 0.4 | 0.8 | -1.0 | -1.0 | 0.4 | 0.8 | 0.0 | 0.0 | -1.0 | -1.0 | -1.0 | -1.0 | -1.0 | -1.0 |
| No log | 225.0 | 450 | 0.6641 | 0.2723 | 0.3333 | 0.3333 | 0.4 | 0.3 | -1.0 | 0.4167 | 0.55 | 0.55 | 0.5333 | 0.9 | -1.0 | -1.0 | -1.0 | 0.4 | 0.8 | -1.0 | -1.0 | 0.4168 | 0.85 | 0.0 | 0.0 | -1.0 | -1.0 | -1.0 | -1.0 | -1.0 | -1.0 |
| No log | 226.0 | 452 | 0.6668 | 0.2723 | 0.3333 | 0.3333 | 0.4 | 0.3 | -1.0 | 0.4167 | 0.55 | 0.55 | 0.5333 | 0.9 | -1.0 | -1.0 | -1.0 | 0.4 | 0.8 | -1.0 | -1.0 | 0.4168 | 0.85 | 0.0 | 0.0 | -1.0 | -1.0 | -1.0 | -1.0 | -1.0 | -1.0 |
| No log | 227.0 | 454 | 0.6693 | 0.2723 | 0.3333 | 0.3333 | 0.4 | 0.3 | -1.0 | 0.4167 | 0.55 | 0.55 | 0.5333 | 0.9 | -1.0 | -1.0 | -1.0 | 0.4 | 0.8 | -1.0 | -1.0 | 0.4168 | 0.85 | 0.0 | 0.0 | -1.0 | -1.0 | -1.0 | -1.0 | -1.0 | -1.0 |
| No log | 228.0 | 456 | 0.6716 | 0.2667 | 0.3333 | 0.3333 | 0.4 | 0.2667 | -1.0 | 0.4 | 0.5333 | 0.5333 | 0.5333 | 0.8 | -1.0 | -1.0 | -1.0 | 0.4 | 0.8 | -1.0 | -1.0 | 0.4 | 0.8 | 0.0 | 0.0 | -1.0 | -1.0 | -1.0 | -1.0 | -1.0 | -1.0 |
| No log | 229.0 | 458 | 0.6795 | 0.2667 | 0.3333 | 0.3333 | 0.4 | 0.2667 | -1.0 | 0.4 | 0.5333 | 0.5333 | 0.5333 | 0.8 | -1.0 | -1.0 | -1.0 | 0.4 | 0.8 | -1.0 | -1.0 | 0.4 | 0.8 | 0.0 | 0.0 | -1.0 | -1.0 | -1.0 | -1.0 | -1.0 | -1.0 |
| No log | 230.0 | 460 | 0.6849 | 0.2833 | 0.3333 | 0.3333 | 0.4167 | 0.2667 | -1.0 | 0.4333 | 0.5667 | 0.5667 | 0.5667 | 0.8 | -1.0 | -1.0 | -1.0 | 0.45 | 0.9 | -1.0 | -1.0 | 0.4 | 0.8 | 0.0 | 0.0 | -1.0 | -1.0 | -1.0 | -1.0 | -1.0 | -1.0 |
| No log | 231.0 | 462 | 0.6854 | 0.2833 | 0.3333 | 0.3333 | 0.4167 | 0.2667 | -1.0 | 0.4333 | 0.5667 | 0.5667 | 0.5667 | 0.8 | -1.0 | -1.0 | -1.0 | 0.45 | 0.9 | -1.0 | -1.0 | 0.4 | 0.8 | 0.0 | 0.0 | -1.0 | -1.0 | -1.0 | -1.0 | -1.0 | -1.0 |
| No log | 232.0 | 464 | 0.6850 | 0.2667 | 0.3333 | 0.3333 | 0.4 | 0.2667 | -1.0 | 0.4 | 0.5333 | 0.5333 | 0.5333 | 0.8 | -1.0 | -1.0 | -1.0 | 0.4 | 0.8 | -1.0 | -1.0 | 0.4 | 0.8 | 0.0 | 0.0 | -1.0 | -1.0 | -1.0 | -1.0 | -1.0 | -1.0 |
| No log | 233.0 | 466 | 0.6808 | 0.2833 | 0.3333 | 0.3333 | 0.4167 | 0.2667 | -1.0 | 0.4333 | 0.5667 | 0.5667 | 0.5667 | 0.8 | -1.0 | -1.0 | -1.0 | 0.45 | 0.9 | -1.0 | -1.0 | 0.4 | 0.8 | 0.0 | 0.0 | -1.0 | -1.0 | -1.0 | -1.0 | -1.0 | -1.0 |
| No log | 234.0 | 468 | 0.6785 | 0.2723 | 0.3333 | 0.3333 | 0.4 | 0.3 | -1.0 | 0.4167 | 0.55 | 0.55 | 0.5333 | 0.9 | -1.0 | -1.0 | -1.0 | 0.4 | 0.8 | -1.0 | -1.0 | 0.4168 | 0.85 | 0.0 | 0.0 | -1.0 | -1.0 | -1.0 | -1.0 | -1.0 | -1.0 |
| No log | 235.0 | 470 | 0.6800 | 0.2723 | 0.3333 | 0.3333 | 0.4 | 0.3 | -1.0 | 0.4167 | 0.55 | 0.55 | 0.5333 | 0.9 | -1.0 | -1.0 | -1.0 | 0.4 | 0.8 | -1.0 | -1.0 | 0.4168 | 0.85 | 0.0 | 0.0 | -1.0 | -1.0 | -1.0 | -1.0 | -1.0 | -1.0 |
| No log | 236.0 | 472 | 0.6794 | 0.2723 | 0.3333 | 0.3333 | 0.4 | 0.3 | -1.0 | 0.4167 | 0.55 | 0.55 | 0.5333 | 0.9 | -1.0 | -1.0 | -1.0 | 0.4 | 0.8 | -1.0 | -1.0 | 0.4168 | 0.85 | 0.0 | 0.0 | -1.0 | -1.0 | -1.0 | -1.0 | -1.0 | -1.0 |
| No log | 237.0 | 474 | 0.6783 | 0.2723 | 0.3333 | 0.3333 | 0.4 | 0.3 | -1.0 | 0.4167 | 0.55 | 0.55 | 0.5333 | 0.9 | -1.0 | -1.0 | -1.0 | 0.4 | 0.8 | -1.0 | -1.0 | 0.4168 | 0.85 | 0.0 | 0.0 | -1.0 | -1.0 | -1.0 | -1.0 | -1.0 | -1.0 |
| No log | 238.0 | 476 | 0.6766 | 0.2667 | 0.3333 | 0.3333 | 0.4 | 0.2667 | -1.0 | 0.4 | 0.5333 | 0.5333 | 0.5333 | 0.8 | -1.0 | -1.0 | -1.0 | 0.4 | 0.8 | -1.0 | -1.0 | 0.4 | 0.8 | 0.0 | 0.0 | -1.0 | -1.0 | -1.0 | -1.0 | -1.0 | -1.0 |
| No log | 239.0 | 478 | 0.6743 | 0.2667 | 0.3333 | 0.3333 | 0.4 | 0.2667 | -1.0 | 0.4 | 0.5333 | 0.5333 | 0.5333 | 0.8 | -1.0 | -1.0 | -1.0 | 0.4 | 0.8 | -1.0 | -1.0 | 0.4 | 0.8 | 0.0 | 0.0 | -1.0 | -1.0 | -1.0 | -1.0 | -1.0 | -1.0 |
| No log | 240.0 | 480 | 0.6736 | 0.2667 | 0.3333 | 0.3333 | 0.4 | 0.2667 | -1.0 | 0.4 | 0.5333 | 0.5333 | 0.5333 | 0.8 | -1.0 | -1.0 | -1.0 | 0.4 | 0.8 | -1.0 | -1.0 | 0.4 | 0.8 | 0.0 | 0.0 | -1.0 | -1.0 | -1.0 | -1.0 | -1.0 | -1.0 |
| No log | 241.0 | 482 | 0.6736 | 0.2667 | 0.3333 | 0.3333 | 0.4 | 0.2667 | -1.0 | 0.4 | 0.5333 | 0.5333 | 0.5333 | 0.8 | -1.0 | -1.0 | -1.0 | 0.4 | 0.8 | -1.0 | -1.0 | 0.4 | 0.8 | 0.0 | 0.0 | -1.0 | -1.0 | -1.0 | -1.0 | -1.0 | -1.0 |
| No log | 242.0 | 484 | 0.6750 | 0.2667 | 0.3333 | 0.3333 | 0.4 | 0.2667 | -1.0 | 0.4 | 0.5333 | 0.5333 | 0.5333 | 0.8 | -1.0 | -1.0 | -1.0 | 0.4 | 0.8 | -1.0 | -1.0 | 0.4 | 0.8 | 0.0 | 0.0 | -1.0 | -1.0 | -1.0 | -1.0 | -1.0 | -1.0 |
| No log | 243.0 | 486 | 0.6735 | 0.2667 | 0.3333 | 0.3333 | 0.4 | 0.2667 | -1.0 | 0.4 | 0.5333 | 0.5333 | 0.5333 | 0.8 | -1.0 | -1.0 | -1.0 | 0.4 | 0.8 | -1.0 | -1.0 | 0.4 | 0.8 | 0.0 | 0.0 | -1.0 | -1.0 | -1.0 | -1.0 | -1.0 | -1.0 |
| No log | 244.0 | 488 | 0.6726 | 0.2667 | 0.3333 | 0.3333 | 0.4 | 0.2667 | -1.0 | 0.4 | 0.5333 | 0.5333 | 0.5333 | 0.8 | -1.0 | -1.0 | -1.0 | 0.4 | 0.8 | -1.0 | -1.0 | 0.4 | 0.8 | 0.0 | 0.0 | -1.0 | -1.0 | -1.0 | -1.0 | -1.0 | -1.0 |
| No log | 245.0 | 490 | 0.6723 | 0.2667 | 0.3333 | 0.3333 | 0.4 | 0.2667 | -1.0 | 0.4 | 0.5333 | 0.5333 | 0.5333 | 0.8 | -1.0 | -1.0 | -1.0 | 0.4 | 0.8 | -1.0 | -1.0 | 0.4 | 0.8 | 0.0 | 0.0 | -1.0 | -1.0 | -1.0 | -1.0 | -1.0 | -1.0 |
| No log | 246.0 | 492 | 0.6741 | 0.2667 | 0.3333 | 0.3333 | 0.4 | 0.2667 | -1.0 | 0.4 | 0.5333 | 0.5333 | 0.5333 | 0.8 | -1.0 | -1.0 | -1.0 | 0.4 | 0.8 | -1.0 | -1.0 | 0.4 | 0.8 | 0.0 | 0.0 | -1.0 | -1.0 | -1.0 | -1.0 | -1.0 | -1.0 |
| No log | 247.0 | 494 | 0.6786 | 0.2833 | 0.3333 | 0.3333 | 0.4167 | 0.2667 | -1.0 | 0.4333 | 0.5667 | 0.5667 | 0.5667 | 0.8 | -1.0 | -1.0 | -1.0 | 0.45 | 0.9 | -1.0 | -1.0 | 0.4 | 0.8 | 0.0 | 0.0 | -1.0 | -1.0 | -1.0 | -1.0 | -1.0 | -1.0 |
| No log | 248.0 | 496 | 0.6811 | 0.2667 | 0.3333 | 0.3333 | 0.4 | 0.2667 | -1.0 | 0.4 | 0.5333 | 0.5333 | 0.5333 | 0.8 | -1.0 | -1.0 | -1.0 | 0.4 | 0.8 | -1.0 | -1.0 | 0.4 | 0.8 | 0.0 | 0.0 | -1.0 | -1.0 | -1.0 | -1.0 | -1.0 | -1.0 |
| No log | 249.0 | 498 | 0.6810 | 0.2667 | 0.3333 | 0.3333 | 0.4 | 0.2667 | -1.0 | 0.4 | 0.5333 | 0.5333 | 0.5333 | 0.8 | -1.0 | -1.0 | -1.0 | 0.4 | 0.8 | -1.0 | -1.0 | 0.4 | 0.8 | 0.0 | 0.0 | -1.0 | -1.0 | -1.0 | -1.0 | -1.0 | -1.0 |
| 0.6008 | 250.0 | 500 | 0.6814 | 0.2667 | 0.3333 | 0.3333 | 0.4 | 0.2667 | -1.0 | 0.4 | 0.5333 | 0.5333 | 0.5333 | 0.8 | -1.0 | -1.0 | -1.0 | 0.4 | 0.8 | -1.0 | -1.0 | 0.4 | 0.8 | 0.0 | 0.0 | -1.0 | -1.0 | -1.0 | -1.0 | -1.0 | -1.0 |
| 0.6008 | 251.0 | 502 | 0.6830 | 0.2667 | 0.3333 | 0.3333 | 0.4 | 0.2667 | -1.0 | 0.4 | 0.5333 | 0.5333 | 0.5333 | 0.8 | -1.0 | -1.0 | -1.0 | 0.4 | 0.8 | -1.0 | -1.0 | 0.4 | 0.8 | 0.0 | 0.0 | -1.0 | -1.0 | -1.0 | -1.0 | -1.0 | -1.0 |
| 0.6008 | 252.0 | 504 | 0.6843 | 0.2667 | 0.3333 | 0.3333 | 0.4 | 0.2667 | -1.0 | 0.4 | 0.5333 | 0.5333 | 0.5333 | 0.8 | -1.0 | -1.0 | -1.0 | 0.4 | 0.8 | -1.0 | -1.0 | 0.4 | 0.8 | 0.0 | 0.0 | -1.0 | -1.0 | -1.0 | -1.0 | -1.0 | -1.0 |
| 0.6008 | 253.0 | 506 | 0.6844 | 0.2667 | 0.3333 | 0.3333 | 0.4 | 0.2667 | -1.0 | 0.4 | 0.5333 | 0.5333 | 0.5333 | 0.8 | -1.0 | -1.0 | -1.0 | 0.4 | 0.8 | -1.0 | -1.0 | 0.4 | 0.8 | 0.0 | 0.0 | -1.0 | -1.0 | -1.0 | -1.0 | -1.0 | -1.0 |
| 0.6008 | 254.0 | 508 | 0.6861 | 0.2667 | 0.3333 | 0.3333 | 0.4 | 0.2667 | -1.0 | 0.4 | 0.5333 | 0.5333 | 0.5333 | 0.8 | -1.0 | -1.0 | -1.0 | 0.4 | 0.8 | -1.0 | -1.0 | 0.4 | 0.8 | 0.0 | 0.0 | -1.0 | -1.0 | -1.0 | -1.0 | -1.0 | -1.0 |
| 0.6008 | 255.0 | 510 | 0.6867 | 0.2667 | 0.3333 | 0.3333 | 0.4 | 0.2667 | -1.0 | 0.4 | 0.5333 | 0.5333 | 0.5333 | 0.8 | -1.0 | -1.0 | -1.0 | 0.4 | 0.8 | -1.0 | -1.0 | 0.4 | 0.8 | 0.0 | 0.0 | -1.0 | -1.0 | -1.0 | -1.0 | -1.0 | -1.0 |
| 0.6008 | 256.0 | 512 | 0.6871 | 0.2667 | 0.3333 | 0.3333 | 0.4 | 0.2667 | -1.0 | 0.4 | 0.5333 | 0.5333 | 0.5333 | 0.8 | -1.0 | -1.0 | -1.0 | 0.4 | 0.8 | -1.0 | -1.0 | 0.4 | 0.8 | 0.0 | 0.0 | -1.0 | -1.0 | -1.0 | -1.0 | -1.0 | -1.0 |
| 0.6008 | 257.0 | 514 | 0.6868 | 0.2667 | 0.3333 | 0.3333 | 0.4 | 0.2667 | -1.0 | 0.4 | 0.5333 | 0.5333 | 0.5333 | 0.8 | -1.0 | -1.0 | -1.0 | 0.4 | 0.8 | -1.0 | -1.0 | 0.4 | 0.8 | 0.0 | 0.0 | -1.0 | -1.0 | -1.0 | -1.0 | -1.0 | -1.0 |
| 0.6008 | 258.0 | 516 | 0.6863 | 0.2723 | 0.3333 | 0.3333 | 0.4 | 0.3 | -1.0 | 0.4167 | 0.55 | 0.55 | 0.5333 | 0.9 | -1.0 | -1.0 | -1.0 | 0.4 | 0.8 | -1.0 | -1.0 | 0.4168 | 0.85 | 0.0 | 0.0 | -1.0 | -1.0 | -1.0 | -1.0 | -1.0 | -1.0 |
| 0.6008 | 259.0 | 518 | 0.6832 | 0.2723 | 0.3333 | 0.3333 | 0.4 | 0.3 | -1.0 | 0.4167 | 0.55 | 0.55 | 0.5333 | 0.9 | -1.0 | -1.0 | -1.0 | 0.4 | 0.8 | -1.0 | -1.0 | 0.4168 | 0.85 | 0.0 | 0.0 | -1.0 | -1.0 | -1.0 | -1.0 | -1.0 | -1.0 |
| 0.6008 | 260.0 | 520 | 0.6789 | 0.2723 | 0.3333 | 0.3333 | 0.4 | 0.3 | -1.0 | 0.4167 | 0.55 | 0.55 | 0.5333 | 0.9 | -1.0 | -1.0 | -1.0 | 0.4 | 0.8 | -1.0 | -1.0 | 0.4168 | 0.85 | 0.0 | 0.0 | -1.0 | -1.0 | -1.0 | -1.0 | -1.0 | -1.0 |
| 0.6008 | 261.0 | 522 | 0.6734 | 0.2723 | 0.3333 | 0.3333 | 0.4 | 0.3 | -1.0 | 0.4167 | 0.55 | 0.55 | 0.5333 | 0.9 | -1.0 | -1.0 | -1.0 | 0.4 | 0.8 | -1.0 | -1.0 | 0.4168 | 0.85 | 0.0 | 0.0 | -1.0 | -1.0 | -1.0 | -1.0 | -1.0 | -1.0 |
| 0.6008 | 262.0 | 524 | 0.6692 | 0.2723 | 0.3333 | 0.3333 | 0.4 | 0.3 | -1.0 | 0.4167 | 0.55 | 0.55 | 0.5333 | 0.9 | -1.0 | -1.0 | -1.0 | 0.4 | 0.8 | -1.0 | -1.0 | 0.4168 | 0.85 | 0.0 | 0.0 | -1.0 | -1.0 | -1.0 | -1.0 | -1.0 | -1.0 |
| 0.6008 | 263.0 | 526 | 0.6664 | 0.2723 | 0.3333 | 0.3333 | 0.4 | 0.3 | -1.0 | 0.4167 | 0.55 | 0.55 | 0.5333 | 0.9 | -1.0 | -1.0 | -1.0 | 0.4 | 0.8 | -1.0 | -1.0 | 0.4168 | 0.85 | 0.0 | 0.0 | -1.0 | -1.0 | -1.0 | -1.0 | -1.0 | -1.0 |
| 0.6008 | 264.0 | 528 | 0.6646 | 0.2723 | 0.3333 | 0.3333 | 0.4 | 0.3 | -1.0 | 0.4167 | 0.55 | 0.55 | 0.5333 | 0.9 | -1.0 | -1.0 | -1.0 | 0.4 | 0.8 | -1.0 | -1.0 | 0.4168 | 0.85 | 0.0 | 0.0 | -1.0 | -1.0 | -1.0 | -1.0 | -1.0 | -1.0 |
| 0.6008 | 265.0 | 530 | 0.6628 | 0.2667 | 0.3333 | 0.3333 | 0.4 | 0.2667 | -1.0 | 0.4 | 0.5333 | 0.5333 | 0.5333 | 0.8 | -1.0 | -1.0 | -1.0 | 0.4 | 0.8 | -1.0 | -1.0 | 0.4 | 0.8 | 0.0 | 0.0 | -1.0 | -1.0 | -1.0 | -1.0 | -1.0 | -1.0 |
| 0.6008 | 266.0 | 532 | 0.6608 | 0.2667 | 0.3333 | 0.3333 | 0.4 | 0.2667 | -1.0 | 0.4 | 0.5333 | 0.5333 | 0.5333 | 0.8 | -1.0 | -1.0 | -1.0 | 0.4 | 0.8 | -1.0 | -1.0 | 0.4 | 0.8 | 0.0 | 0.0 | -1.0 | -1.0 | -1.0 | -1.0 | -1.0 | -1.0 |
| 0.6008 | 267.0 | 534 | 0.6597 | 0.2667 | 0.3333 | 0.3333 | 0.4 | 0.2667 | -1.0 | 0.4 | 0.5333 | 0.5333 | 0.5333 | 0.8 | -1.0 | -1.0 | -1.0 | 0.4 | 0.8 | -1.0 | -1.0 | 0.4 | 0.8 | 0.0 | 0.0 | -1.0 | -1.0 | -1.0 | -1.0 | -1.0 | -1.0 |
| 0.6008 | 268.0 | 536 | 0.6589 | 0.2667 | 0.3333 | 0.3333 | 0.4 | 0.2667 | -1.0 | 0.4 | 0.5333 | 0.5333 | 0.5333 | 0.8 | -1.0 | -1.0 | -1.0 | 0.4 | 0.8 | -1.0 | -1.0 | 0.4 | 0.8 | 0.0 | 0.0 | -1.0 | -1.0 | -1.0 | -1.0 | -1.0 | -1.0 |
| 0.6008 | 269.0 | 538 | 0.6573 | 0.2667 | 0.3333 | 0.3333 | 0.4 | 0.2667 | -1.0 | 0.4 | 0.5333 | 0.5333 | 0.5333 | 0.8 | -1.0 | -1.0 | -1.0 | 0.4 | 0.8 | -1.0 | -1.0 | 0.4 | 0.8 | 0.0 | 0.0 | -1.0 | -1.0 | -1.0 | -1.0 | -1.0 | -1.0 |
| 0.6008 | 270.0 | 540 | 0.6553 | 0.2667 | 0.3333 | 0.3333 | 0.4 | 0.2667 | -1.0 | 0.4 | 0.5333 | 0.5333 | 0.5333 | 0.8 | -1.0 | -1.0 | -1.0 | 0.4 | 0.8 | -1.0 | -1.0 | 0.4 | 0.8 | 0.0 | 0.0 | -1.0 | -1.0 | -1.0 | -1.0 | -1.0 | -1.0 |
| 0.6008 | 271.0 | 542 | 0.6541 | 0.2667 | 0.3333 | 0.3333 | 0.4 | 0.2667 | -1.0 | 0.4 | 0.5333 | 0.5333 | 0.5333 | 0.8 | -1.0 | -1.0 | -1.0 | 0.4 | 0.8 | -1.0 | -1.0 | 0.4 | 0.8 | 0.0 | 0.0 | -1.0 | -1.0 | -1.0 | -1.0 | -1.0 | -1.0 |
| 0.6008 | 272.0 | 544 | 0.6529 | 0.2667 | 0.3333 | 0.3333 | 0.4 | 0.2667 | -1.0 | 0.4 | 0.5333 | 0.5333 | 0.5333 | 0.8 | -1.0 | -1.0 | -1.0 | 0.4 | 0.8 | -1.0 | -1.0 | 0.4 | 0.8 | 0.0 | 0.0 | -1.0 | -1.0 | -1.0 | -1.0 | -1.0 | -1.0 |
| 0.6008 | 273.0 | 546 | 0.6510 | 0.2667 | 0.3333 | 0.3333 | 0.4 | 0.2667 | -1.0 | 0.4 | 0.5333 | 0.5333 | 0.5333 | 0.8 | -1.0 | -1.0 | -1.0 | 0.4 | 0.8 | -1.0 | -1.0 | 0.4 | 0.8 | 0.0 | 0.0 | -1.0 | -1.0 | -1.0 | -1.0 | -1.0 | -1.0 |
| 0.6008 | 274.0 | 548 | 0.6508 | 0.2667 | 0.3333 | 0.3333 | 0.4 | 0.2667 | -1.0 | 0.4 | 0.5333 | 0.5333 | 0.5333 | 0.8 | -1.0 | -1.0 | -1.0 | 0.4 | 0.8 | -1.0 | -1.0 | 0.4 | 0.8 | 0.0 | 0.0 | -1.0 | -1.0 | -1.0 | -1.0 | -1.0 | -1.0 |
| 0.6008 | 275.0 | 550 | 0.6514 | 0.2667 | 0.3333 | 0.3333 | 0.4 | 0.2667 | -1.0 | 0.4 | 0.5333 | 0.5333 | 0.5333 | 0.8 | -1.0 | -1.0 | -1.0 | 0.4 | 0.8 | -1.0 | -1.0 | 0.4 | 0.8 | 0.0 | 0.0 | -1.0 | -1.0 | -1.0 | -1.0 | -1.0 | -1.0 |
| 0.6008 | 276.0 | 552 | 0.6519 | 0.2667 | 0.3333 | 0.3333 | 0.4 | 0.2667 | -1.0 | 0.4 | 0.5333 | 0.5333 | 0.5333 | 0.8 | -1.0 | -1.0 | -1.0 | 0.4 | 0.8 | -1.0 | -1.0 | 0.4 | 0.8 | 0.0 | 0.0 | -1.0 | -1.0 | -1.0 | -1.0 | -1.0 | -1.0 |
| 0.6008 | 277.0 | 554 | 0.6523 | 0.2667 | 0.3333 | 0.3333 | 0.4 | 0.2667 | -1.0 | 0.4 | 0.5333 | 0.5333 | 0.5333 | 0.8 | -1.0 | -1.0 | -1.0 | 0.4 | 0.8 | -1.0 | -1.0 | 0.4 | 0.8 | 0.0 | 0.0 | -1.0 | -1.0 | -1.0 | -1.0 | -1.0 | -1.0 |
| 0.6008 | 278.0 | 556 | 0.6529 | 0.2667 | 0.3333 | 0.3333 | 0.4 | 0.2667 | -1.0 | 0.4 | 0.5333 | 0.5333 | 0.5333 | 0.8 | -1.0 | -1.0 | -1.0 | 0.4 | 0.8 | -1.0 | -1.0 | 0.4 | 0.8 | 0.0 | 0.0 | -1.0 | -1.0 | -1.0 | -1.0 | -1.0 | -1.0 |
| 0.6008 | 279.0 | 558 | 0.6532 | 0.2667 | 0.3333 | 0.3333 | 0.4 | 0.2667 | -1.0 | 0.4 | 0.5333 | 0.5333 | 0.5333 | 0.8 | -1.0 | -1.0 | -1.0 | 0.4 | 0.8 | -1.0 | -1.0 | 0.4 | 0.8 | 0.0 | 0.0 | -1.0 | -1.0 | -1.0 | -1.0 | -1.0 | -1.0 |
| 0.6008 | 280.0 | 560 | 0.6536 | 0.2667 | 0.3333 | 0.3333 | 0.4 | 0.2667 | -1.0 | 0.4 | 0.5333 | 0.5333 | 0.5333 | 0.8 | -1.0 | -1.0 | -1.0 | 0.4 | 0.8 | -1.0 | -1.0 | 0.4 | 0.8 | 0.0 | 0.0 | -1.0 | -1.0 | -1.0 | -1.0 | -1.0 | -1.0 |
| 0.6008 | 281.0 | 562 | 0.6538 | 0.2667 | 0.3333 | 0.3333 | 0.4 | 0.2667 | -1.0 | 0.4 | 0.5333 | 0.5333 | 0.5333 | 0.8 | -1.0 | -1.0 | -1.0 | 0.4 | 0.8 | -1.0 | -1.0 | 0.4 | 0.8 | 0.0 | 0.0 | -1.0 | -1.0 | -1.0 | -1.0 | -1.0 | -1.0 |
| 0.6008 | 282.0 | 564 | 0.6539 | 0.2667 | 0.3333 | 0.3333 | 0.4 | 0.2667 | -1.0 | 0.4 | 0.5333 | 0.5333 | 0.5333 | 0.8 | -1.0 | -1.0 | -1.0 | 0.4 | 0.8 | -1.0 | -1.0 | 0.4 | 0.8 | 0.0 | 0.0 | -1.0 | -1.0 | -1.0 | -1.0 | -1.0 | -1.0 |
| 0.6008 | 283.0 | 566 | 0.6540 | 0.2667 | 0.3333 | 0.3333 | 0.4 | 0.2667 | -1.0 | 0.4 | 0.5333 | 0.5333 | 0.5333 | 0.8 | -1.0 | -1.0 | -1.0 | 0.4 | 0.8 | -1.0 | -1.0 | 0.4 | 0.8 | 0.0 | 0.0 | -1.0 | -1.0 | -1.0 | -1.0 | -1.0 | -1.0 |
| 0.6008 | 284.0 | 568 | 0.6540 | 0.2667 | 0.3333 | 0.3333 | 0.4 | 0.2667 | -1.0 | 0.4 | 0.5333 | 0.5333 | 0.5333 | 0.8 | -1.0 | -1.0 | -1.0 | 0.4 | 0.8 | -1.0 | -1.0 | 0.4 | 0.8 | 0.0 | 0.0 | -1.0 | -1.0 | -1.0 | -1.0 | -1.0 | -1.0 |
| 0.6008 | 285.0 | 570 | 0.6542 | 0.2667 | 0.3333 | 0.3333 | 0.4 | 0.2667 | -1.0 | 0.4 | 0.5333 | 0.5333 | 0.5333 | 0.8 | -1.0 | -1.0 | -1.0 | 0.4 | 0.8 | -1.0 | -1.0 | 0.4 | 0.8 | 0.0 | 0.0 | -1.0 | -1.0 | -1.0 | -1.0 | -1.0 | -1.0 |
| 0.6008 | 286.0 | 572 | 0.6546 | 0.2667 | 0.3333 | 0.3333 | 0.4 | 0.2667 | -1.0 | 0.4 | 0.5333 | 0.5333 | 0.5333 | 0.8 | -1.0 | -1.0 | -1.0 | 0.4 | 0.8 | -1.0 | -1.0 | 0.4 | 0.8 | 0.0 | 0.0 | -1.0 | -1.0 | -1.0 | -1.0 | -1.0 | -1.0 |
| 0.6008 | 287.0 | 574 | 0.6546 | 0.2667 | 0.3333 | 0.3333 | 0.4 | 0.2667 | -1.0 | 0.4 | 0.5333 | 0.5333 | 0.5333 | 0.8 | -1.0 | -1.0 | -1.0 | 0.4 | 0.8 | -1.0 | -1.0 | 0.4 | 0.8 | 0.0 | 0.0 | -1.0 | -1.0 | -1.0 | -1.0 | -1.0 | -1.0 |
| 0.6008 | 288.0 | 576 | 0.6546 | 0.2667 | 0.3333 | 0.3333 | 0.4 | 0.2667 | -1.0 | 0.4 | 0.5333 | 0.5333 | 0.5333 | 0.8 | -1.0 | -1.0 | -1.0 | 0.4 | 0.8 | -1.0 | -1.0 | 0.4 | 0.8 | 0.0 | 0.0 | -1.0 | -1.0 | -1.0 | -1.0 | -1.0 | -1.0 |
| 0.6008 | 289.0 | 578 | 0.6547 | 0.2667 | 0.3333 | 0.3333 | 0.4 | 0.2667 | -1.0 | 0.4 | 0.5333 | 0.5333 | 0.5333 | 0.8 | -1.0 | -1.0 | -1.0 | 0.4 | 0.8 | -1.0 | -1.0 | 0.4 | 0.8 | 0.0 | 0.0 | -1.0 | -1.0 | -1.0 | -1.0 | -1.0 | -1.0 |
| 0.6008 | 290.0 | 580 | 0.6548 | 0.2667 | 0.3333 | 0.3333 | 0.4 | 0.2667 | -1.0 | 0.4 | 0.5333 | 0.5333 | 0.5333 | 0.8 | -1.0 | -1.0 | -1.0 | 0.4 | 0.8 | -1.0 | -1.0 | 0.4 | 0.8 | 0.0 | 0.0 | -1.0 | -1.0 | -1.0 | -1.0 | -1.0 | -1.0 |
| 0.6008 | 291.0 | 582 | 0.6548 | 0.2667 | 0.3333 | 0.3333 | 0.4 | 0.2667 | -1.0 | 0.4 | 0.5333 | 0.5333 | 0.5333 | 0.8 | -1.0 | -1.0 | -1.0 | 0.4 | 0.8 | -1.0 | -1.0 | 0.4 | 0.8 | 0.0 | 0.0 | -1.0 | -1.0 | -1.0 | -1.0 | -1.0 | -1.0 |
| 0.6008 | 292.0 | 584 | 0.6549 | 0.2667 | 0.3333 | 0.3333 | 0.4 | 0.2667 | -1.0 | 0.4 | 0.5333 | 0.5333 | 0.5333 | 0.8 | -1.0 | -1.0 | -1.0 | 0.4 | 0.8 | -1.0 | -1.0 | 0.4 | 0.8 | 0.0 | 0.0 | -1.0 | -1.0 | -1.0 | -1.0 | -1.0 | -1.0 |
| 0.6008 | 293.0 | 586 | 0.6549 | 0.2667 | 0.3333 | 0.3333 | 0.4 | 0.2667 | -1.0 | 0.4 | 0.5333 | 0.5333 | 0.5333 | 0.8 | -1.0 | -1.0 | -1.0 | 0.4 | 0.8 | -1.0 | -1.0 | 0.4 | 0.8 | 0.0 | 0.0 | -1.0 | -1.0 | -1.0 | -1.0 | -1.0 | -1.0 |
| 0.6008 | 294.0 | 588 | 0.6549 | 0.2667 | 0.3333 | 0.3333 | 0.4 | 0.2667 | -1.0 | 0.4 | 0.5333 | 0.5333 | 0.5333 | 0.8 | -1.0 | -1.0 | -1.0 | 0.4 | 0.8 | -1.0 | -1.0 | 0.4 | 0.8 | 0.0 | 0.0 | -1.0 | -1.0 | -1.0 | -1.0 | -1.0 | -1.0 |
| 0.6008 | 295.0 | 590 | 0.6548 | 0.2667 | 0.3333 | 0.3333 | 0.4 | 0.2667 | -1.0 | 0.4 | 0.5333 | 0.5333 | 0.5333 | 0.8 | -1.0 | -1.0 | -1.0 | 0.4 | 0.8 | -1.0 | -1.0 | 0.4 | 0.8 | 0.0 | 0.0 | -1.0 | -1.0 | -1.0 | -1.0 | -1.0 | -1.0 |
| 0.6008 | 296.0 | 592 | 0.6548 | 0.2667 | 0.3333 | 0.3333 | 0.4 | 0.2667 | -1.0 | 0.4 | 0.5333 | 0.5333 | 0.5333 | 0.8 | -1.0 | -1.0 | -1.0 | 0.4 | 0.8 | -1.0 | -1.0 | 0.4 | 0.8 | 0.0 | 0.0 | -1.0 | -1.0 | -1.0 | -1.0 | -1.0 | -1.0 |
| 0.6008 | 297.0 | 594 | 0.6548 | 0.2667 | 0.3333 | 0.3333 | 0.4 | 0.2667 | -1.0 | 0.4 | 0.5333 | 0.5333 | 0.5333 | 0.8 | -1.0 | -1.0 | -1.0 | 0.4 | 0.8 | -1.0 | -1.0 | 0.4 | 0.8 | 0.0 | 0.0 | -1.0 | -1.0 | -1.0 | -1.0 | -1.0 | -1.0 |
| 0.6008 | 298.0 | 596 | 0.6548 | 0.2667 | 0.3333 | 0.3333 | 0.4 | 0.2667 | -1.0 | 0.4 | 0.5333 | 0.5333 | 0.5333 | 0.8 | -1.0 | -1.0 | -1.0 | 0.4 | 0.8 | -1.0 | -1.0 | 0.4 | 0.8 | 0.0 | 0.0 | -1.0 | -1.0 | -1.0 | -1.0 | -1.0 | -1.0 |
| 0.6008 | 299.0 | 598 | 0.6547 | 0.2667 | 0.3333 | 0.3333 | 0.4 | 0.2667 | -1.0 | 0.4 | 0.5333 | 0.5333 | 0.5333 | 0.8 | -1.0 | -1.0 | -1.0 | 0.4 | 0.8 | -1.0 | -1.0 | 0.4 | 0.8 | 0.0 | 0.0 | -1.0 | -1.0 | -1.0 | -1.0 | -1.0 | -1.0 |
| 0.6008 | 300.0 | 600 | 0.6547 | 0.2667 | 0.3333 | 0.3333 | 0.4 | 0.2667 | -1.0 | 0.4 | 0.5333 | 0.5333 | 0.5333 | 0.8 | -1.0 | -1.0 | -1.0 | 0.4 | 0.8 | -1.0 | -1.0 | 0.4 | 0.8 | 0.0 | 0.0 | -1.0 | -1.0 | -1.0 | -1.0 | -1.0 | -1.0 |
### Framework versions
- Transformers 4.44.2
- Pytorch 2.5.0+cu121
- Datasets 3.1.0
- Tokenizers 0.19.1
| [
"black_star",
"cat",
"grey_star",
"insect",
"moon",
"owl",
"unicorn_head",
"unicorn_whole"
] |
sahand1/detr_finetuned_candy |
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# detr_finetuned_candy
This model is a fine-tuned version of [microsoft/conditional-detr-resnet-50](https://huggingface.co/microsoft/conditional-detr-resnet-50) on the imagefolder dataset.
It achieves the following results on the evaluation set:
- Loss: 2.2689
- Map: 0.007
- Map 50: 0.0117
- Map 75: 0.0077
- Map Small: 0.0065
- Map Medium: 0.0184
- Map Large: -1.0
- Mar 1: 0.0
- Mar 10: 0.1143
- Mar 100: 0.2286
- Mar Small: 0.2
- Mar Medium: 0.24
- Mar Large: -1.0
- Map Black Star: 0.0
- Mar 100 Black Star: 0.0
- Map Cat: 0.0381
- Mar 100 Cat: 0.8
- Map Grey Star: 0.0034
- Mar 100 Grey Star: 0.4
- Map Insect: 0.0
- Mar 100 Insect: 0.0
- Map Moon: 0.0
- Mar 100 Moon: 0.0
- Map Owl: 0.0
- Mar 100 Owl: 0.0
- Map Unicorn Head: 0.0077
- Mar 100 Unicorn Head: 0.4
- Map Unicorn Whole: -1.0
- Mar 100 Unicorn Whole: -1.0
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 5e-05
- train_batch_size: 8
- eval_batch_size: 8
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: cosine
- num_epochs: 100
### Training results
| Training Loss | Epoch | Step | Validation Loss | Map | Map 50 | Map 75 | Map Small | Map Medium | Map Large | Mar 1 | Mar 10 | Mar 100 | Mar Small | Mar Medium | Mar Large | Map Black Star | Mar 100 Black Star | Map Cat | Mar 100 Cat | Map Grey Star | Mar 100 Grey Star | Map Insect | Mar 100 Insect | Map Moon | Mar 100 Moon | Map Owl | Mar 100 Owl | Map Unicorn Head | Mar 100 Unicorn Head | Map Unicorn Whole | Mar 100 Unicorn Whole |
|:-------------:|:-----:|:----:|:---------------:|:------:|:------:|:------:|:---------:|:----------:|:---------:|:-----:|:------:|:-------:|:---------:|:----------:|:---------:|:--------------:|:------------------:|:-------:|:-----------:|:-------------:|:-----------------:|:----------:|:--------------:|:--------:|:------------:|:-------:|:-----------:|:----------------:|:--------------------:|:-----------------:|:---------------------:|
| No log | 1.0 | 1 | 43.4627 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | -1.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | -1.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | -1.0 | -1.0 |
| No log | 2.0 | 2 | 38.1840 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | -1.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | -1.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | -1.0 | -1.0 |
| No log | 3.0 | 3 | 34.5770 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | -1.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | -1.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | -1.0 | -1.0 |
| No log | 4.0 | 4 | 31.3127 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | -1.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | -1.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | -1.0 | -1.0 |
| No log | 5.0 | 5 | 28.3996 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | -1.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | -1.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | -1.0 | -1.0 |
| No log | 6.0 | 6 | 25.8281 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | -1.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | -1.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | -1.0 | -1.0 |
| No log | 7.0 | 7 | 23.5803 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | -1.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | -1.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | -1.0 | -1.0 |
| No log | 8.0 | 8 | 21.5154 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | -1.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | -1.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | -1.0 | -1.0 |
| No log | 9.0 | 9 | 19.7246 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | -1.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | -1.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | -1.0 | -1.0 |
| No log | 10.0 | 10 | 18.0319 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | -1.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | -1.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | -1.0 | -1.0 |
| No log | 11.0 | 11 | 16.4552 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | -1.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | -1.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | -1.0 | -1.0 |
| No log | 12.0 | 12 | 14.9394 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | -1.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | -1.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | -1.0 | -1.0 |
| No log | 13.0 | 13 | 13.5355 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | -1.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | -1.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | -1.0 | -1.0 |
| No log | 14.0 | 14 | 12.1913 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | -1.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | -1.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | -1.0 | -1.0 |
| No log | 15.0 | 15 | 11.0468 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | -1.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | -1.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | -1.0 | -1.0 |
| No log | 16.0 | 16 | 9.9526 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | -1.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | -1.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | -1.0 | -1.0 |
| No log | 17.0 | 17 | 8.9056 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | -1.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | -1.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | -1.0 | -1.0 |
| No log | 18.0 | 18 | 8.1271 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | -1.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | -1.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | -1.0 | -1.0 |
| No log | 19.0 | 19 | 7.4517 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | -1.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | -1.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | -1.0 | -1.0 |
| No log | 20.0 | 20 | 6.8534 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | -1.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | -1.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | -1.0 | -1.0 |
| No log | 21.0 | 21 | 6.2685 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | -1.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | -1.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | -1.0 | -1.0 |
| No log | 22.0 | 22 | 5.7899 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | -1.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | -1.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | -1.0 | -1.0 |
| No log | 23.0 | 23 | 5.3402 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | -1.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | -1.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | -1.0 | -1.0 |
| No log | 24.0 | 24 | 5.0099 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | -1.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | -1.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | -1.0 | -1.0 |
| No log | 25.0 | 25 | 4.8475 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | -1.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | -1.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | -1.0 | -1.0 |
| No log | 26.0 | 26 | 4.6449 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | -1.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | -1.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | -1.0 | -1.0 |
| No log | 27.0 | 27 | 4.4136 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | -1.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | -1.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | -1.0 | -1.0 |
| No log | 28.0 | 28 | 4.1757 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | -1.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | -1.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | -1.0 | -1.0 |
| No log | 29.0 | 29 | 3.9868 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | -1.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | -1.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | -1.0 | -1.0 |
| No log | 30.0 | 30 | 3.8385 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | -1.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | -1.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | -1.0 | -1.0 |
| No log | 31.0 | 31 | 3.6915 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | -1.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | -1.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | -1.0 | -1.0 |
| No log | 32.0 | 32 | 3.5659 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | -1.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | -1.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | -1.0 | -1.0 |
| No log | 33.0 | 33 | 3.4609 | 0.0003 | 0.0011 | 0.0 | 0.0 | 0.0005 | -1.0 | 0.0 | 0.0 | 0.0214 | 0.0 | 0.03 | -1.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0023 | 0.15 | -1.0 | -1.0 |
| No log | 34.0 | 34 | 3.3716 | 0.0005 | 0.0014 | 0.0 | 0.0 | 0.0008 | -1.0 | 0.0 | 0.0 | 0.0286 | 0.0 | 0.04 | -1.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0032 | 0.2 | -1.0 | -1.0 |
| No log | 35.0 | 35 | 3.2758 | 0.0006 | 0.0019 | 0.0 | 0.0 | 0.001 | -1.0 | 0.0 | 0.0 | 0.0286 | 0.0 | 0.04 | -1.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0043 | 0.2 | -1.0 | -1.0 |
| No log | 36.0 | 36 | 3.1869 | 0.0008 | 0.0027 | 0.0 | 0.0 | 0.0014 | -1.0 | 0.0 | 0.0 | 0.0286 | 0.0 | 0.04 | -1.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0058 | 0.2 | -1.0 | -1.0 |
| No log | 37.0 | 37 | 3.1176 | 0.0008 | 0.003 | 0.0 | 0.0 | 0.0014 | -1.0 | 0.0 | 0.0 | 0.0214 | 0.0 | 0.03 | -1.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0054 | 0.15 | -1.0 | -1.0 |
| No log | 38.0 | 38 | 3.0594 | 0.0047 | 0.0083 | 0.0045 | 0.0 | 0.0092 | -1.0 | 0.0 | 0.0 | 0.1429 | 0.0 | 0.2 | -1.0 | 0.0 | 0.0 | 0.025 | 0.8 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0078 | 0.2 | -1.0 | -1.0 |
| No log | 39.0 | 39 | 3.0267 | 0.0056 | 0.0096 | 0.0053 | 0.0 | 0.0095 | -1.0 | 0.0 | 0.1143 | 0.1571 | 0.0 | 0.22 | -1.0 | 0.0 | 0.0 | 0.0296 | 0.8 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0097 | 0.3 | -1.0 | -1.0 |
| No log | 40.0 | 40 | 3.0128 | 0.0059 | 0.0096 | 0.0049 | 0.0 | 0.0095 | -1.0 | 0.0 | 0.1286 | 0.1714 | 0.0 | 0.24 | -1.0 | 0.0 | 0.0 | 0.031 | 0.9 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0101 | 0.3 | -1.0 | -1.0 |
| No log | 41.0 | 41 | 2.9745 | 0.0048 | 0.0082 | 0.0039 | 0.0 | 0.0076 | -1.0 | 0.0 | 0.0 | 0.1714 | 0.0 | 0.24 | -1.0 | 0.0 | 0.0 | 0.0243 | 0.9 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0094 | 0.3 | -1.0 | -1.0 |
| No log | 42.0 | 42 | 2.8860 | 0.0062 | 0.0098 | 0.0055 | 0.0 | 0.0108 | -1.0 | 0.0 | 0.1286 | 0.1714 | 0.0 | 0.24 | -1.0 | 0.0 | 0.0 | 0.0346 | 0.9 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0087 | 0.3 | -1.0 | -1.0 |
| No log | 43.0 | 43 | 2.8396 | 0.0067 | 0.0108 | 0.0084 | 0.0 | 0.0127 | -1.0 | 0.0 | 0.1143 | 0.1643 | 0.0 | 0.23 | -1.0 | 0.0 | 0.0 | 0.0381 | 0.8 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.009 | 0.35 | -1.0 | -1.0 |
| No log | 44.0 | 44 | 2.7898 | 0.0072 | 0.0104 | 0.0068 | 0.0 | 0.0135 | -1.0 | 0.0 | 0.1286 | 0.1714 | 0.0 | 0.24 | -1.0 | 0.0 | 0.0 | 0.0429 | 0.9 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0075 | 0.3 | -1.0 | -1.0 |
| No log | 45.0 | 45 | 2.7496 | 0.0055 | 0.008 | 0.0066 | 0.0 | 0.0102 | -1.0 | 0.0 | 0.0 | 0.1786 | 0.0 | 0.25 | -1.0 | 0.0 | 0.0 | 0.03 | 0.9 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0087 | 0.35 | -1.0 | -1.0 |
| No log | 46.0 | 46 | 2.7201 | 0.004 | 0.0055 | 0.0055 | 0.0 | 0.0072 | -1.0 | 0.0 | 0.0 | 0.1571 | 0.0 | 0.22 | -1.0 | 0.0 | 0.0 | 0.019 | 0.8 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0089 | 0.3 | -1.0 | -1.0 |
| No log | 47.0 | 47 | 2.7139 | 0.0035 | 0.005 | 0.005 | 0.0 | 0.0055 | -1.0 | 0.0 | 0.0 | 0.1571 | 0.0 | 0.22 | -1.0 | 0.0 | 0.0 | 0.0131 | 0.8 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0112 | 0.3 | -1.0 | -1.0 |
| No log | 48.0 | 48 | 2.7079 | 0.0034 | 0.0049 | 0.0049 | 0.0 | 0.0053 | -1.0 | 0.0 | 0.0 | 0.1571 | 0.0 | 0.22 | -1.0 | 0.0 | 0.0 | 0.0125 | 0.8 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0112 | 0.3 | -1.0 | -1.0 |
| No log | 49.0 | 49 | 2.6761 | 0.004 | 0.006 | 0.0033 | 0.0 | 0.0065 | -1.0 | 0.0 | 0.0 | 0.15 | 0.0 | 0.21 | -1.0 | 0.0 | 0.0 | 0.0186 | 0.8 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0094 | 0.25 | -1.0 | -1.0 |
| No log | 50.0 | 50 | 2.6451 | 0.0047 | 0.0065 | 0.0037 | 0.0 | 0.0081 | -1.0 | 0.0 | 0.0 | 0.1643 | 0.0 | 0.23 | -1.0 | 0.0 | 0.0 | 0.0231 | 0.9 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0101 | 0.25 | -1.0 | -1.0 |
| No log | 51.0 | 51 | 2.6156 | 0.0048 | 0.0064 | 0.004 | 0.0 | 0.0077 | -1.0 | 0.0 | 0.0 | 0.1643 | 0.0 | 0.23 | -1.0 | 0.0 | 0.0 | 0.025 | 0.9 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0084 | 0.25 | -1.0 | -1.0 |
| No log | 52.0 | 52 | 2.5724 | 0.0054 | 0.0074 | 0.0062 | 0.0 | 0.0103 | -1.0 | 0.0 | 0.1143 | 0.1571 | 0.0 | 0.22 | -1.0 | 0.0 | 0.0 | 0.0296 | 0.8 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0078 | 0.3 | -1.0 | -1.0 |
| No log | 53.0 | 53 | 2.5404 | 0.0052 | 0.0072 | 0.0055 | 0.0 | 0.0128 | -1.0 | 0.0 | 0.1143 | 0.15 | 0.0 | 0.21 | -1.0 | 0.0 | 0.0 | 0.0308 | 0.8 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0054 | 0.25 | -1.0 | -1.0 |
| No log | 54.0 | 54 | 2.5318 | 0.005 | 0.007 | 0.0053 | 0.0 | 0.0148 | -1.0 | 0.0 | 0.1143 | 0.15 | 0.0 | 0.21 | -1.0 | 0.0 | 0.0 | 0.0296 | 0.8 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0053 | 0.25 | -1.0 | -1.0 |
| No log | 55.0 | 55 | 2.5240 | 0.0049 | 0.0067 | 0.0051 | 0.0 | 0.0162 | -1.0 | 0.0 | 0.1143 | 0.15 | 0.0 | 0.21 | -1.0 | 0.0 | 0.0 | 0.0286 | 0.8 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0057 | 0.25 | -1.0 | -1.0 |
| No log | 56.0 | 56 | 2.5011 | 0.0051 | 0.0062 | 0.0062 | 0.0 | 0.0131 | -1.0 | 0.0 | 0.0 | 0.1714 | 0.0 | 0.24 | -1.0 | 0.0 | 0.0 | 0.029 | 0.9 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0066 | 0.3 | -1.0 | -1.0 |
| No log | 57.0 | 57 | 2.4993 | 0.0046 | 0.0068 | 0.0059 | 0.0 | 0.0084 | -1.0 | 0.0 | 0.0 | 0.1714 | 0.0 | 0.24 | -1.0 | 0.0 | 0.0 | 0.025 | 0.8 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0074 | 0.4 | -1.0 | -1.0 |
| No log | 58.0 | 58 | 2.4976 | 0.0011 | 0.0025 | 0.0014 | 0.0 | 0.0018 | -1.0 | 0.0 | 0.0 | 0.0571 | 0.0 | 0.08 | -1.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0074 | 0.4 | -1.0 | -1.0 |
| No log | 59.0 | 59 | 2.4791 | 0.0012 | 0.0033 | 0.0013 | 0.0 | 0.0021 | -1.0 | 0.0 | 0.0 | 0.0571 | 0.0 | 0.08 | -1.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0083 | 0.4 | -1.0 | -1.0 |
| No log | 60.0 | 60 | 2.4503 | 0.0012 | 0.0039 | 0.0012 | 0.0 | 0.0023 | -1.0 | 0.0 | 0.0 | 0.0643 | 0.0 | 0.09 | -1.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0083 | 0.45 | -1.0 | -1.0 |
| No log | 61.0 | 61 | 2.4290 | 0.0012 | 0.0043 | 0.0012 | 0.0 | 0.0024 | -1.0 | 0.0 | 0.0 | 0.0643 | 0.0 | 0.09 | -1.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0085 | 0.45 | -1.0 | -1.0 |
| No log | 62.0 | 62 | 2.4172 | 0.0011 | 0.0042 | 0.0011 | 0.0 | 0.0024 | -1.0 | 0.0 | 0.0 | 0.0571 | 0.0 | 0.08 | -1.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0074 | 0.4 | -1.0 | -1.0 |
| No log | 63.0 | 63 | 2.3938 | 0.001 | 0.0039 | 0.001 | 0.0 | 0.0022 | -1.0 | 0.0 | 0.0 | 0.0571 | 0.0 | 0.08 | -1.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.007 | 0.4 | -1.0 | -1.0 |
| No log | 64.0 | 64 | 2.3736 | 0.0011 | 0.0036 | 0.0009 | 0.0 | 0.0022 | -1.0 | 0.0 | 0.0 | 0.0571 | 0.0 | 0.08 | -1.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0076 | 0.4 | -1.0 | -1.0 |
| No log | 65.0 | 65 | 2.3663 | 0.0011 | 0.0035 | 0.0009 | 0.0 | 0.002 | -1.0 | 0.0 | 0.0 | 0.0571 | 0.0 | 0.08 | -1.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0074 | 0.4 | -1.0 | -1.0 |
| No log | 66.0 | 66 | 2.3599 | 0.0013 | 0.0035 | 0.0009 | 0.0 | 0.0023 | -1.0 | 0.0 | 0.0 | 0.0643 | 0.0 | 0.09 | -1.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0093 | 0.45 | -1.0 | -1.0 |
| No log | 67.0 | 67 | 2.3547 | 0.0014 | 0.0037 | 0.0009 | 0.0 | 0.0024 | -1.0 | 0.0 | 0.0 | 0.0643 | 0.0 | 0.09 | -1.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0096 | 0.45 | -1.0 | -1.0 |
| No log | 68.0 | 68 | 2.3477 | 0.0011 | 0.0037 | 0.0009 | 0.0 | 0.0019 | -1.0 | 0.0 | 0.0 | 0.0571 | 0.0 | 0.08 | -1.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0077 | 0.4 | -1.0 | -1.0 |
| No log | 69.0 | 69 | 2.3357 | 0.0011 | 0.0038 | 0.001 | 0.0 | 0.002 | -1.0 | 0.0 | 0.0 | 0.0571 | 0.0 | 0.08 | -1.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.008 | 0.4 | -1.0 | -1.0 |
| No log | 70.0 | 70 | 2.3244 | 0.0012 | 0.0039 | 0.001 | 0.0 | 0.0023 | -1.0 | 0.0 | 0.0 | 0.0571 | 0.0 | 0.08 | -1.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0081 | 0.4 | -1.0 | -1.0 |
| No log | 71.0 | 71 | 2.3189 | 0.0012 | 0.0039 | 0.001 | 0.0 | 0.0025 | -1.0 | 0.0 | 0.0 | 0.0571 | 0.0 | 0.08 | -1.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0082 | 0.4 | -1.0 | -1.0 |
| No log | 72.0 | 72 | 2.3207 | 0.0009 | 0.004 | 0.001 | 0.0 | 0.0021 | -1.0 | 0.0 | 0.0 | 0.05 | 0.0 | 0.07 | -1.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0063 | 0.35 | -1.0 | -1.0 |
| No log | 73.0 | 73 | 2.3184 | 0.0052 | 0.0087 | 0.0058 | 0.0 | 0.0159 | -1.0 | 0.0 | 0.0 | 0.1786 | 0.0 | 0.25 | -1.0 | 0.0 | 0.0 | 0.03 | 0.9 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0063 | 0.35 | -1.0 | -1.0 |
| No log | 74.0 | 74 | 2.3131 | 0.0059 | 0.0096 | 0.0065 | 0.0 | 0.0183 | -1.0 | 0.0 | 0.1286 | 0.1786 | 0.0 | 0.25 | -1.0 | 0.0 | 0.0 | 0.0346 | 0.9 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0065 | 0.35 | -1.0 | -1.0 |
| No log | 75.0 | 75 | 2.3047 | 0.0062 | 0.0095 | 0.0065 | 0.0 | 0.0164 | -1.0 | 0.0 | 0.1286 | 0.1857 | 0.0 | 0.26 | -1.0 | 0.0 | 0.0 | 0.0346 | 0.9 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0085 | 0.4 | -1.0 | -1.0 |
| No log | 76.0 | 76 | 2.2978 | 0.0064 | 0.0098 | 0.0067 | 0.0 | 0.0152 | -1.0 | 0.0 | 0.1286 | 0.1857 | 0.0 | 0.26 | -1.0 | 0.0 | 0.0 | 0.036 | 0.9 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0086 | 0.4 | -1.0 | -1.0 |
| No log | 77.0 | 77 | 2.2943 | 0.0066 | 0.0099 | 0.007 | 0.0 | 0.0135 | -1.0 | 0.0 | 0.1286 | 0.1857 | 0.0 | 0.26 | -1.0 | 0.0 | 0.0 | 0.0375 | 0.9 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0085 | 0.4 | -1.0 | -1.0 |
| No log | 78.0 | 78 | 2.2913 | 0.0065 | 0.0098 | 0.007 | 0.0 | 0.0133 | -1.0 | 0.0 | 0.1286 | 0.1857 | 0.0 | 0.26 | -1.0 | 0.0 | 0.0 | 0.0375 | 0.9 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0083 | 0.4 | -1.0 | -1.0 |
| No log | 79.0 | 79 | 2.2875 | 0.0066 | 0.0099 | 0.007 | 0.0 | 0.0134 | -1.0 | 0.0 | 0.1286 | 0.1857 | 0.0 | 0.26 | -1.0 | 0.0 | 0.0 | 0.0375 | 0.9 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0084 | 0.4 | -1.0 | -1.0 |
| No log | 80.0 | 80 | 2.2839 | 0.0068 | 0.0101 | 0.0072 | 0.0 | 0.0141 | -1.0 | 0.0 | 0.1286 | 0.1857 | 0.0 | 0.26 | -1.0 | 0.0 | 0.0 | 0.0391 | 0.9 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0082 | 0.4 | -1.0 | -1.0 |
| No log | 81.0 | 81 | 2.2800 | 0.007 | 0.0104 | 0.0075 | 0.0 | 0.0151 | -1.0 | 0.0 | 0.1286 | 0.1857 | 0.0 | 0.26 | -1.0 | 0.0 | 0.0 | 0.0409 | 0.9 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0081 | 0.4 | -1.0 | -1.0 |
| No log | 82.0 | 82 | 2.2785 | 0.0079 | 0.0113 | 0.0085 | 0.0 | 0.0203 | -1.0 | 0.0 | 0.1286 | 0.1857 | 0.0 | 0.26 | -1.0 | 0.0 | 0.0 | 0.0474 | 0.9 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.008 | 0.4 | -1.0 | -1.0 |
| No log | 83.0 | 83 | 2.2781 | 0.0079 | 0.0113 | 0.0085 | 0.0 | 0.0224 | -1.0 | 0.0 | 0.1286 | 0.1857 | 0.0 | 0.26 | -1.0 | 0.0 | 0.0 | 0.0474 | 0.9 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0079 | 0.4 | -1.0 | -1.0 |
| No log | 84.0 | 84 | 2.2775 | 0.0079 | 0.0112 | 0.0085 | 0.0 | 0.0224 | -1.0 | 0.0 | 0.1286 | 0.1857 | 0.0 | 0.26 | -1.0 | 0.0 | 0.0 | 0.0474 | 0.9 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0078 | 0.4 | -1.0 | -1.0 |
| No log | 85.0 | 85 | 2.2796 | 0.0084 | 0.0125 | 0.0085 | 0.0067 | 0.0249 | -1.0 | 0.0 | 0.1286 | 0.2429 | 0.2 | 0.26 | -1.0 | 0.0 | 0.0 | 0.0474 | 0.9 | 0.0036 | 0.4 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0078 | 0.4 | -1.0 | -1.0 |
| No log | 86.0 | 86 | 2.2808 | 0.0084 | 0.0124 | 0.0084 | 0.0059 | 0.0282 | -1.0 | 0.0 | 0.1286 | 0.2429 | 0.2 | 0.26 | -1.0 | 0.0 | 0.0 | 0.0474 | 0.9 | 0.0035 | 0.4 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0077 | 0.4 | -1.0 | -1.0 |
| No log | 87.0 | 87 | 2.2814 | 0.0084 | 0.0124 | 0.0084 | 0.0059 | 0.0283 | -1.0 | 0.0 | 0.1286 | 0.2429 | 0.2 | 0.26 | -1.0 | 0.0 | 0.0 | 0.0474 | 0.9 | 0.0036 | 0.4 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0076 | 0.4 | -1.0 | -1.0 |
| No log | 88.0 | 88 | 2.2804 | 0.0084 | 0.0124 | 0.0084 | 0.0059 | 0.0283 | -1.0 | 0.0 | 0.1286 | 0.2429 | 0.2 | 0.26 | -1.0 | 0.0 | 0.0 | 0.0474 | 0.9 | 0.0036 | 0.4 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0076 | 0.4 | -1.0 | -1.0 |
| No log | 89.0 | 89 | 2.2784 | 0.0076 | 0.0124 | 0.0084 | 0.0059 | 0.0254 | -1.0 | 0.0 | 0.1143 | 0.2286 | 0.2 | 0.24 | -1.0 | 0.0 | 0.0 | 0.0421 | 0.8 | 0.0035 | 0.4 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0076 | 0.4 | -1.0 | -1.0 |
| No log | 90.0 | 90 | 2.2764 | 0.0076 | 0.0124 | 0.0084 | 0.0059 | 0.0254 | -1.0 | 0.0 | 0.1143 | 0.2286 | 0.2 | 0.24 | -1.0 | 0.0 | 0.0 | 0.0421 | 0.8 | 0.0035 | 0.4 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0076 | 0.4 | -1.0 | -1.0 |
| No log | 91.0 | 91 | 2.2744 | 0.0076 | 0.0124 | 0.0084 | 0.0059 | 0.0254 | -1.0 | 0.0 | 0.1143 | 0.2286 | 0.2 | 0.24 | -1.0 | 0.0 | 0.0 | 0.0421 | 0.8 | 0.0035 | 0.4 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0076 | 0.4 | -1.0 | -1.0 |
| No log | 92.0 | 92 | 2.2729 | 0.0076 | 0.0124 | 0.0084 | 0.0063 | 0.0224 | -1.0 | 0.0 | 0.1143 | 0.2286 | 0.2 | 0.24 | -1.0 | 0.0 | 0.0 | 0.0421 | 0.8 | 0.0035 | 0.4 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0076 | 0.4 | -1.0 | -1.0 |
| No log | 93.0 | 93 | 2.2717 | 0.0076 | 0.0124 | 0.0084 | 0.0065 | 0.0224 | -1.0 | 0.0 | 0.1143 | 0.2286 | 0.2 | 0.24 | -1.0 | 0.0 | 0.0 | 0.0421 | 0.8 | 0.0035 | 0.4 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0076 | 0.4 | -1.0 | -1.0 |
| No log | 94.0 | 94 | 2.2708 | 0.0076 | 0.0124 | 0.0084 | 0.0065 | 0.0202 | -1.0 | 0.0 | 0.1143 | 0.2286 | 0.2 | 0.24 | -1.0 | 0.0 | 0.0 | 0.0421 | 0.8 | 0.0035 | 0.4 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0077 | 0.4 | -1.0 | -1.0 |
| No log | 95.0 | 95 | 2.2700 | 0.0076 | 0.0124 | 0.0084 | 0.0065 | 0.0202 | -1.0 | 0.0 | 0.1143 | 0.2286 | 0.2 | 0.24 | -1.0 | 0.0 | 0.0 | 0.0421 | 0.8 | 0.0034 | 0.4 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0077 | 0.4 | -1.0 | -1.0 |
| No log | 96.0 | 96 | 2.2696 | 0.0073 | 0.012 | 0.0081 | 0.0065 | 0.0184 | -1.0 | 0.0 | 0.1143 | 0.2286 | 0.2 | 0.24 | -1.0 | 0.0 | 0.0 | 0.04 | 0.8 | 0.0034 | 0.4 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0077 | 0.4 | -1.0 | -1.0 |
| No log | 97.0 | 97 | 2.2692 | 0.007 | 0.0117 | 0.0077 | 0.0065 | 0.0184 | -1.0 | 0.0 | 0.1143 | 0.2286 | 0.2 | 0.24 | -1.0 | 0.0 | 0.0 | 0.0381 | 0.8 | 0.0034 | 0.4 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0077 | 0.4 | -1.0 | -1.0 |
| No log | 98.0 | 98 | 2.2690 | 0.007 | 0.0117 | 0.0077 | 0.0065 | 0.0184 | -1.0 | 0.0 | 0.1143 | 0.2286 | 0.2 | 0.24 | -1.0 | 0.0 | 0.0 | 0.0381 | 0.8 | 0.0034 | 0.4 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0077 | 0.4 | -1.0 | -1.0 |
| No log | 99.0 | 99 | 2.2689 | 0.007 | 0.0117 | 0.0077 | 0.0065 | 0.0184 | -1.0 | 0.0 | 0.1143 | 0.2286 | 0.2 | 0.24 | -1.0 | 0.0 | 0.0 | 0.0381 | 0.8 | 0.0034 | 0.4 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0077 | 0.4 | -1.0 | -1.0 |
| No log | 100.0 | 100 | 2.2689 | 0.007 | 0.0117 | 0.0077 | 0.0065 | 0.0184 | -1.0 | 0.0 | 0.1143 | 0.2286 | 0.2 | 0.24 | -1.0 | 0.0 | 0.0 | 0.0381 | 0.8 | 0.0034 | 0.4 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0077 | 0.4 | -1.0 | -1.0 |
### Framework versions
- Transformers 4.44.2
- Pytorch 2.5.0+cu121
- Datasets 3.1.0
- Tokenizers 0.19.1
| [
"black_star",
"cat",
"grey_star",
"insect",
"moon",
"owl",
"unicorn_head",
"unicorn_whole"
] |
omarelsayeed/t |
# Model Card for Model ID
<!-- Provide a quick summary of what the model is/does. -->
## Model Details
### Model Description
<!-- Provide a longer summary of what this model is. -->
This is the model card of a 🤗 transformers model that has been pushed on the Hub. This model card has been automatically generated.
- **Developed by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Model type:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
- **Finetuned from model [optional]:** [More Information Needed]
### Model Sources [optional]
<!-- Provide the basic links for the model. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the model is intended to be used, including the foreseeable users of the model and those affected by the model. -->
### Direct Use
<!-- This section is for the model use without fine-tuning or plugging into a larger ecosystem/app. -->
[More Information Needed]
### Downstream Use [optional]
<!-- This section is for the model use when fine-tuned for a task, or when plugged into a larger ecosystem/app -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the model will not work well for. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.
## How to Get Started with the Model
Use the code below to get started with the model.
[More Information Needed]
## Training Details
### Training Data
<!-- This should link to a Dataset Card, perhaps with a short stub of information on what the training data is all about as well as documentation related to data pre-processing or additional filtering. -->
[More Information Needed]
### Training Procedure
<!-- This relates heavily to the Technical Specifications. Content here should link to that section when it is relevant to the training procedure. -->
#### Preprocessing [optional]
[More Information Needed]
#### Training Hyperparameters
- **Training regime:** [More Information Needed] <!--fp32, fp16 mixed precision, bf16 mixed precision, bf16 non-mixed precision, fp16 non-mixed precision, fp8 mixed precision -->
#### Speeds, Sizes, Times [optional]
<!-- This section provides information about throughput, start/end time, checkpoint size if relevant, etc. -->
[More Information Needed]
## Evaluation
<!-- This section describes the evaluation protocols and provides the results. -->
### Testing Data, Factors & Metrics
#### Testing Data
<!-- This should link to a Dataset Card if possible. -->
[More Information Needed]
#### Factors
<!-- These are the things the evaluation is disaggregating by, e.g., subpopulations or domains. -->
[More Information Needed]
#### Metrics
<!-- These are the evaluation metrics being used, ideally with a description of why. -->
[More Information Needed]
### Results
[More Information Needed]
#### Summary
## Model Examination [optional]
<!-- Relevant interpretability work for the model goes here -->
[More Information Needed]
## Environmental Impact
<!-- Total emissions (in grams of CO2eq) and additional considerations, such as electricity usage, go here. Edit the suggested text below accordingly -->
Carbon emissions can be estimated using the [Machine Learning Impact calculator](https://mlco2.github.io/impact#compute) presented in [Lacoste et al. (2019)](https://arxiv.org/abs/1910.09700).
- **Hardware Type:** [More Information Needed]
- **Hours used:** [More Information Needed]
- **Cloud Provider:** [More Information Needed]
- **Compute Region:** [More Information Needed]
- **Carbon Emitted:** [More Information Needed]
## Technical Specifications [optional]
### Model Architecture and Objective
[More Information Needed]
### Compute Infrastructure
[More Information Needed]
#### Hardware
[More Information Needed]
#### Software
[More Information Needed]
## Citation [optional]
<!-- If there is a paper or blog post introducing the model, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the model or model card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Model Card Authors [optional]
[More Information Needed]
## Model Card Contact
[More Information Needed] | [
"n/a",
"caption",
"footnote",
"formula",
"list-item",
"page-footer",
"page-header",
"picture",
"section-header",
"table",
"text",
"title"
] |
jadechoghari/RT-DETRv2 |
This is the HF transformers implementation for RT-DETRv2
Model: RT-DETRv2-S
RT-DETRv2, an improved Real-Time DEtection TRansformer (RT-DETR). RT-DETRv2 builds upon the previous state-of-the-art real-time detector, RT-DETR, and opens up a set of bag-of-freebies for flexibility and practicality, as well as optimizing the training strategy to achieve enhanced performance. To improve the flexibility, we suggest setting a distinct number of sampling points for features at different scales in the deformable attention to achieve selective multi-scale feature extraction by the decoder.
Usage:
```python
import torch
import requests
from PIL import Image
from transformers import RTDetrForObjectDetection, RTDetrImageProcessor
url = 'http://images.cocodataset.org/val2017/000000039769.jpg'
image = Image.open(requests.get(url, stream=True).raw)
image_processor = RTDetrImageProcessor.from_pretrained("jadechoghari/RT-DETRv2")
model = RTDetrForObjectDetection.from_pretrained("jadechoghari/RT-DETRv2")
inputs = image_processor(images=image, return_tensors="pt")
with torch.no_grad():
outputs = model(**inputs)
results = image_processor.post_process_object_detection(outputs, target_sizes=torch.tensor([image.size[::-1]]), threshold=0.3)
for result in results:
for score, label_id, box in zip(result["scores"], result["labels"], result["boxes"]):
score, label = score.item(), label_id.item()
box = [round(i, 2) for i in box.tolist()]
print(f"{model.config.id2label[label]}: {score:.2f} {box}") | [
"person",
"bicycle",
"car",
"motorbike",
"aeroplane",
"bus",
"train",
"truck",
"boat",
"traffic light",
"fire hydrant",
"stop sign",
"parking meter",
"bench",
"bird",
"cat",
"dog",
"horse",
"sheep",
"cow",
"elephant",
"bear",
"zebra",
"giraffe",
"backpack",
"umbrella",
"handbag",
"tie",
"suitcase",
"frisbee",
"skis",
"snowboard",
"sports ball",
"kite",
"baseball bat",
"baseball glove",
"skateboard",
"surfboard",
"tennis racket",
"bottle",
"wine glass",
"cup",
"fork",
"knife",
"spoon",
"bowl",
"banana",
"apple",
"sandwich",
"orange",
"broccoli",
"carrot",
"hot dog",
"pizza",
"donut",
"cake",
"chair",
"sofa",
"pottedplant",
"bed",
"diningtable",
"toilet",
"tvmonitor",
"laptop",
"mouse",
"remote",
"keyboard",
"cell phone",
"microwave",
"oven",
"toaster",
"sink",
"refrigerator",
"book",
"clock",
"vase",
"scissors",
"teddy bear",
"hair drier",
"toothbrush"
] |
BjngChjjljng/detr-daytime-fold-0 |
# Model Card for Model ID
<!-- Provide a quick summary of what the model is/does. -->
## Model Details
### Model Description
<!-- Provide a longer summary of what this model is. -->
This is the model card of a 🤗 transformers model that has been pushed on the Hub. This model card has been automatically generated.
- **Developed by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Model type:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
- **Finetuned from model [optional]:** [More Information Needed]
### Model Sources [optional]
<!-- Provide the basic links for the model. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the model is intended to be used, including the foreseeable users of the model and those affected by the model. -->
### Direct Use
<!-- This section is for the model use without fine-tuning or plugging into a larger ecosystem/app. -->
[More Information Needed]
### Downstream Use [optional]
<!-- This section is for the model use when fine-tuned for a task, or when plugged into a larger ecosystem/app -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the model will not work well for. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.
## How to Get Started with the Model
Use the code below to get started with the model.
[More Information Needed]
## Training Details
### Training Data
<!-- This should link to a Dataset Card, perhaps with a short stub of information on what the training data is all about as well as documentation related to data pre-processing or additional filtering. -->
[More Information Needed]
### Training Procedure
<!-- This relates heavily to the Technical Specifications. Content here should link to that section when it is relevant to the training procedure. -->
#### Preprocessing [optional]
[More Information Needed]
#### Training Hyperparameters
- **Training regime:** [More Information Needed] <!--fp32, fp16 mixed precision, bf16 mixed precision, bf16 non-mixed precision, fp16 non-mixed precision, fp8 mixed precision -->
#### Speeds, Sizes, Times [optional]
<!-- This section provides information about throughput, start/end time, checkpoint size if relevant, etc. -->
[More Information Needed]
## Evaluation
<!-- This section describes the evaluation protocols and provides the results. -->
### Testing Data, Factors & Metrics
#### Testing Data
<!-- This should link to a Dataset Card if possible. -->
[More Information Needed]
#### Factors
<!-- These are the things the evaluation is disaggregating by, e.g., subpopulations or domains. -->
[More Information Needed]
#### Metrics
<!-- These are the evaluation metrics being used, ideally with a description of why. -->
[More Information Needed]
### Results
[More Information Needed]
#### Summary
## Model Examination [optional]
<!-- Relevant interpretability work for the model goes here -->
[More Information Needed]
## Environmental Impact
<!-- Total emissions (in grams of CO2eq) and additional considerations, such as electricity usage, go here. Edit the suggested text below accordingly -->
Carbon emissions can be estimated using the [Machine Learning Impact calculator](https://mlco2.github.io/impact#compute) presented in [Lacoste et al. (2019)](https://arxiv.org/abs/1910.09700).
- **Hardware Type:** [More Information Needed]
- **Hours used:** [More Information Needed]
- **Cloud Provider:** [More Information Needed]
- **Compute Region:** [More Information Needed]
- **Carbon Emitted:** [More Information Needed]
## Technical Specifications [optional]
### Model Architecture and Objective
[More Information Needed]
### Compute Infrastructure
[More Information Needed]
#### Hardware
[More Information Needed]
#### Software
[More Information Needed]
## Citation [optional]
<!-- If there is a paper or blog post introducing the model, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the model or model card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Model Card Authors [optional]
[More Information Needed]
## Model Card Contact
[More Information Needed] | [
"label_0",
"label_1",
"label_2",
"label_3"
] |
arfrie22/detr-resnet-50_finetuned_cppe5 |
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# detr-resnet-50_finetuned_cppe5
This model is a fine-tuned version of [facebook/detr-resnet-50](https://huggingface.co/facebook/detr-resnet-50) on an unknown dataset.
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 1e-05
- train_batch_size: 8
- eval_batch_size: 8
- seed: 42
- optimizer: Use adamw_torch with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
- lr_scheduler_type: linear
- num_epochs: 10
- mixed_precision_training: Native AMP
### Training results
### Framework versions
- Transformers 4.47.0.dev0
- Pytorch 2.5.0+cu121
- Datasets 3.1.0
- Tokenizers 0.20.3
| [
"coverall",
"face_shield",
"gloves",
"goggles",
"mask"
] |
arfrie22/detr-resnet-50_finetuned_metric_screw |
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# detr-resnet-50_finetuned_metric_screw
This model is a fine-tuned version of [facebook/detr-resnet-50](https://huggingface.co/facebook/detr-resnet-50) on the imagefolder dataset.
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 1e-05
- train_batch_size: 8
- eval_batch_size: 8
- seed: 42
- optimizer: Use adamw_torch with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
- lr_scheduler_type: linear
- num_epochs: 10
- mixed_precision_training: Native AMP
### Training results
### Framework versions
- Transformers 4.47.0.dev0
- Pytorch 2.5.0+cu121
- Datasets 3.1.0
- Tokenizers 0.20.3
| [
"m2",
"m3",
"m4",
"m5",
"m6",
"m7",
"m8",
"wood"
] |
hxwk507/detr-garbage |
# Model Card for Model ID
<!-- Provide a quick summary of what the model is/does. -->
## Model Details
### Model Description
<!-- Provide a longer summary of what this model is. -->
This is the model card of a 🤗 transformers model that has been pushed on the Hub. This model card has been automatically generated.
- **Developed by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Model type:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
- **Finetuned from model [optional]:** [More Information Needed]
### Model Sources [optional]
<!-- Provide the basic links for the model. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the model is intended to be used, including the foreseeable users of the model and those affected by the model. -->
### Direct Use
<!-- This section is for the model use without fine-tuning or plugging into a larger ecosystem/app. -->
[More Information Needed]
### Downstream Use [optional]
<!-- This section is for the model use when fine-tuned for a task, or when plugged into a larger ecosystem/app -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the model will not work well for. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.
## How to Get Started with the Model
Use the code below to get started with the model.
[More Information Needed]
## Training Details
### Training Data
<!-- This should link to a Dataset Card, perhaps with a short stub of information on what the training data is all about as well as documentation related to data pre-processing or additional filtering. -->
[More Information Needed]
### Training Procedure
<!-- This relates heavily to the Technical Specifications. Content here should link to that section when it is relevant to the training procedure. -->
#### Preprocessing [optional]
[More Information Needed]
#### Training Hyperparameters
- **Training regime:** [More Information Needed] <!--fp32, fp16 mixed precision, bf16 mixed precision, bf16 non-mixed precision, fp16 non-mixed precision, fp8 mixed precision -->
#### Speeds, Sizes, Times [optional]
<!-- This section provides information about throughput, start/end time, checkpoint size if relevant, etc. -->
[More Information Needed]
## Evaluation
<!-- This section describes the evaluation protocols and provides the results. -->
### Testing Data, Factors & Metrics
#### Testing Data
<!-- This should link to a Dataset Card if possible. -->
[More Information Needed]
#### Factors
<!-- These are the things the evaluation is disaggregating by, e.g., subpopulations or domains. -->
[More Information Needed]
#### Metrics
<!-- These are the evaluation metrics being used, ideally with a description of why. -->
[More Information Needed]
### Results
[More Information Needed]
#### Summary
## Model Examination [optional]
<!-- Relevant interpretability work for the model goes here -->
[More Information Needed]
## Environmental Impact
<!-- Total emissions (in grams of CO2eq) and additional considerations, such as electricity usage, go here. Edit the suggested text below accordingly -->
Carbon emissions can be estimated using the [Machine Learning Impact calculator](https://mlco2.github.io/impact#compute) presented in [Lacoste et al. (2019)](https://arxiv.org/abs/1910.09700).
- **Hardware Type:** [More Information Needed]
- **Hours used:** [More Information Needed]
- **Cloud Provider:** [More Information Needed]
- **Compute Region:** [More Information Needed]
- **Carbon Emitted:** [More Information Needed]
## Technical Specifications [optional]
### Model Architecture and Objective
[More Information Needed]
### Compute Infrastructure
[More Information Needed]
#### Hardware
[More Information Needed]
#### Software
[More Information Needed]
## Citation [optional]
<!-- If there is a paper or blog post introducing the model, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the model or model card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Model Card Authors [optional]
[More Information Needed]
## Model Card Contact
[More Information Needed] | [
"aluminium foil",
"battery",
"aluminium blister pack",
"carded blister pack",
"other plastic bottle",
"clear plastic bottle",
"glass bottle",
"plastic bottle cap",
"metal bottle cap",
"broken glass",
"food can",
"aerosol",
"drink can",
"toilet tube",
"other carton",
"egg carton",
"drink carton",
"corrugated carton",
"meal carton",
"pizza box",
"paper cup",
"disposable plastic cup",
"foam cup",
"glass cup",
"other plastic cup",
"food waste",
"glass jar",
"plastic lid",
"metal lid",
"other plastic",
"magazine paper",
"tissues",
"wrapping paper",
"normal paper",
"paper bag",
"plastified paper bag",
"plastic film",
"six pack rings",
"garbage bag",
"other plastic wrapper",
"single-use carrier bag",
"polypropylene bag",
"crisp packet",
"spread tub",
"tupperware",
"disposable food container",
"foam food container",
"other plastic container",
"plastic glooves",
"plastic utensils",
"pop tab",
"rope & strings",
"scrap metal",
"shoe",
"squeezable tube",
"plastic straw",
"paper straw",
"styrofoam piece",
"unlabeled litter",
"cigarette"
] |
joe611/chickens-composite-201616161616-150-epochs-w-transform-metrics-test-shfld |
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# chickens-composite-201616161616-150-epochs-w-transform-metrics-test-shfld
This model is a fine-tuned version of [facebook/detr-resnet-50](https://huggingface.co/facebook/detr-resnet-50) on the None dataset.
It achieves the following results on the evaluation set:
- Loss: 0.2904
- Map: 0.8051
- Map 50: 0.9551
- Map 75: 0.9034
- Map Small: 0.279
- Map Medium: 0.8008
- Map Large: 0.8412
- Mar 1: 0.3255
- Mar 10: 0.8376
- Mar 100: 0.8429
- Mar Small: 0.3733
- Mar Medium: 0.8409
- Mar Large: 0.8686
- Map Chicken: 0.8056
- Mar 100 Chicken: 0.8421
- Map Duck: 0.7416
- Mar 100 Duck: 0.7918
- Map Plant: 0.8682
- Mar 100 Plant: 0.8948
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 1e-05
- train_batch_size: 2
- eval_batch_size: 8
- seed: 42
- optimizer: Use adamw_torch with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
- lr_scheduler_type: cosine
- num_epochs: 150
### Training results
| Training Loss | Epoch | Step | Validation Loss | Map | Map 50 | Map 75 | Map Small | Map Medium | Map Large | Mar 1 | Mar 10 | Mar 100 | Mar Small | Mar Medium | Mar Large | Map Chicken | Mar 100 Chicken | Map Duck | Mar 100 Duck | Map Plant | Mar 100 Plant |
|:-------------:|:-----:|:-----:|:---------------:|:------:|:------:|:------:|:---------:|:----------:|:---------:|:------:|:------:|:-------:|:---------:|:----------:|:---------:|:-----------:|:---------------:|:--------:|:------------:|:---------:|:-------------:|
| 1.2525 | 1.0 | 500 | 1.4914 | 0.0431 | 0.0683 | 0.0461 | 0.0038 | 0.0195 | 0.0695 | 0.0492 | 0.1277 | 0.2728 | 0.1105 | 0.2434 | 0.2835 | 0.0272 | 0.123 | 0.0 | 0.0 | 0.1019 | 0.6955 |
| 1.155 | 2.0 | 1000 | 1.2507 | 0.0894 | 0.1307 | 0.1082 | 0.0004 | 0.0255 | 0.1241 | 0.0475 | 0.1784 | 0.2424 | 0.0319 | 0.2168 | 0.254 | 0.0017 | 0.0151 | 0.0 | 0.0 | 0.2665 | 0.7121 |
| 1.0293 | 3.0 | 1500 | 1.3024 | 0.1578 | 0.2269 | 0.1836 | 0.0048 | 0.0574 | 0.1977 | 0.0638 | 0.2155 | 0.2325 | 0.0624 | 0.2181 | 0.245 | 0.0136 | 0.0306 | 0.0 | 0.0 | 0.4599 | 0.667 |
| 1.082 | 4.0 | 2000 | 1.0289 | 0.2117 | 0.294 | 0.2408 | 0.012 | 0.1469 | 0.2285 | 0.084 | 0.2638 | 0.2736 | 0.0671 | 0.2563 | 0.2688 | 0.0444 | 0.0905 | 0.0 | 0.0 | 0.5908 | 0.7303 |
| 0.942 | 5.0 | 2500 | 0.9964 | 0.2573 | 0.3856 | 0.2933 | 0.036 | 0.2099 | 0.3008 | 0.1076 | 0.3839 | 0.4097 | 0.0733 | 0.3949 | 0.4279 | 0.1877 | 0.4913 | 0.0 | 0.0 | 0.5843 | 0.7379 |
| 0.9893 | 6.0 | 3000 | 0.9405 | 0.3097 | 0.4517 | 0.3646 | 0.0382 | 0.2711 | 0.3334 | 0.1153 | 0.4333 | 0.4494 | 0.1205 | 0.4235 | 0.4446 | 0.3348 | 0.6028 | 0.0 | 0.0 | 0.5944 | 0.7455 |
| 0.8888 | 7.0 | 3500 | 0.9067 | 0.3208 | 0.4688 | 0.3763 | 0.0064 | 0.2687 | 0.3556 | 0.1217 | 0.4436 | 0.4556 | 0.0867 | 0.4122 | 0.4733 | 0.3538 | 0.6512 | 0.0 | 0.0 | 0.6085 | 0.7158 |
| 0.8665 | 8.0 | 4000 | 0.9043 | 0.3405 | 0.497 | 0.3869 | 0.0582 | 0.2792 | 0.3696 | 0.1201 | 0.4413 | 0.4434 | 0.1329 | 0.3964 | 0.4453 | 0.3641 | 0.6028 | 0.0 | 0.0 | 0.6576 | 0.7273 |
| 0.7811 | 9.0 | 4500 | 0.8501 | 0.3469 | 0.49 | 0.4044 | 0.0308 | 0.2774 | 0.3713 | 0.1276 | 0.4542 | 0.458 | 0.121 | 0.3951 | 0.4696 | 0.3969 | 0.6639 | 0.0 | 0.0 | 0.6438 | 0.71 |
| 0.9009 | 10.0 | 5000 | 0.8122 | 0.3619 | 0.5128 | 0.4292 | 0.0547 | 0.3132 | 0.3705 | 0.1261 | 0.4682 | 0.4721 | 0.1233 | 0.4335 | 0.4843 | 0.4101 | 0.6671 | 0.0 | 0.0 | 0.6757 | 0.7491 |
| 0.8164 | 11.0 | 5500 | 0.7989 | 0.3614 | 0.5269 | 0.4129 | 0.0461 | 0.3348 | 0.3692 | 0.1215 | 0.4636 | 0.4706 | 0.1414 | 0.4403 | 0.4945 | 0.408 | 0.669 | 0.0 | 0.0 | 0.6763 | 0.7427 |
| 0.6442 | 12.0 | 6000 | 0.7173 | 0.3907 | 0.5439 | 0.4495 | 0.0361 | 0.3623 | 0.4158 | 0.132 | 0.4932 | 0.4956 | 0.1314 | 0.4665 | 0.5239 | 0.4743 | 0.7222 | 0.0 | 0.0 | 0.6978 | 0.7645 |
| 0.7746 | 13.0 | 6500 | 0.7465 | 0.3797 | 0.5401 | 0.4237 | 0.0892 | 0.3473 | 0.3955 | 0.1272 | 0.4764 | 0.4788 | 0.1862 | 0.4487 | 0.5006 | 0.4399 | 0.6782 | 0.0 | 0.0 | 0.6993 | 0.7582 |
| 0.674 | 14.0 | 7000 | 0.7455 | 0.3763 | 0.5547 | 0.4313 | 0.0721 | 0.3451 | 0.3911 | 0.128 | 0.4666 | 0.4691 | 0.1005 | 0.4456 | 0.4882 | 0.4421 | 0.6647 | 0.0 | 0.0 | 0.6869 | 0.7427 |
| 0.6078 | 15.0 | 7500 | 0.6729 | 0.3935 | 0.5477 | 0.4558 | 0.0364 | 0.3554 | 0.4275 | 0.1323 | 0.4865 | 0.4927 | 0.1381 | 0.4688 | 0.5204 | 0.4547 | 0.6944 | 0.0 | 0.0 | 0.7257 | 0.7836 |
| 0.7171 | 16.0 | 8000 | 0.6612 | 0.3927 | 0.5546 | 0.4655 | 0.056 | 0.3656 | 0.4249 | 0.1276 | 0.4824 | 0.4874 | 0.1567 | 0.4603 | 0.5148 | 0.4729 | 0.7004 | 0.0 | 0.0 | 0.7053 | 0.7618 |
| 0.6641 | 17.0 | 8500 | 0.6510 | 0.4041 | 0.5669 | 0.4742 | 0.0569 | 0.3677 | 0.4294 | 0.1327 | 0.488 | 0.4929 | 0.1081 | 0.4614 | 0.5184 | 0.4918 | 0.7048 | 0.0 | 0.0 | 0.7204 | 0.7739 |
| 0.5962 | 18.0 | 9000 | 0.6768 | 0.3956 | 0.5561 | 0.4543 | 0.0178 | 0.3723 | 0.423 | 0.1351 | 0.4825 | 0.4845 | 0.1 | 0.4638 | 0.5145 | 0.4617 | 0.6778 | 0.0 | 0.0 | 0.7249 | 0.7758 |
| 0.8006 | 19.0 | 9500 | 0.6209 | 0.4092 | 0.5645 | 0.494 | 0.0796 | 0.3747 | 0.4398 | 0.1316 | 0.4937 | 0.5006 | 0.1467 | 0.4716 | 0.5279 | 0.4906 | 0.7163 | 0.0 | 0.0 | 0.7371 | 0.7855 |
| 0.7191 | 20.0 | 10000 | 0.5962 | 0.423 | 0.5709 | 0.5003 | 0.1072 | 0.3958 | 0.4585 | 0.1394 | 0.5047 | 0.5078 | 0.1976 | 0.4822 | 0.5395 | 0.5279 | 0.7313 | 0.0 | 0.0 | 0.7411 | 0.7921 |
| 0.6064 | 21.0 | 10500 | 0.6064 | 0.4197 | 0.5777 | 0.4921 | 0.0939 | 0.3888 | 0.4519 | 0.1379 | 0.4981 | 0.5023 | 0.1567 | 0.4762 | 0.5318 | 0.528 | 0.7242 | 0.0 | 0.0 | 0.7311 | 0.7827 |
| 0.6584 | 22.0 | 11000 | 0.5884 | 0.4283 | 0.583 | 0.4987 | 0.0976 | 0.3985 | 0.4536 | 0.1451 | 0.5046 | 0.5086 | 0.1752 | 0.4804 | 0.5369 | 0.5453 | 0.7377 | 0.0 | 0.0 | 0.7396 | 0.7882 |
| 0.6776 | 23.0 | 11500 | 0.5939 | 0.4122 | 0.5716 | 0.4735 | 0.111 | 0.3761 | 0.4406 | 0.136 | 0.4949 | 0.4991 | 0.1752 | 0.4691 | 0.5215 | 0.4934 | 0.706 | 0.0 | 0.0 | 0.7433 | 0.7912 |
| 0.7424 | 24.0 | 12000 | 0.6049 | 0.4175 | 0.5644 | 0.4957 | 0.1045 | 0.3825 | 0.445 | 0.1395 | 0.4954 | 0.4978 | 0.1952 | 0.4668 | 0.522 | 0.5184 | 0.7151 | 0.0 | 0.0 | 0.7341 | 0.7782 |
| 0.5716 | 25.0 | 12500 | 0.5571 | 0.4373 | 0.5819 | 0.5031 | 0.1131 | 0.4026 | 0.4641 | 0.1402 | 0.5113 | 0.5134 | 0.1638 | 0.4881 | 0.5165 | 0.5416 | 0.7306 | 0.0 | 0.0 | 0.7702 | 0.8097 |
| 0.5601 | 26.0 | 13000 | 0.5891 | 0.4207 | 0.5784 | 0.4968 | 0.1128 | 0.3859 | 0.4535 | 0.1388 | 0.4983 | 0.5021 | 0.199 | 0.4693 | 0.529 | 0.5125 | 0.7143 | 0.0 | 0.0 | 0.7496 | 0.7921 |
| 0.667 | 27.0 | 13500 | 0.5549 | 0.4509 | 0.5944 | 0.5349 | 0.1368 | 0.412 | 0.4784 | 0.1463 | 0.5155 | 0.5198 | 0.2371 | 0.4894 | 0.5335 | 0.5772 | 0.7448 | 0.0 | 0.0 | 0.7754 | 0.8145 |
| 0.6431 | 28.0 | 14000 | 0.5422 | 0.4483 | 0.5946 | 0.5341 | 0.1445 | 0.4155 | 0.4777 | 0.1474 | 0.5177 | 0.5224 | 0.2371 | 0.4958 | 0.5384 | 0.5809 | 0.7563 | 0.0 | 0.0 | 0.764 | 0.8109 |
| 0.6363 | 29.0 | 14500 | 0.5430 | 0.4508 | 0.5956 | 0.5353 | 0.1094 | 0.4191 | 0.4876 | 0.1454 | 0.5165 | 0.5203 | 0.1914 | 0.4929 | 0.5445 | 0.5873 | 0.7524 | 0.0 | 0.0 | 0.7651 | 0.8085 |
| 0.4827 | 30.0 | 15000 | 0.5539 | 0.4591 | 0.5978 | 0.5379 | 0.1309 | 0.4365 | 0.4814 | 0.1496 | 0.5198 | 0.5233 | 0.1943 | 0.503 | 0.5389 | 0.6033 | 0.7536 | 0.0 | 0.0 | 0.7741 | 0.8164 |
| 0.5673 | 31.0 | 15500 | 0.6061 | 0.4422 | 0.5905 | 0.53 | 0.1173 | 0.411 | 0.4682 | 0.1415 | 0.501 | 0.5032 | 0.1819 | 0.4766 | 0.5185 | 0.5646 | 0.7099 | 0.0 | 0.0 | 0.762 | 0.7997 |
| 0.6023 | 32.0 | 16000 | 0.5962 | 0.4496 | 0.5873 | 0.5398 | 0.0528 | 0.4152 | 0.4889 | 0.1492 | 0.5086 | 0.512 | 0.1319 | 0.4848 | 0.5448 | 0.598 | 0.7413 | 0.0 | 0.0 | 0.7507 | 0.7948 |
| 0.5884 | 33.0 | 16500 | 0.5540 | 0.4553 | 0.6086 | 0.5514 | 0.1439 | 0.4199 | 0.4892 | 0.1474 | 0.5065 | 0.5103 | 0.209 | 0.482 | 0.5335 | 0.5965 | 0.7238 | 0.0 | 0.0 | 0.7695 | 0.807 |
| 0.6104 | 34.0 | 17000 | 0.5341 | 0.471 | 0.6193 | 0.566 | 0.1262 | 0.4349 | 0.4992 | 0.1492 | 0.5192 | 0.5226 | 0.2119 | 0.4916 | 0.5402 | 0.633 | 0.7492 | 0.0 | 0.0 | 0.78 | 0.8185 |
| 0.5326 | 35.0 | 17500 | 0.5223 | 0.4764 | 0.6211 | 0.5696 | 0.1251 | 0.4471 | 0.5052 | 0.1517 | 0.5202 | 0.525 | 0.1943 | 0.5011 | 0.5445 | 0.636 | 0.7397 | 0.0 | 0.0 | 0.7932 | 0.8355 |
| 0.5088 | 36.0 | 18000 | 0.5522 | 0.4683 | 0.6307 | 0.5664 | 0.1748 | 0.4395 | 0.4858 | 0.1466 | 0.5085 | 0.5113 | 0.2495 | 0.4867 | 0.524 | 0.6335 | 0.7234 | 0.0 | 0.0 | 0.7715 | 0.8106 |
| 0.5151 | 37.0 | 18500 | 0.5133 | 0.4857 | 0.6296 | 0.5677 | 0.171 | 0.446 | 0.4995 | 0.154 | 0.5223 | 0.526 | 0.2319 | 0.4923 | 0.541 | 0.6828 | 0.7611 | 0.0 | 0.0 | 0.7744 | 0.817 |
| 0.5623 | 38.0 | 19000 | 0.5089 | 0.493 | 0.6406 | 0.577 | 0.0923 | 0.4619 | 0.5165 | 0.1569 | 0.5263 | 0.5295 | 0.2176 | 0.5052 | 0.5475 | 0.6993 | 0.7675 | 0.0099 | 0.0052 | 0.7697 | 0.8158 |
| 0.5581 | 39.0 | 19500 | 0.5457 | 0.4715 | 0.6362 | 0.576 | 0.2016 | 0.4417 | 0.4838 | 0.1464 | 0.5005 | 0.5056 | 0.2505 | 0.4794 | 0.52 | 0.6399 | 0.7107 | 0.0 | 0.0 | 0.7746 | 0.8061 |
| 0.6243 | 40.0 | 20000 | 0.4951 | 0.5333 | 0.6956 | 0.6383 | 0.1647 | 0.5093 | 0.5588 | 0.1812 | 0.5656 | 0.5687 | 0.2067 | 0.5457 | 0.5961 | 0.6794 | 0.7472 | 0.133 | 0.133 | 0.7876 | 0.8258 |
| 0.5015 | 41.0 | 20500 | 0.4635 | 0.575 | 0.7201 | 0.6742 | 0.1788 | 0.5583 | 0.5652 | 0.2041 | 0.6051 | 0.6106 | 0.2243 | 0.6007 | 0.5995 | 0.718 | 0.7762 | 0.2051 | 0.2124 | 0.802 | 0.8433 |
| 0.5282 | 42.0 | 21000 | 0.4729 | 0.6178 | 0.7922 | 0.7371 | 0.1229 | 0.61 | 0.6311 | 0.2317 | 0.6599 | 0.6631 | 0.2167 | 0.6564 | 0.6695 | 0.6817 | 0.756 | 0.3748 | 0.3969 | 0.7969 | 0.8364 |
| 0.6359 | 43.0 | 21500 | 0.4664 | 0.6376 | 0.8312 | 0.7506 | 0.1173 | 0.6386 | 0.6168 | 0.2514 | 0.6855 | 0.6889 | 0.2357 | 0.6898 | 0.657 | 0.6815 | 0.756 | 0.4443 | 0.4794 | 0.7869 | 0.8315 |
| 0.631 | 44.0 | 22000 | 0.4675 | 0.659 | 0.8476 | 0.7996 | 0.1464 | 0.6619 | 0.6155 | 0.2578 | 0.7002 | 0.7032 | 0.2143 | 0.7079 | 0.6518 | 0.6981 | 0.7548 | 0.4847 | 0.5206 | 0.7942 | 0.8342 |
| 0.469 | 45.0 | 22500 | 0.4369 | 0.6797 | 0.8849 | 0.8126 | 0.1648 | 0.6676 | 0.6938 | 0.276 | 0.7232 | 0.7256 | 0.2605 | 0.7181 | 0.7371 | 0.6949 | 0.7587 | 0.5407 | 0.5742 | 0.8035 | 0.8439 |
| 0.5085 | 46.0 | 23000 | 0.4395 | 0.6936 | 0.9072 | 0.834 | 0.1854 | 0.6844 | 0.7324 | 0.283 | 0.7337 | 0.7382 | 0.2476 | 0.7333 | 0.7692 | 0.6934 | 0.7484 | 0.5887 | 0.6289 | 0.7988 | 0.8373 |
| 0.4624 | 47.0 | 23500 | 0.4190 | 0.7036 | 0.9138 | 0.8366 | 0.1937 | 0.6866 | 0.7519 | 0.2861 | 0.743 | 0.7458 | 0.2938 | 0.7334 | 0.7887 | 0.7183 | 0.775 | 0.6022 | 0.6361 | 0.7903 | 0.8264 |
| 0.3904 | 48.0 | 24000 | 0.4220 | 0.7152 | 0.9239 | 0.8418 | 0.1614 | 0.7041 | 0.7732 | 0.2953 | 0.7598 | 0.7642 | 0.239 | 0.7578 | 0.8158 | 0.7175 | 0.7762 | 0.6305 | 0.6753 | 0.7976 | 0.8412 |
| 0.4802 | 49.0 | 24500 | 0.3970 | 0.7291 | 0.9182 | 0.8617 | 0.166 | 0.7186 | 0.7531 | 0.2976 | 0.7693 | 0.7736 | 0.2257 | 0.7688 | 0.7991 | 0.7406 | 0.7952 | 0.6479 | 0.6876 | 0.7986 | 0.8379 |
| 0.4003 | 50.0 | 25000 | 0.4036 | 0.7171 | 0.9245 | 0.8569 | 0.1795 | 0.7048 | 0.7628 | 0.2939 | 0.7624 | 0.7662 | 0.2624 | 0.759 | 0.8077 | 0.7296 | 0.7893 | 0.6139 | 0.666 | 0.8078 | 0.8433 |
| 0.409 | 51.0 | 25500 | 0.3875 | 0.7353 | 0.9263 | 0.8576 | 0.1978 | 0.7263 | 0.7541 | 0.3061 | 0.7801 | 0.7844 | 0.2795 | 0.7806 | 0.8026 | 0.753 | 0.8012 | 0.648 | 0.7062 | 0.8049 | 0.8458 |
| 0.5391 | 52.0 | 26000 | 0.4081 | 0.706 | 0.9142 | 0.8518 | 0.187 | 0.693 | 0.7246 | 0.2889 | 0.7516 | 0.7552 | 0.289 | 0.7486 | 0.7686 | 0.7041 | 0.7702 | 0.6226 | 0.6639 | 0.7914 | 0.8315 |
| 0.5191 | 53.0 | 26500 | 0.3965 | 0.723 | 0.9371 | 0.8642 | 0.1739 | 0.7105 | 0.7569 | 0.3007 | 0.7655 | 0.7722 | 0.2862 | 0.7682 | 0.8008 | 0.7151 | 0.7663 | 0.6628 | 0.7155 | 0.7913 | 0.8348 |
| 0.4349 | 54.0 | 27000 | 0.3841 | 0.7412 | 0.9379 | 0.8713 | 0.1861 | 0.7364 | 0.7825 | 0.3044 | 0.7812 | 0.7862 | 0.2495 | 0.7854 | 0.827 | 0.7366 | 0.7865 | 0.6716 | 0.7165 | 0.8155 | 0.8555 |
| 0.4715 | 55.0 | 27500 | 0.3899 | 0.7336 | 0.9492 | 0.8691 | 0.238 | 0.7221 | 0.762 | 0.2995 | 0.7798 | 0.7845 | 0.331 | 0.7807 | 0.8017 | 0.7186 | 0.7706 | 0.6576 | 0.7206 | 0.8245 | 0.8621 |
| 0.4615 | 56.0 | 28000 | 0.3896 | 0.7311 | 0.9448 | 0.8832 | 0.2497 | 0.724 | 0.7506 | 0.2987 | 0.7737 | 0.7768 | 0.3743 | 0.77 | 0.7999 | 0.7216 | 0.773 | 0.657 | 0.7052 | 0.8146 | 0.8521 |
| 0.356 | 57.0 | 28500 | 0.3897 | 0.727 | 0.9417 | 0.8547 | 0.1613 | 0.7154 | 0.7737 | 0.2958 | 0.7692 | 0.7752 | 0.2848 | 0.7672 | 0.8192 | 0.7117 | 0.7675 | 0.6589 | 0.7093 | 0.8102 | 0.8488 |
| 0.4577 | 58.0 | 29000 | 0.3857 | 0.7302 | 0.9395 | 0.8749 | 0.1832 | 0.717 | 0.7593 | 0.2989 | 0.7699 | 0.7758 | 0.3343 | 0.7662 | 0.8062 | 0.7157 | 0.7687 | 0.6696 | 0.7134 | 0.8054 | 0.8455 |
| 0.4052 | 59.0 | 29500 | 0.3916 | 0.7315 | 0.9383 | 0.8678 | 0.1507 | 0.7255 | 0.7653 | 0.2981 | 0.7703 | 0.7759 | 0.2529 | 0.773 | 0.8158 | 0.7139 | 0.7702 | 0.672 | 0.7113 | 0.8086 | 0.8461 |
| 0.4092 | 60.0 | 30000 | 0.3783 | 0.7396 | 0.9392 | 0.8705 | 0.2153 | 0.7258 | 0.7751 | 0.3039 | 0.7797 | 0.7837 | 0.2929 | 0.7755 | 0.8203 | 0.7355 | 0.7861 | 0.6733 | 0.7206 | 0.8101 | 0.8442 |
| 0.4661 | 61.0 | 30500 | 0.3638 | 0.7522 | 0.9411 | 0.8702 | 0.2431 | 0.7466 | 0.7809 | 0.3095 | 0.7943 | 0.7979 | 0.3024 | 0.7988 | 0.8261 | 0.7502 | 0.7964 | 0.681 | 0.7381 | 0.8254 | 0.8591 |
| 0.4316 | 62.0 | 31000 | 0.3944 | 0.7263 | 0.9359 | 0.8691 | 0.1639 | 0.7124 | 0.7613 | 0.2993 | 0.7656 | 0.7715 | 0.2324 | 0.7629 | 0.795 | 0.7071 | 0.7591 | 0.6617 | 0.7124 | 0.81 | 0.843 |
| 0.3565 | 63.0 | 31500 | 0.3822 | 0.7343 | 0.9423 | 0.8744 | 0.2217 | 0.7235 | 0.7592 | 0.2999 | 0.7749 | 0.7788 | 0.3067 | 0.7702 | 0.8013 | 0.7238 | 0.771 | 0.6675 | 0.7175 | 0.8114 | 0.8479 |
| 0.5392 | 64.0 | 32000 | 0.3760 | 0.7394 | 0.9428 | 0.8705 | 0.2025 | 0.7319 | 0.7789 | 0.3032 | 0.7759 | 0.7801 | 0.3014 | 0.7773 | 0.8161 | 0.7264 | 0.7718 | 0.6876 | 0.7289 | 0.8041 | 0.8397 |
| 0.4893 | 65.0 | 32500 | 0.3702 | 0.7448 | 0.9408 | 0.8701 | 0.1175 | 0.7326 | 0.7906 | 0.3043 | 0.7828 | 0.7869 | 0.1576 | 0.7819 | 0.8271 | 0.7389 | 0.779 | 0.6682 | 0.7237 | 0.8274 | 0.8579 |
| 0.3706 | 66.0 | 33000 | 0.3416 | 0.7602 | 0.9451 | 0.8736 | 0.2166 | 0.7528 | 0.7848 | 0.3104 | 0.7974 | 0.8019 | 0.2838 | 0.799 | 0.8273 | 0.757 | 0.7988 | 0.7022 | 0.7495 | 0.8215 | 0.8573 |
| 0.4715 | 67.0 | 33500 | 0.3407 | 0.7626 | 0.9528 | 0.8804 | 0.2629 | 0.7472 | 0.7868 | 0.3079 | 0.7999 | 0.8062 | 0.3638 | 0.7954 | 0.8336 | 0.7643 | 0.8079 | 0.6973 | 0.7474 | 0.8262 | 0.8633 |
| 0.4381 | 68.0 | 34000 | 0.3454 | 0.7619 | 0.9504 | 0.8741 | 0.1501 | 0.742 | 0.8009 | 0.3129 | 0.7982 | 0.8063 | 0.3519 | 0.7926 | 0.8302 | 0.7726 | 0.8147 | 0.7037 | 0.7557 | 0.8092 | 0.8485 |
| 0.3467 | 69.0 | 34500 | 0.3459 | 0.7621 | 0.9479 | 0.8972 | 0.3178 | 0.7557 | 0.7793 | 0.3082 | 0.8002 | 0.8039 | 0.4286 | 0.7985 | 0.812 | 0.757 | 0.7992 | 0.7042 | 0.7495 | 0.8252 | 0.863 |
| 0.4757 | 70.0 | 35000 | 0.3503 | 0.7575 | 0.9448 | 0.8654 | 0.2299 | 0.7521 | 0.7897 | 0.3085 | 0.7915 | 0.7992 | 0.3033 | 0.7962 | 0.8254 | 0.77 | 0.8147 | 0.6773 | 0.7206 | 0.8251 | 0.8624 |
| 0.381 | 71.0 | 35500 | 0.3437 | 0.7655 | 0.9497 | 0.8843 | 0.2601 | 0.7533 | 0.8075 | 0.3115 | 0.7993 | 0.8046 | 0.3671 | 0.7953 | 0.8413 | 0.7654 | 0.806 | 0.7148 | 0.7567 | 0.8161 | 0.8512 |
| 0.5358 | 72.0 | 36000 | 0.3380 | 0.7681 | 0.9459 | 0.8783 | 0.199 | 0.76 | 0.8157 | 0.3169 | 0.8045 | 0.8093 | 0.3014 | 0.8042 | 0.8497 | 0.7679 | 0.8159 | 0.7109 | 0.7495 | 0.8255 | 0.8624 |
| 0.4114 | 73.0 | 36500 | 0.3485 | 0.766 | 0.9454 | 0.8839 | 0.1839 | 0.7587 | 0.7939 | 0.3145 | 0.7997 | 0.8058 | 0.281 | 0.8033 | 0.8281 | 0.7688 | 0.8087 | 0.7101 | 0.7485 | 0.8189 | 0.8603 |
| 0.4304 | 74.0 | 37000 | 0.3483 | 0.759 | 0.9444 | 0.8602 | 0.2158 | 0.755 | 0.7778 | 0.3094 | 0.7973 | 0.8027 | 0.2843 | 0.8017 | 0.8247 | 0.7666 | 0.8163 | 0.6854 | 0.7278 | 0.8248 | 0.8639 |
| 0.4527 | 75.0 | 37500 | 0.3590 | 0.7587 | 0.9457 | 0.8706 | 0.235 | 0.7493 | 0.7895 | 0.313 | 0.7928 | 0.7994 | 0.2843 | 0.7963 | 0.8242 | 0.7396 | 0.7825 | 0.705 | 0.7485 | 0.8314 | 0.8673 |
| 0.4407 | 76.0 | 38000 | 0.3461 | 0.7546 | 0.9454 | 0.8871 | 0.233 | 0.7467 | 0.7934 | 0.309 | 0.7961 | 0.8006 | 0.2705 | 0.7995 | 0.839 | 0.7501 | 0.7944 | 0.6839 | 0.7381 | 0.8299 | 0.8691 |
| 0.4118 | 77.0 | 38500 | 0.3657 | 0.7495 | 0.9412 | 0.8691 | 0.1603 | 0.7361 | 0.7877 | 0.3071 | 0.784 | 0.7893 | 0.219 | 0.7798 | 0.8278 | 0.7483 | 0.7849 | 0.6836 | 0.7299 | 0.8166 | 0.853 |
| 0.4346 | 78.0 | 39000 | 0.3583 | 0.7559 | 0.9421 | 0.8753 | 0.1848 | 0.7435 | 0.8044 | 0.3082 | 0.7892 | 0.7956 | 0.2514 | 0.788 | 0.8411 | 0.7551 | 0.8 | 0.693 | 0.732 | 0.8195 | 0.8548 |
| 0.4567 | 79.0 | 39500 | 0.3487 | 0.7635 | 0.9452 | 0.8953 | 0.1983 | 0.7514 | 0.8003 | 0.3098 | 0.7963 | 0.8021 | 0.2562 | 0.7951 | 0.8322 | 0.7608 | 0.7964 | 0.7052 | 0.7505 | 0.8244 | 0.8594 |
| 0.4801 | 80.0 | 40000 | 0.3592 | 0.7641 | 0.9435 | 0.8805 | 0.1908 | 0.754 | 0.8126 | 0.3092 | 0.796 | 0.8015 | 0.2457 | 0.7935 | 0.8445 | 0.7547 | 0.7933 | 0.7062 | 0.7474 | 0.8313 | 0.8639 |
| 0.4711 | 81.0 | 40500 | 0.3317 | 0.7745 | 0.949 | 0.8958 | 0.2417 | 0.7666 | 0.8032 | 0.3152 | 0.8103 | 0.8139 | 0.2914 | 0.8106 | 0.8375 | 0.7828 | 0.819 | 0.7098 | 0.7546 | 0.831 | 0.8679 |
| 0.3737 | 82.0 | 41000 | 0.3236 | 0.7828 | 0.9529 | 0.8924 | 0.2418 | 0.7741 | 0.8258 | 0.3196 | 0.8169 | 0.8217 | 0.33 | 0.8172 | 0.8557 | 0.7876 | 0.8214 | 0.7313 | 0.7763 | 0.8293 | 0.8673 |
| 0.4275 | 83.0 | 41500 | 0.3328 | 0.7735 | 0.9466 | 0.8786 | 0.1908 | 0.7641 | 0.8002 | 0.3151 | 0.8072 | 0.8111 | 0.2424 | 0.8067 | 0.8342 | 0.7798 | 0.8131 | 0.7087 | 0.7536 | 0.8319 | 0.8667 |
| 0.3289 | 84.0 | 42000 | 0.3243 | 0.7798 | 0.9474 | 0.8848 | 0.2453 | 0.7646 | 0.8087 | 0.3186 | 0.815 | 0.8177 | 0.2929 | 0.8071 | 0.8481 | 0.7806 | 0.8198 | 0.7316 | 0.7722 | 0.8272 | 0.8612 |
| 0.4157 | 85.0 | 42500 | 0.3515 | 0.7612 | 0.9471 | 0.878 | 0.1902 | 0.7564 | 0.7963 | 0.3115 | 0.7988 | 0.8028 | 0.2538 | 0.8005 | 0.8341 | 0.7499 | 0.7948 | 0.7077 | 0.7526 | 0.8261 | 0.8609 |
| 0.3518 | 86.0 | 43000 | 0.3355 | 0.7836 | 0.9436 | 0.8976 | 0.2453 | 0.7749 | 0.8198 | 0.3175 | 0.8151 | 0.8204 | 0.3043 | 0.8139 | 0.853 | 0.7841 | 0.8198 | 0.7328 | 0.7732 | 0.834 | 0.8682 |
| 0.4637 | 87.0 | 43500 | 0.3392 | 0.7756 | 0.9454 | 0.8975 | 0.2388 | 0.7673 | 0.7974 | 0.3167 | 0.8109 | 0.8146 | 0.2986 | 0.8075 | 0.8365 | 0.7718 | 0.8119 | 0.7215 | 0.766 | 0.8333 | 0.8661 |
| 0.395 | 88.0 | 44000 | 0.3309 | 0.7747 | 0.9461 | 0.8889 | 0.2263 | 0.7687 | 0.8017 | 0.3159 | 0.8104 | 0.8147 | 0.2967 | 0.8089 | 0.8352 | 0.7686 | 0.8107 | 0.7189 | 0.7639 | 0.8368 | 0.8694 |
| 0.3863 | 89.0 | 44500 | 0.3337 | 0.7706 | 0.9494 | 0.8827 | 0.2383 | 0.7584 | 0.8078 | 0.3115 | 0.8046 | 0.8118 | 0.3167 | 0.8034 | 0.8429 | 0.7649 | 0.8083 | 0.7041 | 0.7515 | 0.8428 | 0.8755 |
| 0.3874 | 90.0 | 45000 | 0.3239 | 0.7765 | 0.9527 | 0.8834 | 0.247 | 0.7711 | 0.7965 | 0.3143 | 0.8091 | 0.8156 | 0.331 | 0.8122 | 0.8388 | 0.7754 | 0.8151 | 0.7144 | 0.7577 | 0.8398 | 0.8739 |
| 0.3986 | 91.0 | 45500 | 0.3262 | 0.779 | 0.9527 | 0.8851 | 0.2605 | 0.7739 | 0.7997 | 0.3175 | 0.8161 | 0.8213 | 0.3452 | 0.8176 | 0.8441 | 0.7647 | 0.8075 | 0.7265 | 0.7784 | 0.8459 | 0.8779 |
| 0.3324 | 92.0 | 46000 | 0.3329 | 0.7644 | 0.9524 | 0.899 | 0.2734 | 0.7522 | 0.7995 | 0.3063 | 0.8024 | 0.8072 | 0.3324 | 0.7988 | 0.8344 | 0.7656 | 0.806 | 0.6853 | 0.7402 | 0.8423 | 0.8755 |
| 0.367 | 93.0 | 46500 | 0.3169 | 0.7792 | 0.9485 | 0.8927 | 0.2618 | 0.769 | 0.8151 | 0.3155 | 0.8155 | 0.8213 | 0.339 | 0.8134 | 0.8554 | 0.7839 | 0.8234 | 0.7253 | 0.7753 | 0.8284 | 0.8652 |
| 0.3546 | 94.0 | 47000 | 0.3150 | 0.7805 | 0.9523 | 0.8935 | 0.249 | 0.769 | 0.8173 | 0.3155 | 0.817 | 0.8221 | 0.3329 | 0.8143 | 0.8515 | 0.7827 | 0.821 | 0.7113 | 0.766 | 0.8474 | 0.8794 |
| 0.426 | 95.0 | 47500 | 0.3182 | 0.7855 | 0.9518 | 0.8913 | 0.2582 | 0.7779 | 0.818 | 0.3184 | 0.8223 | 0.827 | 0.3305 | 0.8223 | 0.8516 | 0.7967 | 0.8329 | 0.7214 | 0.7722 | 0.8384 | 0.8758 |
| 0.3735 | 96.0 | 48000 | 0.3135 | 0.7842 | 0.9488 | 0.8899 | 0.2766 | 0.7786 | 0.8092 | 0.3195 | 0.8206 | 0.8236 | 0.3438 | 0.8201 | 0.8451 | 0.7971 | 0.8369 | 0.7112 | 0.7567 | 0.8444 | 0.8773 |
| 0.3777 | 97.0 | 48500 | 0.3381 | 0.765 | 0.9472 | 0.8974 | 0.2939 | 0.7578 | 0.8049 | 0.3098 | 0.8072 | 0.8098 | 0.3467 | 0.8033 | 0.8481 | 0.7608 | 0.7984 | 0.7021 | 0.7639 | 0.8321 | 0.867 |
| 0.4125 | 98.0 | 49000 | 0.3031 | 0.7901 | 0.9468 | 0.9006 | 0.2662 | 0.7802 | 0.8316 | 0.3192 | 0.8265 | 0.83 | 0.3148 | 0.8242 | 0.8677 | 0.805 | 0.8409 | 0.7173 | 0.767 | 0.8478 | 0.8821 |
| 0.495 | 99.0 | 49500 | 0.3173 | 0.7846 | 0.9444 | 0.8963 | 0.2482 | 0.7789 | 0.8213 | 0.3191 | 0.8222 | 0.8248 | 0.2919 | 0.8224 | 0.8605 | 0.7895 | 0.8298 | 0.7221 | 0.767 | 0.8421 | 0.8776 |
| 0.408 | 100.0 | 50000 | 0.3049 | 0.793 | 0.9449 | 0.8939 | 0.2319 | 0.7858 | 0.8346 | 0.3233 | 0.8251 | 0.8294 | 0.2786 | 0.8267 | 0.8651 | 0.8023 | 0.8389 | 0.7294 | 0.768 | 0.8474 | 0.8812 |
| 0.3715 | 101.0 | 50500 | 0.3064 | 0.7841 | 0.9475 | 0.8894 | 0.2867 | 0.7748 | 0.824 | 0.3194 | 0.8229 | 0.8265 | 0.3386 | 0.8209 | 0.8633 | 0.7798 | 0.8179 | 0.7206 | 0.7763 | 0.8517 | 0.8855 |
| 0.3299 | 102.0 | 51000 | 0.3219 | 0.7744 | 0.9482 | 0.8799 | 0.2572 | 0.7675 | 0.8112 | 0.3168 | 0.8088 | 0.8131 | 0.3105 | 0.8077 | 0.8501 | 0.7751 | 0.8087 | 0.7117 | 0.7588 | 0.8363 | 0.8718 |
| 0.3759 | 103.0 | 51500 | 0.3075 | 0.7822 | 0.9452 | 0.8922 | 0.2596 | 0.774 | 0.811 | 0.3186 | 0.8206 | 0.8237 | 0.3162 | 0.817 | 0.8559 | 0.7831 | 0.8198 | 0.7109 | 0.766 | 0.8525 | 0.8852 |
| 0.4056 | 104.0 | 52000 | 0.3063 | 0.7882 | 0.9475 | 0.8932 | 0.2782 | 0.7816 | 0.8218 | 0.3221 | 0.8253 | 0.828 | 0.3357 | 0.8249 | 0.8557 | 0.7916 | 0.8321 | 0.7206 | 0.7691 | 0.8524 | 0.8827 |
| 0.3461 | 105.0 | 52500 | 0.3086 | 0.7907 | 0.956 | 0.8894 | 0.2624 | 0.784 | 0.8231 | 0.3193 | 0.8261 | 0.8289 | 0.3305 | 0.8259 | 0.8547 | 0.7976 | 0.8317 | 0.7285 | 0.7784 | 0.8459 | 0.8767 |
| 0.4638 | 106.0 | 53000 | 0.2992 | 0.7884 | 0.9538 | 0.8934 | 0.25 | 0.7847 | 0.8221 | 0.3188 | 0.8244 | 0.8288 | 0.3038 | 0.8278 | 0.862 | 0.7933 | 0.8298 | 0.7208 | 0.7742 | 0.8511 | 0.8824 |
| 0.4445 | 107.0 | 53500 | 0.3113 | 0.7858 | 0.946 | 0.8944 | 0.25 | 0.7799 | 0.8221 | 0.3184 | 0.8206 | 0.825 | 0.2976 | 0.8219 | 0.8561 | 0.7928 | 0.827 | 0.7193 | 0.7711 | 0.8453 | 0.877 |
| 0.3413 | 108.0 | 54000 | 0.3113 | 0.7861 | 0.9511 | 0.9036 | 0.2611 | 0.7765 | 0.8245 | 0.3201 | 0.8222 | 0.8259 | 0.3086 | 0.8201 | 0.8613 | 0.7873 | 0.8222 | 0.7155 | 0.7701 | 0.8555 | 0.8855 |
| 0.4364 | 109.0 | 54500 | 0.3054 | 0.7899 | 0.9515 | 0.8953 | 0.2493 | 0.7833 | 0.8326 | 0.3217 | 0.8257 | 0.8298 | 0.291 | 0.8266 | 0.8668 | 0.7918 | 0.8254 | 0.7229 | 0.7784 | 0.8549 | 0.8858 |
| 0.415 | 110.0 | 55000 | 0.3016 | 0.7937 | 0.9531 | 0.9011 | 0.273 | 0.7919 | 0.8191 | 0.3249 | 0.8297 | 0.8334 | 0.3433 | 0.8354 | 0.8529 | 0.794 | 0.8313 | 0.7368 | 0.7866 | 0.8503 | 0.8821 |
| 0.3368 | 111.0 | 55500 | 0.3073 | 0.7933 | 0.9497 | 0.8963 | 0.2642 | 0.7878 | 0.8238 | 0.3212 | 0.8272 | 0.8312 | 0.3219 | 0.8289 | 0.8591 | 0.7922 | 0.8306 | 0.7356 | 0.7814 | 0.852 | 0.8815 |
| 0.3752 | 112.0 | 56000 | 0.3107 | 0.7874 | 0.9525 | 0.9007 | 0.2815 | 0.7829 | 0.815 | 0.32 | 0.8256 | 0.8296 | 0.3471 | 0.8276 | 0.8558 | 0.7791 | 0.8183 | 0.7325 | 0.7876 | 0.8506 | 0.883 |
| 0.3418 | 113.0 | 56500 | 0.2966 | 0.8001 | 0.9533 | 0.9024 | 0.2848 | 0.7956 | 0.834 | 0.3241 | 0.8371 | 0.8411 | 0.3524 | 0.8386 | 0.8686 | 0.8057 | 0.8452 | 0.7407 | 0.7928 | 0.8538 | 0.8852 |
| 0.3301 | 114.0 | 57000 | 0.2992 | 0.798 | 0.9521 | 0.8953 | 0.287 | 0.7921 | 0.8325 | 0.3237 | 0.8342 | 0.8374 | 0.3343 | 0.8361 | 0.8679 | 0.8027 | 0.8365 | 0.7357 | 0.7918 | 0.8556 | 0.8839 |
| 0.365 | 115.0 | 57500 | 0.2981 | 0.7972 | 0.9438 | 0.8946 | 0.252 | 0.7867 | 0.8336 | 0.3243 | 0.8297 | 0.8339 | 0.3243 | 0.8292 | 0.8696 | 0.8068 | 0.8401 | 0.7242 | 0.7732 | 0.8606 | 0.8885 |
| 0.3717 | 116.0 | 58000 | 0.3036 | 0.7962 | 0.9527 | 0.9068 | 0.2885 | 0.7877 | 0.8309 | 0.3236 | 0.8314 | 0.8358 | 0.3581 | 0.8313 | 0.8653 | 0.795 | 0.8317 | 0.739 | 0.7918 | 0.8547 | 0.8839 |
| 0.3575 | 117.0 | 58500 | 0.3002 | 0.798 | 0.9503 | 0.8973 | 0.2803 | 0.7928 | 0.8344 | 0.3234 | 0.8323 | 0.8361 | 0.3424 | 0.8349 | 0.863 | 0.8008 | 0.8369 | 0.7381 | 0.7856 | 0.855 | 0.8858 |
| 0.3356 | 118.0 | 59000 | 0.3043 | 0.7937 | 0.9495 | 0.8923 | 0.2801 | 0.7898 | 0.8359 | 0.3223 | 0.8288 | 0.8329 | 0.339 | 0.8316 | 0.8653 | 0.7956 | 0.8298 | 0.7307 | 0.7825 | 0.8548 | 0.8864 |
| 0.435 | 119.0 | 59500 | 0.2998 | 0.798 | 0.9484 | 0.9026 | 0.2737 | 0.7926 | 0.8313 | 0.3256 | 0.8346 | 0.8371 | 0.3248 | 0.8349 | 0.8676 | 0.8017 | 0.8345 | 0.7327 | 0.7887 | 0.8595 | 0.8882 |
| 0.3344 | 120.0 | 60000 | 0.2980 | 0.7978 | 0.9536 | 0.8983 | 0.2747 | 0.7947 | 0.8349 | 0.3247 | 0.8336 | 0.8371 | 0.3476 | 0.8349 | 0.8655 | 0.8014 | 0.8385 | 0.7418 | 0.7907 | 0.8501 | 0.8821 |
| 0.3991 | 121.0 | 60500 | 0.2983 | 0.8006 | 0.9519 | 0.9032 | 0.275 | 0.7938 | 0.8349 | 0.3255 | 0.8353 | 0.8387 | 0.3562 | 0.8332 | 0.8699 | 0.8018 | 0.8393 | 0.7424 | 0.7897 | 0.8575 | 0.8873 |
| 0.3154 | 122.0 | 61000 | 0.2995 | 0.798 | 0.9524 | 0.8895 | 0.2622 | 0.7932 | 0.8361 | 0.3242 | 0.8324 | 0.8364 | 0.3314 | 0.8335 | 0.8659 | 0.7963 | 0.8317 | 0.737 | 0.7876 | 0.8607 | 0.8897 |
| 0.3353 | 123.0 | 61500 | 0.3011 | 0.8024 | 0.9506 | 0.9045 | 0.2739 | 0.7931 | 0.851 | 0.3246 | 0.8358 | 0.8394 | 0.3552 | 0.8332 | 0.8783 | 0.8083 | 0.846 | 0.7422 | 0.7866 | 0.8569 | 0.8855 |
| 0.3539 | 124.0 | 62000 | 0.2942 | 0.8027 | 0.9534 | 0.9021 | 0.2772 | 0.7948 | 0.8438 | 0.3234 | 0.8353 | 0.8397 | 0.381 | 0.8346 | 0.8723 | 0.8092 | 0.8448 | 0.7406 | 0.7866 | 0.8582 | 0.8876 |
| 0.5115 | 125.0 | 62500 | 0.2893 | 0.8056 | 0.9559 | 0.8998 | 0.2623 | 0.8002 | 0.8465 | 0.3244 | 0.838 | 0.8432 | 0.35 | 0.8415 | 0.8722 | 0.8115 | 0.848 | 0.7385 | 0.7866 | 0.8668 | 0.8948 |
| 0.3867 | 126.0 | 63000 | 0.2880 | 0.8086 | 0.9571 | 0.9046 | 0.2768 | 0.8039 | 0.8449 | 0.3253 | 0.8398 | 0.846 | 0.3738 | 0.8435 | 0.8731 | 0.8106 | 0.848 | 0.7422 | 0.7907 | 0.873 | 0.8994 |
| 0.6307 | 127.0 | 63500 | 0.2950 | 0.8049 | 0.953 | 0.9051 | 0.253 | 0.7985 | 0.8465 | 0.3256 | 0.8353 | 0.8413 | 0.3419 | 0.8371 | 0.8745 | 0.8057 | 0.8429 | 0.739 | 0.7856 | 0.87 | 0.8955 |
| 0.4204 | 128.0 | 64000 | 0.2944 | 0.8033 | 0.9572 | 0.9046 | 0.2675 | 0.7997 | 0.8428 | 0.3255 | 0.8354 | 0.8413 | 0.3562 | 0.8381 | 0.8737 | 0.8019 | 0.8401 | 0.7401 | 0.7897 | 0.8679 | 0.8942 |
| 0.3894 | 129.0 | 64500 | 0.2931 | 0.8071 | 0.9556 | 0.9001 | 0.2736 | 0.8029 | 0.843 | 0.3249 | 0.8386 | 0.8437 | 0.3752 | 0.8417 | 0.8699 | 0.8101 | 0.8444 | 0.7427 | 0.7918 | 0.8687 | 0.8948 |
| 0.4176 | 130.0 | 65000 | 0.2906 | 0.805 | 0.9528 | 0.9028 | 0.2673 | 0.797 | 0.8459 | 0.3257 | 0.8375 | 0.842 | 0.3624 | 0.8374 | 0.8748 | 0.8088 | 0.8437 | 0.7391 | 0.7887 | 0.8671 | 0.8936 |
| 0.3171 | 131.0 | 65500 | 0.2968 | 0.8018 | 0.9555 | 0.9035 | 0.27 | 0.7973 | 0.8375 | 0.325 | 0.834 | 0.839 | 0.359 | 0.8367 | 0.8671 | 0.8011 | 0.8377 | 0.7368 | 0.7866 | 0.8675 | 0.8927 |
| 0.3034 | 132.0 | 66000 | 0.2947 | 0.8011 | 0.9553 | 0.9006 | 0.2691 | 0.7976 | 0.8323 | 0.3245 | 0.8341 | 0.8395 | 0.3557 | 0.838 | 0.8627 | 0.8015 | 0.8393 | 0.7335 | 0.7845 | 0.8682 | 0.8945 |
| 0.4137 | 133.0 | 66500 | 0.2885 | 0.8068 | 0.955 | 0.8958 | 0.2782 | 0.8007 | 0.8471 | 0.3259 | 0.8382 | 0.8436 | 0.37 | 0.8403 | 0.8737 | 0.8094 | 0.8468 | 0.7415 | 0.7897 | 0.8695 | 0.8942 |
| 0.3806 | 134.0 | 67000 | 0.2915 | 0.8067 | 0.9525 | 0.9035 | 0.2753 | 0.8007 | 0.8507 | 0.3262 | 0.839 | 0.8438 | 0.3652 | 0.8406 | 0.8767 | 0.8066 | 0.8444 | 0.7456 | 0.7928 | 0.8678 | 0.8942 |
| 0.3932 | 135.0 | 67500 | 0.2915 | 0.8059 | 0.9549 | 0.899 | 0.2735 | 0.7986 | 0.8389 | 0.3248 | 0.8375 | 0.843 | 0.3671 | 0.8389 | 0.869 | 0.8085 | 0.8448 | 0.7407 | 0.7907 | 0.8684 | 0.8933 |
| 0.3981 | 136.0 | 68000 | 0.2917 | 0.8057 | 0.9544 | 0.8957 | 0.2804 | 0.8009 | 0.8375 | 0.3259 | 0.8384 | 0.8432 | 0.3652 | 0.8413 | 0.8655 | 0.8077 | 0.8444 | 0.74 | 0.7907 | 0.8694 | 0.8945 |
| 0.3151 | 137.0 | 68500 | 0.2932 | 0.8059 | 0.9542 | 0.8999 | 0.2793 | 0.801 | 0.8393 | 0.3248 | 0.8387 | 0.8436 | 0.3686 | 0.8411 | 0.8682 | 0.8084 | 0.8444 | 0.739 | 0.7907 | 0.8702 | 0.8958 |
| 0.3825 | 138.0 | 69000 | 0.2898 | 0.8084 | 0.9548 | 0.9033 | 0.2774 | 0.8013 | 0.8493 | 0.3262 | 0.8396 | 0.8449 | 0.3719 | 0.8408 | 0.8739 | 0.8097 | 0.8452 | 0.7471 | 0.7948 | 0.8683 | 0.8945 |
| 0.2824 | 139.0 | 69500 | 0.2916 | 0.805 | 0.9548 | 0.9032 | 0.2788 | 0.7989 | 0.8376 | 0.3253 | 0.8376 | 0.8429 | 0.3752 | 0.839 | 0.8677 | 0.8052 | 0.8417 | 0.7418 | 0.7928 | 0.8679 | 0.8942 |
| 0.4594 | 140.0 | 70000 | 0.2908 | 0.8069 | 0.9546 | 0.9033 | 0.2802 | 0.8013 | 0.8465 | 0.3265 | 0.8397 | 0.8449 | 0.3848 | 0.8421 | 0.8722 | 0.8069 | 0.844 | 0.746 | 0.7948 | 0.8679 | 0.8958 |
| 0.3131 | 141.0 | 70500 | 0.2934 | 0.8052 | 0.9546 | 0.9001 | 0.279 | 0.7993 | 0.8455 | 0.3257 | 0.8383 | 0.8434 | 0.3862 | 0.8399 | 0.8724 | 0.8044 | 0.8425 | 0.7422 | 0.7918 | 0.8689 | 0.8961 |
| 0.384 | 142.0 | 71000 | 0.2905 | 0.8063 | 0.955 | 0.9033 | 0.279 | 0.7996 | 0.8499 | 0.3257 | 0.8387 | 0.8438 | 0.3767 | 0.84 | 0.8749 | 0.8057 | 0.8429 | 0.7439 | 0.7928 | 0.8695 | 0.8958 |
| 0.3811 | 143.0 | 71500 | 0.2909 | 0.805 | 0.9548 | 0.9002 | 0.279 | 0.7995 | 0.8441 | 0.3252 | 0.8377 | 0.843 | 0.3767 | 0.8397 | 0.8714 | 0.8043 | 0.8417 | 0.7413 | 0.7918 | 0.8693 | 0.8955 |
| 0.2437 | 144.0 | 72000 | 0.2898 | 0.8051 | 0.955 | 0.9003 | 0.279 | 0.8007 | 0.8417 | 0.3255 | 0.838 | 0.8433 | 0.3767 | 0.8411 | 0.869 | 0.8043 | 0.8421 | 0.7416 | 0.7918 | 0.8694 | 0.8961 |
| 0.3751 | 145.0 | 72500 | 0.2909 | 0.8055 | 0.955 | 0.9034 | 0.279 | 0.801 | 0.8414 | 0.3255 | 0.838 | 0.8433 | 0.3767 | 0.8416 | 0.8688 | 0.805 | 0.8421 | 0.7415 | 0.7918 | 0.8699 | 0.8961 |
| 0.3538 | 146.0 | 73000 | 0.2905 | 0.8052 | 0.955 | 0.9034 | 0.2791 | 0.8008 | 0.8412 | 0.3257 | 0.8378 | 0.8432 | 0.3767 | 0.841 | 0.8686 | 0.8056 | 0.8429 | 0.7416 | 0.7918 | 0.8682 | 0.8948 |
| 0.3244 | 147.0 | 73500 | 0.2904 | 0.8048 | 0.955 | 0.9034 | 0.279 | 0.8003 | 0.8412 | 0.3255 | 0.8375 | 0.8428 | 0.3733 | 0.8405 | 0.8686 | 0.8056 | 0.8421 | 0.7417 | 0.7918 | 0.8672 | 0.8945 |
| 0.3588 | 148.0 | 74000 | 0.2904 | 0.8051 | 0.9551 | 0.9034 | 0.279 | 0.8008 | 0.8412 | 0.3255 | 0.8376 | 0.8429 | 0.3733 | 0.8409 | 0.8686 | 0.8056 | 0.8421 | 0.7416 | 0.7918 | 0.8682 | 0.8948 |
| 0.347 | 149.0 | 74500 | 0.2904 | 0.8051 | 0.9551 | 0.9034 | 0.279 | 0.8008 | 0.8412 | 0.3255 | 0.8376 | 0.8429 | 0.3733 | 0.8409 | 0.8686 | 0.8056 | 0.8421 | 0.7416 | 0.7918 | 0.8682 | 0.8948 |
| 0.3688 | 150.0 | 75000 | 0.2904 | 0.8051 | 0.9551 | 0.9034 | 0.279 | 0.8008 | 0.8412 | 0.3255 | 0.8376 | 0.8429 | 0.3733 | 0.8409 | 0.8686 | 0.8056 | 0.8421 | 0.7416 | 0.7918 | 0.8682 | 0.8948 |
### Framework versions
- Transformers 4.46.2
- Pytorch 2.5.0+cu121
- Datasets 2.19.2
- Tokenizers 0.20.3
| [
"chicken",
"duck",
"plant"
] |
joe611/chickens-composite-201616161616-150-epochs-wo-transform-metrics-test-shfld |
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# chickens-composite-201616161616-150-epochs-wo-transform-metrics-test-shfld
This model is a fine-tuned version of [facebook/detr-resnet-50](https://huggingface.co/facebook/detr-resnet-50) on the None dataset.
It achieves the following results on the evaluation set:
- Loss: 0.3059
- Map: 0.8044
- Map 50: 0.9405
- Map 75: 0.9024
- Map Small: 0.2979
- Map Medium: 0.8141
- Map Large: 0.7843
- Mar 1: 0.3221
- Mar 10: 0.8382
- Mar 100: 0.8419
- Mar Small: 0.3829
- Mar Medium: 0.8546
- Mar Large: 0.8145
- Map Chicken: 0.7936
- Mar 100 Chicken: 0.844
- Map Duck: 0.7475
- Mar 100 Duck: 0.7804
- Map Plant: 0.8722
- Mar 100 Plant: 0.9012
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 1e-05
- train_batch_size: 2
- eval_batch_size: 8
- seed: 42
- optimizer: Use adamw_torch with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
- lr_scheduler_type: cosine
- num_epochs: 150
### Training results
| Training Loss | Epoch | Step | Validation Loss | Map | Map 50 | Map 75 | Map Small | Map Medium | Map Large | Mar 1 | Mar 10 | Mar 100 | Mar Small | Mar Medium | Mar Large | Map Chicken | Mar 100 Chicken | Map Duck | Mar 100 Duck | Map Plant | Mar 100 Plant |
|:-------------:|:-----:|:-----:|:---------------:|:------:|:------:|:------:|:---------:|:----------:|:---------:|:------:|:------:|:-------:|:---------:|:----------:|:---------:|:-----------:|:---------------:|:--------:|:------------:|:---------:|:-------------:|
| 1.4551 | 1.0 | 500 | 1.3226 | 0.1944 | 0.2674 | 0.2295 | 0.0337 | 0.1092 | 0.2465 | 0.0876 | 0.3262 | 0.4233 | 0.13 | 0.4015 | 0.4322 | 0.0578 | 0.5155 | 0.0 | 0.0 | 0.5256 | 0.7545 |
| 1.1004 | 2.0 | 1000 | 1.0223 | 0.2841 | 0.4037 | 0.341 | 0.0583 | 0.2422 | 0.3237 | 0.1149 | 0.4071 | 0.464 | 0.1281 | 0.4375 | 0.4936 | 0.1791 | 0.6532 | 0.0 | 0.0 | 0.6732 | 0.7388 |
| 0.9029 | 3.0 | 1500 | 0.8779 | 0.3548 | 0.5047 | 0.4101 | 0.026 | 0.3151 | 0.3991 | 0.1258 | 0.4634 | 0.4674 | 0.0933 | 0.4451 | 0.4783 | 0.3786 | 0.675 | 0.0 | 0.0 | 0.6859 | 0.7273 |
| 0.8533 | 4.0 | 2000 | 0.7691 | 0.3823 | 0.5563 | 0.4418 | 0.0372 | 0.3522 | 0.4082 | 0.1294 | 0.4671 | 0.4695 | 0.0957 | 0.4454 | 0.4898 | 0.4386 | 0.6528 | 0.0 | 0.0 | 0.7083 | 0.7558 |
| 0.7318 | 5.0 | 2500 | 0.6937 | 0.4096 | 0.5624 | 0.4792 | 0.0495 | 0.3833 | 0.4163 | 0.139 | 0.4914 | 0.4955 | 0.109 | 0.4748 | 0.5012 | 0.4942 | 0.7083 | 0.0 | 0.0 | 0.7347 | 0.7782 |
| 0.756 | 6.0 | 3000 | 0.6431 | 0.4239 | 0.5803 | 0.4923 | 0.0634 | 0.3994 | 0.4443 | 0.1419 | 0.4941 | 0.4974 | 0.131 | 0.4719 | 0.5195 | 0.5296 | 0.7095 | 0.0 | 0.0 | 0.7422 | 0.7827 |
| 0.6901 | 7.0 | 3500 | 0.6218 | 0.44 | 0.6019 | 0.5287 | 0.0735 | 0.4207 | 0.4376 | 0.142 | 0.499 | 0.5016 | 0.1362 | 0.4833 | 0.5041 | 0.582 | 0.7246 | 0.0 | 0.0 | 0.738 | 0.7803 |
| 0.666 | 8.0 | 4000 | 0.5863 | 0.4611 | 0.6199 | 0.5421 | 0.0933 | 0.438 | 0.4818 | 0.1472 | 0.5093 | 0.5128 | 0.1738 | 0.4923 | 0.528 | 0.6304 | 0.7452 | 0.0 | 0.0 | 0.7528 | 0.793 |
| 0.5968 | 9.0 | 4500 | 0.5396 | 0.4825 | 0.6337 | 0.5724 | 0.1276 | 0.4593 | 0.4868 | 0.1524 | 0.5194 | 0.5243 | 0.2205 | 0.5027 | 0.5385 | 0.6663 | 0.7563 | 0.0 | 0.0 | 0.7811 | 0.8167 |
| 0.593 | 10.0 | 5000 | 0.5442 | 0.4858 | 0.6479 | 0.5773 | 0.1346 | 0.4638 | 0.4924 | 0.1572 | 0.5185 | 0.5213 | 0.2067 | 0.5018 | 0.5245 | 0.6555 | 0.7325 | 0.0224 | 0.0186 | 0.7795 | 0.8127 |
| 0.4866 | 11.0 | 5500 | 0.5145 | 0.5446 | 0.7266 | 0.6558 | 0.1763 | 0.5315 | 0.5297 | 0.1912 | 0.5863 | 0.5881 | 0.2514 | 0.5767 | 0.5633 | 0.6654 | 0.7381 | 0.1821 | 0.2041 | 0.7864 | 0.8221 |
| 0.5661 | 12.0 | 6000 | 0.5027 | 0.6187 | 0.855 | 0.7639 | 0.1668 | 0.6262 | 0.5823 | 0.2407 | 0.6618 | 0.665 | 0.2424 | 0.6708 | 0.6198 | 0.6598 | 0.7234 | 0.423 | 0.4598 | 0.7732 | 0.8118 |
| 0.4722 | 13.0 | 6500 | 0.4694 | 0.6815 | 0.9159 | 0.8152 | 0.1688 | 0.6799 | 0.6589 | 0.2779 | 0.7283 | 0.7321 | 0.2929 | 0.7339 | 0.707 | 0.6835 | 0.7456 | 0.581 | 0.634 | 0.78 | 0.8167 |
| 0.5192 | 14.0 | 7000 | 0.4483 | 0.6856 | 0.9323 | 0.7876 | 0.2121 | 0.6727 | 0.7362 | 0.2804 | 0.7363 | 0.7404 | 0.3452 | 0.727 | 0.7832 | 0.6626 | 0.7321 | 0.597 | 0.6546 | 0.7972 | 0.8345 |
| 0.4858 | 15.0 | 7500 | 0.4415 | 0.6978 | 0.9354 | 0.8359 | 0.1552 | 0.6764 | 0.7535 | 0.2856 | 0.7474 | 0.7515 | 0.279 | 0.7398 | 0.7872 | 0.6837 | 0.7488 | 0.6176 | 0.6773 | 0.7921 | 0.8285 |
| 0.4536 | 16.0 | 8000 | 0.4759 | 0.6603 | 0.9171 | 0.7832 | 0.1173 | 0.6491 | 0.6453 | 0.2667 | 0.7049 | 0.7069 | 0.1962 | 0.7042 | 0.6825 | 0.6634 | 0.7234 | 0.544 | 0.5866 | 0.7734 | 0.8106 |
| 0.4177 | 17.0 | 8500 | 0.4158 | 0.7044 | 0.9278 | 0.8406 | 0.1923 | 0.6974 | 0.7167 | 0.2867 | 0.7507 | 0.7529 | 0.3029 | 0.751 | 0.7592 | 0.7209 | 0.7782 | 0.596 | 0.6474 | 0.7963 | 0.833 |
| 0.4261 | 18.0 | 9000 | 0.3972 | 0.716 | 0.9389 | 0.8733 | 0.2513 | 0.7088 | 0.7355 | 0.2856 | 0.7589 | 0.7616 | 0.339 | 0.7577 | 0.7763 | 0.7259 | 0.7782 | 0.6258 | 0.6742 | 0.7965 | 0.8324 |
| 0.4455 | 19.0 | 9500 | 0.4175 | 0.7075 | 0.9156 | 0.8448 | 0.1643 | 0.7041 | 0.7268 | 0.2843 | 0.7467 | 0.7517 | 0.2829 | 0.7484 | 0.7631 | 0.7165 | 0.773 | 0.6035 | 0.6423 | 0.8024 | 0.8397 |
| 0.4156 | 20.0 | 10000 | 0.4091 | 0.7115 | 0.9162 | 0.8518 | 0.2067 | 0.7045 | 0.7171 | 0.2833 | 0.7488 | 0.7532 | 0.3076 | 0.7459 | 0.761 | 0.7287 | 0.781 | 0.6108 | 0.6485 | 0.7951 | 0.8303 |
| 0.4166 | 21.0 | 10500 | 0.3732 | 0.7298 | 0.9467 | 0.8933 | 0.2131 | 0.7254 | 0.7506 | 0.2888 | 0.7713 | 0.7759 | 0.3005 | 0.7731 | 0.7839 | 0.739 | 0.7909 | 0.643 | 0.6979 | 0.8074 | 0.8388 |
| 0.4243 | 22.0 | 11000 | 0.3636 | 0.7327 | 0.9489 | 0.884 | 0.2297 | 0.7268 | 0.766 | 0.2938 | 0.7751 | 0.7777 | 0.299 | 0.7759 | 0.7997 | 0.7405 | 0.7813 | 0.6469 | 0.7072 | 0.8108 | 0.8445 |
| 0.4215 | 23.0 | 11500 | 0.3797 | 0.7256 | 0.9357 | 0.8647 | 0.209 | 0.7063 | 0.752 | 0.2923 | 0.7676 | 0.7697 | 0.2938 | 0.7585 | 0.7855 | 0.7336 | 0.7853 | 0.6427 | 0.6887 | 0.8006 | 0.8352 |
| 0.3449 | 24.0 | 12000 | 0.3634 | 0.7489 | 0.9419 | 0.9009 | 0.1651 | 0.7472 | 0.7848 | 0.3042 | 0.7905 | 0.7947 | 0.25 | 0.794 | 0.8259 | 0.7477 | 0.7984 | 0.6861 | 0.7381 | 0.8129 | 0.8476 |
| 0.3908 | 25.0 | 12500 | 0.3959 | 0.7286 | 0.945 | 0.88 | 0.2226 | 0.7222 | 0.7564 | 0.3009 | 0.7671 | 0.7702 | 0.2871 | 0.7683 | 0.7928 | 0.715 | 0.7611 | 0.6812 | 0.7278 | 0.7897 | 0.8215 |
| 0.354 | 26.0 | 13000 | 0.3740 | 0.7367 | 0.9424 | 0.8928 | 0.1496 | 0.7402 | 0.7566 | 0.2936 | 0.7748 | 0.7812 | 0.2343 | 0.7844 | 0.8003 | 0.7347 | 0.7853 | 0.6554 | 0.7052 | 0.8201 | 0.853 |
| 0.367 | 27.0 | 13500 | 0.3563 | 0.7463 | 0.9503 | 0.8908 | 0.2277 | 0.7472 | 0.7683 | 0.3011 | 0.7905 | 0.7934 | 0.3448 | 0.7939 | 0.8067 | 0.7435 | 0.7885 | 0.6722 | 0.7371 | 0.8234 | 0.8545 |
| 0.356 | 28.0 | 14000 | 0.3583 | 0.7381 | 0.9303 | 0.8776 | 0.1918 | 0.734 | 0.7519 | 0.3026 | 0.7798 | 0.7827 | 0.2867 | 0.7835 | 0.7857 | 0.7438 | 0.7929 | 0.6467 | 0.6979 | 0.8237 | 0.8573 |
| 0.3703 | 29.0 | 14500 | 0.3504 | 0.7417 | 0.9441 | 0.8654 | 0.2577 | 0.732 | 0.762 | 0.2996 | 0.7817 | 0.7854 | 0.3557 | 0.784 | 0.7927 | 0.7432 | 0.7913 | 0.6648 | 0.7103 | 0.817 | 0.8545 |
| 0.3581 | 30.0 | 15000 | 0.3627 | 0.7342 | 0.9338 | 0.8642 | 0.2271 | 0.7376 | 0.7361 | 0.2953 | 0.7752 | 0.7777 | 0.3395 | 0.7829 | 0.7677 | 0.749 | 0.7956 | 0.6315 | 0.6845 | 0.8221 | 0.853 |
| 0.3178 | 31.0 | 15500 | 0.3675 | 0.7332 | 0.958 | 0.8706 | 0.2348 | 0.7259 | 0.7627 | 0.2968 | 0.7752 | 0.7834 | 0.3681 | 0.7785 | 0.7978 | 0.7192 | 0.7694 | 0.6672 | 0.7309 | 0.8133 | 0.8497 |
| 0.3386 | 32.0 | 16000 | 0.3378 | 0.7529 | 0.9364 | 0.8612 | 0.2411 | 0.7477 | 0.7756 | 0.3052 | 0.7943 | 0.7976 | 0.3443 | 0.796 | 0.8119 | 0.7652 | 0.8143 | 0.6554 | 0.7082 | 0.838 | 0.8703 |
| 0.3606 | 33.0 | 16500 | 0.3678 | 0.7377 | 0.945 | 0.8757 | 0.1816 | 0.7305 | 0.7766 | 0.3017 | 0.7828 | 0.7874 | 0.3043 | 0.7874 | 0.8109 | 0.7223 | 0.7742 | 0.6677 | 0.7289 | 0.823 | 0.8591 |
| 0.3542 | 34.0 | 17000 | 0.3237 | 0.7678 | 0.9577 | 0.8851 | 0.2398 | 0.7603 | 0.823 | 0.3116 | 0.8118 | 0.8154 | 0.331 | 0.8133 | 0.8538 | 0.7713 | 0.8163 | 0.694 | 0.7577 | 0.8382 | 0.8721 |
| 0.3498 | 35.0 | 17500 | 0.3462 | 0.7607 | 0.9564 | 0.8941 | 0.225 | 0.7543 | 0.8052 | 0.3069 | 0.8017 | 0.8056 | 0.3152 | 0.8047 | 0.8351 | 0.7462 | 0.7948 | 0.6965 | 0.7505 | 0.8393 | 0.8715 |
| 0.3432 | 36.0 | 18000 | 0.3481 | 0.7446 | 0.9511 | 0.8872 | 0.2363 | 0.736 | 0.7854 | 0.3045 | 0.7874 | 0.7935 | 0.3386 | 0.7907 | 0.8207 | 0.7246 | 0.7702 | 0.681 | 0.7485 | 0.8281 | 0.8618 |
| 0.3356 | 37.0 | 18500 | 0.3494 | 0.7521 | 0.9476 | 0.8878 | 0.2416 | 0.7427 | 0.7653 | 0.303 | 0.7918 | 0.7968 | 0.3252 | 0.7957 | 0.7955 | 0.7637 | 0.8175 | 0.6657 | 0.7134 | 0.8267 | 0.8594 |
| 0.3211 | 38.0 | 19000 | 0.3288 | 0.7571 | 0.9543 | 0.9015 | 0.2839 | 0.7483 | 0.781 | 0.3027 | 0.7995 | 0.8041 | 0.4048 | 0.8004 | 0.8109 | 0.7496 | 0.8 | 0.6881 | 0.7423 | 0.8337 | 0.87 |
| 0.3051 | 39.0 | 19500 | 0.3266 | 0.7633 | 0.9573 | 0.8897 | 0.2715 | 0.7586 | 0.7705 | 0.301 | 0.8041 | 0.8081 | 0.3438 | 0.8078 | 0.8026 | 0.7629 | 0.8091 | 0.6927 | 0.7454 | 0.8342 | 0.8697 |
| 0.3382 | 40.0 | 20000 | 0.3310 | 0.7733 | 0.956 | 0.9068 | 0.2091 | 0.7786 | 0.8033 | 0.3103 | 0.8118 | 0.8145 | 0.291 | 0.82 | 0.836 | 0.7502 | 0.8012 | 0.7314 | 0.7722 | 0.8382 | 0.87 |
| 0.3142 | 41.0 | 20500 | 0.3367 | 0.7619 | 0.9449 | 0.8784 | 0.2308 | 0.7659 | 0.7691 | 0.3025 | 0.8005 | 0.8044 | 0.3357 | 0.8087 | 0.8008 | 0.7696 | 0.8139 | 0.6846 | 0.733 | 0.8316 | 0.8664 |
| 0.3508 | 42.0 | 21000 | 0.3400 | 0.7435 | 0.9537 | 0.8756 | 0.2596 | 0.746 | 0.758 | 0.3028 | 0.7895 | 0.7959 | 0.3938 | 0.7955 | 0.7991 | 0.7244 | 0.7857 | 0.6718 | 0.7299 | 0.8343 | 0.8721 |
| 0.3418 | 43.0 | 21500 | 0.3165 | 0.7693 | 0.9654 | 0.8953 | 0.2851 | 0.7617 | 0.8064 | 0.3075 | 0.8188 | 0.8224 | 0.4124 | 0.8151 | 0.8479 | 0.7574 | 0.8226 | 0.7105 | 0.767 | 0.8401 | 0.8776 |
| 0.3113 | 44.0 | 22000 | 0.3370 | 0.7503 | 0.9598 | 0.8954 | 0.2546 | 0.7475 | 0.7714 | 0.3057 | 0.7979 | 0.8007 | 0.311 | 0.7995 | 0.8096 | 0.7492 | 0.8032 | 0.6775 | 0.7351 | 0.8242 | 0.8639 |
| 0.3321 | 45.0 | 22500 | 0.3149 | 0.7702 | 0.9513 | 0.8936 | 0.2749 | 0.7769 | 0.7853 | 0.3116 | 0.8095 | 0.8155 | 0.3581 | 0.8213 | 0.8192 | 0.7551 | 0.8103 | 0.7094 | 0.7567 | 0.8461 | 0.8794 |
| 0.2898 | 46.0 | 23000 | 0.3268 | 0.7598 | 0.9362 | 0.8826 | 0.2491 | 0.767 | 0.7647 | 0.3047 | 0.8014 | 0.8043 | 0.3048 | 0.8129 | 0.7967 | 0.7735 | 0.8246 | 0.6665 | 0.7134 | 0.8394 | 0.8748 |
| 0.3005 | 47.0 | 23500 | 0.3154 | 0.7694 | 0.9539 | 0.9 | 0.3107 | 0.7722 | 0.7935 | 0.3095 | 0.8083 | 0.8121 | 0.3771 | 0.8155 | 0.8286 | 0.7664 | 0.8179 | 0.6945 | 0.7412 | 0.8472 | 0.8773 |
| 0.2937 | 48.0 | 24000 | 0.3359 | 0.7649 | 0.9433 | 0.8798 | 0.2748 | 0.7609 | 0.7692 | 0.3077 | 0.8046 | 0.8085 | 0.3462 | 0.8091 | 0.8109 | 0.7668 | 0.8167 | 0.6882 | 0.734 | 0.8397 | 0.8748 |
| 0.2994 | 49.0 | 24500 | 0.3217 | 0.7658 | 0.9399 | 0.9006 | 0.2809 | 0.758 | 0.7851 | 0.3086 | 0.8038 | 0.8083 | 0.3738 | 0.8032 | 0.8207 | 0.7636 | 0.8075 | 0.6925 | 0.7454 | 0.8414 | 0.8721 |
| 0.2677 | 50.0 | 25000 | 0.3322 | 0.7653 | 0.9479 | 0.8893 | 0.2232 | 0.7697 | 0.768 | 0.3096 | 0.8031 | 0.8077 | 0.3152 | 0.8123 | 0.8061 | 0.7537 | 0.8004 | 0.6944 | 0.7454 | 0.8477 | 0.8773 |
| 0.2658 | 51.0 | 25500 | 0.3119 | 0.7865 | 0.9556 | 0.9088 | 0.3102 | 0.7849 | 0.8165 | 0.3118 | 0.8204 | 0.8252 | 0.4181 | 0.8242 | 0.8406 | 0.7873 | 0.8282 | 0.7245 | 0.7722 | 0.8476 | 0.8752 |
| 0.3013 | 52.0 | 26000 | 0.3267 | 0.7682 | 0.9527 | 0.9042 | 0.299 | 0.7747 | 0.7725 | 0.3082 | 0.8062 | 0.811 | 0.3948 | 0.8158 | 0.8119 | 0.7713 | 0.8183 | 0.6935 | 0.7433 | 0.8398 | 0.8715 |
| 0.2996 | 53.0 | 26500 | 0.3058 | 0.7848 | 0.955 | 0.9014 | 0.2989 | 0.7855 | 0.8104 | 0.3172 | 0.8235 | 0.8269 | 0.371 | 0.8286 | 0.8383 | 0.7765 | 0.8242 | 0.7282 | 0.7753 | 0.8498 | 0.8812 |
| 0.2698 | 54.0 | 27000 | 0.3091 | 0.7792 | 0.958 | 0.9146 | 0.2821 | 0.7764 | 0.7964 | 0.3124 | 0.8157 | 0.8212 | 0.3819 | 0.8203 | 0.826 | 0.7657 | 0.8163 | 0.7275 | 0.7701 | 0.8442 | 0.8773 |
| 0.279 | 55.0 | 27500 | 0.3156 | 0.7872 | 0.9579 | 0.9076 | 0.303 | 0.7809 | 0.8213 | 0.3155 | 0.826 | 0.8296 | 0.421 | 0.8274 | 0.846 | 0.7743 | 0.8266 | 0.7342 | 0.7784 | 0.8532 | 0.8839 |
| 0.3081 | 56.0 | 28000 | 0.3420 | 0.7643 | 0.9598 | 0.9073 | 0.2604 | 0.7595 | 0.7944 | 0.3077 | 0.8032 | 0.8084 | 0.3362 | 0.8074 | 0.8304 | 0.7396 | 0.7881 | 0.7272 | 0.7753 | 0.8262 | 0.8618 |
| 0.2411 | 57.0 | 28500 | 0.3053 | 0.7867 | 0.9563 | 0.9061 | 0.3135 | 0.7858 | 0.8014 | 0.3175 | 0.8211 | 0.8253 | 0.389 | 0.8296 | 0.8334 | 0.785 | 0.8298 | 0.716 | 0.7588 | 0.8592 | 0.8873 |
| 0.2855 | 58.0 | 29000 | 0.3166 | 0.7775 | 0.9534 | 0.9049 | 0.3075 | 0.7825 | 0.7655 | 0.3146 | 0.8165 | 0.821 | 0.4033 | 0.8269 | 0.8077 | 0.7687 | 0.8242 | 0.7158 | 0.7598 | 0.8479 | 0.8791 |
| 0.267 | 59.0 | 29500 | 0.3122 | 0.7824 | 0.9393 | 0.8946 | 0.2954 | 0.7884 | 0.7996 | 0.3128 | 0.8206 | 0.8242 | 0.3662 | 0.8287 | 0.8325 | 0.7812 | 0.8325 | 0.7079 | 0.7495 | 0.8583 | 0.8906 |
| 0.2794 | 60.0 | 30000 | 0.3151 | 0.7828 | 0.9538 | 0.9016 | 0.2872 | 0.7855 | 0.778 | 0.3131 | 0.8211 | 0.8243 | 0.37 | 0.8286 | 0.813 | 0.7778 | 0.8242 | 0.7169 | 0.766 | 0.8538 | 0.8827 |
| 0.2753 | 61.0 | 30500 | 0.3159 | 0.7771 | 0.9501 | 0.9101 | 0.2857 | 0.7781 | 0.7739 | 0.3145 | 0.8203 | 0.824 | 0.4062 | 0.8265 | 0.8113 | 0.7801 | 0.8345 | 0.7112 | 0.7629 | 0.8401 | 0.8745 |
| 0.2723 | 62.0 | 31000 | 0.3247 | 0.7794 | 0.9358 | 0.8932 | 0.2729 | 0.7824 | 0.7706 | 0.31 | 0.8159 | 0.8191 | 0.361 | 0.8214 | 0.802 | 0.7704 | 0.8222 | 0.7175 | 0.7536 | 0.8503 | 0.8815 |
| 0.2692 | 63.0 | 31500 | 0.3120 | 0.78 | 0.9539 | 0.9024 | 0.3299 | 0.7832 | 0.7737 | 0.3081 | 0.8156 | 0.8214 | 0.4438 | 0.8239 | 0.8035 | 0.7707 | 0.8222 | 0.7095 | 0.7495 | 0.8597 | 0.8924 |
| 0.2669 | 64.0 | 32000 | 0.3148 | 0.7769 | 0.9381 | 0.8971 | 0.2555 | 0.7829 | 0.7754 | 0.3104 | 0.8145 | 0.8184 | 0.3438 | 0.825 | 0.8081 | 0.7828 | 0.8313 | 0.6954 | 0.7371 | 0.8527 | 0.8867 |
| 0.2879 | 65.0 | 32500 | 0.3274 | 0.7675 | 0.9433 | 0.8828 | 0.3322 | 0.7679 | 0.7775 | 0.3077 | 0.811 | 0.8149 | 0.4195 | 0.8156 | 0.8094 | 0.7644 | 0.8147 | 0.6933 | 0.7464 | 0.8448 | 0.8836 |
| 0.2846 | 66.0 | 33000 | 0.3176 | 0.7834 | 0.9533 | 0.8909 | 0.2865 | 0.7898 | 0.8015 | 0.3158 | 0.829 | 0.8341 | 0.3976 | 0.8383 | 0.8359 | 0.7678 | 0.823 | 0.7381 | 0.8 | 0.8442 | 0.8794 |
| 0.2543 | 67.0 | 33500 | 0.3261 | 0.7757 | 0.9517 | 0.8887 | 0.2777 | 0.7769 | 0.7902 | 0.3088 | 0.8168 | 0.8227 | 0.4229 | 0.8219 | 0.8252 | 0.7759 | 0.8274 | 0.7054 | 0.7598 | 0.8457 | 0.8809 |
| 0.2968 | 68.0 | 34000 | 0.3083 | 0.782 | 0.9658 | 0.9035 | 0.3121 | 0.7814 | 0.8046 | 0.3086 | 0.8228 | 0.828 | 0.4286 | 0.8297 | 0.8425 | 0.7727 | 0.8159 | 0.7152 | 0.7773 | 0.8581 | 0.8909 |
| 0.2598 | 69.0 | 34500 | 0.3058 | 0.7852 | 0.953 | 0.9034 | 0.3264 | 0.7911 | 0.7915 | 0.3148 | 0.8295 | 0.8336 | 0.4014 | 0.8387 | 0.8317 | 0.7819 | 0.8369 | 0.7188 | 0.7742 | 0.855 | 0.8897 |
| 0.2432 | 70.0 | 35000 | 0.3179 | 0.7686 | 0.9519 | 0.8945 | 0.3086 | 0.7737 | 0.7688 | 0.309 | 0.8117 | 0.816 | 0.4214 | 0.8218 | 0.8084 | 0.765 | 0.8194 | 0.688 | 0.7412 | 0.8528 | 0.8873 |
| 0.2625 | 71.0 | 35500 | 0.3310 | 0.7748 | 0.952 | 0.8946 | 0.2824 | 0.7826 | 0.771 | 0.3096 | 0.814 | 0.8185 | 0.369 | 0.8234 | 0.8166 | 0.7704 | 0.8202 | 0.7187 | 0.767 | 0.8352 | 0.8682 |
| 0.2869 | 72.0 | 36000 | 0.3307 | 0.7664 | 0.9427 | 0.8855 | 0.331 | 0.7779 | 0.7498 | 0.3018 | 0.8058 | 0.8102 | 0.4124 | 0.821 | 0.7887 | 0.7688 | 0.8246 | 0.6784 | 0.7196 | 0.852 | 0.8864 |
| 0.2644 | 73.0 | 36500 | 0.3320 | 0.7719 | 0.9517 | 0.8835 | 0.3091 | 0.7788 | 0.7436 | 0.3064 | 0.8095 | 0.8125 | 0.4224 | 0.8211 | 0.7816 | 0.7676 | 0.8155 | 0.6998 | 0.7381 | 0.8485 | 0.8839 |
| 0.2598 | 74.0 | 37000 | 0.3211 | 0.7792 | 0.9438 | 0.8931 | 0.2695 | 0.7865 | 0.7613 | 0.3083 | 0.8175 | 0.822 | 0.3433 | 0.8334 | 0.8011 | 0.7885 | 0.8337 | 0.6992 | 0.7433 | 0.8501 | 0.8891 |
| 0.2982 | 75.0 | 37500 | 0.3129 | 0.7714 | 0.9429 | 0.891 | 0.2886 | 0.7808 | 0.7612 | 0.3065 | 0.8116 | 0.8157 | 0.3743 | 0.8253 | 0.7992 | 0.7703 | 0.8214 | 0.6943 | 0.7402 | 0.8496 | 0.8855 |
| 0.2442 | 76.0 | 38000 | 0.3125 | 0.7798 | 0.9428 | 0.8784 | 0.2624 | 0.7839 | 0.7722 | 0.3123 | 0.8172 | 0.8222 | 0.3724 | 0.8285 | 0.8086 | 0.7815 | 0.8258 | 0.7029 | 0.7515 | 0.8551 | 0.8894 |
| 0.2609 | 77.0 | 38500 | 0.3084 | 0.7785 | 0.9488 | 0.8845 | 0.2835 | 0.7906 | 0.7825 | 0.3125 | 0.8198 | 0.8239 | 0.3833 | 0.8346 | 0.826 | 0.7794 | 0.823 | 0.706 | 0.7608 | 0.85 | 0.8879 |
| 0.276 | 78.0 | 39000 | 0.3242 | 0.7851 | 0.9398 | 0.8971 | 0.2651 | 0.7886 | 0.7793 | 0.3116 | 0.8239 | 0.8279 | 0.3433 | 0.8364 | 0.8149 | 0.783 | 0.8298 | 0.7233 | 0.768 | 0.8489 | 0.8858 |
| 0.2669 | 79.0 | 39500 | 0.3163 | 0.7804 | 0.9405 | 0.8776 | 0.2287 | 0.7854 | 0.7748 | 0.3102 | 0.8229 | 0.8266 | 0.3114 | 0.8344 | 0.8049 | 0.7806 | 0.8313 | 0.7119 | 0.7619 | 0.8487 | 0.8867 |
| 0.2178 | 80.0 | 40000 | 0.3195 | 0.7717 | 0.9477 | 0.8871 | 0.3213 | 0.7833 | 0.7551 | 0.3063 | 0.8144 | 0.8179 | 0.409 | 0.8295 | 0.791 | 0.77 | 0.821 | 0.6939 | 0.7423 | 0.8512 | 0.8903 |
| 0.2674 | 81.0 | 40500 | 0.3227 | 0.7798 | 0.9389 | 0.8906 | 0.2706 | 0.7869 | 0.7781 | 0.3123 | 0.8198 | 0.8235 | 0.3557 | 0.8306 | 0.812 | 0.7846 | 0.8341 | 0.7025 | 0.7515 | 0.8523 | 0.8848 |
| 0.267 | 82.0 | 41000 | 0.3452 | 0.7631 | 0.9325 | 0.8928 | 0.2776 | 0.7729 | 0.7402 | 0.3044 | 0.8054 | 0.8094 | 0.361 | 0.8197 | 0.7818 | 0.7606 | 0.8163 | 0.6837 | 0.7299 | 0.845 | 0.8821 |
| 0.2288 | 83.0 | 41500 | 0.3283 | 0.7862 | 0.9395 | 0.897 | 0.29 | 0.789 | 0.7729 | 0.3154 | 0.825 | 0.8282 | 0.3833 | 0.833 | 0.8075 | 0.781 | 0.8357 | 0.7269 | 0.766 | 0.8506 | 0.883 |
| 0.2467 | 84.0 | 42000 | 0.3113 | 0.7818 | 0.938 | 0.884 | 0.3061 | 0.7907 | 0.7668 | 0.3087 | 0.8224 | 0.8253 | 0.4119 | 0.8334 | 0.7987 | 0.7704 | 0.8218 | 0.716 | 0.7588 | 0.8588 | 0.8955 |
| 0.2282 | 85.0 | 42500 | 0.3234 | 0.7902 | 0.9402 | 0.898 | 0.2703 | 0.7938 | 0.7856 | 0.3133 | 0.8274 | 0.8308 | 0.3562 | 0.835 | 0.8148 | 0.7968 | 0.8421 | 0.7168 | 0.7619 | 0.8568 | 0.8885 |
| 0.2556 | 86.0 | 43000 | 0.3280 | 0.788 | 0.9457 | 0.8903 | 0.3085 | 0.7956 | 0.7863 | 0.3152 | 0.8288 | 0.8317 | 0.4019 | 0.837 | 0.8213 | 0.7853 | 0.8313 | 0.7236 | 0.7742 | 0.8552 | 0.8894 |
| 0.2462 | 87.0 | 43500 | 0.3256 | 0.7821 | 0.9447 | 0.8838 | 0.28 | 0.7892 | 0.7757 | 0.3109 | 0.8211 | 0.825 | 0.3581 | 0.8308 | 0.8074 | 0.773 | 0.8258 | 0.7146 | 0.7577 | 0.8586 | 0.8915 |
| 0.2446 | 88.0 | 44000 | 0.3304 | 0.7844 | 0.9456 | 0.8984 | 0.2819 | 0.7876 | 0.7985 | 0.3152 | 0.8237 | 0.8281 | 0.3757 | 0.8307 | 0.8297 | 0.7825 | 0.8321 | 0.722 | 0.768 | 0.8485 | 0.8842 |
| 0.2167 | 89.0 | 44500 | 0.3307 | 0.7836 | 0.9465 | 0.8931 | 0.2796 | 0.7927 | 0.7857 | 0.3172 | 0.8224 | 0.8267 | 0.3633 | 0.8337 | 0.8227 | 0.7641 | 0.8179 | 0.7295 | 0.7742 | 0.8572 | 0.8879 |
| 0.2208 | 90.0 | 45000 | 0.3141 | 0.7897 | 0.9357 | 0.8778 | 0.2966 | 0.7964 | 0.7858 | 0.3146 | 0.8255 | 0.8305 | 0.4095 | 0.8361 | 0.8162 | 0.7981 | 0.8492 | 0.7123 | 0.7515 | 0.8586 | 0.8906 |
| 0.2179 | 91.0 | 45500 | 0.3065 | 0.7973 | 0.9521 | 0.8981 | 0.3185 | 0.8027 | 0.7978 | 0.3189 | 0.8364 | 0.8395 | 0.4481 | 0.8432 | 0.8337 | 0.7911 | 0.8397 | 0.7413 | 0.7866 | 0.8597 | 0.8921 |
| 0.223 | 92.0 | 46000 | 0.3365 | 0.7768 | 0.9443 | 0.8936 | 0.3138 | 0.782 | 0.7559 | 0.3082 | 0.8158 | 0.8189 | 0.419 | 0.8218 | 0.7933 | 0.7745 | 0.8254 | 0.6993 | 0.7433 | 0.8566 | 0.8879 |
| 0.2352 | 93.0 | 46500 | 0.3110 | 0.7881 | 0.9509 | 0.8961 | 0.302 | 0.797 | 0.7847 | 0.3144 | 0.8266 | 0.8294 | 0.4052 | 0.8362 | 0.8163 | 0.7811 | 0.8298 | 0.7202 | 0.7639 | 0.863 | 0.8945 |
| 0.2379 | 94.0 | 47000 | 0.3129 | 0.7849 | 0.942 | 0.8943 | 0.2918 | 0.7914 | 0.7724 | 0.3131 | 0.8218 | 0.8257 | 0.3952 | 0.8352 | 0.8035 | 0.7789 | 0.8266 | 0.713 | 0.7557 | 0.8627 | 0.8948 |
| 0.2309 | 95.0 | 47500 | 0.3100 | 0.794 | 0.9435 | 0.907 | 0.2909 | 0.8005 | 0.7779 | 0.3159 | 0.8312 | 0.8368 | 0.4524 | 0.8441 | 0.8063 | 0.7918 | 0.8437 | 0.7237 | 0.7691 | 0.8664 | 0.8976 |
| 0.2421 | 96.0 | 48000 | 0.3244 | 0.789 | 0.9436 | 0.8999 | 0.307 | 0.7934 | 0.781 | 0.315 | 0.8261 | 0.8304 | 0.4257 | 0.8316 | 0.8173 | 0.7863 | 0.8365 | 0.7156 | 0.7608 | 0.865 | 0.8939 |
| 0.2159 | 97.0 | 48500 | 0.3186 | 0.796 | 0.9441 | 0.9013 | 0.2801 | 0.7992 | 0.8041 | 0.3212 | 0.8339 | 0.8378 | 0.3686 | 0.8442 | 0.8324 | 0.7891 | 0.8357 | 0.7372 | 0.7825 | 0.8619 | 0.8952 |
| 0.2395 | 98.0 | 49000 | 0.3188 | 0.7928 | 0.9407 | 0.8933 | 0.315 | 0.801 | 0.7774 | 0.3147 | 0.83 | 0.8329 | 0.4005 | 0.8418 | 0.8055 | 0.7878 | 0.8337 | 0.727 | 0.7691 | 0.8635 | 0.8958 |
| 0.2334 | 99.0 | 49500 | 0.2972 | 0.8062 | 0.9481 | 0.9016 | 0.3125 | 0.8137 | 0.812 | 0.3235 | 0.8437 | 0.8469 | 0.3895 | 0.8561 | 0.8396 | 0.8003 | 0.8468 | 0.7492 | 0.7918 | 0.8692 | 0.9021 |
| 0.2293 | 100.0 | 50000 | 0.3288 | 0.7856 | 0.9393 | 0.8979 | 0.2595 | 0.7885 | 0.7879 | 0.3122 | 0.8208 | 0.8248 | 0.3505 | 0.8297 | 0.8159 | 0.7777 | 0.8262 | 0.7237 | 0.7608 | 0.8554 | 0.8873 |
| 0.2218 | 101.0 | 50500 | 0.3177 | 0.7966 | 0.9464 | 0.8976 | 0.2801 | 0.8019 | 0.8037 | 0.3176 | 0.8319 | 0.8356 | 0.3643 | 0.8386 | 0.8383 | 0.7882 | 0.8333 | 0.7359 | 0.7773 | 0.8657 | 0.8961 |
| 0.207 | 102.0 | 51000 | 0.3204 | 0.7944 | 0.9457 | 0.8966 | 0.2779 | 0.8032 | 0.7741 | 0.3145 | 0.8296 | 0.8333 | 0.361 | 0.8433 | 0.8043 | 0.793 | 0.8417 | 0.726 | 0.7639 | 0.8642 | 0.8942 |
| 0.2236 | 103.0 | 51500 | 0.3233 | 0.7909 | 0.9434 | 0.8957 | 0.2628 | 0.8028 | 0.7914 | 0.3152 | 0.8281 | 0.8326 | 0.379 | 0.8406 | 0.8274 | 0.7808 | 0.8306 | 0.7275 | 0.7691 | 0.8644 | 0.8982 |
| 0.2209 | 104.0 | 52000 | 0.3113 | 0.8037 | 0.9459 | 0.915 | 0.302 | 0.8082 | 0.8123 | 0.3201 | 0.8385 | 0.8435 | 0.4129 | 0.8493 | 0.8403 | 0.7971 | 0.844 | 0.7465 | 0.7876 | 0.8674 | 0.8988 |
| 0.2005 | 105.0 | 52500 | 0.3211 | 0.8 | 0.9458 | 0.9061 | 0.3026 | 0.8039 | 0.8066 | 0.3217 | 0.8355 | 0.8388 | 0.3881 | 0.8449 | 0.8356 | 0.7872 | 0.8369 | 0.7489 | 0.7866 | 0.8639 | 0.893 |
| 0.2611 | 106.0 | 53000 | 0.3086 | 0.7984 | 0.9393 | 0.9063 | 0.3172 | 0.8085 | 0.7794 | 0.3171 | 0.8347 | 0.8377 | 0.3995 | 0.851 | 0.8088 | 0.7972 | 0.8472 | 0.7298 | 0.766 | 0.8682 | 0.9 |
| 0.2117 | 107.0 | 53500 | 0.3087 | 0.7914 | 0.9424 | 0.8985 | 0.3112 | 0.7978 | 0.7693 | 0.315 | 0.8293 | 0.833 | 0.4114 | 0.8418 | 0.8029 | 0.7812 | 0.8333 | 0.7295 | 0.768 | 0.8635 | 0.8976 |
| 0.2093 | 108.0 | 54000 | 0.3056 | 0.7981 | 0.9479 | 0.9065 | 0.2851 | 0.807 | 0.8011 | 0.3201 | 0.8326 | 0.8376 | 0.3867 | 0.8447 | 0.8322 | 0.7836 | 0.8317 | 0.7378 | 0.7794 | 0.873 | 0.9018 |
| 0.2155 | 109.0 | 54500 | 0.3124 | 0.8016 | 0.9461 | 0.9017 | 0.2928 | 0.8107 | 0.7842 | 0.3212 | 0.8377 | 0.8413 | 0.3738 | 0.8527 | 0.8171 | 0.7936 | 0.8456 | 0.7448 | 0.7835 | 0.8664 | 0.8948 |
| 0.2033 | 110.0 | 55000 | 0.3006 | 0.8014 | 0.9508 | 0.9072 | 0.2781 | 0.8032 | 0.8093 | 0.3222 | 0.8395 | 0.8436 | 0.3671 | 0.8502 | 0.8373 | 0.7922 | 0.8448 | 0.7448 | 0.7887 | 0.8671 | 0.8973 |
| 0.2083 | 111.0 | 55500 | 0.3220 | 0.799 | 0.9442 | 0.8934 | 0.2552 | 0.8021 | 0.7974 | 0.3215 | 0.8361 | 0.8386 | 0.3371 | 0.8459 | 0.8293 | 0.7975 | 0.8437 | 0.7367 | 0.7784 | 0.8629 | 0.8939 |
| 0.2128 | 112.0 | 56000 | 0.3111 | 0.7999 | 0.9453 | 0.8985 | 0.292 | 0.8055 | 0.807 | 0.3216 | 0.8357 | 0.839 | 0.369 | 0.8456 | 0.8404 | 0.8004 | 0.8472 | 0.7292 | 0.7711 | 0.8701 | 0.8988 |
| 0.2104 | 113.0 | 56500 | 0.3162 | 0.7986 | 0.9476 | 0.9022 | 0.2889 | 0.8053 | 0.8013 | 0.3201 | 0.8347 | 0.8382 | 0.3781 | 0.8441 | 0.8324 | 0.7916 | 0.8397 | 0.7354 | 0.7763 | 0.8689 | 0.8985 |
| 0.2143 | 114.0 | 57000 | 0.3173 | 0.7983 | 0.9394 | 0.9003 | 0.2735 | 0.8079 | 0.7805 | 0.3189 | 0.8345 | 0.8368 | 0.3362 | 0.8475 | 0.814 | 0.7948 | 0.8413 | 0.7299 | 0.7701 | 0.8703 | 0.8991 |
| 0.2068 | 115.0 | 57500 | 0.3119 | 0.7984 | 0.9423 | 0.9043 | 0.2954 | 0.81 | 0.7703 | 0.3175 | 0.8341 | 0.8366 | 0.3605 | 0.8496 | 0.8062 | 0.7905 | 0.8409 | 0.7332 | 0.7691 | 0.8714 | 0.9 |
| 0.229 | 116.0 | 58000 | 0.3149 | 0.7965 | 0.9423 | 0.8993 | 0.2689 | 0.8075 | 0.7753 | 0.3185 | 0.8322 | 0.836 | 0.37 | 0.8469 | 0.8078 | 0.7908 | 0.8401 | 0.7306 | 0.7701 | 0.8682 | 0.8979 |
| 0.2138 | 117.0 | 58500 | 0.3153 | 0.7996 | 0.9409 | 0.9016 | 0.2821 | 0.8053 | 0.7852 | 0.3196 | 0.8357 | 0.8388 | 0.3762 | 0.848 | 0.8162 | 0.7975 | 0.8468 | 0.7319 | 0.7701 | 0.8692 | 0.8994 |
| 0.2498 | 118.0 | 59000 | 0.3178 | 0.7947 | 0.9367 | 0.8946 | 0.3007 | 0.8062 | 0.7716 | 0.3153 | 0.8313 | 0.8343 | 0.3781 | 0.8467 | 0.8048 | 0.7845 | 0.8341 | 0.7271 | 0.767 | 0.8725 | 0.9018 |
| 0.1974 | 119.0 | 59500 | 0.3135 | 0.7969 | 0.942 | 0.8924 | 0.3104 | 0.8037 | 0.7801 | 0.3179 | 0.8314 | 0.8342 | 0.3767 | 0.8466 | 0.8106 | 0.7876 | 0.8349 | 0.7301 | 0.766 | 0.873 | 0.9018 |
| 0.2079 | 120.0 | 60000 | 0.3068 | 0.7985 | 0.9365 | 0.8994 | 0.2926 | 0.8085 | 0.772 | 0.3193 | 0.8339 | 0.8366 | 0.351 | 0.8518 | 0.8033 | 0.7931 | 0.8405 | 0.7299 | 0.768 | 0.8724 | 0.9012 |
| 0.2138 | 121.0 | 60500 | 0.3116 | 0.8011 | 0.9434 | 0.8993 | 0.3113 | 0.8077 | 0.8025 | 0.3197 | 0.8373 | 0.8408 | 0.3848 | 0.8503 | 0.8327 | 0.792 | 0.8409 | 0.7407 | 0.7814 | 0.8705 | 0.9 |
| 0.2041 | 122.0 | 61000 | 0.3138 | 0.7979 | 0.9424 | 0.8994 | 0.3121 | 0.8087 | 0.7685 | 0.3184 | 0.834 | 0.8369 | 0.3962 | 0.8509 | 0.8019 | 0.7874 | 0.8345 | 0.7341 | 0.7753 | 0.8723 | 0.9009 |
| 0.1967 | 123.0 | 61500 | 0.3122 | 0.8005 | 0.9388 | 0.9016 | 0.2981 | 0.8074 | 0.7723 | 0.3177 | 0.8352 | 0.8376 | 0.3719 | 0.8508 | 0.8013 | 0.7988 | 0.8452 | 0.7333 | 0.7691 | 0.8692 | 0.8985 |
| 0.2053 | 124.0 | 62000 | 0.3151 | 0.7946 | 0.9362 | 0.9015 | 0.2908 | 0.8052 | 0.7641 | 0.318 | 0.8299 | 0.8325 | 0.3738 | 0.8464 | 0.7943 | 0.7887 | 0.8393 | 0.7308 | 0.7639 | 0.8642 | 0.8942 |
| 0.2082 | 125.0 | 62500 | 0.3126 | 0.7999 | 0.9362 | 0.9025 | 0.2865 | 0.8106 | 0.7728 | 0.3186 | 0.8336 | 0.836 | 0.3686 | 0.8515 | 0.7985 | 0.7976 | 0.8429 | 0.7335 | 0.767 | 0.8688 | 0.8982 |
| 0.2148 | 126.0 | 63000 | 0.3068 | 0.8039 | 0.9451 | 0.9013 | 0.2967 | 0.8156 | 0.7788 | 0.3212 | 0.8382 | 0.841 | 0.3752 | 0.8559 | 0.8073 | 0.8 | 0.8476 | 0.7426 | 0.7763 | 0.869 | 0.8991 |
| 0.2408 | 127.0 | 63500 | 0.3024 | 0.8035 | 0.9472 | 0.9046 | 0.3043 | 0.8137 | 0.8039 | 0.3219 | 0.8391 | 0.8428 | 0.3862 | 0.8542 | 0.8365 | 0.7934 | 0.8444 | 0.7443 | 0.7825 | 0.8727 | 0.9015 |
| 0.2024 | 128.0 | 64000 | 0.3083 | 0.8002 | 0.9437 | 0.9056 | 0.3118 | 0.811 | 0.7795 | 0.3183 | 0.8352 | 0.8386 | 0.389 | 0.8517 | 0.8137 | 0.7872 | 0.8401 | 0.7398 | 0.7742 | 0.8737 | 0.9015 |
| 0.2157 | 129.0 | 64500 | 0.3075 | 0.8026 | 0.9416 | 0.8993 | 0.3022 | 0.8141 | 0.7809 | 0.3213 | 0.8373 | 0.8407 | 0.3781 | 0.8554 | 0.8121 | 0.7947 | 0.8444 | 0.7423 | 0.7773 | 0.8709 | 0.9003 |
| 0.2064 | 130.0 | 65000 | 0.3108 | 0.8024 | 0.9405 | 0.8981 | 0.3084 | 0.8127 | 0.7864 | 0.3208 | 0.8373 | 0.8407 | 0.3976 | 0.8542 | 0.8147 | 0.7937 | 0.844 | 0.7428 | 0.7784 | 0.8708 | 0.8997 |
| 0.1886 | 131.0 | 65500 | 0.3046 | 0.8058 | 0.9408 | 0.9039 | 0.2995 | 0.813 | 0.792 | 0.3228 | 0.8402 | 0.8437 | 0.3895 | 0.8542 | 0.8232 | 0.798 | 0.8476 | 0.7472 | 0.7825 | 0.8721 | 0.9009 |
| 0.2001 | 132.0 | 66000 | 0.3036 | 0.8018 | 0.9402 | 0.9011 | 0.298 | 0.8102 | 0.7865 | 0.3198 | 0.8374 | 0.8404 | 0.3762 | 0.8525 | 0.8175 | 0.7945 | 0.844 | 0.742 | 0.7784 | 0.869 | 0.8988 |
| 0.2174 | 133.0 | 66500 | 0.3015 | 0.8053 | 0.9437 | 0.9028 | 0.3028 | 0.8146 | 0.7837 | 0.3223 | 0.8393 | 0.8422 | 0.3795 | 0.8545 | 0.814 | 0.7967 | 0.8464 | 0.7486 | 0.7814 | 0.8705 | 0.8988 |
| 0.1916 | 134.0 | 67000 | 0.3066 | 0.8044 | 0.9404 | 0.9053 | 0.3072 | 0.8123 | 0.7829 | 0.3217 | 0.8386 | 0.8417 | 0.3943 | 0.8525 | 0.8119 | 0.7965 | 0.848 | 0.7464 | 0.7784 | 0.8704 | 0.8988 |
| 0.2129 | 135.0 | 67500 | 0.3089 | 0.8007 | 0.9374 | 0.9011 | 0.2991 | 0.8089 | 0.7809 | 0.3206 | 0.8344 | 0.838 | 0.3862 | 0.8496 | 0.8124 | 0.793 | 0.8433 | 0.7368 | 0.7701 | 0.8722 | 0.9006 |
| 0.2161 | 136.0 | 68000 | 0.3047 | 0.806 | 0.9405 | 0.9026 | 0.3 | 0.8156 | 0.7859 | 0.3214 | 0.8397 | 0.8431 | 0.3895 | 0.8561 | 0.8157 | 0.7985 | 0.8468 | 0.7468 | 0.7804 | 0.8727 | 0.9021 |
| 0.2227 | 137.0 | 68500 | 0.3070 | 0.8041 | 0.9412 | 0.9039 | 0.309 | 0.814 | 0.7813 | 0.3206 | 0.8384 | 0.8413 | 0.3824 | 0.8539 | 0.8142 | 0.7924 | 0.8433 | 0.7456 | 0.7784 | 0.8743 | 0.9024 |
| 0.219 | 138.0 | 69000 | 0.3046 | 0.8056 | 0.9405 | 0.9058 | 0.3072 | 0.816 | 0.7821 | 0.3222 | 0.8391 | 0.8421 | 0.3857 | 0.8561 | 0.8109 | 0.7956 | 0.8444 | 0.7483 | 0.7804 | 0.873 | 0.9015 |
| 0.201 | 139.0 | 69500 | 0.3036 | 0.8044 | 0.9405 | 0.9022 | 0.3104 | 0.8138 | 0.7828 | 0.3222 | 0.8385 | 0.8414 | 0.3857 | 0.8537 | 0.8135 | 0.7916 | 0.8417 | 0.7483 | 0.7814 | 0.8732 | 0.9012 |
| 0.2011 | 140.0 | 70000 | 0.3054 | 0.8033 | 0.9404 | 0.9008 | 0.2954 | 0.814 | 0.7807 | 0.3215 | 0.8369 | 0.8403 | 0.3762 | 0.8538 | 0.8122 | 0.791 | 0.8405 | 0.7461 | 0.7794 | 0.8729 | 0.9009 |
| 0.2139 | 141.0 | 70500 | 0.3038 | 0.8057 | 0.9404 | 0.9055 | 0.3014 | 0.8142 | 0.7827 | 0.3226 | 0.8397 | 0.8427 | 0.3857 | 0.8545 | 0.8134 | 0.7963 | 0.8456 | 0.7496 | 0.7825 | 0.8712 | 0.9 |
| 0.2095 | 142.0 | 71000 | 0.3056 | 0.8043 | 0.9404 | 0.9023 | 0.2945 | 0.8148 | 0.7828 | 0.3213 | 0.838 | 0.8417 | 0.3862 | 0.8545 | 0.8134 | 0.7941 | 0.8448 | 0.746 | 0.7794 | 0.8728 | 0.9009 |
| 0.2028 | 143.0 | 71500 | 0.3062 | 0.805 | 0.9404 | 0.9022 | 0.2943 | 0.8141 | 0.7824 | 0.3214 | 0.8382 | 0.8419 | 0.3829 | 0.8545 | 0.8135 | 0.7948 | 0.8444 | 0.7472 | 0.7804 | 0.8732 | 0.9009 |
| 0.2026 | 144.0 | 72000 | 0.3060 | 0.8046 | 0.9405 | 0.9049 | 0.2989 | 0.8135 | 0.7825 | 0.3213 | 0.8384 | 0.8418 | 0.3829 | 0.854 | 0.8135 | 0.7929 | 0.8433 | 0.7487 | 0.7814 | 0.8723 | 0.9006 |
| 0.2019 | 145.0 | 72500 | 0.3062 | 0.8046 | 0.9404 | 0.9017 | 0.3043 | 0.8126 | 0.7825 | 0.3214 | 0.8382 | 0.8416 | 0.3862 | 0.8536 | 0.8135 | 0.7932 | 0.8437 | 0.7477 | 0.7804 | 0.873 | 0.9006 |
| 0.1945 | 146.0 | 73000 | 0.3056 | 0.8047 | 0.9405 | 0.9024 | 0.2993 | 0.8137 | 0.7842 | 0.3221 | 0.8384 | 0.8418 | 0.3829 | 0.8542 | 0.8145 | 0.7934 | 0.844 | 0.7479 | 0.7804 | 0.8728 | 0.9009 |
| 0.1983 | 147.0 | 73500 | 0.3080 | 0.8049 | 0.9405 | 0.9024 | 0.3009 | 0.8141 | 0.7843 | 0.3218 | 0.8386 | 0.842 | 0.3829 | 0.8546 | 0.8147 | 0.7937 | 0.844 | 0.7479 | 0.7804 | 0.8731 | 0.9015 |
| 0.19 | 148.0 | 74000 | 0.3058 | 0.8044 | 0.9405 | 0.9024 | 0.2979 | 0.8141 | 0.7843 | 0.3221 | 0.8382 | 0.8419 | 0.3829 | 0.8546 | 0.8145 | 0.7936 | 0.844 | 0.7475 | 0.7804 | 0.8722 | 0.9012 |
| 0.1978 | 149.0 | 74500 | 0.3059 | 0.8044 | 0.9405 | 0.9024 | 0.2979 | 0.8141 | 0.7843 | 0.3221 | 0.8382 | 0.8419 | 0.3829 | 0.8546 | 0.8145 | 0.7936 | 0.844 | 0.7475 | 0.7804 | 0.8722 | 0.9012 |
| 0.2344 | 150.0 | 75000 | 0.3059 | 0.8044 | 0.9405 | 0.9024 | 0.2979 | 0.8141 | 0.7843 | 0.3221 | 0.8382 | 0.8419 | 0.3829 | 0.8546 | 0.8145 | 0.7936 | 0.844 | 0.7475 | 0.7804 | 0.8722 | 0.9012 |
### Framework versions
- Transformers 4.46.2
- Pytorch 2.5.0+cu121
- Datasets 2.19.2
- Tokenizers 0.20.3
| [
"chicken",
"duck",
"plant"
] |
syamgopal/detr-resnet-50_finetuned_cppe5 |
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# detr-resnet-50_finetuned_cppe5
This model is a fine-tuned version of [facebook/detr-resnet-50](https://huggingface.co/facebook/detr-resnet-50) on an unknown dataset.
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 1e-05
- train_batch_size: 8
- eval_batch_size: 8
- seed: 42
- optimizer: Use adamw_torch with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
- lr_scheduler_type: linear
- num_epochs: 10
- mixed_precision_training: Native AMP
### Training results
### Framework versions
- Transformers 4.46.3
- Pytorch 2.5.1+cu121
- Datasets 3.1.0
- Tokenizers 0.20.3
| [
"n/a",
"person",
"bicycle",
"car",
"motorcycle",
"airplane",
"bus",
"train",
"truck",
"boat",
"traffic light",
"fire hydrant",
"street sign",
"stop sign",
"parking meter",
"bench",
"bird",
"cat",
"dog",
"horse",
"sheep",
"cow",
"elephant",
"bear",
"zebra",
"giraffe",
"hat",
"backpack",
"umbrella",
"shoe",
"eye glasses",
"handbag",
"tie",
"suitcase",
"frisbee",
"skis",
"snowboard",
"sports ball",
"kite",
"baseball bat",
"baseball glove",
"skateboard",
"surfboard",
"tennis racket",
"bottle",
"plate",
"wine glass",
"cup",
"fork",
"knife",
"spoon",
"bowl",
"banana",
"apple",
"sandwich",
"orange",
"broccoli",
"carrot",
"hot dog",
"pizza",
"donut",
"cake",
"chair",
"couch",
"potted plant",
"bed",
"mirror",
"dining table",
"window",
"desk",
"toilet",
"door",
"tv",
"laptop",
"mouse",
"remote",
"keyboard",
"cell phone",
"microwave",
"oven",
"toaster",
"sink",
"refrigerator",
"blender",
"book",
"clock",
"vase",
"scissors",
"teddy bear",
"hair drier",
"toothbrush"
] |
joe611/chickens-composite-101818181818-150-epochs-wo-transform-metrics-test-shfld |
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# chickens-composite-101818181818-150-epochs-wo-transform-metrics-test-shfld
This model is a fine-tuned version of [facebook/detr-resnet-50](https://huggingface.co/facebook/detr-resnet-50) on the None dataset.
It achieves the following results on the evaluation set:
- Loss: 0.2924
- Map: 0.8009
- Map 50: 0.9452
- Map 75: 0.8912
- Map Small: 0.3686
- Map Medium: 0.8037
- Map Large: 0.7895
- Mar 1: 0.3378
- Mar 10: 0.8405
- Mar 100: 0.8439
- Mar Small: 0.4542
- Mar Medium: 0.8478
- Mar Large: 0.8254
- Map Chicken: 0.8118
- Mar 100 Chicken: 0.852
- Map Duck: 0.7142
- Mar 100 Duck: 0.7705
- Map Plant: 0.8766
- Mar 100 Plant: 0.909
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 1e-05
- train_batch_size: 2
- eval_batch_size: 8
- seed: 42
- optimizer: Use adamw_torch with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
- lr_scheduler_type: cosine
- num_epochs: 150
### Training results
| Training Loss | Epoch | Step | Validation Loss | Map | Map 50 | Map 75 | Map Small | Map Medium | Map Large | Mar 1 | Mar 10 | Mar 100 | Mar Small | Mar Medium | Mar Large | Map Chicken | Mar 100 Chicken | Map Duck | Mar 100 Duck | Map Plant | Mar 100 Plant |
|:-------------:|:-----:|:-----:|:---------------:|:------:|:------:|:------:|:---------:|:----------:|:---------:|:------:|:------:|:-------:|:---------:|:----------:|:---------:|:-----------:|:---------------:|:--------:|:------------:|:---------:|:-------------:|
| 1.4873 | 1.0 | 500 | 1.3189 | 0.1046 | 0.1417 | 0.1198 | 0.0079 | 0.0437 | 0.1252 | 0.0858 | 0.2076 | 0.3054 | 0.0875 | 0.2716 | 0.2959 | 0.0632 | 0.1294 | 0.0 | 0.0 | 0.2506 | 0.7867 |
| 1.2334 | 2.0 | 1000 | 1.1080 | 0.1805 | 0.2428 | 0.2062 | 0.0398 | 0.1034 | 0.203 | 0.1025 | 0.3048 | 0.3342 | 0.1125 | 0.3088 | 0.3295 | 0.0851 | 0.2339 | 0.0 | 0.0 | 0.4564 | 0.7687 |
| 1.0117 | 3.0 | 1500 | 0.9088 | 0.329 | 0.4497 | 0.3943 | 0.0521 | 0.2839 | 0.3325 | 0.1249 | 0.4453 | 0.4688 | 0.1125 | 0.4566 | 0.4233 | 0.3033 | 0.6214 | 0.0 | 0.0 | 0.6837 | 0.7849 |
| 0.8522 | 4.0 | 2000 | 0.8317 | 0.3819 | 0.5371 | 0.4528 | 0.0606 | 0.3474 | 0.3707 | 0.1329 | 0.4743 | 0.4798 | 0.1125 | 0.469 | 0.4547 | 0.4512 | 0.6702 | 0.0 | 0.0 | 0.6945 | 0.7693 |
| 0.7724 | 5.0 | 2500 | 0.6907 | 0.4113 | 0.5633 | 0.4944 | 0.088 | 0.3747 | 0.4143 | 0.1351 | 0.4954 | 0.5036 | 0.2417 | 0.4716 | 0.52 | 0.4888 | 0.7141 | 0.0 | 0.0 | 0.7451 | 0.7967 |
| 0.8285 | 6.0 | 3000 | 0.6632 | 0.4081 | 0.5628 | 0.5005 | 0.0813 | 0.3738 | 0.4167 | 0.1323 | 0.4884 | 0.4948 | 0.1583 | 0.4664 | 0.5137 | 0.4993 | 0.7008 | 0.0 | 0.0 | 0.725 | 0.7837 |
| 0.6798 | 7.0 | 3500 | 0.6236 | 0.4273 | 0.5842 | 0.5064 | 0.1213 | 0.4009 | 0.4369 | 0.1351 | 0.4978 | 0.5031 | 0.2375 | 0.4806 | 0.5154 | 0.524 | 0.704 | 0.0001 | 0.0011 | 0.7578 | 0.8042 |
| 0.6397 | 8.0 | 4000 | 0.5891 | 0.4363 | 0.5891 | 0.5082 | 0.1093 | 0.4045 | 0.4477 | 0.1368 | 0.5042 | 0.5114 | 0.2 | 0.4834 | 0.5302 | 0.5399 | 0.7198 | 0.0 | 0.0 | 0.7689 | 0.8145 |
| 0.6939 | 9.0 | 4500 | 0.5602 | 0.4515 | 0.5921 | 0.5389 | 0.0922 | 0.4212 | 0.4494 | 0.1418 | 0.5154 | 0.5182 | 0.1625 | 0.4892 | 0.5317 | 0.5863 | 0.7419 | 0.0 | 0.0 | 0.7681 | 0.8127 |
| 0.6099 | 10.0 | 5000 | 0.5398 | 0.4665 | 0.6047 | 0.5592 | 0.1144 | 0.4504 | 0.4588 | 0.1452 | 0.5237 | 0.5273 | 0.1708 | 0.5098 | 0.5263 | 0.6232 | 0.7569 | 0.0 | 0.0 | 0.7764 | 0.825 |
| 0.5951 | 11.0 | 5500 | 0.5795 | 0.4559 | 0.6063 | 0.5608 | 0.166 | 0.4343 | 0.4451 | 0.1433 | 0.5071 | 0.5096 | 0.2292 | 0.4834 | 0.5192 | 0.5929 | 0.7129 | 0.0 | 0.0 | 0.7749 | 0.816 |
| 0.6176 | 12.0 | 6000 | 0.5234 | 0.4817 | 0.6241 | 0.565 | 0.1185 | 0.4608 | 0.473 | 0.1453 | 0.5244 | 0.529 | 0.1792 | 0.5061 | 0.5327 | 0.6544 | 0.7548 | 0.0 | 0.0 | 0.7908 | 0.8322 |
| 0.5803 | 13.0 | 6500 | 0.5109 | 0.4904 | 0.6438 | 0.5771 | 0.1232 | 0.4688 | 0.4817 | 0.1522 | 0.5277 | 0.5305 | 0.15 | 0.5111 | 0.5293 | 0.6824 | 0.7597 | 0.0052 | 0.0095 | 0.7838 | 0.8223 |
| 0.6283 | 14.0 | 7000 | 0.5161 | 0.5518 | 0.7502 | 0.6607 | 0.1524 | 0.5428 | 0.5207 | 0.2041 | 0.5976 | 0.602 | 0.2292 | 0.6004 | 0.5634 | 0.6488 | 0.7266 | 0.2095 | 0.24 | 0.7972 | 0.8395 |
| 0.5737 | 15.0 | 7500 | 0.4550 | 0.6726 | 0.8882 | 0.8028 | 0.1205 | 0.6694 | 0.6669 | 0.2865 | 0.72 | 0.7253 | 0.1958 | 0.7249 | 0.7118 | 0.719 | 0.769 | 0.5155 | 0.5737 | 0.7833 | 0.8331 |
| 0.5311 | 16.0 | 8000 | 0.4303 | 0.6788 | 0.8948 | 0.8211 | 0.1591 | 0.6771 | 0.6665 | 0.2883 | 0.731 | 0.736 | 0.2625 | 0.7382 | 0.7103 | 0.7118 | 0.7718 | 0.5284 | 0.5947 | 0.7962 | 0.8416 |
| 0.4312 | 17.0 | 8500 | 0.4286 | 0.692 | 0.9122 | 0.8385 | 0.1173 | 0.6931 | 0.6796 | 0.2912 | 0.7434 | 0.7454 | 0.1917 | 0.7491 | 0.7307 | 0.7277 | 0.7839 | 0.5571 | 0.6158 | 0.7912 | 0.8364 |
| 0.4567 | 18.0 | 9000 | 0.3997 | 0.7067 | 0.9243 | 0.8524 | 0.1747 | 0.7073 | 0.6759 | 0.3026 | 0.7506 | 0.755 | 0.2583 | 0.7579 | 0.7277 | 0.7249 | 0.7718 | 0.5936 | 0.6505 | 0.8016 | 0.8428 |
| 0.4276 | 19.0 | 9500 | 0.3736 | 0.7236 | 0.9256 | 0.881 | 0.2577 | 0.7235 | 0.7125 | 0.3078 | 0.7653 | 0.7694 | 0.3333 | 0.7735 | 0.7503 | 0.7435 | 0.7944 | 0.6153 | 0.6568 | 0.8119 | 0.8569 |
| 0.4423 | 20.0 | 10000 | 0.3855 | 0.7146 | 0.9297 | 0.8421 | 0.3276 | 0.7097 | 0.6869 | 0.2993 | 0.7589 | 0.7616 | 0.3611 | 0.7627 | 0.7241 | 0.7263 | 0.7774 | 0.5928 | 0.6453 | 0.8249 | 0.862 |
| 0.4228 | 21.0 | 10500 | 0.3910 | 0.7109 | 0.9407 | 0.8658 | 0.2481 | 0.7158 | 0.6789 | 0.302 | 0.7603 | 0.7636 | 0.3264 | 0.771 | 0.7226 | 0.6971 | 0.7569 | 0.632 | 0.6905 | 0.8036 | 0.8434 |
| 0.4178 | 22.0 | 11000 | 0.3759 | 0.7255 | 0.9327 | 0.8677 | 0.2395 | 0.7319 | 0.689 | 0.3085 | 0.771 | 0.7743 | 0.3528 | 0.7837 | 0.7357 | 0.7266 | 0.7839 | 0.6366 | 0.6842 | 0.8134 | 0.8548 |
| 0.3919 | 23.0 | 11500 | 0.4000 | 0.7118 | 0.9322 | 0.8445 | 0.2206 | 0.7038 | 0.689 | 0.3029 | 0.7581 | 0.7614 | 0.3361 | 0.7601 | 0.7314 | 0.7281 | 0.781 | 0.6019 | 0.6547 | 0.8055 | 0.8485 |
| 0.3834 | 24.0 | 12000 | 0.3803 | 0.7277 | 0.939 | 0.8686 | 0.2209 | 0.736 | 0.6916 | 0.3112 | 0.7671 | 0.7718 | 0.2944 | 0.7831 | 0.7274 | 0.7299 | 0.7774 | 0.6358 | 0.6821 | 0.8175 | 0.856 |
| 0.4079 | 25.0 | 12500 | 0.3698 | 0.7215 | 0.9269 | 0.859 | 0.1499 | 0.734 | 0.6866 | 0.3083 | 0.7662 | 0.7713 | 0.2042 | 0.782 | 0.7313 | 0.7381 | 0.7895 | 0.6082 | 0.6695 | 0.8182 | 0.8548 |
| 0.395 | 26.0 | 13000 | 0.3549 | 0.7357 | 0.9289 | 0.8558 | 0.1795 | 0.7421 | 0.7037 | 0.3199 | 0.7755 | 0.7799 | 0.2333 | 0.7876 | 0.7433 | 0.7483 | 0.7992 | 0.6377 | 0.6811 | 0.8211 | 0.8593 |
| 0.3787 | 27.0 | 13500 | 0.3747 | 0.7156 | 0.9236 | 0.836 | 0.2138 | 0.7122 | 0.6978 | 0.3046 | 0.7615 | 0.7641 | 0.2458 | 0.7628 | 0.7364 | 0.7327 | 0.7915 | 0.6032 | 0.6516 | 0.8108 | 0.8491 |
| 0.3851 | 28.0 | 14000 | 0.3824 | 0.7073 | 0.9242 | 0.8583 | 0.1333 | 0.705 | 0.6833 | 0.3029 | 0.759 | 0.7615 | 0.1667 | 0.7677 | 0.7216 | 0.7213 | 0.7718 | 0.5919 | 0.6611 | 0.8087 | 0.8518 |
| 0.3736 | 29.0 | 14500 | 0.3583 | 0.7178 | 0.9333 | 0.847 | 0.207 | 0.7201 | 0.6811 | 0.3026 | 0.7629 | 0.7668 | 0.2653 | 0.7764 | 0.7247 | 0.7326 | 0.781 | 0.6022 | 0.6589 | 0.8188 | 0.8605 |
| 0.3312 | 30.0 | 15000 | 0.3573 | 0.7431 | 0.9485 | 0.8705 | 0.1842 | 0.744 | 0.7323 | 0.3182 | 0.7844 | 0.7882 | 0.2625 | 0.7852 | 0.7799 | 0.7373 | 0.7859 | 0.6822 | 0.7284 | 0.8099 | 0.8503 |
| 0.3459 | 31.0 | 15500 | 0.3423 | 0.743 | 0.9315 | 0.8848 | 0.2253 | 0.7506 | 0.7236 | 0.3147 | 0.7843 | 0.7891 | 0.3 | 0.7969 | 0.7605 | 0.7531 | 0.7992 | 0.6508 | 0.7021 | 0.8252 | 0.866 |
| 0.3561 | 32.0 | 16000 | 0.3421 | 0.7458 | 0.9405 | 0.8724 | 0.3071 | 0.7484 | 0.754 | 0.3188 | 0.789 | 0.792 | 0.3611 | 0.7972 | 0.7942 | 0.7629 | 0.8073 | 0.65 | 0.7032 | 0.8246 | 0.8657 |
| 0.343 | 33.0 | 16500 | 0.3619 | 0.7221 | 0.9209 | 0.8532 | 0.1383 | 0.733 | 0.6821 | 0.3098 | 0.7663 | 0.7699 | 0.1708 | 0.7839 | 0.7292 | 0.7414 | 0.7907 | 0.6074 | 0.6611 | 0.8173 | 0.8578 |
| 0.3509 | 34.0 | 17000 | 0.3502 | 0.7375 | 0.9366 | 0.8556 | 0.3321 | 0.7431 | 0.7114 | 0.3176 | 0.7858 | 0.789 | 0.4125 | 0.7961 | 0.7524 | 0.7448 | 0.7935 | 0.6481 | 0.7116 | 0.8195 | 0.8617 |
| 0.3465 | 35.0 | 17500 | 0.3514 | 0.7277 | 0.9347 | 0.8526 | 0.3686 | 0.7258 | 0.6757 | 0.3112 | 0.7753 | 0.7803 | 0.4764 | 0.7797 | 0.7193 | 0.7312 | 0.7835 | 0.6251 | 0.6905 | 0.8267 | 0.8669 |
| 0.3216 | 36.0 | 18000 | 0.3720 | 0.711 | 0.9348 | 0.8572 | 0.3728 | 0.7127 | 0.6593 | 0.3017 | 0.7599 | 0.7637 | 0.4472 | 0.7671 | 0.7063 | 0.6994 | 0.7577 | 0.6138 | 0.6747 | 0.8196 | 0.8587 |
| 0.3271 | 37.0 | 18500 | 0.3360 | 0.7556 | 0.9497 | 0.8832 | 0.3003 | 0.7617 | 0.7332 | 0.3219 | 0.7981 | 0.8021 | 0.4111 | 0.8041 | 0.7813 | 0.7605 | 0.8048 | 0.6657 | 0.7211 | 0.8407 | 0.8804 |
| 0.3059 | 38.0 | 19000 | 0.3320 | 0.7513 | 0.9522 | 0.885 | 0.1165 | 0.754 | 0.7404 | 0.3253 | 0.7971 | 0.802 | 0.2375 | 0.8024 | 0.7873 | 0.7585 | 0.8036 | 0.665 | 0.7358 | 0.8303 | 0.8666 |
| 0.3269 | 39.0 | 19500 | 0.3424 | 0.7498 | 0.9429 | 0.8806 | 0.2361 | 0.7525 | 0.7336 | 0.3164 | 0.7905 | 0.7955 | 0.3486 | 0.7985 | 0.7767 | 0.7617 | 0.8101 | 0.6445 | 0.6979 | 0.8432 | 0.8786 |
| 0.3353 | 40.0 | 20000 | 0.3500 | 0.7466 | 0.9407 | 0.8782 | 0.198 | 0.7502 | 0.738 | 0.3162 | 0.783 | 0.7886 | 0.2722 | 0.7938 | 0.7756 | 0.7552 | 0.8024 | 0.6561 | 0.6979 | 0.8285 | 0.8654 |
| 0.3552 | 41.0 | 20500 | 0.3479 | 0.7386 | 0.9394 | 0.868 | 0.1597 | 0.7359 | 0.7296 | 0.3138 | 0.7775 | 0.782 | 0.2292 | 0.7817 | 0.7679 | 0.7472 | 0.7911 | 0.6351 | 0.6853 | 0.8335 | 0.8696 |
| 0.3452 | 42.0 | 21000 | 0.3341 | 0.7416 | 0.9381 | 0.8793 | 0.2425 | 0.7508 | 0.7313 | 0.3137 | 0.7809 | 0.7865 | 0.3208 | 0.7995 | 0.7635 | 0.7616 | 0.8089 | 0.6314 | 0.68 | 0.8318 | 0.8705 |
| 0.2925 | 43.0 | 21500 | 0.3201 | 0.7695 | 0.9393 | 0.89 | 0.2844 | 0.7742 | 0.7619 | 0.3271 | 0.8067 | 0.8117 | 0.3764 | 0.8189 | 0.7915 | 0.7787 | 0.8173 | 0.6816 | 0.7305 | 0.8482 | 0.8873 |
| 0.2957 | 44.0 | 22000 | 0.3343 | 0.7589 | 0.9435 | 0.8874 | 0.2823 | 0.7619 | 0.7491 | 0.3242 | 0.7972 | 0.8023 | 0.3667 | 0.8067 | 0.7867 | 0.7535 | 0.7948 | 0.6696 | 0.7242 | 0.8536 | 0.888 |
| 0.3537 | 45.0 | 22500 | 0.3381 | 0.7523 | 0.9424 | 0.8801 | 0.259 | 0.7507 | 0.7506 | 0.3204 | 0.7948 | 0.802 | 0.3681 | 0.8037 | 0.793 | 0.7441 | 0.7919 | 0.6682 | 0.7305 | 0.8445 | 0.8834 |
| 0.3256 | 46.0 | 23000 | 0.3383 | 0.758 | 0.9428 | 0.8742 | 0.259 | 0.7624 | 0.7319 | 0.3179 | 0.7922 | 0.7999 | 0.3681 | 0.7992 | 0.7774 | 0.7697 | 0.8133 | 0.6569 | 0.7042 | 0.8473 | 0.8822 |
| 0.3277 | 47.0 | 23500 | 0.3294 | 0.7538 | 0.9464 | 0.8782 | 0.2466 | 0.7504 | 0.756 | 0.3199 | 0.7964 | 0.8021 | 0.3681 | 0.8034 | 0.789 | 0.7586 | 0.804 | 0.6624 | 0.7221 | 0.8405 | 0.8801 |
| 0.3376 | 48.0 | 24000 | 0.3391 | 0.76 | 0.9453 | 0.8918 | 0.3566 | 0.7539 | 0.7628 | 0.3217 | 0.7966 | 0.8021 | 0.4458 | 0.797 | 0.7951 | 0.7644 | 0.8016 | 0.696 | 0.74 | 0.8196 | 0.8648 |
| 0.3071 | 49.0 | 24500 | 0.3249 | 0.7703 | 0.9464 | 0.8814 | 0.2872 | 0.7669 | 0.7616 | 0.3297 | 0.8057 | 0.8104 | 0.3667 | 0.8085 | 0.7958 | 0.7673 | 0.8052 | 0.7052 | 0.7484 | 0.8383 | 0.8774 |
| 0.2748 | 50.0 | 25000 | 0.3277 | 0.7576 | 0.9317 | 0.8878 | 0.2644 | 0.7638 | 0.7097 | 0.3236 | 0.7912 | 0.7941 | 0.3569 | 0.8047 | 0.7375 | 0.7659 | 0.8073 | 0.6694 | 0.7 | 0.8375 | 0.875 |
| 0.3259 | 51.0 | 25500 | 0.3256 | 0.7676 | 0.9406 | 0.8976 | 0.2403 | 0.7709 | 0.7443 | 0.3246 | 0.7994 | 0.8049 | 0.35 | 0.8119 | 0.7745 | 0.7715 | 0.8097 | 0.6884 | 0.7221 | 0.8429 | 0.8828 |
| 0.3225 | 52.0 | 26000 | 0.3375 | 0.7625 | 0.9454 | 0.895 | 0.2682 | 0.7651 | 0.7517 | 0.3259 | 0.8006 | 0.8041 | 0.3556 | 0.8067 | 0.7863 | 0.7625 | 0.8036 | 0.6921 | 0.7337 | 0.8329 | 0.875 |
| 0.3122 | 53.0 | 26500 | 0.3385 | 0.7573 | 0.9467 | 0.8944 | 0.1974 | 0.7595 | 0.7445 | 0.3167 | 0.7947 | 0.7985 | 0.2833 | 0.8019 | 0.7866 | 0.742 | 0.7903 | 0.6967 | 0.7316 | 0.8332 | 0.8735 |
| 0.3076 | 54.0 | 27000 | 0.3175 | 0.7729 | 0.948 | 0.8753 | 0.2099 | 0.7789 | 0.7603 | 0.3301 | 0.8114 | 0.815 | 0.2681 | 0.8197 | 0.8025 | 0.7678 | 0.8117 | 0.7141 | 0.7579 | 0.8367 | 0.8753 |
| 0.3285 | 55.0 | 27500 | 0.3211 | 0.7787 | 0.9476 | 0.8956 | 0.2747 | 0.7777 | 0.7703 | 0.3297 | 0.8141 | 0.8194 | 0.3528 | 0.8237 | 0.807 | 0.7719 | 0.8125 | 0.7129 | 0.7558 | 0.8513 | 0.8901 |
| 0.2867 | 56.0 | 28000 | 0.3486 | 0.7532 | 0.9335 | 0.8763 | 0.1495 | 0.7551 | 0.7302 | 0.3179 | 0.7896 | 0.7923 | 0.1917 | 0.8003 | 0.7616 | 0.7607 | 0.8069 | 0.6681 | 0.7021 | 0.8307 | 0.8681 |
| 0.3095 | 57.0 | 28500 | 0.3197 | 0.769 | 0.9376 | 0.8841 | 0.2709 | 0.7698 | 0.7671 | 0.3216 | 0.8074 | 0.8109 | 0.3181 | 0.8177 | 0.797 | 0.773 | 0.8137 | 0.6832 | 0.7316 | 0.8508 | 0.8873 |
| 0.3225 | 58.0 | 29000 | 0.3437 | 0.7528 | 0.9403 | 0.8807 | 0.3319 | 0.7523 | 0.7328 | 0.3215 | 0.7954 | 0.7988 | 0.3903 | 0.8017 | 0.7719 | 0.7468 | 0.7879 | 0.6623 | 0.7232 | 0.8494 | 0.8852 |
| 0.3181 | 59.0 | 29500 | 0.3329 | 0.7522 | 0.9305 | 0.8661 | 0.2974 | 0.758 | 0.731 | 0.3163 | 0.7917 | 0.7943 | 0.3472 | 0.8034 | 0.7637 | 0.7607 | 0.8016 | 0.6436 | 0.6947 | 0.8524 | 0.8864 |
| 0.2679 | 60.0 | 30000 | 0.3069 | 0.778 | 0.9422 | 0.8845 | 0.3299 | 0.7781 | 0.7692 | 0.3322 | 0.8149 | 0.8184 | 0.3972 | 0.8224 | 0.8022 | 0.769 | 0.8137 | 0.7032 | 0.7432 | 0.8618 | 0.8982 |
| 0.305 | 61.0 | 30500 | 0.3137 | 0.7731 | 0.9403 | 0.882 | 0.2619 | 0.7676 | 0.7625 | 0.3277 | 0.8071 | 0.8113 | 0.3236 | 0.8079 | 0.7961 | 0.7792 | 0.8169 | 0.6819 | 0.7232 | 0.8582 | 0.8937 |
| 0.305 | 62.0 | 31000 | 0.3221 | 0.771 | 0.9467 | 0.8878 | 0.267 | 0.7718 | 0.7457 | 0.3257 | 0.8045 | 0.8085 | 0.3444 | 0.8113 | 0.787 | 0.7621 | 0.8044 | 0.7016 | 0.7379 | 0.8494 | 0.8831 |
| 0.2713 | 63.0 | 31500 | 0.3199 | 0.7769 | 0.9469 | 0.881 | 0.3265 | 0.7733 | 0.7663 | 0.3245 | 0.8094 | 0.8125 | 0.3903 | 0.811 | 0.7977 | 0.7691 | 0.8085 | 0.7101 | 0.7442 | 0.8516 | 0.8849 |
| 0.3082 | 64.0 | 32000 | 0.3163 | 0.7721 | 0.9386 | 0.8875 | 0.2628 | 0.7778 | 0.7388 | 0.3282 | 0.8093 | 0.8127 | 0.3389 | 0.8206 | 0.7791 | 0.7755 | 0.8198 | 0.6885 | 0.7295 | 0.8522 | 0.8889 |
| 0.2819 | 65.0 | 32500 | 0.3396 | 0.75 | 0.9288 | 0.8694 | 0.3187 | 0.7549 | 0.7158 | 0.3167 | 0.7943 | 0.7972 | 0.3514 | 0.804 | 0.7588 | 0.7547 | 0.804 | 0.6433 | 0.6989 | 0.8522 | 0.8886 |
| 0.2804 | 66.0 | 33000 | 0.3221 | 0.7713 | 0.9278 | 0.8964 | 0.2681 | 0.7661 | 0.7691 | 0.3239 | 0.8092 | 0.8122 | 0.3556 | 0.8122 | 0.8016 | 0.7755 | 0.8157 | 0.6765 | 0.7263 | 0.8618 | 0.8946 |
| 0.2746 | 67.0 | 33500 | 0.3125 | 0.7642 | 0.9335 | 0.8792 | 0.2764 | 0.7643 | 0.7452 | 0.3221 | 0.8037 | 0.8065 | 0.3597 | 0.8097 | 0.7825 | 0.7765 | 0.8133 | 0.6597 | 0.7158 | 0.8565 | 0.8904 |
| 0.2775 | 68.0 | 34000 | 0.3048 | 0.7767 | 0.938 | 0.8975 | 0.3299 | 0.7745 | 0.76 | 0.3252 | 0.8151 | 0.8181 | 0.4278 | 0.8202 | 0.794 | 0.7745 | 0.8145 | 0.6889 | 0.7421 | 0.8668 | 0.8976 |
| 0.2614 | 69.0 | 34500 | 0.3000 | 0.7788 | 0.9376 | 0.8972 | 0.2978 | 0.7803 | 0.7609 | 0.3271 | 0.82 | 0.8245 | 0.3833 | 0.831 | 0.7963 | 0.7777 | 0.8198 | 0.6886 | 0.7505 | 0.8702 | 0.9033 |
| 0.2706 | 70.0 | 35000 | 0.3045 | 0.7747 | 0.9347 | 0.8967 | 0.3231 | 0.7831 | 0.7555 | 0.3286 | 0.8177 | 0.8208 | 0.4125 | 0.8298 | 0.7951 | 0.7808 | 0.8278 | 0.6882 | 0.7421 | 0.8552 | 0.8925 |
| 0.2608 | 71.0 | 35500 | 0.2951 | 0.7739 | 0.9462 | 0.8902 | 0.3108 | 0.7839 | 0.7517 | 0.3274 | 0.8165 | 0.8219 | 0.3972 | 0.8331 | 0.7903 | 0.7779 | 0.8181 | 0.6857 | 0.7474 | 0.858 | 0.9003 |
| 0.3068 | 72.0 | 36000 | 0.3171 | 0.7755 | 0.9397 | 0.8788 | 0.331 | 0.7766 | 0.7614 | 0.3288 | 0.8136 | 0.8172 | 0.3944 | 0.821 | 0.7953 | 0.7786 | 0.8202 | 0.6794 | 0.7295 | 0.8687 | 0.9021 |
| 0.2893 | 73.0 | 36500 | 0.3156 | 0.7803 | 0.9442 | 0.9094 | 0.3084 | 0.7861 | 0.758 | 0.3244 | 0.8181 | 0.8213 | 0.4139 | 0.8281 | 0.7957 | 0.7773 | 0.8234 | 0.7061 | 0.7484 | 0.8575 | 0.8922 |
| 0.2462 | 74.0 | 37000 | 0.3136 | 0.7793 | 0.9348 | 0.8894 | 0.3201 | 0.7831 | 0.7552 | 0.327 | 0.8147 | 0.8183 | 0.4014 | 0.8258 | 0.7875 | 0.769 | 0.8177 | 0.7054 | 0.7411 | 0.8637 | 0.8961 |
| 0.3016 | 75.0 | 37500 | 0.3131 | 0.7762 | 0.9393 | 0.899 | 0.3175 | 0.7801 | 0.7597 | 0.326 | 0.8196 | 0.8226 | 0.4111 | 0.8315 | 0.7891 | 0.7726 | 0.8137 | 0.6937 | 0.7547 | 0.8624 | 0.8994 |
| 0.2714 | 76.0 | 38000 | 0.2986 | 0.7867 | 0.9558 | 0.907 | 0.3107 | 0.7909 | 0.7671 | 0.3287 | 0.8276 | 0.83 | 0.3875 | 0.8391 | 0.7979 | 0.7923 | 0.8274 | 0.7018 | 0.7611 | 0.8659 | 0.9015 |
| 0.2921 | 77.0 | 38500 | 0.2951 | 0.7862 | 0.9398 | 0.8882 | 0.3325 | 0.7915 | 0.7668 | 0.333 | 0.8267 | 0.8307 | 0.4042 | 0.8391 | 0.798 | 0.7939 | 0.827 | 0.6987 | 0.7611 | 0.8661 | 0.9039 |
| 0.272 | 78.0 | 39000 | 0.3125 | 0.7754 | 0.9497 | 0.899 | 0.3393 | 0.776 | 0.7751 | 0.3244 | 0.8177 | 0.8213 | 0.4167 | 0.8252 | 0.8076 | 0.7793 | 0.8238 | 0.6987 | 0.7526 | 0.8481 | 0.8873 |
| 0.2734 | 79.0 | 39500 | 0.3010 | 0.7818 | 0.9404 | 0.8915 | 0.3022 | 0.7875 | 0.7524 | 0.3324 | 0.8224 | 0.8265 | 0.3806 | 0.8342 | 0.7863 | 0.795 | 0.8331 | 0.6945 | 0.7526 | 0.8558 | 0.8937 |
| 0.2392 | 80.0 | 40000 | 0.3056 | 0.78 | 0.9379 | 0.8891 | 0.3211 | 0.7887 | 0.7444 | 0.329 | 0.8228 | 0.8259 | 0.4236 | 0.8389 | 0.7782 | 0.7918 | 0.8363 | 0.6858 | 0.7432 | 0.8624 | 0.8982 |
| 0.2743 | 81.0 | 40500 | 0.3236 | 0.7665 | 0.9343 | 0.8924 | 0.3312 | 0.7643 | 0.7528 | 0.3233 | 0.8057 | 0.8108 | 0.4278 | 0.815 | 0.7842 | 0.7782 | 0.8177 | 0.6603 | 0.7211 | 0.8611 | 0.8937 |
| 0.2462 | 82.0 | 41000 | 0.3055 | 0.7823 | 0.9306 | 0.8842 | 0.2892 | 0.7898 | 0.7693 | 0.3334 | 0.824 | 0.8267 | 0.35 | 0.8346 | 0.8045 | 0.7944 | 0.8323 | 0.6903 | 0.7495 | 0.8622 | 0.8985 |
| 0.2701 | 83.0 | 41500 | 0.3055 | 0.7759 | 0.9382 | 0.8707 | 0.2915 | 0.7864 | 0.742 | 0.3312 | 0.8154 | 0.8192 | 0.3708 | 0.8311 | 0.7816 | 0.7903 | 0.8302 | 0.6759 | 0.7305 | 0.8616 | 0.8967 |
| 0.2732 | 84.0 | 42000 | 0.3205 | 0.7712 | 0.937 | 0.8946 | 0.3145 | 0.7697 | 0.7674 | 0.3249 | 0.813 | 0.817 | 0.4083 | 0.8206 | 0.8 | 0.7749 | 0.8141 | 0.6854 | 0.74 | 0.8532 | 0.897 |
| 0.2426 | 85.0 | 42500 | 0.3125 | 0.7815 | 0.9313 | 0.8845 | 0.2965 | 0.7843 | 0.7677 | 0.332 | 0.8209 | 0.8234 | 0.3264 | 0.8293 | 0.8059 | 0.8005 | 0.8335 | 0.6896 | 0.7463 | 0.8545 | 0.8904 |
| 0.2083 | 86.0 | 43000 | 0.3144 | 0.7763 | 0.9391 | 0.8883 | 0.3368 | 0.7814 | 0.7626 | 0.3282 | 0.8165 | 0.8198 | 0.3847 | 0.8258 | 0.8027 | 0.7935 | 0.825 | 0.68 | 0.7421 | 0.8554 | 0.8922 |
| 0.2444 | 87.0 | 43500 | 0.3039 | 0.7737 | 0.9325 | 0.8839 | 0.3897 | 0.7787 | 0.7612 | 0.3241 | 0.8169 | 0.8202 | 0.4458 | 0.8264 | 0.7986 | 0.7967 | 0.8363 | 0.6664 | 0.7274 | 0.8579 | 0.897 |
| 0.2458 | 88.0 | 44000 | 0.2948 | 0.7936 | 0.935 | 0.8879 | 0.402 | 0.7963 | 0.7829 | 0.3323 | 0.8357 | 0.8395 | 0.4625 | 0.844 | 0.8208 | 0.8079 | 0.8448 | 0.7067 | 0.7705 | 0.8662 | 0.9033 |
| 0.23 | 89.0 | 44500 | 0.3180 | 0.7772 | 0.9216 | 0.886 | 0.3416 | 0.7844 | 0.749 | 0.3291 | 0.818 | 0.8212 | 0.4139 | 0.8324 | 0.7813 | 0.8016 | 0.8403 | 0.6802 | 0.7347 | 0.8499 | 0.8886 |
| 0.2379 | 90.0 | 45000 | 0.2937 | 0.7886 | 0.9319 | 0.8868 | 0.3813 | 0.7959 | 0.7689 | 0.3299 | 0.8299 | 0.8338 | 0.4514 | 0.8442 | 0.8019 | 0.8033 | 0.8423 | 0.6995 | 0.7558 | 0.8632 | 0.9033 |
| 0.2498 | 91.0 | 45500 | 0.3045 | 0.7878 | 0.9365 | 0.8871 | 0.3616 | 0.7879 | 0.786 | 0.3305 | 0.8271 | 0.83 | 0.4514 | 0.8337 | 0.8165 | 0.7968 | 0.8339 | 0.7 | 0.7547 | 0.8666 | 0.9015 |
| 0.2212 | 92.0 | 46000 | 0.3132 | 0.7854 | 0.9304 | 0.8915 | 0.3613 | 0.7886 | 0.7848 | 0.3316 | 0.8225 | 0.8277 | 0.4306 | 0.8335 | 0.8174 | 0.7909 | 0.8286 | 0.7008 | 0.7505 | 0.8646 | 0.9039 |
| 0.245 | 93.0 | 46500 | 0.3060 | 0.7807 | 0.929 | 0.8913 | 0.3292 | 0.7832 | 0.7865 | 0.3314 | 0.8244 | 0.8298 | 0.3917 | 0.8352 | 0.8236 | 0.7948 | 0.8359 | 0.6856 | 0.7526 | 0.8616 | 0.9009 |
| 0.2444 | 94.0 | 47000 | 0.2977 | 0.7888 | 0.949 | 0.895 | 0.3567 | 0.7839 | 0.7979 | 0.3315 | 0.8297 | 0.8338 | 0.4514 | 0.8331 | 0.8283 | 0.8041 | 0.8448 | 0.6994 | 0.7568 | 0.8629 | 0.8997 |
| 0.2172 | 95.0 | 47500 | 0.2981 | 0.7888 | 0.9394 | 0.8949 | 0.3713 | 0.7915 | 0.782 | 0.3313 | 0.8283 | 0.8316 | 0.4458 | 0.8361 | 0.8163 | 0.8056 | 0.8395 | 0.6946 | 0.7537 | 0.8661 | 0.9015 |
| 0.2419 | 96.0 | 48000 | 0.3086 | 0.7745 | 0.9293 | 0.8806 | 0.3155 | 0.7808 | 0.7605 | 0.3229 | 0.8155 | 0.8188 | 0.3722 | 0.8276 | 0.7955 | 0.7945 | 0.8323 | 0.6663 | 0.7242 | 0.8627 | 0.9 |
| 0.2554 | 97.0 | 48500 | 0.3050 | 0.7782 | 0.9331 | 0.8822 | 0.3421 | 0.7779 | 0.7775 | 0.3288 | 0.8197 | 0.8235 | 0.4181 | 0.8251 | 0.8137 | 0.7919 | 0.8315 | 0.6827 | 0.7411 | 0.8599 | 0.8979 |
| 0.2263 | 98.0 | 49000 | 0.3050 | 0.7836 | 0.9292 | 0.8862 | 0.314 | 0.7907 | 0.7662 | 0.3294 | 0.8226 | 0.826 | 0.3764 | 0.8367 | 0.7984 | 0.8045 | 0.8391 | 0.6868 | 0.7421 | 0.8595 | 0.8967 |
| 0.2502 | 99.0 | 49500 | 0.2972 | 0.7874 | 0.9344 | 0.8899 | 0.3418 | 0.792 | 0.7671 | 0.3294 | 0.8293 | 0.8328 | 0.4139 | 0.8413 | 0.8038 | 0.805 | 0.8464 | 0.6926 | 0.7505 | 0.8645 | 0.9015 |
| 0.2306 | 100.0 | 50000 | 0.2951 | 0.7924 | 0.9376 | 0.8906 | 0.3658 | 0.7985 | 0.7796 | 0.3368 | 0.8322 | 0.8361 | 0.4347 | 0.8433 | 0.8165 | 0.7984 | 0.8371 | 0.7094 | 0.7653 | 0.8696 | 0.906 |
| 0.2115 | 101.0 | 50500 | 0.2959 | 0.7877 | 0.9317 | 0.8894 | 0.3145 | 0.7848 | 0.7801 | 0.3342 | 0.8289 | 0.832 | 0.3972 | 0.833 | 0.8196 | 0.8021 | 0.8444 | 0.7011 | 0.7547 | 0.8599 | 0.897 |
| 0.2248 | 102.0 | 51000 | 0.2967 | 0.7945 | 0.9287 | 0.8836 | 0.3876 | 0.795 | 0.7696 | 0.3323 | 0.8325 | 0.8366 | 0.4556 | 0.8411 | 0.8036 | 0.8102 | 0.8476 | 0.6998 | 0.7589 | 0.8736 | 0.9033 |
| 0.2231 | 103.0 | 51500 | 0.2989 | 0.7901 | 0.9281 | 0.8835 | 0.3368 | 0.795 | 0.7668 | 0.3338 | 0.8311 | 0.8347 | 0.4264 | 0.8419 | 0.8029 | 0.803 | 0.8452 | 0.701 | 0.7568 | 0.8661 | 0.9021 |
| 0.2492 | 104.0 | 52000 | 0.2985 | 0.7914 | 0.9332 | 0.8919 | 0.4092 | 0.7902 | 0.7852 | 0.3331 | 0.8321 | 0.8351 | 0.475 | 0.8384 | 0.8188 | 0.7936 | 0.8343 | 0.709 | 0.7642 | 0.8715 | 0.9069 |
| 0.2405 | 105.0 | 52500 | 0.2987 | 0.7933 | 0.9324 | 0.9009 | 0.3649 | 0.7912 | 0.7877 | 0.3347 | 0.8319 | 0.8352 | 0.4431 | 0.8387 | 0.8209 | 0.8013 | 0.8427 | 0.7056 | 0.7589 | 0.873 | 0.9039 |
| 0.2129 | 106.0 | 53000 | 0.2984 | 0.7901 | 0.9333 | 0.8934 | 0.3716 | 0.7893 | 0.7883 | 0.3315 | 0.8289 | 0.832 | 0.4333 | 0.8354 | 0.8173 | 0.7985 | 0.8399 | 0.6914 | 0.7463 | 0.8805 | 0.9096 |
| 0.2065 | 107.0 | 53500 | 0.3135 | 0.7758 | 0.9245 | 0.8814 | 0.3797 | 0.7833 | 0.7469 | 0.325 | 0.817 | 0.8207 | 0.4389 | 0.8315 | 0.7781 | 0.7924 | 0.8383 | 0.6585 | 0.7168 | 0.8765 | 0.9069 |
| 0.2192 | 108.0 | 54000 | 0.3019 | 0.79 | 0.9313 | 0.8843 | 0.3845 | 0.7909 | 0.7841 | 0.333 | 0.829 | 0.8323 | 0.4611 | 0.8359 | 0.8172 | 0.8027 | 0.8435 | 0.6992 | 0.7505 | 0.8682 | 0.9027 |
| 0.2098 | 109.0 | 54500 | 0.3017 | 0.7886 | 0.9396 | 0.8942 | 0.3535 | 0.7897 | 0.7863 | 0.3332 | 0.8284 | 0.8315 | 0.3861 | 0.8367 | 0.8174 | 0.7992 | 0.8407 | 0.6953 | 0.7484 | 0.8714 | 0.9054 |
| 0.2164 | 110.0 | 55000 | 0.3019 | 0.7876 | 0.9366 | 0.8931 | 0.3397 | 0.7882 | 0.7755 | 0.3314 | 0.8281 | 0.8306 | 0.3931 | 0.8356 | 0.8115 | 0.802 | 0.8415 | 0.6901 | 0.7474 | 0.8709 | 0.903 |
| 0.2199 | 111.0 | 55500 | 0.3067 | 0.7796 | 0.9388 | 0.8826 | 0.3126 | 0.7892 | 0.7537 | 0.3303 | 0.8211 | 0.8242 | 0.3556 | 0.8358 | 0.7925 | 0.7938 | 0.8335 | 0.6785 | 0.7368 | 0.8666 | 0.9024 |
| 0.2126 | 112.0 | 56000 | 0.3036 | 0.7866 | 0.9339 | 0.888 | 0.3294 | 0.7874 | 0.7771 | 0.3326 | 0.8282 | 0.8315 | 0.3764 | 0.8381 | 0.8112 | 0.8016 | 0.8427 | 0.688 | 0.7505 | 0.8703 | 0.9012 |
| 0.2205 | 113.0 | 56500 | 0.2932 | 0.7929 | 0.9354 | 0.8873 | 0.3473 | 0.7988 | 0.7795 | 0.3348 | 0.8357 | 0.8392 | 0.4028 | 0.8482 | 0.8148 | 0.8108 | 0.8512 | 0.7006 | 0.7621 | 0.8671 | 0.9042 |
| 0.2556 | 114.0 | 57000 | 0.2975 | 0.7867 | 0.9279 | 0.885 | 0.3682 | 0.7938 | 0.7682 | 0.3325 | 0.829 | 0.8325 | 0.4361 | 0.8425 | 0.8044 | 0.797 | 0.8379 | 0.6877 | 0.7495 | 0.8754 | 0.9102 |
| 0.2086 | 115.0 | 57500 | 0.2986 | 0.7901 | 0.928 | 0.8923 | 0.3674 | 0.7945 | 0.7731 | 0.334 | 0.8303 | 0.8337 | 0.4292 | 0.8421 | 0.8063 | 0.8107 | 0.8476 | 0.6846 | 0.7463 | 0.8751 | 0.9072 |
| 0.2156 | 116.0 | 58000 | 0.3016 | 0.7879 | 0.9372 | 0.8859 | 0.3365 | 0.7909 | 0.7719 | 0.3339 | 0.8269 | 0.8304 | 0.3861 | 0.8384 | 0.8033 | 0.8026 | 0.8415 | 0.6941 | 0.7484 | 0.8669 | 0.9012 |
| 0.2057 | 117.0 | 58500 | 0.2933 | 0.79 | 0.939 | 0.889 | 0.3345 | 0.7974 | 0.7713 | 0.3348 | 0.8302 | 0.8338 | 0.3847 | 0.8445 | 0.8073 | 0.8106 | 0.8496 | 0.6892 | 0.7463 | 0.8703 | 0.9054 |
| 0.2387 | 118.0 | 59000 | 0.2904 | 0.7932 | 0.9364 | 0.8941 | 0.3489 | 0.7975 | 0.7813 | 0.3346 | 0.8343 | 0.8377 | 0.3986 | 0.8453 | 0.8186 | 0.8044 | 0.8452 | 0.7021 | 0.7611 | 0.873 | 0.9069 |
| 0.2228 | 119.0 | 59500 | 0.2963 | 0.7979 | 0.9408 | 0.8937 | 0.3477 | 0.8009 | 0.7885 | 0.336 | 0.8379 | 0.8413 | 0.4181 | 0.8474 | 0.8204 | 0.8061 | 0.8464 | 0.7098 | 0.7695 | 0.8778 | 0.9081 |
| 0.2244 | 120.0 | 60000 | 0.2944 | 0.7962 | 0.9424 | 0.8941 | 0.3427 | 0.8048 | 0.7765 | 0.3361 | 0.8368 | 0.8401 | 0.4042 | 0.8504 | 0.8138 | 0.8072 | 0.8472 | 0.7074 | 0.7663 | 0.874 | 0.9069 |
| 0.2189 | 121.0 | 60500 | 0.3028 | 0.7941 | 0.9442 | 0.8937 | 0.3097 | 0.7981 | 0.7837 | 0.3351 | 0.8334 | 0.8357 | 0.3583 | 0.842 | 0.8184 | 0.8026 | 0.8391 | 0.7102 | 0.7653 | 0.8694 | 0.9027 |
| 0.2335 | 122.0 | 61000 | 0.2959 | 0.7994 | 0.9388 | 0.8941 | 0.3443 | 0.8052 | 0.7745 | 0.336 | 0.836 | 0.8386 | 0.4014 | 0.8484 | 0.8067 | 0.8144 | 0.8496 | 0.7065 | 0.7589 | 0.8772 | 0.9072 |
| 0.2225 | 123.0 | 61500 | 0.2948 | 0.801 | 0.9353 | 0.8922 | 0.3368 | 0.8062 | 0.7927 | 0.3361 | 0.8412 | 0.844 | 0.4125 | 0.8511 | 0.8243 | 0.8166 | 0.8548 | 0.7084 | 0.7695 | 0.878 | 0.9078 |
| 0.2079 | 124.0 | 62000 | 0.2982 | 0.7952 | 0.9374 | 0.8913 | 0.3483 | 0.7968 | 0.7839 | 0.3334 | 0.8346 | 0.8381 | 0.4069 | 0.8445 | 0.8179 | 0.8062 | 0.8444 | 0.7068 | 0.7642 | 0.8726 | 0.9057 |
| 0.2281 | 125.0 | 62500 | 0.2878 | 0.7974 | 0.9473 | 0.8861 | 0.3636 | 0.8009 | 0.779 | 0.3354 | 0.837 | 0.8399 | 0.4375 | 0.8465 | 0.8144 | 0.8122 | 0.8512 | 0.7032 | 0.7611 | 0.8768 | 0.9075 |
| 0.2678 | 126.0 | 63000 | 0.2902 | 0.8014 | 0.9416 | 0.8971 | 0.3517 | 0.803 | 0.7885 | 0.3364 | 0.8406 | 0.8436 | 0.4264 | 0.8487 | 0.8251 | 0.8163 | 0.8536 | 0.7103 | 0.7674 | 0.8775 | 0.9099 |
| 0.2222 | 127.0 | 63500 | 0.2872 | 0.8017 | 0.9427 | 0.8965 | 0.3541 | 0.8036 | 0.7891 | 0.3369 | 0.841 | 0.8448 | 0.4458 | 0.8506 | 0.8244 | 0.819 | 0.8573 | 0.7072 | 0.7663 | 0.8787 | 0.9108 |
| 0.198 | 128.0 | 64000 | 0.2885 | 0.8013 | 0.9434 | 0.8973 | 0.3562 | 0.8018 | 0.7918 | 0.3358 | 0.8399 | 0.8429 | 0.4431 | 0.8472 | 0.8255 | 0.8116 | 0.852 | 0.7122 | 0.7663 | 0.8802 | 0.9102 |
| 0.2413 | 129.0 | 64500 | 0.2911 | 0.7961 | 0.9434 | 0.8966 | 0.3467 | 0.7977 | 0.7842 | 0.3347 | 0.8346 | 0.8377 | 0.4222 | 0.8429 | 0.8176 | 0.8089 | 0.8484 | 0.701 | 0.7568 | 0.8784 | 0.9078 |
| 0.2204 | 130.0 | 65000 | 0.2967 | 0.7986 | 0.9434 | 0.8959 | 0.3681 | 0.7999 | 0.7914 | 0.336 | 0.8358 | 0.8384 | 0.4292 | 0.8431 | 0.8218 | 0.8078 | 0.8444 | 0.7078 | 0.7611 | 0.8803 | 0.9099 |
| 0.2138 | 131.0 | 65500 | 0.2912 | 0.8009 | 0.9425 | 0.895 | 0.3628 | 0.8057 | 0.7866 | 0.3369 | 0.8388 | 0.8423 | 0.4431 | 0.8478 | 0.8217 | 0.8153 | 0.8528 | 0.7092 | 0.7642 | 0.8783 | 0.9099 |
| 0.2085 | 132.0 | 66000 | 0.2931 | 0.8015 | 0.9423 | 0.8942 | 0.3711 | 0.8032 | 0.789 | 0.3361 | 0.84 | 0.8436 | 0.4472 | 0.8476 | 0.8264 | 0.8154 | 0.8528 | 0.7103 | 0.7674 | 0.8787 | 0.9105 |
| 0.2352 | 133.0 | 66500 | 0.2920 | 0.7987 | 0.945 | 0.8909 | 0.3721 | 0.8023 | 0.7857 | 0.3363 | 0.8394 | 0.8426 | 0.4597 | 0.8475 | 0.8211 | 0.8133 | 0.8524 | 0.7118 | 0.7695 | 0.8711 | 0.906 |
| 0.2071 | 134.0 | 67000 | 0.2910 | 0.801 | 0.9449 | 0.8942 | 0.3732 | 0.8044 | 0.7868 | 0.3384 | 0.8403 | 0.8436 | 0.4417 | 0.8487 | 0.8247 | 0.8146 | 0.8536 | 0.7133 | 0.7695 | 0.8751 | 0.9078 |
| 0.2433 | 135.0 | 67500 | 0.2917 | 0.7991 | 0.9453 | 0.8886 | 0.3688 | 0.8023 | 0.788 | 0.3368 | 0.8389 | 0.8422 | 0.4333 | 0.8466 | 0.8233 | 0.8104 | 0.8492 | 0.7133 | 0.7695 | 0.8737 | 0.9078 |
| 0.2176 | 136.0 | 68000 | 0.2899 | 0.8004 | 0.9452 | 0.8945 | 0.3714 | 0.8054 | 0.7834 | 0.3362 | 0.8405 | 0.8438 | 0.4458 | 0.8501 | 0.8204 | 0.8137 | 0.8528 | 0.7111 | 0.7695 | 0.8764 | 0.909 |
| 0.2559 | 137.0 | 68500 | 0.2919 | 0.8002 | 0.9451 | 0.8912 | 0.3678 | 0.805 | 0.7863 | 0.3364 | 0.8409 | 0.8443 | 0.4458 | 0.8497 | 0.8224 | 0.8101 | 0.8504 | 0.7144 | 0.7737 | 0.876 | 0.9087 |
| 0.2206 | 138.0 | 69000 | 0.2931 | 0.7988 | 0.9451 | 0.8911 | 0.3735 | 0.8016 | 0.7888 | 0.3355 | 0.8386 | 0.8419 | 0.45 | 0.8461 | 0.8238 | 0.8101 | 0.8488 | 0.7129 | 0.7695 | 0.8733 | 0.9075 |
| 0.2218 | 139.0 | 69500 | 0.2930 | 0.8006 | 0.9452 | 0.8913 | 0.3737 | 0.8035 | 0.7918 | 0.3366 | 0.8403 | 0.8438 | 0.4542 | 0.848 | 0.8258 | 0.8143 | 0.8516 | 0.7129 | 0.7716 | 0.8747 | 0.9081 |
| 0.1946 | 140.0 | 70000 | 0.2917 | 0.8002 | 0.9452 | 0.8913 | 0.3739 | 0.8032 | 0.7826 | 0.3361 | 0.8399 | 0.8432 | 0.4625 | 0.8478 | 0.8208 | 0.8121 | 0.8512 | 0.7124 | 0.7695 | 0.876 | 0.909 |
| 0.2353 | 141.0 | 70500 | 0.2912 | 0.8014 | 0.9453 | 0.8914 | 0.3729 | 0.8041 | 0.7901 | 0.3368 | 0.8405 | 0.8439 | 0.4542 | 0.8481 | 0.8255 | 0.8122 | 0.8504 | 0.715 | 0.7716 | 0.8769 | 0.9096 |
| 0.241 | 142.0 | 71000 | 0.2935 | 0.8009 | 0.9453 | 0.8915 | 0.3737 | 0.8037 | 0.7878 | 0.3372 | 0.8401 | 0.8433 | 0.4625 | 0.8476 | 0.8226 | 0.8105 | 0.8496 | 0.7154 | 0.7716 | 0.8767 | 0.9087 |
| 0.1939 | 143.0 | 71500 | 0.2937 | 0.8012 | 0.9454 | 0.8915 | 0.3745 | 0.8046 | 0.7863 | 0.3372 | 0.8404 | 0.8437 | 0.4542 | 0.8492 | 0.8202 | 0.8104 | 0.8492 | 0.7155 | 0.7726 | 0.8776 | 0.9093 |
| 0.202 | 144.0 | 72000 | 0.2949 | 0.8008 | 0.9453 | 0.8912 | 0.3684 | 0.8038 | 0.788 | 0.3375 | 0.84 | 0.8433 | 0.45 | 0.8481 | 0.8236 | 0.8113 | 0.8504 | 0.7139 | 0.7705 | 0.8771 | 0.909 |
| 0.2009 | 145.0 | 72500 | 0.2939 | 0.801 | 0.9453 | 0.8912 | 0.3744 | 0.8041 | 0.7892 | 0.3378 | 0.8409 | 0.8442 | 0.45 | 0.8488 | 0.8251 | 0.812 | 0.8516 | 0.7149 | 0.7726 | 0.8761 | 0.9084 |
| 0.2481 | 146.0 | 73000 | 0.2938 | 0.801 | 0.9452 | 0.8912 | 0.3684 | 0.804 | 0.7894 | 0.3378 | 0.8406 | 0.844 | 0.45 | 0.8483 | 0.8252 | 0.8118 | 0.8516 | 0.7149 | 0.7716 | 0.8763 | 0.9087 |
| 0.2236 | 147.0 | 73500 | 0.2924 | 0.801 | 0.9453 | 0.8914 | 0.3686 | 0.8039 | 0.7895 | 0.3378 | 0.8406 | 0.844 | 0.4542 | 0.848 | 0.8254 | 0.812 | 0.8524 | 0.7144 | 0.7705 | 0.8766 | 0.909 |
| 0.2002 | 148.0 | 74000 | 0.2924 | 0.8009 | 0.9452 | 0.8912 | 0.3686 | 0.8037 | 0.7895 | 0.3378 | 0.8405 | 0.8439 | 0.4542 | 0.8478 | 0.8254 | 0.8118 | 0.852 | 0.7142 | 0.7705 | 0.8766 | 0.909 |
| 0.236 | 149.0 | 74500 | 0.2924 | 0.8009 | 0.9452 | 0.8912 | 0.3686 | 0.8037 | 0.7895 | 0.3378 | 0.8405 | 0.8439 | 0.4542 | 0.8478 | 0.8254 | 0.8118 | 0.852 | 0.7142 | 0.7705 | 0.8766 | 0.909 |
| 0.2197 | 150.0 | 75000 | 0.2924 | 0.8009 | 0.9452 | 0.8912 | 0.3686 | 0.8037 | 0.7895 | 0.3378 | 0.8405 | 0.8439 | 0.4542 | 0.8478 | 0.8254 | 0.8118 | 0.852 | 0.7142 | 0.7705 | 0.8766 | 0.909 |
### Framework versions
- Transformers 4.46.2
- Pytorch 2.5.0+cu121
- Datasets 2.19.2
- Tokenizers 0.20.3
| [
"chicken",
"duck",
"plant"
] |
joe611/chickens-composite-101818181818-150-epochs-w-transform-metrics-test-shfld |
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# chickens-composite-101818181818-150-epochs-w-transform-metrics-test-shfld
This model is a fine-tuned version of [facebook/detr-resnet-50](https://huggingface.co/facebook/detr-resnet-50) on the None dataset.
It achieves the following results on the evaluation set:
- Loss: 0.2959
- Map: 0.7997
- Map 50: 0.9529
- Map 75: 0.9125
- Map Small: 0.2883
- Map Medium: 0.7977
- Map Large: 0.8126
- Mar 1: 0.3344
- Mar 10: 0.8339
- Mar 100: 0.8362
- Mar Small: 0.3111
- Mar Medium: 0.8343
- Mar Large: 0.8457
- Map Chicken: 0.7961
- Mar 100 Chicken: 0.8395
- Map Duck: 0.7392
- Mar 100 Duck: 0.7758
- Map Plant: 0.8637
- Mar 100 Plant: 0.8934
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 1e-05
- train_batch_size: 2
- eval_batch_size: 8
- seed: 42
- optimizer: Use adamw_torch with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
- lr_scheduler_type: cosine
- num_epochs: 150
### Training results
| Training Loss | Epoch | Step | Validation Loss | Map | Map 50 | Map 75 | Map Small | Map Medium | Map Large | Mar 1 | Mar 10 | Mar 100 | Mar Small | Mar Medium | Mar Large | Map Chicken | Mar 100 Chicken | Map Duck | Mar 100 Duck | Map Plant | Mar 100 Plant |
|:-------------:|:-----:|:-----:|:---------------:|:------:|:------:|:------:|:---------:|:----------:|:---------:|:------:|:------:|:-------:|:---------:|:----------:|:---------:|:-----------:|:---------------:|:--------:|:------------:|:---------:|:-------------:|
| 1.5593 | 1.0 | 500 | 1.4254 | 0.0618 | 0.0998 | 0.068 | 0.0482 | 0.038 | 0.0714 | 0.0719 | 0.2004 | 0.3931 | 0.1167 | 0.3646 | 0.3877 | 0.0728 | 0.4278 | 0.0 | 0.0 | 0.1126 | 0.7515 |
| 1.1781 | 2.0 | 1000 | 1.2702 | 0.1171 | 0.178 | 0.1301 | 0.0034 | 0.0749 | 0.1283 | 0.1021 | 0.2999 | 0.4572 | 0.0583 | 0.4223 | 0.4692 | 0.1209 | 0.6089 | 0.0 | 0.0 | 0.2303 | 0.7627 |
| 1.2828 | 3.0 | 1500 | 1.1369 | 0.1986 | 0.2993 | 0.2276 | 0.012 | 0.1722 | 0.2143 | 0.1131 | 0.3631 | 0.4357 | 0.125 | 0.4166 | 0.4223 | 0.2177 | 0.5508 | 0.0 | 0.0 | 0.3781 | 0.7563 |
| 1.211 | 4.0 | 2000 | 1.0852 | 0.2357 | 0.3552 | 0.2708 | 0.0092 | 0.1875 | 0.259 | 0.1155 | 0.3965 | 0.4606 | 0.1125 | 0.4361 | 0.4658 | 0.2359 | 0.6181 | 0.0 | 0.0 | 0.4712 | 0.7636 |
| 0.8201 | 5.0 | 2500 | 0.9602 | 0.3058 | 0.4592 | 0.3621 | 0.0188 | 0.2541 | 0.3171 | 0.1184 | 0.425 | 0.4378 | 0.15 | 0.4168 | 0.4355 | 0.3077 | 0.581 | 0.0 | 0.0 | 0.6097 | 0.7322 |
| 1.0216 | 6.0 | 3000 | 1.0500 | 0.3162 | 0.4586 | 0.3713 | 0.0739 | 0.2797 | 0.3196 | 0.1193 | 0.4178 | 0.4205 | 0.0958 | 0.4006 | 0.4111 | 0.328 | 0.5669 | 0.0 | 0.0 | 0.6206 | 0.6946 |
| 0.9282 | 7.0 | 3500 | 0.8361 | 0.3673 | 0.519 | 0.4224 | 0.0731 | 0.3377 | 0.3698 | 0.1315 | 0.4652 | 0.4725 | 0.1208 | 0.4519 | 0.4738 | 0.4035 | 0.6496 | 0.0 | 0.0 | 0.6984 | 0.7678 |
| 0.744 | 8.0 | 4000 | 0.8710 | 0.3596 | 0.5289 | 0.4163 | 0.0228 | 0.3272 | 0.3628 | 0.1261 | 0.4519 | 0.4547 | 0.0542 | 0.4271 | 0.461 | 0.4232 | 0.6379 | 0.0 | 0.0 | 0.6555 | 0.7262 |
| 0.8694 | 9.0 | 4500 | 0.7759 | 0.3833 | 0.5524 | 0.4452 | 0.0822 | 0.3334 | 0.3959 | 0.1329 | 0.4717 | 0.4771 | 0.1125 | 0.4377 | 0.4887 | 0.4549 | 0.6657 | 0.0 | 0.0 | 0.695 | 0.7657 |
| 0.7675 | 10.0 | 5000 | 0.7369 | 0.3868 | 0.549 | 0.4514 | 0.0968 | 0.3583 | 0.3948 | 0.1302 | 0.4801 | 0.4845 | 0.125 | 0.4532 | 0.5004 | 0.445 | 0.6766 | 0.0 | 0.0 | 0.7153 | 0.7768 |
| 1.0047 | 11.0 | 5500 | 0.7570 | 0.3987 | 0.5494 | 0.4731 | 0.0828 | 0.3708 | 0.4196 | 0.1387 | 0.4818 | 0.4843 | 0.1333 | 0.4521 | 0.5098 | 0.4916 | 0.6911 | 0.0 | 0.0 | 0.7044 | 0.7617 |
| 0.7842 | 12.0 | 6000 | 0.7426 | 0.4039 | 0.568 | 0.4646 | 0.096 | 0.3755 | 0.4117 | 0.1379 | 0.481 | 0.4837 | 0.1583 | 0.451 | 0.5009 | 0.4936 | 0.6823 | 0.0 | 0.0 | 0.7181 | 0.769 |
| 0.7374 | 13.0 | 6500 | 0.6384 | 0.4272 | 0.5718 | 0.5012 | 0.1095 | 0.3974 | 0.4315 | 0.1428 | 0.4996 | 0.5024 | 0.1667 | 0.4709 | 0.5152 | 0.5466 | 0.7157 | 0.0 | 0.0 | 0.735 | 0.7916 |
| 0.8151 | 14.0 | 7000 | 0.6512 | 0.4108 | 0.5644 | 0.4814 | 0.1311 | 0.3736 | 0.4232 | 0.1359 | 0.4899 | 0.4929 | 0.1583 | 0.4609 | 0.5073 | 0.4984 | 0.6927 | 0.0 | 0.0 | 0.7341 | 0.7858 |
| 0.7233 | 15.0 | 7500 | 0.7068 | 0.4002 | 0.5618 | 0.4729 | 0.0743 | 0.3623 | 0.4249 | 0.1352 | 0.4785 | 0.4812 | 0.15 | 0.4459 | 0.5021 | 0.4871 | 0.6786 | 0.0 | 0.0 | 0.7135 | 0.7651 |
| 0.6345 | 16.0 | 8000 | 0.6210 | 0.424 | 0.5825 | 0.4956 | 0.1345 | 0.394 | 0.4389 | 0.1399 | 0.4959 | 0.5018 | 0.1875 | 0.4711 | 0.5256 | 0.5273 | 0.7133 | 0.0 | 0.0 | 0.7446 | 0.7922 |
| 0.5406 | 17.0 | 8500 | 0.5952 | 0.4341 | 0.5836 | 0.5014 | 0.2377 | 0.4078 | 0.4426 | 0.1419 | 0.5052 | 0.5084 | 0.2625 | 0.4795 | 0.5231 | 0.5476 | 0.7169 | 0.0 | 0.0 | 0.7548 | 0.8081 |
| 0.6986 | 18.0 | 9000 | 0.5646 | 0.4465 | 0.5893 | 0.5235 | 0.1327 | 0.4176 | 0.4565 | 0.1399 | 0.5088 | 0.5154 | 0.1625 | 0.4785 | 0.5353 | 0.576 | 0.7343 | 0.0 | 0.0 | 0.7635 | 0.812 |
| 0.6939 | 19.0 | 9500 | 0.6293 | 0.4146 | 0.5821 | 0.4897 | 0.1459 | 0.3885 | 0.4238 | 0.1343 | 0.4816 | 0.4848 | 0.1667 | 0.4535 | 0.5074 | 0.5217 | 0.6875 | 0.0 | 0.0 | 0.7222 | 0.7669 |
| 0.6532 | 20.0 | 10000 | 0.5819 | 0.4345 | 0.5819 | 0.4954 | 0.1127 | 0.4072 | 0.4364 | 0.1435 | 0.5116 | 0.5158 | 0.1958 | 0.483 | 0.5299 | 0.5501 | 0.7415 | 0.0 | 0.0 | 0.7535 | 0.8057 |
| 0.6648 | 21.0 | 10500 | 0.5717 | 0.4251 | 0.5849 | 0.5027 | 0.1448 | 0.3897 | 0.4468 | 0.1341 | 0.4993 | 0.5041 | 0.2125 | 0.4721 | 0.5222 | 0.5444 | 0.7323 | 0.0 | 0.0 | 0.7308 | 0.7801 |
| 0.7616 | 22.0 | 11000 | 0.5705 | 0.4331 | 0.5904 | 0.5126 | 0.1392 | 0.4043 | 0.4415 | 0.1365 | 0.5034 | 0.508 | 0.1833 | 0.4804 | 0.5178 | 0.5476 | 0.7234 | 0.0 | 0.0 | 0.7518 | 0.8006 |
| 0.6133 | 23.0 | 11500 | 0.5598 | 0.4461 | 0.5853 | 0.5289 | 0.0873 | 0.4143 | 0.4588 | 0.14 | 0.5158 | 0.5209 | 0.1375 | 0.4886 | 0.5394 | 0.5671 | 0.7452 | 0.0 | 0.0 | 0.7712 | 0.8175 |
| 0.7888 | 24.0 | 12000 | 0.5552 | 0.4503 | 0.5882 | 0.5292 | 0.158 | 0.4155 | 0.4656 | 0.1444 | 0.5185 | 0.5237 | 0.2167 | 0.4911 | 0.5364 | 0.5769 | 0.7609 | 0.0 | 0.0 | 0.7741 | 0.8102 |
| 0.4915 | 25.0 | 12500 | 0.5522 | 0.4361 | 0.5917 | 0.5251 | 0.0734 | 0.4055 | 0.4591 | 0.1379 | 0.5031 | 0.5094 | 0.2375 | 0.4745 | 0.5255 | 0.542 | 0.7153 | 0.0 | 0.0 | 0.7661 | 0.813 |
| 0.5545 | 26.0 | 13000 | 0.5461 | 0.4586 | 0.6022 | 0.5503 | 0.1329 | 0.4316 | 0.4647 | 0.1507 | 0.5153 | 0.5193 | 0.2292 | 0.4906 | 0.5279 | 0.6059 | 0.7492 | 0.0 | 0.0 | 0.7699 | 0.8087 |
| 0.6167 | 27.0 | 13500 | 0.5517 | 0.4488 | 0.5941 | 0.5451 | 0.1318 | 0.4116 | 0.4805 | 0.1417 | 0.5105 | 0.5152 | 0.2125 | 0.4806 | 0.5445 | 0.5815 | 0.7444 | 0.0 | 0.0 | 0.765 | 0.8012 |
| 0.7243 | 28.0 | 14000 | 0.5391 | 0.4569 | 0.5973 | 0.5348 | 0.1178 | 0.4315 | 0.4786 | 0.146 | 0.5185 | 0.5233 | 0.2 | 0.4952 | 0.5454 | 0.5923 | 0.744 | 0.0 | 0.0 | 0.7784 | 0.8259 |
| 0.6615 | 29.0 | 14500 | 0.5449 | 0.4503 | 0.6024 | 0.5494 | 0.0953 | 0.4129 | 0.4816 | 0.1439 | 0.507 | 0.5113 | 0.2 | 0.4751 | 0.5379 | 0.5879 | 0.7323 | 0.0 | 0.0 | 0.7629 | 0.8015 |
| 0.4993 | 30.0 | 15000 | 0.5131 | 0.4743 | 0.6051 | 0.5623 | 0.0918 | 0.4482 | 0.5041 | 0.1478 | 0.5275 | 0.5331 | 0.2375 | 0.5045 | 0.5539 | 0.6328 | 0.773 | 0.0 | 0.0 | 0.7901 | 0.8262 |
| 0.62 | 31.0 | 15500 | 0.5405 | 0.4646 | 0.603 | 0.5542 | 0.0607 | 0.4422 | 0.4832 | 0.1485 | 0.5201 | 0.5237 | 0.1625 | 0.5016 | 0.5361 | 0.614 | 0.7512 | 0.0 | 0.0 | 0.7798 | 0.8199 |
| 0.5616 | 32.0 | 16000 | 0.5097 | 0.4744 | 0.6111 | 0.5736 | 0.07 | 0.4423 | 0.4967 | 0.1453 | 0.5236 | 0.5288 | 0.1667 | 0.4979 | 0.544 | 0.6395 | 0.7585 | 0.0 | 0.0 | 0.7836 | 0.828 |
| 0.4979 | 33.0 | 16500 | 0.5117 | 0.4846 | 0.6197 | 0.5695 | 0.0474 | 0.4538 | 0.5076 | 0.1494 | 0.5307 | 0.5358 | 0.125 | 0.5075 | 0.5547 | 0.6651 | 0.779 | 0.0 | 0.0 | 0.7888 | 0.8283 |
| 0.5772 | 34.0 | 17000 | 0.5248 | 0.4758 | 0.6206 | 0.571 | 0.0753 | 0.4453 | 0.4902 | 0.1503 | 0.5223 | 0.5265 | 0.1958 | 0.4962 | 0.5354 | 0.6594 | 0.7669 | 0.0 | 0.0 | 0.7679 | 0.8127 |
| 0.6061 | 35.0 | 17500 | 0.5076 | 0.4867 | 0.6271 | 0.5914 | 0.0845 | 0.455 | 0.4979 | 0.1506 | 0.5254 | 0.5297 | 0.225 | 0.4979 | 0.5438 | 0.6651 | 0.7609 | 0.0 | 0.0 | 0.7951 | 0.8283 |
| 0.4872 | 36.0 | 18000 | 0.5002 | 0.4875 | 0.6307 | 0.5823 | 0.0911 | 0.4517 | 0.5101 | 0.1532 | 0.5244 | 0.528 | 0.2208 | 0.4922 | 0.5457 | 0.6716 | 0.7581 | 0.0 | 0.0 | 0.7909 | 0.8259 |
| 0.499 | 37.0 | 18500 | 0.4787 | 0.4995 | 0.6374 | 0.5907 | 0.134 | 0.4658 | 0.5097 | 0.1559 | 0.5317 | 0.5369 | 0.225 | 0.506 | 0.5468 | 0.6978 | 0.7726 | 0.0 | 0.0 | 0.8008 | 0.8383 |
| 0.4926 | 38.0 | 19000 | 0.4706 | 0.5026 | 0.6392 | 0.6018 | 0.1207 | 0.4744 | 0.5192 | 0.1567 | 0.5321 | 0.5391 | 0.2333 | 0.5156 | 0.5516 | 0.7093 | 0.7746 | 0.0 | 0.0 | 0.7986 | 0.8428 |
| 0.4842 | 39.0 | 19500 | 0.4501 | 0.5355 | 0.6675 | 0.6262 | 0.1531 | 0.5188 | 0.5258 | 0.1782 | 0.5599 | 0.5657 | 0.2167 | 0.5497 | 0.5566 | 0.7289 | 0.7831 | 0.0676 | 0.0621 | 0.8099 | 0.8518 |
| 0.4388 | 40.0 | 20000 | 0.4492 | 0.5759 | 0.7319 | 0.6917 | 0.2016 | 0.5629 | 0.5794 | 0.2133 | 0.6052 | 0.6116 | 0.2792 | 0.5968 | 0.612 | 0.7058 | 0.7609 | 0.2139 | 0.2211 | 0.808 | 0.8527 |
| 0.5973 | 41.0 | 20500 | 0.4320 | 0.6214 | 0.7675 | 0.7299 | 0.1689 | 0.6037 | 0.6191 | 0.2363 | 0.6511 | 0.6566 | 0.275 | 0.6379 | 0.6511 | 0.7474 | 0.8024 | 0.3108 | 0.3232 | 0.806 | 0.8443 |
| 0.6471 | 42.0 | 21000 | 0.4575 | 0.6216 | 0.8126 | 0.7403 | 0.1932 | 0.6134 | 0.6043 | 0.2459 | 0.6582 | 0.661 | 0.2667 | 0.6489 | 0.6509 | 0.6722 | 0.7298 | 0.3924 | 0.4137 | 0.8 | 0.8395 |
| 0.5442 | 43.0 | 21500 | 0.4265 | 0.6911 | 0.8876 | 0.842 | 0.2163 | 0.6898 | 0.693 | 0.2988 | 0.7358 | 0.7408 | 0.3333 | 0.736 | 0.7502 | 0.7061 | 0.7681 | 0.5569 | 0.6053 | 0.8103 | 0.8491 |
| 0.52 | 44.0 | 22000 | 0.4279 | 0.6918 | 0.8948 | 0.8453 | 0.1957 | 0.691 | 0.6836 | 0.2969 | 0.7344 | 0.7381 | 0.2875 | 0.7341 | 0.7327 | 0.7114 | 0.7694 | 0.5632 | 0.6021 | 0.8007 | 0.8428 |
| 0.5388 | 45.0 | 22500 | 0.4436 | 0.6881 | 0.8959 | 0.8218 | 0.1081 | 0.6807 | 0.6963 | 0.2972 | 0.7291 | 0.7321 | 0.2417 | 0.7229 | 0.7455 | 0.6986 | 0.7476 | 0.5668 | 0.6042 | 0.7991 | 0.8446 |
| 0.448 | 46.0 | 23000 | 0.4175 | 0.6923 | 0.888 | 0.8294 | 0.1616 | 0.686 | 0.7025 | 0.2932 | 0.7316 | 0.7366 | 0.2542 | 0.7282 | 0.7519 | 0.712 | 0.7665 | 0.5533 | 0.5916 | 0.8116 | 0.8518 |
| 0.4813 | 47.0 | 23500 | 0.4539 | 0.6668 | 0.869 | 0.8013 | 0.177 | 0.6492 | 0.6792 | 0.2837 | 0.7038 | 0.7076 | 0.3 | 0.6941 | 0.715 | 0.7076 | 0.7593 | 0.5094 | 0.5347 | 0.7835 | 0.8289 |
| 0.5411 | 48.0 | 24000 | 0.3942 | 0.7248 | 0.9066 | 0.8668 | 0.2003 | 0.7087 | 0.7693 | 0.3079 | 0.7636 | 0.7678 | 0.2625 | 0.7516 | 0.813 | 0.7497 | 0.7964 | 0.6227 | 0.6621 | 0.8019 | 0.8449 |
| 0.6534 | 49.0 | 24500 | 0.4056 | 0.7123 | 0.8974 | 0.8511 | 0.1611 | 0.6932 | 0.7483 | 0.304 | 0.7505 | 0.7543 | 0.2958 | 0.7338 | 0.7925 | 0.7219 | 0.7746 | 0.5993 | 0.6316 | 0.8156 | 0.8566 |
| 0.377 | 50.0 | 25000 | 0.3828 | 0.7397 | 0.936 | 0.8724 | 0.2397 | 0.724 | 0.7552 | 0.3188 | 0.7791 | 0.7832 | 0.3292 | 0.7696 | 0.8016 | 0.7286 | 0.7734 | 0.6664 | 0.7126 | 0.8241 | 0.8636 |
| 0.5198 | 51.0 | 25500 | 0.3895 | 0.7341 | 0.9234 | 0.8692 | 0.1944 | 0.722 | 0.7554 | 0.3099 | 0.7701 | 0.7733 | 0.2875 | 0.7617 | 0.802 | 0.7328 | 0.7734 | 0.6507 | 0.6926 | 0.8186 | 0.8539 |
| 0.4339 | 52.0 | 26000 | 0.3881 | 0.7396 | 0.9251 | 0.8757 | 0.1544 | 0.717 | 0.7959 | 0.3127 | 0.7743 | 0.7776 | 0.2417 | 0.7538 | 0.8312 | 0.7413 | 0.7891 | 0.6603 | 0.6895 | 0.8171 | 0.8542 |
| 0.4556 | 53.0 | 26500 | 0.3676 | 0.7385 | 0.9322 | 0.882 | 0.2223 | 0.7185 | 0.7716 | 0.3151 | 0.7726 | 0.7772 | 0.3014 | 0.7634 | 0.8087 | 0.7453 | 0.7847 | 0.6602 | 0.6968 | 0.8099 | 0.85 |
| 0.4291 | 54.0 | 27000 | 0.3836 | 0.7341 | 0.9304 | 0.8776 | 0.177 | 0.7103 | 0.7794 | 0.3179 | 0.7712 | 0.7758 | 0.3 | 0.7579 | 0.8178 | 0.7395 | 0.7895 | 0.6559 | 0.6937 | 0.8068 | 0.8443 |
| 0.4573 | 55.0 | 27500 | 0.3755 | 0.7386 | 0.9355 | 0.8918 | 0.1973 | 0.7277 | 0.7621 | 0.3135 | 0.7772 | 0.7809 | 0.2667 | 0.7733 | 0.8019 | 0.738 | 0.7847 | 0.6684 | 0.7116 | 0.8095 | 0.8464 |
| 0.4252 | 56.0 | 28000 | 0.3715 | 0.7428 | 0.9323 | 0.8989 | 0.1769 | 0.7324 | 0.7619 | 0.3141 | 0.779 | 0.783 | 0.3042 | 0.774 | 0.7984 | 0.7513 | 0.794 | 0.6641 | 0.7032 | 0.8129 | 0.8518 |
| 0.4442 | 57.0 | 28500 | 0.3634 | 0.7473 | 0.9293 | 0.8816 | 0.2133 | 0.739 | 0.7691 | 0.3148 | 0.7871 | 0.7901 | 0.2708 | 0.7827 | 0.8123 | 0.7623 | 0.804 | 0.6598 | 0.7084 | 0.8197 | 0.8578 |
| 0.6128 | 58.0 | 29000 | 0.3701 | 0.7495 | 0.9339 | 0.8731 | 0.1517 | 0.7355 | 0.7881 | 0.3166 | 0.7839 | 0.7881 | 0.2875 | 0.7748 | 0.8276 | 0.7661 | 0.806 | 0.6761 | 0.7126 | 0.8063 | 0.8455 |
| 0.5688 | 59.0 | 29500 | 0.3720 | 0.7475 | 0.9385 | 0.8895 | 0.2601 | 0.7387 | 0.7661 | 0.3215 | 0.7826 | 0.7862 | 0.3125 | 0.7791 | 0.8048 | 0.7562 | 0.794 | 0.6744 | 0.7158 | 0.8118 | 0.8488 |
| 0.4942 | 60.0 | 30000 | 0.3703 | 0.74 | 0.9318 | 0.8728 | 0.1782 | 0.726 | 0.7702 | 0.3124 | 0.7787 | 0.7821 | 0.2542 | 0.7671 | 0.817 | 0.7558 | 0.8028 | 0.6483 | 0.6895 | 0.816 | 0.8539 |
| 0.4106 | 61.0 | 30500 | 0.3685 | 0.739 | 0.934 | 0.8802 | 0.2221 | 0.7235 | 0.7501 | 0.3176 | 0.7772 | 0.7799 | 0.2667 | 0.767 | 0.7946 | 0.7401 | 0.7847 | 0.6619 | 0.7053 | 0.8151 | 0.8497 |
| 0.4923 | 62.0 | 31000 | 0.3567 | 0.7568 | 0.9428 | 0.9018 | 0.2009 | 0.7439 | 0.7907 | 0.3218 | 0.7928 | 0.7958 | 0.2875 | 0.787 | 0.8232 | 0.7691 | 0.8085 | 0.6744 | 0.7179 | 0.8267 | 0.8611 |
| 0.474 | 63.0 | 31500 | 0.3643 | 0.7388 | 0.9372 | 0.8944 | 0.1755 | 0.7311 | 0.7726 | 0.315 | 0.7806 | 0.784 | 0.2667 | 0.7816 | 0.8129 | 0.746 | 0.7915 | 0.6476 | 0.7021 | 0.8228 | 0.8584 |
| 0.4232 | 64.0 | 32000 | 0.3292 | 0.772 | 0.9443 | 0.8969 | 0.2505 | 0.755 | 0.812 | 0.3252 | 0.8062 | 0.8104 | 0.3528 | 0.7921 | 0.8463 | 0.7743 | 0.8198 | 0.7097 | 0.7442 | 0.8321 | 0.8672 |
| 0.5473 | 65.0 | 32500 | 0.3485 | 0.7562 | 0.9403 | 0.8948 | 0.2005 | 0.743 | 0.7782 | 0.3213 | 0.7979 | 0.8011 | 0.2931 | 0.7902 | 0.8205 | 0.7678 | 0.8145 | 0.6801 | 0.7274 | 0.8206 | 0.8614 |
| 0.3545 | 66.0 | 33000 | 0.3473 | 0.7548 | 0.9457 | 0.901 | 0.1938 | 0.7436 | 0.7813 | 0.3176 | 0.7922 | 0.7958 | 0.275 | 0.7861 | 0.8239 | 0.7501 | 0.798 | 0.6857 | 0.7263 | 0.8285 | 0.863 |
| 0.596 | 67.0 | 33500 | 0.3422 | 0.7591 | 0.9432 | 0.8851 | 0.2045 | 0.7441 | 0.7888 | 0.3222 | 0.796 | 0.8002 | 0.2958 | 0.787 | 0.8287 | 0.7652 | 0.8129 | 0.6755 | 0.7179 | 0.8365 | 0.8699 |
| 0.4029 | 68.0 | 34000 | 0.3423 | 0.7551 | 0.9428 | 0.8957 | 0.2309 | 0.7443 | 0.791 | 0.3184 | 0.7891 | 0.7947 | 0.3514 | 0.7851 | 0.8277 | 0.7551 | 0.8016 | 0.6732 | 0.7116 | 0.8369 | 0.8708 |
| 0.4128 | 69.0 | 34500 | 0.3450 | 0.7529 | 0.9409 | 0.8963 | 0.2611 | 0.7392 | 0.7774 | 0.3193 | 0.79 | 0.7959 | 0.3542 | 0.7835 | 0.8173 | 0.7417 | 0.7952 | 0.6837 | 0.7221 | 0.8334 | 0.8705 |
| 0.4718 | 70.0 | 35000 | 0.3278 | 0.7638 | 0.9412 | 0.8912 | 0.2465 | 0.7532 | 0.7886 | 0.3231 | 0.8006 | 0.8045 | 0.3458 | 0.7991 | 0.8211 | 0.7709 | 0.8169 | 0.6872 | 0.7274 | 0.8333 | 0.8693 |
| 0.4164 | 71.0 | 35500 | 0.3412 | 0.7612 | 0.9365 | 0.8922 | 0.1539 | 0.7533 | 0.7865 | 0.3204 | 0.7966 | 0.801 | 0.2917 | 0.7934 | 0.8223 | 0.7756 | 0.8202 | 0.6792 | 0.72 | 0.8289 | 0.863 |
| 0.403 | 72.0 | 36000 | 0.3284 | 0.7728 | 0.9476 | 0.8902 | 0.1302 | 0.7622 | 0.7966 | 0.325 | 0.8076 | 0.8099 | 0.2736 | 0.8 | 0.8286 | 0.7776 | 0.8181 | 0.6971 | 0.7368 | 0.8438 | 0.8747 |
| 0.3609 | 73.0 | 36500 | 0.3294 | 0.7637 | 0.9489 | 0.89 | 0.2091 | 0.7552 | 0.7887 | 0.3209 | 0.8006 | 0.8035 | 0.2875 | 0.7958 | 0.8287 | 0.7743 | 0.8141 | 0.6813 | 0.7263 | 0.8354 | 0.8702 |
| 0.5105 | 74.0 | 37000 | 0.3394 | 0.7624 | 0.9427 | 0.8916 | 0.2572 | 0.7482 | 0.7855 | 0.3218 | 0.8002 | 0.8023 | 0.3444 | 0.7914 | 0.8232 | 0.7715 | 0.8121 | 0.6815 | 0.7232 | 0.8341 | 0.8717 |
| 0.6515 | 75.0 | 37500 | 0.3499 | 0.7502 | 0.9314 | 0.8928 | 0.2575 | 0.746 | 0.7566 | 0.3176 | 0.7881 | 0.7907 | 0.3042 | 0.787 | 0.7984 | 0.7466 | 0.7935 | 0.6826 | 0.7242 | 0.8213 | 0.8542 |
| 0.4555 | 76.0 | 38000 | 0.3373 | 0.7616 | 0.9381 | 0.8959 | 0.1457 | 0.7485 | 0.7836 | 0.3239 | 0.7999 | 0.803 | 0.2069 | 0.7923 | 0.8227 | 0.7674 | 0.8085 | 0.6834 | 0.7337 | 0.834 | 0.8669 |
| 0.3485 | 77.0 | 38500 | 0.3352 | 0.7615 | 0.9427 | 0.8978 | 0.2851 | 0.7522 | 0.7733 | 0.3234 | 0.8016 | 0.8041 | 0.325 | 0.7968 | 0.8129 | 0.7631 | 0.8101 | 0.685 | 0.7337 | 0.8365 | 0.8687 |
| 0.3587 | 78.0 | 39000 | 0.3461 | 0.7597 | 0.9494 | 0.8958 | 0.2878 | 0.749 | 0.7867 | 0.3187 | 0.7974 | 0.8002 | 0.3375 | 0.7913 | 0.8198 | 0.7504 | 0.794 | 0.6912 | 0.7358 | 0.8375 | 0.8708 |
| 0.3367 | 79.0 | 39500 | 0.3436 | 0.7621 | 0.9427 | 0.8896 | 0.2702 | 0.7507 | 0.7792 | 0.3251 | 0.7975 | 0.8016 | 0.3944 | 0.794 | 0.8105 | 0.7562 | 0.798 | 0.7054 | 0.7463 | 0.8246 | 0.8605 |
| 0.3801 | 80.0 | 40000 | 0.3441 | 0.7589 | 0.9455 | 0.8992 | 0.2149 | 0.7475 | 0.779 | 0.3191 | 0.7943 | 0.7974 | 0.3319 | 0.788 | 0.8175 | 0.7551 | 0.7935 | 0.6854 | 0.7274 | 0.836 | 0.8714 |
| 0.3428 | 81.0 | 40500 | 0.3205 | 0.7721 | 0.9512 | 0.8923 | 0.2591 | 0.7598 | 0.7885 | 0.3272 | 0.8124 | 0.8165 | 0.4194 | 0.8055 | 0.8279 | 0.7651 | 0.8121 | 0.7035 | 0.7579 | 0.8477 | 0.8795 |
| 0.4288 | 82.0 | 41000 | 0.3386 | 0.7599 | 0.9442 | 0.8905 | 0.2776 | 0.7494 | 0.7804 | 0.3206 | 0.8012 | 0.8039 | 0.3292 | 0.7937 | 0.8175 | 0.7578 | 0.8012 | 0.6852 | 0.74 | 0.8369 | 0.8705 |
| 0.4539 | 83.0 | 41500 | 0.3316 | 0.7721 | 0.9488 | 0.9017 | 0.2794 | 0.7616 | 0.7927 | 0.325 | 0.8078 | 0.811 | 0.3653 | 0.8016 | 0.8252 | 0.7731 | 0.8137 | 0.7055 | 0.7484 | 0.8376 | 0.8708 |
| 0.4158 | 84.0 | 42000 | 0.3345 | 0.7676 | 0.9458 | 0.8937 | 0.2775 | 0.7582 | 0.7787 | 0.3238 | 0.8063 | 0.8095 | 0.3292 | 0.7991 | 0.8183 | 0.7683 | 0.8133 | 0.6969 | 0.7442 | 0.8376 | 0.8711 |
| 0.4224 | 85.0 | 42500 | 0.3407 | 0.7706 | 0.9461 | 0.9014 | 0.2233 | 0.7642 | 0.7863 | 0.3265 | 0.8048 | 0.8095 | 0.2667 | 0.804 | 0.8204 | 0.7653 | 0.8113 | 0.7094 | 0.7474 | 0.837 | 0.8699 |
| 0.3439 | 86.0 | 43000 | 0.3385 | 0.7685 | 0.9516 | 0.8996 | 0.2689 | 0.7649 | 0.777 | 0.3275 | 0.806 | 0.8098 | 0.3167 | 0.8051 | 0.8161 | 0.7562 | 0.8004 | 0.7077 | 0.7558 | 0.8414 | 0.8732 |
| 0.3483 | 87.0 | 43500 | 0.3165 | 0.7836 | 0.9498 | 0.9039 | 0.2278 | 0.7832 | 0.797 | 0.3325 | 0.815 | 0.8192 | 0.3028 | 0.8178 | 0.8328 | 0.7771 | 0.819 | 0.7286 | 0.7611 | 0.845 | 0.8777 |
| 0.5253 | 88.0 | 44000 | 0.3120 | 0.7843 | 0.9559 | 0.918 | 0.2194 | 0.7764 | 0.8012 | 0.3336 | 0.82 | 0.8236 | 0.3931 | 0.8153 | 0.8335 | 0.7689 | 0.8169 | 0.7314 | 0.7684 | 0.8527 | 0.8855 |
| 0.3953 | 89.0 | 44500 | 0.3341 | 0.7739 | 0.9514 | 0.9124 | 0.2586 | 0.7675 | 0.7808 | 0.3277 | 0.8086 | 0.8121 | 0.3917 | 0.8042 | 0.8181 | 0.7658 | 0.8044 | 0.7069 | 0.7516 | 0.8489 | 0.8804 |
| 0.3813 | 90.0 | 45000 | 0.3128 | 0.7878 | 0.9479 | 0.9026 | 0.2493 | 0.7815 | 0.8061 | 0.3305 | 0.8198 | 0.8239 | 0.3569 | 0.817 | 0.8422 | 0.7859 | 0.8315 | 0.719 | 0.7495 | 0.8585 | 0.8907 |
| 0.4071 | 91.0 | 45500 | 0.3136 | 0.7806 | 0.95 | 0.9138 | 0.2509 | 0.7757 | 0.7935 | 0.3296 | 0.8204 | 0.8226 | 0.3736 | 0.8172 | 0.8341 | 0.7747 | 0.8218 | 0.7165 | 0.7632 | 0.8505 | 0.8828 |
| 0.2818 | 92.0 | 46000 | 0.3106 | 0.7856 | 0.9452 | 0.9137 | 0.2494 | 0.781 | 0.7984 | 0.3308 | 0.8207 | 0.8241 | 0.3097 | 0.8192 | 0.8352 | 0.7843 | 0.8298 | 0.7207 | 0.7558 | 0.8518 | 0.8867 |
| 0.4456 | 93.0 | 46500 | 0.3185 | 0.7745 | 0.9492 | 0.9028 | 0.2588 | 0.7734 | 0.7772 | 0.3278 | 0.8113 | 0.8152 | 0.3111 | 0.8122 | 0.8164 | 0.7647 | 0.8141 | 0.7148 | 0.7505 | 0.8441 | 0.881 |
| 0.3401 | 94.0 | 47000 | 0.3182 | 0.7793 | 0.948 | 0.9154 | 0.2493 | 0.7752 | 0.7902 | 0.3286 | 0.8171 | 0.8202 | 0.2889 | 0.8178 | 0.8328 | 0.7733 | 0.8214 | 0.7157 | 0.7568 | 0.8489 | 0.8825 |
| 0.4676 | 95.0 | 47500 | 0.3151 | 0.7779 | 0.9452 | 0.9112 | 0.2873 | 0.7781 | 0.7884 | 0.3285 | 0.8133 | 0.8179 | 0.3222 | 0.8158 | 0.8288 | 0.7774 | 0.8222 | 0.7067 | 0.7463 | 0.8496 | 0.8852 |
| 0.3521 | 96.0 | 48000 | 0.3066 | 0.7766 | 0.9432 | 0.9063 | 0.2183 | 0.7709 | 0.7952 | 0.3272 | 0.8131 | 0.8162 | 0.2875 | 0.8059 | 0.8375 | 0.7837 | 0.8298 | 0.6937 | 0.7347 | 0.8522 | 0.884 |
| 0.441 | 97.0 | 48500 | 0.3254 | 0.772 | 0.9516 | 0.9019 | 0.2356 | 0.7668 | 0.7752 | 0.3235 | 0.8091 | 0.8123 | 0.3167 | 0.8031 | 0.8168 | 0.7692 | 0.8129 | 0.6997 | 0.7432 | 0.847 | 0.8807 |
| 0.427 | 98.0 | 49000 | 0.3106 | 0.7836 | 0.9527 | 0.9099 | 0.2817 | 0.7824 | 0.7784 | 0.3307 | 0.8202 | 0.8244 | 0.3611 | 0.8203 | 0.8181 | 0.7819 | 0.8254 | 0.7195 | 0.7621 | 0.8492 | 0.8855 |
| 0.4111 | 99.0 | 49500 | 0.3173 | 0.7738 | 0.9507 | 0.9078 | 0.2385 | 0.7695 | 0.7834 | 0.3245 | 0.815 | 0.8176 | 0.3264 | 0.8128 | 0.8225 | 0.7634 | 0.8169 | 0.7003 | 0.7474 | 0.8576 | 0.8886 |
| 0.3262 | 100.0 | 50000 | 0.3151 | 0.7825 | 0.9484 | 0.9102 | 0.2957 | 0.7775 | 0.7945 | 0.3288 | 0.8209 | 0.8241 | 0.3569 | 0.8189 | 0.8343 | 0.7853 | 0.8302 | 0.7066 | 0.7547 | 0.8555 | 0.8873 |
| 0.3494 | 101.0 | 50500 | 0.3024 | 0.7894 | 0.9502 | 0.9134 | 0.279 | 0.7801 | 0.7959 | 0.3354 | 0.8259 | 0.8289 | 0.3361 | 0.8214 | 0.8357 | 0.7921 | 0.8339 | 0.7235 | 0.7674 | 0.8526 | 0.8855 |
| 0.3275 | 102.0 | 51000 | 0.3043 | 0.7942 | 0.9492 | 0.9154 | 0.2618 | 0.7902 | 0.7998 | 0.3333 | 0.8301 | 0.8332 | 0.3431 | 0.8305 | 0.8369 | 0.7993 | 0.8375 | 0.7259 | 0.7716 | 0.8575 | 0.8907 |
| 0.4047 | 103.0 | 51500 | 0.3139 | 0.7821 | 0.9471 | 0.9041 | 0.2531 | 0.7759 | 0.795 | 0.3333 | 0.8192 | 0.8222 | 0.2972 | 0.8194 | 0.8308 | 0.7777 | 0.8218 | 0.7181 | 0.7632 | 0.8505 | 0.8816 |
| 0.4509 | 104.0 | 52000 | 0.3037 | 0.7874 | 0.949 | 0.9044 | 0.2873 | 0.7813 | 0.7962 | 0.3325 | 0.8244 | 0.8273 | 0.3444 | 0.8217 | 0.8364 | 0.782 | 0.8266 | 0.7222 | 0.7674 | 0.858 | 0.888 |
| 0.3733 | 105.0 | 52500 | 0.3045 | 0.7915 | 0.9505 | 0.9118 | 0.2813 | 0.79 | 0.7988 | 0.3367 | 0.8271 | 0.8304 | 0.3056 | 0.8302 | 0.839 | 0.7838 | 0.8306 | 0.7349 | 0.7737 | 0.8558 | 0.8867 |
| 0.288 | 106.0 | 53000 | 0.2985 | 0.7987 | 0.9451 | 0.9103 | 0.2556 | 0.7955 | 0.8187 | 0.337 | 0.831 | 0.8342 | 0.3 | 0.8316 | 0.8528 | 0.7983 | 0.8399 | 0.734 | 0.7695 | 0.8638 | 0.8931 |
| 0.3548 | 107.0 | 53500 | 0.3132 | 0.7894 | 0.9507 | 0.9072 | 0.2922 | 0.7839 | 0.8013 | 0.3329 | 0.8252 | 0.8278 | 0.3514 | 0.8205 | 0.8403 | 0.7828 | 0.8262 | 0.7297 | 0.7716 | 0.8555 | 0.8855 |
| 0.3726 | 108.0 | 54000 | 0.3111 | 0.7878 | 0.9497 | 0.9134 | 0.2736 | 0.7799 | 0.8137 | 0.3319 | 0.824 | 0.8265 | 0.3292 | 0.8182 | 0.8511 | 0.7879 | 0.8306 | 0.7213 | 0.7653 | 0.8541 | 0.8837 |
| 0.3411 | 109.0 | 54500 | 0.3119 | 0.7887 | 0.9531 | 0.9073 | 0.2727 | 0.7778 | 0.8099 | 0.3322 | 0.8222 | 0.8263 | 0.3389 | 0.8172 | 0.8445 | 0.7819 | 0.8238 | 0.7242 | 0.7684 | 0.86 | 0.8867 |
| 0.363 | 110.0 | 55000 | 0.3108 | 0.7899 | 0.9503 | 0.9081 | 0.2949 | 0.7863 | 0.7952 | 0.3326 | 0.8236 | 0.8266 | 0.3139 | 0.8226 | 0.8333 | 0.787 | 0.8274 | 0.7217 | 0.7632 | 0.8611 | 0.8892 |
| 0.4096 | 111.0 | 55500 | 0.3072 | 0.7927 | 0.954 | 0.9121 | 0.2848 | 0.7876 | 0.8045 | 0.3324 | 0.828 | 0.8316 | 0.3778 | 0.825 | 0.8429 | 0.7853 | 0.8315 | 0.7351 | 0.7768 | 0.8576 | 0.8864 |
| 0.3111 | 112.0 | 56000 | 0.3106 | 0.7929 | 0.9505 | 0.9022 | 0.2749 | 0.7829 | 0.8123 | 0.3332 | 0.8266 | 0.83 | 0.3153 | 0.8215 | 0.8478 | 0.7885 | 0.8323 | 0.7335 | 0.7726 | 0.8569 | 0.8852 |
| 0.3804 | 113.0 | 56500 | 0.3035 | 0.793 | 0.951 | 0.903 | 0.2734 | 0.7889 | 0.8084 | 0.3332 | 0.8285 | 0.8312 | 0.3097 | 0.8258 | 0.8445 | 0.7988 | 0.8399 | 0.7244 | 0.7663 | 0.8557 | 0.8873 |
| 0.3715 | 114.0 | 57000 | 0.2993 | 0.7932 | 0.9514 | 0.9094 | 0.2774 | 0.7894 | 0.8044 | 0.3333 | 0.8286 | 0.832 | 0.3347 | 0.8274 | 0.8413 | 0.7915 | 0.8351 | 0.7269 | 0.7705 | 0.8612 | 0.8904 |
| 0.315 | 115.0 | 57500 | 0.3028 | 0.7944 | 0.9507 | 0.9133 | 0.3015 | 0.7909 | 0.8033 | 0.3332 | 0.8294 | 0.8331 | 0.35 | 0.8299 | 0.8387 | 0.7897 | 0.8347 | 0.7335 | 0.7737 | 0.8601 | 0.891 |
| 0.4886 | 116.0 | 58000 | 0.3074 | 0.7883 | 0.9457 | 0.911 | 0.2345 | 0.7885 | 0.8051 | 0.3317 | 0.8214 | 0.8246 | 0.2625 | 0.8256 | 0.8396 | 0.791 | 0.8339 | 0.7182 | 0.7526 | 0.8556 | 0.8873 |
| 0.3763 | 117.0 | 58500 | 0.3135 | 0.7852 | 0.9475 | 0.9057 | 0.2393 | 0.7799 | 0.7995 | 0.3305 | 0.8187 | 0.8227 | 0.2958 | 0.8182 | 0.8352 | 0.7903 | 0.8351 | 0.7145 | 0.7505 | 0.8508 | 0.8825 |
| 0.3792 | 118.0 | 59000 | 0.3029 | 0.7906 | 0.9472 | 0.9091 | 0.236 | 0.7894 | 0.8025 | 0.3328 | 0.8241 | 0.8274 | 0.2667 | 0.8263 | 0.8388 | 0.7917 | 0.8351 | 0.7151 | 0.7526 | 0.8651 | 0.8946 |
| 0.3545 | 119.0 | 59500 | 0.3043 | 0.7888 | 0.9476 | 0.905 | 0.2429 | 0.7832 | 0.8063 | 0.3315 | 0.8241 | 0.8266 | 0.2833 | 0.8223 | 0.8427 | 0.7975 | 0.8435 | 0.712 | 0.7495 | 0.8568 | 0.8867 |
| 0.3975 | 120.0 | 60000 | 0.3022 | 0.7916 | 0.951 | 0.9125 | 0.2702 | 0.7858 | 0.8034 | 0.3329 | 0.8243 | 0.8268 | 0.2903 | 0.8232 | 0.8369 | 0.791 | 0.8335 | 0.7232 | 0.7558 | 0.8606 | 0.8913 |
| 0.4769 | 121.0 | 60500 | 0.3009 | 0.7946 | 0.9509 | 0.9107 | 0.2788 | 0.7918 | 0.8015 | 0.3354 | 0.8296 | 0.8321 | 0.3167 | 0.8309 | 0.8374 | 0.7921 | 0.8355 | 0.7321 | 0.7705 | 0.8596 | 0.8904 |
| 0.4686 | 122.0 | 61000 | 0.3002 | 0.7971 | 0.9524 | 0.9136 | 0.2965 | 0.794 | 0.8077 | 0.333 | 0.8305 | 0.833 | 0.3181 | 0.8291 | 0.8421 | 0.7987 | 0.8419 | 0.7322 | 0.7663 | 0.8602 | 0.8907 |
| 0.4104 | 123.0 | 61500 | 0.2997 | 0.7989 | 0.9524 | 0.9122 | 0.2967 | 0.795 | 0.812 | 0.3339 | 0.8311 | 0.8335 | 0.3167 | 0.8287 | 0.8457 | 0.7924 | 0.8355 | 0.7374 | 0.7705 | 0.8667 | 0.8946 |
| 0.3751 | 124.0 | 62000 | 0.2945 | 0.7978 | 0.9491 | 0.9101 | 0.249 | 0.7961 | 0.815 | 0.3351 | 0.8311 | 0.8337 | 0.2764 | 0.833 | 0.8499 | 0.7944 | 0.8383 | 0.7298 | 0.7642 | 0.8693 | 0.8985 |
| 0.3375 | 125.0 | 62500 | 0.3023 | 0.7967 | 0.9547 | 0.9132 | 0.2885 | 0.7933 | 0.8065 | 0.3341 | 0.8292 | 0.8322 | 0.3264 | 0.8286 | 0.838 | 0.789 | 0.8319 | 0.7339 | 0.7684 | 0.867 | 0.8964 |
| 0.4074 | 126.0 | 63000 | 0.2956 | 0.7958 | 0.9523 | 0.912 | 0.2951 | 0.7927 | 0.8071 | 0.3334 | 0.8314 | 0.8343 | 0.3417 | 0.8308 | 0.8451 | 0.794 | 0.8379 | 0.7292 | 0.7705 | 0.8642 | 0.8946 |
| 0.3375 | 127.0 | 63500 | 0.2982 | 0.7952 | 0.9523 | 0.9131 | 0.2948 | 0.791 | 0.8101 | 0.3314 | 0.8292 | 0.832 | 0.3236 | 0.8274 | 0.8482 | 0.7976 | 0.8387 | 0.7282 | 0.7642 | 0.8599 | 0.8931 |
| 0.3425 | 128.0 | 64000 | 0.2963 | 0.7983 | 0.9523 | 0.9082 | 0.293 | 0.7948 | 0.8129 | 0.3354 | 0.8324 | 0.8349 | 0.3278 | 0.8305 | 0.8505 | 0.798 | 0.8403 | 0.7353 | 0.7695 | 0.8616 | 0.8949 |
| 0.35 | 129.0 | 64500 | 0.2941 | 0.8 | 0.9514 | 0.9128 | 0.2972 | 0.7994 | 0.808 | 0.3364 | 0.8336 | 0.8364 | 0.3208 | 0.8348 | 0.8432 | 0.797 | 0.8391 | 0.7404 | 0.7768 | 0.8626 | 0.8934 |
| 0.3261 | 130.0 | 65000 | 0.3002 | 0.7943 | 0.9528 | 0.9095 | 0.2825 | 0.7902 | 0.808 | 0.3329 | 0.8279 | 0.8304 | 0.3083 | 0.8253 | 0.8435 | 0.7902 | 0.8331 | 0.7315 | 0.7674 | 0.8612 | 0.8907 |
| 0.4025 | 131.0 | 65500 | 0.3037 | 0.7935 | 0.9527 | 0.9092 | 0.2759 | 0.7905 | 0.8061 | 0.3325 | 0.8271 | 0.8296 | 0.3111 | 0.8254 | 0.8414 | 0.788 | 0.831 | 0.7327 | 0.7674 | 0.8597 | 0.8904 |
| 0.3304 | 132.0 | 66000 | 0.3004 | 0.7944 | 0.9528 | 0.9092 | 0.2841 | 0.7916 | 0.8062 | 0.333 | 0.8286 | 0.831 | 0.3125 | 0.8269 | 0.8432 | 0.7906 | 0.8335 | 0.7334 | 0.7695 | 0.8593 | 0.8901 |
| 0.326 | 133.0 | 66500 | 0.3042 | 0.793 | 0.9524 | 0.9089 | 0.2888 | 0.7906 | 0.8047 | 0.3326 | 0.8277 | 0.8299 | 0.3153 | 0.8267 | 0.8416 | 0.79 | 0.8343 | 0.7297 | 0.7663 | 0.8592 | 0.8892 |
| 0.4304 | 134.0 | 67000 | 0.2998 | 0.7951 | 0.9526 | 0.9123 | 0.301 | 0.7935 | 0.8057 | 0.3326 | 0.8297 | 0.8323 | 0.325 | 0.8302 | 0.8413 | 0.792 | 0.8355 | 0.7337 | 0.7695 | 0.8596 | 0.8919 |
| 0.4094 | 135.0 | 67500 | 0.2979 | 0.7998 | 0.9523 | 0.9088 | 0.2886 | 0.7976 | 0.8168 | 0.3363 | 0.834 | 0.8363 | 0.3111 | 0.8334 | 0.8508 | 0.7984 | 0.8403 | 0.7398 | 0.7758 | 0.8612 | 0.8928 |
| 0.3193 | 136.0 | 68000 | 0.2979 | 0.8006 | 0.9524 | 0.9121 | 0.2949 | 0.7985 | 0.8127 | 0.336 | 0.8353 | 0.8375 | 0.3319 | 0.8342 | 0.8505 | 0.7964 | 0.8411 | 0.7401 | 0.7768 | 0.8653 | 0.8946 |
| 0.3215 | 137.0 | 68500 | 0.3002 | 0.7984 | 0.9525 | 0.9122 | 0.3018 | 0.7963 | 0.8109 | 0.3338 | 0.8318 | 0.8341 | 0.3319 | 0.8317 | 0.845 | 0.7942 | 0.8379 | 0.7363 | 0.7695 | 0.8647 | 0.8949 |
| 0.322 | 138.0 | 69000 | 0.2994 | 0.798 | 0.9527 | 0.9091 | 0.2887 | 0.7956 | 0.812 | 0.3338 | 0.833 | 0.8354 | 0.3111 | 0.8324 | 0.8465 | 0.7984 | 0.8399 | 0.7376 | 0.7758 | 0.858 | 0.8904 |
| 0.3981 | 139.0 | 69500 | 0.2985 | 0.7989 | 0.9523 | 0.9084 | 0.2801 | 0.7977 | 0.8097 | 0.3344 | 0.8343 | 0.8366 | 0.3028 | 0.8342 | 0.8495 | 0.7985 | 0.8415 | 0.7375 | 0.7768 | 0.8607 | 0.8916 |
| 0.469 | 140.0 | 70000 | 0.2960 | 0.8016 | 0.9527 | 0.9114 | 0.2883 | 0.7992 | 0.8189 | 0.3356 | 0.8366 | 0.839 | 0.3153 | 0.8366 | 0.8534 | 0.7998 | 0.8435 | 0.7423 | 0.78 | 0.8628 | 0.8934 |
| 0.4151 | 141.0 | 70500 | 0.2954 | 0.8013 | 0.9528 | 0.9128 | 0.2854 | 0.7992 | 0.8181 | 0.3355 | 0.8359 | 0.8383 | 0.3069 | 0.8357 | 0.8508 | 0.7995 | 0.8427 | 0.7412 | 0.7789 | 0.8631 | 0.8931 |
| 0.3491 | 142.0 | 71000 | 0.2968 | 0.8001 | 0.9528 | 0.9125 | 0.2965 | 0.7979 | 0.815 | 0.335 | 0.8354 | 0.8376 | 0.3236 | 0.835 | 0.8509 | 0.797 | 0.8403 | 0.7405 | 0.7789 | 0.863 | 0.8937 |
| 0.304 | 143.0 | 71500 | 0.2961 | 0.7991 | 0.9528 | 0.9124 | 0.2946 | 0.7979 | 0.8118 | 0.3336 | 0.8339 | 0.8361 | 0.3236 | 0.8345 | 0.845 | 0.7976 | 0.8415 | 0.7384 | 0.7747 | 0.8613 | 0.8922 |
| 0.3747 | 144.0 | 72000 | 0.2962 | 0.7982 | 0.9528 | 0.9117 | 0.2883 | 0.7975 | 0.8091 | 0.3341 | 0.8334 | 0.8357 | 0.3111 | 0.8339 | 0.8454 | 0.7965 | 0.8395 | 0.7362 | 0.7747 | 0.862 | 0.8928 |
| 0.3139 | 145.0 | 72500 | 0.2960 | 0.7988 | 0.9528 | 0.9124 | 0.2801 | 0.7966 | 0.8122 | 0.3341 | 0.8333 | 0.8357 | 0.3028 | 0.8336 | 0.8475 | 0.7949 | 0.8379 | 0.7382 | 0.7758 | 0.8632 | 0.8934 |
| 0.5324 | 146.0 | 73000 | 0.2956 | 0.7999 | 0.9529 | 0.9125 | 0.2883 | 0.798 | 0.8148 | 0.3348 | 0.8352 | 0.8376 | 0.3111 | 0.8357 | 0.8502 | 0.796 | 0.8399 | 0.7405 | 0.7789 | 0.8633 | 0.894 |
| 0.3885 | 147.0 | 73500 | 0.2958 | 0.7996 | 0.9529 | 0.9125 | 0.2883 | 0.7979 | 0.8126 | 0.3344 | 0.834 | 0.8363 | 0.3111 | 0.8348 | 0.8457 | 0.796 | 0.8395 | 0.7392 | 0.7758 | 0.8637 | 0.8937 |
| 0.3726 | 148.0 | 74000 | 0.2959 | 0.7997 | 0.9529 | 0.9125 | 0.2883 | 0.7977 | 0.8126 | 0.3344 | 0.8339 | 0.8362 | 0.3111 | 0.8343 | 0.8457 | 0.7961 | 0.8395 | 0.7392 | 0.7758 | 0.8637 | 0.8934 |
| 0.4201 | 149.0 | 74500 | 0.2959 | 0.7997 | 0.9529 | 0.9125 | 0.2883 | 0.7977 | 0.8126 | 0.3344 | 0.8339 | 0.8362 | 0.3111 | 0.8343 | 0.8457 | 0.7961 | 0.8395 | 0.7392 | 0.7758 | 0.8637 | 0.8934 |
| 0.3823 | 150.0 | 75000 | 0.2959 | 0.7997 | 0.9529 | 0.9125 | 0.2883 | 0.7977 | 0.8126 | 0.3344 | 0.8339 | 0.8362 | 0.3111 | 0.8343 | 0.8457 | 0.7961 | 0.8395 | 0.7392 | 0.7758 | 0.8637 | 0.8934 |
### Framework versions
- Transformers 4.46.2
- Pytorch 2.5.0+cu121
- Datasets 2.19.2
- Tokenizers 0.20.3
| [
"chicken",
"duck",
"plant"
] |
vietlethe/bkad-deformable-detr |
# Model Card for Model ID
<!-- Provide a quick summary of what the model is/does. -->
## Model Details
### Model Description
<!-- Provide a longer summary of what this model is. -->
This is the model card of a 🤗 transformers model that has been pushed on the Hub. This model card has been automatically generated.
- **Developed by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Model type:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
- **Finetuned from model [optional]:** [More Information Needed]
### Model Sources [optional]
<!-- Provide the basic links for the model. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the model is intended to be used, including the foreseeable users of the model and those affected by the model. -->
### Direct Use
<!-- This section is for the model use without fine-tuning or plugging into a larger ecosystem/app. -->
[More Information Needed]
### Downstream Use [optional]
<!-- This section is for the model use when fine-tuned for a task, or when plugged into a larger ecosystem/app -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the model will not work well for. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.
## How to Get Started with the Model
Use the code below to get started with the model.
[More Information Needed]
## Training Details
### Training Data
<!-- This should link to a Dataset Card, perhaps with a short stub of information on what the training data is all about as well as documentation related to data pre-processing or additional filtering. -->
[More Information Needed]
### Training Procedure
<!-- This relates heavily to the Technical Specifications. Content here should link to that section when it is relevant to the training procedure. -->
#### Preprocessing [optional]
[More Information Needed]
#### Training Hyperparameters
- **Training regime:** [More Information Needed] <!--fp32, fp16 mixed precision, bf16 mixed precision, bf16 non-mixed precision, fp16 non-mixed precision, fp8 mixed precision -->
#### Speeds, Sizes, Times [optional]
<!-- This section provides information about throughput, start/end time, checkpoint size if relevant, etc. -->
[More Information Needed]
## Evaluation
<!-- This section describes the evaluation protocols and provides the results. -->
### Testing Data, Factors & Metrics
#### Testing Data
<!-- This should link to a Dataset Card if possible. -->
[More Information Needed]
#### Factors
<!-- These are the things the evaluation is disaggregating by, e.g., subpopulations or domains. -->
[More Information Needed]
#### Metrics
<!-- These are the evaluation metrics being used, ideally with a description of why. -->
[More Information Needed]
### Results
[More Information Needed]
#### Summary
## Model Examination [optional]
<!-- Relevant interpretability work for the model goes here -->
[More Information Needed]
## Environmental Impact
<!-- Total emissions (in grams of CO2eq) and additional considerations, such as electricity usage, go here. Edit the suggested text below accordingly -->
Carbon emissions can be estimated using the [Machine Learning Impact calculator](https://mlco2.github.io/impact#compute) presented in [Lacoste et al. (2019)](https://arxiv.org/abs/1910.09700).
- **Hardware Type:** [More Information Needed]
- **Hours used:** [More Information Needed]
- **Cloud Provider:** [More Information Needed]
- **Compute Region:** [More Information Needed]
- **Carbon Emitted:** [More Information Needed]
## Technical Specifications [optional]
### Model Architecture and Objective
[More Information Needed]
### Compute Infrastructure
[More Information Needed]
#### Hardware
[More Information Needed]
#### Software
[More Information Needed]
## Citation [optional]
<!-- If there is a paper or blog post introducing the model, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the model or model card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Model Card Authors [optional]
[More Information Needed]
## Model Card Contact
[More Information Needed] | [
"label_0",
"label_1",
"label_2",
"label_3"
] |
Kelex83/finetuned-detr-resnet-50-dc5-fashionpedia |
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# finetuned-detr-resnet-50-dc5-fashionpedia
This model is a fine-tuned version of [facebook/detr-resnet-50-dc5](https://huggingface.co/facebook/detr-resnet-50-dc5) on an unknown dataset.
It achieves the following results on the evaluation set:
- Loss: 2.2712
- Map: 0.0137
- Map 50: 0.0256
- Map 75: 0.0131
- Map Small: 0.0064
- Map Medium: 0.0183
- Map Large: 0.0123
- Mar 1: 0.0366
- Mar 10: 0.0653
- Mar 100: 0.0683
- Mar Small: 0.0264
- Mar Medium: 0.0676
- Mar Large: 0.0777
- Map Shirt, blouse: 0.0
- Mar 100 Shirt, blouse: 0.0
- Map Top, t-shirt, sweatshirt: 0.0347
- Mar 100 Top, t-shirt, sweatshirt: 0.3371
- Map Sweater: 0.0
- Mar 100 Sweater: 0.0
- Map Cardigan: 0.0
- Mar 100 Cardigan: 0.0
- Map Jacket: 0.0
- Mar 100 Jacket: 0.0
- Map Vest: 0.0
- Mar 100 Vest: 0.0
- Map Pants: 0.1262
- Mar 100 Pants: 0.6475
- Map Shorts: 0.0
- Mar 100 Shorts: 0.0
- Map Skirt: 0.0
- Mar 100 Skirt: 0.0
- Map Coat: 0.0
- Mar 100 Coat: 0.0
- Map Dress: 0.1141
- Mar 100 Dress: 0.7232
- Map Jumpsuit: 0.0
- Mar 100 Jumpsuit: 0.0
- Map Cape: 0.0
- Mar 100 Cape: 0.0
- Map Glasses: 0.0086
- Mar 100 Glasses: 0.0946
- Map Hat: 0.0
- Mar 100 Hat: 0.0
- Map Headband, head covering, hair accessory: 0.0006
- Mar 100 Headband, head covering, hair accessory: 0.0037
- Map Tie: 0.0
- Mar 100 Tie: 0.0
- Map Glove: 0.0
- Mar 100 Glove: 0.0
- Map Watch: 0.0
- Mar 100 Watch: 0.0
- Map Belt: 0.0
- Mar 100 Belt: 0.0
- Map Leg warmer: 0.0
- Mar 100 Leg warmer: 0.0
- Map Tights, stockings: 0.0
- Mar 100 Tights, stockings: 0.0
- Map Sock: 0.0
- Mar 100 Sock: 0.0
- Map Shoe: 0.2039
- Mar 100 Shoe: 0.5147
- Map Bag, wallet: 0.0134
- Mar 100 Bag, wallet: 0.0173
- Map Scarf: 0.0
- Mar 100 Scarf: 0.0
- Map Umbrella: 0.0
- Mar 100 Umbrella: 0.0
- Map Hood: 0.0
- Mar 100 Hood: 0.0
- Map Collar: 0.0
- Mar 100 Collar: 0.0
- Map Lapel: 0.0079
- Mar 100 Lapel: 0.0059
- Map Epaulette: 0.0
- Mar 100 Epaulette: 0.0
- Map Sleeve: 0.078
- Mar 100 Sleeve: 0.4796
- Map Pocket: 0.0002
- Mar 100 Pocket: 0.0606
- Map Neckline: 0.0417
- Mar 100 Neckline: 0.2583
- Map Buckle: 0.0
- Mar 100 Buckle: 0.0
- Map Zipper: 0.0
- Mar 100 Zipper: 0.0
- Map Applique: 0.0
- Mar 100 Applique: 0.0
- Map Bead: 0.0
- Mar 100 Bead: 0.0
- Map Bow: 0.0
- Mar 100 Bow: 0.0
- Map Flower: 0.0
- Mar 100 Flower: 0.0
- Map Fringe: 0.0
- Mar 100 Fringe: 0.0
- Map Ribbon: 0.0
- Mar 100 Ribbon: 0.0
- Map Rivet: 0.0
- Mar 100 Rivet: 0.0
- Map Ruffle: 0.0
- Mar 100 Ruffle: 0.0
- Map Sequin: 0.0
- Mar 100 Sequin: 0.0
- Map Tassel: 0.0
- Mar 100 Tassel: 0.0
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 1e-05
- train_batch_size: 4
- eval_batch_size: 4
- seed: 42
- optimizer: Use adamw_torch with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
- lr_scheduler_type: linear
- training_steps: 10000
- mixed_precision_training: Native AMP
### Training results
| Training Loss | Epoch | Step | Validation Loss | Map | Map 50 | Map 75 | Map Small | Map Medium | Map Large | Mar 1 | Mar 10 | Mar 100 | Mar Small | Mar Medium | Mar Large | Map Shirt, blouse | Mar 100 Shirt, blouse | Map Top, t-shirt, sweatshirt | Mar 100 Top, t-shirt, sweatshirt | Map Sweater | Mar 100 Sweater | Map Cardigan | Mar 100 Cardigan | Map Jacket | Mar 100 Jacket | Map Vest | Mar 100 Vest | Map Pants | Mar 100 Pants | Map Shorts | Mar 100 Shorts | Map Skirt | Mar 100 Skirt | Map Coat | Mar 100 Coat | Map Dress | Mar 100 Dress | Map Jumpsuit | Mar 100 Jumpsuit | Map Cape | Mar 100 Cape | Map Glasses | Mar 100 Glasses | Map Hat | Mar 100 Hat | Map Headband, head covering, hair accessory | Mar 100 Headband, head covering, hair accessory | Map Tie | Mar 100 Tie | Map Glove | Mar 100 Glove | Map Watch | Mar 100 Watch | Map Belt | Mar 100 Belt | Map Leg warmer | Mar 100 Leg warmer | Map Tights, stockings | Mar 100 Tights, stockings | Map Sock | Mar 100 Sock | Map Shoe | Mar 100 Shoe | Map Bag, wallet | Mar 100 Bag, wallet | Map Scarf | Mar 100 Scarf | Map Umbrella | Mar 100 Umbrella | Map Hood | Mar 100 Hood | Map Collar | Mar 100 Collar | Map Lapel | Mar 100 Lapel | Map Epaulette | Mar 100 Epaulette | Map Sleeve | Mar 100 Sleeve | Map Pocket | Mar 100 Pocket | Map Neckline | Mar 100 Neckline | Map Buckle | Mar 100 Buckle | Map Zipper | Mar 100 Zipper | Map Applique | Mar 100 Applique | Map Bead | Mar 100 Bead | Map Bow | Mar 100 Bow | Map Flower | Mar 100 Flower | Map Fringe | Mar 100 Fringe | Map Ribbon | Mar 100 Ribbon | Map Rivet | Mar 100 Rivet | Map Ruffle | Mar 100 Ruffle | Map Sequin | Mar 100 Sequin | Map Tassel | Mar 100 Tassel |
|:-------------:|:------:|:-----:|:---------------:|:------:|:------:|:------:|:---------:|:----------:|:---------:|:------:|:------:|:-------:|:---------:|:----------:|:---------:|:-----------------:|:---------------------:|:----------------------------:|:--------------------------------:|:-----------:|:---------------:|:------------:|:----------------:|:----------:|:--------------:|:--------:|:------------:|:---------:|:-------------:|:----------:|:--------------:|:---------:|:-------------:|:--------:|:------------:|:---------:|:-------------:|:------------:|:----------------:|:--------:|:------------:|:-----------:|:---------------:|:-------:|:-----------:|:-------------------------------------------:|:-----------------------------------------------:|:-------:|:-----------:|:---------:|:-------------:|:---------:|:-------------:|:--------:|:------------:|:--------------:|:------------------:|:---------------------:|:-------------------------:|:--------:|:------------:|:--------:|:------------:|:---------------:|:-------------------:|:---------:|:-------------:|:------------:|:----------------:|:--------:|:------------:|:----------:|:--------------:|:---------:|:-------------:|:-------------:|:-----------------:|:----------:|:--------------:|:----------:|:--------------:|:------------:|:----------------:|:----------:|:--------------:|:----------:|:--------------:|:------------:|:----------------:|:--------:|:------------:|:-------:|:-----------:|:----------:|:--------------:|:----------:|:--------------:|:----------:|:--------------:|:---------:|:-------------:|:----------:|:--------------:|:----------:|:--------------:|:----------:|:--------------:|
| 7.0493 | 0.0044 | 50 | 6.4615 | 0.0002 | 0.0002 | 0.0002 | 0.0 | 0.0002 | 0.0006 | 0.0004 | 0.0013 | 0.0016 | 0.0006 | 0.0004 | 0.0072 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0167 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0012 | 0.0 | 0.0 | 0.0001 | 0.0008 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0193 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.008 | 0.0145 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0008 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0082 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0118 | 0.0 | 0.0 | 0.0 | 0.0 |
| 5.3509 | 0.0088 | 100 | 5.6823 | 0.0003 | 0.0003 | 0.0003 | 0.0 | 0.0002 | 0.0012 | 0.0006 | 0.001 | 0.0017 | 0.0007 | 0.0021 | 0.0053 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0004 | 0.0087 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0193 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0003 | 0.0119 | 0.0192 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0091 | 0.0 | 0.0 | 0.0 | 0.0057 | 0.0 | 0.0 | 0.0 | 0.0144 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 5.3659 | 0.0132 | 150 | 5.1351 | 0.0003 | 0.0005 | 0.0003 | 0.0 | 0.0001 | 0.0012 | 0.0009 | 0.0021 | 0.0037 | 0.001 | 0.0034 | 0.0146 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0167 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0002 | 0.0006 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0019 | 0.0283 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0157 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0008 | 0.0111 | 0.0178 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0004 | 0.0688 | 0.0 | 0.0 | 0.0 | 0.0078 | 0.0 | 0.0 | 0.0 | 0.0119 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0026 | 0.0 | 0.0 | 0.0 | 0.0 |
| 4.5562 | 0.0175 | 200 | 4.6970 | 0.0002 | 0.0004 | 0.0001 | 0.0 | 0.0002 | 0.0003 | 0.0003 | 0.0007 | 0.0034 | 0.0011 | 0.0044 | 0.0159 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0004 | 0.0063 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0016 | 0.0054 | 0.0051 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0007 | 0.1231 | 0.0 | 0.0 | 0.0005 | 0.0107 | 0.0 | 0.0 | 0.0 | 0.0119 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 5.181 | 0.0219 | 250 | 4.3474 | 0.0001 | 0.0003 | 0.0001 | 0.0 | 0.0002 | 0.0001 | 0.0002 | 0.0005 | 0.0035 | 0.0013 | 0.0045 | 0.0131 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0002 | 0.0016 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0068 | 0.0046 | 0.0051 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0008 | 0.1353 | 0.0 | 0.0 | 0.0001 | 0.0124 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 3.9616 | 0.0263 | 300 | 3.9479 | 0.0 | 0.0001 | 0.0 | 0.0 | 0.0001 | 0.0001 | 0.0001 | 0.0008 | 0.0042 | 0.0022 | 0.0051 | 0.0129 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0003 | 0.0249 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.001 | 0.1518 | 0.0 | 0.0 | 0.0002 | 0.0156 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 3.7922 | 0.0307 | 350 | 3.7664 | 0.0001 | 0.0002 | 0.0 | 0.0 | 0.0001 | 0.0001 | 0.0002 | 0.0009 | 0.0049 | 0.0017 | 0.0059 | 0.0182 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0003 | 0.0243 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0013 | 0.1908 | 0.0 | 0.0 | 0.0008 | 0.0122 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 3.4752 | 0.0351 | 400 | 3.9790 | 0.0001 | 0.0004 | 0.0 | 0.0001 | 0.0002 | 0.0003 | 0.0004 | 0.0019 | 0.0075 | 0.0041 | 0.01 | 0.015 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0025 | 0.1017 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0018 | 0.2403 | 0.0 | 0.0 | 0.0006 | 0.0042 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 3.9056 | 0.0395 | 450 | 3.7476 | 0.0001 | 0.0003 | 0.0 | 0.0001 | 0.0001 | 0.0001 | 0.0004 | 0.0014 | 0.0065 | 0.0036 | 0.0078 | 0.0129 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0024 | 0.0568 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0019 | 0.2417 | 0.0 | 0.0 | 0.0001 | 0.0002 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 4.0423 | 0.0438 | 500 | 3.5736 | 0.0001 | 0.0005 | 0.0 | 0.0001 | 0.0001 | 0.0001 | 0.0004 | 0.0018 | 0.0073 | 0.0043 | 0.0086 | 0.0131 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0025 | 0.0584 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0028 | 0.2763 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 3.3564 | 0.0482 | 550 | 3.4520 | 0.0003 | 0.0011 | 0.0001 | 0.0004 | 0.0004 | 0.0001 | 0.0007 | 0.0044 | 0.0116 | 0.0083 | 0.0139 | 0.0147 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0101 | 0.1876 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0039 | 0.3426 | 0.0 | 0.0 | 0.0007 | 0.0015 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 3.1953 | 0.0526 | 600 | 3.3816 | 0.0005 | 0.0016 | 0.0002 | 0.0006 | 0.0007 | 0.0001 | 0.001 | 0.0067 | 0.0145 | 0.0102 | 0.0185 | 0.0151 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0198 | 0.328 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0038 | 0.3381 | 0.0 | 0.0 | 0.0002 | 0.0004 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 2.5103 | 0.0570 | 650 | 3.3732 | 0.0007 | 0.0024 | 0.0003 | 0.0007 | 0.001 | 0.0001 | 0.0013 | 0.0073 | 0.0146 | 0.0104 | 0.0193 | 0.0198 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0291 | 0.378 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0035 | 0.2907 | 0.0 | 0.0 | 0.0005 | 0.0035 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 3.0506 | 0.0614 | 700 | 3.3284 | 0.0007 | 0.0024 | 0.0003 | 0.0008 | 0.001 | 0.0001 | 0.0013 | 0.0081 | 0.0161 | 0.0111 | 0.0209 | 0.0164 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0281 | 0.3659 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0053 | 0.3703 | 0.0 | 0.0 | 0.001 | 0.0038 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 3.2105 | 0.0658 | 750 | 3.3047 | 0.0008 | 0.0027 | 0.0003 | 0.0008 | 0.0011 | 0.0001 | 0.0014 | 0.0082 | 0.0156 | 0.011 | 0.02 | 0.0169 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0311 | 0.3509 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0049 | 0.3603 | 0.0 | 0.0 | 0.0007 | 0.0066 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 2.6299 | 0.0701 | 800 | 3.2656 | 0.0011 | 0.0037 | 0.0004 | 0.0011 | 0.0015 | 0.0001 | 0.002 | 0.0091 | 0.0158 | 0.0102 | 0.0211 | 0.0262 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0447 | 0.3593 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.006 | 0.3602 | 0.0 | 0.0 | 0.0011 | 0.0085 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 3.0512 | 0.0745 | 850 | 3.2161 | 0.0012 | 0.0037 | 0.0005 | 0.0011 | 0.0016 | 0.0002 | 0.0021 | 0.0105 | 0.0176 | 0.0134 | 0.023 | 0.0298 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.043 | 0.3998 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0073 | 0.3646 | 0.0 | 0.0 | 0.0034 | 0.0451 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 3.4365 | 0.0789 | 900 | 3.1872 | 0.0013 | 0.0043 | 0.0006 | 0.0013 | 0.0018 | 0.0002 | 0.002 | 0.0108 | 0.018 | 0.0136 | 0.0234 | 0.0287 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0504 | 0.3998 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.008 | 0.3953 | 0.0 | 0.0 | 0.0035 | 0.0324 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 3.4587 | 0.0833 | 950 | 3.1776 | 0.0016 | 0.0049 | 0.0007 | 0.0013 | 0.0025 | 0.0002 | 0.0024 | 0.0111 | 0.0181 | 0.0129 | 0.0241 | 0.0283 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0556 | 0.3766 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0088 | 0.4052 | 0.0 | 0.0 | 0.008 | 0.052 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 3.7678 | 0.0877 | 1000 | 3.1366 | 0.0017 | 0.005 | 0.0008 | 0.0013 | 0.0026 | 0.0005 | 0.0026 | 0.0124 | 0.0199 | 0.0156 | 0.0255 | 0.0334 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0079 | 0.0016 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0553 | 0.4129 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0103 | 0.4162 | 0.0 | 0.0 | 0.0066 | 0.084 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 2.9833 | 0.0921 | 1050 | 3.1290 | 0.002 | 0.0056 | 0.001 | 0.0016 | 0.0026 | 0.0005 | 0.003 | 0.0129 | 0.0197 | 0.0145 | 0.0265 | 0.0296 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0079 | 0.0016 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0676 | 0.4114 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0101 | 0.4164 | 0.0 | 0.0 | 0.006 | 0.0781 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 3.4028 | 0.0964 | 1100 | 3.0953 | 0.0019 | 0.0055 | 0.0008 | 0.0015 | 0.003 | 0.0005 | 0.0029 | 0.0135 | 0.0207 | 0.0158 | 0.0269 | 0.0338 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0051 | 0.0049 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0638 | 0.4308 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0112 | 0.424 | 0.0 | 0.0 | 0.0079 | 0.094 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 2.9266 | 0.1008 | 1150 | 3.1042 | 0.0016 | 0.0046 | 0.0009 | 0.0012 | 0.0029 | 0.0006 | 0.0026 | 0.0128 | 0.0209 | 0.0163 | 0.0268 | 0.0316 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0056 | 0.0069 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0487 | 0.4455 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0119 | 0.4135 | 0.0 | 0.0 | 0.0082 | 0.0968 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 3.82 | 0.1052 | 1200 | 3.0522 | 0.002 | 0.0059 | 0.0009 | 0.0015 | 0.0029 | 0.0006 | 0.0031 | 0.0142 | 0.0209 | 0.0152 | 0.0277 | 0.0343 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0082 | 0.0094 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0618 | 0.4117 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0131 | 0.4271 | 0.0 | 0.0 | 0.0089 | 0.112 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 2.2205 | 0.1096 | 1250 | 3.0567 | 0.002 | 0.0059 | 0.0011 | 0.0014 | 0.003 | 0.0008 | 0.0037 | 0.0161 | 0.022 | 0.0157 | 0.0286 | 0.0336 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0115 | 0.028 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0538 | 0.4408 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0191 | 0.3985 | 0.0 | 0.0 | 0.0097 | 0.1429 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 3.127 | 0.1140 | 1300 | 3.1491 | 0.0021 | 0.0059 | 0.0011 | 0.0018 | 0.0027 | 0.0007 | 0.004 | 0.0147 | 0.02 | 0.0127 | 0.0268 | 0.0367 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.005 | 0.0201 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0583 | 0.4457 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0183 | 0.3663 | 0.0 | 0.0 | 0.0131 | 0.0899 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 3.3857 | 0.1184 | 1350 | 3.0887 | 0.0022 | 0.0059 | 0.0013 | 0.0016 | 0.0032 | 0.0006 | 0.0041 | 0.0157 | 0.021 | 0.0147 | 0.0276 | 0.032 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0075 | 0.0222 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0591 | 0.4646 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0178 | 0.3663 | 0.0 | 0.0 | 0.015 | 0.112 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 3.4188 | 0.1227 | 1400 | 3.0303 | 0.0022 | 0.006 | 0.0011 | 0.0014 | 0.0031 | 0.001 | 0.0039 | 0.016 | 0.0214 | 0.0154 | 0.0283 | 0.0313 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0157 | 0.0268 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0533 | 0.4352 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0178 | 0.3834 | 0.0 | 0.0 | 0.013 | 0.1371 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 2.8893 | 0.1271 | 1450 | 2.9888 | 0.0025 | 0.0069 | 0.0014 | 0.002 | 0.0036 | 0.0008 | 0.0049 | 0.0178 | 0.0228 | 0.0172 | 0.0297 | 0.0353 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0058 | 0.0291 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0751 | 0.4528 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0205 | 0.406 | 0.0 | 0.0 | 0.0137 | 0.1628 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 3.0498 | 0.1315 | 1500 | 2.9922 | 0.0026 | 0.0071 | 0.0014 | 0.002 | 0.0039 | 0.0007 | 0.005 | 0.0177 | 0.023 | 0.0163 | 0.0297 | 0.0364 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0087 | 0.0443 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0776 | 0.4105 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0204 | 0.4382 | 0.0 | 0.0 | 0.0151 | 0.1627 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 3.335 | 0.1359 | 1550 | 2.9809 | 0.0027 | 0.0073 | 0.0014 | 0.002 | 0.0038 | 0.001 | 0.0058 | 0.0189 | 0.0243 | 0.0174 | 0.03 | 0.0373 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0123 | 0.086 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0778 | 0.4315 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0185 | 0.423 | 0.0 | 0.0 | 0.0137 | 0.1765 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 3.1299 | 0.1403 | 1600 | 2.9605 | 0.0029 | 0.0079 | 0.0015 | 0.0022 | 0.0039 | 0.001 | 0.0065 | 0.0193 | 0.0245 | 0.0167 | 0.0297 | 0.0334 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0151 | 0.1219 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.087 | 0.4205 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0161 | 0.4142 | 0.0 | 0.0 | 0.0146 | 0.1696 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 3.189 | 0.1447 | 1650 | 2.9775 | 0.0023 | 0.007 | 0.0011 | 0.0015 | 0.003 | 0.0013 | 0.0065 | 0.0188 | 0.0241 | 0.0156 | 0.028 | 0.0343 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0201 | 0.1593 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0558 | 0.3799 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0157 | 0.4 | 0.0 | 0.0 | 0.0149 | 0.171 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 3.2569 | 0.1490 | 1700 | 2.9462 | 0.0028 | 0.0077 | 0.0015 | 0.002 | 0.0035 | 0.0012 | 0.0066 | 0.0191 | 0.0243 | 0.0166 | 0.029 | 0.0335 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0229 | 0.1343 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0714 | 0.4231 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0149 | 0.3978 | 0.0001 | 0.0002 | 0.0207 | 0.1629 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 3.2697 | 0.1534 | 1750 | 2.9275 | 0.0027 | 0.0075 | 0.0015 | 0.0019 | 0.0036 | 0.0014 | 0.0074 | 0.02 | 0.0255 | 0.0164 | 0.029 | 0.0387 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0023 | 0.0022 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0206 | 0.1738 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0635 | 0.4292 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0195 | 0.4244 | 0.0 | 0.0 | 0.0195 | 0.1447 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 3.8347 | 0.1578 | 1800 | 2.8975 | 0.0029 | 0.0077 | 0.0018 | 0.0021 | 0.0037 | 0.0017 | 0.0073 | 0.0208 | 0.027 | 0.0185 | 0.0303 | 0.0395 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0017 | 0.0038 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0252 | 0.1866 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.069 | 0.4611 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0163 | 0.423 | 0.0 | 0.0 | 0.0223 | 0.1674 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 2.336 | 0.1622 | 1850 | 2.8781 | 0.0031 | 0.0081 | 0.002 | 0.0021 | 0.0036 | 0.002 | 0.0084 | 0.0219 | 0.0279 | 0.0184 | 0.0311 | 0.0406 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0017 | 0.0061 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0313 | 0.2075 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0731 | 0.4601 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0203 | 0.4414 | 0.0001 | 0.0006 | 0.0181 | 0.1698 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 3.1503 | 0.1666 | 1900 | 2.8516 | 0.0036 | 0.0091 | 0.0023 | 0.0023 | 0.0046 | 0.0022 | 0.0087 | 0.0234 | 0.0287 | 0.0186 | 0.03 | 0.0404 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.015 | 0.0194 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0293 | 0.2441 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0799 | 0.4445 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0191 | 0.4294 | 0.0 | 0.0015 | 0.0213 | 0.1803 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 2.9179 | 0.1710 | 1950 | 2.8664 | 0.0032 | 0.0089 | 0.0018 | 0.0019 | 0.0037 | 0.0022 | 0.009 | 0.0229 | 0.0275 | 0.0154 | 0.0298 | 0.0416 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0104 | 0.0159 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0279 | 0.2498 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0678 | 0.3701 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0242 | 0.4278 | 0.0001 | 0.0017 | 0.0188 | 0.2 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 2.9207 | 0.1753 | 2000 | 2.8651 | 0.0037 | 0.0096 | 0.0023 | 0.0026 | 0.0042 | 0.0021 | 0.0094 | 0.0237 | 0.0293 | 0.018 | 0.0304 | 0.0415 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0025 | 0.0083 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0334 | 0.287 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.1 | 0.4208 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0201 | 0.4315 | 0.0 | 0.0 | 0.0149 | 0.2005 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 2.7564 | 0.1797 | 2050 | 2.8356 | 0.0042 | 0.0103 | 0.0027 | 0.003 | 0.005 | 0.0027 | 0.0108 | 0.0254 | 0.0308 | 0.0169 | 0.0317 | 0.0429 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0114 | 0.0131 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0412 | 0.3581 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.1012 | 0.4047 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0202 | 0.4474 | 0.004 | 0.0017 | 0.015 | 0.1903 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 2.9197 | 0.1841 | 2100 | 2.8097 | 0.0041 | 0.0096 | 0.0028 | 0.0026 | 0.0046 | 0.0027 | 0.0105 | 0.0261 | 0.0319 | 0.0182 | 0.0328 | 0.0439 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0132 | 0.0204 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0452 | 0.3589 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0952 | 0.4425 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0179 | 0.4531 | 0.0001 | 0.0039 | 0.0164 | 0.1865 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 2.4666 | 0.1885 | 2150 | 2.8181 | 0.0043 | 0.0099 | 0.003 | 0.0025 | 0.0044 | 0.0033 | 0.0108 | 0.0269 | 0.0323 | 0.0184 | 0.0314 | 0.0491 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0238 | 0.0404 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0377 | 0.364 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0985 | 0.4392 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0203 | 0.4459 | 0.0001 | 0.0022 | 0.0156 | 0.1941 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 2.8845 | 0.1929 | 2200 | 2.8454 | 0.0038 | 0.0091 | 0.0026 | 0.0023 | 0.0044 | 0.0025 | 0.0109 | 0.0269 | 0.0319 | 0.0162 | 0.0316 | 0.0465 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0165 | 0.0414 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0358 | 0.4014 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0869 | 0.4242 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0217 | 0.4278 | 0.0 | 0.0035 | 0.0148 | 0.1681 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 3.562 | 0.1973 | 2250 | 2.8099 | 0.0038 | 0.0094 | 0.0024 | 0.0022 | 0.0049 | 0.0026 | 0.0109 | 0.0269 | 0.032 | 0.0187 | 0.0309 | 0.0428 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0135 | 0.05 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0371 | 0.3659 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.079 | 0.4258 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0244 | 0.4242 | 0.0 | 0.0043 | 0.0189 | 0.2002 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 2.4295 | 0.2016 | 2300 | 2.8019 | 0.0043 | 0.0097 | 0.0033 | 0.0024 | 0.0047 | 0.0032 | 0.0116 | 0.0275 | 0.0326 | 0.0171 | 0.0329 | 0.0377 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0268 | 0.0771 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0432 | 0.3626 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0876 | 0.4517 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0223 | 0.4267 | 0.0 | 0.0045 | 0.0188 | 0.1776 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 2.6915 | 0.2060 | 2350 | 2.7792 | 0.0044 | 0.0103 | 0.003 | 0.0024 | 0.0043 | 0.0036 | 0.0134 | 0.029 | 0.0341 | 0.0167 | 0.0331 | 0.0467 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0312 | 0.0952 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0452 | 0.4108 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0829 | 0.4292 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0227 | 0.4323 | 0.0001 | 0.0046 | 0.0189 | 0.1954 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 2.5396 | 0.2104 | 2400 | 2.7852 | 0.0046 | 0.0106 | 0.0033 | 0.0025 | 0.0048 | 0.0036 | 0.0138 | 0.0313 | 0.0359 | 0.0168 | 0.0326 | 0.0478 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0426 | 0.1583 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0423 | 0.4492 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0904 | 0.4539 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0202 | 0.407 | 0.0001 | 0.0069 | 0.018 | 0.1763 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 1.9112 | 0.2148 | 2450 | 2.7817 | 0.0046 | 0.0113 | 0.0031 | 0.0024 | 0.0046 | 0.0035 | 0.0145 | 0.032 | 0.0364 | 0.0162 | 0.0321 | 0.0491 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0413 | 0.1809 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.044 | 0.4746 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0798 | 0.4075 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0233 | 0.4133 | 0.0 | 0.0061 | 0.022 | 0.1929 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 2.1891 | 0.2192 | 2500 | 2.7650 | 0.005 | 0.0117 | 0.0035 | 0.0029 | 0.0053 | 0.0035 | 0.0156 | 0.0354 | 0.0398 | 0.0175 | 0.0336 | 0.0556 | 0.0 | 0.0 | 0.0014 | 0.0021 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0411 | 0.2449 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0379 | 0.5041 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.1031 | 0.4429 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.022 | 0.4199 | 0.0001 | 0.0106 | 0.0225 | 0.2053 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 2.2978 | 0.2236 | 2550 | 2.7209 | 0.0053 | 0.0122 | 0.0039 | 0.0029 | 0.0052 | 0.0042 | 0.0161 | 0.0355 | 0.0408 | 0.0197 | 0.0336 | 0.0553 | 0.0 | 0.0 | 0.0017 | 0.0038 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0428 | 0.2185 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0498 | 0.5516 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.1044 | 0.4366 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0214 | 0.4527 | 0.0001 | 0.0093 | 0.0215 | 0.2057 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 3.0312 | 0.2280 | 2600 | 2.7076 | 0.0053 | 0.0123 | 0.0037 | 0.003 | 0.0053 | 0.0042 | 0.0168 | 0.0362 | 0.0409 | 0.019 | 0.0343 | 0.0552 | 0.0 | 0.0 | 0.001 | 0.0025 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.044 | 0.2137 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0487 | 0.564 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.1034 | 0.4483 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.024 | 0.4313 | 0.0 | 0.0058 | 0.0226 | 0.2176 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 2.9525 | 0.2323 | 2650 | 2.7448 | 0.005 | 0.0116 | 0.0038 | 0.0029 | 0.0049 | 0.0039 | 0.0161 | 0.0362 | 0.0406 | 0.0178 | 0.0336 | 0.0594 | 0.0 | 0.0 | 0.0043 | 0.0072 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0422 | 0.2379 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0401 | 0.5549 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.1004 | 0.4586 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0236 | 0.3983 | 0.0001 | 0.0095 | 0.0217 | 0.2002 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 2.3726 | 0.2367 | 2700 | 2.7169 | 0.0051 | 0.0115 | 0.0039 | 0.0024 | 0.0049 | 0.0047 | 0.0171 | 0.0366 | 0.041 | 0.0172 | 0.0335 | 0.0597 | 0.0 | 0.0 | 0.0014 | 0.0034 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0552 | 0.2497 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0461 | 0.573 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0879 | 0.4284 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0264 | 0.4167 | 0.0 | 0.0059 | 0.0197 | 0.2066 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 3.4374 | 0.2411 | 2750 | 2.7227 | 0.0049 | 0.0115 | 0.0037 | 0.0024 | 0.0061 | 0.0043 | 0.017 | 0.0372 | 0.0416 | 0.0176 | 0.035 | 0.0594 | 0.0 | 0.0 | 0.0006 | 0.0015 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0434 | 0.2532 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0517 | 0.5896 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0834 | 0.4256 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0246 | 0.4171 | 0.0001 | 0.0063 | 0.0236 | 0.221 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 2.3248 | 0.2455 | 2800 | 2.6968 | 0.0053 | 0.0119 | 0.0039 | 0.0027 | 0.0053 | 0.0045 | 0.0182 | 0.0388 | 0.0434 | 0.0178 | 0.035 | 0.0611 | 0.0 | 0.0 | 0.001 | 0.0036 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0455 | 0.2981 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0568 | 0.6079 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0934 | 0.4559 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0231 | 0.4169 | 0.0001 | 0.0078 | 0.0222 | 0.204 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 2.9627 | 0.2499 | 2850 | 2.6914 | 0.0055 | 0.0122 | 0.0043 | 0.0027 | 0.005 | 0.0051 | 0.0179 | 0.0393 | 0.0439 | 0.0178 | 0.036 | 0.0545 | 0.0 | 0.0 | 0.009 | 0.0078 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0531 | 0.3029 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0543 | 0.6128 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0939 | 0.4689 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.025 | 0.4234 | 0.0 | 0.0086 | 0.0199 | 0.1929 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 3.5196 | 0.2543 | 2900 | 2.6557 | 0.0058 | 0.0128 | 0.0046 | 0.0028 | 0.006 | 0.0049 | 0.0189 | 0.04 | 0.0449 | 0.0199 | 0.0382 | 0.0508 | 0.0 | 0.0 | 0.0037 | 0.0101 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0492 | 0.3175 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0619 | 0.5872 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.1075 | 0.479 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0262 | 0.4453 | 0.0001 | 0.0138 | 0.0195 | 0.2128 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 2.6884 | 0.2586 | 2950 | 2.6729 | 0.0058 | 0.0129 | 0.0045 | 0.0029 | 0.0055 | 0.005 | 0.0186 | 0.0389 | 0.0434 | 0.019 | 0.0372 | 0.0468 | 0.0 | 0.0 | 0.0026 | 0.0051 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0476 | 0.3408 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0682 | 0.5547 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0998 | 0.4673 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0226 | 0.4141 | 0.0001 | 0.0154 | 0.0259 | 0.198 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 2.5318 | 0.2630 | 3000 | 2.6668 | 0.006 | 0.0132 | 0.0048 | 0.0031 | 0.0059 | 0.0051 | 0.02 | 0.0415 | 0.0461 | 0.0189 | 0.0391 | 0.0548 | 0.0 | 0.0 | 0.0055 | 0.0141 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0542 | 0.3959 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0593 | 0.5821 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.1066 | 0.4709 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0226 | 0.4293 | 0.0001 | 0.0162 | 0.029 | 0.2122 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 2.6744 | 0.2674 | 3050 | 2.6491 | 0.0063 | 0.0139 | 0.0051 | 0.0031 | 0.0062 | 0.0055 | 0.0203 | 0.042 | 0.0461 | 0.0195 | 0.0387 | 0.0503 | 0.0 | 0.0 | 0.0041 | 0.0234 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0644 | 0.3847 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0596 | 0.5778 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.1075 | 0.4797 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0262 | 0.4178 | 0.0 | 0.0143 | 0.0295 | 0.2225 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 3.0012 | 0.2718 | 3100 | 2.6299 | 0.0064 | 0.0135 | 0.0051 | 0.0029 | 0.006 | 0.0058 | 0.0206 | 0.0425 | 0.047 | 0.02 | 0.0394 | 0.0553 | 0.0 | 0.0 | 0.0054 | 0.0286 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0699 | 0.4 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0625 | 0.5797 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.1038 | 0.4871 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0259 | 0.4347 | 0.0 | 0.0147 | 0.0268 | 0.2162 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 3.1837 | 0.2762 | 3150 | 2.6213 | 0.0067 | 0.0142 | 0.0058 | 0.003 | 0.0066 | 0.0062 | 0.0209 | 0.0435 | 0.0482 | 0.0215 | 0.0399 | 0.0624 | 0.0 | 0.0 | 0.0071 | 0.0356 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0712 | 0.3904 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0686 | 0.6138 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.1119 | 0.4918 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0265 | 0.4374 | 0.0001 | 0.0162 | 0.025 | 0.2329 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 2.9647 | 0.2806 | 3200 | 2.6316 | 0.0061 | 0.0134 | 0.0047 | 0.0028 | 0.0062 | 0.0056 | 0.0206 | 0.0424 | 0.0467 | 0.0182 | 0.0387 | 0.0545 | 0.0 | 0.0 | 0.0046 | 0.0337 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0657 | 0.4207 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0621 | 0.5986 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0975 | 0.4579 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0249 | 0.4268 | 0.0 | 0.0121 | 0.0241 | 0.1992 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 2.6275 | 0.2849 | 3250 | 2.6175 | 0.0064 | 0.0135 | 0.0053 | 0.0028 | 0.0069 | 0.0059 | 0.0219 | 0.0448 | 0.0493 | 0.0205 | 0.0412 | 0.0563 | 0.0 | 0.0 | 0.0051 | 0.0366 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0643 | 0.4411 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0707 | 0.6293 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.1009 | 0.4827 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0262 | 0.4342 | 0.0 | 0.0154 | 0.0262 | 0.2288 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 3.2604 | 0.2893 | 3300 | 2.6128 | 0.0069 | 0.0142 | 0.006 | 0.0031 | 0.0081 | 0.0063 | 0.022 | 0.0453 | 0.0499 | 0.0202 | 0.0424 | 0.0657 | 0.0 | 0.0 | 0.0097 | 0.0627 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0732 | 0.4268 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0696 | 0.6398 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.1104 | 0.488 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0246 | 0.4383 | 0.0001 | 0.0158 | 0.0293 | 0.2249 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 2.0361 | 0.2937 | 3350 | 2.5990 | 0.0074 | 0.0154 | 0.0065 | 0.003 | 0.0066 | 0.0073 | 0.0235 | 0.0452 | 0.0503 | 0.0208 | 0.0429 | 0.0564 | 0.0 | 0.0 | 0.0135 | 0.0827 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0882 | 0.4382 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0817 | 0.6301 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.1099 | 0.4832 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0226 | 0.4304 | 0.0001 | 0.0184 | 0.0262 | 0.2296 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 2.4672 | 0.2981 | 3400 | 2.6148 | 0.0064 | 0.0137 | 0.0052 | 0.0029 | 0.0087 | 0.0055 | 0.0211 | 0.0458 | 0.05 | 0.0197 | 0.0432 | 0.0563 | 0.0 | 0.0 | 0.0108 | 0.0914 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.069 | 0.4232 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0519 | 0.6415 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.1124 | 0.481 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0276 | 0.425 | 0.0001 | 0.0171 | 0.0205 | 0.221 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 3.0081 | 0.3025 | 3450 | 2.6040 | 0.0065 | 0.0136 | 0.0055 | 0.0027 | 0.0081 | 0.0061 | 0.021 | 0.0465 | 0.0515 | 0.0213 | 0.0453 | 0.0581 | 0.0 | 0.0 | 0.0122 | 0.0869 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0688 | 0.4048 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0642 | 0.6809 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.1034 | 0.4948 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0267 | 0.4558 | 0.0001 | 0.0171 | 0.0233 | 0.2309 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 2.0952 | 0.3069 | 3500 | 2.5743 | 0.0068 | 0.0141 | 0.0058 | 0.0029 | 0.0085 | 0.0061 | 0.0233 | 0.0488 | 0.0537 | 0.0215 | 0.0473 | 0.0641 | 0.0 | 0.0 | 0.0149 | 0.0945 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0668 | 0.4742 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0682 | 0.6809 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.1063 | 0.4974 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0284 | 0.4596 | 0.0001 | 0.018 | 0.0273 | 0.2457 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 2.5016 | 0.3112 | 3550 | 2.5812 | 0.0069 | 0.0142 | 0.0059 | 0.0033 | 0.0078 | 0.0062 | 0.0242 | 0.049 | 0.0536 | 0.0205 | 0.0462 | 0.0707 | 0.0 | 0.0 | 0.0131 | 0.0848 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0719 | 0.4822 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0643 | 0.685 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.1127 | 0.4992 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0315 | 0.4626 | 0.0001 | 0.0182 | 0.0261 | 0.235 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 2.4596 | 0.3156 | 3600 | 2.5700 | 0.0073 | 0.0148 | 0.0064 | 0.003 | 0.0071 | 0.0069 | 0.0241 | 0.0492 | 0.0541 | 0.0209 | 0.0465 | 0.0706 | 0.0 | 0.0 | 0.0179 | 0.0884 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0731 | 0.4662 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.073 | 0.701 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.1111 | 0.5056 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.031 | 0.4626 | 0.0001 | 0.0216 | 0.0276 | 0.2429 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 2.2083 | 0.3200 | 3650 | 2.5575 | 0.0075 | 0.015 | 0.0066 | 0.0033 | 0.0083 | 0.0072 | 0.0253 | 0.0499 | 0.0551 | 0.0212 | 0.047 | 0.0725 | 0.0 | 0.0 | 0.014 | 0.0893 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0844 | 0.4841 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0726 | 0.7067 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.1144 | 0.5068 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0304 | 0.4744 | 0.0001 | 0.0216 | 0.028 | 0.254 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 2.2074 | 0.3244 | 3700 | 2.5718 | 0.0075 | 0.0151 | 0.0065 | 0.0035 | 0.0084 | 0.0065 | 0.0248 | 0.0503 | 0.055 | 0.0208 | 0.0482 | 0.07 | 0.0 | 0.0 | 0.0133 | 0.1141 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0727 | 0.5016 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0718 | 0.6882 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.129 | 0.4864 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0309 | 0.4681 | 0.0001 | 0.0195 | 0.027 | 0.2543 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 1.8081 | 0.3288 | 3750 | 2.5592 | 0.0082 | 0.0165 | 0.0072 | 0.0039 | 0.0078 | 0.0076 | 0.025 | 0.0492 | 0.0539 | 0.0216 | 0.0472 | 0.0676 | 0.0 | 0.0 | 0.0163 | 0.1221 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0855 | 0.4443 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0809 | 0.686 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.1291 | 0.4884 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0356 | 0.4576 | 0.0001 | 0.0214 | 0.028 | 0.2599 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 2.4224 | 0.3332 | 3800 | 2.5625 | 0.0078 | 0.0159 | 0.0066 | 0.0033 | 0.0076 | 0.007 | 0.0251 | 0.0504 | 0.055 | 0.0215 | 0.0498 | 0.0692 | 0.0 | 0.0 | 0.0136 | 0.1162 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0854 | 0.4876 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.077 | 0.6866 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.1173 | 0.5093 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0371 | 0.4568 | 0.0001 | 0.0247 | 0.0278 | 0.2466 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 2.6959 | 0.3375 | 3850 | 2.5409 | 0.0077 | 0.0154 | 0.0068 | 0.0028 | 0.0077 | 0.0074 | 0.025 | 0.0506 | 0.0553 | 0.0221 | 0.0491 | 0.0658 | 0.0 | 0.0 | 0.0219 | 0.1128 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0855 | 0.5099 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0776 | 0.6819 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.113 | 0.5157 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0354 | 0.4585 | 0.0001 | 0.0219 | 0.0211 | 0.2426 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 2.7602 | 0.3419 | 3900 | 2.5424 | 0.008 | 0.016 | 0.007 | 0.0034 | 0.0085 | 0.0072 | 0.0257 | 0.051 | 0.0555 | 0.0219 | 0.0494 | 0.0668 | 0.0 | 0.0 | 0.0225 | 0.1257 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0772 | 0.5121 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0761 | 0.6799 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.1232 | 0.5036 | 0.0089 | 0.0042 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0325 | 0.4465 | 0.0001 | 0.0243 | 0.0259 | 0.255 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 3.2808 | 0.3463 | 3950 | 2.5410 | 0.0077 | 0.0151 | 0.0069 | 0.003 | 0.0078 | 0.0072 | 0.0254 | 0.0507 | 0.0555 | 0.0212 | 0.05 | 0.0671 | 0.0 | 0.0 | 0.0228 | 0.1208 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0766 | 0.5201 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0783 | 0.6799 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.1105 | 0.5064 | 0.0089 | 0.0042 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0311 | 0.4609 | 0.0001 | 0.0232 | 0.025 | 0.2375 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 1.8816 | 0.3507 | 4000 | 2.5517 | 0.0074 | 0.0148 | 0.0069 | 0.0033 | 0.0079 | 0.0067 | 0.0245 | 0.0505 | 0.0552 | 0.0203 | 0.0498 | 0.0694 | 0.0 | 0.0 | 0.0148 | 0.1112 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0745 | 0.5178 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0774 | 0.6839 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.1097 | 0.4925 | 0.0079 | 0.0037 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0313 | 0.4563 | 0.0001 | 0.0247 | 0.0262 | 0.2496 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 2.6022 | 0.3551 | 4050 | 2.5470 | 0.0078 | 0.0153 | 0.0071 | 0.0034 | 0.0085 | 0.0069 | 0.024 | 0.0506 | 0.0556 | 0.0216 | 0.0491 | 0.0646 | 0.0 | 0.0 | 0.015 | 0.14 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0805 | 0.5 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.076 | 0.6766 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.1187 | 0.4984 | 0.0079 | 0.0037 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.027 | 0.4652 | 0.0001 | 0.0216 | 0.0322 | 0.2511 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 2.0763 | 0.3595 | 4100 | 2.5482 | 0.0073 | 0.0151 | 0.0063 | 0.003 | 0.0075 | 0.0066 | 0.0229 | 0.0484 | 0.0533 | 0.0204 | 0.0482 | 0.0619 | 0.0 | 0.0 | 0.0155 | 0.1528 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0714 | 0.4787 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0765 | 0.6362 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.1124 | 0.4671 | 0.0079 | 0.0037 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0268 | 0.4432 | 0.0001 | 0.0281 | 0.0253 | 0.24 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 2.9504 | 0.3638 | 4150 | 2.5168 | 0.0083 | 0.0167 | 0.0074 | 0.0036 | 0.0081 | 0.0077 | 0.0254 | 0.0504 | 0.0553 | 0.0222 | 0.0481 | 0.0642 | 0.0 | 0.0 | 0.0169 | 0.1608 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0912 | 0.4745 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.08 | 0.6713 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.1244 | 0.4818 | 0.0079 | 0.0037 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0324 | 0.4685 | 0.0001 | 0.0262 | 0.0282 | 0.2578 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 1.9656 | 0.3682 | 4200 | 2.5238 | 0.0076 | 0.0155 | 0.0065 | 0.0032 | 0.0082 | 0.0069 | 0.0251 | 0.0505 | 0.0555 | 0.0216 | 0.0509 | 0.0632 | 0.0 | 0.0 | 0.018 | 0.1501 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.074 | 0.4997 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0743 | 0.6638 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.1124 | 0.4966 | 0.0089 | 0.0042 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0323 | 0.4676 | 0.0001 | 0.0247 | 0.0295 | 0.2448 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 2.8733 | 0.3726 | 4250 | 2.5311 | 0.008 | 0.0159 | 0.0072 | 0.0033 | 0.0085 | 0.0072 | 0.0249 | 0.0495 | 0.0544 | 0.0225 | 0.0479 | 0.0615 | 0.0 | 0.0 | 0.0172 | 0.1459 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0809 | 0.4873 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0813 | 0.6553 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.1168 | 0.4999 | 0.0089 | 0.0042 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.03 | 0.4475 | 0.0001 | 0.0255 | 0.0326 | 0.2351 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 2.8777 | 0.3770 | 4300 | 2.5353 | 0.0074 | 0.0154 | 0.0062 | 0.0031 | 0.0082 | 0.007 | 0.0233 | 0.0482 | 0.0525 | 0.0208 | 0.0455 | 0.0605 | 0.0 | 0.0 | 0.0186 | 0.1493 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0714 | 0.4605 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0813 | 0.6384 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.1126 | 0.4838 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0285 | 0.4292 | 0.0001 | 0.0266 | 0.0282 | 0.2271 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 2.312 | 0.3814 | 4350 | 2.5155 | 0.0075 | 0.0155 | 0.0064 | 0.0034 | 0.0092 | 0.0067 | 0.024 | 0.0509 | 0.0551 | 0.0215 | 0.0495 | 0.0635 | 0.0 | 0.0 | 0.0207 | 0.1495 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0694 | 0.5089 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0682 | 0.6717 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.116 | 0.491 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0377 | 0.448 | 0.0001 | 0.0234 | 0.0325 | 0.2426 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 2.3045 | 0.3858 | 4400 | 2.5102 | 0.008 | 0.0161 | 0.0071 | 0.0035 | 0.0096 | 0.0072 | 0.0252 | 0.0518 | 0.0563 | 0.022 | 0.0501 | 0.0649 | 0.0 | 0.0 | 0.0215 | 0.1539 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0772 | 0.522 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0715 | 0.6789 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.1178 | 0.5043 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0472 | 0.4614 | 0.0001 | 0.0296 | 0.0306 | 0.2376 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 3.6469 | 0.3901 | 4450 | 2.4944 | 0.0084 | 0.0167 | 0.0078 | 0.0031 | 0.0101 | 0.0078 | 0.0254 | 0.0525 | 0.057 | 0.0219 | 0.0506 | 0.0663 | 0.0 | 0.0 | 0.0193 | 0.164 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0853 | 0.5226 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0826 | 0.69 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.1219 | 0.5092 | 0.0079 | 0.0037 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0423 | 0.4549 | 0.0001 | 0.027 | 0.0257 | 0.2493 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 2.1084 | 0.3945 | 4500 | 2.4853 | 0.0085 | 0.017 | 0.0075 | 0.0034 | 0.0101 | 0.0076 | 0.0259 | 0.0531 | 0.0578 | 0.0235 | 0.0518 | 0.0661 | 0.0 | 0.0 | 0.0199 | 0.1867 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0837 | 0.5131 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0832 | 0.6923 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.1273 | 0.5 | 0.0089 | 0.0042 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0391 | 0.4699 | 0.0001 | 0.0275 | 0.0294 | 0.2668 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 2.34 | 0.3989 | 4550 | 2.4771 | 0.0082 | 0.0167 | 0.0072 | 0.0032 | 0.0102 | 0.0072 | 0.0253 | 0.0529 | 0.0576 | 0.0224 | 0.0529 | 0.0665 | 0.0 | 0.0 | 0.0214 | 0.2069 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0734 | 0.5172 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.081 | 0.674 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.1245 | 0.4932 | 0.0089 | 0.0042 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0407 | 0.4679 | 0.0001 | 0.032 | 0.0294 | 0.2525 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 2.4942 | 0.4033 | 4600 | 2.4845 | 0.0085 | 0.0171 | 0.0076 | 0.0037 | 0.0095 | 0.0072 | 0.0254 | 0.0517 | 0.056 | 0.0219 | 0.0505 | 0.0645 | 0.0 | 0.0 | 0.0264 | 0.2067 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0754 | 0.5064 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0797 | 0.6435 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.1355 | 0.4935 | 0.0089 | 0.0042 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0377 | 0.4451 | 0.0001 | 0.0322 | 0.0289 | 0.2425 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 2.4839 | 0.4077 | 4650 | 2.4877 | 0.0088 | 0.0176 | 0.0081 | 0.0039 | 0.0103 | 0.0074 | 0.0256 | 0.0525 | 0.0569 | 0.0217 | 0.0532 | 0.0638 | 0.0 | 0.0 | 0.026 | 0.2137 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0826 | 0.514 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0792 | 0.6553 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.1392 | 0.4893 | 0.0089 | 0.0042 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0384 | 0.4602 | 0.0001 | 0.0322 | 0.0316 | 0.2468 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 2.5089 | 0.4121 | 4700 | 2.4807 | 0.0084 | 0.0172 | 0.0073 | 0.0034 | 0.0109 | 0.0073 | 0.0267 | 0.0531 | 0.057 | 0.0198 | 0.0528 | 0.066 | 0.0 | 0.0 | 0.0209 | 0.2181 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0797 | 0.5341 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0796 | 0.6705 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.1231 | 0.4721 | 0.0079 | 0.0037 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0448 | 0.449 | 0.0001 | 0.0333 | 0.0299 | 0.2403 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 2.9228 | 0.4164 | 4750 | 2.4644 | 0.0088 | 0.0177 | 0.0079 | 0.0037 | 0.0113 | 0.0077 | 0.0272 | 0.0548 | 0.0588 | 0.0218 | 0.0539 | 0.0746 | 0.0 | 0.0 | 0.0242 | 0.2067 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0821 | 0.5576 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0819 | 0.685 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.1319 | 0.5031 | 0.0089 | 0.0042 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0397 | 0.4564 | 0.0001 | 0.0286 | 0.0354 | 0.2616 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 3.5246 | 0.4208 | 4800 | 2.4674 | 0.0088 | 0.0174 | 0.0079 | 0.0037 | 0.0105 | 0.0078 | 0.0271 | 0.0536 | 0.0574 | 0.0215 | 0.0524 | 0.066 | 0.0 | 0.0 | 0.0198 | 0.1825 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0838 | 0.5465 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.081 | 0.6722 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.1314 | 0.507 | 0.0089 | 0.0042 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0479 | 0.4478 | 0.0001 | 0.0273 | 0.0306 | 0.2546 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 2.6292 | 0.4252 | 4850 | 2.4685 | 0.0087 | 0.0171 | 0.0079 | 0.0039 | 0.0104 | 0.0078 | 0.0269 | 0.0545 | 0.0584 | 0.0217 | 0.0539 | 0.0682 | 0.0 | 0.0 | 0.0192 | 0.1945 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0795 | 0.55 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0814 | 0.6805 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.1303 | 0.5147 | 0.0089 | 0.0042 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0487 | 0.4563 | 0.0001 | 0.0362 | 0.0329 | 0.2485 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 3.0016 | 0.4296 | 4900 | 2.4470 | 0.0089 | 0.0175 | 0.0082 | 0.0037 | 0.0109 | 0.008 | 0.028 | 0.0553 | 0.0593 | 0.0223 | 0.0544 | 0.0685 | 0.0 | 0.0 | 0.0196 | 0.2017 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0875 | 0.5752 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0826 | 0.6736 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.1323 | 0.511 | 0.0089 | 0.0042 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.047 | 0.4662 | 0.0001 | 0.0375 | 0.0327 | 0.2576 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 2.2245 | 0.4340 | 4950 | 2.4507 | 0.0093 | 0.0178 | 0.0084 | 0.0035 | 0.0104 | 0.0089 | 0.0289 | 0.0566 | 0.0606 | 0.0225 | 0.0557 | 0.0716 | 0.0 | 0.0 | 0.0212 | 0.2175 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.1037 | 0.5949 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0823 | 0.6874 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.1295 | 0.4978 | 0.0089 | 0.0042 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0511 | 0.471 | 0.0001 | 0.0403 | 0.0304 | 0.273 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 2.5848 | 0.4384 | 5000 | 2.4488 | 0.0092 | 0.0179 | 0.0085 | 0.0037 | 0.0101 | 0.0083 | 0.0284 | 0.056 | 0.0598 | 0.0224 | 0.0544 | 0.0688 | 0.0 | 0.0 | 0.0196 | 0.2164 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0985 | 0.5818 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0841 | 0.6795 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.1333 | 0.514 | 0.0089 | 0.0042 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0466 | 0.459 | 0.0001 | 0.0403 | 0.0325 | 0.2565 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 2.7955 | 0.4427 | 5050 | 2.4566 | 0.0087 | 0.0169 | 0.0079 | 0.0037 | 0.0108 | 0.0078 | 0.0276 | 0.0558 | 0.0594 | 0.0218 | 0.0539 | 0.069 | 0.0 | 0.0 | 0.0207 | 0.2152 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0872 | 0.5828 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0749 | 0.6713 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.127 | 0.5104 | 0.0089 | 0.0042 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0519 | 0.4549 | 0.0001 | 0.0422 | 0.0289 | 0.2535 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 2.6832 | 0.4471 | 5100 | 2.4586 | 0.0089 | 0.0172 | 0.0081 | 0.0037 | 0.0112 | 0.0081 | 0.0276 | 0.0558 | 0.0598 | 0.0226 | 0.0552 | 0.0698 | 0.0 | 0.0 | 0.0204 | 0.2101 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0949 | 0.5834 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0754 | 0.6785 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.1296 | 0.5118 | 0.0089 | 0.0042 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0493 | 0.4639 | 0.0002 | 0.0454 | 0.0289 | 0.2556 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 2.5706 | 0.4515 | 5150 | 2.4522 | 0.0094 | 0.0179 | 0.0088 | 0.0038 | 0.0115 | 0.0086 | 0.0283 | 0.0566 | 0.0605 | 0.0228 | 0.0557 | 0.0705 | 0.0 | 0.0 | 0.023 | 0.2189 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.1035 | 0.5901 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0814 | 0.6825 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.1318 | 0.5198 | 0.0089 | 0.0065 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0527 | 0.4626 | 0.0001 | 0.0416 | 0.0295 | 0.2599 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 2.2534 | 0.4559 | 5200 | 2.4336 | 0.0097 | 0.0182 | 0.0092 | 0.0038 | 0.0113 | 0.0088 | 0.0288 | 0.0562 | 0.06 | 0.0223 | 0.0561 | 0.0716 | 0.0 | 0.0 | 0.0225 | 0.2137 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.1032 | 0.5844 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0912 | 0.6803 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.1417 | 0.518 | 0.0089 | 0.0042 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0487 | 0.4608 | 0.0002 | 0.0422 | 0.0285 | 0.2577 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 3.4737 | 0.4603 | 5250 | 2.4330 | 0.0097 | 0.0187 | 0.009 | 0.0039 | 0.0107 | 0.009 | 0.0288 | 0.0555 | 0.0595 | 0.0225 | 0.055 | 0.0727 | 0.0 | 0.0 | 0.0234 | 0.2162 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.1033 | 0.5729 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0893 | 0.6709 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.1418 | 0.5063 | 0.0099 | 0.0047 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0422 | 0.4538 | 0.0002 | 0.0433 | 0.0347 | 0.2709 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 2.2841 | 0.4647 | 5300 | 2.4288 | 0.0101 | 0.0192 | 0.0095 | 0.0041 | 0.0117 | 0.0095 | 0.0298 | 0.0559 | 0.06 | 0.0229 | 0.055 | 0.0729 | 0.0 | 0.0 | 0.0253 | 0.2265 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.1083 | 0.5637 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0966 | 0.6864 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.1479 | 0.5068 | 0.0084 | 0.0042 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0435 | 0.4686 | 0.0002 | 0.0485 | 0.0327 | 0.2566 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 2.3675 | 0.4691 | 5350 | 2.4358 | 0.0098 | 0.0185 | 0.0094 | 0.0042 | 0.0115 | 0.009 | 0.0292 | 0.0567 | 0.0607 | 0.0223 | 0.0562 | 0.076 | 0.0 | 0.0 | 0.0218 | 0.2288 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.1008 | 0.5844 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0841 | 0.6789 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.1538 | 0.5101 | 0.0113 | 0.0084 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0466 | 0.4699 | 0.0002 | 0.047 | 0.0316 | 0.2642 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 2.7819 | 0.4734 | 5400 | 2.4224 | 0.0103 | 0.0193 | 0.0098 | 0.0041 | 0.0128 | 0.0099 | 0.0303 | 0.0574 | 0.0619 | 0.0231 | 0.0576 | 0.0757 | 0.0 | 0.0 | 0.0285 | 0.2621 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.1086 | 0.5787 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0972 | 0.6943 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.1524 | 0.499 | 0.0089 | 0.0042 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.045 | 0.4793 | 0.0002 | 0.0485 | 0.033 | 0.2824 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 1.9625 | 0.4778 | 5450 | 2.4276 | 0.0102 | 0.0196 | 0.0094 | 0.0041 | 0.0119 | 0.0097 | 0.0299 | 0.0561 | 0.0604 | 0.0234 | 0.0557 | 0.0711 | 0.0 | 0.0 | 0.0284 | 0.2457 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.1086 | 0.5455 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0971 | 0.6874 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.1503 | 0.4889 | 0.0089 | 0.0042 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0438 | 0.4737 | 0.0002 | 0.05 | 0.0341 | 0.2814 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 1.8536 | 0.4822 | 5500 | 2.4273 | 0.01 | 0.0193 | 0.0093 | 0.0041 | 0.0124 | 0.0092 | 0.03 | 0.0559 | 0.0599 | 0.0222 | 0.0566 | 0.0709 | 0.0 | 0.0 | 0.0276 | 0.2387 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.1008 | 0.5592 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.095 | 0.6839 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.1487 | 0.4946 | 0.0089 | 0.0042 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.046 | 0.459 | 0.0002 | 0.0463 | 0.0316 | 0.2682 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 3.2057 | 0.4866 | 5550 | 2.4177 | 0.0103 | 0.0191 | 0.01 | 0.0042 | 0.013 | 0.0098 | 0.0312 | 0.0584 | 0.0628 | 0.0244 | 0.0584 | 0.0757 | 0.0 | 0.0 | 0.0274 | 0.2501 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.1034 | 0.5777 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0948 | 0.7199 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.1573 | 0.5148 | 0.0092 | 0.0051 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0478 | 0.4795 | 0.0002 | 0.0452 | 0.0335 | 0.2963 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 2.8074 | 0.4910 | 5600 | 2.4264 | 0.01 | 0.0192 | 0.0092 | 0.0042 | 0.0133 | 0.0096 | 0.0308 | 0.0576 | 0.0619 | 0.0221 | 0.0588 | 0.0846 | 0.0 | 0.0 | 0.0275 | 0.2528 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0989 | 0.5873 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0942 | 0.7171 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.1493 | 0.4939 | 0.0089 | 0.0047 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0471 | 0.4634 | 0.0002 | 0.0487 | 0.0338 | 0.2793 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 2.8717 | 0.4954 | 5650 | 2.4182 | 0.0102 | 0.0193 | 0.0095 | 0.0043 | 0.0124 | 0.0095 | 0.0311 | 0.0579 | 0.062 | 0.0226 | 0.0592 | 0.0771 | 0.0 | 0.0 | 0.0307 | 0.2699 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0963 | 0.5844 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0976 | 0.7016 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.1553 | 0.4987 | 0.0089 | 0.0042 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0448 | 0.4681 | 0.0002 | 0.052 | 0.0335 | 0.2716 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 2.8318 | 0.4997 | 5700 | 2.4191 | 0.0101 | 0.0193 | 0.0096 | 0.0043 | 0.0133 | 0.0094 | 0.0303 | 0.0583 | 0.0623 | 0.0222 | 0.0582 | 0.0774 | 0.0 | 0.0 | 0.0257 | 0.2587 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.1103 | 0.6006 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0847 | 0.7077 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.1554 | 0.5077 | 0.0089 | 0.0051 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0492 | 0.4665 | 0.0002 | 0.0474 | 0.0292 | 0.2723 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 2.4833 | 0.5041 | 5750 | 2.4234 | 0.0102 | 0.0193 | 0.0098 | 0.0042 | 0.0133 | 0.0094 | 0.0297 | 0.0584 | 0.0624 | 0.0223 | 0.0574 | 0.0786 | 0.0 | 0.0 | 0.029 | 0.2773 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.1111 | 0.6 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0854 | 0.7004 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.1599 | 0.5187 | 0.0079 | 0.0037 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0457 | 0.4569 | 0.0002 | 0.0472 | 0.0318 | 0.2658 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 2.1686 | 0.5085 | 5800 | 2.4080 | 0.0105 | 0.0197 | 0.0098 | 0.0043 | 0.0137 | 0.0101 | 0.0306 | 0.0585 | 0.0628 | 0.0233 | 0.0587 | 0.0786 | 0.0 | 0.0 | 0.0306 | 0.2749 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.1115 | 0.6061 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0886 | 0.703 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.1575 | 0.5115 | 0.0089 | 0.0051 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0497 | 0.474 | 0.0001 | 0.0455 | 0.0343 | 0.27 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 2.1837 | 0.5129 | 5850 | 2.4086 | 0.0105 | 0.0199 | 0.01 | 0.0044 | 0.0127 | 0.0102 | 0.0301 | 0.058 | 0.062 | 0.0233 | 0.0573 | 0.0791 | 0.0 | 0.0 | 0.0288 | 0.2752 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.1071 | 0.585 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0924 | 0.6957 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0008 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.162 | 0.5129 | 0.0089 | 0.0051 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0501 | 0.4553 | 0.0002 | 0.0506 | 0.0322 | 0.2709 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 2.1888 | 0.5173 | 5900 | 2.4098 | 0.0108 | 0.0205 | 0.0103 | 0.0044 | 0.0138 | 0.0105 | 0.0306 | 0.0579 | 0.0619 | 0.0223 | 0.0588 | 0.0796 | 0.0 | 0.0 | 0.0319 | 0.2771 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.1139 | 0.5946 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0952 | 0.6996 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0008 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.1652 | 0.4996 | 0.0092 | 0.0056 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0527 | 0.4529 | 0.0002 | 0.0526 | 0.0284 | 0.2638 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 2.7038 | 0.5217 | 5950 | 2.4142 | 0.0105 | 0.02 | 0.0096 | 0.0045 | 0.0144 | 0.0097 | 0.03 | 0.0576 | 0.0611 | 0.0223 | 0.0578 | 0.0756 | 0.0 | 0.0 | 0.0289 | 0.2651 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.1075 | 0.6134 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0855 | 0.6756 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0023 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.1667 | 0.5049 | 0.0079 | 0.0037 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0549 | 0.4403 | 0.0001 | 0.0489 | 0.0295 | 0.255 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 3.0488 | 0.5260 | 6000 | 2.4054 | 0.0107 | 0.0205 | 0.01 | 0.0044 | 0.0137 | 0.0102 | 0.0311 | 0.0581 | 0.0616 | 0.0216 | 0.0576 | 0.0753 | 0.0 | 0.0 | 0.0303 | 0.2912 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.1147 | 0.6166 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0861 | 0.6803 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.1702 | 0.491 | 0.0091 | 0.0084 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0538 | 0.442 | 0.0001 | 0.0459 | 0.0295 | 0.2585 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 2.342 | 0.5304 | 6050 | 2.4073 | 0.011 | 0.0207 | 0.0106 | 0.005 | 0.0143 | 0.0106 | 0.0311 | 0.0589 | 0.0628 | 0.0228 | 0.0595 | 0.0818 | 0.0 | 0.0 | 0.0284 | 0.2771 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.1212 | 0.621 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0913 | 0.7031 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0008 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.1746 | 0.4893 | 0.0046 | 0.0079 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0534 | 0.4766 | 0.0001 | 0.0413 | 0.0322 | 0.2735 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 1.9017 | 0.5348 | 6100 | 2.4063 | 0.0112 | 0.0211 | 0.0107 | 0.005 | 0.0134 | 0.0106 | 0.032 | 0.0584 | 0.0623 | 0.0228 | 0.0589 | 0.0857 | 0.0 | 0.0 | 0.0292 | 0.2771 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.1198 | 0.6108 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0979 | 0.6978 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.1751 | 0.4938 | 0.0088 | 0.0089 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0511 | 0.4642 | 0.0002 | 0.0478 | 0.0311 | 0.2644 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 2.968 | 0.5392 | 6150 | 2.4001 | 0.0109 | 0.0207 | 0.0104 | 0.0048 | 0.0132 | 0.0106 | 0.031 | 0.0579 | 0.0618 | 0.0219 | 0.0602 | 0.08 | 0.0 | 0.0 | 0.0293 | 0.2739 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.1123 | 0.6102 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0989 | 0.6907 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.17 | 0.4812 | 0.0079 | 0.0047 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0547 | 0.4659 | 0.0002 | 0.0481 | 0.0277 | 0.2672 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 2.0871 | 0.5436 | 6200 | 2.3993 | 0.0108 | 0.0206 | 0.0102 | 0.0046 | 0.0133 | 0.0103 | 0.031 | 0.0577 | 0.0617 | 0.0221 | 0.0603 | 0.0775 | 0.0 | 0.0 | 0.0316 | 0.2832 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.1095 | 0.6022 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.1011 | 0.6904 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0006 | 0.0037 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.1674 | 0.4871 | 0.0082 | 0.0051 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0484 | 0.4592 | 0.0002 | 0.0489 | 0.0283 | 0.2571 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 2.4298 | 0.5480 | 6250 | 2.4031 | 0.0104 | 0.0202 | 0.0097 | 0.0047 | 0.0125 | 0.0095 | 0.0302 | 0.0573 | 0.0614 | 0.0216 | 0.0583 | 0.0745 | 0.0 | 0.0 | 0.0306 | 0.2893 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.1027 | 0.615 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0916 | 0.6671 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0003 | 0.0018 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.1691 | 0.479 | 0.0079 | 0.0047 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0455 | 0.4697 | 0.0002 | 0.0496 | 0.0317 | 0.2469 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 2.4915 | 0.5523 | 6300 | 2.3843 | 0.0107 | 0.0204 | 0.0103 | 0.005 | 0.0144 | 0.0095 | 0.0306 | 0.0592 | 0.0631 | 0.023 | 0.0603 | 0.076 | 0.0 | 0.0 | 0.0295 | 0.2901 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.1022 | 0.6194 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0881 | 0.6793 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0005 | 0.0028 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.1782 | 0.5013 | 0.0097 | 0.0084 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0489 | 0.4735 | 0.0002 | 0.0563 | 0.0359 | 0.2722 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 2.0322 | 0.5567 | 6350 | 2.3705 | 0.0109 | 0.0209 | 0.0101 | 0.0048 | 0.0129 | 0.0106 | 0.0318 | 0.0599 | 0.0636 | 0.023 | 0.0614 | 0.0773 | 0.0 | 0.0 | 0.0292 | 0.2962 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.1066 | 0.6223 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0987 | 0.699 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0008 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.1685 | 0.5029 | 0.0083 | 0.0075 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0528 | 0.4662 | 0.0002 | 0.0539 | 0.0366 | 0.2768 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 3.1661 | 0.5611 | 6400 | 2.3671 | 0.0114 | 0.0212 | 0.0109 | 0.0048 | 0.014 | 0.0115 | 0.0327 | 0.0604 | 0.0643 | 0.024 | 0.0609 | 0.077 | 0.0 | 0.0 | 0.0314 | 0.2897 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.114 | 0.6213 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.1083 | 0.7175 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0023 | 0.0 | 0.0 | 0.0003 | 0.0028 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.1712 | 0.5178 | 0.0082 | 0.0047 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0553 | 0.4702 | 0.0002 | 0.0539 | 0.0352 | 0.2764 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 2.6323 | 0.5655 | 6450 | 2.3834 | 0.0111 | 0.0209 | 0.0103 | 0.0047 | 0.014 | 0.0113 | 0.0324 | 0.0604 | 0.0642 | 0.0238 | 0.0599 | 0.0792 | 0.0 | 0.0 | 0.0285 | 0.2836 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.1109 | 0.6299 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.101 | 0.7234 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0015 | 0.0 | 0.0 | 0.0004 | 0.0028 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.167 | 0.5123 | 0.0083 | 0.0051 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0592 | 0.4676 | 0.0002 | 0.053 | 0.0339 | 0.275 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 2.9815 | 0.5699 | 6500 | 2.3698 | 0.0109 | 0.0205 | 0.0104 | 0.0047 | 0.0146 | 0.0107 | 0.0324 | 0.0606 | 0.0644 | 0.0237 | 0.0599 | 0.0759 | 0.0 | 0.0 | 0.0324 | 0.2907 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0967 | 0.6312 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.1025 | 0.7187 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0004 | 0.0028 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.1661 | 0.5286 | 0.0112 | 0.0117 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0546 | 0.4619 | 0.0001 | 0.048 | 0.0378 | 0.2698 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 1.7586 | 0.5743 | 6550 | 2.3746 | 0.0107 | 0.0203 | 0.0102 | 0.0046 | 0.0149 | 0.0108 | 0.0318 | 0.0604 | 0.0641 | 0.0234 | 0.0606 | 0.078 | 0.0 | 0.0 | 0.0322 | 0.2813 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.1013 | 0.6153 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.1007 | 0.7301 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0001 | 0.0062 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.1589 | 0.5249 | 0.0082 | 0.0075 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.055 | 0.4655 | 0.0002 | 0.0474 | 0.0376 | 0.2721 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 2.9074 | 0.5786 | 6600 | 2.3659 | 0.0111 | 0.0207 | 0.0107 | 0.0048 | 0.0135 | 0.0103 | 0.0324 | 0.0606 | 0.0645 | 0.0239 | 0.0597 | 0.0757 | 0.0 | 0.0 | 0.0315 | 0.2996 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.1031 | 0.629 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.1022 | 0.7142 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0015 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.1686 | 0.5279 | 0.0079 | 0.0047 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.057 | 0.465 | 0.0001 | 0.0455 | 0.0383 | 0.2814 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 1.9045 | 0.5830 | 6650 | 2.3662 | 0.011 | 0.0204 | 0.0104 | 0.0049 | 0.0147 | 0.01 | 0.0321 | 0.061 | 0.0651 | 0.0236 | 0.0618 | 0.0768 | 0.0 | 0.0 | 0.0292 | 0.2981 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.1008 | 0.6274 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0986 | 0.7293 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0003 | 0.0085 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.1723 | 0.5241 | 0.0098 | 0.0093 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0571 | 0.4753 | 0.0002 | 0.048 | 0.0381 | 0.2743 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 2.2403 | 0.5874 | 6700 | 2.3604 | 0.0105 | 0.02 | 0.0098 | 0.0048 | 0.0143 | 0.0098 | 0.0314 | 0.0603 | 0.064 | 0.0227 | 0.0618 | 0.0755 | 0.0 | 0.0 | 0.0288 | 0.2907 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0908 | 0.6299 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0948 | 0.7012 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0002 | 0.0069 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.1677 | 0.5225 | 0.009 | 0.0089 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0587 | 0.464 | 0.0001 | 0.0489 | 0.0352 | 0.2708 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 2.7937 | 0.5918 | 6750 | 2.3725 | 0.0101 | 0.0193 | 0.0091 | 0.0046 | 0.0144 | 0.0096 | 0.0295 | 0.0595 | 0.0633 | 0.0223 | 0.0609 | 0.0752 | 0.0 | 0.0 | 0.0273 | 0.2855 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0869 | 0.6306 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0866 | 0.6937 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0008 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.1628 | 0.5151 | 0.0092 | 0.0084 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0554 | 0.459 | 0.0002 | 0.0511 | 0.0341 | 0.266 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 2.6128 | 0.5962 | 6800 | 2.3598 | 0.0104 | 0.02 | 0.0097 | 0.0045 | 0.0136 | 0.0104 | 0.0309 | 0.0593 | 0.0631 | 0.0215 | 0.0596 | 0.0764 | 0.0 | 0.0 | 0.0291 | 0.2884 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0984 | 0.6299 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0921 | 0.7069 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0001 | 0.0023 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.1603 | 0.5021 | 0.0101 | 0.0098 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0575 | 0.4578 | 0.0001 | 0.0485 | 0.032 | 0.2586 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 2.3714 | 0.6006 | 6850 | 2.3492 | 0.0111 | 0.0208 | 0.0104 | 0.0046 | 0.0138 | 0.0113 | 0.0327 | 0.0613 | 0.0653 | 0.0235 | 0.0619 | 0.0782 | 0.0 | 0.0 | 0.031 | 0.2964 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.1056 | 0.6363 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.098 | 0.7415 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0008 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.1645 | 0.514 | 0.0093 | 0.0098 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0643 | 0.4733 | 0.0002 | 0.0533 | 0.0384 | 0.2773 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 3.3155 | 0.6049 | 6900 | 2.3474 | 0.0109 | 0.0207 | 0.0103 | 0.0047 | 0.0155 | 0.0107 | 0.0324 | 0.0611 | 0.0648 | 0.0232 | 0.0633 | 0.0765 | 0.0 | 0.0 | 0.0307 | 0.2971 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0967 | 0.6226 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0991 | 0.7384 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0015 | 0.0 | 0.0 | 0.0002 | 0.0018 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.1647 | 0.5181 | 0.0094 | 0.0098 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0634 | 0.4636 | 0.0002 | 0.0554 | 0.0392 | 0.2703 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 2.6329 | 0.6093 | 6950 | 2.3509 | 0.0113 | 0.0211 | 0.0109 | 0.0046 | 0.0152 | 0.0106 | 0.0325 | 0.0613 | 0.065 | 0.0231 | 0.064 | 0.0766 | 0.0 | 0.0 | 0.0305 | 0.2966 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.1022 | 0.6226 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.1019 | 0.7427 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0038 | 0.0 | 0.0 | 0.0003 | 0.0018 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.1647 | 0.5178 | 0.0182 | 0.0182 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0609 | 0.4721 | 0.0002 | 0.0528 | 0.0419 | 0.262 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 2.9196 | 0.6137 | 7000 | 2.3432 | 0.0115 | 0.0212 | 0.0112 | 0.0046 | 0.0147 | 0.0109 | 0.0334 | 0.0618 | 0.0654 | 0.0243 | 0.063 | 0.0769 | 0.0 | 0.0 | 0.0312 | 0.2916 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.1062 | 0.6166 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.1046 | 0.7339 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0001 | 0.0085 | 0.0 | 0.0 | 0.0004 | 0.0028 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.1681 | 0.5256 | 0.0182 | 0.021 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0595 | 0.4781 | 0.0002 | 0.0574 | 0.0388 | 0.2711 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 2.8484 | 0.6181 | 7050 | 2.3584 | 0.0111 | 0.021 | 0.0107 | 0.0044 | 0.0144 | 0.0104 | 0.0321 | 0.0597 | 0.0632 | 0.0217 | 0.062 | 0.0753 | 0.0 | 0.0 | 0.0345 | 0.2956 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.1051 | 0.622 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.1044 | 0.7159 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0002 | 0.0062 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.1599 | 0.5017 | 0.0143 | 0.0145 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0586 | 0.4592 | 0.0002 | 0.0526 | 0.0335 | 0.2399 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 2.1078 | 0.6225 | 7100 | 2.3503 | 0.0111 | 0.0208 | 0.0107 | 0.0046 | 0.0139 | 0.0101 | 0.0319 | 0.0601 | 0.0637 | 0.0225 | 0.0618 | 0.075 | 0.0 | 0.0 | 0.0296 | 0.2924 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.1037 | 0.6242 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.1014 | 0.7132 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0003 | 0.01 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.1644 | 0.5138 | 0.0152 | 0.0154 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0614 | 0.4627 | 0.0002 | 0.0524 | 0.0352 | 0.2442 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 2.1935 | 0.6269 | 7150 | 2.3596 | 0.0111 | 0.0207 | 0.0103 | 0.0047 | 0.0135 | 0.0104 | 0.0315 | 0.0591 | 0.0625 | 0.0232 | 0.0606 | 0.0779 | 0.0 | 0.0 | 0.029 | 0.2771 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0988 | 0.628 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.1013 | 0.6858 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0002 | 0.0108 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.1622 | 0.513 | 0.0152 | 0.0154 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0617 | 0.4551 | 0.0002 | 0.0532 | 0.04 | 0.2386 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 3.1469 | 0.6312 | 7200 | 2.3577 | 0.011 | 0.0207 | 0.0107 | 0.0049 | 0.0139 | 0.0097 | 0.0309 | 0.0588 | 0.0621 | 0.0219 | 0.0599 | 0.077 | 0.0 | 0.0 | 0.0269 | 0.2718 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0965 | 0.6366 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0977 | 0.6898 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0002 | 0.0085 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.1684 | 0.5091 | 0.0132 | 0.0126 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0603 | 0.4416 | 0.0002 | 0.0515 | 0.0403 | 0.2342 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 3.8813 | 0.6356 | 7250 | 2.3379 | 0.0115 | 0.0216 | 0.0109 | 0.0048 | 0.0143 | 0.0108 | 0.0326 | 0.0613 | 0.0645 | 0.0224 | 0.0616 | 0.0791 | 0.0 | 0.0 | 0.0306 | 0.2979 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.1043 | 0.6449 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0988 | 0.7238 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0002 | 0.0077 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.1728 | 0.504 | 0.0178 | 0.0192 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0648 | 0.464 | 0.0001 | 0.0483 | 0.0374 | 0.259 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 2.1135 | 0.6400 | 7300 | 2.3347 | 0.0119 | 0.0223 | 0.0115 | 0.0049 | 0.0151 | 0.0112 | 0.0332 | 0.0616 | 0.065 | 0.0232 | 0.0636 | 0.0787 | 0.0 | 0.0 | 0.036 | 0.3029 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.1128 | 0.6357 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.1008 | 0.7278 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0003 | 0.0062 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.1783 | 0.5154 | 0.0159 | 0.0178 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0642 | 0.4727 | 0.0002 | 0.0498 | 0.041 | 0.262 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 2.1482 | 0.6444 | 7350 | 2.3241 | 0.0121 | 0.0225 | 0.0117 | 0.005 | 0.0159 | 0.0117 | 0.0335 | 0.0618 | 0.0655 | 0.0241 | 0.0645 | 0.0783 | 0.0 | 0.0 | 0.0334 | 0.2958 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.1216 | 0.6296 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.1059 | 0.7396 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0003 | 0.0108 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.1821 | 0.5192 | 0.0126 | 0.0159 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0625 | 0.4755 | 0.0002 | 0.0546 | 0.0394 | 0.2698 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 2.8763 | 0.6488 | 7400 | 2.3245 | 0.012 | 0.0227 | 0.0115 | 0.0048 | 0.0153 | 0.0117 | 0.0337 | 0.0611 | 0.0646 | 0.0234 | 0.0628 | 0.0762 | 0.0 | 0.0 | 0.0339 | 0.2899 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.1217 | 0.6156 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.1059 | 0.7425 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0008 | 0.0131 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.178 | 0.5158 | 0.0087 | 0.0098 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0681 | 0.4732 | 0.0002 | 0.0545 | 0.0339 | 0.2575 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 1.6302 | 0.6532 | 7450 | 2.3228 | 0.0121 | 0.0225 | 0.0116 | 0.0051 | 0.0153 | 0.0117 | 0.0335 | 0.0612 | 0.0643 | 0.023 | 0.0615 | 0.077 | 0.0 | 0.0 | 0.0337 | 0.2956 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.1221 | 0.6287 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.1054 | 0.7331 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0012 | 0.0192 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.1803 | 0.5185 | 0.0089 | 0.0093 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0688 | 0.4658 | 0.0001 | 0.0478 | 0.035 | 0.2395 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 2.6397 | 0.6575 | 7500 | 2.3219 | 0.0122 | 0.0226 | 0.0117 | 0.005 | 0.0155 | 0.0123 | 0.0338 | 0.0616 | 0.0649 | 0.0236 | 0.0615 | 0.0783 | 0.0 | 0.0 | 0.0336 | 0.2971 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.117 | 0.6293 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.1107 | 0.7348 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0009 | 0.0185 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.1803 | 0.5214 | 0.0109 | 0.0131 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0704 | 0.4719 | 0.0001 | 0.0498 | 0.0387 | 0.2516 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 2.0553 | 0.6619 | 7550 | 2.3254 | 0.0123 | 0.0229 | 0.0119 | 0.0051 | 0.0153 | 0.0121 | 0.0341 | 0.0614 | 0.0648 | 0.0235 | 0.0633 | 0.0782 | 0.0 | 0.0 | 0.0339 | 0.3166 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.1139 | 0.6261 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.1117 | 0.7152 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0006 | 0.0177 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.185 | 0.5185 | 0.0135 | 0.0159 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0676 | 0.4705 | 0.0002 | 0.0515 | 0.0392 | 0.2497 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 2.5076 | 0.6663 | 7600 | 2.3156 | 0.012 | 0.0225 | 0.0116 | 0.005 | 0.0151 | 0.0114 | 0.034 | 0.0615 | 0.0647 | 0.0234 | 0.0616 | 0.0783 | 0.0 | 0.0 | 0.0296 | 0.3042 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.1175 | 0.6366 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.1059 | 0.7222 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0003 | 0.0138 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.1781 | 0.5192 | 0.0085 | 0.0154 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0725 | 0.4644 | 0.0001 | 0.0504 | 0.0405 | 0.2517 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 2.2284 | 0.6707 | 7650 | 2.3124 | 0.0121 | 0.0229 | 0.0114 | 0.005 | 0.0157 | 0.0115 | 0.0343 | 0.0618 | 0.065 | 0.0234 | 0.0629 | 0.0779 | 0.0 | 0.0 | 0.03 | 0.3133 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.1154 | 0.6328 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.1061 | 0.7201 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0002 | 0.0169 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.18 | 0.5262 | 0.0121 | 0.0168 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0737 | 0.4635 | 0.0002 | 0.0487 | 0.0372 | 0.2535 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 2.9091 | 0.6751 | 7700 | 2.3099 | 0.0124 | 0.0231 | 0.012 | 0.0051 | 0.0168 | 0.0118 | 0.0348 | 0.0627 | 0.0661 | 0.0238 | 0.0652 | 0.0819 | 0.0 | 0.0 | 0.0366 | 0.3234 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.1194 | 0.6401 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.108 | 0.7333 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0003 | 0.0177 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.1826 | 0.5278 | 0.0106 | 0.0145 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0743 | 0.4674 | 0.0002 | 0.0517 | 0.0407 | 0.2642 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 2.77 | 0.6795 | 7750 | 2.3025 | 0.0125 | 0.0232 | 0.012 | 0.0052 | 0.017 | 0.0118 | 0.0351 | 0.063 | 0.0663 | 0.024 | 0.0661 | 0.0802 | 0.0 | 0.0 | 0.037 | 0.3263 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.118 | 0.6401 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.1105 | 0.7321 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0008 | 0.0185 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.1837 | 0.5304 | 0.0104 | 0.0126 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0727 | 0.469 | 0.0002 | 0.0558 | 0.0399 | 0.2638 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 1.6237 | 0.6839 | 7800 | 2.3045 | 0.0124 | 0.0232 | 0.0118 | 0.0051 | 0.0165 | 0.0118 | 0.0347 | 0.0632 | 0.0667 | 0.0244 | 0.0658 | 0.078 | 0.0 | 0.0 | 0.0358 | 0.3276 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.122 | 0.6411 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.1085 | 0.7431 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0011 | 0.01 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.1812 | 0.5303 | 0.0121 | 0.0126 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0696 | 0.4714 | 0.0002 | 0.0571 | 0.0404 | 0.2753 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 2.7165 | 0.6882 | 7850 | 2.3041 | 0.0125 | 0.0232 | 0.012 | 0.0051 | 0.0167 | 0.0121 | 0.035 | 0.063 | 0.0665 | 0.0249 | 0.0644 | 0.0773 | 0.0 | 0.0 | 0.0333 | 0.3244 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.1243 | 0.6395 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.1121 | 0.735 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0005 | 0.0138 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.1805 | 0.5256 | 0.0104 | 0.0126 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0705 | 0.4737 | 0.0002 | 0.0548 | 0.0422 | 0.281 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 2.4805 | 0.6926 | 7900 | 2.2965 | 0.0126 | 0.0234 | 0.0121 | 0.0052 | 0.0163 | 0.0122 | 0.0351 | 0.0631 | 0.0664 | 0.0247 | 0.0646 | 0.0775 | 0.0 | 0.0 | 0.0327 | 0.3248 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.1233 | 0.6382 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.114 | 0.7348 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0014 | 0.02 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.1843 | 0.5245 | 0.0124 | 0.0117 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0711 | 0.4696 | 0.0002 | 0.0556 | 0.0403 | 0.2751 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 2.3697 | 0.6970 | 7950 | 2.3020 | 0.0125 | 0.0232 | 0.012 | 0.0053 | 0.0158 | 0.0118 | 0.0347 | 0.0625 | 0.0658 | 0.024 | 0.0641 | 0.0772 | 0.0 | 0.0 | 0.0363 | 0.3223 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.1176 | 0.6325 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.1118 | 0.7207 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0024 | 0.0277 | 0.0 | 0.0 | 0.0001 | 0.0009 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.186 | 0.5233 | 0.0136 | 0.015 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0671 | 0.4705 | 0.0002 | 0.0541 | 0.038 | 0.258 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 2.4367 | 0.7014 | 8000 | 2.2973 | 0.0123 | 0.0237 | 0.0113 | 0.0055 | 0.0167 | 0.0117 | 0.0345 | 0.0624 | 0.0656 | 0.0241 | 0.0636 | 0.0776 | 0.0 | 0.0 | 0.0362 | 0.3267 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.1119 | 0.6341 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.1093 | 0.7138 | 0.0 | 0.0 | 0.0 | 0.0 | 0.004 | 0.0292 | 0.0 | 0.0 | 0.0001 | 0.0009 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.1838 | 0.5123 | 0.0124 | 0.014 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0702 | 0.4667 | 0.0002 | 0.0558 | 0.0398 | 0.2661 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 1.8973 | 0.7058 | 8050 | 2.2975 | 0.012 | 0.0233 | 0.0111 | 0.0055 | 0.0172 | 0.0108 | 0.0342 | 0.0629 | 0.0662 | 0.0239 | 0.0661 | 0.078 | 0.0 | 0.0 | 0.0309 | 0.3366 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.1076 | 0.6449 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.103 | 0.7138 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0039 | 0.0223 | 0.0 | 0.0 | 0.0001 | 0.0018 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.183 | 0.5104 | 0.0118 | 0.0154 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.072 | 0.4738 | 0.0002 | 0.0584 | 0.0392 | 0.2674 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 3.3595 | 0.7102 | 8100 | 2.2966 | 0.0123 | 0.0233 | 0.0118 | 0.0056 | 0.0178 | 0.0111 | 0.0346 | 0.0635 | 0.0668 | 0.0242 | 0.0665 | 0.0806 | 0.0 | 0.0 | 0.0316 | 0.3385 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.1091 | 0.6567 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.1056 | 0.7089 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0045 | 0.0338 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.1904 | 0.5164 | 0.0135 | 0.0154 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0069 | 0.0052 | 0.0 | 0.0 | 0.0675 | 0.4757 | 0.0002 | 0.0597 | 0.0387 | 0.2618 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 1.9927 | 0.7145 | 8150 | 2.2890 | 0.0127 | 0.0237 | 0.0121 | 0.006 | 0.0159 | 0.012 | 0.0354 | 0.0639 | 0.0673 | 0.0253 | 0.0659 | 0.0797 | 0.0 | 0.0 | 0.0341 | 0.3314 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.1176 | 0.6436 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.1097 | 0.7295 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0043 | 0.0385 | 0.0 | 0.0 | 0.0005 | 0.0064 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.1955 | 0.5171 | 0.0129 | 0.0131 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0695 | 0.4793 | 0.0002 | 0.0615 | 0.0408 | 0.2768 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 2.0569 | 0.7189 | 8200 | 2.2972 | 0.0123 | 0.0236 | 0.0116 | 0.0057 | 0.0163 | 0.0113 | 0.0345 | 0.063 | 0.0664 | 0.0242 | 0.066 | 0.0784 | 0.0 | 0.0 | 0.0329 | 0.3324 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.1116 | 0.6462 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.1102 | 0.7191 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0033 | 0.0331 | 0.0 | 0.0 | 0.0003 | 0.0055 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.1863 | 0.5059 | 0.0119 | 0.0131 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0697 | 0.4714 | 0.0002 | 0.0597 | 0.0388 | 0.2665 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 3.3867 | 0.7233 | 8250 | 2.2923 | 0.0127 | 0.0239 | 0.0119 | 0.006 | 0.0166 | 0.0117 | 0.0345 | 0.0637 | 0.067 | 0.0246 | 0.067 | 0.0793 | 0.0 | 0.0 | 0.0333 | 0.3364 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.118 | 0.6529 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.1095 | 0.7163 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0033 | 0.0331 | 0.0 | 0.0 | 0.0003 | 0.0055 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.1926 | 0.5158 | 0.0123 | 0.0136 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.073 | 0.4783 | 0.0002 | 0.063 | 0.0406 | 0.2666 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 2.6549 | 0.7277 | 8300 | 2.2972 | 0.0126 | 0.0238 | 0.0119 | 0.0059 | 0.0164 | 0.0116 | 0.034 | 0.0626 | 0.0657 | 0.0237 | 0.0652 | 0.0773 | 0.0 | 0.0 | 0.0337 | 0.3301 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.1155 | 0.6398 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.1106 | 0.7063 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0029 | 0.03 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.1912 | 0.5175 | 0.0128 | 0.0131 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0746 | 0.4685 | 0.0002 | 0.0595 | 0.0393 | 0.2574 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 2.2556 | 0.7321 | 8350 | 2.2949 | 0.0127 | 0.024 | 0.012 | 0.0061 | 0.0168 | 0.0113 | 0.0342 | 0.063 | 0.0662 | 0.0246 | 0.0656 | 0.078 | 0.0 | 0.0 | 0.0336 | 0.3375 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.1142 | 0.6411 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.1079 | 0.6998 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0041 | 0.03 | 0.0 | 0.0 | 0.0004 | 0.0055 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.1971 | 0.5209 | 0.0146 | 0.0164 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0713 | 0.473 | 0.0002 | 0.0599 | 0.0396 | 0.2615 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 3.4421 | 0.7365 | 8400 | 2.2933 | 0.0126 | 0.0238 | 0.0119 | 0.0059 | 0.0172 | 0.0115 | 0.0342 | 0.0633 | 0.0664 | 0.0246 | 0.066 | 0.0775 | 0.0 | 0.0 | 0.0323 | 0.3385 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.1161 | 0.6484 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.1073 | 0.7016 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0042 | 0.0331 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.1957 | 0.5213 | 0.013 | 0.0136 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0719 | 0.4724 | 0.0002 | 0.0593 | 0.0401 | 0.2672 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 2.2309 | 0.7408 | 8450 | 2.2843 | 0.0128 | 0.024 | 0.012 | 0.0058 | 0.0173 | 0.012 | 0.0342 | 0.064 | 0.0672 | 0.0249 | 0.0659 | 0.0797 | 0.0 | 0.0 | 0.0334 | 0.3417 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.1228 | 0.6522 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.1071 | 0.7167 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0041 | 0.0331 | 0.0 | 0.0 | 0.0001 | 0.0009 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.1941 | 0.5219 | 0.0115 | 0.014 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0753 | 0.4758 | 0.0002 | 0.0599 | 0.0409 | 0.2731 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 2.2695 | 0.7452 | 8500 | 2.2861 | 0.0128 | 0.0243 | 0.0122 | 0.006 | 0.0169 | 0.012 | 0.0348 | 0.0639 | 0.067 | 0.0243 | 0.0665 | 0.0794 | 0.0 | 0.0 | 0.0333 | 0.3427 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.1201 | 0.6548 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.109 | 0.7161 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0048 | 0.0362 | 0.0 | 0.0 | 0.0003 | 0.0037 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.1929 | 0.5121 | 0.0142 | 0.0173 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0741 | 0.4731 | 0.0002 | 0.0602 | 0.0409 | 0.267 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 2.1015 | 0.7496 | 8550 | 2.2875 | 0.013 | 0.0245 | 0.0126 | 0.006 | 0.0168 | 0.0119 | 0.0353 | 0.0641 | 0.0671 | 0.0247 | 0.0669 | 0.0782 | 0.0 | 0.0 | 0.0353 | 0.3417 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.1253 | 0.6551 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.1105 | 0.7096 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0048 | 0.05 | 0.0 | 0.0 | 0.0004 | 0.0046 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.1967 | 0.5134 | 0.0137 | 0.0164 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0722 | 0.4744 | 0.0002 | 0.0589 | 0.0397 | 0.2628 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 1.8313 | 0.7540 | 8600 | 2.2866 | 0.013 | 0.0243 | 0.0124 | 0.0061 | 0.017 | 0.0119 | 0.0348 | 0.0639 | 0.0671 | 0.0246 | 0.0668 | 0.0802 | 0.0 | 0.0 | 0.0341 | 0.3438 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.1243 | 0.657 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.1087 | 0.7033 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0051 | 0.0531 | 0.0 | 0.0 | 0.0004 | 0.0046 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.1985 | 0.5121 | 0.0143 | 0.0159 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0723 | 0.4736 | 0.0002 | 0.0593 | 0.0389 | 0.2635 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 2.6549 | 0.7584 | 8650 | 2.2904 | 0.0128 | 0.024 | 0.0122 | 0.006 | 0.0167 | 0.0115 | 0.0342 | 0.063 | 0.066 | 0.024 | 0.0658 | 0.077 | 0.0 | 0.0 | 0.0347 | 0.3331 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.1184 | 0.6468 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.1089 | 0.6992 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0042 | 0.0515 | 0.0 | 0.0 | 0.0006 | 0.0037 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.1945 | 0.5083 | 0.0137 | 0.015 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0708 | 0.465 | 0.0002 | 0.0548 | 0.0415 | 0.257 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 2.1444 | 0.7628 | 8700 | 2.2815 | 0.0133 | 0.0248 | 0.0129 | 0.0063 | 0.0181 | 0.0119 | 0.0361 | 0.0648 | 0.0678 | 0.0252 | 0.0681 | 0.0786 | 0.0 | 0.0 | 0.0344 | 0.3436 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.126 | 0.6551 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.1155 | 0.7191 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0061 | 0.0623 | 0.0 | 0.0 | 0.0006 | 0.0046 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.1968 | 0.5119 | 0.0128 | 0.0168 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0079 | 0.0059 | 0.0 | 0.0 | 0.0731 | 0.4747 | 0.0002 | 0.061 | 0.0407 | 0.2655 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 2.2141 | 0.7671 | 8750 | 2.2809 | 0.0134 | 0.0249 | 0.0129 | 0.0061 | 0.0178 | 0.0122 | 0.0358 | 0.0642 | 0.0672 | 0.0248 | 0.0675 | 0.0786 | 0.0 | 0.0 | 0.034 | 0.3406 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.1275 | 0.65 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.1184 | 0.7161 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0055 | 0.0531 | 0.0 | 0.0 | 0.0007 | 0.0055 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.1945 | 0.5073 | 0.0126 | 0.0159 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0079 | 0.0059 | 0.0 | 0.0 | 0.0739 | 0.474 | 0.0002 | 0.0569 | 0.0416 | 0.266 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 2.544 | 0.7715 | 8800 | 2.2786 | 0.0135 | 0.0255 | 0.0128 | 0.0061 | 0.0181 | 0.0123 | 0.036 | 0.0648 | 0.0678 | 0.0254 | 0.0675 | 0.0773 | 0.0 | 0.0 | 0.0357 | 0.3417 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.1278 | 0.6516 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.1173 | 0.7209 | 0.0 | 0.0 | 0.0 | 0.0 | 0.007 | 0.0692 | 0.0 | 0.0 | 0.001 | 0.0046 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.1967 | 0.5086 | 0.0134 | 0.0164 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0079 | 0.0059 | 0.0 | 0.0 | 0.0743 | 0.4749 | 0.0001 | 0.0552 | 0.039 | 0.2677 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 2.5715 | 0.7759 | 8850 | 2.2770 | 0.0135 | 0.0251 | 0.0127 | 0.0061 | 0.0182 | 0.0123 | 0.0366 | 0.0651 | 0.0682 | 0.0259 | 0.0683 | 0.0782 | 0.0 | 0.0 | 0.035 | 0.344 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.128 | 0.6513 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.1163 | 0.7222 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0074 | 0.0738 | 0.0 | 0.0 | 0.0007 | 0.0046 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.1976 | 0.5111 | 0.0126 | 0.0159 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0079 | 0.0059 | 0.0 | 0.0 | 0.0742 | 0.4798 | 0.0002 | 0.0576 | 0.039 | 0.2699 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 3.2387 | 0.7803 | 8900 | 2.2790 | 0.0135 | 0.0255 | 0.0126 | 0.0062 | 0.0181 | 0.0125 | 0.0365 | 0.0649 | 0.068 | 0.0261 | 0.0678 | 0.0798 | 0.0 | 0.0 | 0.0359 | 0.34 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.1287 | 0.6471 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.1152 | 0.7248 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0074 | 0.0815 | 0.0 | 0.0 | 0.0008 | 0.0046 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.1965 | 0.5105 | 0.0123 | 0.015 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0005 | 0.0099 | 0.0074 | 0.0 | 0.0 | 0.0728 | 0.4758 | 0.0002 | 0.0599 | 0.0396 | 0.2624 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 2.5896 | 0.7847 | 8950 | 2.2772 | 0.0136 | 0.0256 | 0.013 | 0.0063 | 0.0175 | 0.0129 | 0.0368 | 0.0651 | 0.0683 | 0.0264 | 0.0686 | 0.078 | 0.0 | 0.0 | 0.0372 | 0.3373 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.1335 | 0.6417 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.1158 | 0.7313 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0077 | 0.0862 | 0.0 | 0.0 | 0.0008 | 0.0046 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.1972 | 0.5111 | 0.0121 | 0.0145 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0079 | 0.0059 | 0.0 | 0.0 | 0.0743 | 0.4814 | 0.0002 | 0.0595 | 0.0385 | 0.2677 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 3.0685 | 0.7891 | 9000 | 2.2833 | 0.0135 | 0.0254 | 0.0127 | 0.0059 | 0.0173 | 0.0131 | 0.037 | 0.0646 | 0.0678 | 0.026 | 0.0685 | 0.077 | 0.0 | 0.0 | 0.0368 | 0.3326 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.1378 | 0.6414 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.118 | 0.7201 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0067 | 0.0877 | 0.0 | 0.0 | 0.0008 | 0.0046 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.1925 | 0.5052 | 0.0118 | 0.014 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0079 | 0.0059 | 0.0 | 0.0 | 0.07 | 0.4761 | 0.0002 | 0.0597 | 0.0397 | 0.2693 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 2.298 | 0.7934 | 9050 | 2.2835 | 0.0137 | 0.0255 | 0.0128 | 0.0061 | 0.017 | 0.013 | 0.037 | 0.0643 | 0.0674 | 0.0261 | 0.0672 | 0.0789 | 0.0 | 0.0 | 0.0375 | 0.3307 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.1328 | 0.6331 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.1187 | 0.714 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0075 | 0.0923 | 0.0 | 0.0 | 0.0008 | 0.0046 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.1948 | 0.5076 | 0.0127 | 0.014 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0079 | 0.0059 | 0.0 | 0.0 | 0.0714 | 0.4758 | 0.0001 | 0.0586 | 0.0443 | 0.2648 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 2.3469 | 0.7978 | 9100 | 2.2834 | 0.0137 | 0.0257 | 0.0132 | 0.0062 | 0.0173 | 0.0128 | 0.0368 | 0.0645 | 0.0675 | 0.0259 | 0.0672 | 0.0767 | 0.0 | 0.0 | 0.0373 | 0.3276 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.1327 | 0.6427 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.1169 | 0.7156 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0074 | 0.0923 | 0.0 | 0.0 | 0.001 | 0.0046 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.1982 | 0.5096 | 0.0133 | 0.0145 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0079 | 0.0059 | 0.0 | 0.0 | 0.0741 | 0.4725 | 0.0001 | 0.0554 | 0.0421 | 0.2622 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 2.1837 | 0.8022 | 9150 | 2.2845 | 0.0137 | 0.0259 | 0.0129 | 0.0063 | 0.018 | 0.0127 | 0.0365 | 0.0644 | 0.0674 | 0.0254 | 0.0668 | 0.0772 | 0.0 | 0.0 | 0.0374 | 0.3339 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.1332 | 0.643 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.1163 | 0.7211 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0087 | 0.09 | 0.0 | 0.0 | 0.0012 | 0.0046 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.196 | 0.5033 | 0.0124 | 0.014 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0079 | 0.0059 | 0.0 | 0.0 | 0.0753 | 0.4694 | 0.0001 | 0.058 | 0.0406 | 0.257 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 1.8285 | 0.8066 | 9200 | 2.2777 | 0.0139 | 0.026 | 0.0131 | 0.0064 | 0.0182 | 0.013 | 0.0368 | 0.0647 | 0.0679 | 0.0263 | 0.0675 | 0.0771 | 0.0 | 0.0 | 0.037 | 0.3297 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.1355 | 0.6408 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.1174 | 0.7222 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0087 | 0.0931 | 0.0 | 0.0 | 0.001 | 0.0046 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.2 | 0.5107 | 0.0112 | 0.014 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0089 | 0.0067 | 0.0 | 0.0 | 0.0775 | 0.4758 | 0.0002 | 0.0599 | 0.041 | 0.2656 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 2.0853 | 0.8110 | 9250 | 2.2759 | 0.0138 | 0.026 | 0.0132 | 0.0062 | 0.0181 | 0.013 | 0.0367 | 0.0649 | 0.0679 | 0.026 | 0.0677 | 0.0775 | 0.0 | 0.0 | 0.0382 | 0.3333 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.133 | 0.6452 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.1177 | 0.7232 | 0.0 | 0.0 | 0.0 | 0.0 | 0.008 | 0.0885 | 0.0 | 0.0 | 0.001 | 0.0046 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.1994 | 0.5061 | 0.011 | 0.0117 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0079 | 0.0059 | 0.0 | 0.0 | 0.0782 | 0.4826 | 0.0002 | 0.0599 | 0.0407 | 0.2641 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 2.2712 | 0.8154 | 9300 | 2.2727 | 0.0139 | 0.026 | 0.0134 | 0.0064 | 0.0187 | 0.013 | 0.037 | 0.0651 | 0.0681 | 0.0261 | 0.0675 | 0.0781 | 0.0 | 0.0 | 0.0385 | 0.3413 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.1312 | 0.6439 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.1169 | 0.7242 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0074 | 0.0831 | 0.0 | 0.0 | 0.001 | 0.0046 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.2032 | 0.5132 | 0.0113 | 0.0145 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0079 | 0.0059 | 0.0 | 0.0 | 0.0806 | 0.4826 | 0.0002 | 0.0578 | 0.0406 | 0.2611 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 2.4336 | 0.8197 | 9350 | 2.2726 | 0.0138 | 0.0257 | 0.0132 | 0.0064 | 0.0181 | 0.0126 | 0.0367 | 0.0651 | 0.0681 | 0.0263 | 0.0674 | 0.0777 | 0.0 | 0.0 | 0.0364 | 0.336 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.1293 | 0.6439 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.1156 | 0.7264 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0083 | 0.0869 | 0.0 | 0.0 | 0.0007 | 0.0046 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.2038 | 0.5137 | 0.0108 | 0.0121 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0079 | 0.0059 | 0.0 | 0.0 | 0.0807 | 0.4827 | 0.0002 | 0.0612 | 0.0402 | 0.2603 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 1.9569 | 0.8241 | 9400 | 2.2735 | 0.0136 | 0.0255 | 0.0129 | 0.0063 | 0.018 | 0.0124 | 0.0364 | 0.0652 | 0.0682 | 0.0261 | 0.0664 | 0.078 | 0.0 | 0.0 | 0.0355 | 0.3394 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.1253 | 0.6465 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.1149 | 0.7313 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0072 | 0.0831 | 0.0 | 0.0 | 0.0007 | 0.0037 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.2042 | 0.5148 | 0.0115 | 0.0154 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0079 | 0.0059 | 0.0 | 0.0 | 0.0789 | 0.4807 | 0.0001 | 0.0563 | 0.0396 | 0.259 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 2.5639 | 0.8285 | 9450 | 2.2715 | 0.0137 | 0.0255 | 0.0132 | 0.0064 | 0.0178 | 0.0124 | 0.0366 | 0.0654 | 0.0684 | 0.0264 | 0.0668 | 0.078 | 0.0 | 0.0 | 0.0365 | 0.3448 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.1252 | 0.6475 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.1142 | 0.7248 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0078 | 0.0869 | 0.0 | 0.0 | 0.0008 | 0.0037 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.2051 | 0.5168 | 0.011 | 0.0136 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0089 | 0.0067 | 0.0 | 0.0 | 0.0778 | 0.4824 | 0.0002 | 0.0595 | 0.0404 | 0.2595 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 2.3379 | 0.8329 | 9500 | 2.2704 | 0.0136 | 0.0254 | 0.0131 | 0.0064 | 0.0176 | 0.0123 | 0.0366 | 0.0653 | 0.0684 | 0.0265 | 0.0673 | 0.0776 | 0.0 | 0.0 | 0.0358 | 0.3392 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.1259 | 0.6484 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.1143 | 0.7274 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0075 | 0.0877 | 0.0 | 0.0 | 0.0005 | 0.0037 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.2065 | 0.5167 | 0.0104 | 0.0126 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0089 | 0.0067 | 0.0 | 0.0 | 0.0774 | 0.4822 | 0.0002 | 0.0623 | 0.0402 | 0.2609 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 1.8189 | 0.8373 | 9550 | 2.2692 | 0.0136 | 0.0254 | 0.0131 | 0.0064 | 0.0178 | 0.0122 | 0.0368 | 0.0655 | 0.0685 | 0.0264 | 0.0672 | 0.0778 | 0.0 | 0.0 | 0.0349 | 0.3385 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.1256 | 0.6506 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.1131 | 0.726 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0077 | 0.0869 | 0.0 | 0.0 | 0.0005 | 0.0037 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.2054 | 0.5178 | 0.0118 | 0.0164 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0089 | 0.0067 | 0.0 | 0.0 | 0.0782 | 0.4844 | 0.0002 | 0.0606 | 0.0415 | 0.2602 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 3.0404 | 0.8417 | 9600 | 2.2710 | 0.0137 | 0.0255 | 0.0131 | 0.0063 | 0.018 | 0.0121 | 0.0367 | 0.0655 | 0.0685 | 0.0263 | 0.0671 | 0.0777 | 0.0 | 0.0 | 0.0347 | 0.3402 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.1273 | 0.651 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.1132 | 0.7278 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0082 | 0.0854 | 0.0 | 0.0 | 0.0005 | 0.0037 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.2049 | 0.5168 | 0.0128 | 0.0173 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0089 | 0.0067 | 0.0 | 0.0 | 0.0783 | 0.4819 | 0.0002 | 0.06 | 0.04 | 0.26 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 2.9754 | 0.8460 | 9650 | 2.2730 | 0.0136 | 0.0253 | 0.013 | 0.0063 | 0.018 | 0.0121 | 0.0366 | 0.0653 | 0.0682 | 0.0261 | 0.0668 | 0.0779 | 0.0 | 0.0 | 0.0348 | 0.3415 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.1261 | 0.6478 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.1139 | 0.7236 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0074 | 0.0869 | 0.0 | 0.0 | 0.0006 | 0.0037 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.2046 | 0.514 | 0.0123 | 0.0164 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0089 | 0.0067 | 0.0 | 0.0 | 0.0771 | 0.4787 | 0.0002 | 0.0612 | 0.0399 | 0.2588 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 2.088 | 0.8504 | 9700 | 2.2728 | 0.0137 | 0.0254 | 0.013 | 0.0064 | 0.0182 | 0.0121 | 0.0368 | 0.0653 | 0.0683 | 0.0264 | 0.0675 | 0.0779 | 0.0 | 0.0 | 0.0352 | 0.3419 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.1271 | 0.6433 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.1131 | 0.7246 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0082 | 0.0946 | 0.0 | 0.0 | 0.0005 | 0.0037 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.2047 | 0.5145 | 0.0126 | 0.0168 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0089 | 0.0067 | 0.0 | 0.0 | 0.0773 | 0.479 | 0.0002 | 0.0602 | 0.041 | 0.258 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 2.0154 | 0.8548 | 9750 | 2.2726 | 0.0136 | 0.0255 | 0.013 | 0.0064 | 0.0182 | 0.0122 | 0.0367 | 0.0654 | 0.0684 | 0.0263 | 0.0674 | 0.0778 | 0.0 | 0.0 | 0.0348 | 0.3383 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.1259 | 0.6471 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.1139 | 0.725 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0084 | 0.0946 | 0.0 | 0.0 | 0.0006 | 0.0037 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.2044 | 0.5151 | 0.013 | 0.0178 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0079 | 0.0059 | 0.0 | 0.0 | 0.0772 | 0.4791 | 0.0002 | 0.06 | 0.0401 | 0.2577 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 2.8461 | 0.8592 | 9800 | 2.2720 | 0.0136 | 0.0254 | 0.0131 | 0.0063 | 0.0182 | 0.0121 | 0.0367 | 0.0653 | 0.0683 | 0.0264 | 0.0677 | 0.0777 | 0.0 | 0.0 | 0.0349 | 0.3379 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.1262 | 0.6503 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.1136 | 0.7217 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0084 | 0.0938 | 0.0 | 0.0 | 0.0006 | 0.0037 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.2046 | 0.515 | 0.0129 | 0.0168 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0079 | 0.0059 | 0.0 | 0.0 | 0.0776 | 0.4799 | 0.0002 | 0.0608 | 0.04 | 0.2562 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 2.6359 | 0.8636 | 9850 | 2.2715 | 0.0137 | 0.0255 | 0.0131 | 0.0064 | 0.0183 | 0.0122 | 0.0366 | 0.0653 | 0.0683 | 0.0263 | 0.0674 | 0.0777 | 0.0 | 0.0 | 0.0348 | 0.3389 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.1254 | 0.6455 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.1144 | 0.723 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0085 | 0.0938 | 0.0 | 0.0 | 0.0006 | 0.0037 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.2046 | 0.5153 | 0.0132 | 0.0173 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0079 | 0.0059 | 0.0 | 0.0 | 0.0778 | 0.4799 | 0.0002 | 0.0604 | 0.0413 | 0.2569 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 2.5484 | 0.8680 | 9900 | 2.2710 | 0.0137 | 0.0255 | 0.0131 | 0.0063 | 0.0184 | 0.0122 | 0.0366 | 0.0654 | 0.0684 | 0.0265 | 0.0677 | 0.0777 | 0.0 | 0.0 | 0.0351 | 0.3387 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.125 | 0.6484 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.114 | 0.7228 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0085 | 0.0938 | 0.0 | 0.0 | 0.0006 | 0.0037 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.2046 | 0.5154 | 0.0132 | 0.0173 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0079 | 0.0059 | 0.0 | 0.0 | 0.0779 | 0.4798 | 0.0002 | 0.0612 | 0.0413 | 0.258 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 2.5602 | 0.8723 | 9950 | 2.2719 | 0.0137 | 0.0255 | 0.0131 | 0.0063 | 0.0183 | 0.0122 | 0.0366 | 0.0653 | 0.0683 | 0.0264 | 0.0675 | 0.0776 | 0.0 | 0.0 | 0.0347 | 0.3373 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.1259 | 0.6475 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.1139 | 0.7228 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0085 | 0.0946 | 0.0 | 0.0 | 0.0006 | 0.0037 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.2036 | 0.5148 | 0.0133 | 0.0173 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0079 | 0.0059 | 0.0 | 0.0 | 0.0779 | 0.4792 | 0.0002 | 0.0604 | 0.0417 | 0.2585 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 2.7095 | 0.8767 | 10000 | 2.2712 | 0.0137 | 0.0256 | 0.0131 | 0.0064 | 0.0183 | 0.0123 | 0.0366 | 0.0653 | 0.0683 | 0.0264 | 0.0676 | 0.0777 | 0.0 | 0.0 | 0.0347 | 0.3371 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.1262 | 0.6475 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.1141 | 0.7232 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0086 | 0.0946 | 0.0 | 0.0 | 0.0006 | 0.0037 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.2039 | 0.5147 | 0.0134 | 0.0173 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0079 | 0.0059 | 0.0 | 0.0 | 0.078 | 0.4796 | 0.0002 | 0.0606 | 0.0417 | 0.2583 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
### Framework versions
- Transformers 4.46.2
- Pytorch 2.5.0+cu121
- Datasets 3.1.0
- Tokenizers 0.20.3
| [
"shirt, blouse",
"top, t-shirt, sweatshirt",
"sweater",
"cardigan",
"jacket",
"vest",
"pants",
"shorts",
"skirt",
"coat",
"dress",
"jumpsuit",
"cape",
"glasses",
"hat",
"headband, head covering, hair accessory",
"tie",
"glove",
"watch",
"belt",
"leg warmer",
"tights, stockings",
"sock",
"shoe",
"bag, wallet",
"scarf",
"umbrella",
"hood",
"collar",
"lapel",
"epaulette",
"sleeve",
"pocket",
"neckline",
"buckle",
"zipper",
"applique",
"bead",
"bow",
"flower",
"fringe",
"ribbon",
"rivet",
"ruffle",
"sequin",
"tassel"
] |
pabloOmega/text-entities-detection |
# Model Card for Model ID
<!-- Provide a quick summary of what the model is/does. -->
## Model Details
### Model Description
<!-- Provide a longer summary of what this model is. -->
This is the model card of a 🤗 transformers model that has been pushed on the Hub. This model card has been automatically generated.
- **Developed by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Model type:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
- **Finetuned from model [optional]:** [More Information Needed]
### Model Sources [optional]
<!-- Provide the basic links for the model. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the model is intended to be used, including the foreseeable users of the model and those affected by the model. -->
### Direct Use
<!-- This section is for the model use without fine-tuning or plugging into a larger ecosystem/app. -->
[More Information Needed]
### Downstream Use [optional]
<!-- This section is for the model use when fine-tuned for a task, or when plugged into a larger ecosystem/app -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the model will not work well for. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.
## How to Get Started with the Model
Use the code below to get started with the model.
[More Information Needed]
## Training Details
### Training Data
<!-- This should link to a Dataset Card, perhaps with a short stub of information on what the training data is all about as well as documentation related to data pre-processing or additional filtering. -->
[More Information Needed]
### Training Procedure
<!-- This relates heavily to the Technical Specifications. Content here should link to that section when it is relevant to the training procedure. -->
#### Preprocessing [optional]
[More Information Needed]
#### Training Hyperparameters
- **Training regime:** [More Information Needed] <!--fp32, fp16 mixed precision, bf16 mixed precision, bf16 non-mixed precision, fp16 non-mixed precision, fp8 mixed precision -->
#### Speeds, Sizes, Times [optional]
<!-- This section provides information about throughput, start/end time, checkpoint size if relevant, etc. -->
[More Information Needed]
## Evaluation
<!-- This section describes the evaluation protocols and provides the results. -->
### Testing Data, Factors & Metrics
#### Testing Data
<!-- This should link to a Dataset Card if possible. -->
[More Information Needed]
#### Factors
<!-- These are the things the evaluation is disaggregating by, e.g., subpopulations or domains. -->
[More Information Needed]
#### Metrics
<!-- These are the evaluation metrics being used, ideally with a description of why. -->
[More Information Needed]
### Results
[More Information Needed]
#### Summary
## Model Examination [optional]
<!-- Relevant interpretability work for the model goes here -->
[More Information Needed]
## Environmental Impact
<!-- Total emissions (in grams of CO2eq) and additional considerations, such as electricity usage, go here. Edit the suggested text below accordingly -->
Carbon emissions can be estimated using the [Machine Learning Impact calculator](https://mlco2.github.io/impact#compute) presented in [Lacoste et al. (2019)](https://arxiv.org/abs/1910.09700).
- **Hardware Type:** [More Information Needed]
- **Hours used:** [More Information Needed]
- **Cloud Provider:** [More Information Needed]
- **Compute Region:** [More Information Needed]
- **Carbon Emitted:** [More Information Needed]
## Technical Specifications [optional]
### Model Architecture and Objective
[More Information Needed]
### Compute Infrastructure
[More Information Needed]
#### Hardware
[More Information Needed]
#### Software
[More Information Needed]
## Citation [optional]
<!-- If there is a paper or blog post introducing the model, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the model or model card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Model Card Authors [optional]
[More Information Needed]
## Model Card Contact
[More Information Needed] | [
"label_0",
"label_1",
"label_2",
"label_3",
"label_4"
] |
nickoloss/detr-resnet-50_finetuned_cppe5 |
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# detr-resnet-50_finetuned_cppe5
This model is a fine-tuned version of [facebook/detr-resnet-50](https://huggingface.co/facebook/detr-resnet-50) on the None dataset.
It achieves the following results on the evaluation set:
- Loss: 2.1397
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 4.941058844013093e-07
- train_batch_size: 1
- eval_batch_size: 8
- seed: 42
- optimizer: Use OptimizerNames.ADAMW_TORCH with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
- lr_scheduler_type: cosine
- num_epochs: 1000
### Training results
| Training Loss | Epoch | Step | Validation Loss |
|:-------------:|:------:|:-----:|:---------------:|
| No log | 1.0 | 10 | 7.0795 |
| No log | 2.0 | 20 | 5.7431 |
| No log | 3.0 | 30 | 6.7530 |
| No log | 4.0 | 40 | 5.3270 |
| 5.8012 | 5.0 | 50 | 6.1804 |
| 5.8012 | 6.0 | 60 | 6.0216 |
| 5.8012 | 7.0 | 70 | 5.2871 |
| 5.8012 | 8.0 | 80 | 5.1623 |
| 5.8012 | 9.0 | 90 | 6.9306 |
| 5.1916 | 10.0 | 100 | 5.7015 |
| 5.1916 | 11.0 | 110 | 7.5914 |
| 5.1916 | 12.0 | 120 | 6.0398 |
| 5.1916 | 13.0 | 130 | 5.3490 |
| 5.1916 | 14.0 | 140 | 5.8373 |
| 5.4281 | 15.0 | 150 | 4.1271 |
| 5.4281 | 16.0 | 160 | 6.2398 |
| 5.4281 | 17.0 | 170 | 4.2083 |
| 5.4281 | 18.0 | 180 | 4.7579 |
| 5.4281 | 19.0 | 190 | 5.2079 |
| 4.8531 | 20.0 | 200 | 3.9074 |
| 4.8531 | 21.0 | 210 | 5.1759 |
| 4.8531 | 22.0 | 220 | 3.8891 |
| 4.8531 | 23.0 | 230 | 3.8469 |
| 4.8531 | 24.0 | 240 | 4.0659 |
| 4.6505 | 25.0 | 250 | 4.7758 |
| 4.6505 | 26.0 | 260 | 4.2206 |
| 4.6505 | 27.0 | 270 | 4.8899 |
| 4.6505 | 28.0 | 280 | 4.6572 |
| 4.6505 | 29.0 | 290 | 4.1456 |
| 4.7175 | 30.0 | 300 | 4.3631 |
| 4.7175 | 31.0 | 310 | 3.9849 |
| 4.7175 | 32.0 | 320 | 5.4221 |
| 4.7175 | 33.0 | 330 | 3.7069 |
| 4.7175 | 34.0 | 340 | 5.2221 |
| 4.3464 | 35.0 | 350 | 4.0269 |
| 4.3464 | 36.0 | 360 | 3.8471 |
| 4.3464 | 37.0 | 370 | 4.0817 |
| 4.3464 | 38.0 | 380 | 4.0005 |
| 4.3464 | 39.0 | 390 | 3.7973 |
| 4.4884 | 40.0 | 400 | 4.1382 |
| 4.4884 | 41.0 | 410 | 3.9224 |
| 4.4884 | 42.0 | 420 | 4.3754 |
| 4.4884 | 43.0 | 430 | 3.9821 |
| 4.4884 | 44.0 | 440 | 3.9350 |
| 4.2526 | 45.0 | 450 | 4.0074 |
| 4.2526 | 46.0 | 460 | 3.6133 |
| 4.2526 | 47.0 | 470 | 3.8681 |
| 4.2526 | 48.0 | 480 | 3.9367 |
| 4.2526 | 49.0 | 490 | 3.8197 |
| 4.1344 | 50.0 | 500 | 3.5646 |
| 4.1344 | 51.0 | 510 | 3.7987 |
| 4.1344 | 52.0 | 520 | 3.9491 |
| 4.1344 | 53.0 | 530 | 4.1457 |
| 4.1344 | 54.0 | 540 | 3.6863 |
| 4.0492 | 55.0 | 550 | 3.8259 |
| 4.0492 | 56.0 | 560 | 3.8122 |
| 4.0492 | 57.0 | 570 | 4.0111 |
| 4.0492 | 58.0 | 580 | 3.7859 |
| 4.0492 | 59.0 | 590 | 3.7566 |
| 3.9168 | 60.0 | 600 | 3.6876 |
| 3.9168 | 61.0 | 610 | 3.7469 |
| 3.9168 | 62.0 | 620 | 4.0203 |
| 3.9168 | 63.0 | 630 | 4.7051 |
| 3.9168 | 64.0 | 640 | 3.6666 |
| 3.8979 | 65.0 | 650 | 3.5877 |
| 3.8979 | 66.0 | 660 | 3.5737 |
| 3.8979 | 67.0 | 670 | 3.6520 |
| 3.8979 | 68.0 | 680 | 3.4342 |
| 3.8979 | 69.0 | 690 | 3.9668 |
| 3.7801 | 70.0 | 700 | 4.0617 |
| 3.7801 | 71.0 | 710 | 3.8625 |
| 3.7801 | 72.0 | 720 | 3.3205 |
| 3.7801 | 73.0 | 730 | 4.0774 |
| 3.7801 | 74.0 | 740 | 4.0416 |
| 3.7303 | 75.0 | 750 | 3.4343 |
| 3.7303 | 76.0 | 760 | 3.5131 |
| 3.7303 | 77.0 | 770 | 3.5507 |
| 3.7303 | 78.0 | 780 | 3.9112 |
| 3.7303 | 79.0 | 790 | 3.3022 |
| 3.5806 | 80.0 | 800 | 3.8618 |
| 3.5806 | 81.0 | 810 | 3.6697 |
| 3.5806 | 82.0 | 820 | 3.5536 |
| 3.5806 | 83.0 | 830 | 3.3500 |
| 3.5806 | 84.0 | 840 | 3.8134 |
| 3.6183 | 85.0 | 850 | 3.4067 |
| 3.6183 | 86.0 | 860 | 3.4425 |
| 3.6183 | 87.0 | 870 | 3.2812 |
| 3.6183 | 88.0 | 880 | 3.3909 |
| 3.6183 | 89.0 | 890 | 3.6878 |
| 3.4767 | 90.0 | 900 | 3.5409 |
| 3.4767 | 91.0 | 910 | 3.5380 |
| 3.4767 | 92.0 | 920 | 3.8982 |
| 3.4767 | 93.0 | 930 | 3.4205 |
| 3.4767 | 94.0 | 940 | 4.0828 |
| 3.5444 | 95.0 | 950 | 3.2579 |
| 3.5444 | 96.0 | 960 | 3.3702 |
| 3.5444 | 97.0 | 970 | 4.2833 |
| 3.5444 | 98.0 | 980 | 3.4222 |
| 3.5444 | 99.0 | 990 | 3.4477 |
| 3.3821 | 100.0 | 1000 | 3.2484 |
| 3.3821 | 101.0 | 1010 | 3.3493 |
| 3.3821 | 102.0 | 1020 | 3.2192 |
| 3.3821 | 103.0 | 1030 | 3.2491 |
| 3.3821 | 104.0 | 1040 | 3.3853 |
| 3.3429 | 105.0 | 1050 | 3.4362 |
| 3.3429 | 106.0 | 1060 | 4.1587 |
| 3.3429 | 107.0 | 1070 | 3.9797 |
| 3.3429 | 108.0 | 1080 | 3.6257 |
| 3.3429 | 109.0 | 1090 | 3.4861 |
| 3.304 | 110.0 | 1100 | 3.3520 |
| 3.304 | 111.0 | 1110 | 3.0047 |
| 3.304 | 112.0 | 1120 | 3.4988 |
| 3.304 | 113.0 | 1130 | 3.4723 |
| 3.304 | 114.0 | 1140 | 3.4294 |
| 3.2826 | 115.0 | 1150 | 3.6923 |
| 3.2826 | 116.0 | 1160 | 3.2513 |
| 3.2826 | 117.0 | 1170 | 3.6769 |
| 3.2826 | 118.0 | 1180 | 3.5384 |
| 3.2826 | 119.0 | 1190 | 3.3773 |
| 3.1944 | 120.0 | 1200 | 3.2538 |
| 3.1944 | 121.0 | 1210 | 3.2896 |
| 3.1944 | 122.0 | 1220 | 3.4226 |
| 3.1944 | 123.0 | 1230 | 3.3085 |
| 3.1944 | 124.0 | 1240 | 3.1047 |
| 3.1978 | 125.0 | 1250 | 3.3142 |
| 3.1978 | 126.0 | 1260 | 3.4432 |
| 3.1978 | 127.0 | 1270 | 2.9309 |
| 3.1978 | 128.0 | 1280 | 3.3678 |
| 3.1978 | 129.0 | 1290 | 3.6156 |
| 3.3425 | 130.0 | 1300 | 3.3015 |
| 3.3425 | 131.0 | 1310 | 3.3181 |
| 3.3425 | 132.0 | 1320 | 3.2688 |
| 3.3425 | 133.0 | 1330 | 3.4590 |
| 3.3425 | 134.0 | 1340 | 3.0809 |
| 3.3654 | 135.0 | 1350 | 3.0907 |
| 3.3654 | 136.0 | 1360 | 3.2888 |
| 3.3654 | 137.0 | 1370 | 3.1504 |
| 3.3654 | 138.0 | 1380 | 3.4285 |
| 3.3654 | 139.0 | 1390 | 3.4080 |
| 3.1702 | 140.0 | 1400 | 3.1543 |
| 3.1702 | 141.0 | 1410 | 3.5154 |
| 3.1702 | 142.0 | 1420 | 3.1132 |
| 3.1702 | 143.0 | 1430 | 3.2503 |
| 3.1702 | 144.0 | 1440 | 3.6848 |
| 3.1929 | 145.0 | 1450 | 3.1961 |
| 3.1929 | 146.0 | 1460 | 3.4146 |
| 3.1929 | 147.0 | 1470 | 3.4162 |
| 3.1929 | 148.0 | 1480 | 3.2388 |
| 3.1929 | 149.0 | 1490 | 3.6281 |
| 3.0 | 150.0 | 1500 | 2.9830 |
| 3.0 | 151.0 | 1510 | 3.1817 |
| 3.0 | 152.0 | 1520 | 3.2862 |
| 3.0 | 153.0 | 1530 | 3.0465 |
| 3.0 | 154.0 | 1540 | 3.1208 |
| 2.9975 | 155.0 | 1550 | 3.3041 |
| 2.9975 | 156.0 | 1560 | 3.4944 |
| 2.9975 | 157.0 | 1570 | 3.5826 |
| 2.9975 | 158.0 | 1580 | 3.5453 |
| 2.9975 | 159.0 | 1590 | 4.0256 |
| 3.0312 | 160.0 | 1600 | 3.3678 |
| 3.0312 | 161.0 | 1610 | 2.9384 |
| 3.0312 | 162.0 | 1620 | 3.0596 |
| 3.0312 | 163.0 | 1630 | 3.3952 |
| 3.0312 | 164.0 | 1640 | 3.5299 |
| 2.9855 | 165.0 | 1650 | 2.9930 |
| 2.9855 | 166.0 | 1660 | 3.3869 |
| 2.9855 | 167.0 | 1670 | 3.1676 |
| 2.9855 | 168.0 | 1680 | 3.1330 |
| 2.9855 | 169.0 | 1690 | 3.2595 |
| 2.863 | 170.0 | 1700 | 3.1151 |
| 2.863 | 171.0 | 1710 | 3.1382 |
| 2.863 | 172.0 | 1720 | 3.7265 |
| 2.863 | 173.0 | 1730 | 2.8716 |
| 2.863 | 174.0 | 1740 | 3.0285 |
| 2.8942 | 175.0 | 1750 | 3.0285 |
| 2.8942 | 176.0 | 1760 | 3.7873 |
| 2.8942 | 177.0 | 1770 | 2.9266 |
| 2.8942 | 178.0 | 1780 | 2.9751 |
| 2.8942 | 179.0 | 1790 | 3.1875 |
| 2.7614 | 180.0 | 1800 | 2.6317 |
| 2.7614 | 181.0 | 1810 | 3.3780 |
| 2.7614 | 182.0 | 1820 | 3.1680 |
| 2.7614 | 183.0 | 1830 | 3.3270 |
| 2.7614 | 184.0 | 1840 | 3.2822 |
| 2.9 | 185.0 | 1850 | 3.0026 |
| 2.9 | 186.0 | 1860 | 3.0610 |
| 2.9 | 187.0 | 1870 | 3.2631 |
| 2.9 | 188.0 | 1880 | 2.8804 |
| 2.9 | 189.0 | 1890 | 3.2069 |
| 2.9176 | 190.0 | 1900 | 2.8339 |
| 2.9176 | 191.0 | 1910 | 2.9836 |
| 2.9176 | 192.0 | 1920 | 3.0211 |
| 2.9176 | 193.0 | 1930 | 2.8448 |
| 2.9176 | 194.0 | 1940 | 4.1654 |
| 2.8189 | 195.0 | 1950 | 3.0910 |
| 2.8189 | 196.0 | 1960 | 2.7972 |
| 2.8189 | 197.0 | 1970 | 3.5421 |
| 2.8189 | 198.0 | 1980 | 2.8334 |
| 2.8189 | 199.0 | 1990 | 3.0457 |
| 2.7236 | 200.0 | 2000 | 3.0531 |
| 2.7236 | 201.0 | 2010 | 3.0384 |
| 2.7236 | 202.0 | 2020 | 3.0183 |
| 2.7236 | 203.0 | 2030 | 3.1019 |
| 2.7236 | 204.0 | 2040 | 2.6909 |
| 2.6289 | 205.0 | 2050 | 2.8969 |
| 2.6289 | 206.0 | 2060 | 2.8063 |
| 2.6289 | 207.0 | 2070 | 3.3533 |
| 2.6289 | 208.0 | 2080 | 3.0578 |
| 2.6289 | 209.0 | 2090 | 3.0081 |
| 2.6592 | 210.0 | 2100 | 3.1674 |
| 2.6592 | 211.0 | 2110 | 3.0982 |
| 2.6592 | 212.0 | 2120 | 2.9070 |
| 2.6592 | 213.0 | 2130 | 2.8881 |
| 2.6592 | 214.0 | 2140 | 2.7869 |
| 2.6898 | 215.0 | 2150 | 2.9736 |
| 2.6898 | 216.0 | 2160 | 2.7309 |
| 2.6898 | 217.0 | 2170 | 3.2656 |
| 2.6898 | 218.0 | 2180 | 2.7734 |
| 2.6898 | 219.0 | 2190 | 2.6135 |
| 2.6117 | 220.0 | 2200 | 3.0652 |
| 2.6117 | 221.0 | 2210 | 3.0918 |
| 2.6117 | 222.0 | 2220 | 3.2191 |
| 2.6117 | 223.0 | 2230 | 2.8947 |
| 2.6117 | 224.0 | 2240 | 2.6307 |
| 2.6281 | 225.0 | 2250 | 2.6585 |
| 2.6281 | 226.0 | 2260 | 3.0801 |
| 2.6281 | 227.0 | 2270 | 2.9075 |
| 2.6281 | 228.0 | 2280 | 3.1795 |
| 2.6281 | 229.0 | 2290 | 2.8762 |
| 2.4503 | 230.0 | 2300 | 2.6883 |
| 2.4503 | 231.0 | 2310 | 3.0329 |
| 2.4503 | 232.0 | 2320 | 2.8990 |
| 2.4503 | 233.0 | 2330 | 2.7381 |
| 2.4503 | 234.0 | 2340 | 2.8102 |
| 2.5171 | 235.0 | 2350 | 3.0730 |
| 2.5171 | 236.0 | 2360 | 2.9376 |
| 2.5171 | 237.0 | 2370 | 2.5781 |
| 2.5171 | 238.0 | 2380 | 2.9466 |
| 2.5171 | 239.0 | 2390 | 2.6868 |
| 2.5004 | 240.0 | 2400 | 2.6414 |
| 2.5004 | 241.0 | 2410 | 3.0623 |
| 2.5004 | 242.0 | 2420 | 2.8071 |
| 2.5004 | 243.0 | 2430 | 2.4406 |
| 2.5004 | 244.0 | 2440 | 2.6247 |
| 2.5338 | 245.0 | 2450 | 2.7334 |
| 2.5338 | 246.0 | 2460 | 2.8576 |
| 2.5338 | 247.0 | 2470 | 2.6042 |
| 2.5338 | 248.0 | 2480 | 2.8519 |
| 2.5338 | 249.0 | 2490 | 3.0416 |
| 2.429 | 250.0 | 2500 | 2.7010 |
| 2.429 | 251.0 | 2510 | 4.0268 |
| 2.429 | 252.0 | 2520 | 2.9236 |
| 2.429 | 253.0 | 2530 | 2.5467 |
| 2.429 | 254.0 | 2540 | 2.7355 |
| 2.4368 | 255.0 | 2550 | 3.1205 |
| 2.4368 | 256.0 | 2560 | 2.8335 |
| 2.4368 | 257.0 | 2570 | 2.7752 |
| 2.4368 | 258.0 | 2580 | 2.7598 |
| 2.4368 | 259.0 | 2590 | 2.6409 |
| 2.3204 | 260.0 | 2600 | 2.7808 |
| 2.3204 | 261.0 | 2610 | 2.4784 |
| 2.3204 | 262.0 | 2620 | 2.9005 |
| 2.3204 | 263.0 | 2630 | 2.6729 |
| 2.3204 | 264.0 | 2640 | 2.6290 |
| 2.4044 | 265.0 | 2650 | 2.8760 |
| 2.4044 | 266.0 | 2660 | 2.5683 |
| 2.4044 | 267.0 | 2670 | 2.8607 |
| 2.4044 | 268.0 | 2680 | 2.5760 |
| 2.4044 | 269.0 | 2690 | 2.6616 |
| 2.3464 | 270.0 | 2700 | 2.6968 |
| 2.3464 | 271.0 | 2710 | 2.7200 |
| 2.3464 | 272.0 | 2720 | 2.7963 |
| 2.3464 | 273.0 | 2730 | 2.5230 |
| 2.3464 | 274.0 | 2740 | 2.7015 |
| 2.2999 | 275.0 | 2750 | 2.9836 |
| 2.2999 | 276.0 | 2760 | 2.6443 |
| 2.2999 | 277.0 | 2770 | 2.5045 |
| 2.2999 | 278.0 | 2780 | 3.2068 |
| 2.2999 | 279.0 | 2790 | 2.5038 |
| 2.3102 | 280.0 | 2800 | 2.7581 |
| 2.3102 | 281.0 | 2810 | 2.6092 |
| 2.3102 | 282.0 | 2820 | 2.4482 |
| 2.3102 | 283.0 | 2830 | 3.0941 |
| 2.3102 | 284.0 | 2840 | 2.3476 |
| 2.2134 | 285.0 | 2850 | 2.8535 |
| 2.2134 | 286.0 | 2860 | 2.6361 |
| 2.2134 | 287.0 | 2870 | 2.6033 |
| 2.2134 | 288.0 | 2880 | 2.4526 |
| 2.2134 | 289.0 | 2890 | 2.7966 |
| 2.3276 | 290.0 | 2900 | 2.6472 |
| 2.3276 | 291.0 | 2910 | 2.6410 |
| 2.3276 | 292.0 | 2920 | 2.5670 |
| 2.3276 | 293.0 | 2930 | 2.7832 |
| 2.3276 | 294.0 | 2940 | 2.5031 |
| 2.287 | 295.0 | 2950 | 2.5614 |
| 2.287 | 296.0 | 2960 | 3.0045 |
| 2.287 | 297.0 | 2970 | 2.5755 |
| 2.287 | 298.0 | 2980 | 2.5132 |
| 2.287 | 299.0 | 2990 | 2.6427 |
| 2.1723 | 300.0 | 3000 | 3.2675 |
| 2.1723 | 301.0 | 3010 | 2.5890 |
| 2.1723 | 302.0 | 3020 | 2.7935 |
| 2.1723 | 303.0 | 3030 | 2.5836 |
| 2.1723 | 304.0 | 3040 | 2.4359 |
| 2.237 | 305.0 | 3050 | 2.7048 |
| 2.237 | 306.0 | 3060 | 2.4640 |
| 2.237 | 307.0 | 3070 | 2.5528 |
| 2.237 | 308.0 | 3080 | 2.4919 |
| 2.237 | 309.0 | 3090 | 2.5067 |
| 2.1502 | 310.0 | 3100 | 2.6569 |
| 2.1502 | 311.0 | 3110 | 2.6649 |
| 2.1502 | 312.0 | 3120 | 2.7721 |
| 2.1502 | 313.0 | 3130 | 2.3934 |
| 2.1502 | 314.0 | 3140 | 2.4799 |
| 2.2248 | 315.0 | 3150 | 2.6882 |
| 2.2248 | 316.0 | 3160 | 2.8493 |
| 2.2248 | 317.0 | 3170 | 2.5919 |
| 2.2248 | 318.0 | 3180 | 2.4124 |
| 2.2248 | 319.0 | 3190 | 2.5997 |
| 2.2399 | 320.0 | 3200 | 2.3440 |
| 2.2399 | 321.0 | 3210 | 2.6292 |
| 2.2399 | 322.0 | 3220 | 3.2851 |
| 2.2399 | 323.0 | 3230 | 2.4422 |
| 2.2399 | 324.0 | 3240 | 2.3866 |
| 2.1759 | 325.0 | 3250 | 2.4307 |
| 2.1759 | 326.0 | 3260 | 2.2842 |
| 2.1759 | 327.0 | 3270 | 2.5418 |
| 2.1759 | 328.0 | 3280 | 2.5840 |
| 2.1759 | 329.0 | 3290 | 2.9884 |
| 2.2557 | 330.0 | 3300 | 2.5096 |
| 2.2557 | 331.0 | 3310 | 3.2382 |
| 2.2557 | 332.0 | 3320 | 2.5237 |
| 2.2557 | 333.0 | 3330 | 2.4346 |
| 2.2557 | 334.0 | 3340 | 2.4034 |
| 2.216 | 335.0 | 3350 | 2.4259 |
| 2.216 | 336.0 | 3360 | 2.4239 |
| 2.216 | 337.0 | 3370 | 2.5417 |
| 2.216 | 338.0 | 3380 | 2.7757 |
| 2.216 | 339.0 | 3390 | 2.6264 |
| 2.2112 | 340.0 | 3400 | 2.6611 |
| 2.2112 | 341.0 | 3410 | 2.6828 |
| 2.2112 | 342.0 | 3420 | 2.4541 |
| 2.2112 | 343.0 | 3430 | 2.4426 |
| 2.2112 | 344.0 | 3440 | 2.4566 |
| 2.1473 | 345.0 | 3450 | 2.8140 |
| 2.1473 | 346.0 | 3460 | 2.3079 |
| 2.1473 | 347.0 | 3470 | 2.4263 |
| 2.1473 | 348.0 | 3480 | 2.4176 |
| 2.1473 | 349.0 | 3490 | 2.5132 |
| 2.0273 | 350.0 | 3500 | 2.5695 |
| 2.0273 | 351.0 | 3510 | 2.3300 |
| 2.0273 | 352.0 | 3520 | 2.3673 |
| 2.0273 | 353.0 | 3530 | 2.4108 |
| 2.0273 | 354.0 | 3540 | 2.1937 |
| 1.9724 | 355.0 | 3550 | 2.4282 |
| 1.9724 | 356.0 | 3560 | 2.5854 |
| 1.9724 | 357.0 | 3570 | 2.3549 |
| 1.9724 | 358.0 | 3580 | 2.7288 |
| 1.9724 | 359.0 | 3590 | 2.3138 |
| 2.1214 | 360.0 | 3600 | 2.6228 |
| 2.1214 | 361.0 | 3610 | 2.5202 |
| 2.1214 | 362.0 | 3620 | 2.3395 |
| 2.1214 | 363.0 | 3630 | 2.7839 |
| 2.1214 | 364.0 | 3640 | 2.3686 |
| 2.0616 | 365.0 | 3650 | 2.1838 |
| 2.0616 | 366.0 | 3660 | 2.1441 |
| 2.0616 | 367.0 | 3670 | 2.3893 |
| 2.0616 | 368.0 | 3680 | 2.3090 |
| 2.0616 | 369.0 | 3690 | 2.5005 |
| 2.0561 | 370.0 | 3700 | 2.5149 |
| 2.0561 | 371.0 | 3710 | 2.4185 |
| 2.0561 | 372.0 | 3720 | 2.2988 |
| 2.0561 | 373.0 | 3730 | 2.5609 |
| 2.0561 | 374.0 | 3740 | 2.4859 |
| 1.9504 | 375.0 | 3750 | 2.6781 |
| 1.9504 | 376.0 | 3760 | 2.4028 |
| 1.9504 | 377.0 | 3770 | 2.2976 |
| 1.9504 | 378.0 | 3780 | 2.6518 |
| 1.9504 | 379.0 | 3790 | 2.4606 |
| 1.9662 | 380.0 | 3800 | 2.0894 |
| 1.9662 | 381.0 | 3810 | 2.7766 |
| 1.9662 | 382.0 | 3820 | 2.6676 |
| 1.9662 | 383.0 | 3830 | 2.3832 |
| 1.9662 | 384.0 | 3840 | 2.3459 |
| 2.007 | 385.0 | 3850 | 2.5191 |
| 2.007 | 386.0 | 3860 | 2.5370 |
| 2.007 | 387.0 | 3870 | 2.3437 |
| 2.007 | 388.0 | 3880 | 2.5367 |
| 2.007 | 389.0 | 3890 | 2.3221 |
| 1.9401 | 390.0 | 3900 | 2.2395 |
| 1.9401 | 391.0 | 3910 | 2.3589 |
| 1.9401 | 392.0 | 3920 | 2.3799 |
| 1.9401 | 393.0 | 3930 | 2.3295 |
| 1.9401 | 394.0 | 3940 | 2.6330 |
| 1.9375 | 395.0 | 3950 | 2.4340 |
| 1.9375 | 396.0 | 3960 | 2.5184 |
| 1.9375 | 397.0 | 3970 | 2.1730 |
| 1.9375 | 398.0 | 3980 | 2.2300 |
| 1.9375 | 399.0 | 3990 | 2.4796 |
| 1.9703 | 400.0 | 4000 | 2.2612 |
| 1.9703 | 401.0 | 4010 | 2.3175 |
| 1.9703 | 402.0 | 4020 | 2.5344 |
| 1.9703 | 403.0 | 4030 | 2.1123 |
| 1.9703 | 404.0 | 4040 | 2.2479 |
| 1.8652 | 405.0 | 4050 | 2.6316 |
| 1.8652 | 406.0 | 4060 | 2.1574 |
| 1.8652 | 407.0 | 4070 | 2.4231 |
| 1.8652 | 408.0 | 4080 | 2.1255 |
| 1.8652 | 409.0 | 4090 | 2.2994 |
| 1.9834 | 410.0 | 4100 | 2.3541 |
| 1.9834 | 411.0 | 4110 | 2.3113 |
| 1.9834 | 412.0 | 4120 | 2.3966 |
| 1.9834 | 413.0 | 4130 | 2.3865 |
| 1.9834 | 414.0 | 4140 | 3.0955 |
| 1.976 | 415.0 | 4150 | 2.6212 |
| 1.976 | 416.0 | 4160 | 2.3237 |
| 1.976 | 417.0 | 4170 | 3.3010 |
| 1.976 | 418.0 | 4180 | 2.7378 |
| 1.976 | 419.0 | 4190 | 2.4063 |
| 1.9165 | 420.0 | 4200 | 2.9853 |
| 1.9165 | 421.0 | 4210 | 2.0776 |
| 1.9165 | 422.0 | 4220 | 2.3036 |
| 1.9165 | 423.0 | 4230 | 2.1934 |
| 1.9165 | 424.0 | 4240 | 2.1535 |
| 1.9224 | 425.0 | 4250 | 2.3000 |
| 1.9224 | 426.0 | 4260 | 2.6858 |
| 1.9224 | 427.0 | 4270 | 2.4825 |
| 1.9224 | 428.0 | 4280 | 2.4776 |
| 1.9224 | 429.0 | 4290 | 2.2042 |
| 1.9091 | 430.0 | 4300 | 2.2847 |
| 1.9091 | 431.0 | 4310 | 2.0935 |
| 1.9091 | 432.0 | 4320 | 2.6040 |
| 1.9091 | 433.0 | 4330 | 2.2520 |
| 1.9091 | 434.0 | 4340 | 2.5126 |
| 1.9543 | 435.0 | 4350 | 2.3081 |
| 1.9543 | 436.0 | 4360 | 2.5018 |
| 1.9543 | 437.0 | 4370 | 2.4462 |
| 1.9543 | 438.0 | 4380 | 2.1927 |
| 1.9543 | 439.0 | 4390 | 2.1584 |
| 1.7975 | 440.0 | 4400 | 2.2996 |
| 1.7975 | 441.0 | 4410 | 2.2288 |
| 1.7975 | 442.0 | 4420 | 2.4102 |
| 1.7975 | 443.0 | 4430 | 2.3321 |
| 1.7975 | 444.0 | 4440 | 1.9341 |
| 1.9595 | 445.0 | 4450 | 2.1064 |
| 1.9595 | 446.0 | 4460 | 2.4024 |
| 1.9595 | 447.0 | 4470 | 2.1377 |
| 1.9595 | 448.0 | 4480 | 2.2580 |
| 1.9595 | 449.0 | 4490 | 2.2505 |
| 1.8746 | 450.0 | 4500 | 2.3562 |
| 1.8746 | 451.0 | 4510 | 2.2730 |
| 1.8746 | 452.0 | 4520 | 2.1447 |
| 1.8746 | 453.0 | 4530 | 2.2458 |
| 1.8746 | 454.0 | 4540 | 2.2136 |
| 2.0722 | 455.0 | 4550 | 2.1459 |
| 2.0722 | 456.0 | 4560 | 1.9991 |
| 2.0722 | 457.0 | 4570 | 2.1572 |
| 2.0722 | 458.0 | 4580 | 2.2700 |
| 2.0722 | 459.0 | 4590 | 2.3094 |
| 1.9179 | 460.0 | 4600 | 2.2721 |
| 1.9179 | 461.0 | 4610 | 2.2809 |
| 1.9179 | 462.0 | 4620 | 2.4517 |
| 1.9179 | 463.0 | 4630 | 2.2500 |
| 1.9179 | 464.0 | 4640 | 2.2107 |
| 2.0428 | 465.0 | 4650 | 2.1489 |
| 2.0428 | 466.0 | 4660 | 2.2571 |
| 2.0428 | 467.0 | 4670 | 2.2047 |
| 2.0428 | 468.0 | 4680 | 2.5041 |
| 2.0428 | 469.0 | 4690 | 2.2354 |
| 1.8738 | 470.0 | 4700 | 2.0811 |
| 1.8738 | 471.0 | 4710 | 2.1300 |
| 1.8738 | 472.0 | 4720 | 2.3041 |
| 1.8738 | 473.0 | 4730 | 2.1780 |
| 1.8738 | 474.0 | 4740 | 2.0481 |
| 1.8625 | 475.0 | 4750 | 2.2354 |
| 1.8625 | 476.0 | 4760 | 2.1670 |
| 1.8625 | 477.0 | 4770 | 2.1575 |
| 1.8625 | 478.0 | 4780 | 2.0797 |
| 1.8625 | 479.0 | 4790 | 2.2353 |
| 1.7743 | 480.0 | 4800 | 2.2478 |
| 1.7743 | 481.0 | 4810 | 2.1120 |
| 1.7743 | 482.0 | 4820 | 2.1790 |
| 1.7743 | 483.0 | 4830 | 3.1939 |
| 1.7743 | 484.0 | 4840 | 2.0575 |
| 1.7955 | 485.0 | 4850 | 2.3685 |
| 1.7955 | 486.0 | 4860 | 2.1021 |
| 1.7955 | 487.0 | 4870 | 2.3043 |
| 1.7955 | 488.0 | 4880 | 2.1155 |
| 1.7955 | 489.0 | 4890 | 2.0982 |
| 1.7685 | 490.0 | 4900 | 2.2740 |
| 1.7685 | 491.0 | 4910 | 2.1216 |
| 1.7685 | 492.0 | 4920 | 2.0764 |
| 1.7685 | 493.0 | 4930 | 2.1182 |
| 1.7685 | 494.0 | 4940 | 2.0343 |
| 1.8806 | 495.0 | 4950 | 2.0229 |
| 1.8806 | 496.0 | 4960 | 2.9971 |
| 1.8806 | 497.0 | 4970 | 2.1848 |
| 1.8806 | 498.0 | 4980 | 2.6586 |
| 1.8806 | 499.0 | 4990 | 2.3622 |
| 1.8554 | 500.0 | 5000 | 2.5255 |
| 1.8554 | 501.0 | 5010 | 2.1792 |
| 1.8554 | 502.0 | 5020 | 2.2098 |
| 1.8554 | 503.0 | 5030 | 3.0466 |
| 1.8554 | 504.0 | 5040 | 2.2054 |
| 1.8123 | 505.0 | 5050 | 2.0846 |
| 1.8123 | 506.0 | 5060 | 2.4480 |
| 1.8123 | 507.0 | 5070 | 2.1692 |
| 1.8123 | 508.0 | 5080 | 2.1262 |
| 1.8123 | 509.0 | 5090 | 2.0610 |
| 1.8189 | 510.0 | 5100 | 2.1438 |
| 1.8189 | 511.0 | 5110 | 1.9691 |
| 1.8189 | 512.0 | 5120 | 1.9818 |
| 1.8189 | 513.0 | 5130 | 2.1824 |
| 1.8189 | 514.0 | 5140 | 2.3053 |
| 1.7296 | 515.0 | 5150 | 2.0095 |
| 1.7296 | 516.0 | 5160 | 2.3895 |
| 1.7296 | 517.0 | 5170 | 2.4203 |
| 1.7296 | 518.0 | 5180 | 2.8143 |
| 1.7296 | 519.0 | 5190 | 1.9249 |
| 1.9353 | 520.0 | 5200 | 1.9745 |
| 1.9353 | 521.0 | 5210 | 2.3712 |
| 1.9353 | 522.0 | 5220 | 2.2221 |
| 1.9353 | 523.0 | 5230 | 2.3223 |
| 1.9353 | 524.0 | 5240 | 2.0649 |
| 1.8203 | 525.0 | 5250 | 2.4524 |
| 1.8203 | 526.0 | 5260 | 2.1729 |
| 1.8203 | 527.0 | 5270 | 2.3503 |
| 1.8203 | 528.0 | 5280 | 1.8859 |
| 1.8203 | 529.0 | 5290 | 2.5795 |
| 1.9042 | 530.0 | 5300 | 2.2665 |
| 1.9042 | 531.0 | 5310 | 1.9231 |
| 1.9042 | 532.0 | 5320 | 2.1896 |
| 1.9042 | 533.0 | 5330 | 2.1866 |
| 1.9042 | 534.0 | 5340 | 2.1273 |
| 1.797 | 535.0 | 5350 | 2.1864 |
| 1.797 | 536.0 | 5360 | 2.1360 |
| 1.797 | 537.0 | 5370 | 2.1195 |
| 1.797 | 538.0 | 5380 | 1.9885 |
| 1.797 | 539.0 | 5390 | 1.9990 |
| 1.8289 | 540.0 | 5400 | 2.0208 |
| 1.8289 | 541.0 | 5410 | 1.9337 |
| 1.8289 | 542.0 | 5420 | 2.0515 |
| 1.8289 | 543.0 | 5430 | 2.3292 |
| 1.8289 | 544.0 | 5440 | 1.8969 |
| 1.7952 | 545.0 | 5450 | 2.0917 |
| 1.7952 | 546.0 | 5460 | 2.2664 |
| 1.7952 | 547.0 | 5470 | 2.1886 |
| 1.7952 | 548.0 | 5480 | 2.2333 |
| 1.7952 | 549.0 | 5490 | 2.1483 |
| 1.8083 | 550.0 | 5500 | 2.2158 |
| 1.8083 | 551.0 | 5510 | 2.2681 |
| 1.8083 | 552.0 | 5520 | 2.7891 |
| 1.8083 | 553.0 | 5530 | 1.9523 |
| 1.8083 | 554.0 | 5540 | 2.2605 |
| 1.8217 | 555.0 | 5550 | 2.4190 |
| 1.8217 | 556.0 | 5560 | 2.1206 |
| 1.8217 | 557.0 | 5570 | 2.5011 |
| 1.8217 | 558.0 | 5580 | 2.1416 |
| 1.8217 | 559.0 | 5590 | 2.1722 |
| 1.7937 | 560.0 | 5600 | 2.0521 |
| 1.7937 | 561.0 | 5610 | 2.1215 |
| 1.7937 | 562.0 | 5620 | 2.7153 |
| 1.7937 | 563.0 | 5630 | 2.1914 |
| 1.7937 | 564.0 | 5640 | 2.1923 |
| 1.7143 | 565.0 | 5650 | 2.4663 |
| 1.7143 | 566.0 | 5660 | 1.9746 |
| 1.7143 | 567.0 | 5670 | 2.0240 |
| 1.7143 | 568.0 | 5680 | 2.5691 |
| 1.7143 | 569.0 | 5690 | 2.3204 |
| 1.6601 | 570.0 | 5700 | 2.1723 |
| 1.6601 | 571.0 | 5710 | 1.9296 |
| 1.6601 | 572.0 | 5720 | 2.1570 |
| 1.6601 | 573.0 | 5730 | 2.1298 |
| 1.6601 | 574.0 | 5740 | 2.3539 |
| 1.8999 | 575.0 | 5750 | 2.1365 |
| 1.8999 | 576.0 | 5760 | 2.0601 |
| 1.8999 | 577.0 | 5770 | 2.0550 |
| 1.8999 | 578.0 | 5780 | 2.5869 |
| 1.8999 | 579.0 | 5790 | 2.1311 |
| 1.6806 | 580.0 | 5800 | 1.9451 |
| 1.6806 | 581.0 | 5810 | 2.1228 |
| 1.6806 | 582.0 | 5820 | 2.3437 |
| 1.6806 | 583.0 | 5830 | 2.3398 |
| 1.6806 | 584.0 | 5840 | 2.1228 |
| 1.7643 | 585.0 | 5850 | 2.0135 |
| 1.7643 | 586.0 | 5860 | 1.9824 |
| 1.7643 | 587.0 | 5870 | 2.2028 |
| 1.7643 | 588.0 | 5880 | 2.4352 |
| 1.7643 | 589.0 | 5890 | 1.9458 |
| 1.803 | 590.0 | 5900 | 2.3152 |
| 1.803 | 591.0 | 5910 | 2.0768 |
| 1.803 | 592.0 | 5920 | 2.2836 |
| 1.803 | 593.0 | 5930 | 2.1446 |
| 1.803 | 594.0 | 5940 | 2.1702 |
| 1.6866 | 595.0 | 5950 | 2.3142 |
| 1.6866 | 596.0 | 5960 | 2.1351 |
| 1.6866 | 597.0 | 5970 | 1.9202 |
| 1.6866 | 598.0 | 5980 | 2.0712 |
| 1.6866 | 599.0 | 5990 | 1.9634 |
| 1.6967 | 600.0 | 6000 | 2.3699 |
| 1.6967 | 601.0 | 6010 | 2.1562 |
| 1.6967 | 602.0 | 6020 | 2.3168 |
| 1.6967 | 603.0 | 6030 | 2.2248 |
| 1.6967 | 604.0 | 6040 | 2.2533 |
| 1.6627 | 605.0 | 6050 | 1.8170 |
| 1.6627 | 606.0 | 6060 | 2.3989 |
| 1.6627 | 607.0 | 6070 | 2.0302 |
| 1.6627 | 608.0 | 6080 | 2.3638 |
| 1.6627 | 609.0 | 6090 | 1.9077 |
| 1.6703 | 610.0 | 6100 | 1.9806 |
| 1.6703 | 611.0 | 6110 | 1.9167 |
| 1.6703 | 612.0 | 6120 | 2.2209 |
| 1.6703 | 613.0 | 6130 | 2.2042 |
| 1.6703 | 614.0 | 6140 | 1.7366 |
| 1.6809 | 615.0 | 6150 | 2.1843 |
| 1.6809 | 616.0 | 6160 | 2.9500 |
| 1.6809 | 617.0 | 6170 | 2.1226 |
| 1.6809 | 618.0 | 6180 | 2.2124 |
| 1.6809 | 619.0 | 6190 | 2.8095 |
| 1.762 | 620.0 | 6200 | 1.9578 |
| 1.762 | 621.0 | 6210 | 2.0715 |
| 1.762 | 622.0 | 6220 | 2.1241 |
| 1.762 | 623.0 | 6230 | 2.4005 |
| 1.762 | 624.0 | 6240 | 1.9467 |
| 1.7518 | 625.0 | 6250 | 1.9363 |
| 1.7518 | 626.0 | 6260 | 2.3800 |
| 1.7518 | 627.0 | 6270 | 2.0086 |
| 1.7518 | 628.0 | 6280 | 2.0844 |
| 1.7518 | 629.0 | 6290 | 1.9936 |
| 1.7146 | 630.0 | 6300 | 2.9278 |
| 1.7146 | 631.0 | 6310 | 2.2130 |
| 1.7146 | 632.0 | 6320 | 1.8916 |
| 1.7146 | 633.0 | 6330 | 1.9770 |
| 1.7146 | 634.0 | 6340 | 1.9727 |
| 1.7078 | 635.0 | 6350 | 2.5519 |
| 1.7078 | 636.0 | 6360 | 1.8578 |
| 1.7078 | 637.0 | 6370 | 2.1396 |
| 1.7078 | 638.0 | 6380 | 2.1651 |
| 1.7078 | 639.0 | 6390 | 1.9666 |
| 1.7668 | 640.0 | 6400 | 2.1160 |
| 1.7668 | 641.0 | 6410 | 2.0328 |
| 1.7668 | 642.0 | 6420 | 2.0711 |
| 1.7668 | 643.0 | 6430 | 2.1058 |
| 1.7668 | 644.0 | 6440 | 2.0504 |
| 1.7245 | 645.0 | 6450 | 2.2605 |
| 1.7245 | 646.0 | 6460 | 2.3964 |
| 1.7245 | 647.0 | 6470 | 2.0940 |
| 1.7245 | 648.0 | 6480 | 2.4811 |
| 1.7245 | 649.0 | 6490 | 2.2603 |
| 1.66 | 650.0 | 6500 | 2.0771 |
| 1.66 | 651.0 | 6510 | 2.0068 |
| 1.66 | 652.0 | 6520 | 1.9992 |
| 1.66 | 653.0 | 6530 | 2.0482 |
| 1.66 | 654.0 | 6540 | 2.1352 |
| 1.6753 | 655.0 | 6550 | 2.0777 |
| 1.6753 | 656.0 | 6560 | 1.9601 |
| 1.6753 | 657.0 | 6570 | 2.0755 |
| 1.6753 | 658.0 | 6580 | 2.0130 |
| 1.6753 | 659.0 | 6590 | 2.5618 |
| 1.6751 | 660.0 | 6600 | 2.0391 |
| 1.6751 | 661.0 | 6610 | 1.9881 |
| 1.6751 | 662.0 | 6620 | 2.0105 |
| 1.6751 | 663.0 | 6630 | 2.0397 |
| 1.6751 | 664.0 | 6640 | 1.9171 |
| 1.7329 | 665.0 | 6650 | 2.1690 |
| 1.7329 | 666.0 | 6660 | 1.9315 |
| 1.7329 | 667.0 | 6670 | 2.4457 |
| 1.7329 | 668.0 | 6680 | 2.0552 |
| 1.7329 | 669.0 | 6690 | 2.1250 |
| 1.7304 | 670.0 | 6700 | 1.9498 |
| 1.7304 | 671.0 | 6710 | 2.1620 |
| 1.7304 | 672.0 | 6720 | 2.1663 |
| 1.7304 | 673.0 | 6730 | 2.2802 |
| 1.7304 | 674.0 | 6740 | 2.0857 |
| 1.6356 | 675.0 | 6750 | 2.2656 |
| 1.6356 | 676.0 | 6760 | 1.9959 |
| 1.6356 | 677.0 | 6770 | 2.0719 |
| 1.6356 | 678.0 | 6780 | 2.0429 |
| 1.6356 | 679.0 | 6790 | 1.9561 |
| 1.6098 | 680.0 | 6800 | 2.3071 |
| 1.6098 | 681.0 | 6810 | 2.2920 |
| 1.6098 | 682.0 | 6820 | 2.1268 |
| 1.6098 | 683.0 | 6830 | 1.9186 |
| 1.6098 | 684.0 | 6840 | 1.8820 |
| 1.6784 | 685.0 | 6850 | 2.1013 |
| 1.6784 | 686.0 | 6860 | 2.0973 |
| 1.6784 | 687.0 | 6870 | 2.3960 |
| 1.6784 | 688.0 | 6880 | 1.8338 |
| 1.6784 | 689.0 | 6890 | 2.0245 |
| 1.689 | 690.0 | 6900 | 2.1786 |
| 1.689 | 691.0 | 6910 | 2.0254 |
| 1.689 | 692.0 | 6920 | 1.9316 |
| 1.689 | 693.0 | 6930 | 1.9776 |
| 1.689 | 694.0 | 6940 | 2.1271 |
| 1.6889 | 695.0 | 6950 | 2.3542 |
| 1.6889 | 696.0 | 6960 | 2.1932 |
| 1.6889 | 697.0 | 6970 | 1.8910 |
| 1.6889 | 698.0 | 6980 | 2.1252 |
| 1.6889 | 699.0 | 6990 | 1.9726 |
| 1.7028 | 700.0 | 7000 | 2.0448 |
| 1.7028 | 701.0 | 7010 | 2.1499 |
| 1.7028 | 702.0 | 7020 | 1.8854 |
| 1.7028 | 703.0 | 7030 | 1.9297 |
| 1.7028 | 704.0 | 7040 | 2.1054 |
| 1.6484 | 705.0 | 7050 | 1.9997 |
| 1.6484 | 706.0 | 7060 | 2.0114 |
| 1.6484 | 707.0 | 7070 | 2.0139 |
| 1.6484 | 708.0 | 7080 | 2.9272 |
| 1.6484 | 709.0 | 7090 | 1.8419 |
| 1.6615 | 710.0 | 7100 | 3.2302 |
| 1.6615 | 711.0 | 7110 | 2.0337 |
| 1.6615 | 712.0 | 7120 | 2.0933 |
| 1.6615 | 713.0 | 7130 | 2.0162 |
| 1.6615 | 714.0 | 7140 | 2.0073 |
| 1.6318 | 715.0 | 7150 | 2.1256 |
| 1.6318 | 716.0 | 7160 | 1.8836 |
| 1.6318 | 717.0 | 7170 | 2.0321 |
| 1.6318 | 718.0 | 7180 | 2.0796 |
| 1.6318 | 719.0 | 7190 | 1.9985 |
| 1.7706 | 720.0 | 7200 | 2.6352 |
| 1.7706 | 721.0 | 7210 | 1.9618 |
| 1.7706 | 722.0 | 7220 | 1.8866 |
| 1.7706 | 723.0 | 7230 | 1.9311 |
| 1.7706 | 724.0 | 7240 | 2.2133 |
| 1.7221 | 725.0 | 7250 | 1.8637 |
| 1.7221 | 726.0 | 7260 | 2.1916 |
| 1.7221 | 727.0 | 7270 | 1.8545 |
| 1.7221 | 728.0 | 7280 | 2.1350 |
| 1.7221 | 729.0 | 7290 | 2.0091 |
| 1.754 | 730.0 | 7300 | 1.9316 |
| 1.754 | 731.0 | 7310 | 2.0585 |
| 1.754 | 732.0 | 7320 | 2.0417 |
| 1.754 | 733.0 | 7330 | 2.1116 |
| 1.754 | 734.0 | 7340 | 2.0630 |
| 1.6204 | 735.0 | 7350 | 1.9218 |
| 1.6204 | 736.0 | 7360 | 2.5058 |
| 1.6204 | 737.0 | 7370 | 2.2771 |
| 1.6204 | 738.0 | 7380 | 1.9493 |
| 1.6204 | 739.0 | 7390 | 2.1200 |
| 1.6891 | 740.0 | 7400 | 2.0596 |
| 1.6891 | 741.0 | 7410 | 2.0757 |
| 1.6891 | 742.0 | 7420 | 1.9904 |
| 1.6891 | 743.0 | 7430 | 2.1336 |
| 1.6891 | 744.0 | 7440 | 2.4599 |
| 1.7584 | 745.0 | 7450 | 2.1578 |
| 1.7584 | 746.0 | 7460 | 1.9749 |
| 1.7584 | 747.0 | 7470 | 2.1406 |
| 1.7584 | 748.0 | 7480 | 2.3524 |
| 1.7584 | 749.0 | 7490 | 2.0798 |
| 1.5819 | 750.0 | 7500 | 1.8948 |
| 1.5819 | 751.0 | 7510 | 1.8562 |
| 1.5819 | 752.0 | 7520 | 3.5239 |
| 1.5819 | 753.0 | 7530 | 2.2157 |
| 1.5819 | 754.0 | 7540 | 2.7353 |
| 1.6342 | 755.0 | 7550 | 2.2190 |
| 1.6342 | 756.0 | 7560 | 2.3935 |
| 1.6342 | 757.0 | 7570 | 2.0825 |
| 1.6342 | 758.0 | 7580 | 2.0174 |
| 1.6342 | 759.0 | 7590 | 1.9563 |
| 1.7279 | 760.0 | 7600 | 2.1491 |
| 1.7279 | 761.0 | 7610 | 1.9795 |
| 1.7279 | 762.0 | 7620 | 1.9805 |
| 1.7279 | 763.0 | 7630 | 1.9753 |
| 1.7279 | 764.0 | 7640 | 2.0721 |
| 1.5626 | 765.0 | 7650 | 2.1229 |
| 1.5626 | 766.0 | 7660 | 2.0831 |
| 1.5626 | 767.0 | 7670 | 2.8723 |
| 1.5626 | 768.0 | 7680 | 1.9799 |
| 1.5626 | 769.0 | 7690 | 2.0792 |
| 1.6589 | 770.0 | 7700 | 1.9836 |
| 1.6589 | 771.0 | 7710 | 1.8836 |
| 1.6589 | 772.0 | 7720 | 2.1195 |
| 1.6589 | 773.0 | 7730 | 2.2073 |
| 1.6589 | 774.0 | 7740 | 1.9880 |
| 1.641 | 775.0 | 7750 | 2.2762 |
| 1.641 | 776.0 | 7760 | 2.0996 |
| 1.641 | 777.0 | 7770 | 2.0157 |
| 1.641 | 778.0 | 7780 | 1.9012 |
| 1.641 | 779.0 | 7790 | 3.4505 |
| 1.8726 | 780.0 | 7800 | 1.9617 |
| 1.8726 | 781.0 | 7810 | 2.0913 |
| 1.8726 | 782.0 | 7820 | 1.9486 |
| 1.8726 | 783.0 | 7830 | 2.0114 |
| 1.8726 | 784.0 | 7840 | 1.9957 |
| 1.6342 | 785.0 | 7850 | 2.1678 |
| 1.6342 | 786.0 | 7860 | 2.1731 |
| 1.6342 | 787.0 | 7870 | 1.9840 |
| 1.6342 | 788.0 | 7880 | 2.2147 |
| 1.6342 | 789.0 | 7890 | 2.4845 |
| 1.6656 | 790.0 | 7900 | 2.0647 |
| 1.6656 | 791.0 | 7910 | 1.9105 |
| 1.6656 | 792.0 | 7920 | 1.9711 |
| 1.6656 | 793.0 | 7930 | 2.8114 |
| 1.6656 | 794.0 | 7940 | 2.1196 |
| 1.7298 | 795.0 | 7950 | 2.0664 |
| 1.7298 | 796.0 | 7960 | 2.2231 |
| 1.7298 | 797.0 | 7970 | 1.9946 |
| 1.7298 | 798.0 | 7980 | 2.3052 |
| 1.7298 | 799.0 | 7990 | 2.4928 |
| 1.7294 | 800.0 | 8000 | 2.0689 |
| 1.7294 | 801.0 | 8010 | 2.1222 |
| 1.7294 | 802.0 | 8020 | 1.9995 |
| 1.7294 | 803.0 | 8030 | 2.0070 |
| 1.7294 | 804.0 | 8040 | 1.8976 |
| 1.6905 | 805.0 | 8050 | 2.0889 |
| 1.6905 | 806.0 | 8060 | 2.0273 |
| 1.6905 | 807.0 | 8070 | 1.8873 |
| 1.6905 | 808.0 | 8080 | 2.5260 |
| 1.6905 | 809.0 | 8090 | 2.0703 |
| 1.6383 | 810.0 | 8100 | 2.1421 |
| 1.6383 | 811.0 | 8110 | 1.9730 |
| 1.6383 | 812.0 | 8120 | 2.2552 |
| 1.6383 | 813.0 | 8130 | 1.8962 |
| 1.6383 | 814.0 | 8140 | 2.0572 |
| 1.6897 | 815.0 | 8150 | 2.0349 |
| 1.6897 | 816.0 | 8160 | 2.0451 |
| 1.6897 | 817.0 | 8170 | 2.0762 |
| 1.6897 | 818.0 | 8180 | 2.0079 |
| 1.6897 | 819.0 | 8190 | 2.1432 |
| 1.5845 | 820.0 | 8200 | 2.5644 |
| 1.5845 | 821.0 | 8210 | 2.1259 |
| 1.5845 | 822.0 | 8220 | 2.1217 |
| 1.5845 | 823.0 | 8230 | 1.8807 |
| 1.5845 | 824.0 | 8240 | 2.2475 |
| 1.6942 | 825.0 | 8250 | 2.7079 |
| 1.6942 | 826.0 | 8260 | 2.1418 |
| 1.6942 | 827.0 | 8270 | 1.9854 |
| 1.6942 | 828.0 | 8280 | 2.1039 |
| 1.6942 | 829.0 | 8290 | 1.9488 |
| 1.5919 | 830.0 | 8300 | 2.1037 |
| 1.5919 | 831.0 | 8310 | 2.0170 |
| 1.5919 | 832.0 | 8320 | 1.8831 |
| 1.5919 | 833.0 | 8330 | 1.7501 |
| 1.5919 | 834.0 | 8340 | 2.5991 |
| 1.6626 | 835.0 | 8350 | 2.0915 |
| 1.6626 | 836.0 | 8360 | 2.0901 |
| 1.6626 | 837.0 | 8370 | 2.0779 |
| 1.6626 | 838.0 | 8380 | 1.9901 |
| 1.6626 | 839.0 | 8390 | 2.1458 |
| 1.5978 | 840.0 | 8400 | 2.1409 |
| 1.5978 | 841.0 | 8410 | 2.2341 |
| 1.5978 | 842.0 | 8420 | 2.3387 |
| 1.5978 | 843.0 | 8430 | 2.0669 |
| 1.5978 | 844.0 | 8440 | 2.1725 |
| 1.6153 | 845.0 | 8450 | 1.9977 |
| 1.6153 | 846.0 | 8460 | 2.3008 |
| 1.6153 | 847.0 | 8470 | 2.0032 |
| 1.6153 | 848.0 | 8480 | 2.0802 |
| 1.6153 | 849.0 | 8490 | 2.1358 |
| 1.6977 | 850.0 | 8500 | 2.2539 |
| 1.6977 | 851.0 | 8510 | 2.3892 |
| 1.6977 | 852.0 | 8520 | 1.8730 |
| 1.6977 | 853.0 | 8530 | 2.4494 |
| 1.6977 | 854.0 | 8540 | 1.7971 |
| 1.6117 | 855.0 | 8550 | 1.8645 |
| 1.6117 | 856.0 | 8560 | 2.1854 |
| 1.6117 | 857.0 | 8570 | 1.7846 |
| 1.6117 | 858.0 | 8580 | 2.0895 |
| 1.6117 | 859.0 | 8590 | 1.9494 |
| 1.6776 | 860.0 | 8600 | 3.0806 |
| 1.6776 | 861.0 | 8610 | 2.5941 |
| 1.6776 | 862.0 | 8620 | 1.8778 |
| 1.6776 | 863.0 | 8630 | 1.9408 |
| 1.6776 | 864.0 | 8640 | 2.0962 |
| 1.7326 | 865.0 | 8650 | 1.8876 |
| 1.7326 | 866.0 | 8660 | 1.9434 |
| 1.7326 | 867.0 | 8670 | 2.0616 |
| 1.7326 | 868.0 | 8680 | 2.4041 |
| 1.7326 | 869.0 | 8690 | 2.8890 |
| 1.6468 | 870.0 | 8700 | 2.1031 |
| 1.6468 | 871.0 | 8710 | 2.1359 |
| 1.6468 | 872.0 | 8720 | 1.8292 |
| 1.6468 | 873.0 | 8730 | 2.0762 |
| 1.6468 | 874.0 | 8740 | 2.1207 |
| 1.7116 | 875.0 | 8750 | 1.8605 |
| 1.7116 | 876.0 | 8760 | 1.8536 |
| 1.7116 | 877.0 | 8770 | 2.0260 |
| 1.7116 | 878.0 | 8780 | 2.6150 |
| 1.7116 | 879.0 | 8790 | 1.9157 |
| 1.5673 | 880.0 | 8800 | 1.9184 |
| 1.5673 | 881.0 | 8810 | 1.9319 |
| 1.5673 | 882.0 | 8820 | 2.4362 |
| 1.5673 | 883.0 | 8830 | 1.9637 |
| 1.5673 | 884.0 | 8840 | 1.8797 |
| 1.7281 | 885.0 | 8850 | 1.9358 |
| 1.7281 | 886.0 | 8860 | 2.0570 |
| 1.7281 | 887.0 | 8870 | 1.8167 |
| 1.7281 | 888.0 | 8880 | 2.4525 |
| 1.7281 | 889.0 | 8890 | 2.0002 |
| 1.6826 | 890.0 | 8900 | 2.1198 |
| 1.6826 | 891.0 | 8910 | 2.0699 |
| 1.6826 | 892.0 | 8920 | 1.9274 |
| 1.6826 | 893.0 | 8930 | 2.1415 |
| 1.6826 | 894.0 | 8940 | 2.2883 |
| 1.6198 | 895.0 | 8950 | 2.0476 |
| 1.6198 | 896.0 | 8960 | 2.2307 |
| 1.6198 | 897.0 | 8970 | 2.0366 |
| 1.6198 | 898.0 | 8980 | 2.2318 |
| 1.6198 | 899.0 | 8990 | 1.8846 |
| 1.6745 | 900.0 | 9000 | 2.1018 |
| 1.6745 | 901.0 | 9010 | 1.9280 |
| 1.6745 | 902.0 | 9020 | 1.9235 |
| 1.6745 | 903.0 | 9030 | 1.9320 |
| 1.6745 | 904.0 | 9040 | 2.0586 |
| 1.6756 | 905.0 | 9050 | 2.2404 |
| 1.6756 | 906.0 | 9060 | 1.7918 |
| 1.6756 | 907.0 | 9070 | 2.0683 |
| 1.6756 | 908.0 | 9080 | 2.1354 |
| 1.6756 | 909.0 | 9090 | 1.8801 |
| 1.6787 | 910.0 | 9100 | 1.9743 |
| 1.6787 | 911.0 | 9110 | 1.9033 |
| 1.6787 | 912.0 | 9120 | 1.9763 |
| 1.6787 | 913.0 | 9130 | 2.4240 |
| 1.6787 | 914.0 | 9140 | 2.1385 |
| 1.7097 | 915.0 | 9150 | 2.1198 |
| 1.7097 | 916.0 | 9160 | 2.0050 |
| 1.7097 | 917.0 | 9170 | 2.2088 |
| 1.7097 | 918.0 | 9180 | 2.1206 |
| 1.7097 | 919.0 | 9190 | 2.0948 |
| 1.6659 | 920.0 | 9200 | 1.8802 |
| 1.6659 | 921.0 | 9210 | 2.1338 |
| 1.6659 | 922.0 | 9220 | 2.1038 |
| 1.6659 | 923.0 | 9230 | 1.9181 |
| 1.6659 | 924.0 | 9240 | 2.7046 |
| 1.6811 | 925.0 | 9250 | 2.0183 |
| 1.6811 | 926.0 | 9260 | 1.8901 |
| 1.6811 | 927.0 | 9270 | 1.9689 |
| 1.6811 | 928.0 | 9280 | 2.0394 |
| 1.6811 | 929.0 | 9290 | 2.2120 |
| 1.6563 | 930.0 | 9300 | 2.0195 |
| 1.6563 | 931.0 | 9310 | 1.9242 |
| 1.6563 | 932.0 | 9320 | 1.9250 |
| 1.6563 | 933.0 | 9330 | 2.0381 |
| 1.6563 | 934.0 | 9340 | 2.0593 |
| 1.6305 | 935.0 | 9350 | 2.0884 |
| 1.6305 | 936.0 | 9360 | 2.2510 |
| 1.6305 | 937.0 | 9370 | 2.1661 |
| 1.6305 | 938.0 | 9380 | 2.1428 |
| 1.6305 | 939.0 | 9390 | 1.9285 |
| 1.7281 | 940.0 | 9400 | 2.2593 |
| 1.7281 | 941.0 | 9410 | 1.9035 |
| 1.7281 | 942.0 | 9420 | 2.1112 |
| 1.7281 | 943.0 | 9430 | 1.8724 |
| 1.7281 | 944.0 | 9440 | 2.1733 |
| 1.7082 | 945.0 | 9450 | 2.0155 |
| 1.7082 | 946.0 | 9460 | 2.3869 |
| 1.7082 | 947.0 | 9470 | 1.8851 |
| 1.7082 | 948.0 | 9480 | 2.0056 |
| 1.7082 | 949.0 | 9490 | 2.2667 |
| 1.6896 | 950.0 | 9500 | 1.8944 |
| 1.6896 | 951.0 | 9510 | 2.1082 |
| 1.6896 | 952.0 | 9520 | 1.9545 |
| 1.6896 | 953.0 | 9530 | 1.8668 |
| 1.6896 | 954.0 | 9540 | 2.0611 |
| 1.6217 | 955.0 | 9550 | 1.9020 |
| 1.6217 | 956.0 | 9560 | 1.9017 |
| 1.6217 | 957.0 | 9570 | 1.8864 |
| 1.6217 | 958.0 | 9580 | 1.8889 |
| 1.6217 | 959.0 | 9590 | 2.1421 |
| 1.6448 | 960.0 | 9600 | 2.0292 |
| 1.6448 | 961.0 | 9610 | 1.9317 |
| 1.6448 | 962.0 | 9620 | 2.1516 |
| 1.6448 | 963.0 | 9630 | 1.9716 |
| 1.6448 | 964.0 | 9640 | 2.1114 |
| 1.6824 | 965.0 | 9650 | 2.1036 |
| 1.6824 | 966.0 | 9660 | 2.0659 |
| 1.6824 | 967.0 | 9670 | 1.9232 |
| 1.6824 | 968.0 | 9680 | 1.9512 |
| 1.6824 | 969.0 | 9690 | 1.9665 |
| 1.6428 | 970.0 | 9700 | 2.0697 |
| 1.6428 | 971.0 | 9710 | 2.4811 |
| 1.6428 | 972.0 | 9720 | 2.2800 |
| 1.6428 | 973.0 | 9730 | 2.0109 |
| 1.6428 | 974.0 | 9740 | 1.9637 |
| 1.6795 | 975.0 | 9750 | 1.7978 |
| 1.6795 | 976.0 | 9760 | 2.2613 |
| 1.6795 | 977.0 | 9770 | 2.0626 |
| 1.6795 | 978.0 | 9780 | 1.9644 |
| 1.6795 | 979.0 | 9790 | 1.9700 |
| 1.668 | 980.0 | 9800 | 2.0342 |
| 1.668 | 981.0 | 9810 | 1.9443 |
| 1.668 | 982.0 | 9820 | 1.9675 |
| 1.668 | 983.0 | 9830 | 1.8887 |
| 1.668 | 984.0 | 9840 | 1.9073 |
| 1.6776 | 985.0 | 9850 | 2.0161 |
| 1.6776 | 986.0 | 9860 | 1.8777 |
| 1.6776 | 987.0 | 9870 | 2.4692 |
| 1.6776 | 988.0 | 9880 | 2.0462 |
| 1.6776 | 989.0 | 9890 | 1.9776 |
| 1.744 | 990.0 | 9900 | 2.0838 |
| 1.744 | 991.0 | 9910 | 2.1438 |
| 1.744 | 992.0 | 9920 | 2.2172 |
| 1.744 | 993.0 | 9930 | 2.4513 |
| 1.744 | 994.0 | 9940 | 1.8723 |
| 1.644 | 995.0 | 9950 | 2.9081 |
| 1.644 | 996.0 | 9960 | 1.8090 |
| 1.644 | 997.0 | 9970 | 1.9621 |
| 1.644 | 998.0 | 9980 | 2.1157 |
| 1.644 | 999.0 | 9990 | 1.9026 |
| 1.6751 | 1000.0 | 10000 | 2.1397 |
### Framework versions
- Transformers 4.46.3
- Pytorch 2.5.1+cu124
- Datasets 3.1.0
- Tokenizers 0.20.3
| [
"button",
"textfield"
] |
madhutry/detr-finetuned-scrn-38samples-1-epoch157 |
# Model Card for Model ID
<!-- Provide a quick summary of what the model is/does. -->
## Model Details
### Model Description
<!-- Provide a longer summary of what this model is. -->
This is the model card of a 🤗 transformers model that has been pushed on the Hub. This model card has been automatically generated.
- **Developed by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Model type:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
- **Finetuned from model [optional]:** [More Information Needed]
### Model Sources [optional]
<!-- Provide the basic links for the model. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the model is intended to be used, including the foreseeable users of the model and those affected by the model. -->
### Direct Use
<!-- This section is for the model use without fine-tuning or plugging into a larger ecosystem/app. -->
[More Information Needed]
### Downstream Use [optional]
<!-- This section is for the model use when fine-tuned for a task, or when plugged into a larger ecosystem/app -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the model will not work well for. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.
## How to Get Started with the Model
Use the code below to get started with the model.
[More Information Needed]
## Training Details
### Training Data
<!-- This should link to a Dataset Card, perhaps with a short stub of information on what the training data is all about as well as documentation related to data pre-processing or additional filtering. -->
[More Information Needed]
### Training Procedure
<!-- This relates heavily to the Technical Specifications. Content here should link to that section when it is relevant to the training procedure. -->
#### Preprocessing [optional]
[More Information Needed]
#### Training Hyperparameters
- **Training regime:** [More Information Needed] <!--fp32, fp16 mixed precision, bf16 mixed precision, bf16 non-mixed precision, fp16 non-mixed precision, fp8 mixed precision -->
#### Speeds, Sizes, Times [optional]
<!-- This section provides information about throughput, start/end time, checkpoint size if relevant, etc. -->
[More Information Needed]
## Evaluation
<!-- This section describes the evaluation protocols and provides the results. -->
### Testing Data, Factors & Metrics
#### Testing Data
<!-- This should link to a Dataset Card if possible. -->
[More Information Needed]
#### Factors
<!-- These are the things the evaluation is disaggregating by, e.g., subpopulations or domains. -->
[More Information Needed]
#### Metrics
<!-- These are the evaluation metrics being used, ideally with a description of why. -->
[More Information Needed]
### Results
[More Information Needed]
#### Summary
## Model Examination [optional]
<!-- Relevant interpretability work for the model goes here -->
[More Information Needed]
## Environmental Impact
<!-- Total emissions (in grams of CO2eq) and additional considerations, such as electricity usage, go here. Edit the suggested text below accordingly -->
Carbon emissions can be estimated using the [Machine Learning Impact calculator](https://mlco2.github.io/impact#compute) presented in [Lacoste et al. (2019)](https://arxiv.org/abs/1910.09700).
- **Hardware Type:** [More Information Needed]
- **Hours used:** [More Information Needed]
- **Cloud Provider:** [More Information Needed]
- **Compute Region:** [More Information Needed]
- **Carbon Emitted:** [More Information Needed]
## Technical Specifications [optional]
### Model Architecture and Objective
[More Information Needed]
### Compute Infrastructure
[More Information Needed]
#### Hardware
[More Information Needed]
#### Software
[More Information Needed]
## Citation [optional]
<!-- If there is a paper or blog post introducing the model, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the model or model card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Model Card Authors [optional]
[More Information Needed]
## Model Card Contact
[More Information Needed] | [
"label_0",
"label_1",
"label_2",
"label_3",
"label_4",
"label_5",
"label_6",
"label_7",
"label_8",
"label_9",
"label_10",
"label_11"
] |
madhutry/detr-finetuned-scrn-28samples-1-epoch150 |
# Model Card for Model ID
<!-- Provide a quick summary of what the model is/does. -->
## Model Details
### Model Description
<!-- Provide a longer summary of what this model is. -->
This is the model card of a 🤗 transformers model that has been pushed on the Hub. This model card has been automatically generated.
- **Developed by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Model type:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
- **Finetuned from model [optional]:** [More Information Needed]
### Model Sources [optional]
<!-- Provide the basic links for the model. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the model is intended to be used, including the foreseeable users of the model and those affected by the model. -->
### Direct Use
<!-- This section is for the model use without fine-tuning or plugging into a larger ecosystem/app. -->
[More Information Needed]
### Downstream Use [optional]
<!-- This section is for the model use when fine-tuned for a task, or when plugged into a larger ecosystem/app -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the model will not work well for. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.
## How to Get Started with the Model
Use the code below to get started with the model.
[More Information Needed]
## Training Details
### Training Data
<!-- This should link to a Dataset Card, perhaps with a short stub of information on what the training data is all about as well as documentation related to data pre-processing or additional filtering. -->
[More Information Needed]
### Training Procedure
<!-- This relates heavily to the Technical Specifications. Content here should link to that section when it is relevant to the training procedure. -->
#### Preprocessing [optional]
[More Information Needed]
#### Training Hyperparameters
- **Training regime:** [More Information Needed] <!--fp32, fp16 mixed precision, bf16 mixed precision, bf16 non-mixed precision, fp16 non-mixed precision, fp8 mixed precision -->
#### Speeds, Sizes, Times [optional]
<!-- This section provides information about throughput, start/end time, checkpoint size if relevant, etc. -->
[More Information Needed]
## Evaluation
<!-- This section describes the evaluation protocols and provides the results. -->
### Testing Data, Factors & Metrics
#### Testing Data
<!-- This should link to a Dataset Card if possible. -->
[More Information Needed]
#### Factors
<!-- These are the things the evaluation is disaggregating by, e.g., subpopulations or domains. -->
[More Information Needed]
#### Metrics
<!-- These are the evaluation metrics being used, ideally with a description of why. -->
[More Information Needed]
### Results
[More Information Needed]
#### Summary
## Model Examination [optional]
<!-- Relevant interpretability work for the model goes here -->
[More Information Needed]
## Environmental Impact
<!-- Total emissions (in grams of CO2eq) and additional considerations, such as electricity usage, go here. Edit the suggested text below accordingly -->
Carbon emissions can be estimated using the [Machine Learning Impact calculator](https://mlco2.github.io/impact#compute) presented in [Lacoste et al. (2019)](https://arxiv.org/abs/1910.09700).
- **Hardware Type:** [More Information Needed]
- **Hours used:** [More Information Needed]
- **Cloud Provider:** [More Information Needed]
- **Compute Region:** [More Information Needed]
- **Carbon Emitted:** [More Information Needed]
## Technical Specifications [optional]
### Model Architecture and Objective
[More Information Needed]
### Compute Infrastructure
[More Information Needed]
#### Hardware
[More Information Needed]
#### Software
[More Information Needed]
## Citation [optional]
<!-- If there is a paper or blog post introducing the model, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the model or model card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Model Card Authors [optional]
[More Information Needed]
## Model Card Contact
[More Information Needed] | [
"label_0",
"label_1",
"label_2",
"label_3",
"label_4",
"label_5",
"label_6",
"label_7",
"label_8",
"label_9",
"label_10",
"label_11"
] |
vietlethe/bkad-deformable-detr_test |
# Model Card for Model ID
<!-- Provide a quick summary of what the model is/does. -->
## Model Details
### Model Description
<!-- Provide a longer summary of what this model is. -->
This is the model card of a 🤗 transformers model that has been pushed on the Hub. This model card has been automatically generated.
- **Developed by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Model type:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
- **Finetuned from model [optional]:** [More Information Needed]
### Model Sources [optional]
<!-- Provide the basic links for the model. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the model is intended to be used, including the foreseeable users of the model and those affected by the model. -->
### Direct Use
<!-- This section is for the model use without fine-tuning or plugging into a larger ecosystem/app. -->
[More Information Needed]
### Downstream Use [optional]
<!-- This section is for the model use when fine-tuned for a task, or when plugged into a larger ecosystem/app -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the model will not work well for. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.
## How to Get Started with the Model
Use the code below to get started with the model.
[More Information Needed]
## Training Details
### Training Data
<!-- This should link to a Dataset Card, perhaps with a short stub of information on what the training data is all about as well as documentation related to data pre-processing or additional filtering. -->
[More Information Needed]
### Training Procedure
<!-- This relates heavily to the Technical Specifications. Content here should link to that section when it is relevant to the training procedure. -->
#### Preprocessing [optional]
[More Information Needed]
#### Training Hyperparameters
- **Training regime:** [More Information Needed] <!--fp32, fp16 mixed precision, bf16 mixed precision, bf16 non-mixed precision, fp16 non-mixed precision, fp8 mixed precision -->
#### Speeds, Sizes, Times [optional]
<!-- This section provides information about throughput, start/end time, checkpoint size if relevant, etc. -->
[More Information Needed]
## Evaluation
<!-- This section describes the evaluation protocols and provides the results. -->
### Testing Data, Factors & Metrics
#### Testing Data
<!-- This should link to a Dataset Card if possible. -->
[More Information Needed]
#### Factors
<!-- These are the things the evaluation is disaggregating by, e.g., subpopulations or domains. -->
[More Information Needed]
#### Metrics
<!-- These are the evaluation metrics being used, ideally with a description of why. -->
[More Information Needed]
### Results
[More Information Needed]
#### Summary
## Model Examination [optional]
<!-- Relevant interpretability work for the model goes here -->
[More Information Needed]
## Environmental Impact
<!-- Total emissions (in grams of CO2eq) and additional considerations, such as electricity usage, go here. Edit the suggested text below accordingly -->
Carbon emissions can be estimated using the [Machine Learning Impact calculator](https://mlco2.github.io/impact#compute) presented in [Lacoste et al. (2019)](https://arxiv.org/abs/1910.09700).
- **Hardware Type:** [More Information Needed]
- **Hours used:** [More Information Needed]
- **Cloud Provider:** [More Information Needed]
- **Compute Region:** [More Information Needed]
- **Carbon Emitted:** [More Information Needed]
## Technical Specifications [optional]
### Model Architecture and Objective
[More Information Needed]
### Compute Infrastructure
[More Information Needed]
#### Hardware
[More Information Needed]
#### Software
[More Information Needed]
## Citation [optional]
<!-- If there is a paper or blog post introducing the model, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the model or model card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Model Card Authors [optional]
[More Information Needed]
## Model Card Contact
[More Information Needed] | [
"label_0",
"label_1",
"label_2",
"label_3"
] |
vietlethe/bkad-deformable-detr_last |
# Model Card for Model ID
<!-- Provide a quick summary of what the model is/does. -->
## Model Details
### Model Description
<!-- Provide a longer summary of what this model is. -->
This is the model card of a 🤗 transformers model that has been pushed on the Hub. This model card has been automatically generated.
- **Developed by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Model type:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
- **Finetuned from model [optional]:** [More Information Needed]
### Model Sources [optional]
<!-- Provide the basic links for the model. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the model is intended to be used, including the foreseeable users of the model and those affected by the model. -->
### Direct Use
<!-- This section is for the model use without fine-tuning or plugging into a larger ecosystem/app. -->
[More Information Needed]
### Downstream Use [optional]
<!-- This section is for the model use when fine-tuned for a task, or when plugged into a larger ecosystem/app -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the model will not work well for. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.
## How to Get Started with the Model
Use the code below to get started with the model.
[More Information Needed]
## Training Details
### Training Data
<!-- This should link to a Dataset Card, perhaps with a short stub of information on what the training data is all about as well as documentation related to data pre-processing or additional filtering. -->
[More Information Needed]
### Training Procedure
<!-- This relates heavily to the Technical Specifications. Content here should link to that section when it is relevant to the training procedure. -->
#### Preprocessing [optional]
[More Information Needed]
#### Training Hyperparameters
- **Training regime:** [More Information Needed] <!--fp32, fp16 mixed precision, bf16 mixed precision, bf16 non-mixed precision, fp16 non-mixed precision, fp8 mixed precision -->
#### Speeds, Sizes, Times [optional]
<!-- This section provides information about throughput, start/end time, checkpoint size if relevant, etc. -->
[More Information Needed]
## Evaluation
<!-- This section describes the evaluation protocols and provides the results. -->
### Testing Data, Factors & Metrics
#### Testing Data
<!-- This should link to a Dataset Card if possible. -->
[More Information Needed]
#### Factors
<!-- These are the things the evaluation is disaggregating by, e.g., subpopulations or domains. -->
[More Information Needed]
#### Metrics
<!-- These are the evaluation metrics being used, ideally with a description of why. -->
[More Information Needed]
### Results
[More Information Needed]
#### Summary
## Model Examination [optional]
<!-- Relevant interpretability work for the model goes here -->
[More Information Needed]
## Environmental Impact
<!-- Total emissions (in grams of CO2eq) and additional considerations, such as electricity usage, go here. Edit the suggested text below accordingly -->
Carbon emissions can be estimated using the [Machine Learning Impact calculator](https://mlco2.github.io/impact#compute) presented in [Lacoste et al. (2019)](https://arxiv.org/abs/1910.09700).
- **Hardware Type:** [More Information Needed]
- **Hours used:** [More Information Needed]
- **Cloud Provider:** [More Information Needed]
- **Compute Region:** [More Information Needed]
- **Carbon Emitted:** [More Information Needed]
## Technical Specifications [optional]
### Model Architecture and Objective
[More Information Needed]
### Compute Infrastructure
[More Information Needed]
#### Hardware
[More Information Needed]
#### Software
[More Information Needed]
## Citation [optional]
<!-- If there is a paper or blog post introducing the model, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the model or model card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Model Card Authors [optional]
[More Information Needed]
## Model Card Contact
[More Information Needed] | [
"label_0",
"label_1",
"label_2",
"label_3"
] |
vietlethe/bkad-deformable-detr_best |
# Model Card for Model ID
<!-- Provide a quick summary of what the model is/does. -->
## Model Details
### Model Description
<!-- Provide a longer summary of what this model is. -->
This is the model card of a 🤗 transformers model that has been pushed on the Hub. This model card has been automatically generated.
- **Developed by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Model type:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
- **Finetuned from model [optional]:** [More Information Needed]
### Model Sources [optional]
<!-- Provide the basic links for the model. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the model is intended to be used, including the foreseeable users of the model and those affected by the model. -->
### Direct Use
<!-- This section is for the model use without fine-tuning or plugging into a larger ecosystem/app. -->
[More Information Needed]
### Downstream Use [optional]
<!-- This section is for the model use when fine-tuned for a task, or when plugged into a larger ecosystem/app -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the model will not work well for. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.
## How to Get Started with the Model
Use the code below to get started with the model.
[More Information Needed]
## Training Details
### Training Data
<!-- This should link to a Dataset Card, perhaps with a short stub of information on what the training data is all about as well as documentation related to data pre-processing or additional filtering. -->
[More Information Needed]
### Training Procedure
<!-- This relates heavily to the Technical Specifications. Content here should link to that section when it is relevant to the training procedure. -->
#### Preprocessing [optional]
[More Information Needed]
#### Training Hyperparameters
- **Training regime:** [More Information Needed] <!--fp32, fp16 mixed precision, bf16 mixed precision, bf16 non-mixed precision, fp16 non-mixed precision, fp8 mixed precision -->
#### Speeds, Sizes, Times [optional]
<!-- This section provides information about throughput, start/end time, checkpoint size if relevant, etc. -->
[More Information Needed]
## Evaluation
<!-- This section describes the evaluation protocols and provides the results. -->
### Testing Data, Factors & Metrics
#### Testing Data
<!-- This should link to a Dataset Card if possible. -->
[More Information Needed]
#### Factors
<!-- These are the things the evaluation is disaggregating by, e.g., subpopulations or domains. -->
[More Information Needed]
#### Metrics
<!-- These are the evaluation metrics being used, ideally with a description of why. -->
[More Information Needed]
### Results
[More Information Needed]
#### Summary
## Model Examination [optional]
<!-- Relevant interpretability work for the model goes here -->
[More Information Needed]
## Environmental Impact
<!-- Total emissions (in grams of CO2eq) and additional considerations, such as electricity usage, go here. Edit the suggested text below accordingly -->
Carbon emissions can be estimated using the [Machine Learning Impact calculator](https://mlco2.github.io/impact#compute) presented in [Lacoste et al. (2019)](https://arxiv.org/abs/1910.09700).
- **Hardware Type:** [More Information Needed]
- **Hours used:** [More Information Needed]
- **Cloud Provider:** [More Information Needed]
- **Compute Region:** [More Information Needed]
- **Carbon Emitted:** [More Information Needed]
## Technical Specifications [optional]
### Model Architecture and Objective
[More Information Needed]
### Compute Infrastructure
[More Information Needed]
#### Hardware
[More Information Needed]
#### Software
[More Information Needed]
## Citation [optional]
<!-- If there is a paper or blog post introducing the model, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the model or model card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Model Card Authors [optional]
[More Information Needed]
## Model Card Contact
[More Information Needed] | [
"label_0",
"label_1",
"label_2",
"label_3"
] |
letuminhtuan/bkad-deformable-detr_last |
# Model Card for Model ID
<!-- Provide a quick summary of what the model is/does. -->
## Model Details
### Model Description
<!-- Provide a longer summary of what this model is. -->
This is the model card of a 🤗 transformers model that has been pushed on the Hub. This model card has been automatically generated.
- **Developed by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Model type:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
- **Finetuned from model [optional]:** [More Information Needed]
### Model Sources [optional]
<!-- Provide the basic links for the model. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the model is intended to be used, including the foreseeable users of the model and those affected by the model. -->
### Direct Use
<!-- This section is for the model use without fine-tuning or plugging into a larger ecosystem/app. -->
[More Information Needed]
### Downstream Use [optional]
<!-- This section is for the model use when fine-tuned for a task, or when plugged into a larger ecosystem/app -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the model will not work well for. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.
## How to Get Started with the Model
Use the code below to get started with the model.
[More Information Needed]
## Training Details
### Training Data
<!-- This should link to a Dataset Card, perhaps with a short stub of information on what the training data is all about as well as documentation related to data pre-processing or additional filtering. -->
[More Information Needed]
### Training Procedure
<!-- This relates heavily to the Technical Specifications. Content here should link to that section when it is relevant to the training procedure. -->
#### Preprocessing [optional]
[More Information Needed]
#### Training Hyperparameters
- **Training regime:** [More Information Needed] <!--fp32, fp16 mixed precision, bf16 mixed precision, bf16 non-mixed precision, fp16 non-mixed precision, fp8 mixed precision -->
#### Speeds, Sizes, Times [optional]
<!-- This section provides information about throughput, start/end time, checkpoint size if relevant, etc. -->
[More Information Needed]
## Evaluation
<!-- This section describes the evaluation protocols and provides the results. -->
### Testing Data, Factors & Metrics
#### Testing Data
<!-- This should link to a Dataset Card if possible. -->
[More Information Needed]
#### Factors
<!-- These are the things the evaluation is disaggregating by, e.g., subpopulations or domains. -->
[More Information Needed]
#### Metrics
<!-- These are the evaluation metrics being used, ideally with a description of why. -->
[More Information Needed]
### Results
[More Information Needed]
#### Summary
## Model Examination [optional]
<!-- Relevant interpretability work for the model goes here -->
[More Information Needed]
## Environmental Impact
<!-- Total emissions (in grams of CO2eq) and additional considerations, such as electricity usage, go here. Edit the suggested text below accordingly -->
Carbon emissions can be estimated using the [Machine Learning Impact calculator](https://mlco2.github.io/impact#compute) presented in [Lacoste et al. (2019)](https://arxiv.org/abs/1910.09700).
- **Hardware Type:** [More Information Needed]
- **Hours used:** [More Information Needed]
- **Cloud Provider:** [More Information Needed]
- **Compute Region:** [More Information Needed]
- **Carbon Emitted:** [More Information Needed]
## Technical Specifications [optional]
### Model Architecture and Objective
[More Information Needed]
### Compute Infrastructure
[More Information Needed]
#### Hardware
[More Information Needed]
#### Software
[More Information Needed]
## Citation [optional]
<!-- If there is a paper or blog post introducing the model, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the model or model card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Model Card Authors [optional]
[More Information Needed]
## Model Card Contact
[More Information Needed] | [
"label_0",
"label_1",
"label_2",
"label_3"
] |
letuminhtuan/bkad-deformable-detr_best |
# Model Card for Model ID
<!-- Provide a quick summary of what the model is/does. -->
## Model Details
### Model Description
<!-- Provide a longer summary of what this model is. -->
This is the model card of a 🤗 transformers model that has been pushed on the Hub. This model card has been automatically generated.
- **Developed by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Model type:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
- **Finetuned from model [optional]:** [More Information Needed]
### Model Sources [optional]
<!-- Provide the basic links for the model. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the model is intended to be used, including the foreseeable users of the model and those affected by the model. -->
### Direct Use
<!-- This section is for the model use without fine-tuning or plugging into a larger ecosystem/app. -->
[More Information Needed]
### Downstream Use [optional]
<!-- This section is for the model use when fine-tuned for a task, or when plugged into a larger ecosystem/app -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the model will not work well for. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.
## How to Get Started with the Model
Use the code below to get started with the model.
[More Information Needed]
## Training Details
### Training Data
<!-- This should link to a Dataset Card, perhaps with a short stub of information on what the training data is all about as well as documentation related to data pre-processing or additional filtering. -->
[More Information Needed]
### Training Procedure
<!-- This relates heavily to the Technical Specifications. Content here should link to that section when it is relevant to the training procedure. -->
#### Preprocessing [optional]
[More Information Needed]
#### Training Hyperparameters
- **Training regime:** [More Information Needed] <!--fp32, fp16 mixed precision, bf16 mixed precision, bf16 non-mixed precision, fp16 non-mixed precision, fp8 mixed precision -->
#### Speeds, Sizes, Times [optional]
<!-- This section provides information about throughput, start/end time, checkpoint size if relevant, etc. -->
[More Information Needed]
## Evaluation
<!-- This section describes the evaluation protocols and provides the results. -->
### Testing Data, Factors & Metrics
#### Testing Data
<!-- This should link to a Dataset Card if possible. -->
[More Information Needed]
#### Factors
<!-- These are the things the evaluation is disaggregating by, e.g., subpopulations or domains. -->
[More Information Needed]
#### Metrics
<!-- These are the evaluation metrics being used, ideally with a description of why. -->
[More Information Needed]
### Results
[More Information Needed]
#### Summary
## Model Examination [optional]
<!-- Relevant interpretability work for the model goes here -->
[More Information Needed]
## Environmental Impact
<!-- Total emissions (in grams of CO2eq) and additional considerations, such as electricity usage, go here. Edit the suggested text below accordingly -->
Carbon emissions can be estimated using the [Machine Learning Impact calculator](https://mlco2.github.io/impact#compute) presented in [Lacoste et al. (2019)](https://arxiv.org/abs/1910.09700).
- **Hardware Type:** [More Information Needed]
- **Hours used:** [More Information Needed]
- **Cloud Provider:** [More Information Needed]
- **Compute Region:** [More Information Needed]
- **Carbon Emitted:** [More Information Needed]
## Technical Specifications [optional]
### Model Architecture and Objective
[More Information Needed]
### Compute Infrastructure
[More Information Needed]
#### Hardware
[More Information Needed]
#### Software
[More Information Needed]
## Citation [optional]
<!-- If there is a paper or blog post introducing the model, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the model or model card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Model Card Authors [optional]
[More Information Needed]
## Model Card Contact
[More Information Needed] | [
"label_0",
"label_1",
"label_2",
"label_3"
] |
mrdbourke/detr_finetuned_trashify_box_detector_with_data_aug |
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# detr_finetuned_trashify_box_detector_with_data_aug
This model is a fine-tuned version of [microsoft/conditional-detr-resnet-50](https://huggingface.co/microsoft/conditional-detr-resnet-50) on an unknown dataset.
It achieves the following results on the evaluation set:
- Loss: 1.0704
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 0.0001
- train_batch_size: 16
- eval_batch_size: 16
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- lr_scheduler_warmup_ratio: 0.05
- num_epochs: 25
- mixed_precision_training: Native AMP
### Training results
| Training Loss | Epoch | Step | Validation Loss |
|:-------------:|:-----:|:----:|:---------------:|
| 100.4735 | 1.0 | 50 | 8.0297 |
| 4.369 | 2.0 | 100 | 2.7376 |
| 2.5518 | 3.0 | 150 | 2.1839 |
| 2.2226 | 4.0 | 200 | 1.9228 |
| 1.9906 | 5.0 | 250 | 1.7408 |
| 1.8219 | 6.0 | 300 | 1.5573 |
| 1.6974 | 7.0 | 350 | 1.4779 |
| 1.6027 | 8.0 | 400 | 1.4510 |
| 1.5517 | 9.0 | 450 | 1.3711 |
| 1.4491 | 10.0 | 500 | 1.3177 |
| 1.4335 | 11.0 | 550 | 1.2811 |
| 1.3645 | 12.0 | 600 | 1.2475 |
| 1.3314 | 13.0 | 650 | 1.2060 |
| 1.2973 | 14.0 | 700 | 1.1874 |
| 1.2506 | 15.0 | 750 | 1.1794 |
| 1.2319 | 16.0 | 800 | 1.1657 |
| 1.1479 | 17.0 | 850 | 1.1300 |
| 1.1466 | 18.0 | 900 | 1.1179 |
| 1.1138 | 19.0 | 950 | 1.1095 |
| 1.1153 | 20.0 | 1000 | 1.0961 |
| 1.0894 | 21.0 | 1050 | 1.0790 |
| 1.0691 | 22.0 | 1100 | 1.0870 |
| 1.0619 | 23.0 | 1150 | 1.0804 |
| 1.0459 | 24.0 | 1200 | 1.0717 |
| 1.0363 | 25.0 | 1250 | 1.0704 |
### Framework versions
- Transformers 4.45.0.dev0
- Pytorch 2.4.0+cu124
- Datasets 2.21.0
- Tokenizers 0.19.1
| [
"bin",
"hand",
"not_bin",
"not_hand",
"not_trash",
"trash",
"trash_arm"
] |
cems-official/panels_detection_rtdetr |
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# panels_detection_rtdetr
This model is a fine-tuned version of [PekingU/rtdetr_r101vd_coco_o365](https://huggingface.co/PekingU/rtdetr_r101vd_coco_o365) on the None dataset.
It achieves the following results on the evaluation set:
- Loss: 9.5718
- Map: 0.5617
- Map 50: 0.6631
- Map 75: 0.6137
- Map Small: -1.0
- Map Medium: 0.3451
- Map Large: 0.5935
- Mar 1: 0.6546
- Mar 10: 0.7877
- Mar 100: 0.8058
- Mar Small: -1.0
- Mar Medium: 0.5802
- Mar Large: 0.8672
- Map Radar (small): 0.3509
- Mar 100 Radar (small): 0.8077
- Map Ship management system (small): 0.6748
- Mar 100 Ship management system (small): 0.8933
- Map Radar (large): 0.5846
- Mar 100 Radar (large): 0.8624
- Map Ship management system (large): 0.7577
- Mar 100 Ship management system (large): 0.9341
- Map Ship management system (top): 0.789
- Mar 100 Ship management system (top): 0.8356
- Map Ecdis (large): 0.3281
- Mar 100 Ecdis (large): 0.7652
- Map Visual observation (small): 0.585
- Mar 100 Visual observation (small): 0.902
- Map Ecdis (small): 0.7635
- Mar 100 Ecdis (small): 0.8967
- Map Ship management system (table top): 0.6306
- Mar 100 Ship management system (table top): 0.7882
- Map Thruster control: 0.4949
- Mar 100 Thruster control: 0.7447
- Map Visual observation (left): 0.6062
- Mar 100 Visual observation (left): 0.8395
- Map Visual observation (mid): 0.7946
- Mar 100 Visual observation (mid): 0.8901
- Map Visual observation (right): 0.7446
- Mar 100 Visual observation (right): 0.8966
- Map Bow thruster: 0.2392
- Mar 100 Bow thruster: 0.5167
- Map Me telegraph: 0.0825
- Mar 100 Me telegraph: 0.5143
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 5e-05
- train_batch_size: 4
- eval_batch_size: 8
- seed: 42
- optimizer: Use adamw_torch with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
- lr_scheduler_type: cosine
- num_epochs: 7
### Training results
| Training Loss | Epoch | Step | Validation Loss | Map | Map 50 | Map 75 | Map Small | Map Medium | Map Large | Mar 1 | Mar 10 | Mar 100 | Mar Small | Mar Medium | Mar Large | Map Radar (small) | Mar 100 Radar (small) | Map Ship management system (small) | Mar 100 Ship management system (small) | Map Radar (large) | Mar 100 Radar (large) | Map Ship management system (large) | Mar 100 Ship management system (large) | Map Ship management system (top) | Mar 100 Ship management system (top) | Map Ecdis (large) | Mar 100 Ecdis (large) | Map Visual observation (small) | Mar 100 Visual observation (small) | Map Ecdis (small) | Mar 100 Ecdis (small) | Map Ship management system (table top) | Mar 100 Ship management system (table top) | Map Thruster control | Mar 100 Thruster control | Map Visual observation (left) | Mar 100 Visual observation (left) | Map Visual observation (mid) | Mar 100 Visual observation (mid) | Map Visual observation (right) | Mar 100 Visual observation (right) | Map Bow thruster | Mar 100 Bow thruster | Map Me telegraph | Mar 100 Me telegraph |
|:-------------:|:-----:|:----:|:---------------:|:------:|:------:|:------:|:---------:|:----------:|:---------:|:------:|:------:|:-------:|:---------:|:----------:|:---------:|:-----------------:|:---------------------:|:----------------------------------:|:--------------------------------------:|:-----------------:|:---------------------:|:----------------------------------:|:--------------------------------------:|:--------------------------------:|:------------------------------------:|:-----------------:|:---------------------:|:------------------------------:|:----------------------------------:|:-----------------:|:---------------------:|:--------------------------------------:|:------------------------------------------:|:--------------------:|:------------------------:|:-----------------------------:|:---------------------------------:|:----------------------------:|:--------------------------------:|:------------------------------:|:----------------------------------:|:----------------:|:--------------------:|:----------------:|:--------------------:|
| 14.2599 | 1.0 | 699 | 9.6242 | 0.4769 | 0.5404 | 0.5144 | -1.0 | 0.2274 | 0.5416 | 0.5866 | 0.755 | 0.7709 | -1.0 | 0.4884 | 0.8359 | 0.7408 | 0.92 | 0.672 | 0.8827 | 0.7054 | 0.9504 | 0.8329 | 0.926 | 0.7965 | 0.8692 | 0.3419 | 0.9571 | 0.2734 | 0.8627 | 0.1207 | 0.6933 | 0.4841 | 0.7059 | 0.3541 | 0.6947 | 0.5303 | 0.8961 | 0.8393 | 0.9342 | 0.2629 | 0.8466 | 0.1988 | 0.3583 | 0.0011 | 0.0667 |
| 8.9356 | 2.0 | 1398 | 9.1941 | 0.5527 | 0.6652 | 0.6044 | -1.0 | 0.3212 | 0.574 | 0.6512 | 0.7882 | 0.8015 | -1.0 | 0.6608 | 0.8085 | 0.6989 | 0.8862 | 0.5273 | 0.8053 | 0.7683 | 0.9145 | 0.7209 | 0.9073 | 0.7995 | 0.8644 | 0.4929 | 0.833 | 0.4034 | 0.8392 | 0.5519 | 0.8333 | 0.6453 | 0.8618 | 0.4221 | 0.6447 | 0.5734 | 0.8474 | 0.8714 | 0.8973 | 0.412 | 0.8448 | 0.3154 | 0.5333 | 0.0874 | 0.5095 |
| 8.1388 | 3.0 | 2097 | 9.7524 | 0.535 | 0.6013 | 0.5854 | -1.0 | 0.2545 | 0.574 | 0.6219 | 0.7425 | 0.7612 | -1.0 | 0.538 | 0.8183 | 0.6358 | 0.8292 | 0.5844 | 0.8013 | 0.6721 | 0.8368 | 0.7422 | 0.8829 | 0.7144 | 0.8096 | 0.4904 | 0.8562 | 0.7623 | 0.9078 | 0.5667 | 0.89 | 0.6409 | 0.7824 | 0.1853 | 0.5763 | 0.5453 | 0.7789 | 0.8362 | 0.9 | 0.5862 | 0.9207 | 0.0384 | 0.3833 | 0.0248 | 0.2619 |
| 7.5951 | 4.0 | 2796 | 9.3983 | 0.5991 | 0.7001 | 0.6587 | -1.0 | 0.3745 | 0.6167 | 0.6957 | 0.8036 | 0.8188 | -1.0 | 0.6611 | 0.8746 | 0.603 | 0.8538 | 0.626 | 0.88 | 0.6211 | 0.8496 | 0.8218 | 0.9382 | 0.8062 | 0.8433 | 0.3917 | 0.8804 | 0.6202 | 0.851 | 0.8307 | 0.9433 | 0.555 | 0.8147 | 0.5143 | 0.8 | 0.6609 | 0.8579 | 0.887 | 0.9369 | 0.7174 | 0.8759 | 0.2732 | 0.5333 | 0.0579 | 0.4238 |
| 7.1786 | 5.0 | 3495 | 9.1194 | 0.6117 | 0.7144 | 0.6689 | -1.0 | 0.3458 | 0.6476 | 0.6904 | 0.8136 | 0.8324 | -1.0 | 0.6649 | 0.8777 | 0.5 | 0.8538 | 0.6723 | 0.8733 | 0.7272 | 0.8795 | 0.778 | 0.9398 | 0.7803 | 0.8385 | 0.3389 | 0.8509 | 0.6484 | 0.8804 | 0.7914 | 0.9433 | 0.7053 | 0.8059 | 0.6257 | 0.8447 | 0.5945 | 0.8658 | 0.8411 | 0.9009 | 0.7812 | 0.9397 | 0.2863 | 0.5792 | 0.1053 | 0.4905 |
| 7.1386 | 6.0 | 4194 | 9.9394 | 0.5353 | 0.634 | 0.5921 | -1.0 | 0.3062 | 0.5549 | 0.6429 | 0.7691 | 0.7874 | -1.0 | 0.5638 | 0.8364 | 0.3431 | 0.7631 | 0.6563 | 0.8813 | 0.5789 | 0.8393 | 0.6941 | 0.9236 | 0.721 | 0.7712 | 0.4061 | 0.8018 | 0.5685 | 0.8725 | 0.7656 | 0.91 | 0.5317 | 0.8 | 0.5194 | 0.7684 | 0.5191 | 0.8039 | 0.7994 | 0.8586 | 0.6714 | 0.8793 | 0.2223 | 0.4958 | 0.0333 | 0.4429 |
| 7.0912 | 7.0 | 4893 | 9.5718 | 0.5617 | 0.6631 | 0.6137 | -1.0 | 0.3451 | 0.5935 | 0.6546 | 0.7877 | 0.8058 | -1.0 | 0.5802 | 0.8672 | 0.3509 | 0.8077 | 0.6748 | 0.8933 | 0.5846 | 0.8624 | 0.7577 | 0.9341 | 0.789 | 0.8356 | 0.3281 | 0.7652 | 0.585 | 0.902 | 0.7635 | 0.8967 | 0.6306 | 0.7882 | 0.4949 | 0.7447 | 0.6062 | 0.8395 | 0.7946 | 0.8901 | 0.7446 | 0.8966 | 0.2392 | 0.5167 | 0.0825 | 0.5143 |
### Framework versions
- Transformers 4.46.0
- Pytorch 2.5.0+cu121
- Datasets 3.0.2
- Tokenizers 0.20.1
| [
"radar (small)",
"ship management system (small)",
"radar (large)",
"ship management system (large)",
"ship management system (top)",
"ecdis (large)",
"visual observation (small)",
"ecdis (small)",
"ship management system (table top)",
"thruster control",
"visual observation (left)",
"visual observation (mid)",
"visual observation (right)",
"bow thruster",
"me telegraph"
] |
Aryan-401/tumor-detect-yolo-small |
# Model Trained Using AutoTrain
- Problem type: Object Detection
## Validation Metrics
loss: 1.1927123069763184
map: 0.2537
map_50: 0.502
map_75: 0.2303
map_small: 0.0016
map_medium: 0.1325
map_large: 0.3796
mar_1: 0.3388
mar_10: 0.4646
mar_100: 0.5012
mar_small: 0.0456
mar_medium: 0.4027
mar_large: 0.678 | [
"brain-tumor",
"label0",
"label1",
"label2"
] |
tech-aloa/detr-resnet-50_finetuned_cppe5 |
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# detr-resnet-50_finetuned_cppe5
This model is a fine-tuned version of [facebook/detr-resnet-50](https://huggingface.co/facebook/detr-resnet-50) on an unknown dataset.
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 1e-05
- train_batch_size: 8
- eval_batch_size: 8
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 10
- mixed_precision_training: Native AMP
### Training results
### Framework versions
- Transformers 4.44.2
- Pytorch 2.4.1+cu121
- Datasets 3.1.0
- Tokenizers 0.19.1
| [
"coverall",
"face_shield",
"gloves",
"goggles",
"mask"
] |
xiuqhou/relation-detr-resnet50 |
# Relation DETR model with ResNet-50 backbone
## Model Details
The model is not available now. We are working on integrating Relation-DETR into transformers. We will update as soon as possible.
### Model Description

> This paper presents a general scheme for enhancing the convergence and performance of DETR (DEtection TRansformer).
> We investigate the slow convergence problem in transformers from a new perspective, suggesting that it arises from
> the self-attention that introduces no structural bias over inputs. To address this issue, we explore incorporating
> position relation prior as attention bias to augment object detection, following the verification of its statistical
> significance using a proposed quantitative macroscopic correlation (MC) metric. Our approach, termed Relation-DETR,
> introduces an encoder to construct position relation embeddings for progressive attention refinement, which further
> extends the traditional streaming pipeline of DETR into a contrastive relation pipeline to address the conflicts
> between non-duplicate predictions and positive supervision. Extensive experiments on both generic and task-specific
> datasets demonstrate the effectiveness of our approach. Under the same configurations, Relation-DETR achieves a
> significant improvement (+2.0% AP compared to DINO), state-of-the-art performance (51.7% AP for 1x and 52.1% AP
> for 2x settings), and a remarkably faster convergence speed (over 40% AP with only 2 training epochs) than existing
> DETR detectors on COCO val2017. Moreover, the proposed relation encoder serves as a universal plug-in-and-play component,
> bringing clear improvements for theoretically any DETR-like methods. Furthermore, we introduce a class-agnostic detection
> dataset, SA-Det-100k. The experimental results on the dataset illustrate that the proposed explicit position relation
> achieves a clear improvement of 1.3% AP, highlighting its potential towards universal object detection.
> The code and dataset are available at [this https URL](https://github.com/xiuqhou/Relation-DETR).
- **Developed by:** [Xiuquan Hou]
- **Shared by:** Xiuquan Hou
- **Model type:** Relation DETR
- **License:** Apache-2.0
### Model Sources
<!-- Provide the basic links for the model. -->
- **Repository:** [https://github.com/xiuqhou/Relation-DETR](https://github.com/xiuqhou/Relation-DETR)
- **Paper:** [Relation DETR: Exploring Explicit Position Relation Prior for Object Detection](https://arxiv.org/abs/2407.11699)
<!-- - **Demo [optional]:** [More Information Needed] -->
## How to Get Started with the Model
Use the code below to get started with the model.
```python
import torch
import requests
from PIL import Image
from transformers import RelationDetrForObjectDetection, RelationDetrImageProcessor
url = 'http://images.cocodataset.org/val2017/000000039769.jpg'
image = Image.open(requests.get(url, stream=True).raw)
image_processor = RelationDetrImageProcessor.from_pretrained("PekingU/rtdetr_r50vd")
model = RelationDetrForObjectDetection.from_pretrained("PekingU/rtdetr_r50vd")
inputs = image_processor(images=image, return_tensors="pt")
with torch.no_grad():
outputs = model(**inputs)
results = image_processor.post_process_object_detection(outputs, target_sizes=torch.tensor([image.size[::-1]]), threshold=0.3)
for result in results:
for score, label_id, box in zip(result["scores"], result["labels"], result["boxes"]):
score, label = score.item(), label_id.item()
box = [round(i, 2) for i in box.tolist()]
print(f"{model.config.id2label[label]}: {score:.2f} {box}")
```
This should output
```python
cat: 0.96 [343.8, 24.9, 639.52, 371.71]
cat: 0.95 [12.6, 54.34, 316.37, 471.86]
remote: 0.95 [40.09, 73.49, 175.52, 118.06]
remote: 0.90 [333.09, 76.71, 369.77, 187.4]
couch: 0.90 [0.44, 0.53, 640.44, 475.54]
```
## Training Details
Relation DEtection TRansformer (Relation DETR) model is trained on [COCO 2017 object detection](https://cocodataset.org/#download) (118k annotated images) for 12 epochs (aka 1x schedule).
## Evaluation
| Model | Backbone | Epoch | mAP | AP<sub>50 | AP<sub>75 | AP<sub>S | AP<sub>M | AP<sub>L |
| ------------------- | -------------------- | :---: | :---: | :-------: | :-------: | :------: | :------: | :------: |
| Relation DETR | ResNet50 | 12 | 51.7 | 69.1 | 56.3 | 36.1 | 55.6 | 66.1 |
| Relation DETR | Swin-L<sub>(IN-22K) | 12 | 57.8 | 76.1 | 62.9 | 41.2 | 62.1 | 74.4 |
| Relation DETR | ResNet50 | 24 | 52.1 | 69.7 | 56.6 | 36.1 | 56.0 | 66.5 |
| Relation DETR | Swin-L<sub>(IN-22K) | 24 | 58.1 | 76.4 | 63.5 | 41.8 | 63.0 | 73.5 |
| Relation-DETR<sup>† | Focal-L<sub>(IN-22K) | 4+24 | 63.5 | 80.8 | 69.1 | 47.2 | 66.9 | 77.0 |
† means finetuned model on COCO after pretraining on Object365.
## Model Architecture and Objective


## Citation and BibTeX
```
@misc{hou2024relationdetrexploringexplicit,
title={Relation DETR: Exploring Explicit Position Relation Prior for Object Detection},
author={Xiuquan Hou and Meiqin Liu and Senlin Zhang and Ping Wei and Badong Chen and Xuguang Lan},
year={2024},
eprint={2407.11699},
archivePrefix={arXiv},
primaryClass={cs.CV},
url={https://arxiv.org/abs/2407.11699},
}
```
## Model Card Authors
[xiuqhou](https://huggingface.co/xiuqhou)
| [
"none",
"person",
"bicycle",
"car",
"motorcycle",
"airplane",
"bus",
"train",
"truck",
"boat",
"traffic light",
"fire hydrant",
"none",
"stop sign",
"parking meter",
"bench",
"bird",
"cat",
"dog",
"horse",
"sheep",
"cow",
"elephant",
"bear",
"zebra",
"giraffe",
"none",
"backpack",
"umbrella",
"none",
"none",
"handbag",
"tie",
"suitcase",
"frisbee",
"skis",
"snowboard",
"sports ball",
"kite",
"baseball bat",
"baseball glove",
"skateboard",
"surfboard",
"tennis racket",
"bottle",
"none",
"wine glass",
"cup",
"fork",
"knife",
"spoon",
"bowl",
"banana",
"apple",
"sandwich",
"orange",
"broccoli",
"carrot",
"hot dog",
"pizza",
"donut",
"cake",
"chair",
"couch",
"potted plant",
"bed",
"none",
"dining table",
"none",
"none",
"toilet",
"none",
"tv",
"laptop",
"mouse",
"remote",
"keyboard",
"cell phone",
"microwave",
"oven",
"toaster",
"sink",
"refrigerator",
"none",
"book",
"clock",
"vase",
"scissors",
"teddy bear",
"hair drier",
"toothbrush"
] |
datamonster/ms-cond-detr-res-50-vehicles |
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# ms-cond-detr-res-50-vehicles
This model is a fine-tuned version of [datamonster/ms-cond-detr-res-50-vehicles](https://huggingface.co/datamonster/ms-cond-detr-res-50-vehicles) on the None dataset.
It achieves the following results on the evaluation set:
- eval_loss: 0.8407
- eval_map: 0.5078
- eval_map_50: 0.8475
- eval_map_75: 0.543
- eval_map_small: 0.1375
- eval_map_medium: 0.5038
- eval_map_large: 0.7143
- eval_mar_1: 0.2612
- eval_mar_10: 0.5658
- eval_mar_100: 0.6127
- eval_mar_small: 0.3327
- eval_mar_medium: 0.6156
- eval_mar_large: 0.7981
- eval_map_motorbike: 0.3689
- eval_mar_100_motorbike: 0.4835
- eval_map_car: 0.5308
- eval_mar_100_car: 0.6198
- eval_map_bus: 0.5808
- eval_mar_100_bus: 0.6911
- eval_map_container: 0.5506
- eval_mar_100_container: 0.6566
- eval_runtime: 144.8577
- eval_samples_per_second: 15.898
- eval_steps_per_second: 1.988
- epoch: 36.0
- step: 82980
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 0.0001
- train_batch_size: 4
- eval_batch_size: 8
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: cosine_with_restarts
- num_epochs: 60
- mixed_precision_training: Native AMP
### Framework versions
- Transformers 4.45.1
- Pytorch 2.4.0
- Datasets 3.0.1
- Tokenizers 0.20.0
| [
"motorbike",
"car",
"bus",
"container"
] |
apolloparty/rtdetr_v2_r101vd |
Transformers usable lyuwenyu RT-DETRv2_101vd
Origin:
- https://github.com/lyuwenyu/RT-DETR/tree/main/rtdetrv2_pytorch
Converter:
- https://huggingface.co/jadechoghari/RT-DETRv2/tree/main
- Modification on config.json -> model_type to be usable with Transformers
Tested and working with:
- Transformers with RTDetrForObjectDetection, RTDetrImageProcessor, AutoModelForObjectDetection, AutoImageProcessor
- supervision with SAHI and annotators (except segmentation)
Code example (coming soon)
Source:
@misc{lv2023detrs,
title={DETRs Beat YOLOs on Real-time Object Detection},
author={Wenyu Lv and Shangliang Xu and Yian Zhao and Guanzhong Wang and Jinman Wei and Cheng Cui and Yuning Du and Qingqing Dang and Yi Liu},
year={2023},
eprint={2304.08069},
archivePrefix={arXiv},
primaryClass={cs.CV}
}
@misc{lv2024rtdetrv2improvedbaselinebagoffreebies,
title={RT-DETRv2: Improved Baseline with Bag-of-Freebies for Real-Time Detection Transformer},
author={Wenyu Lv and Yian Zhao and Qinyao Chang and Kui Huang and Guanzhong Wang and Yi Liu},
year={2024},
eprint={2407.17140},
archivePrefix={arXiv},
primaryClass={cs.CV},
url={https://arxiv.org/abs/2407.17140},
} | [
"person",
"bicycle",
"car",
"motorbike",
"aeroplane",
"bus",
"train",
"truck",
"boat",
"traffic light",
"fire hydrant",
"stop sign",
"parking meter",
"bench",
"bird",
"cat",
"dog",
"horse",
"sheep",
"cow",
"elephant",
"bear",
"zebra",
"giraffe",
"backpack",
"umbrella",
"handbag",
"tie",
"suitcase",
"frisbee",
"skis",
"snowboard",
"sports ball",
"kite",
"baseball bat",
"baseball glove",
"skateboard",
"surfboard",
"tennis racket",
"bottle",
"wine glass",
"cup",
"fork",
"knife",
"spoon",
"bowl",
"banana",
"apple",
"sandwich",
"orange",
"broccoli",
"carrot",
"hot dog",
"pizza",
"donut",
"cake",
"chair",
"sofa",
"pottedplant",
"bed",
"diningtable",
"toilet",
"tvmonitor",
"laptop",
"mouse",
"remote",
"keyboard",
"cell phone",
"microwave",
"oven",
"toaster",
"sink",
"refrigerator",
"book",
"clock",
"vase",
"scissors",
"teddy bear",
"hair drier",
"toothbrush"
] |
turalchik/detr-resnet-50-dc5-fashionpedia-finetuned |
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# detr-resnet-50-dc5-fashionpedia-finetuned
This model is a fine-tuned version of [facebook/detr-resnet-50-dc5](https://huggingface.co/facebook/detr-resnet-50-dc5) on the None dataset.
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 1e-05
- train_batch_size: 2
- eval_batch_size: 2
- seed: 42
- optimizer: Use adamw_torch with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
- lr_scheduler_type: linear
- training_steps: 3
- mixed_precision_training: Native AMP
### Training results
### Framework versions
- Transformers 4.46.2
- Pytorch 2.5.1+cu121
- Datasets 3.1.0
- Tokenizers 0.20.3
| [
"no_tb",
"tb"
] |
Poonam2110/detr-finetuned-bonefracture-5epochs |
# Model Card for Model ID
<!-- Provide a quick summary of what the model is/does. -->
## Model Details
### Model Description
<!-- Provide a longer summary of what this model is. -->
This is the model card of a 🤗 transformers model that has been pushed on the Hub. This model card has been automatically generated.
- **Developed by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Model type:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
- **Finetuned from model [optional]:** [More Information Needed]
### Model Sources [optional]
<!-- Provide the basic links for the model. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the model is intended to be used, including the foreseeable users of the model and those affected by the model. -->
### Direct Use
<!-- This section is for the model use without fine-tuning or plugging into a larger ecosystem/app. -->
[More Information Needed]
### Downstream Use [optional]
<!-- This section is for the model use when fine-tuned for a task, or when plugged into a larger ecosystem/app -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the model will not work well for. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.
## How to Get Started with the Model
Use the code below to get started with the model.
[More Information Needed]
## Training Details
### Training Data
<!-- This should link to a Dataset Card, perhaps with a short stub of information on what the training data is all about as well as documentation related to data pre-processing or additional filtering. -->
[More Information Needed]
### Training Procedure
<!-- This relates heavily to the Technical Specifications. Content here should link to that section when it is relevant to the training procedure. -->
#### Preprocessing [optional]
[More Information Needed]
#### Training Hyperparameters
- **Training regime:** [More Information Needed] <!--fp32, fp16 mixed precision, bf16 mixed precision, bf16 non-mixed precision, fp16 non-mixed precision, fp8 mixed precision -->
#### Speeds, Sizes, Times [optional]
<!-- This section provides information about throughput, start/end time, checkpoint size if relevant, etc. -->
[More Information Needed]
## Evaluation
<!-- This section describes the evaluation protocols and provides the results. -->
### Testing Data, Factors & Metrics
#### Testing Data
<!-- This should link to a Dataset Card if possible. -->
[More Information Needed]
#### Factors
<!-- These are the things the evaluation is disaggregating by, e.g., subpopulations or domains. -->
[More Information Needed]
#### Metrics
<!-- These are the evaluation metrics being used, ideally with a description of why. -->
[More Information Needed]
### Results
[More Information Needed]
#### Summary
## Model Examination [optional]
<!-- Relevant interpretability work for the model goes here -->
[More Information Needed]
## Environmental Impact
<!-- Total emissions (in grams of CO2eq) and additional considerations, such as electricity usage, go here. Edit the suggested text below accordingly -->
Carbon emissions can be estimated using the [Machine Learning Impact calculator](https://mlco2.github.io/impact#compute) presented in [Lacoste et al. (2019)](https://arxiv.org/abs/1910.09700).
- **Hardware Type:** [More Information Needed]
- **Hours used:** [More Information Needed]
- **Cloud Provider:** [More Information Needed]
- **Compute Region:** [More Information Needed]
- **Carbon Emitted:** [More Information Needed]
## Technical Specifications [optional]
### Model Architecture and Objective
[More Information Needed]
### Compute Infrastructure
[More Information Needed]
#### Hardware
[More Information Needed]
#### Software
[More Information Needed]
## Citation [optional]
<!-- If there is a paper or blog post introducing the model, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the model or model card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Model Card Authors [optional]
[More Information Needed]
## Model Card Contact
[More Information Needed] | [
"label_0",
"label_1",
"label_2",
"label_3",
"label_4",
"label_5",
"label_6"
] |
Poonam2110/detr-finetuned-bonefracture-100epochs |
# Model Card for Model ID
<!-- Provide a quick summary of what the model is/does. -->
## Model Details
### Model Description
<!-- Provide a longer summary of what this model is. -->
This is the model card of a 🤗 transformers model that has been pushed on the Hub. This model card has been automatically generated.
- **Developed by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Model type:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
- **Finetuned from model [optional]:** [More Information Needed]
### Model Sources [optional]
<!-- Provide the basic links for the model. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the model is intended to be used, including the foreseeable users of the model and those affected by the model. -->
### Direct Use
<!-- This section is for the model use without fine-tuning or plugging into a larger ecosystem/app. -->
[More Information Needed]
### Downstream Use [optional]
<!-- This section is for the model use when fine-tuned for a task, or when plugged into a larger ecosystem/app -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the model will not work well for. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.
## How to Get Started with the Model
Use the code below to get started with the model.
[More Information Needed]
## Training Details
### Training Data
<!-- This should link to a Dataset Card, perhaps with a short stub of information on what the training data is all about as well as documentation related to data pre-processing or additional filtering. -->
[More Information Needed]
### Training Procedure
<!-- This relates heavily to the Technical Specifications. Content here should link to that section when it is relevant to the training procedure. -->
#### Preprocessing [optional]
[More Information Needed]
#### Training Hyperparameters
- **Training regime:** [More Information Needed] <!--fp32, fp16 mixed precision, bf16 mixed precision, bf16 non-mixed precision, fp16 non-mixed precision, fp8 mixed precision -->
#### Speeds, Sizes, Times [optional]
<!-- This section provides information about throughput, start/end time, checkpoint size if relevant, etc. -->
[More Information Needed]
## Evaluation
<!-- This section describes the evaluation protocols and provides the results. -->
### Testing Data, Factors & Metrics
#### Testing Data
<!-- This should link to a Dataset Card if possible. -->
[More Information Needed]
#### Factors
<!-- These are the things the evaluation is disaggregating by, e.g., subpopulations or domains. -->
[More Information Needed]
#### Metrics
<!-- These are the evaluation metrics being used, ideally with a description of why. -->
[More Information Needed]
### Results
[More Information Needed]
#### Summary
## Model Examination [optional]
<!-- Relevant interpretability work for the model goes here -->
[More Information Needed]
## Environmental Impact
<!-- Total emissions (in grams of CO2eq) and additional considerations, such as electricity usage, go here. Edit the suggested text below accordingly -->
Carbon emissions can be estimated using the [Machine Learning Impact calculator](https://mlco2.github.io/impact#compute) presented in [Lacoste et al. (2019)](https://arxiv.org/abs/1910.09700).
- **Hardware Type:** [More Information Needed]
- **Hours used:** [More Information Needed]
- **Cloud Provider:** [More Information Needed]
- **Compute Region:** [More Information Needed]
- **Carbon Emitted:** [More Information Needed]
## Technical Specifications [optional]
### Model Architecture and Objective
[More Information Needed]
### Compute Infrastructure
[More Information Needed]
#### Hardware
[More Information Needed]
#### Software
[More Information Needed]
## Citation [optional]
<!-- If there is a paper or blog post introducing the model, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the model or model card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Model Card Authors [optional]
[More Information Needed]
## Model Card Contact
[More Information Needed] | [
"label_0",
"label_1",
"label_2",
"label_3",
"label_4",
"label_5",
"label_6"
] |
madhutry/detr-finetuned-scrn-expanded-1 |
# Model Card for Model ID
<!-- Provide a quick summary of what the model is/does. -->
## Model Details
### Model Description
<!-- Provide a longer summary of what this model is. -->
This is the model card of a 🤗 transformers model that has been pushed on the Hub. This model card has been automatically generated.
- **Developed by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Model type:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
- **Finetuned from model [optional]:** [More Information Needed]
### Model Sources [optional]
<!-- Provide the basic links for the model. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the model is intended to be used, including the foreseeable users of the model and those affected by the model. -->
### Direct Use
<!-- This section is for the model use without fine-tuning or plugging into a larger ecosystem/app. -->
[More Information Needed]
### Downstream Use [optional]
<!-- This section is for the model use when fine-tuned for a task, or when plugged into a larger ecosystem/app -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the model will not work well for. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.
## How to Get Started with the Model
Use the code below to get started with the model.
[More Information Needed]
## Training Details
### Training Data
<!-- This should link to a Dataset Card, perhaps with a short stub of information on what the training data is all about as well as documentation related to data pre-processing or additional filtering. -->
[More Information Needed]
### Training Procedure
<!-- This relates heavily to the Technical Specifications. Content here should link to that section when it is relevant to the training procedure. -->
#### Preprocessing [optional]
[More Information Needed]
#### Training Hyperparameters
- **Training regime:** [More Information Needed] <!--fp32, fp16 mixed precision, bf16 mixed precision, bf16 non-mixed precision, fp16 non-mixed precision, fp8 mixed precision -->
#### Speeds, Sizes, Times [optional]
<!-- This section provides information about throughput, start/end time, checkpoint size if relevant, etc. -->
[More Information Needed]
## Evaluation
<!-- This section describes the evaluation protocols and provides the results. -->
### Testing Data, Factors & Metrics
#### Testing Data
<!-- This should link to a Dataset Card if possible. -->
[More Information Needed]
#### Factors
<!-- These are the things the evaluation is disaggregating by, e.g., subpopulations or domains. -->
[More Information Needed]
#### Metrics
<!-- These are the evaluation metrics being used, ideally with a description of why. -->
[More Information Needed]
### Results
[More Information Needed]
#### Summary
## Model Examination [optional]
<!-- Relevant interpretability work for the model goes here -->
[More Information Needed]
## Environmental Impact
<!-- Total emissions (in grams of CO2eq) and additional considerations, such as electricity usage, go here. Edit the suggested text below accordingly -->
Carbon emissions can be estimated using the [Machine Learning Impact calculator](https://mlco2.github.io/impact#compute) presented in [Lacoste et al. (2019)](https://arxiv.org/abs/1910.09700).
- **Hardware Type:** [More Information Needed]
- **Hours used:** [More Information Needed]
- **Cloud Provider:** [More Information Needed]
- **Compute Region:** [More Information Needed]
- **Carbon Emitted:** [More Information Needed]
## Technical Specifications [optional]
### Model Architecture and Objective
[More Information Needed]
### Compute Infrastructure
[More Information Needed]
#### Hardware
[More Information Needed]
#### Software
[More Information Needed]
## Citation [optional]
<!-- If there is a paper or blog post introducing the model, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the model or model card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Model Card Authors [optional]
[More Information Needed]
## Model Card Contact
[More Information Needed] | [
"label_0",
"label_1",
"label_2",
"label_3",
"label_4",
"label_5",
"label_6",
"label_7",
"label_8",
"label_9",
"label_10",
"label_11"
] |
apkonsta/table-transformer-detection-ifrs |
# Model Card for Model ID
This repository contains a fine-tuned version of the Table Transformer model, specifically adapted for detecting tables in IFRS (International Financial Reporting Standards) PDFs. The model is based on the Table Transformer architecture, which is designed to extract tables from unstructured documents such as PDFs and images.
## Model Details
**Base Model:** microsoft/table-transformer-detection
**Library:** transformers
**Training Data:** The model was trained on a dataset consisting of 2359 IFRS scans, with a focus on detecting tables without borders.
**Classes**: The model is trained to detect two classes: 0 - table (regular tables) and 1 - table_rotated (rotated tables).
## Example Image

# Usage
```python
from transformers import DetrForObjectDetection, DetrImageProcessor
from PIL import Image
import torch
# Load the image processor and model
# DetrImageProcessor is used to preprocess the images before feeding them to the model
image_processor = DetrImageProcessor()
# Load the pre-trained TableTransformer model for object detection
# This model is specifically trained for detecting tables in IFRS documents
model = TableTransformerForObjectDetection.from_pretrained(
"apkonsta/table-transformer-detection-ifrs",
)
# Prepare the image
# Open the image file and convert it to RGB format
image = Image.open("path/to/your/ifrs_pdf_page.png").convert("RGB")
# Table detection threshold
# Set a threshold for detecting tables; only detections with a confidence score above this threshold will be considered
TD_th = 0.5
# Preprocess the image using the image processor
# The image is encoded into a format that the model can understand
encoding = image_processor(image, return_tensors="pt")
# Perform inference without computing gradients (saves memory and computations)
with torch.no_grad():
outputs = model(**encoding)
# Get the probabilities for each detected object
# The softmax function is applied to the logits to get probabilities
probas = outputs.logits.softmax(-1)[0, :, :-1]
# Keep only the detections with a confidence score above the threshold
keep = probas.max(-1).values > TD_th
# Get the target sizes for post-processing
# The target sizes are the dimensions of the original image
target_sizes = torch.tensor(image.size[::-1]).unsqueeze(0)
# Post-process the model outputs to get the final bounding boxes
# The bounding boxes are scaled back to the original image size
postprocessed_outputs = image_processor.post_process(outputs, target_sizes)
bboxes_scaled = postprocessed_outputs[0]["boxes"][keep]
```
| [
"table",
"table rotated"
] |
MalyO2/detr-resnet-50-dc5-baloon-finetuned |
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# detr-resnet-50-dc5-baloon-finetuned
This model is a fine-tuned version of [facebook/detr-resnet-50-dc5](https://huggingface.co/facebook/detr-resnet-50-dc5) on the None dataset.
It achieves the following results on the evaluation set:
- Loss: 0.9290
- Map: 0.3296
- Map 50: 0.4931
- Map 75: 0.3456
- Map Small: 0.2295
- Map Medium: 0.3651
- Map Large: 0.6228
- Mar 1: 0.1023
- Mar 10: 0.4488
- Mar 100: 0.6953
- Mar Small: 0.54
- Mar Medium: 0.6792
- Mar Large: 0.9111
- Map Object: -1.0
- Mar 100 Object: -1.0
- Map Balloon: 0.3296
- Mar 100 Balloon: 0.6953
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 1e-05
- train_batch_size: 4
- eval_batch_size: 2
- seed: 42
- optimizer: Use adamw_torch with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
- lr_scheduler_type: linear
- num_epochs: 30
- mixed_precision_training: Native AMP
### Training results
| Training Loss | Epoch | Step | Validation Loss | Map | Map 50 | Map 75 | Map Small | Map Medium | Map Large | Mar 1 | Mar 10 | Mar 100 | Mar Small | Mar Medium | Mar Large | Map Object | Mar 100 Object | Map Balloon | Mar 100 Balloon |
|:-------------:|:-------:|:----:|:---------------:|:------:|:------:|:------:|:---------:|:----------:|:---------:|:------:|:------:|:-------:|:---------:|:----------:|:---------:|:----------:|:--------------:|:-----------:|:---------------:|
| 2.267 | 0.1538 | 2 | 5.0531 | 0.003 | 0.0041 | 0.0041 | 0.0 | 0.0012 | 0.0115 | 0.0163 | 0.0186 | 0.0605 | 0.0 | 0.0375 | 0.1889 | -1.0 | -1.0 | 0.003 | 0.0605 |
| 6.4786 | 0.3077 | 4 | 5.0531 | 0.003 | 0.0041 | 0.0041 | 0.0 | 0.0012 | 0.0115 | 0.0163 | 0.0186 | 0.0605 | 0.0 | 0.0375 | 0.1889 | -1.0 | -1.0 | 0.003 | 0.0605 |
| 3.5436 | 0.4615 | 6 | 4.8974 | 0.0045 | 0.0057 | 0.0057 | 0.0 | 0.0035 | 0.0165 | 0.0326 | 0.0581 | 0.0814 | 0.0 | 0.0667 | 0.2111 | -1.0 | -1.0 | 0.0045 | 0.0814 |
| 2.8226 | 0.6154 | 8 | 4.7586 | 0.0039 | 0.005 | 0.005 | 0.0 | 0.004 | 0.0145 | 0.0302 | 0.0581 | 0.0814 | 0.0 | 0.0667 | 0.2111 | -1.0 | -1.0 | 0.0039 | 0.0814 |
| 1.6274 | 0.7692 | 10 | 4.3561 | 0.0092 | 0.0117 | 0.0117 | 0.0 | 0.0031 | 0.0395 | 0.0326 | 0.0744 | 0.1023 | 0.0 | 0.0667 | 0.3111 | -1.0 | -1.0 | 0.0092 | 0.1023 |
| 1.2935 | 0.9231 | 12 | 3.8875 | 0.0176 | 0.0213 | 0.0213 | 0.0 | 0.0034 | 0.0774 | 0.0349 | 0.1163 | 0.1651 | 0.0 | 0.0708 | 0.6 | -1.0 | -1.0 | 0.0176 | 0.1651 |
| 5.3947 | 1.0769 | 14 | 3.4772 | 0.0185 | 0.0214 | 0.0214 | 0.0 | 0.0036 | 0.0815 | 0.0372 | 0.1209 | 0.1698 | 0.0 | 0.075 | 0.6111 | -1.0 | -1.0 | 0.0185 | 0.1698 |
| 1.2301 | 1.2308 | 16 | 3.2641 | 0.0192 | 0.023 | 0.023 | 0.0 | 0.0035 | 0.084 | 0.0372 | 0.1209 | 0.1698 | 0.0 | 0.075 | 0.6111 | -1.0 | -1.0 | 0.0192 | 0.1698 |
| 2.3422 | 1.3846 | 18 | 3.1109 | 0.0215 | 0.0286 | 0.0252 | 0.0 | 0.0041 | 0.0934 | 0.0465 | 0.1233 | 0.1884 | 0.0 | 0.0792 | 0.6889 | -1.0 | -1.0 | 0.0215 | 0.1884 |
| 2.7393 | 1.5385 | 20 | 3.0153 | 0.0269 | 0.0348 | 0.0325 | 0.0 | 0.0082 | 0.1102 | 0.0558 | 0.1233 | 0.2349 | 0.0 | 0.1208 | 0.8 | -1.0 | -1.0 | 0.0269 | 0.2349 |
| 1.3444 | 1.6923 | 22 | 2.7122 | 0.0368 | 0.0455 | 0.0431 | 0.0 | 0.0091 | 0.1533 | 0.0581 | 0.1256 | 0.2674 | 0.0 | 0.1292 | 0.9333 | -1.0 | -1.0 | 0.0368 | 0.2674 |
| 1.9548 | 1.8462 | 24 | 2.3990 | 0.0364 | 0.0446 | 0.0419 | 0.0 | 0.0104 | 0.1508 | 0.0581 | 0.1581 | 0.2698 | 0.0 | 0.1292 | 0.9444 | -1.0 | -1.0 | 0.0364 | 0.2698 |
| 1.5859 | 2.0 | 26 | 1.8176 | 0.042 | 0.0572 | 0.0459 | 0.0 | 0.0307 | 0.149 | 0.0605 | 0.2465 | 0.3465 | 0.0 | 0.2625 | 0.9556 | -1.0 | -1.0 | 0.042 | 0.3465 |
| 1.3379 | 2.1538 | 28 | 1.7229 | 0.0492 | 0.068 | 0.0526 | 0.0 | 0.0483 | 0.1574 | 0.0605 | 0.2279 | 0.4116 | 0.0 | 0.3792 | 0.9556 | -1.0 | -1.0 | 0.0492 | 0.4116 |
| 2.0236 | 2.3077 | 30 | 1.5792 | 0.0549 | 0.0775 | 0.0535 | 0.0 | 0.0695 | 0.1612 | 0.0605 | 0.2419 | 0.4581 | 0.0 | 0.4583 | 0.9667 | -1.0 | -1.0 | 0.0549 | 0.4581 |
| 1.264 | 2.4615 | 32 | 1.4933 | 0.0599 | 0.088 | 0.0583 | 0.0035 | 0.0817 | 0.1601 | 0.0698 | 0.2558 | 0.507 | 0.02 | 0.5375 | 0.9667 | -1.0 | -1.0 | 0.0599 | 0.507 |
| 1.5731 | 2.6154 | 34 | 1.4458 | 0.0689 | 0.0984 | 0.0714 | 0.0226 | 0.0772 | 0.1897 | 0.0605 | 0.2791 | 0.5349 | 0.14 | 0.5417 | 0.9556 | -1.0 | -1.0 | 0.0689 | 0.5349 |
| 1.3227 | 2.7692 | 36 | 1.4095 | 0.0746 | 0.1005 | 0.0724 | 0.03 | 0.0891 | 0.2044 | 0.0628 | 0.2744 | 0.5605 | 0.16 | 0.5708 | 0.9778 | -1.0 | -1.0 | 0.0746 | 0.5605 |
| 1.2522 | 2.9231 | 38 | 1.3976 | 0.0757 | 0.106 | 0.0765 | 0.038 | 0.097 | 0.1949 | 0.0628 | 0.2767 | 0.5814 | 0.25 | 0.575 | 0.9667 | -1.0 | -1.0 | 0.0757 | 0.5814 |
| 2.3231 | 3.0769 | 40 | 1.3929 | 0.0817 | 0.1117 | 0.0807 | 0.0729 | 0.1033 | 0.2102 | 0.0628 | 0.3023 | 0.6047 | 0.25 | 0.6167 | 0.9667 | -1.0 | -1.0 | 0.0817 | 0.6047 |
| 1.3829 | 3.2308 | 42 | 1.3492 | 0.0882 | 0.1248 | 0.0801 | 0.0705 | 0.1411 | 0.2226 | 0.0651 | 0.2907 | 0.6326 | 0.3 | 0.6417 | 0.9778 | -1.0 | -1.0 | 0.0882 | 0.6326 |
| 1.125 | 3.3846 | 44 | 1.3130 | 0.0954 | 0.1293 | 0.0959 | 0.0407 | 0.1519 | 0.2401 | 0.0884 | 0.286 | 0.6488 | 0.34 | 0.65 | 0.9889 | -1.0 | -1.0 | 0.0954 | 0.6488 |
| 1.7321 | 3.5385 | 46 | 1.3028 | 0.0948 | 0.1339 | 0.0925 | 0.0656 | 0.1617 | 0.224 | 0.0884 | 0.2698 | 0.6698 | 0.38 | 0.675 | 0.9778 | -1.0 | -1.0 | 0.0948 | 0.6698 |
| 1.5267 | 3.6923 | 48 | 1.3073 | 0.0961 | 0.1404 | 0.091 | 0.0809 | 0.1634 | 0.2227 | 0.086 | 0.2721 | 0.6721 | 0.41 | 0.6667 | 0.9778 | -1.0 | -1.0 | 0.0961 | 0.6721 |
| 1.2051 | 3.8462 | 50 | 1.3263 | 0.0927 | 0.1375 | 0.0882 | 0.0719 | 0.1504 | 0.2218 | 0.086 | 0.2767 | 0.6581 | 0.45 | 0.6333 | 0.9556 | -1.0 | -1.0 | 0.0927 | 0.6581 |
| 1.7084 | 4.0 | 52 | 1.3281 | 0.0933 | 0.1406 | 0.0903 | 0.1042 | 0.1502 | 0.2192 | 0.0837 | 0.2884 | 0.6512 | 0.45 | 0.6208 | 0.9556 | -1.0 | -1.0 | 0.0933 | 0.6512 |
| 2.0452 | 4.1538 | 54 | 1.3303 | 0.0955 | 0.1445 | 0.0904 | 0.0902 | 0.1525 | 0.2316 | 0.0837 | 0.2907 | 0.6419 | 0.44 | 0.6083 | 0.9556 | -1.0 | -1.0 | 0.0955 | 0.6419 |
| 1.5371 | 4.3077 | 56 | 1.3210 | 0.0948 | 0.1447 | 0.0933 | 0.0891 | 0.1533 | 0.2168 | 0.0814 | 0.2837 | 0.6512 | 0.45 | 0.6292 | 0.9333 | -1.0 | -1.0 | 0.0948 | 0.6512 |
| 1.1854 | 4.4615 | 58 | 1.3097 | 0.097 | 0.1505 | 0.0949 | 0.0894 | 0.1662 | 0.2154 | 0.0814 | 0.3 | 0.6488 | 0.43 | 0.6458 | 0.9 | -1.0 | -1.0 | 0.097 | 0.6488 |
| 1.5528 | 4.6154 | 60 | 1.3089 | 0.0979 | 0.1464 | 0.1001 | 0.0558 | 0.1643 | 0.2151 | 0.0791 | 0.2907 | 0.6512 | 0.44 | 0.6458 | 0.9 | -1.0 | -1.0 | 0.0979 | 0.6512 |
| 1.1302 | 4.7692 | 62 | 1.3065 | 0.1026 | 0.1502 | 0.1049 | 0.0632 | 0.1738 | 0.2311 | 0.0814 | 0.2767 | 0.6605 | 0.44 | 0.6542 | 0.9222 | -1.0 | -1.0 | 0.1026 | 0.6605 |
| 1.5987 | 4.9231 | 64 | 1.3038 | 0.1009 | 0.1486 | 0.0992 | 0.0715 | 0.1761 | 0.2198 | 0.0814 | 0.2837 | 0.6581 | 0.46 | 0.6417 | 0.9222 | -1.0 | -1.0 | 0.1009 | 0.6581 |
| 1.5531 | 5.0769 | 66 | 1.3139 | 0.1025 | 0.15 | 0.1032 | 0.0592 | 0.1798 | 0.2253 | 0.0814 | 0.307 | 0.6442 | 0.44 | 0.6292 | 0.9111 | -1.0 | -1.0 | 0.1025 | 0.6442 |
| 1.0241 | 5.2308 | 68 | 1.3102 | 0.109 | 0.1583 | 0.1143 | 0.057 | 0.1785 | 0.2551 | 0.0791 | 0.3047 | 0.6419 | 0.45 | 0.625 | 0.9 | -1.0 | -1.0 | 0.109 | 0.6419 |
| 1.5513 | 5.3846 | 70 | 1.2913 | 0.1081 | 0.1632 | 0.1114 | 0.0694 | 0.1887 | 0.2358 | 0.0814 | 0.2907 | 0.6651 | 0.49 | 0.65 | 0.9 | -1.0 | -1.0 | 0.1081 | 0.6651 |
| 1.5939 | 5.5385 | 72 | 1.2757 | 0.1103 | 0.1643 | 0.1128 | 0.0682 | 0.1943 | 0.2361 | 0.0814 | 0.2953 | 0.6744 | 0.51 | 0.6542 | 0.9111 | -1.0 | -1.0 | 0.1103 | 0.6744 |
| 1.2404 | 5.6923 | 74 | 1.2517 | 0.1168 | 0.1719 | 0.118 | 0.0661 | 0.1847 | 0.2634 | 0.0814 | 0.3 | 0.6791 | 0.54 | 0.65 | 0.9111 | -1.0 | -1.0 | 0.1168 | 0.6791 |
| 1.5663 | 5.8462 | 76 | 1.2420 | 0.1204 | 0.178 | 0.121 | 0.0699 | 0.1945 | 0.2716 | 0.0814 | 0.3163 | 0.686 | 0.53 | 0.6667 | 0.9111 | -1.0 | -1.0 | 0.1204 | 0.686 |
| 1.6695 | 6.0 | 78 | 1.2407 | 0.1225 | 0.1825 | 0.1233 | 0.0698 | 0.1926 | 0.2767 | 0.0814 | 0.3093 | 0.6791 | 0.53 | 0.6542 | 0.9111 | -1.0 | -1.0 | 0.1225 | 0.6791 |
| 1.3143 | 6.1538 | 80 | 1.2348 | 0.1262 | 0.1869 | 0.1316 | 0.0933 | 0.1915 | 0.2904 | 0.0814 | 0.3 | 0.6698 | 0.53 | 0.6375 | 0.9111 | -1.0 | -1.0 | 0.1262 | 0.6698 |
| 1.9869 | 6.3077 | 82 | 1.2318 | 0.1289 | 0.1952 | 0.1346 | 0.0867 | 0.2023 | 0.291 | 0.0814 | 0.307 | 0.6744 | 0.53 | 0.6458 | 0.9111 | -1.0 | -1.0 | 0.1289 | 0.6744 |
| 1.0268 | 6.4615 | 84 | 1.2244 | 0.1294 | 0.1928 | 0.1401 | 0.0871 | 0.203 | 0.2885 | 0.0814 | 0.314 | 0.6744 | 0.53 | 0.6458 | 0.9111 | -1.0 | -1.0 | 0.1294 | 0.6744 |
| 1.1407 | 6.6154 | 86 | 1.2046 | 0.1331 | 0.1979 | 0.147 | 0.0998 | 0.2129 | 0.2928 | 0.0814 | 0.314 | 0.6884 | 0.55 | 0.6625 | 0.9111 | -1.0 | -1.0 | 0.1331 | 0.6884 |
| 1.5952 | 6.7692 | 88 | 1.1936 | 0.1334 | 0.1972 | 0.1492 | 0.0876 | 0.2099 | 0.2937 | 0.0814 | 0.314 | 0.6884 | 0.55 | 0.6583 | 0.9222 | -1.0 | -1.0 | 0.1334 | 0.6884 |
| 1.0276 | 6.9231 | 90 | 1.1914 | 0.1352 | 0.2002 | 0.1453 | 0.0987 | 0.2112 | 0.2927 | 0.0814 | 0.3163 | 0.6953 | 0.57 | 0.6625 | 0.9222 | -1.0 | -1.0 | 0.1352 | 0.6953 |
| 1.0125 | 7.0769 | 92 | 1.1849 | 0.1359 | 0.2069 | 0.1398 | 0.0968 | 0.2154 | 0.296 | 0.093 | 0.3163 | 0.6791 | 0.54 | 0.65 | 0.9111 | -1.0 | -1.0 | 0.1359 | 0.6791 |
| 1.3652 | 7.2308 | 94 | 1.1857 | 0.1388 | 0.2098 | 0.1433 | 0.0936 | 0.2162 | 0.303 | 0.093 | 0.3395 | 0.6791 | 0.53 | 0.6542 | 0.9111 | -1.0 | -1.0 | 0.1388 | 0.6791 |
| 0.887 | 7.3846 | 96 | 1.1771 | 0.1422 | 0.2142 | 0.1489 | 0.0994 | 0.2258 | 0.3086 | 0.1047 | 0.3419 | 0.6791 | 0.52 | 0.6542 | 0.9222 | -1.0 | -1.0 | 0.1422 | 0.6791 |
| 1.4706 | 7.5385 | 98 | 1.1680 | 0.1477 | 0.2196 | 0.1553 | 0.1118 | 0.2287 | 0.3252 | 0.1047 | 0.3279 | 0.6814 | 0.53 | 0.65 | 0.9333 | -1.0 | -1.0 | 0.1477 | 0.6814 |
| 1.6332 | 7.6923 | 100 | 1.1721 | 0.1467 | 0.2172 | 0.1519 | 0.1185 | 0.226 | 0.3213 | 0.1047 | 0.3279 | 0.6721 | 0.53 | 0.6375 | 0.9222 | -1.0 | -1.0 | 0.1467 | 0.6721 |
| 0.9161 | 7.8462 | 102 | 1.1842 | 0.1488 | 0.2204 | 0.1511 | 0.1153 | 0.2338 | 0.3223 | 0.1047 | 0.3488 | 0.6767 | 0.53 | 0.6458 | 0.9222 | -1.0 | -1.0 | 0.1488 | 0.6767 |
| 1.4722 | 8.0 | 104 | 1.1914 | 0.1514 | 0.2226 | 0.1661 | 0.0998 | 0.2422 | 0.3225 | 0.107 | 0.3442 | 0.6767 | 0.53 | 0.6417 | 0.9333 | -1.0 | -1.0 | 0.1514 | 0.6767 |
| 1.1459 | 8.1538 | 106 | 1.1916 | 0.1499 | 0.22 | 0.1656 | 0.1123 | 0.2316 | 0.3244 | 0.1023 | 0.3395 | 0.6721 | 0.53 | 0.6333 | 0.9333 | -1.0 | -1.0 | 0.1499 | 0.6721 |
| 0.918 | 8.3077 | 108 | 1.1974 | 0.153 | 0.2248 | 0.1668 | 0.1108 | 0.2392 | 0.3316 | 0.1023 | 0.3674 | 0.6651 | 0.51 | 0.6292 | 0.9333 | -1.0 | -1.0 | 0.153 | 0.6651 |
| 1.4877 | 8.4615 | 110 | 1.1883 | 0.1535 | 0.2279 | 0.1573 | 0.0866 | 0.2363 | 0.3458 | 0.1023 | 0.3581 | 0.6628 | 0.51 | 0.6208 | 0.9444 | -1.0 | -1.0 | 0.1535 | 0.6628 |
| 1.0962 | 8.6154 | 112 | 1.1676 | 0.1589 | 0.2405 | 0.1627 | 0.0984 | 0.219 | 0.3806 | 0.1023 | 0.3419 | 0.6651 | 0.51 | 0.6208 | 0.9556 | -1.0 | -1.0 | 0.1589 | 0.6651 |
| 0.8947 | 8.7692 | 114 | 1.1601 | 0.1593 | 0.2333 | 0.169 | 0.0952 | 0.221 | 0.3729 | 0.1023 | 0.3488 | 0.6651 | 0.51 | 0.625 | 0.9444 | -1.0 | -1.0 | 0.1593 | 0.6651 |
| 1.6613 | 8.9231 | 116 | 1.1451 | 0.1586 | 0.2362 | 0.1703 | 0.1229 | 0.2169 | 0.3712 | 0.1 | 0.3512 | 0.6581 | 0.51 | 0.6167 | 0.9333 | -1.0 | -1.0 | 0.1586 | 0.6581 |
| 1.0509 | 9.0769 | 118 | 1.1463 | 0.1605 | 0.2414 | 0.1666 | 0.1368 | 0.2126 | 0.3736 | 0.1 | 0.3744 | 0.6628 | 0.52 | 0.625 | 0.9222 | -1.0 | -1.0 | 0.1605 | 0.6628 |
| 1.5174 | 9.2308 | 120 | 1.1496 | 0.1618 | 0.2423 | 0.1674 | 0.1165 | 0.2188 | 0.3778 | 0.1 | 0.3767 | 0.6605 | 0.51 | 0.625 | 0.9222 | -1.0 | -1.0 | 0.1618 | 0.6605 |
| 1.0898 | 9.3846 | 122 | 1.1498 | 0.1647 | 0.2388 | 0.1712 | 0.1456 | 0.2168 | 0.3837 | 0.1 | 0.3605 | 0.6605 | 0.52 | 0.6208 | 0.9222 | -1.0 | -1.0 | 0.1647 | 0.6605 |
| 1.7163 | 9.5385 | 124 | 1.1541 | 0.163 | 0.2373 | 0.1717 | 0.1465 | 0.2126 | 0.3831 | 0.1 | 0.3605 | 0.6535 | 0.5 | 0.6167 | 0.9222 | -1.0 | -1.0 | 0.163 | 0.6535 |
| 1.5502 | 9.6923 | 126 | 1.1558 | 0.1637 | 0.2376 | 0.1753 | 0.1562 | 0.2127 | 0.3851 | 0.1 | 0.3628 | 0.6488 | 0.5 | 0.6083 | 0.9222 | -1.0 | -1.0 | 0.1637 | 0.6488 |
| 1.0774 | 9.8462 | 128 | 1.1586 | 0.1642 | 0.2374 | 0.1756 | 0.1553 | 0.2082 | 0.3823 | 0.1 | 0.3581 | 0.6488 | 0.51 | 0.6083 | 0.9111 | -1.0 | -1.0 | 0.1642 | 0.6488 |
| 1.4566 | 10.0 | 130 | 1.1572 | 0.1667 | 0.2388 | 0.1778 | 0.1729 | 0.2178 | 0.386 | 0.1 | 0.3744 | 0.6512 | 0.5 | 0.6167 | 0.9111 | -1.0 | -1.0 | 0.1667 | 0.6512 |
| 1.5876 | 10.1538 | 132 | 1.1468 | 0.1689 | 0.2406 | 0.1786 | 0.1341 | 0.2185 | 0.3807 | 0.1 | 0.3744 | 0.6558 | 0.51 | 0.6208 | 0.9111 | -1.0 | -1.0 | 0.1689 | 0.6558 |
| 1.0595 | 10.3077 | 134 | 1.1381 | 0.1656 | 0.2418 | 0.1662 | 0.1216 | 0.2378 | 0.348 | 0.1 | 0.3791 | 0.6651 | 0.5 | 0.6417 | 0.9111 | -1.0 | -1.0 | 0.1656 | 0.6651 |
| 1.4107 | 10.4615 | 136 | 1.1476 | 0.1584 | 0.2286 | 0.164 | 0.1147 | 0.2199 | 0.3475 | 0.1 | 0.3558 | 0.6535 | 0.5 | 0.6208 | 0.9111 | -1.0 | -1.0 | 0.1584 | 0.6535 |
| 1.1775 | 10.6154 | 138 | 1.1448 | 0.1636 | 0.2342 | 0.1729 | 0.1023 | 0.2216 | 0.3697 | 0.1 | 0.3349 | 0.6581 | 0.51 | 0.625 | 0.9111 | -1.0 | -1.0 | 0.1636 | 0.6581 |
| 1.0743 | 10.7692 | 140 | 1.1426 | 0.1649 | 0.2447 | 0.1744 | 0.1159 | 0.2254 | 0.3687 | 0.1 | 0.3302 | 0.6535 | 0.51 | 0.6167 | 0.9111 | -1.0 | -1.0 | 0.1649 | 0.6535 |
| 1.3376 | 10.9231 | 142 | 1.1414 | 0.1654 | 0.2413 | 0.1761 | 0.1015 | 0.2184 | 0.3761 | 0.0977 | 0.3419 | 0.6535 | 0.5 | 0.6167 | 0.9222 | -1.0 | -1.0 | 0.1654 | 0.6535 |
| 1.0473 | 11.0769 | 144 | 1.1339 | 0.1661 | 0.2514 | 0.1683 | 0.1204 | 0.212 | 0.3787 | 0.0977 | 0.3372 | 0.6558 | 0.51 | 0.6208 | 0.9111 | -1.0 | -1.0 | 0.1661 | 0.6558 |
| 1.7627 | 11.2308 | 146 | 1.1271 | 0.1835 | 0.2734 | 0.1848 | 0.1285 | 0.2194 | 0.4265 | 0.0977 | 0.3465 | 0.6605 | 0.52 | 0.625 | 0.9111 | -1.0 | -1.0 | 0.1835 | 0.6605 |
| 1.8799 | 11.3846 | 148 | 1.1196 | 0.1896 | 0.28 | 0.191 | 0.1246 | 0.2245 | 0.4347 | 0.0977 | 0.3465 | 0.6605 | 0.52 | 0.6208 | 0.9222 | -1.0 | -1.0 | 0.1896 | 0.6605 |
| 1.1134 | 11.5385 | 150 | 1.1133 | 0.1931 | 0.2824 | 0.1929 | 0.1142 | 0.2283 | 0.44 | 0.0884 | 0.3395 | 0.6674 | 0.52 | 0.6333 | 0.9222 | -1.0 | -1.0 | 0.1931 | 0.6674 |
| 0.9736 | 11.6923 | 152 | 1.1156 | 0.1933 | 0.2829 | 0.1925 | 0.1017 | 0.2258 | 0.4452 | 0.0884 | 0.3395 | 0.6744 | 0.52 | 0.6458 | 0.9222 | -1.0 | -1.0 | 0.1933 | 0.6744 |
| 1.5898 | 11.8462 | 154 | 1.1083 | 0.1976 | 0.287 | 0.1983 | 0.0979 | 0.2363 | 0.4541 | 0.0884 | 0.3419 | 0.6791 | 0.53 | 0.65 | 0.9222 | -1.0 | -1.0 | 0.1976 | 0.6791 |
| 1.4339 | 12.0 | 156 | 1.1030 | 0.1914 | 0.2746 | 0.1932 | 0.0919 | 0.235 | 0.4382 | 0.0884 | 0.3302 | 0.6791 | 0.53 | 0.65 | 0.9222 | -1.0 | -1.0 | 0.1914 | 0.6791 |
| 1.643 | 12.1538 | 158 | 1.1042 | 0.1911 | 0.2799 | 0.1952 | 0.102 | 0.2312 | 0.4372 | 0.0884 | 0.3349 | 0.6814 | 0.56 | 0.6375 | 0.9333 | -1.0 | -1.0 | 0.1911 | 0.6814 |
| 1.1006 | 12.3077 | 160 | 1.0971 | 0.1941 | 0.2756 | 0.2011 | 0.0989 | 0.2287 | 0.4471 | 0.0884 | 0.3605 | 0.6721 | 0.58 | 0.6208 | 0.9111 | -1.0 | -1.0 | 0.1941 | 0.6721 |
| 1.0992 | 12.4615 | 162 | 1.0928 | 0.1984 | 0.2813 | 0.2135 | 0.0964 | 0.2356 | 0.4492 | 0.0884 | 0.3744 | 0.6744 | 0.58 | 0.625 | 0.9111 | -1.0 | -1.0 | 0.1984 | 0.6744 |
| 1.2495 | 12.6154 | 164 | 1.0591 | 0.2067 | 0.3037 | 0.2163 | 0.1099 | 0.2495 | 0.454 | 0.0884 | 0.3628 | 0.6884 | 0.58 | 0.65 | 0.9111 | -1.0 | -1.0 | 0.2067 | 0.6884 |
| 0.7649 | 12.7692 | 166 | 1.0681 | 0.2041 | 0.3001 | 0.2217 | 0.1265 | 0.2318 | 0.4556 | 0.0884 | 0.3605 | 0.6814 | 0.58 | 0.6333 | 0.9222 | -1.0 | -1.0 | 0.2041 | 0.6814 |
| 0.8181 | 12.9231 | 168 | 1.0623 | 0.207 | 0.2937 | 0.222 | 0.1528 | 0.2209 | 0.4597 | 0.0907 | 0.3744 | 0.6744 | 0.56 | 0.625 | 0.9333 | -1.0 | -1.0 | 0.207 | 0.6744 |
| 1.0497 | 13.0769 | 170 | 1.0530 | 0.21 | 0.3092 | 0.2234 | 0.1496 | 0.2348 | 0.4629 | 0.0907 | 0.3744 | 0.6791 | 0.56 | 0.6333 | 0.9333 | -1.0 | -1.0 | 0.21 | 0.6791 |
| 0.9194 | 13.2308 | 172 | 1.0492 | 0.2135 | 0.3127 | 0.226 | 0.1405 | 0.2448 | 0.4688 | 0.0907 | 0.393 | 0.6814 | 0.56 | 0.6333 | 0.9444 | -1.0 | -1.0 | 0.2135 | 0.6814 |
| 1.0051 | 13.3846 | 174 | 1.0499 | 0.2172 | 0.312 | 0.2237 | 0.1396 | 0.2479 | 0.4848 | 0.093 | 0.393 | 0.686 | 0.56 | 0.6375 | 0.9556 | -1.0 | -1.0 | 0.2172 | 0.686 |
| 1.2519 | 13.5385 | 176 | 1.0486 | 0.2123 | 0.3146 | 0.2138 | 0.139 | 0.245 | 0.4662 | 0.0884 | 0.3814 | 0.6791 | 0.56 | 0.6375 | 0.9222 | -1.0 | -1.0 | 0.2123 | 0.6791 |
| 1.6973 | 13.6923 | 178 | 1.0484 | 0.2194 | 0.3199 | 0.2152 | 0.1492 | 0.2449 | 0.4877 | 0.0907 | 0.3674 | 0.6767 | 0.56 | 0.6292 | 0.9333 | -1.0 | -1.0 | 0.2194 | 0.6767 |
| 1.4149 | 13.8462 | 180 | 1.0446 | 0.2236 | 0.3297 | 0.2189 | 0.1898 | 0.2516 | 0.4951 | 0.0907 | 0.3907 | 0.6721 | 0.55 | 0.625 | 0.9333 | -1.0 | -1.0 | 0.2236 | 0.6721 |
| 0.8535 | 14.0 | 182 | 1.0413 | 0.225 | 0.3382 | 0.2149 | 0.1942 | 0.2475 | 0.505 | 0.0907 | 0.3884 | 0.6558 | 0.53 | 0.6042 | 0.9333 | -1.0 | -1.0 | 0.225 | 0.6558 |
| 1.1813 | 14.1538 | 184 | 1.0374 | 0.2256 | 0.3438 | 0.2182 | 0.1952 | 0.2416 | 0.5105 | 0.0907 | 0.3837 | 0.6558 | 0.53 | 0.6042 | 0.9333 | -1.0 | -1.0 | 0.2256 | 0.6558 |
| 1.4957 | 14.3077 | 186 | 1.0325 | 0.2281 | 0.3461 | 0.231 | 0.1983 | 0.249 | 0.5085 | 0.0907 | 0.386 | 0.6628 | 0.55 | 0.6083 | 0.9333 | -1.0 | -1.0 | 0.2281 | 0.6628 |
| 0.8212 | 14.4615 | 188 | 1.0285 | 0.2335 | 0.349 | 0.2424 | 0.1899 | 0.2582 | 0.5122 | 0.0907 | 0.4279 | 0.6674 | 0.55 | 0.6167 | 0.9333 | -1.0 | -1.0 | 0.2335 | 0.6674 |
| 1.2008 | 14.6154 | 190 | 1.0242 | 0.2377 | 0.3524 | 0.2452 | 0.1822 | 0.2744 | 0.5127 | 0.1023 | 0.4349 | 0.6698 | 0.52 | 0.6333 | 0.9333 | -1.0 | -1.0 | 0.2377 | 0.6698 |
| 0.7911 | 14.7692 | 192 | 1.0191 | 0.2398 | 0.3587 | 0.2462 | 0.178 | 0.2763 | 0.5147 | 0.1023 | 0.4349 | 0.6767 | 0.55 | 0.6333 | 0.9333 | -1.0 | -1.0 | 0.2398 | 0.6767 |
| 0.8986 | 14.9231 | 194 | 1.0145 | 0.2402 | 0.3574 | 0.2473 | 0.1808 | 0.2758 | 0.5151 | 0.1 | 0.414 | 0.6791 | 0.55 | 0.6375 | 0.9333 | -1.0 | -1.0 | 0.2402 | 0.6791 |
| 1.6429 | 15.0769 | 196 | 1.0122 | 0.243 | 0.3619 | 0.2503 | 0.1804 | 0.2824 | 0.5159 | 0.1023 | 0.4349 | 0.6767 | 0.54 | 0.6375 | 0.9333 | -1.0 | -1.0 | 0.243 | 0.6767 |
| 1.1183 | 15.2308 | 198 | 1.0073 | 0.2435 | 0.3626 | 0.2508 | 0.1808 | 0.2784 | 0.5075 | 0.1 | 0.4349 | 0.6767 | 0.54 | 0.6375 | 0.9333 | -1.0 | -1.0 | 0.2435 | 0.6767 |
| 0.8191 | 15.3846 | 200 | 1.0106 | 0.2482 | 0.3692 | 0.2547 | 0.1958 | 0.2749 | 0.5224 | 0.1 | 0.4349 | 0.6814 | 0.55 | 0.6417 | 0.9333 | -1.0 | -1.0 | 0.2482 | 0.6814 |
| 0.8314 | 15.5385 | 202 | 1.0094 | 0.2503 | 0.3744 | 0.2579 | 0.1776 | 0.2754 | 0.5216 | 0.1 | 0.4326 | 0.6791 | 0.54 | 0.6417 | 0.9333 | -1.0 | -1.0 | 0.2503 | 0.6791 |
| 0.9215 | 15.6923 | 204 | 1.0088 | 0.2482 | 0.3709 | 0.2579 | 0.1952 | 0.2717 | 0.5192 | 0.1 | 0.4326 | 0.6814 | 0.55 | 0.6417 | 0.9333 | -1.0 | -1.0 | 0.2482 | 0.6814 |
| 1.3008 | 15.8462 | 206 | 1.0080 | 0.2453 | 0.3668 | 0.2557 | 0.1989 | 0.2732 | 0.5121 | 0.1 | 0.4256 | 0.6767 | 0.54 | 0.6375 | 0.9333 | -1.0 | -1.0 | 0.2453 | 0.6767 |
| 1.1941 | 16.0 | 208 | 1.0116 | 0.2427 | 0.3636 | 0.2521 | 0.2044 | 0.2714 | 0.5014 | 0.1 | 0.4279 | 0.6791 | 0.54 | 0.6375 | 0.9444 | -1.0 | -1.0 | 0.2427 | 0.6791 |
| 1.1051 | 16.1538 | 210 | 1.0169 | 0.2529 | 0.3745 | 0.261 | 0.1969 | 0.2752 | 0.5312 | 0.1023 | 0.4163 | 0.6744 | 0.51 | 0.6417 | 0.9444 | -1.0 | -1.0 | 0.2529 | 0.6744 |
| 1.0767 | 16.3077 | 212 | 1.0172 | 0.2513 | 0.3769 | 0.2634 | 0.2053 | 0.2694 | 0.5322 | 0.1023 | 0.4233 | 0.6674 | 0.52 | 0.6333 | 0.9222 | -1.0 | -1.0 | 0.2513 | 0.6674 |
| 1.2547 | 16.4615 | 214 | 1.0219 | 0.2505 | 0.3589 | 0.2794 | 0.1976 | 0.2667 | 0.5263 | 0.1 | 0.407 | 0.6698 | 0.52 | 0.6417 | 0.9111 | -1.0 | -1.0 | 0.2505 | 0.6698 |
| 1.1085 | 16.6154 | 216 | 1.0196 | 0.2513 | 0.3586 | 0.2652 | 0.1836 | 0.2634 | 0.5389 | 0.1023 | 0.4093 | 0.6674 | 0.51 | 0.6375 | 0.9222 | -1.0 | -1.0 | 0.2513 | 0.6674 |
| 0.9551 | 16.7692 | 218 | 1.0195 | 0.2546 | 0.3629 | 0.283 | 0.1791 | 0.2731 | 0.5357 | 0.1 | 0.4233 | 0.6721 | 0.51 | 0.6458 | 0.9222 | -1.0 | -1.0 | 0.2546 | 0.6721 |
| 0.8029 | 16.9231 | 220 | 1.0150 | 0.257 | 0.3688 | 0.2856 | 0.1852 | 0.2768 | 0.5357 | 0.0907 | 0.4233 | 0.6744 | 0.52 | 0.6458 | 0.9222 | -1.0 | -1.0 | 0.257 | 0.6744 |
| 1.2183 | 17.0769 | 222 | 1.0136 | 0.2578 | 0.3661 | 0.2956 | 0.1829 | 0.2783 | 0.5349 | 0.0907 | 0.4233 | 0.6791 | 0.54 | 0.6458 | 0.9222 | -1.0 | -1.0 | 0.2578 | 0.6791 |
| 1.1483 | 17.2308 | 224 | 1.0128 | 0.2614 | 0.3712 | 0.2889 | 0.1558 | 0.2879 | 0.5347 | 0.0907 | 0.4256 | 0.6837 | 0.54 | 0.6542 | 0.9222 | -1.0 | -1.0 | 0.2614 | 0.6837 |
| 0.9909 | 17.3846 | 226 | 1.0095 | 0.2607 | 0.3955 | 0.2737 | 0.1783 | 0.2866 | 0.5341 | 0.093 | 0.407 | 0.6744 | 0.54 | 0.6417 | 0.9111 | -1.0 | -1.0 | 0.2607 | 0.6744 |
| 1.2678 | 17.5385 | 228 | 1.0047 | 0.2601 | 0.3961 | 0.2741 | 0.1802 | 0.286 | 0.534 | 0.093 | 0.4186 | 0.6698 | 0.53 | 0.6375 | 0.9111 | -1.0 | -1.0 | 0.2601 | 0.6698 |
| 0.6623 | 17.6923 | 230 | 0.9983 | 0.2618 | 0.3982 | 0.2727 | 0.1764 | 0.2868 | 0.5368 | 0.093 | 0.4047 | 0.6721 | 0.52 | 0.6417 | 0.9222 | -1.0 | -1.0 | 0.2618 | 0.6721 |
| 1.301 | 17.8462 | 232 | 0.9984 | 0.2655 | 0.4 | 0.2933 | 0.1769 | 0.3024 | 0.5353 | 0.093 | 0.4209 | 0.6767 | 0.52 | 0.65 | 0.9222 | -1.0 | -1.0 | 0.2655 | 0.6767 |
| 0.7999 | 18.0 | 234 | 0.9900 | 0.2669 | 0.4045 | 0.2835 | 0.181 | 0.2963 | 0.5434 | 0.093 | 0.4233 | 0.6767 | 0.53 | 0.6458 | 0.9222 | -1.0 | -1.0 | 0.2669 | 0.6767 |
| 1.1219 | 18.1538 | 236 | 0.9862 | 0.2671 | 0.4155 | 0.284 | 0.1789 | 0.2916 | 0.5419 | 0.093 | 0.414 | 0.6744 | 0.53 | 0.6458 | 0.9111 | -1.0 | -1.0 | 0.2671 | 0.6744 |
| 1.5338 | 18.3077 | 238 | 0.9822 | 0.2673 | 0.3983 | 0.2811 | 0.1764 | 0.2947 | 0.5399 | 0.0953 | 0.4186 | 0.6791 | 0.54 | 0.65 | 0.9111 | -1.0 | -1.0 | 0.2673 | 0.6791 |
| 1.3839 | 18.4615 | 240 | 0.9843 | 0.2676 | 0.3735 | 0.2865 | 0.1784 | 0.2856 | 0.5496 | 0.093 | 0.4093 | 0.6837 | 0.55 | 0.6458 | 0.9333 | -1.0 | -1.0 | 0.2676 | 0.6837 |
| 1.1335 | 18.6154 | 242 | 0.9872 | 0.2698 | 0.3762 | 0.2869 | 0.1772 | 0.2883 | 0.5527 | 0.093 | 0.407 | 0.686 | 0.55 | 0.65 | 0.9333 | -1.0 | -1.0 | 0.2698 | 0.686 |
| 1.1936 | 18.7692 | 244 | 0.9883 | 0.2694 | 0.376 | 0.2926 | 0.1791 | 0.2893 | 0.5475 | 0.093 | 0.3977 | 0.686 | 0.54 | 0.6542 | 0.9333 | -1.0 | -1.0 | 0.2694 | 0.686 |
| 0.734 | 18.9231 | 246 | 0.9846 | 0.2719 | 0.3899 | 0.2883 | 0.1918 | 0.2893 | 0.5541 | 0.093 | 0.4023 | 0.6837 | 0.56 | 0.6417 | 0.9333 | -1.0 | -1.0 | 0.2719 | 0.6837 |
| 1.4423 | 19.0769 | 248 | 0.9893 | 0.2712 | 0.3917 | 0.2812 | 0.1868 | 0.2891 | 0.5574 | 0.093 | 0.4116 | 0.6721 | 0.53 | 0.6375 | 0.9222 | -1.0 | -1.0 | 0.2712 | 0.6721 |
| 0.6381 | 19.2308 | 250 | 0.9891 | 0.2705 | 0.3876 | 0.2847 | 0.1889 | 0.2871 | 0.5544 | 0.093 | 0.414 | 0.6767 | 0.54 | 0.6375 | 0.9333 | -1.0 | -1.0 | 0.2705 | 0.6767 |
| 1.0157 | 19.3846 | 252 | 0.9894 | 0.274 | 0.3919 | 0.2915 | 0.192 | 0.2963 | 0.5548 | 0.093 | 0.4093 | 0.6814 | 0.56 | 0.6375 | 0.9333 | -1.0 | -1.0 | 0.274 | 0.6814 |
| 1.1378 | 19.5385 | 254 | 0.9888 | 0.2807 | 0.401 | 0.2983 | 0.1917 | 0.2943 | 0.5615 | 0.093 | 0.4209 | 0.6791 | 0.56 | 0.6375 | 0.9222 | -1.0 | -1.0 | 0.2807 | 0.6791 |
| 1.3551 | 19.6923 | 256 | 0.9837 | 0.2797 | 0.4007 | 0.296 | 0.2084 | 0.2921 | 0.5596 | 0.093 | 0.4279 | 0.6791 | 0.55 | 0.6375 | 0.9333 | -1.0 | -1.0 | 0.2797 | 0.6791 |
| 0.841 | 19.8462 | 258 | 0.9790 | 0.2807 | 0.4024 | 0.2967 | 0.2371 | 0.2996 | 0.5636 | 0.093 | 0.4279 | 0.6791 | 0.56 | 0.6375 | 0.9222 | -1.0 | -1.0 | 0.2807 | 0.6791 |
| 0.6378 | 20.0 | 260 | 0.9614 | 0.2938 | 0.4211 | 0.2976 | 0.2408 | 0.3153 | 0.5837 | 0.093 | 0.4326 | 0.6977 | 0.57 | 0.6625 | 0.9333 | -1.0 | -1.0 | 0.2938 | 0.6977 |
| 1.1679 | 20.1538 | 262 | 0.9663 | 0.2979 | 0.4276 | 0.3132 | 0.2364 | 0.3307 | 0.5804 | 0.1047 | 0.4326 | 0.6977 | 0.56 | 0.6667 | 0.9333 | -1.0 | -1.0 | 0.2979 | 0.6977 |
| 0.8697 | 20.3077 | 264 | 0.9692 | 0.298 | 0.4271 | 0.3073 | 0.2371 | 0.3254 | 0.5805 | 0.093 | 0.4442 | 0.6953 | 0.54 | 0.6708 | 0.9333 | -1.0 | -1.0 | 0.298 | 0.6953 |
| 1.0891 | 20.4615 | 266 | 0.9579 | 0.2991 | 0.4254 | 0.3105 | 0.2426 | 0.3244 | 0.5837 | 0.1047 | 0.4465 | 0.7047 | 0.58 | 0.6708 | 0.9333 | -1.0 | -1.0 | 0.2991 | 0.7047 |
| 0.8855 | 20.6154 | 268 | 0.9604 | 0.3053 | 0.4551 | 0.3142 | 0.2474 | 0.3401 | 0.5799 | 0.1047 | 0.4558 | 0.707 | 0.57 | 0.6833 | 0.9222 | -1.0 | -1.0 | 0.3053 | 0.707 |
| 1.6891 | 20.7692 | 270 | 0.9600 | 0.2994 | 0.4298 | 0.3179 | 0.182 | 0.3336 | 0.5803 | 0.1047 | 0.4419 | 0.6977 | 0.56 | 0.6708 | 0.9222 | -1.0 | -1.0 | 0.2994 | 0.6977 |
| 0.8072 | 20.9231 | 272 | 0.9565 | 0.3013 | 0.4318 | 0.3219 | 0.2333 | 0.331 | 0.5801 | 0.1047 | 0.4326 | 0.6977 | 0.56 | 0.6708 | 0.9222 | -1.0 | -1.0 | 0.3013 | 0.6977 |
| 0.9642 | 21.0769 | 274 | 0.9541 | 0.2973 | 0.4213 | 0.3188 | 0.2288 | 0.326 | 0.5681 | 0.1047 | 0.4302 | 0.6977 | 0.56 | 0.6708 | 0.9222 | -1.0 | -1.0 | 0.2973 | 0.6977 |
| 1.609 | 21.2308 | 276 | 0.9484 | 0.2975 | 0.4151 | 0.3224 | 0.2384 | 0.3212 | 0.5714 | 0.1047 | 0.4349 | 0.7047 | 0.58 | 0.6708 | 0.9333 | -1.0 | -1.0 | 0.2975 | 0.7047 |
| 0.7868 | 21.3846 | 278 | 0.9485 | 0.2949 | 0.4397 | 0.3157 | 0.2488 | 0.3211 | 0.5681 | 0.1023 | 0.4326 | 0.7 | 0.57 | 0.6708 | 0.9222 | -1.0 | -1.0 | 0.2949 | 0.7 |
| 1.1217 | 21.5385 | 280 | 0.9331 | 0.292 | 0.4385 | 0.3223 | 0.2503 | 0.3206 | 0.5545 | 0.1 | 0.4279 | 0.7 | 0.58 | 0.6708 | 0.9111 | -1.0 | -1.0 | 0.292 | 0.7 |
| 1.3825 | 21.6923 | 282 | 0.9428 | 0.2897 | 0.4411 | 0.3166 | 0.2076 | 0.3195 | 0.5548 | 0.1 | 0.4279 | 0.6953 | 0.56 | 0.6708 | 0.9111 | -1.0 | -1.0 | 0.2897 | 0.6953 |
| 1.0171 | 21.8462 | 284 | 0.9461 | 0.2935 | 0.4392 | 0.3166 | 0.2328 | 0.3223 | 0.5683 | 0.1023 | 0.4326 | 0.7 | 0.56 | 0.675 | 0.9222 | -1.0 | -1.0 | 0.2935 | 0.7 |
| 0.6682 | 22.0 | 286 | 0.9417 | 0.294 | 0.4489 | 0.3166 | 0.2368 | 0.3196 | 0.5681 | 0.1 | 0.4302 | 0.6977 | 0.55 | 0.675 | 0.9222 | -1.0 | -1.0 | 0.294 | 0.6977 |
| 0.9395 | 22.1538 | 288 | 0.9413 | 0.2967 | 0.4501 | 0.3377 | 0.2361 | 0.3202 | 0.5683 | 0.1 | 0.4326 | 0.6907 | 0.54 | 0.6667 | 0.9222 | -1.0 | -1.0 | 0.2967 | 0.6907 |
| 1.0958 | 22.3077 | 290 | 0.9376 | 0.2941 | 0.454 | 0.3152 | 0.2301 | 0.3224 | 0.5546 | 0.1 | 0.4279 | 0.6814 | 0.52 | 0.6625 | 0.9111 | -1.0 | -1.0 | 0.2941 | 0.6814 |
| 1.5821 | 22.4615 | 292 | 0.9374 | 0.2982 | 0.4481 | 0.3216 | 0.2297 | 0.3258 | 0.6006 | 0.1023 | 0.4302 | 0.6884 | 0.54 | 0.6625 | 0.9222 | -1.0 | -1.0 | 0.2982 | 0.6884 |
| 0.5959 | 22.6154 | 294 | 0.9360 | 0.2958 | 0.4499 | 0.3192 | 0.228 | 0.33 | 0.5808 | 0.1 | 0.4279 | 0.6837 | 0.53 | 0.6625 | 0.9111 | -1.0 | -1.0 | 0.2958 | 0.6837 |
| 1.0976 | 22.7692 | 296 | 0.9397 | 0.2984 | 0.452 | 0.3207 | 0.2268 | 0.3389 | 0.5839 | 0.1 | 0.4302 | 0.686 | 0.53 | 0.6667 | 0.9111 | -1.0 | -1.0 | 0.2984 | 0.686 |
| 0.7609 | 22.9231 | 298 | 0.9420 | 0.2953 | 0.4274 | 0.3184 | 0.2306 | 0.3317 | 0.584 | 0.1 | 0.4279 | 0.686 | 0.53 | 0.6625 | 0.9222 | -1.0 | -1.0 | 0.2953 | 0.686 |
| 1.5303 | 23.0769 | 300 | 0.9415 | 0.2945 | 0.4274 | 0.3228 | 0.233 | 0.3283 | 0.582 | 0.1 | 0.4256 | 0.686 | 0.54 | 0.6625 | 0.9111 | -1.0 | -1.0 | 0.2945 | 0.686 |
| 1.2072 | 23.2308 | 302 | 0.9459 | 0.2926 | 0.4256 | 0.3159 | 0.2314 | 0.3276 | 0.5784 | 0.1 | 0.4256 | 0.6814 | 0.53 | 0.6625 | 0.9 | -1.0 | -1.0 | 0.2926 | 0.6814 |
| 0.8314 | 23.3846 | 304 | 0.9436 | 0.2967 | 0.4301 | 0.3161 | 0.2299 | 0.3307 | 0.5788 | 0.1 | 0.4279 | 0.6884 | 0.54 | 0.6708 | 0.9 | -1.0 | -1.0 | 0.2967 | 0.6884 |
| 0.7173 | 23.5385 | 306 | 0.9545 | 0.3012 | 0.4377 | 0.3203 | 0.2303 | 0.3301 | 0.5944 | 0.1 | 0.4279 | 0.6814 | 0.52 | 0.6667 | 0.9 | -1.0 | -1.0 | 0.3012 | 0.6814 |
| 0.9787 | 23.6923 | 308 | 0.9541 | 0.3012 | 0.4348 | 0.3314 | 0.2217 | 0.3264 | 0.5945 | 0.1 | 0.4209 | 0.686 | 0.54 | 0.6625 | 0.9111 | -1.0 | -1.0 | 0.3012 | 0.686 |
| 1.6918 | 23.8462 | 310 | 0.9516 | 0.3025 | 0.439 | 0.3269 | 0.2215 | 0.3299 | 0.5973 | 0.1 | 0.4256 | 0.686 | 0.53 | 0.6708 | 0.9 | -1.0 | -1.0 | 0.3025 | 0.686 |
| 1.1985 | 24.0 | 312 | 0.9448 | 0.3034 | 0.4394 | 0.322 | 0.222 | 0.3381 | 0.5973 | 0.1 | 0.4279 | 0.6884 | 0.53 | 0.675 | 0.9 | -1.0 | -1.0 | 0.3034 | 0.6884 |
| 1.2056 | 24.1538 | 314 | 0.9487 | 0.3062 | 0.465 | 0.3207 | 0.2237 | 0.3427 | 0.5973 | 0.1 | 0.4302 | 0.693 | 0.53 | 0.6833 | 0.9 | -1.0 | -1.0 | 0.3062 | 0.693 |
| 0.8028 | 24.3077 | 316 | 0.9443 | 0.3026 | 0.4594 | 0.3159 | 0.2212 | 0.3371 | 0.5973 | 0.1 | 0.4302 | 0.6907 | 0.53 | 0.6792 | 0.9 | -1.0 | -1.0 | 0.3026 | 0.6907 |
| 1.6389 | 24.4615 | 318 | 0.9523 | 0.2983 | 0.4562 | 0.3111 | 0.2193 | 0.3288 | 0.5938 | 0.1 | 0.4279 | 0.6814 | 0.51 | 0.6708 | 0.9 | -1.0 | -1.0 | 0.2983 | 0.6814 |
| 0.7546 | 24.6154 | 320 | 0.9503 | 0.2964 | 0.4514 | 0.3144 | 0.2226 | 0.3302 | 0.5935 | 0.1 | 0.4442 | 0.6884 | 0.53 | 0.675 | 0.9 | -1.0 | -1.0 | 0.2964 | 0.6884 |
| 0.7313 | 24.7692 | 322 | 0.9438 | 0.2951 | 0.4496 | 0.3135 | 0.2207 | 0.3324 | 0.5907 | 0.1 | 0.4442 | 0.6884 | 0.53 | 0.675 | 0.9 | -1.0 | -1.0 | 0.2951 | 0.6884 |
| 0.7326 | 24.9231 | 324 | 0.9374 | 0.2994 | 0.4522 | 0.3265 | 0.2217 | 0.3324 | 0.597 | 0.1 | 0.4465 | 0.6953 | 0.55 | 0.6792 | 0.9 | -1.0 | -1.0 | 0.2994 | 0.6953 |
| 0.958 | 25.0769 | 326 | 0.9373 | 0.3004 | 0.455 | 0.3254 | 0.2194 | 0.3328 | 0.597 | 0.1 | 0.4465 | 0.693 | 0.55 | 0.675 | 0.9 | -1.0 | -1.0 | 0.3004 | 0.693 |
| 1.5074 | 25.2308 | 328 | 0.9401 | 0.303 | 0.4544 | 0.3395 | 0.2169 | 0.3399 | 0.597 | 0.1 | 0.4488 | 0.693 | 0.53 | 0.6833 | 0.9 | -1.0 | -1.0 | 0.303 | 0.693 |
| 1.2598 | 25.3846 | 330 | 0.9421 | 0.3034 | 0.4619 | 0.3203 | 0.2175 | 0.3362 | 0.597 | 0.1 | 0.4465 | 0.6884 | 0.53 | 0.675 | 0.9 | -1.0 | -1.0 | 0.3034 | 0.6884 |
| 1.1887 | 25.5385 | 332 | 0.9366 | 0.3107 | 0.4627 | 0.3424 | 0.2174 | 0.3621 | 0.597 | 0.1023 | 0.4512 | 0.6953 | 0.53 | 0.6875 | 0.9 | -1.0 | -1.0 | 0.3107 | 0.6953 |
| 1.1788 | 25.6923 | 334 | 0.9356 | 0.3105 | 0.4635 | 0.3482 | 0.2208 | 0.3553 | 0.5972 | 0.1023 | 0.4488 | 0.6953 | 0.54 | 0.6833 | 0.9 | -1.0 | -1.0 | 0.3105 | 0.6953 |
| 0.8729 | 25.8462 | 336 | 0.9363 | 0.3133 | 0.4678 | 0.3507 | 0.221 | 0.3562 | 0.6016 | 0.1023 | 0.4442 | 0.6953 | 0.54 | 0.6833 | 0.9 | -1.0 | -1.0 | 0.3133 | 0.6953 |
| 1.0313 | 26.0 | 338 | 0.9426 | 0.3208 | 0.4752 | 0.3361 | 0.2235 | 0.3531 | 0.6233 | 0.1047 | 0.4419 | 0.6953 | 0.55 | 0.675 | 0.9111 | -1.0 | -1.0 | 0.3208 | 0.6953 |
| 1.8465 | 26.1538 | 340 | 0.9431 | 0.3215 | 0.4781 | 0.333 | 0.2219 | 0.3547 | 0.6232 | 0.1047 | 0.4419 | 0.6953 | 0.54 | 0.6792 | 0.9111 | -1.0 | -1.0 | 0.3215 | 0.6953 |
| 1.3775 | 26.3077 | 342 | 0.9430 | 0.3231 | 0.4786 | 0.3331 | 0.2227 | 0.3607 | 0.6274 | 0.1047 | 0.4442 | 0.6977 | 0.54 | 0.6792 | 0.9222 | -1.0 | -1.0 | 0.3231 | 0.6977 |
| 0.9272 | 26.4615 | 344 | 0.9443 | 0.3232 | 0.4789 | 0.334 | 0.224 | 0.3552 | 0.6276 | 0.1047 | 0.4442 | 0.6977 | 0.54 | 0.6792 | 0.9222 | -1.0 | -1.0 | 0.3232 | 0.6977 |
| 0.7306 | 26.6154 | 346 | 0.9425 | 0.3224 | 0.479 | 0.3331 | 0.2212 | 0.3591 | 0.6276 | 0.1047 | 0.4488 | 0.6977 | 0.54 | 0.6792 | 0.9222 | -1.0 | -1.0 | 0.3224 | 0.6977 |
| 1.1656 | 26.7692 | 348 | 0.9401 | 0.3208 | 0.4753 | 0.3313 | 0.2219 | 0.3495 | 0.6274 | 0.1047 | 0.4488 | 0.7 | 0.54 | 0.6833 | 0.9222 | -1.0 | -1.0 | 0.3208 | 0.7 |
| 1.4892 | 26.9231 | 350 | 0.9390 | 0.3218 | 0.4741 | 0.3327 | 0.2236 | 0.3534 | 0.6276 | 0.1047 | 0.4512 | 0.7 | 0.54 | 0.6833 | 0.9222 | -1.0 | -1.0 | 0.3218 | 0.7 |
| 0.6666 | 27.0769 | 352 | 0.9386 | 0.3231 | 0.4779 | 0.3354 | 0.2281 | 0.3536 | 0.6276 | 0.1047 | 0.4512 | 0.6953 | 0.53 | 0.6792 | 0.9222 | -1.0 | -1.0 | 0.3231 | 0.6953 |
| 0.7846 | 27.2308 | 354 | 0.9369 | 0.3232 | 0.479 | 0.338 | 0.2206 | 0.3534 | 0.6277 | 0.1047 | 0.4488 | 0.6977 | 0.54 | 0.6792 | 0.9222 | -1.0 | -1.0 | 0.3232 | 0.6977 |
| 0.9385 | 27.3846 | 356 | 0.9374 | 0.3241 | 0.4783 | 0.3371 | 0.2204 | 0.3507 | 0.6325 | 0.1047 | 0.4488 | 0.6977 | 0.53 | 0.6875 | 0.9111 | -1.0 | -1.0 | 0.3241 | 0.6977 |
| 1.0147 | 27.5385 | 358 | 0.9384 | 0.3273 | 0.4829 | 0.3397 | 0.228 | 0.3589 | 0.6325 | 0.1047 | 0.4512 | 0.6953 | 0.53 | 0.6833 | 0.9111 | -1.0 | -1.0 | 0.3273 | 0.6953 |
| 1.3636 | 27.6923 | 360 | 0.9397 | 0.3269 | 0.4831 | 0.3375 | 0.2283 | 0.3552 | 0.6234 | 0.1047 | 0.4512 | 0.6953 | 0.54 | 0.6792 | 0.9111 | -1.0 | -1.0 | 0.3269 | 0.6953 |
| 1.8067 | 27.8462 | 362 | 0.9365 | 0.3309 | 0.4873 | 0.3414 | 0.2309 | 0.3591 | 0.6366 | 0.1047 | 0.4535 | 0.6977 | 0.54 | 0.6792 | 0.9222 | -1.0 | -1.0 | 0.3309 | 0.6977 |
| 0.9331 | 28.0 | 364 | 0.9340 | 0.3293 | 0.4862 | 0.342 | 0.2166 | 0.3563 | 0.6368 | 0.1047 | 0.4512 | 0.6953 | 0.53 | 0.6792 | 0.9222 | -1.0 | -1.0 | 0.3293 | 0.6953 |
| 0.7488 | 28.1538 | 366 | 0.9345 | 0.3284 | 0.4837 | 0.3391 | 0.2318 | 0.3506 | 0.6368 | 0.1047 | 0.4535 | 0.6977 | 0.54 | 0.6792 | 0.9222 | -1.0 | -1.0 | 0.3284 | 0.6977 |
| 1.1657 | 28.3077 | 368 | 0.9346 | 0.3332 | 0.4887 | 0.3679 | 0.2179 | 0.3614 | 0.6367 | 0.1047 | 0.4535 | 0.7023 | 0.55 | 0.6833 | 0.9222 | -1.0 | -1.0 | 0.3332 | 0.7023 |
| 0.7517 | 28.4615 | 370 | 0.9342 | 0.3301 | 0.4873 | 0.3404 | 0.2339 | 0.3511 | 0.6363 | 0.1047 | 0.4535 | 0.7023 | 0.56 | 0.6792 | 0.9222 | -1.0 | -1.0 | 0.3301 | 0.7023 |
| 0.6962 | 28.6154 | 372 | 0.9321 | 0.326 | 0.4863 | 0.341 | 0.2196 | 0.3508 | 0.623 | 0.1023 | 0.4488 | 0.6977 | 0.55 | 0.6792 | 0.9111 | -1.0 | -1.0 | 0.326 | 0.6977 |
| 0.6036 | 28.7692 | 374 | 0.9318 | 0.3268 | 0.489 | 0.3422 | 0.2143 | 0.3561 | 0.6225 | 0.1023 | 0.4442 | 0.693 | 0.53 | 0.6792 | 0.9111 | -1.0 | -1.0 | 0.3268 | 0.693 |
| 0.7852 | 28.9231 | 376 | 0.9345 | 0.3294 | 0.4913 | 0.3436 | 0.2323 | 0.3593 | 0.6225 | 0.1023 | 0.4512 | 0.6953 | 0.54 | 0.6792 | 0.9111 | -1.0 | -1.0 | 0.3294 | 0.6953 |
| 0.9468 | 29.0769 | 378 | 0.9319 | 0.3296 | 0.491 | 0.3433 | 0.2277 | 0.3599 | 0.6226 | 0.1023 | 0.4512 | 0.6977 | 0.55 | 0.6792 | 0.9111 | -1.0 | -1.0 | 0.3296 | 0.6977 |
| 0.7504 | 29.2308 | 380 | 0.9322 | 0.3283 | 0.4884 | 0.3422 | 0.2297 | 0.3577 | 0.6226 | 0.1023 | 0.4512 | 0.6977 | 0.55 | 0.6792 | 0.9111 | -1.0 | -1.0 | 0.3283 | 0.6977 |
| 0.7207 | 29.3846 | 382 | 0.9312 | 0.3288 | 0.4928 | 0.343 | 0.2169 | 0.3607 | 0.6224 | 0.1023 | 0.4488 | 0.6977 | 0.55 | 0.6792 | 0.9111 | -1.0 | -1.0 | 0.3288 | 0.6977 |
| 1.4957 | 29.5385 | 384 | 0.9322 | 0.3295 | 0.4896 | 0.3446 | 0.2292 | 0.3643 | 0.6224 | 0.1023 | 0.4512 | 0.6977 | 0.55 | 0.6792 | 0.9111 | -1.0 | -1.0 | 0.3295 | 0.6977 |
| 0.8427 | 29.6923 | 386 | 0.9318 | 0.3288 | 0.4915 | 0.3451 | 0.2153 | 0.3652 | 0.6223 | 0.1023 | 0.4488 | 0.693 | 0.53 | 0.6792 | 0.9111 | -1.0 | -1.0 | 0.3288 | 0.693 |
| 0.9049 | 29.8462 | 388 | 0.9281 | 0.3301 | 0.4916 | 0.3449 | 0.2303 | 0.3661 | 0.6223 | 0.1023 | 0.4512 | 0.6977 | 0.55 | 0.6792 | 0.9111 | -1.0 | -1.0 | 0.3301 | 0.6977 |
| 0.5599 | 30.0 | 390 | 0.9290 | 0.3296 | 0.4931 | 0.3456 | 0.2295 | 0.3651 | 0.6228 | 0.1023 | 0.4488 | 0.6953 | 0.54 | 0.6792 | 0.9111 | -1.0 | -1.0 | 0.3296 | 0.6953 |
### Framework versions
- Transformers 4.46.3
- Pytorch 2.4.0
- Datasets 3.1.0
- Tokenizers 0.20.0
| [
"object",
"balloon"
] |
initial01/detr-resnet-50_finetuned_cppe5 |
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# detr-resnet-50_finetuned_cppe5
This model is a fine-tuned version of [facebook/detr-resnet-50](https://huggingface.co/facebook/detr-resnet-50) on an unknown dataset.
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 1e-05
- train_batch_size: 8
- eval_batch_size: 8
- seed: 42
- optimizer: Use adamw_torch with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
- lr_scheduler_type: linear
- num_epochs: 10
- mixed_precision_training: Native AMP
### Training results
### Framework versions
- Transformers 4.46.2
- Pytorch 2.5.1+cu121
- Datasets 3.1.0
- Tokenizers 0.20.3
| [
"coverall",
"face_shield",
"gloves",
"goggles",
"mask"
] |
MalyO2/detr_finetune_simplest |
# Model Card for Model ID
<!-- Provide a quick summary of what the model is/does. -->
## Model Details
### Model Description
<!-- Provide a longer summary of what this model is. -->
This is the model card of a 🤗 transformers model that has been pushed on the Hub. This model card has been automatically generated.
- **Developed by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Model type:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
- **Finetuned from model [optional]:** [More Information Needed]
### Model Sources [optional]
<!-- Provide the basic links for the model. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the model is intended to be used, including the foreseeable users of the model and those affected by the model. -->
### Direct Use
<!-- This section is for the model use without fine-tuning or plugging into a larger ecosystem/app. -->
[More Information Needed]
### Downstream Use [optional]
<!-- This section is for the model use when fine-tuned for a task, or when plugged into a larger ecosystem/app -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the model will not work well for. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.
## How to Get Started with the Model
Use the code below to get started with the model.
[More Information Needed]
## Training Details
### Training Data
<!-- This should link to a Dataset Card, perhaps with a short stub of information on what the training data is all about as well as documentation related to data pre-processing or additional filtering. -->
[More Information Needed]
### Training Procedure
<!-- This relates heavily to the Technical Specifications. Content here should link to that section when it is relevant to the training procedure. -->
#### Preprocessing [optional]
[More Information Needed]
#### Training Hyperparameters
- **Training regime:** [More Information Needed] <!--fp32, fp16 mixed precision, bf16 mixed precision, bf16 non-mixed precision, fp16 non-mixed precision, fp8 mixed precision -->
#### Speeds, Sizes, Times [optional]
<!-- This section provides information about throughput, start/end time, checkpoint size if relevant, etc. -->
[More Information Needed]
## Evaluation
<!-- This section describes the evaluation protocols and provides the results. -->
### Testing Data, Factors & Metrics
#### Testing Data
<!-- This should link to a Dataset Card if possible. -->
[More Information Needed]
#### Factors
<!-- These are the things the evaluation is disaggregating by, e.g., subpopulations or domains. -->
[More Information Needed]
#### Metrics
<!-- These are the evaluation metrics being used, ideally with a description of why. -->
[More Information Needed]
### Results
[More Information Needed]
#### Summary
## Model Examination [optional]
<!-- Relevant interpretability work for the model goes here -->
[More Information Needed]
## Environmental Impact
<!-- Total emissions (in grams of CO2eq) and additional considerations, such as electricity usage, go here. Edit the suggested text below accordingly -->
Carbon emissions can be estimated using the [Machine Learning Impact calculator](https://mlco2.github.io/impact#compute) presented in [Lacoste et al. (2019)](https://arxiv.org/abs/1910.09700).
- **Hardware Type:** [More Information Needed]
- **Hours used:** [More Information Needed]
- **Cloud Provider:** [More Information Needed]
- **Compute Region:** [More Information Needed]
- **Carbon Emitted:** [More Information Needed]
## Technical Specifications [optional]
### Model Architecture and Objective
[More Information Needed]
### Compute Infrastructure
[More Information Needed]
#### Hardware
[More Information Needed]
#### Software
[More Information Needed]
## Citation [optional]
<!-- If there is a paper or blog post introducing the model, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the model or model card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Model Card Authors [optional]
[More Information Needed]
## Model Card Contact
[More Information Needed] | [
"object",
"balloon"
] |
MalyO2/detr_finetune_lr_scheduler |
# Model Card for Model ID
<!-- Provide a quick summary of what the model is/does. -->
## Model Details
### Model Description
<!-- Provide a longer summary of what this model is. -->
This is the model card of a 🤗 transformers model that has been pushed on the Hub. This model card has been automatically generated.
- **Developed by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Model type:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
- **Finetuned from model [optional]:** [More Information Needed]
### Model Sources [optional]
<!-- Provide the basic links for the model. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the model is intended to be used, including the foreseeable users of the model and those affected by the model. -->
### Direct Use
<!-- This section is for the model use without fine-tuning or plugging into a larger ecosystem/app. -->
[More Information Needed]
### Downstream Use [optional]
<!-- This section is for the model use when fine-tuned for a task, or when plugged into a larger ecosystem/app -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the model will not work well for. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.
## How to Get Started with the Model
Use the code below to get started with the model.
[More Information Needed]
## Training Details
### Training Data
<!-- This should link to a Dataset Card, perhaps with a short stub of information on what the training data is all about as well as documentation related to data pre-processing or additional filtering. -->
[More Information Needed]
### Training Procedure
<!-- This relates heavily to the Technical Specifications. Content here should link to that section when it is relevant to the training procedure. -->
#### Preprocessing [optional]
[More Information Needed]
#### Training Hyperparameters
- **Training regime:** [More Information Needed] <!--fp32, fp16 mixed precision, bf16 mixed precision, bf16 non-mixed precision, fp16 non-mixed precision, fp8 mixed precision -->
#### Speeds, Sizes, Times [optional]
<!-- This section provides information about throughput, start/end time, checkpoint size if relevant, etc. -->
[More Information Needed]
## Evaluation
<!-- This section describes the evaluation protocols and provides the results. -->
### Testing Data, Factors & Metrics
#### Testing Data
<!-- This should link to a Dataset Card if possible. -->
[More Information Needed]
#### Factors
<!-- These are the things the evaluation is disaggregating by, e.g., subpopulations or domains. -->
[More Information Needed]
#### Metrics
<!-- These are the evaluation metrics being used, ideally with a description of why. -->
[More Information Needed]
### Results
[More Information Needed]
#### Summary
## Model Examination [optional]
<!-- Relevant interpretability work for the model goes here -->
[More Information Needed]
## Environmental Impact
<!-- Total emissions (in grams of CO2eq) and additional considerations, such as electricity usage, go here. Edit the suggested text below accordingly -->
Carbon emissions can be estimated using the [Machine Learning Impact calculator](https://mlco2.github.io/impact#compute) presented in [Lacoste et al. (2019)](https://arxiv.org/abs/1910.09700).
- **Hardware Type:** [More Information Needed]
- **Hours used:** [More Information Needed]
- **Cloud Provider:** [More Information Needed]
- **Compute Region:** [More Information Needed]
- **Carbon Emitted:** [More Information Needed]
## Technical Specifications [optional]
### Model Architecture and Objective
[More Information Needed]
### Compute Infrastructure
[More Information Needed]
#### Hardware
[More Information Needed]
#### Software
[More Information Needed]
## Citation [optional]
<!-- If there is a paper or blog post introducing the model, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the model or model card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Model Card Authors [optional]
[More Information Needed]
## Model Card Contact
[More Information Needed] | [
"object",
"balloon"
] |
MalyO2/detr_finetune_lr_scheduler_v3 |
# Model Card for Model ID
<!-- Provide a quick summary of what the model is/does. -->
## Model Details
### Model Description
<!-- Provide a longer summary of what this model is. -->
This is the model card of a 🤗 transformers model that has been pushed on the Hub. This model card has been automatically generated.
- **Developed by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Model type:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
- **Finetuned from model [optional]:** [More Information Needed]
### Model Sources [optional]
<!-- Provide the basic links for the model. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the model is intended to be used, including the foreseeable users of the model and those affected by the model. -->
### Direct Use
<!-- This section is for the model use without fine-tuning or plugging into a larger ecosystem/app. -->
[More Information Needed]
### Downstream Use [optional]
<!-- This section is for the model use when fine-tuned for a task, or when plugged into a larger ecosystem/app -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the model will not work well for. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.
## How to Get Started with the Model
Use the code below to get started with the model.
[More Information Needed]
## Training Details
### Training Data
<!-- This should link to a Dataset Card, perhaps with a short stub of information on what the training data is all about as well as documentation related to data pre-processing or additional filtering. -->
[More Information Needed]
### Training Procedure
<!-- This relates heavily to the Technical Specifications. Content here should link to that section when it is relevant to the training procedure. -->
#### Preprocessing [optional]
[More Information Needed]
#### Training Hyperparameters
- **Training regime:** [More Information Needed] <!--fp32, fp16 mixed precision, bf16 mixed precision, bf16 non-mixed precision, fp16 non-mixed precision, fp8 mixed precision -->
#### Speeds, Sizes, Times [optional]
<!-- This section provides information about throughput, start/end time, checkpoint size if relevant, etc. -->
[More Information Needed]
## Evaluation
<!-- This section describes the evaluation protocols and provides the results. -->
### Testing Data, Factors & Metrics
#### Testing Data
<!-- This should link to a Dataset Card if possible. -->
[More Information Needed]
#### Factors
<!-- These are the things the evaluation is disaggregating by, e.g., subpopulations or domains. -->
[More Information Needed]
#### Metrics
<!-- These are the evaluation metrics being used, ideally with a description of why. -->
[More Information Needed]
### Results
[More Information Needed]
#### Summary
## Model Examination [optional]
<!-- Relevant interpretability work for the model goes here -->
[More Information Needed]
## Environmental Impact
<!-- Total emissions (in grams of CO2eq) and additional considerations, such as electricity usage, go here. Edit the suggested text below accordingly -->
Carbon emissions can be estimated using the [Machine Learning Impact calculator](https://mlco2.github.io/impact#compute) presented in [Lacoste et al. (2019)](https://arxiv.org/abs/1910.09700).
- **Hardware Type:** [More Information Needed]
- **Hours used:** [More Information Needed]
- **Cloud Provider:** [More Information Needed]
- **Compute Region:** [More Information Needed]
- **Carbon Emitted:** [More Information Needed]
## Technical Specifications [optional]
### Model Architecture and Objective
[More Information Needed]
### Compute Infrastructure
[More Information Needed]
#### Hardware
[More Information Needed]
#### Software
[More Information Needed]
## Citation [optional]
<!-- If there is a paper or blog post introducing the model, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the model or model card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Model Card Authors [optional]
[More Information Needed]
## Model Card Contact
[More Information Needed] | [
"object",
"balloon"
] |
MalyO2/detr_finetune_lr_scheduler_v4 |
# Model Card for Model ID
<!-- Provide a quick summary of what the model is/does. -->
## Model Details
### Model Description
<!-- Provide a longer summary of what this model is. -->
This is the model card of a 🤗 transformers model that has been pushed on the Hub. This model card has been automatically generated.
- **Developed by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Model type:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
- **Finetuned from model [optional]:** [More Information Needed]
### Model Sources [optional]
<!-- Provide the basic links for the model. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the model is intended to be used, including the foreseeable users of the model and those affected by the model. -->
### Direct Use
<!-- This section is for the model use without fine-tuning or plugging into a larger ecosystem/app. -->
[More Information Needed]
### Downstream Use [optional]
<!-- This section is for the model use when fine-tuned for a task, or when plugged into a larger ecosystem/app -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the model will not work well for. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.
## How to Get Started with the Model
Use the code below to get started with the model.
[More Information Needed]
## Training Details
### Training Data
<!-- This should link to a Dataset Card, perhaps with a short stub of information on what the training data is all about as well as documentation related to data pre-processing or additional filtering. -->
[More Information Needed]
### Training Procedure
<!-- This relates heavily to the Technical Specifications. Content here should link to that section when it is relevant to the training procedure. -->
#### Preprocessing [optional]
[More Information Needed]
#### Training Hyperparameters
- **Training regime:** [More Information Needed] <!--fp32, fp16 mixed precision, bf16 mixed precision, bf16 non-mixed precision, fp16 non-mixed precision, fp8 mixed precision -->
#### Speeds, Sizes, Times [optional]
<!-- This section provides information about throughput, start/end time, checkpoint size if relevant, etc. -->
[More Information Needed]
## Evaluation
<!-- This section describes the evaluation protocols and provides the results. -->
### Testing Data, Factors & Metrics
#### Testing Data
<!-- This should link to a Dataset Card if possible. -->
[More Information Needed]
#### Factors
<!-- These are the things the evaluation is disaggregating by, e.g., subpopulations or domains. -->
[More Information Needed]
#### Metrics
<!-- These are the evaluation metrics being used, ideally with a description of why. -->
[More Information Needed]
### Results
[More Information Needed]
#### Summary
## Model Examination [optional]
<!-- Relevant interpretability work for the model goes here -->
[More Information Needed]
## Environmental Impact
<!-- Total emissions (in grams of CO2eq) and additional considerations, such as electricity usage, go here. Edit the suggested text below accordingly -->
Carbon emissions can be estimated using the [Machine Learning Impact calculator](https://mlco2.github.io/impact#compute) presented in [Lacoste et al. (2019)](https://arxiv.org/abs/1910.09700).
- **Hardware Type:** [More Information Needed]
- **Hours used:** [More Information Needed]
- **Cloud Provider:** [More Information Needed]
- **Compute Region:** [More Information Needed]
- **Carbon Emitted:** [More Information Needed]
## Technical Specifications [optional]
### Model Architecture and Objective
[More Information Needed]
### Compute Infrastructure
[More Information Needed]
#### Hardware
[More Information Needed]
#### Software
[More Information Needed]
## Citation [optional]
<!-- If there is a paper or blog post introducing the model, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the model or model card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Model Card Authors [optional]
[More Information Needed]
## Model Card Contact
[More Information Needed] | [
"object",
"balloon"
] |
pabloOmega/ocr-entities-detection |
# Model Card for Model ID
<!-- Provide a quick summary of what the model is/does. -->
## Model Details
### Model Description
<!-- Provide a longer summary of what this model is. -->
This is the model card of a 🤗 transformers model that has been pushed on the Hub. This model card has been automatically generated.
- **Developed by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Model type:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
- **Finetuned from model [optional]:** [More Information Needed]
### Model Sources [optional]
<!-- Provide the basic links for the model. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the model is intended to be used, including the foreseeable users of the model and those affected by the model. -->
### Direct Use
<!-- This section is for the model use without fine-tuning or plugging into a larger ecosystem/app. -->
[More Information Needed]
### Downstream Use [optional]
<!-- This section is for the model use when fine-tuned for a task, or when plugged into a larger ecosystem/app -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the model will not work well for. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.
## How to Get Started with the Model
Use the code below to get started with the model.
[More Information Needed]
## Training Details
### Training Data
<!-- This should link to a Dataset Card, perhaps with a short stub of information on what the training data is all about as well as documentation related to data pre-processing or additional filtering. -->
[More Information Needed]
### Training Procedure
<!-- This relates heavily to the Technical Specifications. Content here should link to that section when it is relevant to the training procedure. -->
#### Preprocessing [optional]
[More Information Needed]
#### Training Hyperparameters
- **Training regime:** [More Information Needed] <!--fp32, fp16 mixed precision, bf16 mixed precision, bf16 non-mixed precision, fp16 non-mixed precision, fp8 mixed precision -->
#### Speeds, Sizes, Times [optional]
<!-- This section provides information about throughput, start/end time, checkpoint size if relevant, etc. -->
[More Information Needed]
## Evaluation
<!-- This section describes the evaluation protocols and provides the results. -->
### Testing Data, Factors & Metrics
#### Testing Data
<!-- This should link to a Dataset Card if possible. -->
[More Information Needed]
#### Factors
<!-- These are the things the evaluation is disaggregating by, e.g., subpopulations or domains. -->
[More Information Needed]
#### Metrics
<!-- These are the evaluation metrics being used, ideally with a description of why. -->
[More Information Needed]
### Results
[More Information Needed]
#### Summary
## Model Examination [optional]
<!-- Relevant interpretability work for the model goes here -->
[More Information Needed]
## Environmental Impact
<!-- Total emissions (in grams of CO2eq) and additional considerations, such as electricity usage, go here. Edit the suggested text below accordingly -->
Carbon emissions can be estimated using the [Machine Learning Impact calculator](https://mlco2.github.io/impact#compute) presented in [Lacoste et al. (2019)](https://arxiv.org/abs/1910.09700).
- **Hardware Type:** [More Information Needed]
- **Hours used:** [More Information Needed]
- **Cloud Provider:** [More Information Needed]
- **Compute Region:** [More Information Needed]
- **Carbon Emitted:** [More Information Needed]
## Technical Specifications [optional]
### Model Architecture and Objective
[More Information Needed]
### Compute Infrastructure
[More Information Needed]
#### Hardware
[More Information Needed]
#### Software
[More Information Needed]
## Citation [optional]
<!-- If there is a paper or blog post introducing the model, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the model or model card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Model Card Authors [optional]
[More Information Needed]
## Model Card Contact
[More Information Needed] | [
"label_0",
"label_1",
"label_2",
"label_3",
"label_4",
"label_5",
"label_6",
"label_7",
"label_8",
"label_9",
"label_10"
] |
MalyO2/working |
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# facebook/detr-resnet-50-dc5
This model is a fine-tuned version of [facebook/detr-resnet-50-dc5](https://huggingface.co/facebook/detr-resnet-50-dc5) on the None dataset.
It achieves the following results on the evaluation set:
- Loss: 0.5836
- Map: 0.5257
- Map 50: 0.6508
- Map 75: 0.6241
- Map Small: 0.0
- Map Medium: 0.4752
- Map Large: 0.7513
- Mar 1: 0.1853
- Mar 10: 0.6
- Mar 100: 0.7147
- Mar Small: 0.0
- Mar Medium: 0.6684
- Mar Large: 0.8923
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 5e-05
- train_batch_size: 4
- eval_batch_size: 4
- seed: 42
- optimizer: Use adamw_torch with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
- lr_scheduler_type: linear
- training_steps: 400
- mixed_precision_training: Native AMP
### Training results
| Training Loss | Epoch | Step | Validation Loss | Map | Map 50 | Map 75 | Map Small | Map Medium | Map Large | Mar 1 | Mar 10 | Mar 100 | Mar Small | Mar Medium | Mar Large |
|:-------------:|:-------:|:----:|:---------------:|:------:|:------:|:------:|:---------:|:----------:|:---------:|:------:|:------:|:-------:|:---------:|:----------:|:---------:|
| 4.1002 | 0.7692 | 10 | 4.1741 | 0.0003 | 0.001 | 0.0003 | 0.0 | 0.0062 | 0.0002 | 0.0 | 0.0 | 0.0441 | 0.0 | 0.0474 | 0.0462 |
| 1.772 | 1.5385 | 20 | 1.4577 | 0.0298 | 0.05 | 0.0286 | 0.0 | 0.0185 | 0.0656 | 0.0294 | 0.1206 | 0.4882 | 0.0 | 0.3421 | 0.7769 |
| 1.5665 | 2.3077 | 30 | 1.3869 | 0.0339 | 0.0549 | 0.0351 | 0.0 | 0.0407 | 0.0516 | 0.0029 | 0.0824 | 0.6059 | 0.0 | 0.5158 | 0.8308 |
| 2.0258 | 3.0769 | 40 | 1.2246 | 0.0561 | 0.0797 | 0.0593 | 0.0 | 0.0398 | 0.1166 | 0.0265 | 0.1206 | 0.6441 | 0.0 | 0.5789 | 0.8385 |
| 1.5082 | 3.8462 | 50 | 1.1988 | 0.0477 | 0.0869 | 0.0542 | 0.0 | 0.0927 | 0.063 | 0.0235 | 0.0853 | 0.6471 | 0.0 | 0.6316 | 0.7692 |
| 1.3716 | 4.6154 | 60 | 1.1917 | 0.0549 | 0.1014 | 0.0602 | 0.0 | 0.0902 | 0.0761 | 0.0588 | 0.1618 | 0.5971 | 0.0 | 0.5421 | 0.7692 |
| 1.2398 | 5.3846 | 70 | 1.0554 | 0.1329 | 0.1674 | 0.1485 | 0.0 | 0.1462 | 0.1957 | 0.0765 | 0.1882 | 0.7294 | 0.0 | 0.7474 | 0.8154 |
| 1.401 | 6.1538 | 80 | 0.9179 | 0.1176 | 0.1821 | 0.1315 | 0.0 | 0.0835 | 0.2295 | 0.0529 | 0.1794 | 0.7294 | 0.0 | 0.7211 | 0.8538 |
| 2.0328 | 6.9231 | 90 | 0.9198 | 0.1361 | 0.2109 | 0.1554 | 0.0 | 0.0937 | 0.2424 | 0.0559 | 0.2088 | 0.6882 | 0.0 | 0.6368 | 0.8692 |
| 1.6358 | 7.6923 | 100 | 0.9298 | 0.2252 | 0.2898 | 0.2523 | 0.0 | 0.2279 | 0.3487 | 0.1059 | 0.3176 | 0.6882 | 0.0 | 0.6263 | 0.8846 |
| 0.8849 | 8.4615 | 110 | 0.8894 | 0.1893 | 0.2435 | 0.2248 | 0.0 | 0.1438 | 0.3337 | 0.0971 | 0.2265 | 0.7265 | 0.0 | 0.7263 | 0.8385 |
| 1.1906 | 9.2308 | 120 | 0.8505 | 0.2105 | 0.2704 | 0.2598 | 0.0 | 0.1879 | 0.3317 | 0.1324 | 0.2706 | 0.6853 | 0.0 | 0.6474 | 0.8462 |
| 1.0404 | 10.0 | 130 | 0.7320 | 0.2508 | 0.2998 | 0.29 | 0.0 | 0.2031 | 0.4149 | 0.1588 | 0.2971 | 0.7471 | 0.0 | 0.7421 | 0.8692 |
| 1.1534 | 10.7692 | 140 | 0.7996 | 0.2832 | 0.374 | 0.3479 | 0.0 | 0.2502 | 0.411 | 0.1676 | 0.3647 | 0.6647 | 0.0 | 0.6263 | 0.8231 |
| 1.1725 | 11.5385 | 150 | 0.7990 | 0.3115 | 0.4464 | 0.3745 | 0.0 | 0.2972 | 0.4147 | 0.1294 | 0.3735 | 0.6588 | 0.0 | 0.6158 | 0.8231 |
| 0.891 | 12.3077 | 160 | 0.9007 | 0.2856 | 0.3519 | 0.3449 | 0.0 | 0.2607 | 0.3788 | 0.1029 | 0.3529 | 0.6735 | 0.0 | 0.6263 | 0.8462 |
| 1.1 | 13.0769 | 170 | 0.7376 | 0.2642 | 0.3608 | 0.3377 | 0.0 | 0.2281 | 0.4018 | 0.1176 | 0.3676 | 0.7176 | 0.0 | 0.7 | 0.8538 |
| 1.2631 | 13.8462 | 180 | 0.7162 | 0.306 | 0.4363 | 0.3899 | 0.0 | 0.2997 | 0.3933 | 0.1412 | 0.45 | 0.7059 | 0.0 | 0.7053 | 0.8154 |
| 1.0496 | 14.6154 | 190 | 0.7276 | 0.2811 | 0.3866 | 0.3483 | 0.0 | 0.3061 | 0.3685 | 0.1471 | 0.3882 | 0.7235 | 0.0 | 0.7316 | 0.8231 |
| 0.8883 | 15.3846 | 200 | 0.6855 | 0.3373 | 0.4578 | 0.4385 | 0.0 | 0.3441 | 0.4654 | 0.15 | 0.4824 | 0.7412 | 0.0 | 0.7579 | 0.8308 |
| 0.8471 | 16.1538 | 210 | 0.6733 | 0.4351 | 0.5932 | 0.5367 | 0.0 | 0.3702 | 0.6215 | 0.15 | 0.5412 | 0.7206 | 0.0 | 0.7158 | 0.8385 |
| 0.9084 | 16.9231 | 220 | 0.6526 | 0.4279 | 0.5632 | 0.4848 | 0.0 | 0.4011 | 0.572 | 0.1824 | 0.5647 | 0.7294 | 0.0 | 0.7105 | 0.8692 |
| 0.8872 | 17.6923 | 230 | 0.6218 | 0.4376 | 0.5753 | 0.5274 | 0.0 | 0.3879 | 0.6215 | 0.1559 | 0.5853 | 0.7382 | 0.0 | 0.7263 | 0.8692 |
| 0.9739 | 18.4615 | 240 | 0.6590 | 0.4494 | 0.6293 | 0.505 | 0.0 | 0.3889 | 0.65 | 0.1471 | 0.5853 | 0.7029 | 0.0 | 0.6895 | 0.8308 |
| 0.7596 | 19.2308 | 250 | 0.6367 | 0.4625 | 0.6229 | 0.5322 | 0.0 | 0.4106 | 0.6581 | 0.1529 | 0.5853 | 0.7118 | 0.0 | 0.7053 | 0.8308 |
| 0.7124 | 20.0 | 260 | 0.6601 | 0.4619 | 0.6411 | 0.5327 | 0.0 | 0.39 | 0.6852 | 0.1559 | 0.5765 | 0.6794 | 0.0 | 0.6421 | 0.8385 |
| 0.8369 | 20.7692 | 270 | 0.6363 | 0.4736 | 0.64 | 0.5738 | 0.0 | 0.3993 | 0.737 | 0.1559 | 0.5853 | 0.6853 | 0.0 | 0.6474 | 0.8462 |
| 0.8608 | 21.5385 | 280 | 0.6304 | 0.496 | 0.6406 | 0.5583 | 0.0 | 0.4484 | 0.6973 | 0.1588 | 0.5912 | 0.7 | 0.0 | 0.6579 | 0.8692 |
| 0.6174 | 22.3077 | 290 | 0.6825 | 0.4808 | 0.6714 | 0.5569 | 0.0 | 0.4264 | 0.6738 | 0.1529 | 0.5765 | 0.6735 | 0.0 | 0.6158 | 0.8615 |
| 0.5903 | 23.0769 | 300 | 0.6037 | 0.5187 | 0.6804 | 0.6126 | 0.0 | 0.4604 | 0.709 | 0.1824 | 0.6118 | 0.7206 | 0.0 | 0.6842 | 0.8846 |
| 0.6325 | 23.8462 | 310 | 0.6373 | 0.529 | 0.6819 | 0.6246 | 0.0 | 0.4489 | 0.7601 | 0.1765 | 0.5941 | 0.7088 | 0.0 | 0.6579 | 0.8923 |
| 0.8569 | 24.6154 | 320 | 0.6131 | 0.5382 | 0.6684 | 0.6357 | 0.0 | 0.4862 | 0.7382 | 0.1794 | 0.6147 | 0.7294 | 0.0 | 0.7 | 0.8846 |
| 0.7056 | 25.3846 | 330 | 0.5700 | 0.5244 | 0.6545 | 0.6089 | 0.0 | 0.4891 | 0.6871 | 0.1824 | 0.6176 | 0.75 | 0.0 | 0.7421 | 0.8769 |
| 0.5988 | 26.1538 | 340 | 0.5738 | 0.5437 | 0.7119 | 0.651 | 0.0 | 0.5362 | 0.6823 | 0.1853 | 0.6206 | 0.7529 | 0.0 | 0.7579 | 0.8615 |
| 0.5209 | 26.9231 | 350 | 0.6136 | 0.5153 | 0.6944 | 0.6047 | 0.0 | 0.4772 | 0.7054 | 0.1824 | 0.5882 | 0.7059 | 0.0 | 0.6789 | 0.8538 |
| 0.6547 | 27.6923 | 360 | 0.6338 | 0.5166 | 0.6645 | 0.6224 | 0.0 | 0.4842 | 0.7072 | 0.1882 | 0.5971 | 0.7088 | 0.0 | 0.6842 | 0.8538 |
| 0.6324 | 28.4615 | 370 | 0.6083 | 0.5143 | 0.6543 | 0.6279 | 0.0 | 0.4683 | 0.729 | 0.1853 | 0.6 | 0.7118 | 0.0 | 0.6789 | 0.8692 |
| 0.6323 | 29.2308 | 380 | 0.5748 | 0.529 | 0.6552 | 0.637 | 0.0 | 0.48 | 0.7529 | 0.1853 | 0.6088 | 0.7206 | 0.0 | 0.6842 | 0.8846 |
| 0.4509 | 30.0 | 390 | 0.5758 | 0.5311 | 0.652 | 0.6325 | 0.0 | 0.4923 | 0.7454 | 0.1882 | 0.6206 | 0.7324 | 0.0 | 0.7053 | 0.8846 |
| 0.8259 | 30.7692 | 400 | 0.5836 | 0.5257 | 0.6508 | 0.6241 | 0.0 | 0.4752 | 0.7513 | 0.1853 | 0.6 | 0.7147 | 0.0 | 0.6684 | 0.8923 |
### Framework versions
- Transformers 4.46.3
- Pytorch 2.4.0
- Datasets 3.1.0
- Tokenizers 0.20.0
| [
"object",
"balloon"
] |
Edgar404/Crop_disease_detr |
# Model Card for Model ID
<!-- Provide a quick summary of what the model is/does. -->
## Model Details
### Model Description
<!-- Provide a longer summary of what this model is. -->
This is the model card of a 🤗 transformers model that has been pushed on the Hub. This model card has been automatically generated.
- **Developed by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Model type:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
- **Finetuned from model [optional]:** [More Information Needed]
### Model Sources [optional]
<!-- Provide the basic links for the model. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the model is intended to be used, including the foreseeable users of the model and those affected by the model. -->
### Direct Use
<!-- This section is for the model use without fine-tuning or plugging into a larger ecosystem/app. -->
[More Information Needed]
### Downstream Use [optional]
<!-- This section is for the model use when fine-tuned for a task, or when plugged into a larger ecosystem/app -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the model will not work well for. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.
## How to Get Started with the Model
Use the code below to get started with the model.
[More Information Needed]
## Training Details
### Training Data
<!-- This should link to a Dataset Card, perhaps with a short stub of information on what the training data is all about as well as documentation related to data pre-processing or additional filtering. -->
[More Information Needed]
### Training Procedure
<!-- This relates heavily to the Technical Specifications. Content here should link to that section when it is relevant to the training procedure. -->
#### Preprocessing [optional]
[More Information Needed]
#### Training Hyperparameters
- **Training regime:** [More Information Needed] <!--fp32, fp16 mixed precision, bf16 mixed precision, bf16 non-mixed precision, fp16 non-mixed precision, fp8 mixed precision -->
#### Speeds, Sizes, Times [optional]
<!-- This section provides information about throughput, start/end time, checkpoint size if relevant, etc. -->
[More Information Needed]
## Evaluation
<!-- This section describes the evaluation protocols and provides the results. -->
### Testing Data, Factors & Metrics
#### Testing Data
<!-- This should link to a Dataset Card if possible. -->
[More Information Needed]
#### Factors
<!-- These are the things the evaluation is disaggregating by, e.g., subpopulations or domains. -->
[More Information Needed]
#### Metrics
<!-- These are the evaluation metrics being used, ideally with a description of why. -->
[More Information Needed]
### Results
[More Information Needed]
#### Summary
## Model Examination [optional]
<!-- Relevant interpretability work for the model goes here -->
[More Information Needed]
## Environmental Impact
<!-- Total emissions (in grams of CO2eq) and additional considerations, such as electricity usage, go here. Edit the suggested text below accordingly -->
Carbon emissions can be estimated using the [Machine Learning Impact calculator](https://mlco2.github.io/impact#compute) presented in [Lacoste et al. (2019)](https://arxiv.org/abs/1910.09700).
- **Hardware Type:** [More Information Needed]
- **Hours used:** [More Information Needed]
- **Cloud Provider:** [More Information Needed]
- **Compute Region:** [More Information Needed]
- **Carbon Emitted:** [More Information Needed]
## Technical Specifications [optional]
### Model Architecture and Objective
[More Information Needed]
### Compute Infrastructure
[More Information Needed]
#### Hardware
[More Information Needed]
#### Software
[More Information Needed]
## Citation [optional]
<!-- If there is a paper or blog post introducing the model, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the model or model card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Model Card Authors [optional]
[More Information Needed]
## Model Card Contact
[More Information Needed] | [
"label_0",
"label_1",
"label_2",
"label_3",
"label_4",
"label_5",
"label_6",
"label_7",
"label_8",
"label_9",
"label_10",
"label_11",
"label_12",
"label_13",
"label_14",
"label_15",
"label_16",
"label_17",
"label_18",
"label_19",
"label_20",
"label_21",
"label_22"
] |
the-silent-kid/detr-colab |
# Model Card for Model ID
<!-- Provide a quick summary of what the model is/does. -->
## Model Details
### Model Description
<!-- Provide a longer summary of what this model is. -->
This is the model card of a 🤗 transformers model that has been pushed on the Hub. This model card has been automatically generated.
- **Developed by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Model type:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
- **Finetuned from model [optional]:** [More Information Needed]
### Model Sources [optional]
<!-- Provide the basic links for the model. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the model is intended to be used, including the foreseeable users of the model and those affected by the model. -->
### Direct Use
<!-- This section is for the model use without fine-tuning or plugging into a larger ecosystem/app. -->
[More Information Needed]
### Downstream Use [optional]
<!-- This section is for the model use when fine-tuned for a task, or when plugged into a larger ecosystem/app -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the model will not work well for. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.
## How to Get Started with the Model
Use the code below to get started with the model.
[More Information Needed]
## Training Details
### Training Data
<!-- This should link to a Dataset Card, perhaps with a short stub of information on what the training data is all about as well as documentation related to data pre-processing or additional filtering. -->
[More Information Needed]
### Training Procedure
<!-- This relates heavily to the Technical Specifications. Content here should link to that section when it is relevant to the training procedure. -->
#### Preprocessing [optional]
[More Information Needed]
#### Training Hyperparameters
- **Training regime:** [More Information Needed] <!--fp32, fp16 mixed precision, bf16 mixed precision, bf16 non-mixed precision, fp16 non-mixed precision, fp8 mixed precision -->
#### Speeds, Sizes, Times [optional]
<!-- This section provides information about throughput, start/end time, checkpoint size if relevant, etc. -->
[More Information Needed]
## Evaluation
<!-- This section describes the evaluation protocols and provides the results. -->
### Testing Data, Factors & Metrics
#### Testing Data
<!-- This should link to a Dataset Card if possible. -->
[More Information Needed]
#### Factors
<!-- These are the things the evaluation is disaggregating by, e.g., subpopulations or domains. -->
[More Information Needed]
#### Metrics
<!-- These are the evaluation metrics being used, ideally with a description of why. -->
[More Information Needed]
### Results
[More Information Needed]
#### Summary
## Model Examination [optional]
<!-- Relevant interpretability work for the model goes here -->
[More Information Needed]
## Environmental Impact
<!-- Total emissions (in grams of CO2eq) and additional considerations, such as electricity usage, go here. Edit the suggested text below accordingly -->
Carbon emissions can be estimated using the [Machine Learning Impact calculator](https://mlco2.github.io/impact#compute) presented in [Lacoste et al. (2019)](https://arxiv.org/abs/1910.09700).
- **Hardware Type:** [More Information Needed]
- **Hours used:** [More Information Needed]
- **Cloud Provider:** [More Information Needed]
- **Compute Region:** [More Information Needed]
- **Carbon Emitted:** [More Information Needed]
## Technical Specifications [optional]
### Model Architecture and Objective
[More Information Needed]
### Compute Infrastructure
[More Information Needed]
#### Hardware
[More Information Needed]
#### Software
[More Information Needed]
## Citation [optional]
<!-- If there is a paper or blog post introducing the model, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the model or model card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Model Card Authors [optional]
[More Information Needed]
## Model Card Contact
[More Information Needed] | [
"label_0",
"label_1",
"label_2",
"label_3",
"label_4"
] |
LovrOP/detraaa_finetuned_cppe5 |
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# detraaa_finetuned_cppe5
This model is a fine-tuned version of [microsoft/conditional-detr-resnet-50](https://huggingface.co/microsoft/conditional-detr-resnet-50) on an unknown dataset.
It achieves the following results on the evaluation set:
- Loss: 1.8720
- Map: 0.0239
- Map 50: 0.0667
- Map 75: 0.0105
- Map Small: 0.0079
- Map Medium: 0.0505
- Map Large: 0.0493
- Mar 1: 0.067
- Mar 10: 0.1645
- Mar 100: 0.2185
- Mar Small: 0.1159
- Mar Medium: 0.2718
- Mar Large: 0.2179
- Map Bone-fracture: -1.0
- Mar 100 Bone-fracture: -1.0
- Map Angle: 0.0372
- Mar 100 Angle: 0.1583
- Map Fracture: 0.0068
- Mar 100 Fracture: 0.2237
- Map Line: 0.0042
- Mar 100 Line: 0.1829
- Map Messed Up Angle: 0.0473
- Mar 100 Messed Up Angle: 0.3091
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 5e-05
- train_batch_size: 8
- eval_batch_size: 8
- seed: 42
- optimizer: Use OptimizerNames.ADAMW_TORCH with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
- lr_scheduler_type: cosine
- num_epochs: 30
### Training results
| Training Loss | Epoch | Step | Validation Loss | Map | Map 50 | Map 75 | Map Small | Map Medium | Map Large | Mar 1 | Mar 10 | Mar 100 | Mar Small | Mar Medium | Mar Large | Map Bone-fracture | Mar 100 Bone-fracture | Map Angle | Mar 100 Angle | Map Fracture | Mar 100 Fracture | Map Line | Mar 100 Line | Map Messed Up Angle | Mar 100 Messed Up Angle |
|:-------------:|:-----:|:----:|:---------------:|:------:|:------:|:------:|:---------:|:----------:|:---------:|:------:|:------:|:-------:|:---------:|:----------:|:---------:|:-----------------:|:---------------------:|:---------:|:-------------:|:------------:|:----------------:|:--------:|:------------:|:-------------------:|:-----------------------:|
| No log | 1.0 | 41 | 3.8122 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | -1.0 | -1.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| No log | 2.0 | 82 | 2.9585 | 0.0001 | 0.0005 | 0.0 | 0.0 | 0.0002 | 0.0 | 0.0 | 0.003 | 0.0051 | 0.0 | 0.0127 | 0.0 | -1.0 | -1.0 | 0.0 | 0.0 | 0.0 | 0.0085 | 0.0 | 0.0029 | 0.0002 | 0.0091 |
| No log | 3.0 | 123 | 2.7378 | 0.0001 | 0.0005 | 0.0 | 0.0001 | 0.0002 | 0.0 | 0.0 | 0.0045 | 0.0096 | 0.0111 | 0.0117 | 0.0 | 0.0 | 0.0 | 0.0002 | 0.0356 | 0.0 | 0.0029 | 0.0 | 0.0 |
| No log | 4.0 | 164 | 2.5667 | 0.0001 | 0.0004 | 0.0 | 0.0001 | 0.0001 | 0.0 | 0.0 | 0.005 | 0.0224 | 0.0152 | 0.0362 | 0.0 | -1.0 | -1.0 | 0.0 | 0.0 | 0.0001 | 0.0458 | 0.0002 | 0.0257 | 0.0 | 0.0182 |
| No log | 5.0 | 205 | 2.3972 | 0.0004 | 0.0019 | 0.0 | 0.0002 | 0.001 | 0.0 | 0.0007 | 0.0171 | 0.0465 | 0.0242 | 0.0815 | 0.0 | 0.0 | 0.0 | 0.0004 | 0.0898 | 0.0005 | 0.06 | 0.0005 | 0.0364 |
| No log | 6.0 | 246 | 2.4845 | 0.0001 | 0.0004 | 0.0 | 0.0002 | 0.0001 | 0.0 | 0.0 | 0.0051 | 0.0234 | 0.0305 | 0.0283 | 0.0 | -1.0 | -1.0 | 0.0 | 0.0 | 0.0001 | 0.0593 | 0.0001 | 0.0343 | 0.0 | 0.0 |
| No log | 7.0 | 287 | 2.2863 | 0.0001 | 0.0005 | 0.0 | 0.0002 | 0.0001 | 0.0003 | 0.0 | 0.0043 | 0.0262 | 0.0354 | 0.025 | 0.025 | -1.0 | -1.0 | 0.0 | 0.0 | 0.0003 | 0.0847 | 0.0001 | 0.02 | 0.0 | 0.0 |
| No log | 8.0 | 328 | 2.2480 | 0.0002 | 0.0011 | 0.0 | 0.0005 | 0.0003 | 0.0 | 0.003 | 0.0077 | 0.0338 | 0.0339 | 0.0408 | 0.0 | 0.0 | 0.0 | 0.0005 | 0.0864 | 0.0004 | 0.0486 | 0.0 | 0.0 |
| No log | 9.0 | 369 | 2.1551 | 0.0008 | 0.0037 | 0.0001 | 0.0024 | 0.0006 | 0.0 | 0.0024 | 0.0273 | 0.0544 | 0.0425 | 0.084 | 0.0 | 0.0 | 0.0 | 0.0028 | 0.1424 | 0.0004 | 0.0571 | 0.0001 | 0.0182 |
| No log | 10.0 | 410 | 2.1461 | 0.0005 | 0.0019 | 0.0 | 0.0014 | 0.0005 | 0.0 | 0.0078 | 0.0208 | 0.0513 | 0.0485 | 0.065 | 0.0 | 0.0 | 0.0 | 0.0014 | 0.1424 | 0.0005 | 0.0629 | 0.0 | 0.0 |
| No log | 11.0 | 451 | 2.1594 | 0.0013 | 0.0053 | 0.0 | 0.0032 | 0.0012 | 0.0009 | 0.0114 | 0.0346 | 0.0684 | 0.0577 | 0.0975 | 0.0625 | -1.0 | -1.0 | 0.0 | 0.0 | 0.0042 | 0.1475 | 0.0002 | 0.0714 | 0.0006 | 0.0545 |
| No log | 12.0 | 492 | 2.0050 | 0.0014 | 0.0066 | 0.0 | 0.0037 | 0.0007 | 0.0028 | 0.0045 | 0.0435 | 0.0702 | 0.078 | 0.0742 | 0.0875 | -1.0 | -1.0 | 0.0 | 0.0 | 0.0048 | 0.1661 | 0.0007 | 0.1057 | 0.0001 | 0.0091 |
| 4.8365 | 13.0 | 533 | 2.0575 | 0.0022 | 0.0087 | 0.0002 | 0.002 | 0.0086 | 0.0064 | 0.0083 | 0.054 | 0.1006 | 0.0781 | 0.1375 | 0.1232 | -1.0 | -1.0 | 0.0 | 0.0 | 0.0023 | 0.161 | 0.0012 | 0.1143 | 0.0051 | 0.1273 |
| 4.8365 | 14.0 | 574 | 1.9988 | 0.0071 | 0.0222 | 0.0032 | 0.0025 | 0.0151 | 0.0061 | 0.014 | 0.0919 | 0.1275 | 0.0879 | 0.1797 | 0.0679 | -1.0 | -1.0 | 0.004 | 0.0333 | 0.0023 | 0.1864 | 0.0016 | 0.1629 | 0.0205 | 0.1273 |
| 4.8365 | 15.0 | 615 | 1.9491 | 0.0039 | 0.0146 | 0.0008 | 0.0034 | 0.0095 | 0.0033 | 0.0171 | 0.0772 | 0.1198 | 0.0751 | 0.1762 | 0.0929 | 0.0048 | 0.0417 | 0.0038 | 0.1949 | 0.0008 | 0.0971 | 0.006 | 0.1455 |
| 4.8365 | 16.0 | 656 | 1.9354 | 0.0048 | 0.0245 | 0.0005 | 0.0052 | 0.0106 | 0.0087 | 0.013 | 0.1011 | 0.1515 | 0.102 | 0.1874 | 0.0679 | -1.0 | -1.0 | 0.0087 | 0.0833 | 0.0051 | 0.1966 | 0.0011 | 0.1171 | 0.0042 | 0.2091 |
| 4.8365 | 17.0 | 697 | 1.9822 | 0.0042 | 0.0174 | 0.0013 | 0.0039 | 0.0075 | 0.0195 | 0.0231 | 0.0862 | 0.1299 | 0.1128 | 0.1565 | 0.0786 | -1.0 | -1.0 | 0.0018 | 0.0083 | 0.0048 | 0.1932 | 0.002 | 0.1543 | 0.0084 | 0.1636 |
| 4.8365 | 18.0 | 738 | 1.9518 | 0.0135 | 0.0331 | 0.0024 | 0.0027 | 0.0299 | 0.0263 | 0.0362 | 0.1359 | 0.1954 | 0.0739 | 0.2899 | 0.1839 | -1.0 | -1.0 | 0.0198 | 0.15 | 0.0037 | 0.1932 | 0.0021 | 0.12 | 0.0283 | 0.3182 |
| 4.8365 | 19.0 | 779 | 1.9752 | 0.008 | 0.042 | 0.0019 | 0.0034 | 0.0202 | 0.0085 | 0.0251 | 0.1405 | 0.173 | 0.1069 | 0.2424 | 0.0732 | -1.0 | -1.0 | 0.0098 | 0.15 | 0.0033 | 0.1424 | 0.0022 | 0.1543 | 0.0166 | 0.2455 |
| 4.8365 | 20.0 | 820 | 1.9388 | 0.0155 | 0.0519 | 0.0013 | 0.0047 | 0.034 | 0.0288 | 0.0459 | 0.164 | 0.2209 | 0.135 | 0.2656 | 0.1857 | -1.0 | -1.0 | 0.0176 | 0.1 | 0.0044 | 0.1763 | 0.0034 | 0.18 | 0.0367 | 0.4273 |
| 4.8365 | 21.0 | 861 | 1.9225 | 0.0265 | 0.0666 | 0.0035 | 0.0045 | 0.0545 | 0.0311 | 0.0765 | 0.163 | 0.2146 | 0.1159 | 0.2651 | 0.1625 | -1.0 | -1.0 | 0.0395 | 0.15 | 0.0048 | 0.2 | 0.0028 | 0.1629 | 0.0588 | 0.3455 |
| 4.8365 | 22.0 | 902 | 1.9158 | 0.0151 | 0.0549 | 0.0072 | 0.006 | 0.0398 | 0.0326 | 0.0542 | 0.1634 | 0.1978 | 0.1108 | 0.2455 | 0.1143 | -1.0 | -1.0 | 0.0272 | 0.15 | 0.0044 | 0.1932 | 0.0033 | 0.1571 | 0.0256 | 0.2909 |
| 4.8365 | 23.0 | 943 | 1.8807 | 0.0203 | 0.0553 | 0.0033 | 0.0078 | 0.0464 | 0.0478 | 0.055 | 0.1664 | 0.2098 | 0.1313 | 0.2568 | 0.1768 | -1.0 | -1.0 | 0.0263 | 0.15 | 0.0069 | 0.2 | 0.0034 | 0.18 | 0.0447 | 0.3091 |
| 4.8365 | 24.0 | 984 | 1.8769 | 0.0265 | 0.0645 | 0.0091 | 0.0075 | 0.0546 | 0.0487 | 0.0709 | 0.1737 | 0.221 | 0.1134 | 0.275 | 0.2125 | -1.0 | -1.0 | 0.0379 | 0.1667 | 0.0065 | 0.2068 | 0.0043 | 0.1743 | 0.0574 | 0.3364 |
| 1.6844 | 25.0 | 1025 | 1.8751 | 0.0261 | 0.0661 | 0.0091 | 0.0078 | 0.0532 | 0.0568 | 0.0707 | 0.173 | 0.2203 | 0.1155 | 0.2708 | 0.2429 | -1.0 | -1.0 | 0.0371 | 0.175 | 0.0066 | 0.2169 | 0.0041 | 0.18 | 0.0565 | 0.3091 |
| 1.6844 | 26.0 | 1066 | 1.8840 | 0.0223 | 0.062 | 0.0068 | 0.0078 | 0.0453 | 0.0272 | 0.0578 | 0.1576 | 0.2145 | 0.1277 | 0.2713 | 0.1768 | -1.0 | -1.0 | 0.0332 | 0.1417 | 0.007 | 0.2237 | 0.0037 | 0.1743 | 0.0454 | 0.3182 |
| 1.6844 | 27.0 | 1107 | 1.8819 | 0.0273 | 0.0703 | 0.0115 | 0.0077 | 0.056 | 0.0447 | 0.0707 | 0.1692 | 0.2239 | 0.1149 | 0.2657 | 0.2321 | -1.0 | -1.0 | 0.0419 | 0.1583 | 0.0068 | 0.2203 | 0.0039 | 0.1714 | 0.0565 | 0.3455 |
| 1.6844 | 28.0 | 1148 | 1.8702 | 0.024 | 0.0669 | 0.0105 | 0.0079 | 0.0483 | 0.0487 | 0.067 | 0.1658 | 0.2173 | 0.1169 | 0.2666 | 0.2179 | -1.0 | -1.0 | 0.0381 | 0.1667 | 0.0069 | 0.2254 | 0.0041 | 0.1771 | 0.047 | 0.3 |
| 1.6844 | 29.0 | 1189 | 1.8712 | 0.024 | 0.0667 | 0.0105 | 0.0079 | 0.0485 | 0.0493 | 0.067 | 0.1666 | 0.2206 | 0.1159 | 0.2754 | 0.2179 | -1.0 | -1.0 | 0.0378 | 0.1667 | 0.0068 | 0.2237 | 0.0041 | 0.1829 | 0.0473 | 0.3091 |
| 1.6844 | 30.0 | 1230 | 1.8720 | 0.0239 | 0.0667 | 0.0105 | 0.0079 | 0.0505 | 0.0493 | 0.067 | 0.1645 | 0.2185 | 0.1159 | 0.2718 | 0.2179 | -1.0 | -1.0 | 0.0372 | 0.1583 | 0.0068 | 0.2237 | 0.0042 | 0.1829 | 0.0473 | 0.3091 |
### Framework versions
- Transformers 4.46.3
- Pytorch 2.5.1+cpu
- Datasets 3.1.0
- Tokenizers 0.20.3
| [
"bone-fracture",
"angle",
"fracture",
"line",
"messed_up_angle"
] |
Anurag277/detr-resnet-50_finetuned_cppe5 |
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# detr-resnet-50_finetuned_cppe5
This model is a fine-tuned version of [facebook/detr-resnet-50](https://huggingface.co/facebook/detr-resnet-50) on an unknown dataset.
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 1e-05
- train_batch_size: 8
- eval_batch_size: 8
- seed: 42
- optimizer: Use adamw_torch with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
- lr_scheduler_type: linear
- num_epochs: 10
- mixed_precision_training: Native AMP
### Training results
### Framework versions
- Transformers 4.47.0
- Pytorch 2.5.1+cu121
- Datasets 3.1.0
- Tokenizers 0.21.0
| [
"n/a",
"person",
"bicycle",
"car",
"motorcycle",
"airplane",
"bus",
"train",
"truck",
"boat",
"traffic light",
"fire hydrant",
"street sign",
"stop sign",
"parking meter",
"bench",
"bird",
"cat",
"dog",
"horse",
"sheep",
"cow",
"elephant",
"bear",
"zebra",
"giraffe",
"hat",
"backpack",
"umbrella",
"shoe",
"eye glasses",
"handbag",
"tie",
"suitcase",
"frisbee",
"skis",
"snowboard",
"sports ball",
"kite",
"baseball bat",
"baseball glove",
"skateboard",
"surfboard",
"tennis racket",
"bottle",
"plate",
"wine glass",
"cup",
"fork",
"knife",
"spoon",
"bowl",
"banana",
"apple",
"sandwich",
"orange",
"broccoli",
"carrot",
"hot dog",
"pizza",
"donut",
"cake",
"chair",
"couch",
"potted plant",
"bed",
"mirror",
"dining table",
"window",
"desk",
"toilet",
"door",
"tv",
"laptop",
"mouse",
"remote",
"keyboard",
"cell phone",
"microwave",
"oven",
"toaster",
"sink",
"refrigerator",
"blender",
"book",
"clock",
"vase",
"scissors",
"teddy bear",
"hair drier",
"toothbrush"
] |
L0ki2008/detr-resnet-50_finetuned_cppe5 |
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# detr-resnet-50_finetuned_cppe5
This model is a fine-tuned version of [facebook/detr-resnet-50](https://huggingface.co/facebook/detr-resnet-50) on an unknown dataset.
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 1e-05
- train_batch_size: 8
- eval_batch_size: 8
- seed: 42
- optimizer: Use adamw_torch with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
- lr_scheduler_type: linear
- num_epochs: 30
- mixed_precision_training: Native AMP
### Training results
### Framework versions
- Transformers 4.46.2
- Pytorch 2.5.1+cu121
- Tokenizers 0.20.3
| [
"coverall",
"face_shield",
"gloves",
"goggles",
"mask"
] |
qubvel-hf/rt-detr-finetuned-cppe-5-3k-steps |
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# rt-detr-finetuned-cppe-5-3k-steps
This model is a fine-tuned version of [PekingU/rtdetr_r50vd_coco_o365](https://huggingface.co/PekingU/rtdetr_r50vd_coco_o365) on the cppe-5 dataset.
It achieves the following results on the evaluation set:
- Loss: 9.1012
- Map: 0.2813
- Map 50: 0.5271
- Map 75: 0.2685
- Map Small: 0.0879
- Map Medium: 0.2399
- Map Large: 0.4613
- Mar 1: 0.3061
- Mar 10: 0.4664
- Mar 100: 0.5014
- Mar Small: 0.2985
- Mar Medium: 0.4465
- Mar Large: 0.6698
- Map Coverall: 0.4438
- Mar 100 Coverall: 0.6815
- Map Face Shield: 0.2983
- Mar 100 Face Shield: 0.4924
- Map Gloves: 0.2305
- Mar 100 Gloves: 0.4817
- Map Goggles: 0.1591
- Mar 100 Goggles: 0.3969
- Map Mask: 0.275
- Mar 100 Mask: 0.4547
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 5e-05
- train_batch_size: 8
- eval_batch_size: 8
- seed: 1337
- optimizer: Use adamw_torch with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
- lr_scheduler_type: linear
- num_epochs: 30.0
- mixed_precision_training: Native AMP
### Training results
| Training Loss | Epoch | Step | Validation Loss | Map | Map 50 | Map 75 | Map Small | Map Medium | Map Large | Mar 1 | Mar 10 | Mar 100 | Mar Small | Mar Medium | Mar Large | Map Coverall | Mar 100 Coverall | Map Face Shield | Mar 100 Face Shield | Map Gloves | Mar 100 Gloves | Map Goggles | Mar 100 Goggles | Map Mask | Mar 100 Mask |
|:-------------:|:-----:|:----:|:---------------:|:------:|:------:|:------:|:---------:|:----------:|:---------:|:------:|:------:|:-------:|:---------:|:----------:|:---------:|:------------:|:----------------:|:---------------:|:-------------------:|:----------:|:--------------:|:-----------:|:---------------:|:--------:|:------------:|
| 41.0803 | 1.0 | 107 | 14.7670 | 0.0685 | 0.1413 | 0.0589 | 0.0211 | 0.0496 | 0.1045 | 0.0968 | 0.2251 | 0.2568 | 0.127 | 0.201 | 0.3699 | 0.2045 | 0.3964 | 0.0514 | 0.3013 | 0.0134 | 0.1777 | 0.0031 | 0.1938 | 0.0701 | 0.2147 |
| 18.7124 | 2.0 | 214 | 11.5242 | 0.1664 | 0.3287 | 0.148 | 0.0451 | 0.1291 | 0.265 | 0.1963 | 0.3451 | 0.3986 | 0.2461 | 0.3493 | 0.5659 | 0.3441 | 0.5757 | 0.1158 | 0.4367 | 0.1193 | 0.3446 | 0.0676 | 0.2754 | 0.1854 | 0.3604 |
| 16.6913 | 3.0 | 321 | 12.2486 | 0.1399 | 0.3041 | 0.1087 | 0.029 | 0.1187 | 0.2072 | 0.1663 | 0.3153 | 0.3776 | 0.1937 | 0.3363 | 0.5251 | 0.3001 | 0.5545 | 0.1189 | 0.443 | 0.0832 | 0.3326 | 0.0431 | 0.2092 | 0.1543 | 0.3484 |
| 15.7025 | 4.0 | 428 | 11.5488 | 0.1819 | 0.3681 | 0.1619 | 0.0474 | 0.1853 | 0.2567 | 0.2206 | 0.351 | 0.3984 | 0.1976 | 0.3882 | 0.4911 | 0.3217 | 0.5752 | 0.1295 | 0.3709 | 0.1665 | 0.3241 | 0.1069 | 0.3292 | 0.1847 | 0.3924 |
| 14.7758 | 5.0 | 535 | 12.9331 | 0.1605 | 0.3126 | 0.1484 | 0.0402 | 0.1512 | 0.2597 | 0.1985 | 0.3135 | 0.3435 | 0.1853 | 0.3037 | 0.4472 | 0.2383 | 0.4365 | 0.14 | 0.3304 | 0.1715 | 0.3402 | 0.0775 | 0.2569 | 0.1754 | 0.3538 |
| 14.4905 | 6.0 | 642 | 10.1206 | 0.219 | 0.4286 | 0.1979 | 0.0502 | 0.1962 | 0.3665 | 0.2516 | 0.4058 | 0.4494 | 0.216 | 0.4008 | 0.5944 | 0.404 | 0.673 | 0.1247 | 0.3975 | 0.2031 | 0.4402 | 0.1247 | 0.2985 | 0.2387 | 0.4378 |
| 13.4909 | 7.0 | 749 | 9.6988 | 0.245 | 0.4741 | 0.2326 | 0.0588 | 0.2417 | 0.3664 | 0.2647 | 0.4201 | 0.4572 | 0.2248 | 0.4316 | 0.5821 | 0.4571 | 0.6806 | 0.1374 | 0.3405 | 0.1979 | 0.4473 | 0.159 | 0.3831 | 0.2736 | 0.4347 |
| 13.2245 | 8.0 | 856 | 9.7353 | 0.2375 | 0.4622 | 0.227 | 0.0528 | 0.2347 | 0.3748 | 0.2658 | 0.4291 | 0.4815 | 0.2621 | 0.4423 | 0.6434 | 0.4127 | 0.6707 | 0.1807 | 0.4696 | 0.2293 | 0.4821 | 0.1382 | 0.3462 | 0.2265 | 0.4391 |
| 12.7879 | 9.0 | 963 | 9.2718 | 0.2604 | 0.4832 | 0.2502 | 0.0654 | 0.2323 | 0.4055 | 0.2818 | 0.4304 | 0.4788 | 0.2339 | 0.4387 | 0.6353 | 0.4797 | 0.6959 | 0.1558 | 0.4329 | 0.2371 | 0.4884 | 0.1658 | 0.3354 | 0.2633 | 0.4413 |
| 12.2499 | 10.0 | 1070 | 9.5461 | 0.2547 | 0.4956 | 0.2404 | 0.0582 | 0.2249 | 0.418 | 0.2725 | 0.443 | 0.4933 | 0.2713 | 0.4359 | 0.6505 | 0.4224 | 0.7059 | 0.2289 | 0.4544 | 0.2279 | 0.4915 | 0.1501 | 0.3862 | 0.2443 | 0.4284 |
| 12.1284 | 11.0 | 1177 | 9.6199 | 0.2611 | 0.5056 | 0.2322 | 0.0731 | 0.2459 | 0.3958 | 0.2646 | 0.4262 | 0.4704 | 0.2817 | 0.4256 | 0.598 | 0.4442 | 0.6568 | 0.211 | 0.4354 | 0.2404 | 0.475 | 0.1468 | 0.3446 | 0.2633 | 0.44 |
| 11.9831 | 12.0 | 1284 | 9.3471 | 0.2556 | 0.5007 | 0.2397 | 0.0889 | 0.2476 | 0.3971 | 0.2822 | 0.4573 | 0.5045 | 0.3054 | 0.4747 | 0.6499 | 0.4483 | 0.6806 | 0.1954 | 0.4911 | 0.2084 | 0.4786 | 0.1867 | 0.4292 | 0.2391 | 0.4431 |
| 11.8266 | 13.0 | 1391 | 9.5850 | 0.2329 | 0.4647 | 0.2113 | 0.0657 | 0.2009 | 0.3955 | 0.2667 | 0.4193 | 0.4586 | 0.2653 | 0.3996 | 0.6083 | 0.4132 | 0.6329 | 0.1537 | 0.4063 | 0.2098 | 0.4768 | 0.1427 | 0.3569 | 0.2451 | 0.42 |
| 11.6433 | 14.0 | 1498 | 9.7106 | 0.2353 | 0.472 | 0.2148 | 0.0697 | 0.2149 | 0.3916 | 0.2711 | 0.4243 | 0.4627 | 0.2665 | 0.4189 | 0.6111 | 0.3916 | 0.6248 | 0.1831 | 0.419 | 0.2157 | 0.4437 | 0.1391 | 0.3785 | 0.2469 | 0.4476 |
| 11.8852 | 15.0 | 1605 | 10.8775 | 0.2088 | 0.4137 | 0.1993 | 0.0564 | 0.1788 | 0.3643 | 0.2388 | 0.3642 | 0.3879 | 0.2372 | 0.3324 | 0.5194 | 0.3625 | 0.5649 | 0.1555 | 0.3304 | 0.1954 | 0.4138 | 0.1242 | 0.2615 | 0.2062 | 0.3689 |
| 11.4842 | 16.0 | 1712 | 9.3761 | 0.2585 | 0.5013 | 0.2454 | 0.0648 | 0.2309 | 0.4104 | 0.2752 | 0.4329 | 0.4671 | 0.2841 | 0.4057 | 0.6237 | 0.4659 | 0.6676 | 0.2002 | 0.4329 | 0.2453 | 0.4701 | 0.1407 | 0.3308 | 0.2402 | 0.4342 |
| 11.1006 | 17.0 | 1819 | 9.2561 | 0.2683 | 0.5134 | 0.2582 | 0.0777 | 0.234 | 0.435 | 0.2884 | 0.4437 | 0.4887 | 0.2774 | 0.4407 | 0.6443 | 0.4587 | 0.6586 | 0.2317 | 0.4519 | 0.2363 | 0.4647 | 0.1587 | 0.4015 | 0.2559 | 0.4667 |
| 10.9366 | 18.0 | 1926 | 9.3039 | 0.2626 | 0.4996 | 0.251 | 0.0669 | 0.2413 | 0.4333 | 0.2889 | 0.4399 | 0.4722 | 0.2662 | 0.4241 | 0.6188 | 0.4581 | 0.6572 | 0.1891 | 0.4076 | 0.2421 | 0.467 | 0.1489 | 0.3692 | 0.2748 | 0.46 |
| 10.7473 | 19.0 | 2033 | 9.4736 | 0.2649 | 0.5138 | 0.2541 | 0.082 | 0.2386 | 0.4461 | 0.2883 | 0.4318 | 0.4722 | 0.2856 | 0.4252 | 0.6165 | 0.4438 | 0.655 | 0.2526 | 0.4519 | 0.222 | 0.4598 | 0.1568 | 0.3492 | 0.2494 | 0.4453 |
| 10.7605 | 20.0 | 2140 | 9.2816 | 0.269 | 0.5104 | 0.2442 | 0.0765 | 0.2403 | 0.4417 | 0.293 | 0.4501 | 0.4914 | 0.2695 | 0.4398 | 0.6531 | 0.4441 | 0.6523 | 0.2429 | 0.4582 | 0.2428 | 0.4951 | 0.1547 | 0.3862 | 0.2606 | 0.4653 |
| 10.7865 | 21.0 | 2247 | 9.3265 | 0.2757 | 0.5368 | 0.2621 | 0.0731 | 0.2484 | 0.4379 | 0.2912 | 0.443 | 0.4793 | 0.2896 | 0.441 | 0.6178 | 0.4589 | 0.6455 | 0.2726 | 0.481 | 0.2327 | 0.4768 | 0.1692 | 0.3354 | 0.245 | 0.4578 |
| 10.5171 | 22.0 | 2354 | 9.5773 | 0.2554 | 0.506 | 0.2458 | 0.0866 | 0.2131 | 0.435 | 0.2866 | 0.4385 | 0.4752 | 0.3181 | 0.4226 | 0.621 | 0.4187 | 0.6405 | 0.2536 | 0.4633 | 0.2051 | 0.4545 | 0.1539 | 0.3738 | 0.2457 | 0.444 |
| 10.7464 | 23.0 | 2461 | 9.4040 | 0.2544 | 0.5064 | 0.2414 | 0.0697 | 0.2089 | 0.4424 | 0.288 | 0.4357 | 0.4732 | 0.2358 | 0.4181 | 0.641 | 0.4118 | 0.6468 | 0.2571 | 0.4709 | 0.1966 | 0.4504 | 0.1574 | 0.36 | 0.2489 | 0.4378 |
| 10.5963 | 24.0 | 2568 | 9.1140 | 0.272 | 0.5293 | 0.2615 | 0.0838 | 0.2326 | 0.4477 | 0.2921 | 0.4473 | 0.4877 | 0.3056 | 0.4231 | 0.65 | 0.4342 | 0.6572 | 0.2733 | 0.4772 | 0.226 | 0.4808 | 0.1649 | 0.3631 | 0.2618 | 0.46 |
| 10.4877 | 25.0 | 2675 | 9.2811 | 0.2738 | 0.5409 | 0.2496 | 0.0811 | 0.2321 | 0.4614 | 0.2978 | 0.4528 | 0.4859 | 0.2937 | 0.4261 | 0.6431 | 0.4298 | 0.6527 | 0.2947 | 0.5025 | 0.2465 | 0.4714 | 0.1532 | 0.3662 | 0.2445 | 0.4364 |
| 10.5136 | 26.0 | 2782 | 9.2285 | 0.2809 | 0.5362 | 0.2521 | 0.0764 | 0.2341 | 0.4694 | 0.3014 | 0.4547 | 0.4955 | 0.2986 | 0.4321 | 0.6618 | 0.4342 | 0.6725 | 0.2799 | 0.4911 | 0.247 | 0.4701 | 0.1726 | 0.3862 | 0.271 | 0.4578 |
| 10.4462 | 27.0 | 2889 | 9.1017 | 0.2803 | 0.5419 | 0.2617 | 0.0914 | 0.2296 | 0.4515 | 0.2973 | 0.4663 | 0.5034 | 0.3121 | 0.4374 | 0.6604 | 0.4503 | 0.6842 | 0.2965 | 0.4899 | 0.2318 | 0.4853 | 0.1656 | 0.4077 | 0.2574 | 0.4498 |
| 10.3325 | 28.0 | 2996 | 9.0687 | 0.2849 | 0.5344 | 0.256 | 0.0947 | 0.2288 | 0.4799 | 0.3076 | 0.4598 | 0.4947 | 0.3034 | 0.4193 | 0.6639 | 0.4589 | 0.6752 | 0.2833 | 0.4709 | 0.2302 | 0.4754 | 0.1863 | 0.3969 | 0.2658 | 0.4551 |
| 10.3327 | 29.0 | 3103 | 9.1673 | 0.2818 | 0.5364 | 0.264 | 0.0932 | 0.2415 | 0.4556 | 0.3083 | 0.4606 | 0.4995 | 0.3186 | 0.432 | 0.659 | 0.4379 | 0.6784 | 0.2937 | 0.4772 | 0.2241 | 0.4647 | 0.18 | 0.4185 | 0.2731 | 0.4587 |
| 10.296 | 30.0 | 3210 | 9.1012 | 0.2813 | 0.5271 | 0.2685 | 0.0879 | 0.2399 | 0.4613 | 0.3061 | 0.4664 | 0.5014 | 0.2985 | 0.4465 | 0.6698 | 0.4438 | 0.6815 | 0.2983 | 0.4924 | 0.2305 | 0.4817 | 0.1591 | 0.3969 | 0.275 | 0.4547 |
### Framework versions
- Transformers 4.47.0.dev0
- Pytorch 2.5.0+cu118
- Datasets 2.21.0
- Tokenizers 0.20.0
| [
"coverall",
"face_shield",
"gloves",
"goggles",
"mask"
] |
sina99/detr-resnet-50-dc5-fashionpedia-finetuned |
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# detr-resnet-50-dc5-fashionpedia-finetuned
This model is a fine-tuned version of [facebook/detr-resnet-50-dc5](https://huggingface.co/facebook/detr-resnet-50-dc5) on an unknown dataset.
It achieves the following results on the evaluation set:
- eval_loss: 4.9164
- eval_map: 0.0135
- eval_map_50: 0.0332
- eval_map_75: 0.0009
- eval_map_small: 0.0421
- eval_map_medium: 0.0229
- eval_map_large: 0.0056
- eval_mar_1: 0.0139
- eval_mar_10: 0.0241
- eval_mar_100: 0.0241
- eval_mar_small: 0.0417
- eval_mar_medium: 0.0341
- eval_mar_large: 0.0185
- eval_map_shirt, blouse: 0.0
- eval_mar_100_shirt, blouse: 0.0
- eval_map_top, t-shirt, sweatshirt: 0.0163
- eval_mar_100_top, t-shirt, sweatshirt: 0.1
- eval_map_cardigan: -1.0
- eval_mar_100_cardigan: -1.0
- eval_map_jacket: 0.0
- eval_mar_100_jacket: 0.0
- eval_map_pants: 0.0
- eval_mar_100_pants: 0.0
- eval_map_shorts: 0.0
- eval_mar_100_shorts: 0.0
- eval_map_skirt: 0.0
- eval_mar_100_skirt: 0.0
- eval_map_coat: 0.0
- eval_mar_100_coat: 0.0
- eval_map_dress: 0.0
- eval_mar_100_dress: 0.0
- eval_map_glasses: 0.0
- eval_mar_100_glasses: 0.0
- eval_map_hat: -1.0
- eval_mar_100_hat: -1.0
- eval_map_glove: -1.0
- eval_mar_100_glove: -1.0
- eval_map_belt: -1.0
- eval_mar_100_belt: -1.0
- eval_map_tights, stockings: 0.0
- eval_mar_100_tights, stockings: 0.0
- eval_map_shoe: 0.2259
- eval_mar_100_shoe: 0.3333
- eval_map_bag, wallet: -1.0
- eval_mar_100_bag, wallet: -1.0
- eval_map_scarf: -1.0
- eval_mar_100_scarf: -1.0
- eval_map_umbrella: 0.0
- eval_mar_100_umbrella: 0.0
- eval_map_hood: -1.0
- eval_mar_100_hood: -1.0
- eval_map_collar: 0.0
- eval_mar_100_collar: 0.0
- eval_map_lapel: 0.0
- eval_mar_100_lapel: 0.0
- eval_map_sleeve: 0.0
- eval_mar_100_sleeve: 0.0
- eval_map_pocket: 0.0
- eval_mar_100_pocket: 0.0
- eval_map_neckline: 0.0
- eval_mar_100_neckline: 0.0
- eval_map_buckle: -1.0
- eval_mar_100_buckle: -1.0
- eval_map_zipper: 0.0
- eval_mar_100_zipper: 0.0
- eval_map_fringe: -1.0
- eval_mar_100_fringe: -1.0
- eval_map_ribbon: -1.0
- eval_mar_100_ribbon: -1.0
- eval_map_rivet: -1.0
- eval_mar_100_rivet: -1.0
- eval_map_tassel: -1.0
- eval_mar_100_tassel: -1.0
- eval_runtime: 1.7311
- eval_samples_per_second: 6.354
- eval_steps_per_second: 1.733
- epoch: 1.3158
- step: 150
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 1e-05
- train_batch_size: 4
- eval_batch_size: 4
- seed: 42
- optimizer: Use adamw_torch with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
- lr_scheduler_type: linear
- training_steps: 10000
- mixed_precision_training: Native AMP
### Framework versions
- Transformers 4.46.3
- Pytorch 1.12.1+cu116
- Datasets 3.1.0
- Tokenizers 0.20.3
| [
"shirt, blouse",
"top, t-shirt, sweatshirt",
"sweater",
"cardigan",
"jacket",
"vest",
"pants",
"shorts",
"skirt",
"coat",
"dress",
"jumpsuit",
"cape",
"glasses",
"hat",
"headband, head covering, hair accessory",
"tie",
"glove",
"watch",
"belt",
"leg warmer",
"tights, stockings",
"sock",
"shoe",
"bag, wallet",
"scarf",
"umbrella",
"hood",
"collar",
"lapel",
"epaulette",
"sleeve",
"pocket",
"neckline",
"buckle",
"zipper",
"applique",
"bead",
"bow",
"flower",
"fringe",
"ribbon",
"rivet",
"ruffle",
"sequin",
"tassel"
] |
kvbiii/detr-resnet-50_finetuned_cppe5 |
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# detr-resnet-50_finetuned_cppe5
This model is a fine-tuned version of [facebook/detr-resnet-50](https://huggingface.co/facebook/detr-resnet-50) on an unknown dataset.
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 1e-05
- train_batch_size: 8
- eval_batch_size: 8
- seed: 42
- optimizer: Use adamw_torch with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
- lr_scheduler_type: linear
- num_epochs: 10
- mixed_precision_training: Native AMP
### Training results
### Framework versions
- Transformers 4.46.2
- Pytorch 2.5.1+cu121
- Datasets 3.1.0
- Tokenizers 0.20.3
| [
"coverall",
"face_shield",
"gloves",
"goggles",
"mask"
] |
osmanh/detr-resnet_finetuned |
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# detr-resnet_finetuned
This model is a fine-tuned version of [facebook/detr-resnet-50-dc5](https://huggingface.co/facebook/detr-resnet-50-dc5) on an unknown dataset.
It achieves the following results on the evaluation set:
- Loss: 2.6712
- Model Preparation Time: 0.0124
- Map: 0.0058
- Map 50: 0.0128
- Map 75: 0.0046
- Map Small: 0.0042
- Map Medium: 0.0081
- Map Large: 0.0055
- Mar 1: 0.0128
- Mar 10: 0.0308
- Mar 100: 0.0362
- Mar Small: 0.0239
- Mar Medium: 0.0494
- Mar Large: 0.0448
- Map Shirt, blouse: 0.0
- Mar 100 Shirt, blouse: 0.0
- Map Top, t-shirt, sweatshirt: 0.0
- Mar 100 Top, t-shirt, sweatshirt: 0.0
- Map Cardigan: 0.0
- Mar 100 Cardigan: 0.0
- Map Jacket: 0.0
- Mar 100 Jacket: 0.0
- Map Vest: 0.0
- Mar 100 Vest: 0.0
- Map Pants: 0.0
- Mar 100 Pants: 0.0
- Map Shorts: 0.0
- Mar 100 Shorts: 0.0
- Map Skirt: 0.0
- Mar 100 Skirt: 0.0
- Map Coat: 0.0
- Mar 100 Coat: 0.0
- Map Dress: 0.0681
- Mar 100 Dress: 0.2918
- Map Jumpsuit: 0.0
- Mar 100 Jumpsuit: 0.0
- Map Cape: 0.0
- Mar 100 Cape: 0.0
- Map Glasses: 0.0
- Mar 100 Glasses: 0.0
- Map Hat: 0.0
- Mar 100 Hat: 0.0
- Map Headband, head covering, hair accessory: 0.0
- Mar 100 Headband, head covering, hair accessory: 0.0
- Map Tie: 0.0
- Mar 100 Tie: 0.0
- Map Glove: 0.0
- Mar 100 Glove: 0.0
- Map Watch: 0.0
- Mar 100 Watch: 0.0
- Map Belt: 0.0
- Mar 100 Belt: 0.0
- Map Tights, stockings: 0.0
- Mar 100 Tights, stockings: 0.0
- Map Sock: 0.0
- Mar 100 Sock: 0.0
- Map Shoe: 0.1226
- Mar 100 Shoe: 0.5358
- Map Bag, wallet: 0.0
- Mar 100 Bag, wallet: 0.0
- Map Scarf: 0.0
- Mar 100 Scarf: 0.0
- Map Umbrella: 0.0
- Mar 100 Umbrella: 0.0
- Map Hood: 0.0
- Mar 100 Hood: 0.0
- Map Collar: 0.0
- Mar 100 Collar: 0.0
- Map Lapel: 0.0
- Mar 100 Lapel: 0.0
- Map Epaulette: 0.0
- Mar 100 Epaulette: 0.0
- Map Sleeve: 0.0368
- Mar 100 Sleeve: 0.4661
- Map Pocket: 0.0
- Mar 100 Pocket: 0.0
- Map Neckline: 0.0106
- Mar 100 Neckline: 0.1905
- Map Buckle: 0.0
- Mar 100 Buckle: 0.0
- Map Zipper: 0.0
- Mar 100 Zipper: 0.0
- Map Applique: 0.0
- Mar 100 Applique: 0.0
- Map Bow: 0.0
- Mar 100 Bow: 0.0
- Map Flower: 0.0
- Mar 100 Flower: 0.0
- Map Ribbon: 0.0
- Mar 100 Ribbon: 0.0
- Map Rivet: 0.0
- Mar 100 Rivet: 0.0
- Map Ruffle: 0.0
- Mar 100 Ruffle: 0.0
- Map Sequin: 0.0
- Mar 100 Sequin: 0.0
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 1e-05
- train_batch_size: 4
- eval_batch_size: 4
- seed: 42
- optimizer: Use adamw_torch with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
- lr_scheduler_type: linear
- training_steps: 3000
- mixed_precision_training: Native AMP
### Training results
| Training Loss | Epoch | Step | Validation Loss | Model Preparation Time | Map | Map 50 | Map 75 | Map Small | Map Medium | Map Large | Mar 1 | Mar 10 | Mar 100 | Mar Small | Mar Medium | Mar Large | Map Shirt, blouse | Mar 100 Shirt, blouse | Map Top, t-shirt, sweatshirt | Mar 100 Top, t-shirt, sweatshirt | Map Cardigan | Mar 100 Cardigan | Map Jacket | Mar 100 Jacket | Map Vest | Mar 100 Vest | Map Pants | Mar 100 Pants | Map Shorts | Mar 100 Shorts | Map Skirt | Mar 100 Skirt | Map Coat | Mar 100 Coat | Map Dress | Mar 100 Dress | Map Jumpsuit | Mar 100 Jumpsuit | Map Cape | Mar 100 Cape | Map Glasses | Mar 100 Glasses | Map Hat | Mar 100 Hat | Map Headband, head covering, hair accessory | Mar 100 Headband, head covering, hair accessory | Map Tie | Mar 100 Tie | Map Glove | Mar 100 Glove | Map Watch | Mar 100 Watch | Map Belt | Mar 100 Belt | Map Leg warmer | Mar 100 Leg warmer | Map Tights, stockings | Mar 100 Tights, stockings | Map Sock | Mar 100 Sock | Map Shoe | Mar 100 Shoe | Map Bag, wallet | Mar 100 Bag, wallet | Map Scarf | Mar 100 Scarf | Map Umbrella | Mar 100 Umbrella | Map Hood | Mar 100 Hood | Map Collar | Mar 100 Collar | Map Lapel | Mar 100 Lapel | Map Epaulette | Mar 100 Epaulette | Map Sleeve | Mar 100 Sleeve | Map Pocket | Mar 100 Pocket | Map Neckline | Mar 100 Neckline | Map Buckle | Mar 100 Buckle | Map Zipper | Mar 100 Zipper | Map Applique | Mar 100 Applique | Map Bead | Mar 100 Bead | Map Bow | Mar 100 Bow | Map Flower | Mar 100 Flower | Map Ribbon | Mar 100 Ribbon | Map Rivet | Mar 100 Rivet | Map Ruffle | Mar 100 Ruffle | Map Sequin | Mar 100 Sequin | Map Tassel | Mar 100 Tassel |
|:-------------:|:-----:|:----:|:---------------:|:----------------------:|:------:|:------:|:------:|:---------:|:----------:|:---------:|:------:|:------:|:-------:|:---------:|:----------:|:---------:|:-----------------:|:---------------------:|:----------------------------:|:--------------------------------:|:------------:|:----------------:|:----------:|:--------------:|:--------:|:------------:|:---------:|:-------------:|:----------:|:--------------:|:---------:|:-------------:|:--------:|:------------:|:---------:|:-------------:|:------------:|:----------------:|:--------:|:------------:|:-----------:|:---------------:|:-------:|:-----------:|:-------------------------------------------:|:-----------------------------------------------:|:-------:|:-----------:|:---------:|:-------------:|:---------:|:-------------:|:--------:|:------------:|:--------------:|:------------------:|:---------------------:|:-------------------------:|:--------:|:------------:|:--------:|:------------:|:---------------:|:-------------------:|:---------:|:-------------:|:------------:|:----------------:|:--------:|:------------:|:----------:|:--------------:|:---------:|:-------------:|:-------------:|:-----------------:|:----------:|:--------------:|:----------:|:--------------:|:------------:|:----------------:|:----------:|:--------------:|:----------:|:--------------:|:------------:|:----------------:|:--------:|:------------:|:-------:|:-----------:|:----------:|:--------------:|:----------:|:--------------:|:---------:|:-------------:|:----------:|:--------------:|:----------:|:--------------:|:----------:|:--------------:|
| 6.151 | 0.5 | 50 | 6.2529 | 0.0124 | 0.0 | 0.0001 | 0.0 | 0.0 | 0.0 | 0.0001 | 0.0002 | 0.0025 | 0.0064 | 0.0002 | 0.0039 | 0.0148 | 0.0 | 0.0 | 0.0004 | 0.0467 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0571 | 0.0003 | 0.0612 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | -1.0 | -1.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0001 | 0.0157 | 0.0001 | 0.0826 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0009 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | -1.0 | -1.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | -1.0 | -1.0 |
| 6.5354 | 1.0 | 100 | 5.5728 | 0.0124 | 0.0001 | 0.0003 | 0.0001 | 0.0001 | 0.0003 | 0.0001 | 0.0006 | 0.0021 | 0.004 | 0.0023 | 0.0064 | 0.0037 | 0.0 | 0.0 | 0.0007 | 0.0667 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | -1.0 | -1.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0043 | 0.0955 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0009 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | -1.0 | -1.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | -1.0 | -1.0 |
| 4.9315 | 1.5 | 150 | 4.9862 | 0.0124 | 0.0002 | 0.0007 | 0.0001 | 0.0005 | 0.0005 | 0.0001 | 0.001 | 0.007 | 0.0114 | 0.0069 | 0.0204 | 0.0064 | 0.0004 | 0.02 | 0.0014 | 0.1167 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | -1.0 | -1.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0078 | 0.3052 | 0.0 | 0.0174 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0096 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | -1.0 | -1.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | -1.0 | -1.0 |
| 4.4734 | 2.0 | 200 | 4.3909 | 0.0124 | 0.0005 | 0.0013 | 0.0003 | 0.0013 | 0.0009 | 0.0001 | 0.0014 | 0.0069 | 0.0102 | 0.0101 | 0.0182 | 0.0017 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | -1.0 | -1.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0196 | 0.3746 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0003 | 0.0443 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | -1.0 | -1.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 3.8903 | 2.5 | 250 | 4.0256 | 0.0124 | 0.0006 | 0.0017 | 0.0003 | 0.0011 | 0.0011 | 0.0001 | 0.0018 | 0.0073 | 0.0115 | 0.0095 | 0.0209 | 0.0037 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0239 | 0.394 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0007 | 0.0765 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | -1.0 | -1.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 4.134 | 3.0 | 300 | 3.7973 | 0.0124 | 0.0009 | 0.0026 | 0.0004 | 0.0016 | 0.0015 | 0.0003 | 0.0025 | 0.008 | 0.0107 | 0.0094 | 0.0191 | 0.0025 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0356 | 0.3657 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0011 | 0.0748 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | -1.0 | -1.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 3.3223 | 3.5 | 350 | 3.6230 | 0.0124 | 0.0011 | 0.0025 | 0.0009 | 0.0024 | 0.0017 | 0.0002 | 0.0024 | 0.0084 | 0.0115 | 0.0099 | 0.0186 | 0.0087 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0428 | 0.3276 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0022 | 0.1452 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | -1.0 | -1.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 3.8292 | 4.0 | 400 | 3.5537 | 0.0124 | 0.0012 | 0.0028 | 0.0009 | 0.0018 | 0.0018 | 0.0001 | 0.0024 | 0.0083 | 0.0109 | 0.0118 | 0.0175 | 0.0033 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.047 | 0.3754 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0008 | 0.0713 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 4.3096 | 4.5 | 450 | 3.4566 | 0.0124 | 0.0012 | 0.0033 | 0.0006 | 0.0027 | 0.0018 | 0.0002 | 0.0027 | 0.0085 | 0.0111 | 0.0098 | 0.0179 | 0.007 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0488 | 0.3209 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0017 | 0.133 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 3.7201 | 5.0 | 500 | 3.3342 | 0.0124 | 0.0013 | 0.0035 | 0.0007 | 0.0019 | 0.0021 | 0.0004 | 0.0024 | 0.0102 | 0.0141 | 0.0121 | 0.0231 | 0.0107 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0528 | 0.4284 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0023 | 0.1513 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 3.8089 | 5.5 | 550 | 3.2387 | 0.0124 | 0.0019 | 0.0047 | 0.0014 | 0.0023 | 0.0032 | 0.0005 | 0.0029 | 0.0119 | 0.0159 | 0.0124 | 0.0255 | 0.0149 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0744 | 0.4269 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0041 | 0.2252 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 2.4317 | 6.0 | 600 | 3.2248 | 0.0124 | 0.0017 | 0.0043 | 0.001 | 0.0022 | 0.0027 | 0.0003 | 0.0021 | 0.0121 | 0.0162 | 0.0152 | 0.0265 | 0.0149 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.063 | 0.4187 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0048 | 0.2461 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 3.5663 | 6.5 | 650 | 3.2108 | 0.0124 | 0.002 | 0.0052 | 0.001 | 0.0021 | 0.0037 | 0.0003 | 0.0027 | 0.0117 | 0.0169 | 0.0143 | 0.0278 | 0.0165 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0756 | 0.4112 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0053 | 0.2826 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 3.3771 | 7.0 | 700 | 3.1788 | 0.0124 | 0.0024 | 0.0061 | 0.0013 | 0.0025 | 0.0048 | 0.0004 | 0.0037 | 0.0117 | 0.017 | 0.0134 | 0.0282 | 0.0128 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0937 | 0.4239 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0051 | 0.2713 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 3.0818 | 7.5 | 750 | 3.1503 | 0.0124 | 0.0022 | 0.0051 | 0.0016 | 0.002 | 0.005 | 0.0002 | 0.0033 | 0.013 | 0.0183 | 0.0176 | 0.0293 | 0.0153 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0838 | 0.4552 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0071 | 0.2957 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 3.246 | 8.0 | 800 | 3.1266 | 0.0124 | 0.0027 | 0.0064 | 0.0017 | 0.0026 | 0.0055 | 0.0002 | 0.0032 | 0.0132 | 0.0188 | 0.0169 | 0.0302 | 0.0165 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.1021 | 0.4448 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0087 | 0.3278 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 3.1945 | 8.5 | 850 | 3.0969 | 0.0124 | 0.0029 | 0.0067 | 0.0019 | 0.0029 | 0.0059 | 0.0002 | 0.0036 | 0.015 | 0.0203 | 0.0176 | 0.033 | 0.0182 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.1093 | 0.4709 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0114 | 0.3617 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 2.9341 | 9.0 | 900 | 3.0639 | 0.0124 | 0.003 | 0.0068 | 0.0023 | 0.0036 | 0.0056 | 0.0004 | 0.0031 | 0.0137 | 0.0191 | 0.0149 | 0.0313 | 0.0128 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.1145 | 0.4388 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0083 | 0.3435 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 2.9812 | 9.5 | 950 | 3.0263 | 0.0124 | 0.0028 | 0.0066 | 0.0017 | 0.003 | 0.0057 | 0.0004 | 0.0039 | 0.014 | 0.0197 | 0.0156 | 0.0332 | 0.0165 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.1028 | 0.4425 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0091 | 0.3504 | 0.0 | 0.0 | 0.0049 | 0.0159 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 2.8798 | 10.0 | 1000 | 2.9767 | 0.0124 | 0.003 | 0.0068 | 0.0022 | 0.003 | 0.0066 | 0.0003 | 0.0034 | 0.0145 | 0.021 | 0.0159 | 0.0343 | 0.0186 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.1096 | 0.4619 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0119 | 0.3965 | 0.0 | 0.0 | 0.0003 | 0.0016 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 2.8805 | 10.5 | 1050 | 2.9619 | 0.0124 | 0.0032 | 0.0075 | 0.0026 | 0.0042 | 0.0065 | 0.0002 | 0.0038 | 0.0157 | 0.0227 | 0.0201 | 0.0366 | 0.0178 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.1099 | 0.4866 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0145 | 0.4183 | 0.0 | 0.0 | 0.0085 | 0.0254 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 3.0194 | 11.0 | 1100 | 2.9846 | 0.0124 | 0.0028 | 0.007 | 0.0022 | 0.0027 | 0.0059 | 0.0002 | 0.0042 | 0.0147 | 0.019 | 0.0131 | 0.0325 | 0.0174 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0963 | 0.4299 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0137 | 0.34 | 0.0 | 0.0 | 0.0059 | 0.0095 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 3.6797 | 11.5 | 1150 | 2.9331 | 0.0124 | 0.0034 | 0.0079 | 0.0025 | 0.0033 | 0.0082 | 0.0003 | 0.005 | 0.017 | 0.0224 | 0.0182 | 0.0385 | 0.0169 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.1041 | 0.4642 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0174 | 0.3957 | 0.0 | 0.0 | 0.0187 | 0.0571 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 3.6285 | 12.0 | 1200 | 2.9318 | 0.0124 | 0.0034 | 0.008 | 0.0029 | 0.0036 | 0.0082 | 0.0002 | 0.0051 | 0.018 | 0.0233 | 0.0185 | 0.0396 | 0.0165 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.1084 | 0.5119 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0161 | 0.3678 | 0.0 | 0.0 | 0.0165 | 0.0746 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 3.1858 | 12.5 | 1250 | 2.9115 | 0.0124 | 0.0035 | 0.0077 | 0.0028 | 0.0039 | 0.0073 | 0.0002 | 0.005 | 0.0189 | 0.024 | 0.0191 | 0.0404 | 0.0202 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.1128 | 0.4896 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0193 | 0.4043 | 0.0 | 0.0 | 0.0106 | 0.0889 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 3.5477 | 13.0 | 1300 | 2.9734 | 0.0124 | 0.0043 | 0.0093 | 0.0037 | 0.0043 | 0.0088 | 0.0002 | 0.0059 | 0.0175 | 0.0219 | 0.02 | 0.0361 | 0.0227 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.1287 | 0.4813 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0297 | 0.3661 | 0.0 | 0.0 | 0.0166 | 0.0508 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 2.6609 | 13.5 | 1350 | 2.8948 | 0.0124 | 0.0038 | 0.009 | 0.0029 | 0.0039 | 0.0076 | 0.0003 | 0.0057 | 0.0191 | 0.0242 | 0.0204 | 0.0397 | 0.0236 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.1142 | 0.4993 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0259 | 0.407 | 0.0 | 0.0 | 0.0141 | 0.0857 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 2.1238 | 14.0 | 1400 | 2.8598 | 0.0124 | 0.004 | 0.0097 | 0.003 | 0.0039 | 0.0069 | 0.0009 | 0.006 | 0.0198 | 0.0245 | 0.021 | 0.0401 | 0.0224 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0153 | 0.0184 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.1103 | 0.5164 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0249 | 0.3843 | 0.0 | 0.0 | 0.0135 | 0.0841 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 2.5572 | 14.5 | 1450 | 2.8251 | 0.0124 | 0.0039 | 0.0093 | 0.0026 | 0.0039 | 0.007 | 0.0008 | 0.0054 | 0.0201 | 0.025 | 0.0225 | 0.0402 | 0.0198 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0119 | 0.0082 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.1148 | 0.5067 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0239 | 0.3957 | 0.0 | 0.0 | 0.0099 | 0.1127 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 2.6484 | 15.0 | 1500 | 2.8246 | 0.0124 | 0.0039 | 0.0093 | 0.003 | 0.004 | 0.0074 | 0.0005 | 0.0057 | 0.0196 | 0.0244 | 0.022 | 0.0387 | 0.0235 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0059 | 0.0082 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.1223 | 0.5157 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0249 | 0.3974 | 0.0 | 0.0 | 0.0055 | 0.0778 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 2.6171 | 15.5 | 1550 | 2.8018 | 0.0124 | 0.004 | 0.01 | 0.0032 | 0.0037 | 0.0074 | 0.0008 | 0.0066 | 0.0211 | 0.0263 | 0.0197 | 0.0443 | 0.0258 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0104 | 0.0306 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.1122 | 0.5052 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0266 | 0.4165 | 0.0 | 0.0 | 0.0145 | 0.127 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 2.7484 | 16.0 | 1600 | 2.8278 | 0.0124 | 0.004 | 0.009 | 0.0031 | 0.0037 | 0.0086 | 0.0005 | 0.0064 | 0.0212 | 0.0256 | 0.0206 | 0.0435 | 0.0224 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0048 | 0.0184 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.1153 | 0.4978 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.029 | 0.3957 | 0.0 | 0.0 | 0.0134 | 0.1397 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 2.2523 | 16.5 | 1650 | 2.7804 | 0.0124 | 0.0039 | 0.009 | 0.0031 | 0.0037 | 0.0087 | 0.0027 | 0.0063 | 0.023 | 0.0281 | 0.0201 | 0.0489 | 0.025 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.005 | 0.0224 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.1118 | 0.5127 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0287 | 0.4539 | 0.0 | 0.0 | 0.0159 | 0.1651 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 2.792 | 17.0 | 1700 | 2.7744 | 0.0124 | 0.0039 | 0.0096 | 0.0028 | 0.0037 | 0.0074 | 0.0031 | 0.0059 | 0.0223 | 0.0279 | 0.0203 | 0.0474 | 0.0291 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0112 | 0.0469 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.109 | 0.4731 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0286 | 0.4713 | 0.0 | 0.0 | 0.0129 | 0.1524 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 2.9321 | 17.5 | 1750 | 2.7581 | 0.0124 | 0.004 | 0.0099 | 0.0031 | 0.0039 | 0.0074 | 0.0009 | 0.0065 | 0.0219 | 0.0274 | 0.0213 | 0.0451 | 0.0285 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0106 | 0.0531 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.1172 | 0.4761 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0244 | 0.4496 | 0.0 | 0.0 | 0.0114 | 0.1429 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 2.6387 | 18.0 | 1800 | 2.8029 | 0.0124 | 0.0042 | 0.0099 | 0.0027 | 0.0042 | 0.0075 | 0.001 | 0.0062 | 0.0203 | 0.0258 | 0.0206 | 0.0419 | 0.0306 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0058 | 0.0347 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.1273 | 0.4858 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0265 | 0.4296 | 0.0 | 0.0 | 0.0117 | 0.1063 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 3.8495 | 18.5 | 1850 | 2.7667 | 0.0124 | 0.0042 | 0.0104 | 0.0031 | 0.0039 | 0.0081 | 0.0012 | 0.0066 | 0.0223 | 0.0277 | 0.0223 | 0.044 | 0.0311 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0111 | 0.0551 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.1227 | 0.5104 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0262 | 0.4513 | 0.0 | 0.0 | 0.0126 | 0.1206 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 3.0583 | 19.0 | 1900 | 2.7503 | 0.0124 | 0.0043 | 0.0097 | 0.0035 | 0.0042 | 0.0076 | 0.0012 | 0.0076 | 0.0233 | 0.0291 | 0.0215 | 0.0467 | 0.0321 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0165 | 0.0837 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.1205 | 0.5097 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0275 | 0.4513 | 0.0 | 0.0 | 0.0136 | 0.1476 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 2.8411 | 19.5 | 1950 | 2.7440 | 0.0124 | 0.0043 | 0.0095 | 0.0036 | 0.0039 | 0.0078 | 0.0012 | 0.0079 | 0.0232 | 0.0288 | 0.0227 | 0.0454 | 0.0345 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0144 | 0.0653 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.1194 | 0.5209 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0298 | 0.4661 | 0.0 | 0.0 | 0.011 | 0.1302 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 3.0183 | 20.0 | 2000 | 2.7458 | 0.0124 | 0.0042 | 0.0098 | 0.0032 | 0.0035 | 0.0081 | 0.0018 | 0.0089 | 0.0241 | 0.0296 | 0.0218 | 0.046 | 0.0375 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0227 | 0.102 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.1074 | 0.4955 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0286 | 0.4539 | 0.0 | 0.0 | 0.0152 | 0.1619 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 2.8988 | 20.5 | 2050 | 2.7382 | 0.0124 | 0.0048 | 0.0111 | 0.0038 | 0.0038 | 0.0085 | 0.0022 | 0.0089 | 0.0255 | 0.0308 | 0.0241 | 0.0451 | 0.0365 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0345 | 0.1245 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.1163 | 0.5336 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0296 | 0.4557 | 0.0 | 0.0 | 0.0158 | 0.1476 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 2.5286 | 21.0 | 2100 | 2.7383 | 0.0124 | 0.0048 | 0.0108 | 0.0038 | 0.0042 | 0.0081 | 0.0036 | 0.0083 | 0.0257 | 0.0309 | 0.0248 | 0.0483 | 0.0339 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0298 | 0.0959 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.1206 | 0.5366 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0294 | 0.4461 | 0.0 | 0.0 | 0.0153 | 0.1873 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 3.2692 | 21.5 | 2150 | 2.7388 | 0.0124 | 0.0051 | 0.0113 | 0.0041 | 0.0042 | 0.0083 | 0.0025 | 0.0103 | 0.0264 | 0.0315 | 0.0236 | 0.0454 | 0.0355 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0421 | 0.1653 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.1209 | 0.5172 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0335 | 0.4443 | 0.0 | 0.0 | 0.0111 | 0.1635 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 2.7976 | 22.0 | 2200 | 2.7294 | 0.0124 | 0.0048 | 0.0111 | 0.0035 | 0.0037 | 0.0082 | 0.0046 | 0.0105 | 0.0253 | 0.0308 | 0.0237 | 0.0457 | 0.0347 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0372 | 0.1571 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.1153 | 0.509 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0318 | 0.4304 | 0.0 | 0.0 | 0.0119 | 0.1667 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 2.3505 | 22.5 | 2250 | 2.7265 | 0.0124 | 0.005 | 0.0114 | 0.0042 | 0.0044 | 0.008 | 0.0023 | 0.011 | 0.0275 | 0.0329 | 0.0236 | 0.048 | 0.0391 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0377 | 0.198 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.1208 | 0.5209 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0331 | 0.4565 | 0.0 | 0.0 | 0.0135 | 0.173 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 2.4853 | 23.0 | 2300 | 2.7120 | 0.0124 | 0.0052 | 0.0119 | 0.0041 | 0.0042 | 0.008 | 0.0031 | 0.0109 | 0.0285 | 0.0339 | 0.0233 | 0.0476 | 0.0432 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0453 | 0.2306 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.1233 | 0.5194 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0341 | 0.4704 | 0.0 | 0.0 | 0.0124 | 0.1714 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 2.3593 | 23.5 | 2350 | 2.7258 | 0.0124 | 0.0055 | 0.0128 | 0.004 | 0.0049 | 0.0072 | 0.0034 | 0.0116 | 0.0283 | 0.0329 | 0.0225 | 0.0455 | 0.0406 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0549 | 0.2469 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.1235 | 0.4985 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.033 | 0.4357 | 0.0 | 0.0 | 0.0123 | 0.1667 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 2.4246 | 24.0 | 2400 | 2.7011 | 0.0124 | 0.0054 | 0.0125 | 0.0041 | 0.0044 | 0.0081 | 0.0028 | 0.0112 | 0.0297 | 0.0346 | 0.0245 | 0.0468 | 0.0412 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0476 | 0.2592 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.1268 | 0.5269 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0328 | 0.4609 | 0.0 | 0.0 | 0.0125 | 0.1698 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 6.0358 | 24.5 | 2450 | 2.7073 | 0.0124 | 0.0053 | 0.0123 | 0.0039 | 0.0041 | 0.0079 | 0.0029 | 0.0125 | 0.0293 | 0.034 | 0.0228 | 0.0462 | 0.042 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0508 | 0.2755 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.1212 | 0.5037 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0327 | 0.4478 | 0.0 | 0.0 | 0.0119 | 0.1667 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 2.7728 | 25.0 | 2500 | 2.7010 | 0.0124 | 0.0053 | 0.0122 | 0.0039 | 0.0047 | 0.0071 | 0.0042 | 0.0122 | 0.0294 | 0.0339 | 0.0221 | 0.048 | 0.0418 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0506 | 0.2551 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.1207 | 0.5134 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.034 | 0.4565 | 0.0 | 0.0 | 0.0111 | 0.1667 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 2.5169 | 25.5 | 2550 | 2.6968 | 0.0124 | 0.0054 | 0.012 | 0.0045 | 0.004 | 0.0081 | 0.0032 | 0.0116 | 0.0289 | 0.0339 | 0.0235 | 0.0473 | 0.0402 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.058 | 0.2469 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.1204 | 0.5328 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0333 | 0.4478 | 0.0 | 0.0 | 0.0112 | 0.1603 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 2.9971 | 26.0 | 2600 | 2.6801 | 0.0124 | 0.0057 | 0.0124 | 0.0046 | 0.004 | 0.0084 | 0.0045 | 0.0123 | 0.0303 | 0.0354 | 0.0234 | 0.0486 | 0.0439 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0622 | 0.2816 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.1221 | 0.5381 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0369 | 0.4548 | 0.0 | 0.0 | 0.0115 | 0.1762 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 2.3748 | 26.5 | 2650 | 2.6809 | 0.0124 | 0.0056 | 0.0123 | 0.0046 | 0.004 | 0.0083 | 0.0038 | 0.0121 | 0.0305 | 0.0354 | 0.0233 | 0.0487 | 0.0427 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0597 | 0.2816 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.1236 | 0.5381 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0362 | 0.4539 | 0.0 | 0.0 | 0.0106 | 0.1762 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 2.8506 | 27.0 | 2700 | 2.6779 | 0.0124 | 0.0057 | 0.012 | 0.0047 | 0.0042 | 0.0084 | 0.0038 | 0.0119 | 0.0302 | 0.0354 | 0.0235 | 0.0487 | 0.0419 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0609 | 0.2653 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.125 | 0.5463 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.035 | 0.4635 | 0.0 | 0.0 | 0.0113 | 0.1746 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 2.53 | 27.5 | 2750 | 2.6831 | 0.0124 | 0.0057 | 0.0128 | 0.0047 | 0.0043 | 0.0085 | 0.0036 | 0.012 | 0.0304 | 0.0356 | 0.0234 | 0.0492 | 0.0433 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0606 | 0.2776 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.1262 | 0.5455 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0359 | 0.4609 | 0.0 | 0.0 | 0.0108 | 0.1762 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 2.1641 | 28.0 | 2800 | 2.6716 | 0.0124 | 0.006 | 0.013 | 0.0048 | 0.0042 | 0.0085 | 0.0052 | 0.0127 | 0.0309 | 0.0362 | 0.0243 | 0.0485 | 0.0445 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0718 | 0.2939 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.1261 | 0.5463 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.036 | 0.4704 | 0.0 | 0.0 | 0.0111 | 0.1746 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 5.9674 | 28.5 | 2850 | 2.6713 | 0.0124 | 0.0058 | 0.0127 | 0.0045 | 0.0044 | 0.0079 | 0.0045 | 0.0125 | 0.0305 | 0.036 | 0.0251 | 0.0477 | 0.0445 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0646 | 0.2857 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.1274 | 0.5507 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0346 | 0.4643 | 0.0 | 0.0 | 0.0106 | 0.1762 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 2.6268 | 29.0 | 2900 | 2.6704 | 0.0124 | 0.0056 | 0.0124 | 0.0045 | 0.0041 | 0.0079 | 0.0043 | 0.0128 | 0.0302 | 0.0356 | 0.0238 | 0.0474 | 0.045 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0621 | 0.2878 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.1227 | 0.5366 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.035 | 0.4609 | 0.0 | 0.0 | 0.0104 | 0.1746 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 2.5974 | 29.5 | 2950 | 2.6705 | 0.0124 | 0.0057 | 0.0124 | 0.0046 | 0.0041 | 0.0081 | 0.0054 | 0.0129 | 0.0309 | 0.0362 | 0.0238 | 0.0494 | 0.0449 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0651 | 0.2939 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.1224 | 0.5366 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0362 | 0.4661 | 0.0 | 0.0 | 0.0106 | 0.1889 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 2.4689 | 30.0 | 3000 | 2.6712 | 0.0124 | 0.0058 | 0.0128 | 0.0046 | 0.0042 | 0.0081 | 0.0055 | 0.0128 | 0.0308 | 0.0362 | 0.0239 | 0.0494 | 0.0448 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0681 | 0.2918 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.1226 | 0.5358 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0368 | 0.4661 | 0.0 | 0.0 | 0.0106 | 0.1905 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
### Framework versions
- Transformers 4.46.2
- Pytorch 2.5.1+cu121
- Datasets 3.1.0
- Tokenizers 0.20.3
| [
"shirt, blouse",
"top, t-shirt, sweatshirt",
"sweater",
"cardigan",
"jacket",
"vest",
"pants",
"shorts",
"skirt",
"coat",
"dress",
"jumpsuit",
"cape",
"glasses",
"hat",
"headband, head covering, hair accessory",
"tie",
"glove",
"watch",
"belt",
"leg warmer",
"tights, stockings",
"sock",
"shoe",
"bag, wallet",
"scarf",
"umbrella",
"hood",
"collar",
"lapel",
"epaulette",
"sleeve",
"pocket",
"neckline",
"buckle",
"zipper",
"applique",
"bead",
"bow",
"flower",
"fringe",
"ribbon",
"rivet",
"ruffle",
"sequin",
"tassel"
] |
DatSplit/detr-resnet-50-fashionpedia |
# Model Card for Model ID
<!-- Provide a quick summary of what the model is/does. -->
## Model Details
### Model Description
<!-- Provide a longer summary of what this model is. -->
This is the model card of a 🤗 transformers model that has been pushed on the Hub. This model card has been automatically generated.
- **Developed by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Model type:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
- **Finetuned from model [optional]:** [More Information Needed]
### Model Sources [optional]
<!-- Provide the basic links for the model. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the model is intended to be used, including the foreseeable users of the model and those affected by the model. -->
### Direct Use
<!-- This section is for the model use without fine-tuning or plugging into a larger ecosystem/app. -->
[More Information Needed]
### Downstream Use [optional]
<!-- This section is for the model use when fine-tuned for a task, or when plugged into a larger ecosystem/app -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the model will not work well for. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.
## How to Get Started with the Model
Use the code below to get started with the model.
[More Information Needed]
## Training Details
### Training Data
<!-- This should link to a Dataset Card, perhaps with a short stub of information on what the training data is all about as well as documentation related to data pre-processing or additional filtering. -->
[More Information Needed]
### Training Procedure
<!-- This relates heavily to the Technical Specifications. Content here should link to that section when it is relevant to the training procedure. -->
#### Preprocessing [optional]
[More Information Needed]
#### Training Hyperparameters
- **Training regime:** [More Information Needed] <!--fp32, fp16 mixed precision, bf16 mixed precision, bf16 non-mixed precision, fp16 non-mixed precision, fp8 mixed precision -->
#### Speeds, Sizes, Times [optional]
<!-- This section provides information about throughput, start/end time, checkpoint size if relevant, etc. -->
[More Information Needed]
## Evaluation
<!-- This section describes the evaluation protocols and provides the results. -->
### Testing Data, Factors & Metrics
#### Testing Data
<!-- This should link to a Dataset Card if possible. -->
[More Information Needed]
#### Factors
<!-- These are the things the evaluation is disaggregating by, e.g., subpopulations or domains. -->
[More Information Needed]
#### Metrics
<!-- These are the evaluation metrics being used, ideally with a description of why. -->
[More Information Needed]
### Results
[More Information Needed]
#### Summary
## Model Examination [optional]
<!-- Relevant interpretability work for the model goes here -->
[More Information Needed]
## Environmental Impact
<!-- Total emissions (in grams of CO2eq) and additional considerations, such as electricity usage, go here. Edit the suggested text below accordingly -->
Carbon emissions can be estimated using the [Machine Learning Impact calculator](https://mlco2.github.io/impact#compute) presented in [Lacoste et al. (2019)](https://arxiv.org/abs/1910.09700).
- **Hardware Type:** [More Information Needed]
- **Hours used:** [More Information Needed]
- **Cloud Provider:** [More Information Needed]
- **Compute Region:** [More Information Needed]
- **Carbon Emitted:** [More Information Needed]
## Technical Specifications [optional]
### Model Architecture and Objective
[More Information Needed]
### Compute Infrastructure
[More Information Needed]
#### Hardware
[More Information Needed]
#### Software
[More Information Needed]
## Citation [optional]
<!-- If there is a paper or blog post introducing the model, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the model or model card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Model Card Authors [optional]
[More Information Needed]
## Model Card Contact
[More Information Needed] | [
"label_0",
"label_1",
"label_2",
"label_3",
"label_4",
"label_5",
"label_6",
"label_7",
"label_8",
"label_9",
"label_10",
"label_11",
"label_12",
"label_13",
"label_14",
"label_15",
"label_16",
"label_17",
"label_18",
"label_19",
"label_20",
"label_21",
"label_22",
"label_23",
"label_24",
"label_25",
"label_26",
"label_27",
"label_28",
"label_29",
"label_30",
"label_31",
"label_32",
"label_33",
"label_34",
"label_35",
"label_36",
"label_37",
"label_38",
"label_39",
"label_40",
"label_41",
"label_42",
"label_43",
"label_44",
"label_45"
] |
clee9/detr_finetuned_30 |
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# detr_finetuned_30
This model is a fine-tuned version of [facebook/detr-resnet-50](https://huggingface.co/facebook/detr-resnet-50) on the None dataset.
It achieves the following results on the evaluation set:
- Loss: 1.1249
- Map: 0.2684
- Map 50: 0.5212
- Map 75: 0.2578
- Map Small: 0.22
- Map Medium: 0.377
- Map Large: 0.395
- Mar 1: 0.1497
- Mar 10: 0.3903
- Mar 100: 0.4383
- Mar Small: 0.398
- Mar Medium: 0.5486
- Mar Large: 0.6314
- Map Basketball: 0.0431
- Mar 100 Basketball: 0.147
- Map Player: 0.3203
- Mar 100 Player: 0.5743
- Map Referee: 0.4419
- Mar 100 Referee: 0.5936
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 5e-05
- train_batch_size: 16
- eval_batch_size: 8
- seed: 42
- optimizer: Use OptimizerNames.ADAMW_TORCH with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
- lr_scheduler_type: cosine
- num_epochs: 30
### Training results
| Training Loss | Epoch | Step | Validation Loss | Map | Map 50 | Map 75 | Map Small | Map Medium | Map Large | Mar 1 | Mar 10 | Mar 100 | Mar Small | Mar Medium | Mar Large | Map Basketball | Mar 100 Basketball | Map Player | Mar 100 Player | Map Referee | Mar 100 Referee |
|:-------------:|:-----:|:-----:|:---------------:|:------:|:------:|:------:|:---------:|:----------:|:---------:|:------:|:------:|:-------:|:---------:|:----------:|:---------:|:--------------:|:------------------:|:----------:|:--------------:|:-----------:|:---------------:|
| No log | 1.0 | 461 | 1.7649 | 0.0495 | 0.1443 | 0.0205 | 0.0386 | 0.0955 | 0.1299 | 0.0262 | 0.1028 | 0.1539 | 0.1341 | 0.2707 | 0.5936 | 0.0048 | 0.0142 | 0.1345 | 0.3828 | 0.0093 | 0.0646 |
| 1.9775 | 2.0 | 922 | 1.5434 | 0.1063 | 0.2515 | 0.0673 | 0.0772 | 0.1801 | 0.4147 | 0.0497 | 0.2231 | 0.2853 | 0.2337 | 0.3809 | 0.6431 | 0.0004 | 0.0211 | 0.2031 | 0.4458 | 0.1154 | 0.389 |
| 1.7156 | 3.0 | 1383 | 1.4497 | 0.139 | 0.3385 | 0.0796 | 0.1042 | 0.3513 | 0.4331 | 0.0722 | 0.2634 | 0.3277 | 0.2719 | 0.519 | 0.6221 | 0.0019 | 0.042 | 0.21 | 0.4515 | 0.2049 | 0.4895 |
| 1.5817 | 4.0 | 1844 | 1.3623 | 0.191 | 0.4015 | 0.1554 | 0.1364 | 0.2989 | 0.4691 | 0.1062 | 0.2979 | 0.3572 | 0.3144 | 0.4426 | 0.6789 | 0.002 | 0.0515 | 0.2474 | 0.4896 | 0.3236 | 0.5305 |
| 1.4789 | 5.0 | 2305 | 1.3544 | 0.1874 | 0.4189 | 0.1338 | 0.1415 | 0.357 | 0.4253 | 0.1031 | 0.2946 | 0.36 | 0.3174 | 0.5002 | 0.6838 | 0.0032 | 0.0591 | 0.2486 | 0.4917 | 0.3104 | 0.5292 |
| 1.4235 | 6.0 | 2766 | 1.2867 | 0.211 | 0.4444 | 0.1715 | 0.154 | 0.3671 | 0.298 | 0.118 | 0.3084 | 0.3629 | 0.3113 | 0.5314 | 0.6814 | 0.0032 | 0.0494 | 0.2607 | 0.4919 | 0.3692 | 0.5474 |
| 1.3454 | 7.0 | 3227 | 1.3163 | 0.2024 | 0.4324 | 0.1654 | 0.1381 | 0.362 | 0.4507 | 0.114 | 0.3076 | 0.3557 | 0.3114 | 0.4828 | 0.651 | 0.0052 | 0.0581 | 0.2397 | 0.4762 | 0.3623 | 0.5329 |
| 1.3253 | 8.0 | 3688 | 1.2467 | 0.2169 | 0.4386 | 0.1888 | 0.1576 | 0.3525 | 0.4086 | 0.1128 | 0.327 | 0.3884 | 0.344 | 0.5275 | 0.6926 | 0.0105 | 0.0693 | 0.2758 | 0.525 | 0.3643 | 0.5708 |
| 1.276 | 9.0 | 4149 | 1.2343 | 0.2242 | 0.4614 | 0.1877 | 0.1659 | 0.3289 | 0.3643 | 0.1273 | 0.3341 | 0.3883 | 0.3438 | 0.4961 | 0.6814 | 0.0094 | 0.0896 | 0.2701 | 0.5176 | 0.3932 | 0.5579 |
| 1.2626 | 10.0 | 4610 | 1.2377 | 0.2324 | 0.4659 | 0.2072 | 0.1803 | 0.3146 | 0.3717 | 0.1261 | 0.3469 | 0.3967 | 0.3548 | 0.4797 | 0.624 | 0.0101 | 0.0849 | 0.2777 | 0.5198 | 0.4094 | 0.5855 |
| 1.2274 | 11.0 | 5071 | 1.2398 | 0.2358 | 0.4759 | 0.2075 | 0.1795 | 0.3235 | 0.4632 | 0.1338 | 0.3479 | 0.3952 | 0.3457 | 0.4835 | 0.6647 | 0.0128 | 0.0864 | 0.2831 | 0.5323 | 0.4114 | 0.5669 |
| 1.2026 | 12.0 | 5532 | 1.1964 | 0.2407 | 0.4828 | 0.2133 | 0.1808 | 0.3974 | 0.4632 | 0.1329 | 0.3571 | 0.4064 | 0.3481 | 0.5671 | 0.6966 | 0.0151 | 0.0897 | 0.2866 | 0.5305 | 0.4203 | 0.5991 |
| 1.2026 | 13.0 | 5993 | 1.2058 | 0.2367 | 0.4879 | 0.201 | 0.1919 | 0.3161 | 0.4254 | 0.1314 | 0.3515 | 0.398 | 0.3516 | 0.4787 | 0.6789 | 0.019 | 0.0999 | 0.287 | 0.531 | 0.404 | 0.5632 |
| 1.1779 | 14.0 | 6454 | 1.1949 | 0.2365 | 0.4716 | 0.2179 | 0.1759 | 0.3254 | 0.4736 | 0.1268 | 0.359 | 0.4064 | 0.3592 | 0.5701 | 0.6495 | 0.0203 | 0.1028 | 0.2945 | 0.542 | 0.3946 | 0.5743 |
| 1.1578 | 15.0 | 6915 | 1.2130 | 0.2291 | 0.468 | 0.2024 | 0.1673 | 0.3503 | 0.4148 | 0.1238 | 0.3551 | 0.4049 | 0.3668 | 0.485 | 0.6059 | 0.02 | 0.1102 | 0.2821 | 0.5349 | 0.3851 | 0.5697 |
| 1.131 | 16.0 | 7376 | 1.2012 | 0.2384 | 0.478 | 0.2139 | 0.1848 | 0.2908 | 0.4595 | 0.1314 | 0.3597 | 0.4048 | 0.3616 | 0.5554 | 0.6382 | 0.0174 | 0.1082 | 0.291 | 0.5388 | 0.4068 | 0.5672 |
| 1.1208 | 17.0 | 7837 | 1.1768 | 0.2517 | 0.4928 | 0.2365 | 0.2057 | 0.3507 | 0.4399 | 0.1352 | 0.3697 | 0.4208 | 0.38 | 0.5615 | 0.6201 | 0.0222 | 0.1082 | 0.3005 | 0.5574 | 0.4324 | 0.5969 |
| 1.0992 | 18.0 | 8298 | 1.1605 | 0.2403 | 0.4776 | 0.2194 | 0.1967 | 0.2987 | 0.4002 | 0.1315 | 0.3655 | 0.4148 | 0.3761 | 0.5148 | 0.5838 | 0.0216 | 0.1004 | 0.2958 | 0.5538 | 0.4034 | 0.5901 |
| 1.0793 | 19.0 | 8759 | 1.1529 | 0.2496 | 0.4954 | 0.2307 | 0.2071 | 0.3352 | 0.4166 | 0.133 | 0.3752 | 0.4255 | 0.3868 | 0.5096 | 0.6431 | 0.0278 | 0.1219 | 0.3072 | 0.567 | 0.4139 | 0.5877 |
| 1.0648 | 20.0 | 9220 | 1.1573 | 0.2539 | 0.505 | 0.2326 | 0.2033 | 0.3472 | 0.4315 | 0.1389 | 0.3818 | 0.4308 | 0.3896 | 0.5701 | 0.6275 | 0.0327 | 0.1387 | 0.3021 | 0.5606 | 0.427 | 0.5931 |
| 1.05 | 21.0 | 9681 | 1.1417 | 0.257 | 0.505 | 0.2463 | 0.2135 | 0.3753 | 0.4454 | 0.1392 | 0.3862 | 0.4331 | 0.3949 | 0.5359 | 0.6696 | 0.0339 | 0.1388 | 0.3103 | 0.5656 | 0.4267 | 0.5948 |
| 1.0362 | 22.0 | 10142 | 1.1439 | 0.259 | 0.5124 | 0.2466 | 0.2112 | 0.3458 | 0.3431 | 0.1406 | 0.3832 | 0.4307 | 0.3895 | 0.4829 | 0.6402 | 0.0326 | 0.1252 | 0.312 | 0.5706 | 0.4324 | 0.5963 |
| 1.0248 | 23.0 | 10603 | 1.1317 | 0.2641 | 0.5182 | 0.2514 | 0.215 | 0.3594 | 0.3094 | 0.1445 | 0.3838 | 0.4319 | 0.3942 | 0.5376 | 0.6137 | 0.0334 | 0.1296 | 0.3123 | 0.5687 | 0.4467 | 0.5972 |
| 1.0173 | 24.0 | 11064 | 1.1485 | 0.2581 | 0.5057 | 0.247 | 0.2102 | 0.3723 | 0.4356 | 0.1414 | 0.3819 | 0.4295 | 0.3906 | 0.5416 | 0.6681 | 0.0334 | 0.1372 | 0.3158 | 0.5696 | 0.4251 | 0.5817 |
| 1.0082 | 25.0 | 11525 | 1.1344 | 0.2642 | 0.5158 | 0.2495 | 0.2176 | 0.3517 | 0.4473 | 0.1467 | 0.3843 | 0.4322 | 0.3915 | 0.554 | 0.6377 | 0.0354 | 0.1386 | 0.3158 | 0.5685 | 0.4414 | 0.5894 |
| 1.0082 | 26.0 | 11986 | 1.1267 | 0.2648 | 0.5175 | 0.2514 | 0.2147 | 0.3598 | 0.4399 | 0.1489 | 0.3868 | 0.4341 | 0.3942 | 0.5624 | 0.6294 | 0.0381 | 0.1422 | 0.3181 | 0.5706 | 0.4381 | 0.5894 |
| 1.0006 | 27.0 | 12447 | 1.1296 | 0.2687 | 0.5208 | 0.2581 | 0.2198 | 0.3694 | 0.4415 | 0.1506 | 0.3887 | 0.4359 | 0.3967 | 0.5455 | 0.6333 | 0.0439 | 0.1464 | 0.3188 | 0.5727 | 0.4434 | 0.5885 |
| 0.9989 | 28.0 | 12908 | 1.1237 | 0.2675 | 0.5191 | 0.2562 | 0.2202 | 0.3769 | 0.3991 | 0.1484 | 0.3897 | 0.4374 | 0.397 | 0.5484 | 0.6333 | 0.0411 | 0.1454 | 0.3204 | 0.5735 | 0.441 | 0.5932 |
| 0.9952 | 29.0 | 13369 | 1.1251 | 0.2687 | 0.5204 | 0.2582 | 0.221 | 0.3689 | 0.3963 | 0.15 | 0.3904 | 0.4385 | 0.3984 | 0.5485 | 0.6314 | 0.0426 | 0.1474 | 0.3207 | 0.5744 | 0.4427 | 0.5936 |
| 0.9909 | 30.0 | 13830 | 1.1249 | 0.2684 | 0.5212 | 0.2578 | 0.22 | 0.377 | 0.395 | 0.1497 | 0.3903 | 0.4383 | 0.398 | 0.5486 | 0.6314 | 0.0431 | 0.147 | 0.3203 | 0.5743 | 0.4419 | 0.5936 |
### Framework versions
- Transformers 4.46.3
- Pytorch 2.5.1
- Datasets 3.1.0
- Tokenizers 0.20.3
| [
"basketball",
"player",
"referee"
] |
DatSplit/detr-resnet-101-fashionpedia |
# Model Card for Model ID
<!-- Provide a quick summary of what the model is/does. -->
## Model Details
### Model Description
<!-- Provide a longer summary of what this model is. -->
This is the model card of a 🤗 transformers model that has been pushed on the Hub. This model card has been automatically generated.
- **Developed by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Model type:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
- **Finetuned from model [optional]:** [More Information Needed]
### Model Sources [optional]
<!-- Provide the basic links for the model. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the model is intended to be used, including the foreseeable users of the model and those affected by the model. -->
### Direct Use
<!-- This section is for the model use without fine-tuning or plugging into a larger ecosystem/app. -->
[More Information Needed]
### Downstream Use [optional]
<!-- This section is for the model use when fine-tuned for a task, or when plugged into a larger ecosystem/app -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the model will not work well for. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.
## How to Get Started with the Model
Use the code below to get started with the model.
[More Information Needed]
## Training Details
### Training Data
<!-- This should link to a Dataset Card, perhaps with a short stub of information on what the training data is all about as well as documentation related to data pre-processing or additional filtering. -->
[More Information Needed]
### Training Procedure
<!-- This relates heavily to the Technical Specifications. Content here should link to that section when it is relevant to the training procedure. -->
#### Preprocessing [optional]
[More Information Needed]
#### Training Hyperparameters
- **Training regime:** [More Information Needed] <!--fp32, fp16 mixed precision, bf16 mixed precision, bf16 non-mixed precision, fp16 non-mixed precision, fp8 mixed precision -->
#### Speeds, Sizes, Times [optional]
<!-- This section provides information about throughput, start/end time, checkpoint size if relevant, etc. -->
[More Information Needed]
## Evaluation
<!-- This section describes the evaluation protocols and provides the results. -->
### Testing Data, Factors & Metrics
#### Testing Data
<!-- This should link to a Dataset Card if possible. -->
[More Information Needed]
#### Factors
<!-- These are the things the evaluation is disaggregating by, e.g., subpopulations or domains. -->
[More Information Needed]
#### Metrics
<!-- These are the evaluation metrics being used, ideally with a description of why. -->
[More Information Needed]
### Results
[More Information Needed]
#### Summary
## Model Examination [optional]
<!-- Relevant interpretability work for the model goes here -->
[More Information Needed]
## Environmental Impact
<!-- Total emissions (in grams of CO2eq) and additional considerations, such as electricity usage, go here. Edit the suggested text below accordingly -->
Carbon emissions can be estimated using the [Machine Learning Impact calculator](https://mlco2.github.io/impact#compute) presented in [Lacoste et al. (2019)](https://arxiv.org/abs/1910.09700).
- **Hardware Type:** [More Information Needed]
- **Hours used:** [More Information Needed]
- **Cloud Provider:** [More Information Needed]
- **Compute Region:** [More Information Needed]
- **Carbon Emitted:** [More Information Needed]
## Technical Specifications [optional]
### Model Architecture and Objective
[More Information Needed]
### Compute Infrastructure
[More Information Needed]
#### Hardware
[More Information Needed]
#### Software
[More Information Needed]
## Citation [optional]
<!-- If there is a paper or blog post introducing the model, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the model or model card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Model Card Authors [optional]
[More Information Needed]
## Model Card Contact
[More Information Needed] | [
"label_0",
"label_1",
"label_2",
"label_3",
"label_4",
"label_5",
"label_6",
"label_7",
"label_8",
"label_9",
"label_10",
"label_11",
"label_12",
"label_13",
"label_14",
"label_15",
"label_16",
"label_17",
"label_18",
"label_19",
"label_20",
"label_21",
"label_22",
"label_23",
"label_24",
"label_25",
"label_26",
"label_27",
"label_28",
"label_29",
"label_30",
"label_31",
"label_32",
"label_33",
"label_34",
"label_35",
"label_36",
"label_37",
"label_38",
"label_39",
"label_40",
"label_41",
"label_42",
"label_43",
"label_44",
"label_45"
] |
DatSplit/yolos-small-fashionpedia |
# Model Card for Model ID
<!-- Provide a quick summary of what the model is/does. -->
## Model Details
### Model Description
<!-- Provide a longer summary of what this model is. -->
This is the model card of a 🤗 transformers model that has been pushed on the Hub. This model card has been automatically generated.
- **Developed by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Model type:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
- **Finetuned from model [optional]:** [More Information Needed]
### Model Sources [optional]
<!-- Provide the basic links for the model. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the model is intended to be used, including the foreseeable users of the model and those affected by the model. -->
### Direct Use
<!-- This section is for the model use without fine-tuning or plugging into a larger ecosystem/app. -->
[More Information Needed]
### Downstream Use [optional]
<!-- This section is for the model use when fine-tuned for a task, or when plugged into a larger ecosystem/app -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the model will not work well for. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.
## How to Get Started with the Model
Use the code below to get started with the model.
[More Information Needed]
## Training Details
### Training Data
<!-- This should link to a Dataset Card, perhaps with a short stub of information on what the training data is all about as well as documentation related to data pre-processing or additional filtering. -->
[More Information Needed]
### Training Procedure
<!-- This relates heavily to the Technical Specifications. Content here should link to that section when it is relevant to the training procedure. -->
#### Preprocessing [optional]
[More Information Needed]
#### Training Hyperparameters
- **Training regime:** [More Information Needed] <!--fp32, fp16 mixed precision, bf16 mixed precision, bf16 non-mixed precision, fp16 non-mixed precision, fp8 mixed precision -->
#### Speeds, Sizes, Times [optional]
<!-- This section provides information about throughput, start/end time, checkpoint size if relevant, etc. -->
[More Information Needed]
## Evaluation
<!-- This section describes the evaluation protocols and provides the results. -->
### Testing Data, Factors & Metrics
#### Testing Data
<!-- This should link to a Dataset Card if possible. -->
[More Information Needed]
#### Factors
<!-- These are the things the evaluation is disaggregating by, e.g., subpopulations or domains. -->
[More Information Needed]
#### Metrics
<!-- These are the evaluation metrics being used, ideally with a description of why. -->
[More Information Needed]
### Results
[More Information Needed]
#### Summary
## Model Examination [optional]
<!-- Relevant interpretability work for the model goes here -->
[More Information Needed]
## Environmental Impact
<!-- Total emissions (in grams of CO2eq) and additional considerations, such as electricity usage, go here. Edit the suggested text below accordingly -->
Carbon emissions can be estimated using the [Machine Learning Impact calculator](https://mlco2.github.io/impact#compute) presented in [Lacoste et al. (2019)](https://arxiv.org/abs/1910.09700).
- **Hardware Type:** [More Information Needed]
- **Hours used:** [More Information Needed]
- **Cloud Provider:** [More Information Needed]
- **Compute Region:** [More Information Needed]
- **Carbon Emitted:** [More Information Needed]
## Technical Specifications [optional]
### Model Architecture and Objective
[More Information Needed]
### Compute Infrastructure
[More Information Needed]
#### Hardware
[More Information Needed]
#### Software
[More Information Needed]
## Citation [optional]
<!-- If there is a paper or blog post introducing the model, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the model or model card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Model Card Authors [optional]
[More Information Needed]
## Model Card Contact
[More Information Needed] | [
"label_0",
"label_1",
"label_2",
"label_3",
"label_4",
"label_5",
"label_6",
"label_7",
"label_8",
"label_9",
"label_10",
"label_11",
"label_12",
"label_13",
"label_14",
"label_15",
"label_16",
"label_17",
"label_18",
"label_19",
"label_20",
"label_21",
"label_22",
"label_23",
"label_24",
"label_25",
"label_26",
"label_27",
"label_28",
"label_29",
"label_30",
"label_31",
"label_32",
"label_33",
"label_34",
"label_35",
"label_36",
"label_37",
"label_38",
"label_39",
"label_40",
"label_41",
"label_42",
"label_43",
"label_44",
"label_45"
] |
DatSplit/yolos-base-fashionpedia |
# Model Card for Model ID
<!-- Provide a quick summary of what the model is/does. -->
## Model Details
### Model Description
<!-- Provide a longer summary of what this model is. -->
This is the model card of a 🤗 transformers model that has been pushed on the Hub. This model card has been automatically generated.
- **Developed by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Model type:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
- **Finetuned from model [optional]:** [More Information Needed]
### Model Sources [optional]
<!-- Provide the basic links for the model. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the model is intended to be used, including the foreseeable users of the model and those affected by the model. -->
### Direct Use
<!-- This section is for the model use without fine-tuning or plugging into a larger ecosystem/app. -->
[More Information Needed]
### Downstream Use [optional]
<!-- This section is for the model use when fine-tuned for a task, or when plugged into a larger ecosystem/app -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the model will not work well for. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.
## How to Get Started with the Model
Use the code below to get started with the model.
[More Information Needed]
## Training Details
### Training Data
<!-- This should link to a Dataset Card, perhaps with a short stub of information on what the training data is all about as well as documentation related to data pre-processing or additional filtering. -->
[More Information Needed]
### Training Procedure
<!-- This relates heavily to the Technical Specifications. Content here should link to that section when it is relevant to the training procedure. -->
#### Preprocessing [optional]
[More Information Needed]
#### Training Hyperparameters
- **Training regime:** [More Information Needed] <!--fp32, fp16 mixed precision, bf16 mixed precision, bf16 non-mixed precision, fp16 non-mixed precision, fp8 mixed precision -->
#### Speeds, Sizes, Times [optional]
<!-- This section provides information about throughput, start/end time, checkpoint size if relevant, etc. -->
[More Information Needed]
## Evaluation
<!-- This section describes the evaluation protocols and provides the results. -->
### Testing Data, Factors & Metrics
#### Testing Data
<!-- This should link to a Dataset Card if possible. -->
[More Information Needed]
#### Factors
<!-- These are the things the evaluation is disaggregating by, e.g., subpopulations or domains. -->
[More Information Needed]
#### Metrics
<!-- These are the evaluation metrics being used, ideally with a description of why. -->
[More Information Needed]
### Results
[More Information Needed]
#### Summary
## Model Examination [optional]
<!-- Relevant interpretability work for the model goes here -->
[More Information Needed]
## Environmental Impact
<!-- Total emissions (in grams of CO2eq) and additional considerations, such as electricity usage, go here. Edit the suggested text below accordingly -->
Carbon emissions can be estimated using the [Machine Learning Impact calculator](https://mlco2.github.io/impact#compute) presented in [Lacoste et al. (2019)](https://arxiv.org/abs/1910.09700).
- **Hardware Type:** [More Information Needed]
- **Hours used:** [More Information Needed]
- **Cloud Provider:** [More Information Needed]
- **Compute Region:** [More Information Needed]
- **Carbon Emitted:** [More Information Needed]
## Technical Specifications [optional]
### Model Architecture and Objective
[More Information Needed]
### Compute Infrastructure
[More Information Needed]
#### Hardware
[More Information Needed]
#### Software
[More Information Needed]
## Citation [optional]
<!-- If there is a paper or blog post introducing the model, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the model or model card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Model Card Authors [optional]
[More Information Needed]
## Model Card Contact
[More Information Needed] | [
"label_0",
"label_1",
"label_2",
"label_3",
"label_4",
"label_5",
"label_6",
"label_7",
"label_8",
"label_9",
"label_10",
"label_11",
"label_12",
"label_13",
"label_14",
"label_15",
"label_16",
"label_17",
"label_18",
"label_19",
"label_20",
"label_21",
"label_22",
"label_23",
"label_24",
"label_25",
"label_26",
"label_27",
"label_28",
"label_29",
"label_30",
"label_31",
"label_32",
"label_33",
"label_34",
"label_35",
"label_36",
"label_37",
"label_38",
"label_39",
"label_40",
"label_41",
"label_42",
"label_43",
"label_44",
"label_45"
] |
emvisee/suas-2025-rtdetr-finetuned |
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# suas-2025-rtdetr-finetuned
This model is a fine-tuned version of [PekingU/rtdetr_r50vd](https://huggingface.co/PekingU/rtdetr_r50vd) on the mfly-auton/suas-2025-synthetic-data dataset.
It achieves the following results on the evaluation set:
- Loss: 3.8868
- Map: 0.8924
- Map 50: 0.9568
- Map 75: 0.9554
- Map Small: 0.8115
- Map Medium: 0.9048
- Map Large: 0.9511
- Mar 1: 0.8296
- Mar 10: 0.938
- Mar 100: 0.9407
- Mar Small: 0.8752
- Mar Medium: 0.9535
- Mar Large: 0.9911
- Map Baseball-bat: 0.8962
- Mar 100 Baseball-bat: 0.9382
- Map Basketball: 0.8344
- Mar 100 Basketball: 0.9077
- Map Car: -1.0
- Mar 100 Car: -1.0
- Map Football: 0.814
- Mar 100 Football: 0.8709
- Map Human: 0.9172
- Mar 100 Human: 0.9686
- Map Luggage: 0.8623
- Mar 100 Luggage: 0.9241
- Map Mattress: 0.9809
- Mar 100 Mattress: 0.9971
- Map Motorcycle: 0.9393
- Mar 100 Motorcycle: 0.9751
- Map Skis: 0.8496
- Mar 100 Skis: 0.9692
- Map Snowboard: 0.9857
- Mar 100 Snowboard: 0.9947
- Map Soccer-ball: 0.8382
- Mar 100 Soccer-ball: 0.8755
- Map Stop-sign: 0.9735
- Mar 100 Stop-sign: 0.9957
- Map Tennis-racket: 0.9007
- Mar 100 Tennis-racket: 0.9245
- Map Umbrella: 0.8994
- Mar 100 Umbrella: 0.9775
- Map Volleyball: 0.8024
- Mar 100 Volleyball: 0.8507
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 3e-05
- train_batch_size: 8
- eval_batch_size: 8
- seed: 1337
- optimizer: Use adamw_torch with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
- lr_scheduler_type: linear
- num_epochs: 5.0
- mixed_precision_training: Native AMP
### Training results
| Training Loss | Epoch | Step | Validation Loss | Map | Map 50 | Map 75 | Map Small | Map Medium | Map Large | Mar 1 | Mar 10 | Mar 100 | Mar Small | Mar Medium | Mar Large | Map Baseball-bat | Mar 100 Baseball-bat | Map Basketball | Mar 100 Basketball | Map Car | Mar 100 Car | Map Football | Mar 100 Football | Map Human | Mar 100 Human | Map Luggage | Mar 100 Luggage | Map Mattress | Mar 100 Mattress | Map Motorcycle | Mar 100 Motorcycle | Map Skis | Mar 100 Skis | Map Snowboard | Mar 100 Snowboard | Map Soccer-ball | Mar 100 Soccer-ball | Map Stop-sign | Mar 100 Stop-sign | Map Tennis-racket | Mar 100 Tennis-racket | Map Umbrella | Mar 100 Umbrella | Map Volleyball | Mar 100 Volleyball |
|:-------------:|:-----:|:----:|:---------------:|:------:|:------:|:------:|:---------:|:----------:|:---------:|:------:|:------:|:-------:|:---------:|:----------:|:---------:|:----------------:|:--------------------:|:--------------:|:------------------:|:-------:|:-----------:|:------------:|:----------------:|:---------:|:-------------:|:-----------:|:---------------:|:------------:|:----------------:|:--------------:|:------------------:|:--------:|:------------:|:-------------:|:-----------------:|:---------------:|:-------------------:|:-------------:|:-----------------:|:-----------------:|:---------------------:|:------------:|:----------------:|:--------------:|:------------------:|
| 12.0474 | 1.0 | 438 | 5.5119 | 0.8067 | 0.8868 | 0.8817 | 0.7597 | 0.7886 | 0.8834 | 0.7817 | 0.9168 | 0.924 | 0.8348 | 0.9511 | 0.9848 | 0.7992 | 0.9114 | 0.819 | 0.901 | -1.0 | -1.0 | 0.7627 | 0.8381 | 0.6589 | 0.9657 | 0.8118 | 0.9173 | 0.9643 | 0.9961 | 0.8513 | 0.9658 | 0.5388 | 0.9767 | 0.9801 | 0.9925 | 0.7934 | 0.85 | 0.9652 | 0.988 | 0.8307 | 0.9057 | 0.8198 | 0.9603 | 0.6983 | 0.7671 |
| 6.1205 | 2.0 | 876 | 5.0519 | 0.8186 | 0.8885 | 0.8836 | 0.746 | 0.7754 | 0.902 | 0.7791 | 0.9033 | 0.9094 | 0.8072 | 0.9373 | 0.9872 | 0.8735 | 0.9189 | 0.8175 | 0.9172 | -1.0 | -1.0 | 0.7607 | 0.8174 | 0.798 | 0.9653 | 0.8087 | 0.8941 | 0.9531 | 0.9942 | 0.7115 | 0.9665 | 0.7689 | 0.9508 | 0.9767 | 0.9957 | 0.7882 | 0.846 | 0.92 | 0.9799 | 0.8979 | 0.9208 | 0.7901 | 0.9309 | 0.5955 | 0.6338 |
| 5.3217 | 3.0 | 1314 | 4.0739 | 0.8811 | 0.9513 | 0.9499 | 0.7797 | 0.8841 | 0.9549 | 0.8259 | 0.9337 | 0.9374 | 0.8626 | 0.9518 | 0.9937 | 0.8664 | 0.9269 | 0.8426 | 0.9086 | -1.0 | -1.0 | 0.7995 | 0.8595 | 0.891 | 0.9688 | 0.8574 | 0.9292 | 0.9881 | 0.9998 | 0.9232 | 0.9719 | 0.8199 | 0.9958 | 0.9816 | 0.9932 | 0.82 | 0.862 | 0.979 | 0.994 | 0.8971 | 0.9198 | 0.8925 | 0.9689 | 0.777 | 0.8258 |
| 4.7931 | 4.0 | 1752 | 3.8984 | 0.8943 | 0.9605 | 0.9589 | 0.8046 | 0.9139 | 0.9535 | 0.8289 | 0.9363 | 0.9404 | 0.8726 | 0.9571 | 0.9849 | 0.8889 | 0.9341 | 0.8321 | 0.9053 | -1.0 | -1.0 | 0.809 | 0.8622 | 0.9072 | 0.9685 | 0.8876 | 0.9389 | 0.9796 | 0.9993 | 0.9379 | 0.9757 | 0.8801 | 0.96 | 0.9855 | 0.9972 | 0.8379 | 0.8745 | 0.9811 | 0.994 | 0.9046 | 0.9302 | 0.8907 | 0.9722 | 0.7979 | 0.8531 |
| 4.634 | 5.0 | 2190 | 3.8868 | 0.8924 | 0.9568 | 0.9554 | 0.8115 | 0.9048 | 0.9511 | 0.8296 | 0.938 | 0.9407 | 0.8752 | 0.9535 | 0.9911 | 0.8962 | 0.9382 | 0.8344 | 0.9077 | -1.0 | -1.0 | 0.814 | 0.8709 | 0.9172 | 0.9686 | 0.8623 | 0.9241 | 0.9809 | 0.9971 | 0.9393 | 0.9751 | 0.8496 | 0.9692 | 0.9857 | 0.9947 | 0.8382 | 0.8755 | 0.9735 | 0.9957 | 0.9007 | 0.9245 | 0.8994 | 0.9775 | 0.8024 | 0.8507 |
### Framework versions
- Transformers 4.47.0
- Pytorch 2.5.1+cu124
- Datasets 3.1.0
- Tokenizers 0.21.0
| [
"baseball-bat",
"basketball",
"car",
"football",
"human",
"luggage",
"mattress",
"motorcycle",
"skis",
"snowboard",
"soccer-ball",
"stop-sign",
"tennis-racket",
"umbrella",
"volleyball"
] |
mfly-auton/suas-2025-rtdetr-finetuned-b16-lr1e-4 |
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# suas-2025-rtdetr-finetuned-b16-lr1e-4
This model is a fine-tuned version of [PekingU/rtdetr_r50vd](https://huggingface.co/PekingU/rtdetr_r50vd) on the mfly-auton/suas-2025-synthetic-data dataset.
It achieves the following results on the evaluation set:
- Loss: 7.4789
- Map: 0.636
- Map 50: 0.7833
- Map 75: 0.6825
- Map Small: 0.5292
- Map Medium: 0.6761
- Map Large: 0.6243
- Mar 1: 0.6914
- Mar 10: 0.8319
- Mar 100: 0.8459
- Mar Small: 0.6526
- Mar Medium: 0.861
- Mar Large: 0.9511
- Map Baseball-bat: 0.7327
- Mar 100 Baseball-bat: 0.8479
- Map Basketball: 0.7378
- Mar 100 Basketball: 0.8453
- Map Car: -1.0
- Mar 100 Car: -1.0
- Map Football: 0.515
- Mar 100 Football: 0.5874
- Map Human: 0.2845
- Mar 100 Human: 0.6493
- Map Luggage: 0.3927
- Mar 100 Luggage: 0.9199
- Map Mattress: 0.4605
- Mar 100 Mattress: 0.9731
- Map Motorcycle: 0.8921
- Mar 100 Motorcycle: 0.9379
- Map Skis: 0.9356
- Mar 100 Skis: 0.9772
- Map Snowboard: 0.4858
- Mar 100 Snowboard: 0.9648
- Map Soccer-ball: 0.291
- Mar 100 Soccer-ball: 0.5365
- Map Stop-sign: 0.8721
- Mar 100 Stop-sign: 0.9777
- Map Tennis-racket: 0.8872
- Mar 100 Tennis-racket: 0.9275
- Map Umbrella: 0.7926
- Mar 100 Umbrella: 0.945
- Map Volleyball: 0.6243
- Mar 100 Volleyball: 0.7525
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 0.0001
- train_batch_size: 16
- eval_batch_size: 16
- seed: 1337
- optimizer: Use adamw_torch with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
- lr_scheduler_type: linear
- num_epochs: 10.0
- mixed_precision_training: Native AMP
### Training results
| Training Loss | Epoch | Step | Validation Loss | Map | Map 50 | Map 75 | Map Small | Map Medium | Map Large | Mar 1 | Mar 10 | Mar 100 | Mar Small | Mar Medium | Mar Large | Map Baseball-bat | Mar 100 Baseball-bat | Map Basketball | Mar 100 Basketball | Map Car | Mar 100 Car | Map Football | Mar 100 Football | Map Human | Mar 100 Human | Map Luggage | Mar 100 Luggage | Map Mattress | Mar 100 Mattress | Map Motorcycle | Mar 100 Motorcycle | Map Skis | Mar 100 Skis | Map Snowboard | Mar 100 Snowboard | Map Soccer-ball | Mar 100 Soccer-ball | Map Stop-sign | Mar 100 Stop-sign | Map Tennis-racket | Mar 100 Tennis-racket | Map Umbrella | Mar 100 Umbrella | Map Volleyball | Mar 100 Volleyball |
|:-------------:|:-----:|:----:|:---------------:|:------:|:------:|:------:|:---------:|:----------:|:---------:|:------:|:------:|:-------:|:---------:|:----------:|:---------:|:----------------:|:--------------------:|:--------------:|:------------------:|:-------:|:-----------:|:------------:|:----------------:|:---------:|:-------------:|:-----------:|:---------------:|:------------:|:----------------:|:--------------:|:------------------:|:--------:|:------------:|:-------------:|:-----------------:|:---------------:|:-------------------:|:-------------:|:-----------------:|:-----------------:|:---------------------:|:------------:|:----------------:|:--------------:|:------------------:|
| 14.9473 | 1.0 | 438 | 6.0393 | 0.6929 | 0.8006 | 0.7826 | 0.6924 | 0.6693 | 0.7428 | 0.7205 | 0.8507 | 0.8584 | 0.7888 | 0.8574 | 0.8979 | 0.7927 | 0.8492 | 0.8387 | 0.9263 | -1.0 | -1.0 | 0.5419 | 0.8203 | 0.2032 | 0.5136 | 0.4715 | 0.931 | 0.6218 | 0.9887 | 0.866 | 0.9199 | 0.8182 | 0.8782 | 0.5796 | 0.9515 | 0.6169 | 0.6753 | 0.9283 | 0.9732 | 0.814 | 0.8668 | 0.8116 | 0.8802 | 0.7961 | 0.8431 |
| 6.8045 | 2.0 | 876 | 5.2885 | 0.7468 | 0.8307 | 0.8234 | 0.7611 | 0.7249 | 0.6791 | 0.7565 | 0.8986 | 0.9052 | 0.8475 | 0.877 | 0.9424 | 0.8403 | 0.8959 | 0.8533 | 0.9358 | -1.0 | -1.0 | 0.7497 | 0.8638 | 0.1319 | 0.532 | 0.4374 | 0.9456 | 0.5892 | 0.9958 | 0.9306 | 0.9599 | 0.9652 | 0.9802 | 0.6246 | 0.995 | 0.7541 | 0.8175 | 0.9788 | 0.9958 | 0.8972 | 0.9316 | 0.9091 | 0.9701 | 0.7936 | 0.8532 |
| 6.122 | 3.0 | 1314 | 6.0592 | 0.7128 | 0.8135 | 0.7987 | 0.6996 | 0.7187 | 0.665 | 0.739 | 0.8748 | 0.8891 | 0.8104 | 0.8731 | 0.9347 | 0.8141 | 0.8864 | 0.8214 | 0.9197 | -1.0 | -1.0 | 0.7119 | 0.8346 | 0.1712 | 0.5716 | 0.3367 | 0.9129 | 0.4596 | 0.9784 | 0.9166 | 0.9472 | 0.9445 | 0.9718 | 0.6822 | 0.9922 | 0.6055 | 0.7355 | 0.9341 | 0.9659 | 0.8819 | 0.9207 | 0.9083 | 0.9698 | 0.7906 | 0.8409 |
| 5.7542 | 4.0 | 1752 | 6.7229 | 0.7089 | 0.8239 | 0.7984 | 0.6721 | 0.7153 | 0.6452 | 0.752 | 0.8732 | 0.8817 | 0.7591 | 0.8769 | 0.9587 | 0.7857 | 0.8574 | 0.835 | 0.9052 | -1.0 | -1.0 | 0.6519 | 0.7566 | 0.3228 | 0.6242 | 0.5068 | 0.9147 | 0.4086 | 0.9842 | 0.9254 | 0.9588 | 0.9363 | 0.9639 | 0.6089 | 0.9904 | 0.6176 | 0.7185 | 0.8906 | 0.9765 | 0.8719 | 0.9197 | 0.8104 | 0.9656 | 0.7532 | 0.8083 |
| 5.4197 | 5.0 | 2190 | 6.9705 | 0.6705 | 0.7896 | 0.7393 | 0.6309 | 0.6775 | 0.6347 | 0.7021 | 0.8427 | 0.8567 | 0.7156 | 0.8715 | 0.9375 | 0.8044 | 0.874 | 0.8447 | 0.8896 | -1.0 | -1.0 | 0.6092 | 0.6991 | 0.1786 | 0.5444 | 0.3152 | 0.8589 | 0.3916 | 0.946 | 0.943 | 0.9708 | 0.957 | 0.9782 | 0.521 | 0.9759 | 0.4336 | 0.5923 | 0.8949 | 0.9757 | 0.8747 | 0.9171 | 0.8842 | 0.9704 | 0.7354 | 0.8017 |
| 5.1598 | 6.0 | 2628 | 6.7201 | 0.6509 | 0.7806 | 0.7067 | 0.5621 | 0.6931 | 0.6332 | 0.6927 | 0.826 | 0.839 | 0.6904 | 0.8633 | 0.8993 | 0.7365 | 0.8412 | 0.8438 | 0.9021 | -1.0 | -1.0 | 0.5378 | 0.639 | 0.1533 | 0.4137 | 0.412 | 0.91 | 0.3546 | 0.9008 | 0.9302 | 0.967 | 0.9724 | 0.9881 | 0.5606 | 0.9739 | 0.3666 | 0.5918 | 0.8779 | 0.9642 | 0.843 | 0.8969 | 0.8855 | 0.9633 | 0.6379 | 0.7946 |
| 5.1626 | 7.0 | 3066 | 7.2974 | 0.644 | 0.7767 | 0.6936 | 0.5262 | 0.6787 | 0.6513 | 0.7043 | 0.8445 | 0.8587 | 0.6746 | 0.8942 | 0.9601 | 0.7639 | 0.8663 | 0.7068 | 0.8149 | -1.0 | -1.0 | 0.5452 | 0.6602 | 0.3086 | 0.6825 | 0.41 | 0.9299 | 0.4634 | 0.9747 | 0.9327 | 0.9679 | 0.9505 | 0.9797 | 0.4214 | 0.9677 | 0.2936 | 0.5483 | 0.852 | 0.9818 | 0.8823 | 0.9181 | 0.8924 | 0.9678 | 0.5933 | 0.7615 |
| 5.15 | 8.0 | 3504 | 7.2201 | 0.6572 | 0.7908 | 0.7113 | 0.5559 | 0.6962 | 0.6276 | 0.7048 | 0.8396 | 0.8534 | 0.6795 | 0.8702 | 0.9327 | 0.7574 | 0.8526 | 0.7803 | 0.8718 | -1.0 | -1.0 | 0.5741 | 0.6547 | 0.2742 | 0.5961 | 0.418 | 0.9233 | 0.4893 | 0.9697 | 0.91 | 0.9479 | 0.9345 | 0.9807 | 0.5642 | 0.9692 | 0.3294 | 0.5602 | 0.8803 | 0.9821 | 0.8794 | 0.9264 | 0.7936 | 0.9476 | 0.616 | 0.7659 |
| 4.9521 | 9.0 | 3942 | 7.5060 | 0.6374 | 0.7847 | 0.6838 | 0.5132 | 0.6921 | 0.6331 | 0.6868 | 0.8276 | 0.8419 | 0.6487 | 0.8619 | 0.9341 | 0.7388 | 0.8431 | 0.715 | 0.8273 | -1.0 | -1.0 | 0.5194 | 0.592 | 0.2577 | 0.624 | 0.3945 | 0.9234 | 0.4599 | 0.966 | 0.9031 | 0.942 | 0.9235 | 0.9752 | 0.5532 | 0.9694 | 0.2822 | 0.5368 | 0.9131 | 0.9855 | 0.8769 | 0.9244 | 0.8091 | 0.9438 | 0.5766 | 0.7336 |
| 4.8252 | 10.0 | 4380 | 7.4789 | 0.636 | 0.7833 | 0.6825 | 0.5292 | 0.6761 | 0.6243 | 0.6914 | 0.8319 | 0.8459 | 0.6526 | 0.861 | 0.9511 | 0.7327 | 0.8479 | 0.7378 | 0.8453 | -1.0 | -1.0 | 0.515 | 0.5874 | 0.2845 | 0.6493 | 0.3927 | 0.9199 | 0.4605 | 0.9731 | 0.8921 | 0.9379 | 0.9356 | 0.9772 | 0.4858 | 0.9648 | 0.291 | 0.5365 | 0.8721 | 0.9777 | 0.8872 | 0.9275 | 0.7926 | 0.945 | 0.6243 | 0.7525 |
### Framework versions
- Transformers 4.47.0
- Pytorch 2.5.1+cu124
- Datasets 3.1.0
- Tokenizers 0.21.0
| [
"baseball-bat",
"basketball",
"car",
"football",
"human",
"luggage",
"mattress",
"motorcycle",
"skis",
"snowboard",
"soccer-ball",
"stop-sign",
"tennis-racket",
"umbrella",
"volleyball"
] |
mfly-auton/suas-2025-rtdetr-finetuned-b16-lr3e-5 |
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# suas-2025-rtdetr-finetuned-b16-lr3e-5
This model is a fine-tuned version of [PekingU/rtdetr_r50vd](https://huggingface.co/PekingU/rtdetr_r50vd) on the mfly-auton/suas-2025-synthetic-data dataset.
It achieves the following results on the evaluation set:
- Loss: 9.1849
- Map: 0.4811
- Map 50: 0.6742
- Map 75: 0.5153
- Map Small: 0.385
- Map Medium: 0.5129
- Map Large: 0.5739
- Mar 1: 0.5409
- Mar 10: 0.7299
- Mar 100: 0.7598
- Mar Small: 0.6142
- Mar Medium: 0.7825
- Mar Large: 0.8269
- Map Baseball-bat: 0.49
- Mar 100 Baseball-bat: 0.6707
- Map Basketball: 0.561
- Mar 100 Basketball: 0.6806
- Map Car: -1.0
- Mar 100 Car: -1.0
- Map Football: 0.3393
- Mar 100 Football: 0.6131
- Map Human: 0.7301
- Mar 100 Human: 0.9417
- Map Luggage: 0.6005
- Mar 100 Luggage: 0.8216
- Map Mattress: 0.048
- Mar 100 Mattress: 0.6785
- Map Motorcycle: 0.5833
- Mar 100 Motorcycle: 0.6382
- Map Skis: 0.8396
- Mar 100 Skis: 0.9198
- Map Snowboard: 0.6757
- Mar 100 Snowboard: 0.8016
- Map Soccer-ball: 0.3701
- Mar 100 Soccer-ball: 0.7694
- Map Stop-sign: 0.4419
- Mar 100 Stop-sign: 0.9539
- Map Tennis-racket: 0.4382
- Mar 100 Tennis-racket: 0.7518
- Map Umbrella: 0.2355
- Mar 100 Umbrella: 0.7241
- Map Volleyball: 0.382
- Mar 100 Volleyball: 0.6728
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 3e-05
- train_batch_size: 16
- eval_batch_size: 16
- seed: 1337
- optimizer: Use adamw_torch with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
- lr_scheduler_type: linear
- num_epochs: 10.0
- mixed_precision_training: Native AMP
### Training results
| Training Loss | Epoch | Step | Validation Loss | Map | Map 50 | Map 75 | Map Small | Map Medium | Map Large | Mar 1 | Mar 10 | Mar 100 | Mar Small | Mar Medium | Mar Large | Map Baseball-bat | Mar 100 Baseball-bat | Map Basketball | Mar 100 Basketball | Map Car | Mar 100 Car | Map Football | Mar 100 Football | Map Human | Mar 100 Human | Map Luggage | Mar 100 Luggage | Map Mattress | Mar 100 Mattress | Map Motorcycle | Mar 100 Motorcycle | Map Skis | Mar 100 Skis | Map Snowboard | Mar 100 Snowboard | Map Soccer-ball | Mar 100 Soccer-ball | Map Stop-sign | Mar 100 Stop-sign | Map Tennis-racket | Mar 100 Tennis-racket | Map Umbrella | Mar 100 Umbrella | Map Volleyball | Mar 100 Volleyball |
|:-------------:|:-----:|:----:|:---------------:|:------:|:------:|:------:|:---------:|:----------:|:---------:|:------:|:------:|:-------:|:---------:|:----------:|:---------:|:----------------:|:--------------------:|:--------------:|:------------------:|:-------:|:-----------:|:------------:|:----------------:|:---------:|:-------------:|:-----------:|:---------------:|:------------:|:----------------:|:--------------:|:------------------:|:--------:|:------------:|:-------------:|:-----------------:|:---------------:|:-------------------:|:-------------:|:-----------------:|:-----------------:|:---------------------:|:------------:|:----------------:|:--------------:|:------------------:|
| 23.0537 | 1.0 | 438 | 9.2431 | 0.6183 | 0.7399 | 0.7053 | 0.4862 | 0.6975 | 0.6277 | 0.698 | 0.8445 | 0.863 | 0.727 | 0.8914 | 0.9333 | 0.6135 | 0.8496 | 0.4784 | 0.7922 | -1.0 | -1.0 | 0.5092 | 0.7406 | 0.7931 | 0.9349 | 0.7264 | 0.8453 | 0.4151 | 0.9649 | 0.7746 | 0.8995 | 0.7873 | 0.9153 | 0.7978 | 0.9122 | 0.3779 | 0.6828 | 0.8549 | 0.957 | 0.6769 | 0.885 | 0.377 | 0.9438 | 0.4746 | 0.7593 |
| 9.6444 | 2.0 | 876 | 8.4354 | 0.5654 | 0.6543 | 0.6454 | 0.4377 | 0.6759 | 0.6517 | 0.6178 | 0.8051 | 0.8567 | 0.7616 | 0.8748 | 0.9001 | 0.6536 | 0.8175 | 0.4753 | 0.7836 | -1.0 | -1.0 | 0.4716 | 0.7781 | 0.7632 | 0.925 | 0.7675 | 0.8975 | 0.1446 | 0.8849 | 0.7513 | 0.884 | 0.8092 | 0.8936 | 0.796 | 0.9045 | 0.1811 | 0.8046 | 0.6988 | 0.9162 | 0.734 | 0.8762 | 0.2025 | 0.8361 | 0.4669 | 0.7924 |
| 7.9736 | 3.0 | 1314 | 8.1233 | 0.6032 | 0.7082 | 0.6974 | 0.4632 | 0.7302 | 0.7013 | 0.6284 | 0.8059 | 0.8518 | 0.7726 | 0.8713 | 0.8919 | 0.6745 | 0.7985 | 0.5687 | 0.8135 | -1.0 | -1.0 | 0.5098 | 0.7643 | 0.8075 | 0.917 | 0.7803 | 0.8888 | 0.1936 | 0.8645 | 0.8528 | 0.9092 | 0.8126 | 0.903 | 0.8344 | 0.8903 | 0.2384 | 0.8224 | 0.8343 | 0.9179 | 0.7266 | 0.8549 | 0.2001 | 0.7809 | 0.4116 | 0.8002 |
| 7.5623 | 4.0 | 1752 | 8.0717 | 0.6264 | 0.7342 | 0.7215 | 0.5182 | 0.734 | 0.7239 | 0.651 | 0.8219 | 0.8665 | 0.7837 | 0.8706 | 0.9228 | 0.7056 | 0.8205 | 0.6522 | 0.8386 | -1.0 | -1.0 | 0.5832 | 0.8142 | 0.7739 | 0.9234 | 0.808 | 0.8942 | 0.0914 | 0.8634 | 0.8597 | 0.9093 | 0.8828 | 0.9376 | 0.8274 | 0.8924 | 0.2734 | 0.8501 | 0.8612 | 0.9209 | 0.7139 | 0.8124 | 0.2524 | 0.8567 | 0.4843 | 0.7968 |
| 7.2751 | 5.0 | 2190 | 8.7630 | 0.5761 | 0.6678 | 0.6549 | 0.4548 | 0.6658 | 0.6059 | 0.6224 | 0.7803 | 0.8305 | 0.7163 | 0.8521 | 0.9175 | 0.5765 | 0.7835 | 0.5702 | 0.7898 | -1.0 | -1.0 | 0.4783 | 0.7281 | 0.7352 | 0.9097 | 0.7497 | 0.8839 | 0.0284 | 0.8625 | 0.7861 | 0.8506 | 0.8352 | 0.9277 | 0.833 | 0.8627 | 0.4065 | 0.7625 | 0.7123 | 0.9137 | 0.655 | 0.8026 | 0.2883 | 0.8388 | 0.4105 | 0.7105 |
| 7.2919 | 6.0 | 2628 | 8.3399 | 0.6044 | 0.7342 | 0.6966 | 0.4756 | 0.6987 | 0.7245 | 0.6285 | 0.7863 | 0.8232 | 0.6783 | 0.8579 | 0.8962 | 0.6153 | 0.7723 | 0.5137 | 0.7211 | -1.0 | -1.0 | 0.5038 | 0.7329 | 0.8326 | 0.9302 | 0.7848 | 0.8806 | 0.0252 | 0.7449 | 0.803 | 0.8632 | 0.8442 | 0.9158 | 0.8247 | 0.8912 | 0.4423 | 0.7519 | 0.8512 | 0.9458 | 0.6717 | 0.813 | 0.3123 | 0.8643 | 0.4365 | 0.6975 |
| 6.9699 | 7.0 | 3066 | 8.5778 | 0.5806 | 0.7101 | 0.6569 | 0.4371 | 0.6901 | 0.7185 | 0.6187 | 0.7898 | 0.8279 | 0.6748 | 0.865 | 0.9156 | 0.6135 | 0.7665 | 0.5259 | 0.7135 | -1.0 | -1.0 | 0.3524 | 0.6591 | 0.8118 | 0.9329 | 0.7602 | 0.8796 | 0.1264 | 0.8605 | 0.7603 | 0.8343 | 0.8279 | 0.9312 | 0.8645 | 0.9105 | 0.3614 | 0.7684 | 0.711 | 0.9377 | 0.663 | 0.8212 | 0.3093 | 0.8638 | 0.4402 | 0.7118 |
| 6.7247 | 8.0 | 3504 | 8.5073 | 0.5993 | 0.7448 | 0.6827 | 0.4588 | 0.6121 | 0.7777 | 0.6438 | 0.8042 | 0.8321 | 0.6614 | 0.8377 | 0.9399 | 0.6037 | 0.7583 | 0.6004 | 0.719 | -1.0 | -1.0 | 0.4472 | 0.7137 | 0.8139 | 0.9333 | 0.7039 | 0.8694 | 0.2481 | 0.9102 | 0.8006 | 0.845 | 0.885 | 0.9599 | 0.8342 | 0.8815 | 0.398 | 0.7242 | 0.5666 | 0.9377 | 0.5825 | 0.8135 | 0.4441 | 0.8946 | 0.4621 | 0.689 |
| 6.5664 | 9.0 | 3942 | 9.0973 | 0.4861 | 0.6532 | 0.5236 | 0.3138 | 0.536 | 0.6237 | 0.5459 | 0.7169 | 0.7528 | 0.5389 | 0.7868 | 0.8686 | 0.442 | 0.6763 | 0.4655 | 0.6182 | -1.0 | -1.0 | 0.3113 | 0.5423 | 0.7863 | 0.9349 | 0.6849 | 0.8274 | 0.0783 | 0.7612 | 0.6747 | 0.7367 | 0.8484 | 0.9386 | 0.7503 | 0.8272 | 0.2637 | 0.644 | 0.4794 | 0.9464 | 0.4634 | 0.7295 | 0.2248 | 0.744 | 0.3317 | 0.6123 |
| 6.2978 | 10.0 | 4380 | 9.1849 | 0.4811 | 0.6742 | 0.5153 | 0.385 | 0.5129 | 0.5739 | 0.5409 | 0.7299 | 0.7598 | 0.6142 | 0.7825 | 0.8269 | 0.49 | 0.6707 | 0.561 | 0.6806 | -1.0 | -1.0 | 0.3393 | 0.6131 | 0.7301 | 0.9417 | 0.6005 | 0.8216 | 0.048 | 0.6785 | 0.5833 | 0.6382 | 0.8396 | 0.9198 | 0.6757 | 0.8016 | 0.3701 | 0.7694 | 0.4419 | 0.9539 | 0.4382 | 0.7518 | 0.2355 | 0.7241 | 0.382 | 0.6728 |
### Framework versions
- Transformers 4.47.0
- Pytorch 2.5.1+cu124
- Datasets 3.1.0
- Tokenizers 0.21.0
| [
"baseball-bat",
"basketball",
"car",
"football",
"human",
"luggage",
"mattress",
"motorcycle",
"skis",
"snowboard",
"soccer-ball",
"stop-sign",
"tennis-racket",
"umbrella",
"volleyball"
] |
mfly-auton/suas-2025-rtdetr-finetuned-b8-lr3e-5 |
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# suas-2025-rtdetr-finetuned-b8-lr3e-5
This model is a fine-tuned version of [PekingU/rtdetr_r50vd](https://huggingface.co/PekingU/rtdetr_r50vd) on the mfly-auton/suas-2025-synthetic-data dataset.
It achieves the following results on the evaluation set:
- Loss: 6.4174
- Map: 0.6051
- Map 50: 0.7291
- Map 75: 0.6597
- Map Small: 0.5252
- Map Medium: 0.6131
- Map Large: 0.6723
- Mar 1: 0.6006
- Mar 10: 0.7415
- Mar 100: 0.7959
- Mar Small: 0.5831
- Mar Medium: 0.8299
- Mar Large: 0.948
- Map Baseball-bat: 0.5841
- Mar 100 Baseball-bat: 0.7219
- Map Basketball: 0.4357
- Mar 100 Basketball: 0.4955
- Map Car: -1.0
- Mar 100 Car: -1.0
- Map Football: 0.4363
- Mar 100 Football: 0.5014
- Map Human: 0.9189
- Mar 100 Human: 0.9512
- Map Luggage: 0.5995
- Mar 100 Luggage: 0.8773
- Map Mattress: 0.7881
- Mar 100 Mattress: 0.9906
- Map Motorcycle: 0.9485
- Mar 100 Motorcycle: 0.9752
- Map Skis: 0.9582
- Mar 100 Skis: 0.9782
- Map Snowboard: 0.0257
- Mar 100 Snowboard: 0.777
- Map Soccer-ball: 0.6432
- Mar 100 Soccer-ball: 0.7154
- Map Stop-sign: 0.5994
- Mar 100 Stop-sign: 0.9274
- Map Tennis-racket: 0.0019
- Mar 100 Tennis-racket: 0.5736
- Map Umbrella: 0.8665
- Mar 100 Umbrella: 0.9243
- Map Volleyball: 0.6647
- Mar 100 Volleyball: 0.7338
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 3e-05
- train_batch_size: 8
- eval_batch_size: 32
- seed: 1337
- optimizer: Use adamw_torch with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
- lr_scheduler_type: linear
- num_epochs: 10.0
- mixed_precision_training: Native AMP
### Training results
| Training Loss | Epoch | Step | Validation Loss | Map | Map 50 | Map 75 | Map Small | Map Medium | Map Large | Mar 1 | Mar 10 | Mar 100 | Mar Small | Mar Medium | Mar Large | Map Baseball-bat | Mar 100 Baseball-bat | Map Basketball | Mar 100 Basketball | Map Car | Mar 100 Car | Map Football | Mar 100 Football | Map Human | Mar 100 Human | Map Luggage | Mar 100 Luggage | Map Mattress | Mar 100 Mattress | Map Motorcycle | Mar 100 Motorcycle | Map Skis | Mar 100 Skis | Map Snowboard | Mar 100 Snowboard | Map Soccer-ball | Mar 100 Soccer-ball | Map Stop-sign | Mar 100 Stop-sign | Map Tennis-racket | Mar 100 Tennis-racket | Map Umbrella | Mar 100 Umbrella | Map Volleyball | Mar 100 Volleyball |
|:-------------:|:-----:|:----:|:---------------:|:------:|:------:|:------:|:---------:|:----------:|:---------:|:------:|:------:|:-------:|:---------:|:----------:|:---------:|:----------------:|:--------------------:|:--------------:|:------------------:|:-------:|:-----------:|:------------:|:----------------:|:---------:|:-------------:|:-----------:|:---------------:|:------------:|:----------------:|:--------------:|:------------------:|:--------:|:------------:|:-------------:|:-----------------:|:---------------:|:-------------------:|:-------------:|:-----------------:|:-----------------:|:---------------------:|:------------:|:----------------:|:--------------:|:------------------:|
| 17.7551 | 1.0 | 875 | 5.2510 | 0.6942 | 0.7838 | 0.7782 | 0.7254 | 0.6405 | 0.7363 | 0.7413 | 0.8942 | 0.9051 | 0.8089 | 0.8996 | 0.9496 | 0.6964 | 0.8173 | 0.7915 | 0.8415 | -1.0 | -1.0 | 0.7303 | 0.8192 | 0.8746 | 0.9281 | 0.6526 | 0.9372 | 0.9142 | 0.9711 | 0.8867 | 0.9311 | 0.8376 | 0.9594 | 0.3429 | 0.9742 | 0.7454 | 0.8686 | 0.581 | 0.9542 | 0.3918 | 0.944 | 0.4713 | 0.8414 | 0.8027 | 0.8846 |
| 6.6319 | 2.0 | 1750 | 5.6671 | 0.6915 | 0.7796 | 0.7715 | 0.6499 | 0.6468 | 0.8104 | 0.7323 | 0.879 | 0.892 | 0.726 | 0.9168 | 0.9841 | 0.6895 | 0.7787 | 0.5032 | 0.6076 | -1.0 | -1.0 | 0.7195 | 0.7926 | 0.9186 | 0.9593 | 0.6184 | 0.9288 | 0.895 | 0.9984 | 0.901 | 0.9643 | 0.9222 | 0.9634 | 0.2897 | 0.9699 | 0.7717 | 0.8578 | 0.6219 | 0.9584 | 0.2013 | 0.8995 | 0.8724 | 0.9458 | 0.7562 | 0.864 |
| 6.3092 | 3.0 | 2625 | 5.3786 | 0.6861 | 0.7797 | 0.7703 | 0.6794 | 0.6505 | 0.8039 | 0.7008 | 0.8593 | 0.8895 | 0.738 | 0.9294 | 0.9851 | 0.7854 | 0.863 | 0.5392 | 0.5953 | -1.0 | -1.0 | 0.6451 | 0.7011 | 0.9165 | 0.9458 | 0.6485 | 0.9354 | 0.9011 | 0.9986 | 0.9194 | 0.9592 | 0.9354 | 0.9827 | 0.147 | 0.9815 | 0.8089 | 0.8686 | 0.7102 | 0.9732 | 0.0527 | 0.8694 | 0.8651 | 0.9418 | 0.7312 | 0.838 |
| 5.8652 | 4.0 | 3500 | 5.8538 | 0.6465 | 0.749 | 0.7306 | 0.6014 | 0.6486 | 0.6801 | 0.6644 | 0.8092 | 0.8383 | 0.6448 | 0.8784 | 0.9506 | 0.6314 | 0.7432 | 0.4033 | 0.446 | -1.0 | -1.0 | 0.6013 | 0.6536 | 0.9291 | 0.9609 | 0.729 | 0.9151 | 0.64 | 0.9942 | 0.9361 | 0.9592 | 0.8382 | 0.9396 | 0.1472 | 0.893 | 0.7465 | 0.8059 | 0.803 | 0.952 | 0.0352 | 0.7725 | 0.8569 | 0.892 | 0.7539 | 0.8091 |
| 5.4748 | 5.0 | 4375 | 5.8922 | 0.6416 | 0.7467 | 0.7203 | 0.5884 | 0.6359 | 0.7155 | 0.6543 | 0.8104 | 0.8498 | 0.6337 | 0.895 | 0.965 | 0.6726 | 0.7857 | 0.3624 | 0.4249 | -1.0 | -1.0 | 0.5819 | 0.641 | 0.9266 | 0.9518 | 0.6374 | 0.9073 | 0.7673 | 0.9959 | 0.9372 | 0.9692 | 0.9629 | 0.9861 | 0.0824 | 0.9316 | 0.7745 | 0.8198 | 0.7208 | 0.9508 | 0.015 | 0.8404 | 0.7927 | 0.8863 | 0.7489 | 0.8061 |
| 5.352 | 6.0 | 5250 | 6.3142 | 0.6409 | 0.741 | 0.7178 | 0.6313 | 0.5958 | 0.6861 | 0.6395 | 0.7987 | 0.8411 | 0.6741 | 0.8595 | 0.9544 | 0.7148 | 0.8051 | 0.4792 | 0.5386 | -1.0 | -1.0 | 0.5879 | 0.6392 | 0.9101 | 0.9478 | 0.5811 | 0.8811 | 0.8213 | 0.9987 | 0.9139 | 0.96 | 0.9479 | 0.9767 | 0.052 | 0.9105 | 0.7854 | 0.8321 | 0.6801 | 0.9332 | 0.0209 | 0.6995 | 0.7664 | 0.8798 | 0.7112 | 0.7733 |
| 5.1405 | 7.0 | 6125 | 6.3704 | 0.6464 | 0.7579 | 0.7185 | 0.5862 | 0.6209 | 0.6971 | 0.6346 | 0.7737 | 0.8247 | 0.6346 | 0.865 | 0.9372 | 0.6828 | 0.7855 | 0.5464 | 0.5882 | -1.0 | -1.0 | 0.5058 | 0.5697 | 0.9172 | 0.9526 | 0.6302 | 0.9021 | 0.8037 | 0.9908 | 0.9339 | 0.9643 | 0.9615 | 0.9767 | 0.0249 | 0.6942 | 0.7298 | 0.7859 | 0.7449 | 0.9567 | 0.0025 | 0.713 | 0.8942 | 0.9397 | 0.6718 | 0.7272 |
| 4.9059 | 8.0 | 7000 | 6.4879 | 0.6238 | 0.7487 | 0.6799 | 0.5369 | 0.6248 | 0.6895 | 0.6205 | 0.7655 | 0.8153 | 0.5983 | 0.8634 | 0.9561 | 0.6155 | 0.7369 | 0.4736 | 0.5339 | -1.0 | -1.0 | 0.4614 | 0.5228 | 0.9159 | 0.9516 | 0.6436 | 0.8783 | 0.867 | 0.9951 | 0.9482 | 0.9714 | 0.9574 | 0.9767 | 0.0317 | 0.806 | 0.6742 | 0.7409 | 0.6387 | 0.9321 | 0.0037 | 0.7368 | 0.8625 | 0.9115 | 0.6398 | 0.7199 |
| 4.8446 | 9.0 | 7875 | 6.5165 | 0.6064 | 0.734 | 0.6583 | 0.5305 | 0.6055 | 0.6766 | 0.6057 | 0.7479 | 0.8021 | 0.5908 | 0.8441 | 0.9489 | 0.598 | 0.7249 | 0.4926 | 0.5547 | -1.0 | -1.0 | 0.4304 | 0.4976 | 0.9153 | 0.9478 | 0.5701 | 0.8526 | 0.7636 | 0.9867 | 0.9459 | 0.9684 | 0.9583 | 0.9762 | 0.0271 | 0.7862 | 0.6457 | 0.7131 | 0.6268 | 0.9338 | 0.0014 | 0.6518 | 0.8662 | 0.914 | 0.6484 | 0.7213 |
| 4.7656 | 10.0 | 8750 | 6.4174 | 0.6051 | 0.7291 | 0.6597 | 0.5252 | 0.6131 | 0.6723 | 0.6006 | 0.7415 | 0.7959 | 0.5831 | 0.8299 | 0.948 | 0.5841 | 0.7219 | 0.4357 | 0.4955 | -1.0 | -1.0 | 0.4363 | 0.5014 | 0.9189 | 0.9512 | 0.5995 | 0.8773 | 0.7881 | 0.9906 | 0.9485 | 0.9752 | 0.9582 | 0.9782 | 0.0257 | 0.777 | 0.6432 | 0.7154 | 0.5994 | 0.9274 | 0.0019 | 0.5736 | 0.8665 | 0.9243 | 0.6647 | 0.7338 |
### Framework versions
- Transformers 4.47.0
- Pytorch 2.5.1+cu124
- Datasets 3.1.0
- Tokenizers 0.21.0
| [
"baseball-bat",
"basketball",
"car",
"football",
"human",
"luggage",
"mattress",
"motorcycle",
"skis",
"snowboard",
"soccer-ball",
"stop-sign",
"tennis-racket",
"umbrella",
"volleyball"
] |
mfly-auton/suas-2025-rtdetr-finetuned-b16-lr1e-5 |
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# suas-2025-rtdetr-finetuned-b16-lr1e-5
This model is a fine-tuned version of [PekingU/rtdetr_r50vd](https://huggingface.co/PekingU/rtdetr_r50vd) on the mfly-auton/suas-2025-synthetic-data dataset.
It achieves the following results on the evaluation set:
- Loss: 3.6523
- Map: 0.8465
- Map 50: 0.9229
- Map 75: 0.9193
- Map Small: 0.7682
- Map Medium: 0.8561
- Map Large: 0.9144
- Mar 1: 0.7945
- Mar 10: 0.9245
- Mar 100: 0.9271
- Mar Small: 0.8316
- Mar Medium: 0.9357
- Mar Large: 0.9779
- Map Baseball-bat: 0.8128
- Mar 100 Baseball-bat: 0.892
- Map Basketball: 0.8105
- Mar 100 Basketball: 0.8993
- Map Car: -1.0
- Mar 100 Car: -1.0
- Map Football: 0.7611
- Mar 100 Football: 0.8113
- Map Human: 0.9382
- Mar 100 Human: 0.9641
- Map Luggage: 0.8579
- Mar 100 Luggage: 0.9191
- Map Mattress: 0.9384
- Mar 100 Mattress: 0.977
- Map Motorcycle: 0.9309
- Mar 100 Motorcycle: 0.9773
- Map Skis: 0.7044
- Mar 100 Skis: 0.995
- Map Snowboard: 0.9834
- Mar 100 Snowboard: 0.9932
- Map Soccer-ball: 0.8245
- Mar 100 Soccer-ball: 0.8733
- Map Stop-sign: 0.9671
- Mar 100 Stop-sign: 0.9925
- Map Tennis-racket: 0.7064
- Mar 100 Tennis-racket: 0.8539
- Map Umbrella: 0.8285
- Mar 100 Umbrella: 0.9803
- Map Volleyball: 0.787
- Mar 100 Volleyball: 0.8517
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 1e-05
- train_batch_size: 16
- eval_batch_size: 32
- seed: 1337
- optimizer: Use adamw_torch with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
- lr_scheduler_type: linear
- num_epochs: 10.0
- mixed_precision_training: Native AMP
### Training results
| Training Loss | Epoch | Step | Validation Loss | Map | Map 50 | Map 75 | Map Small | Map Medium | Map Large | Mar 1 | Mar 10 | Mar 100 | Mar Small | Mar Medium | Mar Large | Map Baseball-bat | Mar 100 Baseball-bat | Map Basketball | Mar 100 Basketball | Map Car | Mar 100 Car | Map Football | Mar 100 Football | Map Human | Mar 100 Human | Map Luggage | Mar 100 Luggage | Map Mattress | Mar 100 Mattress | Map Motorcycle | Mar 100 Motorcycle | Map Skis | Mar 100 Skis | Map Snowboard | Mar 100 Snowboard | Map Soccer-ball | Mar 100 Soccer-ball | Map Stop-sign | Mar 100 Stop-sign | Map Tennis-racket | Mar 100 Tennis-racket | Map Umbrella | Mar 100 Umbrella | Map Volleyball | Mar 100 Volleyball |
|:-------------:|:-----:|:----:|:---------------:|:------:|:------:|:------:|:---------:|:----------:|:---------:|:------:|:------:|:-------:|:---------:|:----------:|:---------:|:----------------:|:--------------------:|:--------------:|:------------------:|:-------:|:-----------:|:------------:|:----------------:|:---------:|:-------------:|:-----------:|:---------------:|:------------:|:----------------:|:--------------:|:------------------:|:--------:|:------------:|:-------------:|:-----------------:|:---------------:|:-------------------:|:-------------:|:-----------------:|:-----------------:|:---------------------:|:------------:|:----------------:|:--------------:|:------------------:|
| 26.4817 | 1.0 | 438 | 7.1796 | 0.5246 | 0.6136 | 0.5998 | 0.4998 | 0.4541 | 0.685 | 0.6337 | 0.8316 | 0.8375 | 0.7077 | 0.8446 | 0.9453 | 0.3868 | 0.8129 | 0.1579 | 0.8832 | -1.0 | -1.0 | 0.5107 | 0.6099 | 0.8625 | 0.9246 | 0.6942 | 0.792 | 0.568 | 0.8388 | 0.7128 | 0.9384 | 0.085 | 0.9099 | 0.9532 | 0.9784 | 0.5658 | 0.7478 | 0.3822 | 0.9553 | 0.2726 | 0.7264 | 0.6033 | 0.9515 | 0.5895 | 0.6554 |
| 8.2849 | 2.0 | 876 | 5.1650 | 0.6623 | 0.7385 | 0.7311 | 0.6108 | 0.674 | 0.8009 | 0.6914 | 0.87 | 0.8726 | 0.7227 | 0.9081 | 0.9696 | 0.7515 | 0.8723 | 0.2886 | 0.8671 | -1.0 | -1.0 | 0.6431 | 0.7019 | 0.9057 | 0.9462 | 0.8014 | 0.8587 | 0.8949 | 0.9605 | 0.8986 | 0.9603 | 0.1247 | 0.9936 | 0.9665 | 0.9862 | 0.7817 | 0.8419 | 0.8546 | 0.9869 | 0.0862 | 0.7197 | 0.7593 | 0.9635 | 0.5151 | 0.5581 |
| 6.559 | 3.0 | 1314 | 4.4005 | 0.7504 | 0.8276 | 0.8215 | 0.7114 | 0.7272 | 0.8594 | 0.7435 | 0.8949 | 0.8966 | 0.7711 | 0.9093 | 0.975 | 0.7706 | 0.8428 | 0.6714 | 0.8801 | -1.0 | -1.0 | 0.6961 | 0.7498 | 0.9191 | 0.9553 | 0.8097 | 0.8663 | 0.9315 | 0.981 | 0.9122 | 0.9665 | 0.356 | 0.9926 | 0.9799 | 0.9927 | 0.8045 | 0.8594 | 0.9288 | 0.9846 | 0.2706 | 0.7788 | 0.7865 | 0.9736 | 0.6683 | 0.7284 |
| 6.1007 | 4.0 | 1752 | 4.3050 | 0.7733 | 0.8501 | 0.8445 | 0.672 | 0.8036 | 0.8698 | 0.75 | 0.8933 | 0.8959 | 0.7471 | 0.93 | 0.977 | 0.7927 | 0.8788 | 0.5734 | 0.7517 | -1.0 | -1.0 | 0.7111 | 0.7712 | 0.9262 | 0.959 | 0.8357 | 0.8953 | 0.9221 | 0.966 | 0.9326 | 0.9731 | 0.4091 | 0.9936 | 0.9788 | 0.9937 | 0.7975 | 0.8563 | 0.9535 | 0.9877 | 0.5879 | 0.8409 | 0.798 | 0.9753 | 0.6073 | 0.6998 |
| 5.7812 | 5.0 | 2190 | 3.9205 | 0.8281 | 0.91 | 0.9046 | 0.741 | 0.8373 | 0.9087 | 0.7819 | 0.9162 | 0.9196 | 0.8073 | 0.9294 | 0.9776 | 0.8105 | 0.8866 | 0.7005 | 0.8344 | -1.0 | -1.0 | 0.7656 | 0.8203 | 0.9326 | 0.9609 | 0.8291 | 0.9 | 0.9399 | 0.9791 | 0.9223 | 0.9695 | 0.6417 | 0.9906 | 0.977 | 0.9935 | 0.8194 | 0.8738 | 0.9422 | 0.9858 | 0.7258 | 0.8819 | 0.8433 | 0.9767 | 0.744 | 0.8208 |
| 5.4833 | 6.0 | 2628 | 3.7494 | 0.8402 | 0.9191 | 0.915 | 0.7567 | 0.861 | 0.9247 | 0.7947 | 0.9219 | 0.9246 | 0.8235 | 0.9378 | 0.9902 | 0.8153 | 0.8907 | 0.8096 | 0.9043 | -1.0 | -1.0 | 0.7586 | 0.8083 | 0.9378 | 0.9627 | 0.8322 | 0.9015 | 0.9545 | 0.989 | 0.9241 | 0.9725 | 0.7303 | 0.9926 | 0.9832 | 0.9947 | 0.8225 | 0.8707 | 0.9542 | 0.988 | 0.6731 | 0.871 | 0.8273 | 0.9776 | 0.7398 | 0.8213 |
| 5.3134 | 7.0 | 3066 | 3.7250 | 0.8437 | 0.9213 | 0.9165 | 0.7597 | 0.854 | 0.919 | 0.7917 | 0.9213 | 0.9241 | 0.8256 | 0.9419 | 0.9775 | 0.8142 | 0.8876 | 0.8066 | 0.9014 | -1.0 | -1.0 | 0.7341 | 0.7887 | 0.9359 | 0.9624 | 0.8417 | 0.9133 | 0.9412 | 0.9782 | 0.9292 | 0.9777 | 0.713 | 0.9931 | 0.9828 | 0.9934 | 0.828 | 0.8761 | 0.963 | 0.9922 | 0.7265 | 0.8554 | 0.8307 | 0.9791 | 0.7646 | 0.8392 |
| 5.2982 | 8.0 | 3504 | 3.7825 | 0.8419 | 0.9181 | 0.9133 | 0.7599 | 0.8549 | 0.9154 | 0.7909 | 0.9198 | 0.922 | 0.8277 | 0.9355 | 0.9768 | 0.8151 | 0.8931 | 0.8242 | 0.9066 | -1.0 | -1.0 | 0.7225 | 0.772 | 0.9375 | 0.9627 | 0.8383 | 0.9052 | 0.9338 | 0.9744 | 0.9288 | 0.9755 | 0.7353 | 0.996 | 0.9818 | 0.9925 | 0.8221 | 0.873 | 0.9675 | 0.9891 | 0.6801 | 0.8466 | 0.8273 | 0.9818 | 0.7726 | 0.8397 |
| 5.1855 | 9.0 | 3942 | 3.7947 | 0.8365 | 0.913 | 0.9085 | 0.7528 | 0.8439 | 0.9104 | 0.7884 | 0.9184 | 0.9219 | 0.8189 | 0.9344 | 0.979 | 0.8041 | 0.8864 | 0.8009 | 0.8955 | -1.0 | -1.0 | 0.7239 | 0.7755 | 0.9351 | 0.9622 | 0.8444 | 0.9119 | 0.9314 | 0.9771 | 0.9241 | 0.9735 | 0.7034 | 0.995 | 0.984 | 0.995 | 0.8168 | 0.8704 | 0.9639 | 0.9922 | 0.6888 | 0.8513 | 0.8203 | 0.9808 | 0.7704 | 0.8402 |
| 5.1499 | 10.0 | 4380 | 3.6523 | 0.8465 | 0.9229 | 0.9193 | 0.7682 | 0.8561 | 0.9144 | 0.7945 | 0.9245 | 0.9271 | 0.8316 | 0.9357 | 0.9779 | 0.8128 | 0.892 | 0.8105 | 0.8993 | -1.0 | -1.0 | 0.7611 | 0.8113 | 0.9382 | 0.9641 | 0.8579 | 0.9191 | 0.9384 | 0.977 | 0.9309 | 0.9773 | 0.7044 | 0.995 | 0.9834 | 0.9932 | 0.8245 | 0.8733 | 0.9671 | 0.9925 | 0.7064 | 0.8539 | 0.8285 | 0.9803 | 0.787 | 0.8517 |
### Framework versions
- Transformers 4.47.0
- Pytorch 2.5.1+cu124
- Datasets 3.1.0
- Tokenizers 0.21.0
| [
"baseball-bat",
"basketball",
"car",
"football",
"human",
"luggage",
"mattress",
"motorcycle",
"skis",
"snowboard",
"soccer-ball",
"stop-sign",
"tennis-racket",
"umbrella",
"volleyball"
] |
mfly-auton/suas-2025-rtdetr-finetuned-e20-b16-lr1e-5 |
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# suas-2025-rtdetr-finetuned-e20-b16-lr1e-5
This model is a fine-tuned version of [PekingU/rtdetr_r50vd](https://huggingface.co/PekingU/rtdetr_r50vd) on the mfly-auton/suas-2025-synthetic-data dataset.
It achieves the following results on the evaluation set:
- Loss: 5.3994
- Map: 0.7432
- Map 50: 0.826
- Map 75: 0.8194
- Map Small: 0.6733
- Map Medium: 0.7514
- Map Large: 0.8478
- Mar 1: 0.7418
- Mar 10: 0.8747
- Mar 100: 0.8873
- Mar Small: 0.7457
- Mar Medium: 0.9276
- Mar Large: 0.9725
- Map Baseball-bat: 0.8321
- Mar 100 Baseball-bat: 0.8874
- Map Basketball: 0.5876
- Mar 100 Basketball: 0.6948
- Map Car: -1.0
- Mar 100 Car: -1.0
- Map Football: 0.7601
- Mar 100 Football: 0.8239
- Map Human: 0.3724
- Mar 100 Human: 0.8567
- Map Luggage: 0.3313
- Mar 100 Luggage: 0.933
- Map Mattress: 0.9551
- Mar 100 Mattress: 0.9933
- Map Motorcycle: 0.9352
- Mar 100 Motorcycle: 0.9758
- Map Skis: 0.9499
- Mar 100 Skis: 0.9861
- Map Snowboard: 0.7494
- Mar 100 Snowboard: 0.9924
- Map Soccer-ball: 0.6994
- Mar 100 Soccer-ball: 0.7859
- Map Stop-sign: 0.9566
- Mar 100 Stop-sign: 0.9908
- Map Tennis-racket: 0.8019
- Mar 100 Tennis-racket: 0.8663
- Map Umbrella: 0.8881
- Mar 100 Umbrella: 0.926
- Map Volleyball: 0.5851
- Mar 100 Volleyball: 0.71
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 1e-05
- train_batch_size: 16
- eval_batch_size: 32
- seed: 1337
- optimizer: Use adamw_torch with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
- lr_scheduler_type: linear
- num_epochs: 20.0
- mixed_precision_training: Native AMP
### Training results
| Training Loss | Epoch | Step | Validation Loss | Map | Map 50 | Map 75 | Map Small | Map Medium | Map Large | Mar 1 | Mar 10 | Mar 100 | Mar Small | Mar Medium | Mar Large | Map Baseball-bat | Mar 100 Baseball-bat | Map Basketball | Mar 100 Basketball | Map Car | Mar 100 Car | Map Football | Mar 100 Football | Map Human | Mar 100 Human | Map Luggage | Mar 100 Luggage | Map Mattress | Mar 100 Mattress | Map Motorcycle | Mar 100 Motorcycle | Map Skis | Mar 100 Skis | Map Snowboard | Mar 100 Snowboard | Map Soccer-ball | Mar 100 Soccer-ball | Map Stop-sign | Mar 100 Stop-sign | Map Tennis-racket | Mar 100 Tennis-racket | Map Umbrella | Mar 100 Umbrella | Map Volleyball | Mar 100 Volleyball |
|:-------------:|:-----:|:----:|:---------------:|:------:|:------:|:------:|:---------:|:----------:|:---------:|:------:|:------:|:-------:|:---------:|:----------:|:---------:|:----------------:|:--------------------:|:--------------:|:------------------:|:-------:|:-----------:|:------------:|:----------------:|:---------:|:-------------:|:-----------:|:---------------:|:------------:|:----------------:|:--------------:|:------------------:|:--------:|:------------:|:-------------:|:-----------------:|:---------------:|:-------------------:|:-------------:|:-----------------:|:-----------------:|:---------------------:|:------------:|:----------------:|:--------------:|:------------------:|
| 31.215 | 1.0 | 438 | 11.5331 | 0.361 | 0.4429 | 0.3939 | 0.2412 | 0.3572 | 0.4631 | 0.5413 | 0.7216 | 0.7456 | 0.4705 | 0.8339 | 0.9133 | 0.5278 | 0.7534 | 0.201 | 0.5637 | -1.0 | -1.0 | 0.0954 | 0.3392 | 0.2978 | 0.8908 | 0.2173 | 0.8347 | 0.9008 | 0.9776 | 0.2558 | 0.8724 | 0.3715 | 0.9332 | 0.476 | 0.9786 | 0.1347 | 0.3499 | 0.4095 | 0.8757 | 0.2533 | 0.7259 | 0.5596 | 0.7477 | 0.3531 | 0.5963 |
| 11.6728 | 2.0 | 876 | 8.1149 | 0.6758 | 0.7641 | 0.7545 | 0.57 | 0.7049 | 0.7078 | 0.7463 | 0.8668 | 0.8748 | 0.7153 | 0.9205 | 0.9633 | 0.7816 | 0.8531 | 0.5697 | 0.7246 | -1.0 | -1.0 | 0.345 | 0.6959 | 0.4186 | 0.9573 | 0.4077 | 0.9314 | 0.9689 | 0.994 | 0.8579 | 0.9524 | 0.8735 | 0.9856 | 0.532 | 0.9963 | 0.6632 | 0.7347 | 0.8767 | 0.9606 | 0.7409 | 0.8627 | 0.7848 | 0.8651 | 0.641 | 0.7331 |
| 8.5279 | 3.0 | 1314 | 7.1741 | 0.7245 | 0.8123 | 0.8048 | 0.6062 | 0.7523 | 0.777 | 0.7631 | 0.8836 | 0.89 | 0.7351 | 0.9328 | 0.9734 | 0.8031 | 0.8681 | 0.6326 | 0.7732 | -1.0 | -1.0 | 0.4741 | 0.7704 | 0.4663 | 0.9645 | 0.575 | 0.9431 | 0.9754 | 0.999 | 0.8934 | 0.9659 | 0.9026 | 0.9886 | 0.6149 | 0.9966 | 0.6406 | 0.7162 | 0.9225 | 0.9712 | 0.7727 | 0.871 | 0.8448 | 0.9059 | 0.6252 | 0.7262 |
| 7.6535 | 4.0 | 1752 | 6.6320 | 0.7374 | 0.8206 | 0.8145 | 0.6387 | 0.7807 | 0.8649 | 0.7702 | 0.8941 | 0.9 | 0.7649 | 0.9395 | 0.9869 | 0.7844 | 0.8691 | 0.6492 | 0.7808 | -1.0 | -1.0 | 0.5894 | 0.7994 | 0.5001 | 0.9605 | 0.6277 | 0.9445 | 0.9591 | 0.9887 | 0.8951 | 0.9695 | 0.9208 | 0.9916 | 0.6658 | 0.9969 | 0.6457 | 0.753 | 0.9478 | 0.9911 | 0.6932 | 0.8881 | 0.8172 | 0.9059 | 0.6283 | 0.7608 |
| 7.2488 | 5.0 | 2190 | 6.2374 | 0.7287 | 0.8175 | 0.8097 | 0.6049 | 0.7712 | 0.8579 | 0.7514 | 0.88 | 0.8863 | 0.7218 | 0.936 | 0.9754 | 0.7896 | 0.8726 | 0.5603 | 0.7192 | -1.0 | -1.0 | 0.5343 | 0.7579 | 0.4718 | 0.9494 | 0.587 | 0.9343 | 0.9709 | 0.9983 | 0.9089 | 0.9678 | 0.9366 | 0.9871 | 0.6894 | 0.994 | 0.6338 | 0.7283 | 0.9115 | 0.9824 | 0.7501 | 0.8819 | 0.8789 | 0.9342 | 0.5784 | 0.7002 |
| 6.8555 | 6.0 | 2628 | 5.8678 | 0.7433 | 0.827 | 0.8208 | 0.6423 | 0.7801 | 0.8658 | 0.7602 | 0.8885 | 0.8951 | 0.7482 | 0.9368 | 0.9758 | 0.7919 | 0.8791 | 0.5608 | 0.7367 | -1.0 | -1.0 | 0.615 | 0.7904 | 0.4469 | 0.9424 | 0.5827 | 0.9428 | 0.9483 | 0.9862 | 0.9122 | 0.9755 | 0.9423 | 0.9871 | 0.7175 | 0.9961 | 0.6653 | 0.7622 | 0.9489 | 0.9858 | 0.7588 | 0.8912 | 0.8966 | 0.9336 | 0.6192 | 0.7221 |
| 6.5361 | 7.0 | 3066 | 5.9374 | 0.734 | 0.8181 | 0.8106 | 0.6512 | 0.7467 | 0.8574 | 0.7529 | 0.8832 | 0.8931 | 0.7511 | 0.9359 | 0.9841 | 0.8065 | 0.872 | 0.6151 | 0.7562 | -1.0 | -1.0 | 0.6614 | 0.7874 | 0.4174 | 0.9248 | 0.4939 | 0.9356 | 0.942 | 0.9844 | 0.8818 | 0.9698 | 0.9076 | 0.9861 | 0.671 | 0.9917 | 0.6563 | 0.7728 | 0.9323 | 0.983 | 0.7992 | 0.8741 | 0.8936 | 0.9344 | 0.5979 | 0.7309 |
| 6.4654 | 8.0 | 3504 | 5.7246 | 0.7424 | 0.8246 | 0.8179 | 0.6646 | 0.7573 | 0.8599 | 0.7499 | 0.8842 | 0.8929 | 0.7598 | 0.9351 | 0.978 | 0.8053 | 0.8782 | 0.6157 | 0.7374 | -1.0 | -1.0 | 0.7003 | 0.806 | 0.4102 | 0.8962 | 0.4673 | 0.9298 | 0.9592 | 0.996 | 0.9053 | 0.9757 | 0.9176 | 0.9832 | 0.6832 | 0.9929 | 0.6794 | 0.7789 | 0.9531 | 0.9841 | 0.7883 | 0.8767 | 0.9071 | 0.9426 | 0.6009 | 0.7235 |
| 6.2384 | 9.0 | 3942 | 5.6544 | 0.7571 | 0.8377 | 0.8316 | 0.6868 | 0.7743 | 0.8518 | 0.7582 | 0.8902 | 0.8996 | 0.768 | 0.938 | 0.9823 | 0.8088 | 0.8861 | 0.6286 | 0.7505 | -1.0 | -1.0 | 0.7342 | 0.8303 | 0.3974 | 0.9053 | 0.5417 | 0.9339 | 0.9387 | 0.9891 | 0.9022 | 0.9776 | 0.9246 | 0.9856 | 0.7247 | 0.9971 | 0.6929 | 0.7964 | 0.9621 | 0.9894 | 0.8007 | 0.8756 | 0.9151 | 0.9437 | 0.6276 | 0.7343 |
| 6.1353 | 10.0 | 4380 | 5.4540 | 0.7621 | 0.8461 | 0.8394 | 0.6938 | 0.7761 | 0.7476 | 0.7578 | 0.8899 | 0.8995 | 0.7695 | 0.937 | 0.9809 | 0.8248 | 0.8844 | 0.6711 | 0.7713 | -1.0 | -1.0 | 0.7385 | 0.8195 | 0.3992 | 0.9066 | 0.4718 | 0.9348 | 0.9559 | 0.9962 | 0.9208 | 0.9766 | 0.9513 | 0.9837 | 0.7604 | 0.994 | 0.7046 | 0.8044 | 0.9673 | 0.9863 | 0.7906 | 0.8793 | 0.9127 | 0.9436 | 0.6007 | 0.7125 |
| 5.974 | 11.0 | 4818 | 5.4390 | 0.7477 | 0.8349 | 0.8284 | 0.6753 | 0.7641 | 0.7638 | 0.7469 | 0.8857 | 0.8944 | 0.7475 | 0.93 | 0.9817 | 0.8147 | 0.8746 | 0.5466 | 0.6886 | -1.0 | -1.0 | 0.7436 | 0.8223 | 0.404 | 0.9014 | 0.3508 | 0.9336 | 0.9558 | 0.9969 | 0.9219 | 0.9739 | 0.9454 | 0.9847 | 0.7481 | 0.9943 | 0.7203 | 0.8062 | 0.9647 | 0.9897 | 0.8395 | 0.899 | 0.9016 | 0.9352 | 0.6106 | 0.7216 |
| 5.8663 | 12.0 | 5256 | 5.5215 | 0.7399 | 0.8216 | 0.8156 | 0.6604 | 0.7621 | 0.7835 | 0.7392 | 0.8798 | 0.8907 | 0.7415 | 0.9304 | 0.9812 | 0.8193 | 0.8861 | 0.5166 | 0.6768 | -1.0 | -1.0 | 0.743 | 0.8184 | 0.3875 | 0.8986 | 0.3648 | 0.9283 | 0.9531 | 0.9957 | 0.9295 | 0.9768 | 0.9421 | 0.9851 | 0.737 | 0.9935 | 0.6957 | 0.7817 | 0.9635 | 0.9919 | 0.8015 | 0.8731 | 0.894 | 0.9316 | 0.6109 | 0.7319 |
| 5.7918 | 13.0 | 5694 | 5.3514 | 0.7531 | 0.8339 | 0.8294 | 0.6808 | 0.7732 | 0.7485 | 0.7501 | 0.8852 | 0.8945 | 0.7554 | 0.9306 | 0.9809 | 0.8172 | 0.8871 | 0.5405 | 0.6775 | -1.0 | -1.0 | 0.7592 | 0.8259 | 0.4135 | 0.8961 | 0.3898 | 0.9308 | 0.9633 | 0.9972 | 0.9278 | 0.9757 | 0.9412 | 0.9842 | 0.7532 | 0.9964 | 0.7259 | 0.8098 | 0.9708 | 0.993 | 0.8265 | 0.8881 | 0.8986 | 0.9342 | 0.6158 | 0.7272 |
| 5.6819 | 14.0 | 6132 | 5.3907 | 0.7409 | 0.8234 | 0.8192 | 0.6658 | 0.7677 | 0.8637 | 0.7392 | 0.8793 | 0.8897 | 0.7427 | 0.9314 | 0.9769 | 0.832 | 0.887 | 0.543 | 0.6746 | -1.0 | -1.0 | 0.7549 | 0.827 | 0.3854 | 0.8872 | 0.3596 | 0.9397 | 0.9524 | 0.9946 | 0.9291 | 0.9747 | 0.9396 | 0.9817 | 0.7553 | 0.9929 | 0.7004 | 0.7931 | 0.9573 | 0.9941 | 0.7972 | 0.8793 | 0.8857 | 0.917 | 0.5806 | 0.7127 |
| 5.6176 | 15.0 | 6570 | 5.3644 | 0.7503 | 0.8328 | 0.8268 | 0.687 | 0.7528 | 0.7741 | 0.7479 | 0.8838 | 0.8956 | 0.7622 | 0.9313 | 0.9754 | 0.8319 | 0.8887 | 0.5971 | 0.7175 | -1.0 | -1.0 | 0.7511 | 0.8231 | 0.3696 | 0.8842 | 0.3623 | 0.9325 | 0.9597 | 0.9982 | 0.933 | 0.9773 | 0.9431 | 0.9822 | 0.749 | 0.99 | 0.7222 | 0.8087 | 0.951 | 0.9891 | 0.8169 | 0.8772 | 0.8949 | 0.9306 | 0.6218 | 0.7392 |
| 5.5966 | 16.0 | 7008 | 5.3310 | 0.7507 | 0.8346 | 0.8279 | 0.6767 | 0.7694 | 0.8025 | 0.7483 | 0.8811 | 0.8923 | 0.7499 | 0.9334 | 0.977 | 0.8349 | 0.8921 | 0.5875 | 0.704 | -1.0 | -1.0 | 0.7511 | 0.8178 | 0.4005 | 0.8866 | 0.3754 | 0.9315 | 0.9489 | 0.995 | 0.929 | 0.9784 | 0.9487 | 0.9832 | 0.7675 | 0.9956 | 0.7136 | 0.799 | 0.9429 | 0.9905 | 0.8089 | 0.8668 | 0.903 | 0.936 | 0.5981 | 0.7159 |
| 5.6351 | 17.0 | 7446 | 5.4178 | 0.7425 | 0.8275 | 0.8194 | 0.6692 | 0.7577 | 0.8672 | 0.7429 | 0.874 | 0.8848 | 0.7397 | 0.9236 | 0.9722 | 0.8245 | 0.8846 | 0.5847 | 0.6853 | -1.0 | -1.0 | 0.7495 | 0.8156 | 0.3781 | 0.8625 | 0.3391 | 0.929 | 0.953 | 0.9962 | 0.9326 | 0.9754 | 0.9497 | 0.9847 | 0.7474 | 0.9914 | 0.7129 | 0.792 | 0.953 | 0.9894 | 0.8065 | 0.8674 | 0.8822 | 0.9149 | 0.5822 | 0.6988 |
| 5.5953 | 18.0 | 7884 | 5.4132 | 0.7431 | 0.8293 | 0.8213 | 0.6645 | 0.7644 | 0.854 | 0.7411 | 0.8735 | 0.8843 | 0.7377 | 0.9244 | 0.9736 | 0.8237 | 0.8832 | 0.5741 | 0.6656 | -1.0 | -1.0 | 0.7456 | 0.8154 | 0.374 | 0.8685 | 0.353 | 0.9315 | 0.9543 | 0.9966 | 0.9309 | 0.9768 | 0.9513 | 0.9847 | 0.7614 | 0.9916 | 0.6987 | 0.7805 | 0.9589 | 0.9888 | 0.8047 | 0.8596 | 0.8938 | 0.9276 | 0.5794 | 0.7096 |
| 5.5718 | 19.0 | 8322 | 5.3722 | 0.7427 | 0.8272 | 0.8194 | 0.6725 | 0.762 | 0.853 | 0.7405 | 0.8758 | 0.887 | 0.7438 | 0.9248 | 0.9745 | 0.836 | 0.8904 | 0.5708 | 0.6801 | -1.0 | -1.0 | 0.7528 | 0.8175 | 0.3645 | 0.8559 | 0.3216 | 0.9353 | 0.956 | 0.9962 | 0.9357 | 0.9766 | 0.9496 | 0.9866 | 0.7459 | 0.9934 | 0.7137 | 0.7949 | 0.9589 | 0.9919 | 0.8124 | 0.8684 | 0.8955 | 0.9267 | 0.5841 | 0.7042 |
| 5.5669 | 20.0 | 8760 | 5.3994 | 0.7432 | 0.826 | 0.8194 | 0.6733 | 0.7514 | 0.8478 | 0.7418 | 0.8747 | 0.8873 | 0.7457 | 0.9276 | 0.9725 | 0.8321 | 0.8874 | 0.5876 | 0.6948 | -1.0 | -1.0 | 0.7601 | 0.8239 | 0.3724 | 0.8567 | 0.3313 | 0.933 | 0.9551 | 0.9933 | 0.9352 | 0.9758 | 0.9499 | 0.9861 | 0.7494 | 0.9924 | 0.6994 | 0.7859 | 0.9566 | 0.9908 | 0.8019 | 0.8663 | 0.8881 | 0.926 | 0.5851 | 0.71 |
### Framework versions
- Transformers 4.47.0
- Pytorch 2.5.1+cu124
- Datasets 3.1.0
- Tokenizers 0.21.0
| [
"baseball-bat",
"basketball",
"car",
"football",
"human",
"luggage",
"mattress",
"motorcycle",
"skis",
"snowboard",
"soccer-ball",
"stop-sign",
"tennis-racket",
"umbrella",
"volleyball"
] |
mfly-auton/suas-2025-rtdetr-finetuned-e10-b32-lr1e-5 |
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# suas-2025-rtdetr-finetuned-e10-b32-lr1e-5
This model is a fine-tuned version of [PekingU/rtdetr_r50vd](https://huggingface.co/PekingU/rtdetr_r50vd) on the mfly-auton/suas-2025-synthetic-data dataset.
It achieves the following results on the evaluation set:
- Loss: 5.7755
- Map: 0.8259
- Map 50: 0.9053
- Map 75: 0.8962
- Map Small: 0.7396
- Map Medium: 0.8978
- Map Large: 0.8904
- Mar 1: 0.7897
- Mar 10: 0.9079
- Mar 100: 0.911
- Mar Small: 0.8045
- Mar Medium: 0.9449
- Mar Large: 0.974
- Map Baseball-bat: 0.8535
- Mar 100 Baseball-bat: 0.9223
- Map Basketball: 0.8181
- Mar 100 Basketball: 0.9213
- Map Car: -1.0
- Mar 100 Car: -1.0
- Map Football: 0.4419
- Mar 100 Football: 0.516
- Map Human: 0.8513
- Mar 100 Human: 0.9473
- Map Luggage: 0.8768
- Mar 100 Luggage: 0.9398
- Map Mattress: 0.5281
- Mar 100 Mattress: 0.9959
- Map Motorcycle: 0.9311
- Mar 100 Motorcycle: 0.9635
- Map Skis: 0.9687
- Mar 100 Skis: 0.9975
- Map Snowboard: 0.9779
- Mar 100 Snowboard: 0.9953
- Map Soccer-ball: 0.8253
- Mar 100 Soccer-ball: 0.8679
- Map Stop-sign: 0.9582
- Mar 100 Stop-sign: 0.9838
- Map Tennis-racket: 0.8887
- Mar 100 Tennis-racket: 0.9275
- Map Umbrella: 0.8417
- Mar 100 Umbrella: 0.9128
- Map Volleyball: 0.8015
- Mar 100 Volleyball: 0.8632
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 1e-05
- train_batch_size: 32
- eval_batch_size: 32
- seed: 1337
- optimizer: Use adamw_torch with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
- lr_scheduler_type: linear
- num_epochs: 10.0
- mixed_precision_training: Native AMP
### Training results
| Training Loss | Epoch | Step | Validation Loss | Map | Map 50 | Map 75 | Map Small | Map Medium | Map Large | Mar 1 | Mar 10 | Mar 100 | Mar Small | Mar Medium | Mar Large | Map Baseball-bat | Mar 100 Baseball-bat | Map Basketball | Mar 100 Basketball | Map Car | Mar 100 Car | Map Football | Mar 100 Football | Map Human | Mar 100 Human | Map Luggage | Mar 100 Luggage | Map Mattress | Mar 100 Mattress | Map Motorcycle | Mar 100 Motorcycle | Map Skis | Mar 100 Skis | Map Snowboard | Mar 100 Snowboard | Map Soccer-ball | Mar 100 Soccer-ball | Map Stop-sign | Mar 100 Stop-sign | Map Tennis-racket | Mar 100 Tennis-racket | Map Umbrella | Mar 100 Umbrella | Map Volleyball | Mar 100 Volleyball |
|:-------------:|:-----:|:----:|:---------------:|:------:|:------:|:------:|:---------:|:----------:|:---------:|:------:|:------:|:-------:|:---------:|:----------:|:---------:|:----------------:|:--------------------:|:--------------:|:------------------:|:-------:|:-----------:|:------------:|:----------------:|:---------:|:-------------:|:-----------:|:---------------:|:------------:|:----------------:|:--------------:|:------------------:|:--------:|:------------:|:-------------:|:-----------------:|:---------------:|:-------------------:|:-------------:|:-----------------:|:-----------------:|:---------------------:|:------------:|:----------------:|:--------------:|:------------------:|
| 31.6046 | 1.0 | 219 | 12.5434 | 0.3553 | 0.4457 | 0.3805 | 0.2717 | 0.4559 | 0.4479 | 0.5615 | 0.7432 | 0.7718 | 0.5413 | 0.873 | 0.9496 | 0.6878 | 0.9023 | 0.0337 | 0.5289 | -1.0 | -1.0 | 0.1746 | 0.4829 | 0.4718 | 0.8344 | 0.1939 | 0.8512 | 0.139 | 0.9643 | 0.5451 | 0.9288 | 0.9133 | 0.9891 | 0.4051 | 0.9817 | 0.3034 | 0.4481 | 0.3931 | 0.9455 | 0.3464 | 0.615 | 0.2395 | 0.8765 | 0.1274 | 0.4561 |
| 11.2376 | 2.0 | 438 | 8.7086 | 0.7492 | 0.8518 | 0.8382 | 0.6809 | 0.8229 | 0.7802 | 0.7521 | 0.8824 | 0.8898 | 0.7682 | 0.9257 | 0.9652 | 0.8225 | 0.9069 | 0.7335 | 0.8417 | -1.0 | -1.0 | 0.5246 | 0.6187 | 0.8272 | 0.9075 | 0.8351 | 0.91 | 0.2917 | 0.9978 | 0.8814 | 0.9449 | 0.9763 | 0.9916 | 0.9795 | 0.9963 | 0.7605 | 0.8211 | 0.9176 | 0.9746 | 0.8631 | 0.9088 | 0.4335 | 0.9109 | 0.6428 | 0.7265 |
| 7.5378 | 3.0 | 657 | 7.5895 | 0.7926 | 0.8861 | 0.874 | 0.7165 | 0.8732 | 0.8452 | 0.7772 | 0.8976 | 0.9033 | 0.7944 | 0.9368 | 0.9808 | 0.8325 | 0.9096 | 0.7675 | 0.8908 | -1.0 | -1.0 | 0.5341 | 0.6171 | 0.8536 | 0.9323 | 0.8621 | 0.9288 | 0.3232 | 0.9995 | 0.9099 | 0.9534 | 0.9589 | 0.9851 | 0.9779 | 0.9968 | 0.7949 | 0.8422 | 0.9557 | 0.9799 | 0.874 | 0.9187 | 0.7466 | 0.908 | 0.7061 | 0.7833 |
| 6.7821 | 4.0 | 876 | 6.9548 | 0.7906 | 0.8773 | 0.8662 | 0.6944 | 0.8658 | 0.8692 | 0.7707 | 0.8882 | 0.8931 | 0.763 | 0.9347 | 0.9677 | 0.8386 | 0.9153 | 0.738 | 0.8742 | -1.0 | -1.0 | 0.4264 | 0.5058 | 0.83 | 0.9254 | 0.8657 | 0.9297 | 0.4204 | 0.9995 | 0.9153 | 0.9537 | 0.955 | 0.9891 | 0.9727 | 0.9924 | 0.8047 | 0.8542 | 0.9495 | 0.9869 | 0.8817 | 0.9254 | 0.7954 | 0.8976 | 0.6753 | 0.7549 |
| 6.4517 | 5.0 | 1095 | 6.2876 | 0.8217 | 0.9052 | 0.8965 | 0.7345 | 0.8867 | 0.884 | 0.7885 | 0.9066 | 0.9107 | 0.8055 | 0.9448 | 0.9705 | 0.8384 | 0.92 | 0.7994 | 0.9156 | -1.0 | -1.0 | 0.4948 | 0.5703 | 0.8404 | 0.9363 | 0.8765 | 0.9391 | 0.5335 | 0.9946 | 0.927 | 0.9651 | 0.9504 | 0.995 | 0.9766 | 0.9947 | 0.8183 | 0.8602 | 0.946 | 0.9813 | 0.8881 | 0.9259 | 0.8387 | 0.9205 | 0.7759 | 0.8306 |
| 6.1454 | 6.0 | 1314 | 5.8688 | 0.8355 | 0.9186 | 0.9086 | 0.7416 | 0.8939 | 0.9026 | 0.7975 | 0.91 | 0.9135 | 0.8094 | 0.9477 | 0.973 | 0.8577 | 0.9237 | 0.7904 | 0.9206 | -1.0 | -1.0 | 0.4838 | 0.5638 | 0.8723 | 0.9423 | 0.8782 | 0.9378 | 0.6114 | 0.9982 | 0.9296 | 0.9638 | 0.9826 | 0.9965 | 0.9772 | 0.9953 | 0.8218 | 0.8638 | 0.9541 | 0.9866 | 0.887 | 0.9285 | 0.851 | 0.9228 | 0.7993 | 0.8458 |
| 6.0567 | 7.0 | 1533 | 5.9641 | 0.8278 | 0.9084 | 0.9005 | 0.7434 | 0.8944 | 0.89 | 0.7917 | 0.9098 | 0.9135 | 0.8091 | 0.9467 | 0.9854 | 0.8531 | 0.9224 | 0.7819 | 0.9187 | -1.0 | -1.0 | 0.4727 | 0.5437 | 0.8434 | 0.9498 | 0.8774 | 0.9406 | 0.5438 | 0.9946 | 0.9232 | 0.963 | 0.9791 | 0.998 | 0.9771 | 0.994 | 0.8241 | 0.8707 | 0.9609 | 0.9785 | 0.8909 | 0.9264 | 0.8533 | 0.9254 | 0.8084 | 0.863 |
| 5.9579 | 8.0 | 1752 | 5.7184 | 0.8339 | 0.9141 | 0.9056 | 0.7448 | 0.8939 | 0.9004 | 0.7947 | 0.9099 | 0.9136 | 0.8088 | 0.9453 | 0.9871 | 0.8529 | 0.9222 | 0.8083 | 0.9175 | -1.0 | -1.0 | 0.4795 | 0.5478 | 0.8608 | 0.9444 | 0.8795 | 0.9406 | 0.5759 | 0.998 | 0.9325 | 0.9641 | 0.9711 | 0.9975 | 0.9816 | 0.9964 | 0.8218 | 0.8661 | 0.9655 | 0.9872 | 0.8942 | 0.928 | 0.8474 | 0.9251 | 0.8034 | 0.8551 |
| 5.9419 | 9.0 | 1971 | 5.7328 | 0.8305 | 0.9091 | 0.9018 | 0.7434 | 0.8955 | 0.8956 | 0.7924 | 0.9098 | 0.9136 | 0.8101 | 0.9462 | 0.9868 | 0.858 | 0.9253 | 0.8223 | 0.9197 | -1.0 | -1.0 | 0.4625 | 0.5377 | 0.8502 | 0.9471 | 0.8804 | 0.9409 | 0.5395 | 0.9939 | 0.93 | 0.9619 | 0.9787 | 0.9975 | 0.979 | 0.9959 | 0.8296 | 0.8704 | 0.9624 | 0.988 | 0.8852 | 0.9269 | 0.8393 | 0.9164 | 0.8101 | 0.8684 |
| 5.9176 | 10.0 | 2190 | 5.7755 | 0.8259 | 0.9053 | 0.8962 | 0.7396 | 0.8978 | 0.8904 | 0.7897 | 0.9079 | 0.911 | 0.8045 | 0.9449 | 0.974 | 0.8535 | 0.9223 | 0.8181 | 0.9213 | -1.0 | -1.0 | 0.4419 | 0.516 | 0.8513 | 0.9473 | 0.8768 | 0.9398 | 0.5281 | 0.9959 | 0.9311 | 0.9635 | 0.9687 | 0.9975 | 0.9779 | 0.9953 | 0.8253 | 0.8679 | 0.9582 | 0.9838 | 0.8887 | 0.9275 | 0.8417 | 0.9128 | 0.8015 | 0.8632 |
### Framework versions
- Transformers 4.47.0
- Pytorch 2.5.1+cu124
- Datasets 3.1.0
- Tokenizers 0.21.0
| [
"baseball-bat",
"basketball",
"car",
"football",
"human",
"luggage",
"mattress",
"motorcycle",
"skis",
"snowboard",
"soccer-ball",
"stop-sign",
"tennis-racket",
"umbrella",
"volleyball"
] |
mfly-auton/suas-2025-rtdetr-finetuned-e50-b16-lr1e-5 |
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# suas-2025-rtdetr-finetuned-e50-b16-lr1e-5
This model is a fine-tuned version of [PekingU/rtdetr_r50vd](https://huggingface.co/PekingU/rtdetr_r50vd) on the mfly-auton/suas-2025-synthetic-data dataset.
It achieves the following results on the evaluation set:
- Loss: 5.1922
- Map: 0.848
- Map 50: 0.9654
- Map 75: 0.9497
- Map Small: 0.7323
- Map Medium: 0.8702
- Map Large: 0.9603
- Mar 1: 0.7848
- Mar 10: 0.8983
- Mar 100: 0.9079
- Mar Small: 0.8094
- Mar Medium: 0.9289
- Mar Large: 0.9849
- Map Baseball-bat: 0.7866
- Mar 100 Baseball-bat: 0.906
- Map Basketball: 0.8183
- Mar 100 Basketball: 0.8576
- Map Car: -1.0
- Mar 100 Car: -1.0
- Map Football: 0.623
- Mar 100 Football: 0.7297
- Map Human: 0.9109
- Mar 100 Human: 0.9522
- Map Luggage: 0.826
- Mar 100 Luggage: 0.9096
- Map Mattress: 0.9878
- Mar 100 Mattress: 0.9977
- Map Motorcycle: 0.9319
- Mar 100 Motorcycle: 0.9564
- Map Skis: 0.9199
- Mar 100 Skis: 0.9703
- Map Snowboard: 0.9883
- Mar 100 Snowboard: 0.9951
- Map Soccer-ball: 0.7476
- Mar 100 Soccer-ball: 0.8144
- Map Stop-sign: 0.9339
- Mar 100 Stop-sign: 0.9612
- Map Tennis-racket: 0.7812
- Mar 100 Tennis-racket: 0.8845
- Map Umbrella: 0.9107
- Mar 100 Umbrella: 0.9553
- Map Volleyball: 0.7061
- Mar 100 Volleyball: 0.8208
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 1e-05
- train_batch_size: 16
- eval_batch_size: 32
- seed: 1337
- optimizer: Use adamw_torch with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
- lr_scheduler_type: linear
- num_epochs: 50.0
- mixed_precision_training: Native AMP
### Training results
| Training Loss | Epoch | Step | Validation Loss | Map | Map 50 | Map 75 | Map Small | Map Medium | Map Large | Mar 1 | Mar 10 | Mar 100 | Mar Small | Mar Medium | Mar Large | Map Baseball-bat | Mar 100 Baseball-bat | Map Basketball | Mar 100 Basketball | Map Car | Mar 100 Car | Map Football | Mar 100 Football | Map Human | Mar 100 Human | Map Luggage | Mar 100 Luggage | Map Mattress | Mar 100 Mattress | Map Motorcycle | Mar 100 Motorcycle | Map Skis | Mar 100 Skis | Map Snowboard | Mar 100 Snowboard | Map Soccer-ball | Mar 100 Soccer-ball | Map Stop-sign | Mar 100 Stop-sign | Map Tennis-racket | Mar 100 Tennis-racket | Map Umbrella | Mar 100 Umbrella | Map Volleyball | Mar 100 Volleyball |
|:-------------:|:-----:|:-----:|:---------------:|:------:|:------:|:------:|:---------:|:----------:|:---------:|:------:|:------:|:-------:|:---------:|:----------:|:---------:|:----------------:|:--------------------:|:--------------:|:------------------:|:-------:|:-----------:|:------------:|:----------------:|:---------:|:-------------:|:-----------:|:---------------:|:------------:|:----------------:|:--------------:|:------------------:|:--------:|:------------:|:-------------:|:-----------------:|:---------------:|:-------------------:|:-------------:|:-----------------:|:-----------------:|:---------------------:|:------------:|:----------------:|:--------------:|:------------------:|
| 35.8209 | 1.0 | 438 | 15.8634 | 0.0876 | 0.1207 | 0.0954 | 0.0341 | 0.0699 | 0.1699 | 0.1692 | 0.3326 | 0.351 | 0.1466 | 0.3498 | 0.5102 | 0.008 | 0.2188 | 0.0904 | 0.4045 | -1.0 | -1.0 | 0.0035 | 0.173 | 0.2131 | 0.5707 | 0.0186 | 0.3735 | 0.5333 | 0.8459 | 0.1433 | 0.563 | 0.0 | 0.0 | 0.1018 | 0.6519 | 0.0005 | 0.0609 | 0.0231 | 0.4522 | 0.0001 | 0.0642 | 0.0902 | 0.5279 | 0.0 | 0.0076 |
| 18.2866 | 2.0 | 876 | 9.8698 | 0.5172 | 0.6033 | 0.5814 | 0.4216 | 0.4913 | 0.736 | 0.5753 | 0.7757 | 0.788 | 0.579 | 0.8343 | 0.9339 | 0.1171 | 0.7353 | 0.6 | 0.7434 | -1.0 | -1.0 | 0.3336 | 0.5407 | 0.7746 | 0.8705 | 0.7531 | 0.8718 | 0.9033 | 0.9978 | 0.7636 | 0.8722 | 0.0416 | 0.9446 | 0.9059 | 0.9818 | 0.2723 | 0.491 | 0.6344 | 0.8425 | 0.0891 | 0.6202 | 0.8002 | 0.9216 | 0.2522 | 0.5985 |
| 11.1872 | 3.0 | 1314 | 9.9933 | 0.7022 | 0.7977 | 0.7831 | 0.5061 | 0.7248 | 0.7949 | 0.6809 | 0.8179 | 0.8252 | 0.622 | 0.8734 | 0.9592 | 0.5039 | 0.8168 | 0.6697 | 0.7486 | -1.0 | -1.0 | 0.4155 | 0.5517 | 0.7932 | 0.8865 | 0.8129 | 0.9082 | 0.9436 | 0.9951 | 0.8345 | 0.919 | 0.8335 | 0.9658 | 0.9638 | 0.9864 | 0.4472 | 0.5874 | 0.795 | 0.931 | 0.4851 | 0.6689 | 0.8576 | 0.9445 | 0.4748 | 0.6424 |
| 9.4801 | 4.0 | 1752 | 9.8496 | 0.7498 | 0.8473 | 0.8302 | 0.5563 | 0.7902 | 0.9327 | 0.712 | 0.8431 | 0.8502 | 0.6738 | 0.8928 | 0.9778 | 0.6015 | 0.8549 | 0.7093 | 0.7945 | -1.0 | -1.0 | 0.4907 | 0.6149 | 0.8317 | 0.9066 | 0.8325 | 0.9251 | 0.9407 | 0.9977 | 0.8679 | 0.928 | 0.9228 | 0.9802 | 0.9647 | 0.9887 | 0.5159 | 0.6224 | 0.88 | 0.9525 | 0.5506 | 0.713 | 0.8407 | 0.9433 | 0.5486 | 0.6809 |
| 8.6047 | 5.0 | 2190 | 9.9758 | 0.7961 | 0.8913 | 0.8761 | 0.6228 | 0.822 | 0.9407 | 0.7493 | 0.8766 | 0.8843 | 0.7328 | 0.9233 | 0.9855 | 0.6418 | 0.8738 | 0.7478 | 0.8289 | -1.0 | -1.0 | 0.5773 | 0.6964 | 0.8557 | 0.9283 | 0.8459 | 0.9353 | 0.9628 | 0.9994 | 0.9032 | 0.9506 | 0.9241 | 0.9866 | 0.9692 | 0.9869 | 0.6213 | 0.7185 | 0.9227 | 0.9701 | 0.6941 | 0.8145 | 0.8709 | 0.9505 | 0.6086 | 0.74 |
| 8.0733 | 6.0 | 2628 | 8.9735 | 0.8271 | 0.9212 | 0.9062 | 0.6807 | 0.8423 | 0.955 | 0.7723 | 0.8966 | 0.903 | 0.7891 | 0.9315 | 0.9876 | 0.7286 | 0.9073 | 0.7845 | 0.8616 | -1.0 | -1.0 | 0.6145 | 0.7379 | 0.8907 | 0.9397 | 0.8647 | 0.9403 | 0.9649 | 0.9984 | 0.911 | 0.9537 | 0.9595 | 0.9866 | 0.9689 | 0.9877 | 0.6932 | 0.7815 | 0.9222 | 0.9779 | 0.7176 | 0.8187 | 0.8996 | 0.9654 | 0.6592 | 0.7858 |
| 7.5839 | 7.0 | 3066 | 7.9987 | 0.8223 | 0.9186 | 0.9028 | 0.6633 | 0.8496 | 0.9498 | 0.7742 | 0.8987 | 0.9059 | 0.7739 | 0.9366 | 0.9893 | 0.7143 | 0.9003 | 0.7687 | 0.8564 | -1.0 | -1.0 | 0.6117 | 0.7344 | 0.8951 | 0.9527 | 0.8634 | 0.9432 | 0.9694 | 0.9992 | 0.9208 | 0.9569 | 0.9101 | 0.9782 | 0.9739 | 0.9943 | 0.661 | 0.7853 | 0.9409 | 0.9818 | 0.7701 | 0.8637 | 0.8874 | 0.9668 | 0.625 | 0.7696 |
| 7.1392 | 8.0 | 3504 | 7.1608 | 0.8355 | 0.9256 | 0.9125 | 0.6879 | 0.8766 | 0.9497 | 0.7827 | 0.9047 | 0.9115 | 0.8008 | 0.9413 | 0.9748 | 0.7849 | 0.9196 | 0.7918 | 0.8806 | -1.0 | -1.0 | 0.6121 | 0.745 | 0.8968 | 0.9497 | 0.8698 | 0.9442 | 0.9603 | 0.9963 | 0.9181 | 0.9547 | 0.9387 | 0.9861 | 0.9729 | 0.9893 | 0.664 | 0.7835 | 0.9507 | 0.9751 | 0.8094 | 0.8886 | 0.8947 | 0.9652 | 0.6334 | 0.7833 |
| 6.7601 | 9.0 | 3942 | 6.2561 | 0.8299 | 0.9249 | 0.9129 | 0.6819 | 0.8402 | 0.8921 | 0.774 | 0.8971 | 0.9048 | 0.7899 | 0.9314 | 0.9713 | 0.7324 | 0.917 | 0.7621 | 0.8533 | -1.0 | -1.0 | 0.6202 | 0.7347 | 0.8961 | 0.9481 | 0.8555 | 0.934 | 0.9717 | 0.9966 | 0.9125 | 0.9493 | 0.9039 | 0.9584 | 0.9783 | 0.9921 | 0.6866 | 0.7956 | 0.9331 | 0.9723 | 0.7984 | 0.8725 | 0.9043 | 0.9621 | 0.6631 | 0.7811 |
| 6.4006 | 10.0 | 4380 | 5.8412 | 0.8448 | 0.9414 | 0.9317 | 0.6998 | 0.8692 | 0.9526 | 0.7849 | 0.9029 | 0.9103 | 0.8009 | 0.9361 | 0.9735 | 0.7928 | 0.9195 | 0.7756 | 0.8569 | -1.0 | -1.0 | 0.6622 | 0.7649 | 0.8988 | 0.9478 | 0.8512 | 0.9228 | 0.9712 | 0.9956 | 0.9167 | 0.9501 | 0.9128 | 0.9713 | 0.9824 | 0.9922 | 0.6962 | 0.7799 | 0.9632 | 0.9885 | 0.806 | 0.8865 | 0.9035 | 0.9634 | 0.6953 | 0.8049 |
| 6.1178 | 11.0 | 4818 | 5.5940 | 0.845 | 0.9454 | 0.9347 | 0.71 | 0.8808 | 0.9531 | 0.7831 | 0.9049 | 0.9122 | 0.7997 | 0.9421 | 0.9756 | 0.7937 | 0.9242 | 0.7786 | 0.8569 | -1.0 | -1.0 | 0.6473 | 0.7514 | 0.9098 | 0.9538 | 0.8691 | 0.931 | 0.9736 | 0.9981 | 0.9145 | 0.9512 | 0.9076 | 0.9797 | 0.9805 | 0.9925 | 0.7181 | 0.8018 | 0.9571 | 0.9802 | 0.7853 | 0.8865 | 0.9027 | 0.9635 | 0.692 | 0.8 |
| 5.9386 | 12.0 | 5256 | 5.3809 | 0.8534 | 0.9525 | 0.9444 | 0.7349 | 0.8539 | 0.9484 | 0.7875 | 0.9079 | 0.9155 | 0.8192 | 0.9334 | 0.9716 | 0.8116 | 0.9208 | 0.8033 | 0.8765 | -1.0 | -1.0 | 0.6798 | 0.7668 | 0.9081 | 0.9556 | 0.8555 | 0.9195 | 0.9766 | 0.9977 | 0.9123 | 0.9477 | 0.8838 | 0.9708 | 0.9792 | 0.9911 | 0.7488 | 0.8257 | 0.9397 | 0.9718 | 0.8181 | 0.885 | 0.9005 | 0.9612 | 0.7304 | 0.8267 |
| 5.8369 | 13.0 | 5694 | 5.3334 | 0.853 | 0.9537 | 0.9441 | 0.7272 | 0.875 | 0.9533 | 0.7879 | 0.906 | 0.9115 | 0.806 | 0.9409 | 0.9737 | 0.8282 | 0.9229 | 0.7937 | 0.86 | -1.0 | -1.0 | 0.6648 | 0.75 | 0.9043 | 0.9555 | 0.8547 | 0.9184 | 0.9863 | 0.9983 | 0.92 | 0.9466 | 0.8883 | 0.9782 | 0.9794 | 0.9917 | 0.7238 | 0.7933 | 0.9617 | 0.9737 | 0.8238 | 0.8984 | 0.8977 | 0.9577 | 0.7156 | 0.8159 |
| 5.7218 | 14.0 | 6132 | 5.4029 | 0.8396 | 0.9499 | 0.9341 | 0.6993 | 0.8734 | 0.9435 | 0.779 | 0.8927 | 0.9009 | 0.7897 | 0.9332 | 0.965 | 0.8068 | 0.9229 | 0.7721 | 0.8365 | -1.0 | -1.0 | 0.6141 | 0.722 | 0.8993 | 0.9558 | 0.8525 | 0.9165 | 0.9811 | 0.9978 | 0.9196 | 0.9482 | 0.9045 | 0.9431 | 0.9772 | 0.99 | 0.6888 | 0.7807 | 0.9418 | 0.967 | 0.8264 | 0.887 | 0.8873 | 0.9569 | 0.6837 | 0.788 |
| 5.5868 | 15.0 | 6570 | 5.1404 | 0.8504 | 0.961 | 0.9514 | 0.7327 | 0.8765 | 0.9479 | 0.7852 | 0.8991 | 0.9073 | 0.8121 | 0.9349 | 0.9665 | 0.8254 | 0.9228 | 0.8061 | 0.8569 | -1.0 | -1.0 | 0.6472 | 0.7285 | 0.9169 | 0.9593 | 0.852 | 0.9188 | 0.978 | 0.9986 | 0.9107 | 0.946 | 0.9011 | 0.9515 | 0.978 | 0.9898 | 0.726 | 0.8134 | 0.9524 | 0.9723 | 0.7993 | 0.8829 | 0.8936 | 0.9491 | 0.7187 | 0.8125 |
| 5.5623 | 16.0 | 7008 | 5.4438 | 0.855 | 0.96 | 0.952 | 0.7403 | 0.8612 | 0.9556 | 0.7904 | 0.9026 | 0.9087 | 0.8117 | 0.9247 | 0.9737 | 0.8242 | 0.918 | 0.8126 | 0.8562 | -1.0 | -1.0 | 0.6598 | 0.7473 | 0.8932 | 0.95 | 0.8311 | 0.8984 | 0.9843 | 0.9969 | 0.9204 | 0.9477 | 0.9329 | 0.9822 | 0.9848 | 0.9953 | 0.7209 | 0.7913 | 0.95 | 0.9704 | 0.8114 | 0.8834 | 0.9004 | 0.9622 | 0.744 | 0.8225 |
| 5.4745 | 17.0 | 7446 | 5.1975 | 0.857 | 0.9606 | 0.9498 | 0.739 | 0.8749 | 0.8957 | 0.7922 | 0.9028 | 0.9088 | 0.8031 | 0.9328 | 0.9737 | 0.8173 | 0.905 | 0.8466 | 0.8839 | -1.0 | -1.0 | 0.6402 | 0.7255 | 0.9153 | 0.9556 | 0.8524 | 0.9213 | 0.9809 | 0.9976 | 0.9189 | 0.9504 | 0.9323 | 0.9723 | 0.9833 | 0.9937 | 0.735 | 0.8 | 0.9525 | 0.9782 | 0.7967 | 0.8746 | 0.9147 | 0.9653 | 0.7123 | 0.8002 |
| 5.3515 | 18.0 | 7884 | 5.2381 | 0.8521 | 0.9594 | 0.9488 | 0.7312 | 0.88 | 0.9528 | 0.7895 | 0.901 | 0.9099 | 0.8097 | 0.9362 | 0.9698 | 0.7994 | 0.914 | 0.8113 | 0.8607 | -1.0 | -1.0 | 0.6394 | 0.7319 | 0.8967 | 0.9562 | 0.8352 | 0.9114 | 0.9828 | 0.9989 | 0.9102 | 0.9441 | 0.9371 | 0.9728 | 0.9781 | 0.9912 | 0.7338 | 0.8242 | 0.9528 | 0.9693 | 0.8038 | 0.8782 | 0.9258 | 0.9679 | 0.7228 | 0.8176 |
| 5.3462 | 19.0 | 8322 | 5.3286 | 0.851 | 0.9598 | 0.9462 | 0.7365 | 0.8681 | 0.9496 | 0.7866 | 0.9 | 0.9078 | 0.8059 | 0.9304 | 0.9705 | 0.799 | 0.907 | 0.8204 | 0.8604 | -1.0 | -1.0 | 0.6415 | 0.7333 | 0.9 | 0.9558 | 0.8441 | 0.9066 | 0.9684 | 0.9965 | 0.91 | 0.9469 | 0.9073 | 0.9663 | 0.9839 | 0.9947 | 0.7415 | 0.8149 | 0.9418 | 0.9673 | 0.8066 | 0.8793 | 0.9161 | 0.9706 | 0.7334 | 0.8091 |
| 5.2452 | 20.0 | 8760 | 5.0850 | 0.857 | 0.9632 | 0.9546 | 0.7492 | 0.8794 | 0.9493 | 0.7924 | 0.9039 | 0.9097 | 0.8145 | 0.933 | 0.9806 | 0.8174 | 0.909 | 0.8249 | 0.8685 | -1.0 | -1.0 | 0.6645 | 0.7505 | 0.9029 | 0.956 | 0.8472 | 0.9138 | 0.9749 | 0.9933 | 0.9102 | 0.9395 | 0.8949 | 0.9465 | 0.9772 | 0.9903 | 0.7612 | 0.8234 | 0.947 | 0.9763 | 0.8026 | 0.8741 | 0.9255 | 0.9682 | 0.7482 | 0.8265 |
| 5.1599 | 21.0 | 9198 | 5.0505 | 0.8568 | 0.9668 | 0.9558 | 0.7365 | 0.8822 | 0.9564 | 0.7918 | 0.9023 | 0.9105 | 0.8092 | 0.9342 | 0.9733 | 0.8247 | 0.9204 | 0.8211 | 0.8649 | -1.0 | -1.0 | 0.6429 | 0.7373 | 0.914 | 0.9571 | 0.8436 | 0.9058 | 0.986 | 0.9988 | 0.9268 | 0.9521 | 0.9171 | 0.9698 | 0.9846 | 0.995 | 0.7372 | 0.8054 | 0.9478 | 0.9751 | 0.7906 | 0.8751 | 0.9252 | 0.9684 | 0.733 | 0.8218 |
| 5.0265 | 22.0 | 9636 | 4.8772 | 0.8581 | 0.9631 | 0.9547 | 0.7454 | 0.87 | 0.9563 | 0.7918 | 0.9073 | 0.9153 | 0.8257 | 0.9336 | 0.973 | 0.8196 | 0.9196 | 0.8048 | 0.8519 | -1.0 | -1.0 | 0.66 | 0.7741 | 0.9118 | 0.9569 | 0.8538 | 0.9196 | 0.9842 | 0.9983 | 0.9285 | 0.9524 | 0.9115 | 0.9708 | 0.9812 | 0.9916 | 0.7524 | 0.8267 | 0.9298 | 0.9626 | 0.8163 | 0.8845 | 0.9245 | 0.9708 | 0.7354 | 0.8348 |
| 5.0769 | 23.0 | 10074 | 5.1876 | 0.8503 | 0.9642 | 0.9532 | 0.7342 | 0.8729 | 0.9541 | 0.7829 | 0.8973 | 0.9037 | 0.8034 | 0.9289 | 0.9696 | 0.8139 | 0.9096 | 0.8207 | 0.8628 | -1.0 | -1.0 | 0.6232 | 0.7274 | 0.9103 | 0.9549 | 0.8375 | 0.9062 | 0.9795 | 0.9978 | 0.925 | 0.9507 | 0.9009 | 0.951 | 0.9882 | 0.9959 | 0.7227 | 0.7841 | 0.9521 | 0.976 | 0.7892 | 0.8596 | 0.9103 | 0.9615 | 0.7304 | 0.8145 |
| 5.067 | 24.0 | 10512 | 5.0899 | 0.8576 | 0.9671 | 0.952 | 0.7386 | 0.8771 | 0.9618 | 0.793 | 0.9039 | 0.9131 | 0.8186 | 0.9318 | 0.9759 | 0.818 | 0.9098 | 0.8425 | 0.882 | -1.0 | -1.0 | 0.6195 | 0.7314 | 0.9078 | 0.9571 | 0.8446 | 0.9191 | 0.987 | 0.9974 | 0.9256 | 0.9491 | 0.9503 | 0.9812 | 0.9759 | 0.9895 | 0.7502 | 0.8265 | 0.9583 | 0.9835 | 0.8203 | 0.8829 | 0.9017 | 0.9588 | 0.705 | 0.8152 |
| 5.0013 | 25.0 | 10950 | 4.9919 | 0.859 | 0.9682 | 0.9561 | 0.7454 | 0.8821 | 0.9619 | 0.7938 | 0.9056 | 0.9151 | 0.823 | 0.9392 | 0.9863 | 0.8247 | 0.9175 | 0.8513 | 0.8848 | -1.0 | -1.0 | 0.646 | 0.7475 | 0.9076 | 0.9545 | 0.855 | 0.9236 | 0.9822 | 0.9971 | 0.9112 | 0.946 | 0.9303 | 0.9683 | 0.9836 | 0.9912 | 0.7401 | 0.8147 | 0.9592 | 0.9832 | 0.8187 | 0.9 | 0.9109 | 0.9599 | 0.7058 | 0.8233 |
| 4.9435 | 26.0 | 11388 | 5.1065 | 0.8611 | 0.9673 | 0.9552 | 0.7485 | 0.8803 | 0.9616 | 0.7957 | 0.9072 | 0.9151 | 0.8231 | 0.9334 | 0.9765 | 0.821 | 0.9233 | 0.8611 | 0.8912 | -1.0 | -1.0 | 0.624 | 0.7335 | 0.9133 | 0.9618 | 0.8513 | 0.9212 | 0.982 | 0.9972 | 0.9217 | 0.9488 | 0.9462 | 0.9792 | 0.983 | 0.9906 | 0.7556 | 0.8165 | 0.9506 | 0.9821 | 0.8198 | 0.886 | 0.9061 | 0.9588 | 0.7195 | 0.8216 |
| 4.9063 | 27.0 | 11826 | 5.0362 | 0.8585 | 0.9699 | 0.9562 | 0.739 | 0.8799 | 0.9644 | 0.7938 | 0.9055 | 0.9147 | 0.8204 | 0.9375 | 0.9875 | 0.8074 | 0.9175 | 0.8362 | 0.8709 | -1.0 | -1.0 | 0.6333 | 0.7354 | 0.9101 | 0.9594 | 0.8425 | 0.9206 | 0.9866 | 0.9971 | 0.9228 | 0.9513 | 0.9549 | 0.9807 | 0.9849 | 0.994 | 0.7379 | 0.8231 | 0.9532 | 0.9765 | 0.824 | 0.8891 | 0.9139 | 0.9626 | 0.7116 | 0.8279 |
| 4.8789 | 28.0 | 12264 | 5.0817 | 0.8545 | 0.9657 | 0.9503 | 0.734 | 0.8798 | 0.9582 | 0.7904 | 0.9021 | 0.9109 | 0.8139 | 0.9313 | 0.9728 | 0.7879 | 0.9033 | 0.8316 | 0.8709 | -1.0 | -1.0 | 0.6431 | 0.7484 | 0.917 | 0.9573 | 0.8387 | 0.9189 | 0.9877 | 0.9989 | 0.9271 | 0.9529 | 0.9354 | 0.9713 | 0.9763 | 0.9891 | 0.7376 | 0.8085 | 0.9447 | 0.9668 | 0.804 | 0.8741 | 0.9158 | 0.9634 | 0.7161 | 0.8289 |
| 4.9181 | 29.0 | 12702 | 5.0919 | 0.8488 | 0.9673 | 0.948 | 0.7178 | 0.873 | 0.9605 | 0.7847 | 0.8949 | 0.9034 | 0.7972 | 0.9323 | 0.9847 | 0.7899 | 0.8981 | 0.8159 | 0.8578 | -1.0 | -1.0 | 0.6114 | 0.7131 | 0.9059 | 0.9516 | 0.8471 | 0.9168 | 0.9877 | 0.9977 | 0.9204 | 0.948 | 0.9492 | 0.9723 | 0.9855 | 0.9953 | 0.7195 | 0.7884 | 0.95 | 0.976 | 0.7906 | 0.8679 | 0.9115 | 0.9573 | 0.6994 | 0.8069 |
| 4.8989 | 30.0 | 13140 | 5.1016 | 0.8507 | 0.9666 | 0.9511 | 0.7255 | 0.8813 | 0.9506 | 0.7864 | 0.8986 | 0.908 | 0.8065 | 0.9344 | 0.9873 | 0.7901 | 0.9063 | 0.8097 | 0.8536 | -1.0 | -1.0 | 0.6229 | 0.7297 | 0.9173 | 0.9586 | 0.8372 | 0.9172 | 0.9879 | 0.9983 | 0.9366 | 0.9561 | 0.9276 | 0.9723 | 0.984 | 0.9937 | 0.7253 | 0.7979 | 0.9454 | 0.9701 | 0.8037 | 0.8772 | 0.9131 | 0.9588 | 0.7095 | 0.8218 |
| 4.8752 | 31.0 | 13578 | 5.1597 | 0.8528 | 0.9665 | 0.951 | 0.7308 | 0.8796 | 0.9064 | 0.7889 | 0.9006 | 0.9094 | 0.8122 | 0.9348 | 0.9864 | 0.8109 | 0.9149 | 0.8143 | 0.8571 | -1.0 | -1.0 | 0.6165 | 0.7272 | 0.912 | 0.9566 | 0.8388 | 0.9117 | 0.988 | 0.9975 | 0.9322 | 0.9537 | 0.9131 | 0.9693 | 0.9795 | 0.9885 | 0.7523 | 0.8249 | 0.9591 | 0.9791 | 0.8016 | 0.8736 | 0.9114 | 0.9608 | 0.7094 | 0.8174 |
| 4.88 | 32.0 | 14016 | 5.1928 | 0.8506 | 0.9651 | 0.9484 | 0.7205 | 0.879 | 0.9698 | 0.787 | 0.8998 | 0.9082 | 0.802 | 0.9319 | 0.9859 | 0.8161 | 0.9115 | 0.7855 | 0.8301 | -1.0 | -1.0 | 0.6275 | 0.734 | 0.9035 | 0.9571 | 0.8431 | 0.9229 | 0.9863 | 0.9981 | 0.9287 | 0.9518 | 0.9446 | 0.9762 | 0.9857 | 0.9935 | 0.7377 | 0.8159 | 0.9426 | 0.974 | 0.8024 | 0.885 | 0.9096 | 0.9579 | 0.6951 | 0.8066 |
| 4.8312 | 33.0 | 14454 | 5.1171 | 0.8519 | 0.9649 | 0.9485 | 0.7236 | 0.8715 | 0.9092 | 0.7882 | 0.9002 | 0.9094 | 0.8051 | 0.9355 | 0.9867 | 0.8064 | 0.915 | 0.8127 | 0.8545 | -1.0 | -1.0 | 0.6205 | 0.7302 | 0.916 | 0.9617 | 0.8272 | 0.9138 | 0.9875 | 0.9975 | 0.9385 | 0.964 | 0.93 | 0.9733 | 0.9806 | 0.9887 | 0.7479 | 0.8203 | 0.9445 | 0.9715 | 0.8113 | 0.886 | 0.9069 | 0.9543 | 0.6967 | 0.8005 |
| 4.8807 | 34.0 | 14892 | 5.3542 | 0.8404 | 0.9617 | 0.943 | 0.707 | 0.8752 | 0.9566 | 0.7792 | 0.8921 | 0.9002 | 0.7871 | 0.9313 | 0.9836 | 0.7855 | 0.8946 | 0.7834 | 0.827 | -1.0 | -1.0 | 0.606 | 0.7162 | 0.911 | 0.9558 | 0.8249 | 0.9062 | 0.9872 | 0.9975 | 0.9283 | 0.9561 | 0.9189 | 0.9738 | 0.9808 | 0.9888 | 0.7295 | 0.8 | 0.9428 | 0.9665 | 0.7857 | 0.8777 | 0.9002 | 0.9505 | 0.6814 | 0.7924 |
| 4.8605 | 35.0 | 15330 | 5.2119 | 0.8419 | 0.9631 | 0.9435 | 0.7118 | 0.8611 | 0.9584 | 0.7801 | 0.8937 | 0.9024 | 0.7951 | 0.9301 | 0.984 | 0.7704 | 0.899 | 0.7955 | 0.8391 | -1.0 | -1.0 | 0.6075 | 0.7112 | 0.9101 | 0.9542 | 0.8315 | 0.9119 | 0.9883 | 0.9978 | 0.927 | 0.9591 | 0.9161 | 0.9624 | 0.9831 | 0.9912 | 0.7368 | 0.8162 | 0.9422 | 0.969 | 0.7839 | 0.8715 | 0.912 | 0.957 | 0.6815 | 0.7939 |
| 4.8117 | 36.0 | 15768 | 5.1740 | 0.8494 | 0.966 | 0.9498 | 0.7291 | 0.8765 | 0.9572 | 0.7855 | 0.8996 | 0.9089 | 0.8108 | 0.9322 | 0.9845 | 0.8005 | 0.9149 | 0.805 | 0.8481 | -1.0 | -1.0 | 0.618 | 0.7256 | 0.9143 | 0.9551 | 0.8306 | 0.9109 | 0.9893 | 0.9992 | 0.9307 | 0.9599 | 0.9281 | 0.9713 | 0.9845 | 0.9938 | 0.7541 | 0.8267 | 0.9342 | 0.9603 | 0.7892 | 0.8793 | 0.9081 | 0.9577 | 0.7051 | 0.8216 |
| 4.7775 | 37.0 | 16206 | 5.1906 | 0.8508 | 0.9634 | 0.9476 | 0.7353 | 0.8685 | 0.9041 | 0.7879 | 0.9007 | 0.9088 | 0.8116 | 0.9269 | 0.9865 | 0.7837 | 0.9074 | 0.8018 | 0.8429 | -1.0 | -1.0 | 0.6357 | 0.7352 | 0.9186 | 0.958 | 0.8262 | 0.9122 | 0.9901 | 0.9993 | 0.9306 | 0.9545 | 0.9192 | 0.9683 | 0.9832 | 0.9934 | 0.7565 | 0.8262 | 0.9417 | 0.9687 | 0.7953 | 0.8777 | 0.9143 | 0.9634 | 0.7136 | 0.8162 |
| 4.7134 | 38.0 | 16644 | 5.1609 | 0.8506 | 0.9624 | 0.9466 | 0.7367 | 0.8744 | 0.9591 | 0.7873 | 0.9012 | 0.9094 | 0.8137 | 0.9309 | 0.986 | 0.7943 | 0.9079 | 0.8112 | 0.8538 | -1.0 | -1.0 | 0.6252 | 0.7275 | 0.9138 | 0.9549 | 0.8253 | 0.9112 | 0.9886 | 0.9982 | 0.9295 | 0.9547 | 0.911 | 0.9668 | 0.9846 | 0.9927 | 0.7582 | 0.8278 | 0.9478 | 0.9687 | 0.7875 | 0.8777 | 0.9159 | 0.9628 | 0.7157 | 0.8265 |
| 4.7257 | 39.0 | 17082 | 5.1242 | 0.8509 | 0.9641 | 0.9479 | 0.7398 | 0.8643 | 0.961 | 0.7861 | 0.8988 | 0.908 | 0.8145 | 0.9255 | 0.9822 | 0.7946 | 0.9008 | 0.8351 | 0.8694 | -1.0 | -1.0 | 0.6304 | 0.7366 | 0.919 | 0.9576 | 0.8292 | 0.9105 | 0.9872 | 0.9976 | 0.9327 | 0.9553 | 0.9121 | 0.9589 | 0.991 | 0.9964 | 0.7371 | 0.8134 | 0.938 | 0.9662 | 0.785 | 0.8751 | 0.9083 | 0.9523 | 0.713 | 0.8211 |
| 4.7352 | 40.0 | 17520 | 5.2189 | 0.8524 | 0.9633 | 0.949 | 0.7443 | 0.8621 | 0.9017 | 0.7895 | 0.9018 | 0.911 | 0.82 | 0.9222 | 0.9721 | 0.7816 | 0.9083 | 0.8336 | 0.8656 | -1.0 | -1.0 | 0.629 | 0.7379 | 0.9119 | 0.9545 | 0.8305 | 0.9089 | 0.9902 | 0.9993 | 0.9302 | 0.9569 | 0.9272 | 0.9619 | 0.9844 | 0.9938 | 0.7543 | 0.8301 | 0.9357 | 0.9637 | 0.7949 | 0.8917 | 0.914 | 0.959 | 0.7165 | 0.823 |
| 4.7245 | 41.0 | 17958 | 5.2015 | 0.8541 | 0.9663 | 0.9529 | 0.7408 | 0.8707 | 0.9638 | 0.7898 | 0.9023 | 0.9122 | 0.8165 | 0.9316 | 0.9863 | 0.8029 | 0.9049 | 0.8163 | 0.854 | -1.0 | -1.0 | 0.6416 | 0.748 | 0.9145 | 0.9586 | 0.8303 | 0.9136 | 0.9882 | 0.9977 | 0.9319 | 0.9564 | 0.9178 | 0.9713 | 0.9907 | 0.9956 | 0.7605 | 0.827 | 0.9392 | 0.9707 | 0.7957 | 0.8886 | 0.9085 | 0.9521 | 0.7191 | 0.8321 |
| 4.7187 | 42.0 | 18396 | 5.2757 | 0.8493 | 0.9614 | 0.9469 | 0.7365 | 0.8665 | 0.9033 | 0.7869 | 0.8994 | 0.9085 | 0.8121 | 0.9261 | 0.9846 | 0.7967 | 0.9073 | 0.8228 | 0.8555 | -1.0 | -1.0 | 0.6198 | 0.733 | 0.9133 | 0.9558 | 0.8187 | 0.9087 | 0.988 | 0.9978 | 0.933 | 0.9547 | 0.9032 | 0.9619 | 0.9842 | 0.9921 | 0.7558 | 0.8213 | 0.9339 | 0.9631 | 0.7886 | 0.8855 | 0.9112 | 0.9562 | 0.7203 | 0.826 |
| 4.7428 | 43.0 | 18834 | 5.2510 | 0.8499 | 0.964 | 0.9511 | 0.7396 | 0.8651 | 0.9026 | 0.788 | 0.9016 | 0.911 | 0.8192 | 0.9295 | 0.9837 | 0.7987 | 0.9076 | 0.8228 | 0.86 | -1.0 | -1.0 | 0.632 | 0.7421 | 0.9109 | 0.9533 | 0.8211 | 0.9086 | 0.9865 | 0.9974 | 0.9303 | 0.9536 | 0.9136 | 0.9743 | 0.984 | 0.9935 | 0.7499 | 0.8239 | 0.927 | 0.9575 | 0.7854 | 0.886 | 0.9116 | 0.9582 | 0.7244 | 0.838 |
| 4.7277 | 44.0 | 19272 | 5.2467 | 0.8471 | 0.9653 | 0.9483 | 0.7293 | 0.8562 | 0.9586 | 0.7838 | 0.8977 | 0.9062 | 0.8063 | 0.9244 | 0.9847 | 0.7899 | 0.9018 | 0.8167 | 0.8566 | -1.0 | -1.0 | 0.6188 | 0.7297 | 0.9155 | 0.958 | 0.8224 | 0.9049 | 0.9868 | 0.9972 | 0.9337 | 0.9583 | 0.9203 | 0.9649 | 0.9842 | 0.9934 | 0.7376 | 0.808 | 0.928 | 0.9628 | 0.7852 | 0.8793 | 0.9106 | 0.9544 | 0.7091 | 0.8181 |
| 4.73 | 45.0 | 19710 | 5.2441 | 0.8506 | 0.9648 | 0.9507 | 0.7359 | 0.8696 | 0.9583 | 0.7871 | 0.9003 | 0.9095 | 0.8127 | 0.933 | 0.9818 | 0.8011 | 0.9055 | 0.8126 | 0.8543 | -1.0 | -1.0 | 0.635 | 0.7429 | 0.9139 | 0.9555 | 0.8192 | 0.9112 | 0.9862 | 0.997 | 0.9262 | 0.9499 | 0.9094 | 0.9639 | 0.9839 | 0.9932 | 0.7511 | 0.8188 | 0.9321 | 0.9654 | 0.8072 | 0.8907 | 0.9146 | 0.9606 | 0.7157 | 0.8238 |
| 4.736 | 46.0 | 20148 | 5.2361 | 0.8496 | 0.9639 | 0.9489 | 0.7344 | 0.8729 | 0.9005 | 0.7859 | 0.9 | 0.9087 | 0.8107 | 0.9323 | 0.9818 | 0.7925 | 0.9095 | 0.8179 | 0.8559 | -1.0 | -1.0 | 0.6265 | 0.7349 | 0.9139 | 0.9565 | 0.8268 | 0.9143 | 0.9882 | 0.9976 | 0.929 | 0.9499 | 0.9153 | 0.9678 | 0.9878 | 0.9947 | 0.7492 | 0.817 | 0.9266 | 0.9531 | 0.7903 | 0.8881 | 0.9167 | 0.9594 | 0.7142 | 0.8235 |
| 4.728 | 47.0 | 20586 | 5.1929 | 0.8511 | 0.9652 | 0.9517 | 0.7374 | 0.8757 | 0.9602 | 0.7876 | 0.8998 | 0.9096 | 0.813 | 0.9325 | 0.9822 | 0.7964 | 0.9063 | 0.8261 | 0.8626 | -1.0 | -1.0 | 0.6323 | 0.7376 | 0.9123 | 0.9571 | 0.827 | 0.9141 | 0.9889 | 0.999 | 0.9309 | 0.9547 | 0.9144 | 0.9629 | 0.9891 | 0.9956 | 0.7397 | 0.8085 | 0.9329 | 0.962 | 0.7905 | 0.8876 | 0.917 | 0.9569 | 0.7173 | 0.8294 |
| 4.711 | 48.0 | 21024 | 5.2107 | 0.8484 | 0.9642 | 0.9488 | 0.7342 | 0.8638 | 0.9613 | 0.787 | 0.8984 | 0.9081 | 0.8132 | 0.9274 | 0.9818 | 0.7948 | 0.9064 | 0.8215 | 0.8597 | -1.0 | -1.0 | 0.6242 | 0.7349 | 0.9094 | 0.9522 | 0.8193 | 0.9089 | 0.9885 | 0.999 | 0.9285 | 0.9491 | 0.9131 | 0.9604 | 0.9886 | 0.9953 | 0.7442 | 0.8149 | 0.932 | 0.9628 | 0.7869 | 0.8876 | 0.9117 | 0.9582 | 0.7148 | 0.8233 |
| 4.7041 | 49.0 | 21462 | 5.1916 | 0.8502 | 0.9651 | 0.9509 | 0.7392 | 0.8734 | 0.9006 | 0.7877 | 0.9004 | 0.9093 | 0.8174 | 0.9323 | 0.9817 | 0.7997 | 0.9038 | 0.8277 | 0.8645 | -1.0 | -1.0 | 0.6368 | 0.7456 | 0.9105 | 0.9549 | 0.8167 | 0.9114 | 0.987 | 0.9976 | 0.9316 | 0.9506 | 0.9186 | 0.9624 | 0.9849 | 0.9937 | 0.7454 | 0.8203 | 0.9342 | 0.9573 | 0.7834 | 0.8845 | 0.9118 | 0.9548 | 0.7152 | 0.8294 |
| 4.7133 | 50.0 | 21900 | 5.1922 | 0.848 | 0.9654 | 0.9497 | 0.7323 | 0.8702 | 0.9603 | 0.7848 | 0.8983 | 0.9079 | 0.8094 | 0.9289 | 0.9849 | 0.7866 | 0.906 | 0.8183 | 0.8576 | -1.0 | -1.0 | 0.623 | 0.7297 | 0.9109 | 0.9522 | 0.826 | 0.9096 | 0.9878 | 0.9977 | 0.9319 | 0.9564 | 0.9199 | 0.9703 | 0.9883 | 0.9951 | 0.7476 | 0.8144 | 0.9339 | 0.9612 | 0.7812 | 0.8845 | 0.9107 | 0.9553 | 0.7061 | 0.8208 |
### Framework versions
- Transformers 4.47.0
- Pytorch 2.5.1+cu124
- Datasets 3.1.0
- Tokenizers 0.21.0
| [
"baseball-bat",
"basketball",
"car",
"football",
"human",
"luggage",
"mattress",
"motorcycle",
"skis",
"snowboard",
"soccer-ball",
"stop-sign",
"tennis-racket",
"umbrella",
"volleyball"
] |
fukatani/detr-resnet-50_finetuned_cppe5 |
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# detr-resnet-50_finetuned_cppe5
This model is a fine-tuned version of [facebook/detr-resnet-50](https://huggingface.co/facebook/detr-resnet-50) on an unknown dataset.
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 1e-05
- train_batch_size: 8
- eval_batch_size: 8
- seed: 42
- optimizer: Use adamw_torch with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
- lr_scheduler_type: linear
- num_epochs: 10
- mixed_precision_training: Native AMP
### Training results
### Framework versions
- Transformers 4.46.3
- Pytorch 2.5.1+cu121
- Datasets 3.2.0
- Tokenizers 0.20.3
| [
"coverall",
"face_shield",
"gloves",
"goggles",
"mask"
] |
albertklorer/detr-finetuned-layout |
# Model Card for Model ID
<!-- Provide a quick summary of what the model is/does. -->
## Model Details
### Model Description
<!-- Provide a longer summary of what this model is. -->
This is the model card of a 🤗 transformers model that has been pushed on the Hub. This model card has been automatically generated.
- **Developed by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Model type:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
- **Finetuned from model [optional]:** [More Information Needed]
### Model Sources [optional]
<!-- Provide the basic links for the model. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the model is intended to be used, including the foreseeable users of the model and those affected by the model. -->
### Direct Use
<!-- This section is for the model use without fine-tuning or plugging into a larger ecosystem/app. -->
[More Information Needed]
### Downstream Use [optional]
<!-- This section is for the model use when fine-tuned for a task, or when plugged into a larger ecosystem/app -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the model will not work well for. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.
## How to Get Started with the Model
Use the code below to get started with the model.
[More Information Needed]
## Training Details
### Training Data
<!-- This should link to a Dataset Card, perhaps with a short stub of information on what the training data is all about as well as documentation related to data pre-processing or additional filtering. -->
[More Information Needed]
### Training Procedure
<!-- This relates heavily to the Technical Specifications. Content here should link to that section when it is relevant to the training procedure. -->
#### Preprocessing [optional]
[More Information Needed]
#### Training Hyperparameters
- **Training regime:** [More Information Needed] <!--fp32, fp16 mixed precision, bf16 mixed precision, bf16 non-mixed precision, fp16 non-mixed precision, fp8 mixed precision -->
#### Speeds, Sizes, Times [optional]
<!-- This section provides information about throughput, start/end time, checkpoint size if relevant, etc. -->
[More Information Needed]
## Evaluation
<!-- This section describes the evaluation protocols and provides the results. -->
### Testing Data, Factors & Metrics
#### Testing Data
<!-- This should link to a Dataset Card if possible. -->
[More Information Needed]
#### Factors
<!-- These are the things the evaluation is disaggregating by, e.g., subpopulations or domains. -->
[More Information Needed]
#### Metrics
<!-- These are the evaluation metrics being used, ideally with a description of why. -->
[More Information Needed]
### Results
[More Information Needed]
#### Summary
## Model Examination [optional]
<!-- Relevant interpretability work for the model goes here -->
[More Information Needed]
## Environmental Impact
<!-- Total emissions (in grams of CO2eq) and additional considerations, such as electricity usage, go here. Edit the suggested text below accordingly -->
Carbon emissions can be estimated using the [Machine Learning Impact calculator](https://mlco2.github.io/impact#compute) presented in [Lacoste et al. (2019)](https://arxiv.org/abs/1910.09700).
- **Hardware Type:** [More Information Needed]
- **Hours used:** [More Information Needed]
- **Cloud Provider:** [More Information Needed]
- **Compute Region:** [More Information Needed]
- **Carbon Emitted:** [More Information Needed]
## Technical Specifications [optional]
### Model Architecture and Objective
[More Information Needed]
### Compute Infrastructure
[More Information Needed]
#### Hardware
[More Information Needed]
#### Software
[More Information Needed]
## Citation [optional]
<!-- If there is a paper or blog post introducing the model, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the model or model card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Model Card Authors [optional]
[More Information Needed]
## Model Card Contact
[More Information Needed] | [
"text",
"title",
"table",
"figure",
"list"
] |
Mgeong/detr-resnet-50-dc5-fashionpedia-finetuned |
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# detr-resnet-50-dc5-fashionpedia-finetuned
This model is a fine-tuned version of [facebook/detr-resnet-50-dc5](https://huggingface.co/facebook/detr-resnet-50-dc5) on an unknown dataset.
It achieves the following results on the evaluation set:
- Loss: 2.3591
- Map: 0.0106
- Map 50: 0.0209
- Map 75: 0.0096
- Map Small: 0.006
- Map Medium: 0.015
- Map Large: 0.0083
- Mar 1: 0.0303
- Mar 10: 0.0597
- Mar 100: 0.0624
- Mar Small: 0.0227
- Mar Medium: 0.0615
- Mar Large: 0.0814
- Map Shirt, blouse: 0.0
- Mar 100 Shirt, blouse: 0.0
- Map Top, t-shirt, sweatshirt: 0.024
- Mar 100 Top, t-shirt, sweatshirt: 0.3141
- Map Sweater: 0.0
- Mar 100 Sweater: 0.0
- Map Cardigan: 0.0
- Mar 100 Cardigan: 0.0
- Map Jacket: 0.0
- Mar 100 Jacket: 0.0
- Map Vest: 0.0
- Mar 100 Vest: 0.0
- Map Pants: 0.0557
- Mar 100 Pants: 0.5831
- Map Shorts: 0.0
- Mar 100 Shorts: 0.0
- Map Skirt: 0.0
- Mar 100 Skirt: 0.0
- Map Coat: 0.0
- Mar 100 Coat: 0.0
- Map Dress: 0.0957
- Mar 100 Dress: 0.7406
- Map Jumpsuit: 0.0
- Mar 100 Jumpsuit: 0.0
- Map Cape: 0.0
- Mar 100 Cape: 0.0
- Map Glasses: 0.0
- Mar 100 Glasses: 0.0
- Map Hat: 0.0
- Mar 100 Hat: 0.0
- Map Headband, head covering, hair accessory: 0.0
- Mar 100 Headband, head covering, hair accessory: 0.0
- Map Tie: 0.0
- Mar 100 Tie: 0.0
- Map Glove: 0.0
- Mar 100 Glove: 0.0
- Map Watch: 0.0
- Mar 100 Watch: 0.0
- Map Belt: 0.0
- Mar 100 Belt: 0.0
- Map Leg warmer: 0.0
- Mar 100 Leg warmer: 0.0
- Map Tights, stockings: 0.0
- Mar 100 Tights, stockings: 0.0
- Map Sock: 0.0
- Mar 100 Sock: 0.0
- Map Shoe: 0.2154
- Mar 100 Shoe: 0.5153
- Map Bag, wallet: 0.0
- Mar 100 Bag, wallet: 0.0
- Map Scarf: 0.0
- Mar 100 Scarf: 0.0
- Map Umbrella: 0.0
- Mar 100 Umbrella: 0.0
- Map Hood: 0.0
- Mar 100 Hood: 0.0
- Map Collar: 0.0
- Mar 100 Collar: 0.0
- Map Lapel: 0.0
- Mar 100 Lapel: 0.0
- Map Epaulette: 0.0
- Mar 100 Epaulette: 0.0
- Map Sleeve: 0.0683
- Mar 100 Sleeve: 0.4312
- Map Pocket: 0.0001
- Mar 100 Pocket: 0.0407
- Map Neckline: 0.0288
- Mar 100 Neckline: 0.2476
- Map Buckle: 0.0
- Mar 100 Buckle: 0.0
- Map Zipper: 0.0
- Mar 100 Zipper: 0.0
- Map Applique: 0.0
- Mar 100 Applique: 0.0
- Map Bead: 0.0
- Mar 100 Bead: 0.0
- Map Bow: 0.0
- Mar 100 Bow: 0.0
- Map Flower: 0.0
- Mar 100 Flower: 0.0
- Map Fringe: 0.0
- Mar 100 Fringe: 0.0
- Map Ribbon: 0.0
- Mar 100 Ribbon: 0.0
- Map Rivet: 0.0
- Mar 100 Rivet: 0.0
- Map Ruffle: 0.0
- Mar 100 Ruffle: 0.0
- Map Sequin: 0.0
- Mar 100 Sequin: 0.0
- Map Tassel: 0.0
- Mar 100 Tassel: 0.0
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 1e-05
- train_batch_size: 4
- eval_batch_size: 4
- seed: 42
- optimizer: Use adamw_torch with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
- lr_scheduler_type: linear
- training_steps: 10000
- mixed_precision_training: Native AMP
### Training results
| Training Loss | Epoch | Step | Validation Loss | Map | Map 50 | Map 75 | Map Small | Map Medium | Map Large | Mar 1 | Mar 10 | Mar 100 | Mar Small | Mar Medium | Mar Large | Map Shirt, blouse | Mar 100 Shirt, blouse | Map Top, t-shirt, sweatshirt | Mar 100 Top, t-shirt, sweatshirt | Map Sweater | Mar 100 Sweater | Map Cardigan | Mar 100 Cardigan | Map Jacket | Mar 100 Jacket | Map Vest | Mar 100 Vest | Map Pants | Mar 100 Pants | Map Shorts | Mar 100 Shorts | Map Skirt | Mar 100 Skirt | Map Coat | Mar 100 Coat | Map Dress | Mar 100 Dress | Map Jumpsuit | Mar 100 Jumpsuit | Map Cape | Mar 100 Cape | Map Glasses | Mar 100 Glasses | Map Hat | Mar 100 Hat | Map Headband, head covering, hair accessory | Mar 100 Headband, head covering, hair accessory | Map Tie | Mar 100 Tie | Map Glove | Mar 100 Glove | Map Watch | Mar 100 Watch | Map Belt | Mar 100 Belt | Map Leg warmer | Mar 100 Leg warmer | Map Tights, stockings | Mar 100 Tights, stockings | Map Sock | Mar 100 Sock | Map Shoe | Mar 100 Shoe | Map Bag, wallet | Mar 100 Bag, wallet | Map Scarf | Mar 100 Scarf | Map Umbrella | Mar 100 Umbrella | Map Hood | Mar 100 Hood | Map Collar | Mar 100 Collar | Map Lapel | Mar 100 Lapel | Map Epaulette | Mar 100 Epaulette | Map Sleeve | Mar 100 Sleeve | Map Pocket | Mar 100 Pocket | Map Neckline | Mar 100 Neckline | Map Buckle | Mar 100 Buckle | Map Zipper | Mar 100 Zipper | Map Applique | Mar 100 Applique | Map Bead | Mar 100 Bead | Map Bow | Mar 100 Bow | Map Flower | Mar 100 Flower | Map Fringe | Mar 100 Fringe | Map Ribbon | Mar 100 Ribbon | Map Rivet | Mar 100 Rivet | Map Ruffle | Mar 100 Ruffle | Map Sequin | Mar 100 Sequin | Map Tassel | Mar 100 Tassel |
|:-------------:|:------:|:-----:|:---------------:|:------:|:------:|:------:|:---------:|:----------:|:---------:|:------:|:------:|:-------:|:---------:|:----------:|:---------:|:-----------------:|:---------------------:|:----------------------------:|:--------------------------------:|:-----------:|:---------------:|:------------:|:----------------:|:----------:|:--------------:|:--------:|:------------:|:---------:|:-------------:|:----------:|:--------------:|:---------:|:-------------:|:--------:|:------------:|:---------:|:-------------:|:------------:|:----------------:|:--------:|:------------:|:-----------:|:---------------:|:-------:|:-----------:|:-------------------------------------------:|:-----------------------------------------------:|:-------:|:-----------:|:---------:|:-------------:|:---------:|:-------------:|:--------:|:------------:|:--------------:|:------------------:|:---------------------:|:-------------------------:|:--------:|:------------:|:--------:|:------------:|:---------------:|:-------------------:|:---------:|:-------------:|:------------:|:----------------:|:--------:|:------------:|:----------:|:--------------:|:---------:|:-------------:|:-------------:|:-----------------:|:----------:|:--------------:|:----------:|:--------------:|:------------:|:----------------:|:----------:|:--------------:|:----------:|:--------------:|:------------:|:----------------:|:--------:|:------------:|:-------:|:-----------:|:----------:|:--------------:|:----------:|:--------------:|:----------:|:--------------:|:---------:|:-------------:|:----------:|:--------------:|:----------:|:--------------:|:----------:|:--------------:|
| 7.6247 | 0.0044 | 50 | 6.3328 | 0.0 | 0.0001 | 0.0 | 0.0 | 0.0 | 0.0002 | 0.0014 | 0.0055 | 0.0067 | 0.002 | 0.0087 | 0.0226 | 0.0 | 0.0 | 0.001 | 0.1173 | 0.0 | 0.0238 | 0.0 | 0.0 | 0.0003 | 0.0175 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0001 | 0.0437 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0119 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0055 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0015 | 0.0001 | 0.0458 | 0.0 | 0.0229 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0003 | 0.0174 | 0.0 | 0.0004 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 5.3226 | 0.0088 | 100 | 5.7221 | 0.0001 | 0.0002 | 0.0 | 0.0 | 0.0001 | 0.0003 | 0.0015 | 0.0047 | 0.0066 | 0.0048 | 0.0082 | 0.0161 | 0.0 | 0.0 | 0.0006 | 0.0537 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0011 | 0.0191 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0004 | 0.0941 | 0.0 | 0.0 | 0.0 | 0.02 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0055 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0041 | 0.0001 | 0.007 | 0.0 | 0.0188 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.001 | 0.0813 | 0.0 | 0.0006 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 5.7036 | 0.0132 | 150 | 5.0666 | 0.0001 | 0.0004 | 0.0 | 0.0 | 0.0002 | 0.0003 | 0.0007 | 0.0027 | 0.0059 | 0.0012 | 0.009 | 0.0125 | 0.0 | 0.0 | 0.0001 | 0.0059 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0005 | 0.1059 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0007 | 0.0256 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0044 | 0.134 | 0.0 | 0.0004 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 5.473 | 0.0175 | 200 | 4.6885 | 0.0002 | 0.0007 | 0.0001 | 0.0 | 0.0004 | 0.0002 | 0.0011 | 0.0032 | 0.0055 | 0.0011 | 0.0092 | 0.0133 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0001 | 0.025 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0023 | 0.034 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0076 | 0.1929 | 0.0003 | 0.0026 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 4.972 | 0.0219 | 250 | 4.1105 | 0.0004 | 0.0015 | 0.0002 | 0.0002 | 0.0007 | 0.0004 | 0.0012 | 0.0032 | 0.0061 | 0.002 | 0.0077 | 0.0142 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0106 | 0.0446 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0087 | 0.2326 | 0.0002 | 0.0017 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 4.4022 | 0.0263 | 300 | 3.9396 | 0.0005 | 0.002 | 0.0001 | 0.0005 | 0.0007 | 0.0002 | 0.0015 | 0.0036 | 0.0063 | 0.0026 | 0.008 | 0.0128 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0193 | 0.077 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0059 | 0.2128 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 3.8062 | 0.0307 | 350 | 3.7220 | 0.0011 | 0.0035 | 0.0004 | 0.001 | 0.0015 | 0.0002 | 0.0024 | 0.0059 | 0.0094 | 0.0051 | 0.0119 | 0.0142 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.04 | 0.152 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0121 | 0.2785 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 3.0497 | 0.0351 | 400 | 3.6161 | 0.0015 | 0.0045 | 0.0005 | 0.0012 | 0.0021 | 0.0002 | 0.0029 | 0.008 | 0.0115 | 0.0062 | 0.0153 | 0.0148 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0543 | 0.2368 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0141 | 0.293 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 4.2079 | 0.0395 | 450 | 3.5568 | 0.0018 | 0.0053 | 0.0007 | 0.0018 | 0.0026 | 0.0001 | 0.0031 | 0.0092 | 0.0127 | 0.008 | 0.0165 | 0.0139 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0728 | 0.301 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0115 | 0.2809 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 3.9025 | 0.0438 | 500 | 3.4765 | 0.0017 | 0.0049 | 0.0008 | 0.0015 | 0.0025 | 0.0001 | 0.0028 | 0.01 | 0.0138 | 0.01 | 0.0174 | 0.0142 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0649 | 0.3296 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0142 | 0.3044 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 3.8511 | 0.0482 | 550 | 3.4431 | 0.0016 | 0.0049 | 0.0005 | 0.0015 | 0.0021 | 0.0002 | 0.003 | 0.0097 | 0.0136 | 0.0088 | 0.0181 | 0.0138 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.063 | 0.3378 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0123 | 0.2887 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 3.7037 | 0.0526 | 600 | 3.4005 | 0.002 | 0.0055 | 0.001 | 0.0017 | 0.0028 | 0.0002 | 0.0033 | 0.0109 | 0.0152 | 0.0104 | 0.0197 | 0.0153 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0758 | 0.3637 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0178 | 0.3333 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 2.4975 | 0.0570 | 650 | 3.3737 | 0.0019 | 0.0052 | 0.001 | 0.002 | 0.0025 | 0.0003 | 0.0035 | 0.0112 | 0.0156 | 0.0094 | 0.0211 | 0.0184 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0737 | 0.3798 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0151 | 0.338 | 0.0 | 0.0 | 0.0002 | 0.0002 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 3.4822 | 0.0614 | 700 | 3.3744 | 0.0021 | 0.0057 | 0.0012 | 0.002 | 0.0027 | 0.0003 | 0.0036 | 0.0117 | 0.0152 | 0.0089 | 0.0206 | 0.0241 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.079 | 0.3811 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.018 | 0.3163 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 3.7697 | 0.0658 | 750 | 3.3134 | 0.002 | 0.0056 | 0.0009 | 0.0019 | 0.0026 | 0.0003 | 0.0037 | 0.0116 | 0.0152 | 0.0096 | 0.0205 | 0.0271 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0733 | 0.3605 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0162 | 0.3398 | 0.0 | 0.0 | 0.0006 | 0.0006 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 3.1864 | 0.0701 | 800 | 3.3496 | 0.0019 | 0.0055 | 0.0008 | 0.0014 | 0.0029 | 0.0015 | 0.0032 | 0.0114 | 0.0148 | 0.0104 | 0.0188 | 0.0301 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0662 | 0.3446 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0192 | 0.3201 | 0.0 | 0.0 | 0.0013 | 0.0145 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 3.219 | 0.0745 | 850 | 3.2743 | 0.0019 | 0.0051 | 0.0011 | 0.0015 | 0.0026 | 0.0004 | 0.0035 | 0.0121 | 0.0157 | 0.0098 | 0.021 | 0.0311 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0695 | 0.4049 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0178 | 0.3135 | 0.0 | 0.0 | 0.0008 | 0.0052 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 3.6004 | 0.0789 | 900 | 3.2804 | 0.002 | 0.0052 | 0.0013 | 0.0018 | 0.0028 | 0.0005 | 0.0038 | 0.0127 | 0.0166 | 0.0126 | 0.0212 | 0.0313 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0755 | 0.4407 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0156 | 0.2931 | 0.0 | 0.0 | 0.0022 | 0.0282 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 3.7969 | 0.0833 | 950 | 3.2312 | 0.0019 | 0.005 | 0.0011 | 0.0016 | 0.0029 | 0.0004 | 0.0035 | 0.0129 | 0.0169 | 0.0125 | 0.0217 | 0.0296 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0655 | 0.4384 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0193 | 0.3101 | 0.0 | 0.0 | 0.0029 | 0.0273 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 3.8932 | 0.0877 | 1000 | 3.1790 | 0.0019 | 0.0053 | 0.001 | 0.0017 | 0.0029 | 0.0005 | 0.0036 | 0.0132 | 0.0176 | 0.0129 | 0.0229 | 0.0265 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0664 | 0.4546 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0178 | 0.3185 | 0.0 | 0.0 | 0.0041 | 0.036 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 2.8747 | 0.0921 | 1050 | 3.1859 | 0.0021 | 0.0059 | 0.0012 | 0.0017 | 0.0034 | 0.0004 | 0.0043 | 0.014 | 0.0172 | 0.0134 | 0.0221 | 0.0304 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0738 | 0.4226 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0198 | 0.2996 | 0.0 | 0.0 | 0.0053 | 0.07 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 3.7667 | 0.0964 | 1100 | 3.1599 | 0.0019 | 0.0053 | 0.0009 | 0.0016 | 0.0031 | 0.0006 | 0.0039 | 0.0141 | 0.0186 | 0.0147 | 0.0241 | 0.0298 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0612 | 0.4316 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0176 | 0.3257 | 0.0 | 0.0 | 0.007 | 0.0997 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 3.6618 | 0.1008 | 1150 | 3.1424 | 0.0018 | 0.0053 | 0.0009 | 0.0015 | 0.0032 | 0.0004 | 0.0038 | 0.0139 | 0.0179 | 0.014 | 0.0233 | 0.0308 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0621 | 0.4129 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0157 | 0.3077 | 0.0 | 0.0 | 0.0056 | 0.1045 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 4.0532 | 0.1052 | 1200 | 3.0834 | 0.0022 | 0.0061 | 0.001 | 0.0019 | 0.003 | 0.0006 | 0.0044 | 0.0147 | 0.0191 | 0.0129 | 0.0258 | 0.0316 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0026 | 0.0016 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0702 | 0.4043 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0205 | 0.3745 | 0.0 | 0.0 | 0.0062 | 0.099 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 2.7869 | 0.1096 | 1250 | 3.0666 | 0.0023 | 0.0065 | 0.0012 | 0.0019 | 0.0032 | 0.0006 | 0.0045 | 0.0155 | 0.0195 | 0.0139 | 0.0261 | 0.0307 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0026 | 0.0016 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0773 | 0.4126 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0206 | 0.3688 | 0.0 | 0.0 | 0.007 | 0.1159 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 3.6145 | 0.1140 | 1300 | 3.0875 | 0.0022 | 0.0063 | 0.0012 | 0.0018 | 0.0033 | 0.0006 | 0.0041 | 0.0152 | 0.0197 | 0.0142 | 0.0263 | 0.0302 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0051 | 0.0045 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0732 | 0.419 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0164 | 0.3562 | 0.0 | 0.0 | 0.0072 | 0.1247 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 3.7354 | 0.1184 | 1350 | 3.0702 | 0.0024 | 0.0065 | 0.0015 | 0.0019 | 0.0037 | 0.0008 | 0.0044 | 0.015 | 0.0197 | 0.0145 | 0.0258 | 0.0273 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0069 | 0.0094 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0795 | 0.42 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0181 | 0.3717 | 0.0 | 0.0 | 0.0082 | 0.1058 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 3.6307 | 0.1227 | 1400 | 3.0500 | 0.0024 | 0.0066 | 0.0013 | 0.0017 | 0.0039 | 0.0007 | 0.0041 | 0.0155 | 0.0201 | 0.0143 | 0.0264 | 0.0293 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0069 | 0.0087 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0729 | 0.435 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0234 | 0.3676 | 0.0 | 0.0 | 0.009 | 0.1121 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 3.1505 | 0.1271 | 1450 | 3.0474 | 0.0025 | 0.0069 | 0.0012 | 0.0017 | 0.0037 | 0.0007 | 0.0043 | 0.0153 | 0.0196 | 0.0141 | 0.0259 | 0.0295 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0087 | 0.0059 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0718 | 0.4156 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.021 | 0.3531 | 0.0 | 0.0 | 0.0115 | 0.1265 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 3.3503 | 0.1315 | 1500 | 3.0609 | 0.0026 | 0.0071 | 0.0014 | 0.0022 | 0.0039 | 0.0006 | 0.0046 | 0.0154 | 0.0192 | 0.0135 | 0.0258 | 0.0281 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0035 | 0.0018 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0829 | 0.4292 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0231 | 0.347 | 0.0 | 0.0 | 0.0106 | 0.103 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 3.472 | 0.1359 | 1550 | 3.0172 | 0.0025 | 0.0066 | 0.0015 | 0.0017 | 0.004 | 0.0014 | 0.0046 | 0.0166 | 0.0212 | 0.016 | 0.0271 | 0.0299 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0151 | 0.0266 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0662 | 0.4387 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0221 | 0.3755 | 0.0 | 0.0 | 0.0099 | 0.1365 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 3.7421 | 0.1403 | 1600 | 3.0321 | 0.0024 | 0.0067 | 0.0012 | 0.0017 | 0.0036 | 0.001 | 0.0045 | 0.0156 | 0.02 | 0.0134 | 0.0263 | 0.0258 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0121 | 0.0311 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0716 | 0.419 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0183 | 0.3397 | 0.0 | 0.0 | 0.0079 | 0.1282 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 3.1353 | 0.1447 | 1650 | 3.0019 | 0.0025 | 0.0068 | 0.0014 | 0.0018 | 0.0037 | 0.0013 | 0.0054 | 0.017 | 0.022 | 0.0166 | 0.0264 | 0.0299 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0197 | 0.0671 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0645 | 0.4283 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0197 | 0.3503 | 0.0 | 0.0002 | 0.0113 | 0.1644 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 3.5123 | 0.1490 | 1700 | 2.9785 | 0.0026 | 0.0071 | 0.0014 | 0.0017 | 0.0038 | 0.0011 | 0.0052 | 0.0175 | 0.022 | 0.0159 | 0.0277 | 0.0272 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.017 | 0.0587 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.068 | 0.4203 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0247 | 0.374 | 0.0 | 0.0 | 0.0104 | 0.1599 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 3.5223 | 0.1534 | 1750 | 2.9857 | 0.0026 | 0.0069 | 0.0017 | 0.0017 | 0.0045 | 0.0013 | 0.0058 | 0.018 | 0.0227 | 0.016 | 0.0273 | 0.0344 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0197 | 0.0906 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0664 | 0.4399 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0247 | 0.3731 | 0.0 | 0.0 | 0.0105 | 0.1423 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 4.1934 | 0.1578 | 1800 | 2.9619 | 0.0031 | 0.008 | 0.0021 | 0.002 | 0.0042 | 0.0016 | 0.0062 | 0.019 | 0.0233 | 0.0167 | 0.0277 | 0.0316 | 0.0 | 0.0 | 0.0002 | 0.0002 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0272 | 0.1098 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0737 | 0.4463 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0241 | 0.3472 | 0.0 | 0.0 | 0.0169 | 0.1683 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 2.8922 | 0.1622 | 1850 | 2.9526 | 0.0028 | 0.0076 | 0.0016 | 0.0018 | 0.0038 | 0.0017 | 0.0059 | 0.0187 | 0.0224 | 0.0159 | 0.0265 | 0.0315 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0234 | 0.1165 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0649 | 0.4185 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0274 | 0.3081 | 0.0 | 0.0015 | 0.0143 | 0.1838 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 3.3634 | 0.1666 | 1900 | 2.9238 | 0.0032 | 0.0083 | 0.002 | 0.0021 | 0.0041 | 0.0017 | 0.0073 | 0.0205 | 0.0244 | 0.0158 | 0.0282 | 0.0294 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.028 | 0.1772 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0789 | 0.4375 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0241 | 0.3398 | 0.0 | 0.0007 | 0.0144 | 0.1693 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 3.0369 | 0.1710 | 1950 | 2.8895 | 0.0036 | 0.0091 | 0.0025 | 0.0023 | 0.0041 | 0.0022 | 0.0078 | 0.0213 | 0.0255 | 0.0164 | 0.0288 | 0.0363 | 0.0 | 0.0 | 0.001 | 0.0002 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0346 | 0.1909 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0878 | 0.436 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.031 | 0.3729 | 0.0 | 0.0009 | 0.0113 | 0.1711 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 3.4783 | 0.1753 | 2000 | 2.8851 | 0.0033 | 0.0084 | 0.0021 | 0.002 | 0.0037 | 0.0023 | 0.0081 | 0.0217 | 0.0262 | 0.0161 | 0.0284 | 0.0353 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.04 | 0.2435 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0747 | 0.4381 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0235 | 0.3479 | 0.0 | 0.0011 | 0.0123 | 0.1768 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 3.1163 | 0.1797 | 2050 | 2.9083 | 0.0031 | 0.0081 | 0.0017 | 0.0019 | 0.0035 | 0.0023 | 0.0084 | 0.0213 | 0.0252 | 0.0164 | 0.0269 | 0.0345 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0358 | 0.2321 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0645 | 0.4248 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0289 | 0.3073 | 0.0 | 0.0019 | 0.0111 | 0.1928 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 3.1269 | 0.1841 | 2100 | 2.9097 | 0.0031 | 0.0081 | 0.0019 | 0.0018 | 0.0034 | 0.0023 | 0.0078 | 0.0203 | 0.0239 | 0.0148 | 0.0263 | 0.0278 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0007 | 0.0089 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0399 | 0.2047 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0666 | 0.4306 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0226 | 0.2821 | 0.0 | 0.0019 | 0.0111 | 0.1708 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 2.5906 | 0.1885 | 2150 | 2.8842 | 0.0035 | 0.0092 | 0.0021 | 0.002 | 0.0039 | 0.0027 | 0.0093 | 0.0225 | 0.0262 | 0.0148 | 0.0278 | 0.0318 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.003 | 0.0092 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0442 | 0.2705 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0751 | 0.4192 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0272 | 0.317 | 0.0 | 0.0033 | 0.0135 | 0.1876 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 3.2339 | 0.1929 | 2200 | 2.8868 | 0.0035 | 0.0086 | 0.0024 | 0.0021 | 0.0042 | 0.0023 | 0.009 | 0.0228 | 0.0263 | 0.0158 | 0.0276 | 0.0325 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0006 | 0.0086 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0437 | 0.2618 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0797 | 0.4624 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0235 | 0.2946 | 0.0001 | 0.0061 | 0.0134 | 0.1744 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 3.721 | 0.1973 | 2250 | 2.8685 | 0.0033 | 0.0086 | 0.0022 | 0.0019 | 0.005 | 0.0022 | 0.0089 | 0.024 | 0.0283 | 0.0176 | 0.0286 | 0.0422 | 0.0 | 0.0 | 0.0012 | 0.0006 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0005 | 0.0073 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0359 | 0.2992 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0742 | 0.4749 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0271 | 0.3494 | 0.0 | 0.0035 | 0.015 | 0.1677 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 2.712 | 0.2016 | 2300 | 2.8713 | 0.0036 | 0.0089 | 0.0025 | 0.0016 | 0.0045 | 0.0032 | 0.0097 | 0.0235 | 0.028 | 0.0168 | 0.0271 | 0.0383 | 0.0 | 0.0 | 0.0069 | 0.0015 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0011 | 0.0172 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0516 | 0.3219 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0652 | 0.4548 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0273 | 0.321 | 0.0001 | 0.0056 | 0.0145 | 0.1663 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 2.8365 | 0.2060 | 2350 | 2.8391 | 0.004 | 0.0096 | 0.0028 | 0.0021 | 0.0049 | 0.0025 | 0.0112 | 0.0265 | 0.0304 | 0.0169 | 0.0294 | 0.0358 | 0.0 | 0.0 | 0.0004 | 0.0013 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0053 | 0.0551 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0451 | 0.3445 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0805 | 0.4753 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0359 | 0.3398 | 0.0 | 0.005 | 0.0174 | 0.1779 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 2.7271 | 0.2104 | 2400 | 2.8285 | 0.004 | 0.0098 | 0.0028 | 0.0023 | 0.0047 | 0.0024 | 0.0109 | 0.0267 | 0.0304 | 0.0166 | 0.0297 | 0.0415 | 0.0 | 0.0 | 0.0008 | 0.0008 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0048 | 0.0551 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0425 | 0.3441 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0918 | 0.4682 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0294 | 0.3369 | 0.0 | 0.0048 | 0.016 | 0.1879 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 2.0462 | 0.2148 | 2450 | 2.8274 | 0.0039 | 0.0094 | 0.0029 | 0.0019 | 0.0053 | 0.003 | 0.011 | 0.0272 | 0.0311 | 0.0173 | 0.029 | 0.0455 | 0.0 | 0.0 | 0.0015 | 0.0013 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0046 | 0.0592 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0543 | 0.3844 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0752 | 0.4687 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0279 | 0.3246 | 0.0 | 0.0054 | 0.0161 | 0.1854 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 2.3516 | 0.2192 | 2500 | 2.7999 | 0.0038 | 0.0093 | 0.0027 | 0.002 | 0.0046 | 0.0024 | 0.0104 | 0.0266 | 0.0302 | 0.0165 | 0.029 | 0.037 | 0.0 | 0.0 | 0.0004 | 0.0017 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0141 | 0.0685 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0348 | 0.339 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0842 | 0.474 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0257 | 0.334 | 0.0 | 0.0046 | 0.0134 | 0.1665 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 2.461 | 0.2236 | 2550 | 2.7866 | 0.004 | 0.0096 | 0.0028 | 0.002 | 0.0047 | 0.0029 | 0.0115 | 0.0282 | 0.0324 | 0.0164 | 0.0302 | 0.053 | 0.0 | 0.0 | 0.0002 | 0.0011 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0062 | 0.0697 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0516 | 0.411 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0841 | 0.4616 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0272 | 0.3715 | 0.0 | 0.0061 | 0.0144 | 0.1688 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 3.4791 | 0.2280 | 2600 | 2.7960 | 0.004 | 0.0099 | 0.0027 | 0.002 | 0.0041 | 0.0029 | 0.0115 | 0.0278 | 0.0316 | 0.0143 | 0.0288 | 0.0562 | 0.0 | 0.0 | 0.0013 | 0.0025 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0166 | 0.092 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0435 | 0.4106 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0859 | 0.4286 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0261 | 0.3515 | 0.0 | 0.0052 | 0.0123 | 0.1618 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 3.4903 | 0.2323 | 2650 | 2.7675 | 0.0045 | 0.0103 | 0.0035 | 0.0023 | 0.0055 | 0.0028 | 0.0115 | 0.0297 | 0.034 | 0.0176 | 0.0314 | 0.0456 | 0.0 | 0.0 | 0.0003 | 0.0013 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0093 | 0.0952 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0519 | 0.4148 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0972 | 0.4849 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0342 | 0.3806 | 0.0 | 0.0059 | 0.015 | 0.1835 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 2.6516 | 0.2367 | 2700 | 2.7690 | 0.0044 | 0.0102 | 0.0033 | 0.0026 | 0.0054 | 0.0024 | 0.011 | 0.03 | 0.0342 | 0.0174 | 0.0309 | 0.054 | 0.0 | 0.0 | 0.0005 | 0.0006 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0081 | 0.0844 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0398 | 0.4374 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.1017 | 0.4848 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0338 | 0.3878 | 0.0 | 0.0032 | 0.0187 | 0.1738 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 3.7815 | 0.2411 | 2750 | 2.7602 | 0.0044 | 0.0103 | 0.0033 | 0.0024 | 0.0054 | 0.0027 | 0.0112 | 0.0303 | 0.0342 | 0.0169 | 0.0306 | 0.0542 | 0.0 | 0.0 | 0.0002 | 0.0015 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0073 | 0.0892 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0469 | 0.4608 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0996 | 0.4797 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0322 | 0.3638 | 0.0 | 0.0082 | 0.0159 | 0.1706 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 3.1036 | 0.2455 | 2800 | 2.7317 | 0.0044 | 0.0101 | 0.0035 | 0.0024 | 0.0057 | 0.0029 | 0.013 | 0.0326 | 0.0366 | 0.0179 | 0.0323 | 0.0555 | 0.0 | 0.0 | 0.0005 | 0.0029 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0105 | 0.1048 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0453 | 0.5142 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0982 | 0.4948 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.033 | 0.3675 | 0.0 | 0.0045 | 0.0155 | 0.1961 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 2.9735 | 0.2499 | 2850 | 2.7321 | 0.0046 | 0.0106 | 0.0034 | 0.0024 | 0.0058 | 0.0031 | 0.0119 | 0.0319 | 0.0363 | 0.0176 | 0.032 | 0.0565 | 0.0 | 0.0 | 0.0036 | 0.0078 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0124 | 0.0494 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0462 | 0.5577 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0924 | 0.4604 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0357 | 0.3864 | 0.0 | 0.0045 | 0.0191 | 0.2018 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 3.6733 | 0.2543 | 2900 | 2.7303 | 0.0042 | 0.0099 | 0.003 | 0.0022 | 0.0054 | 0.0028 | 0.0119 | 0.0316 | 0.0362 | 0.0171 | 0.0316 | 0.0547 | 0.0 | 0.0 | 0.0023 | 0.0027 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0065 | 0.058 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0505 | 0.5685 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.091 | 0.4715 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0266 | 0.3813 | 0.0 | 0.0056 | 0.0154 | 0.1789 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 3.3088 | 0.2586 | 2950 | 2.7187 | 0.0045 | 0.0104 | 0.0035 | 0.0022 | 0.0056 | 0.0034 | 0.0124 | 0.0322 | 0.0371 | 0.018 | 0.0317 | 0.0544 | 0.0 | 0.0 | 0.0007 | 0.0042 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0065 | 0.0745 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0619 | 0.5695 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0903 | 0.4889 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0307 | 0.3858 | 0.0001 | 0.0074 | 0.0161 | 0.1744 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 3.0889 | 0.2630 | 3000 | 2.7067 | 0.005 | 0.0114 | 0.0034 | 0.0027 | 0.0074 | 0.0034 | 0.0137 | 0.0349 | 0.0395 | 0.0187 | 0.0336 | 0.0601 | 0.0 | 0.0 | 0.0018 | 0.0128 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0102 | 0.1105 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0575 | 0.6039 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.1069 | 0.4847 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0318 | 0.3932 | 0.0 | 0.0084 | 0.0218 | 0.2037 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 2.711 | 0.2674 | 3050 | 2.7195 | 0.0047 | 0.0107 | 0.0033 | 0.0026 | 0.0057 | 0.0033 | 0.0135 | 0.0343 | 0.0385 | 0.0175 | 0.0329 | 0.0606 | 0.0 | 0.0 | 0.0024 | 0.0139 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0123 | 0.143 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0507 | 0.5522 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.1005 | 0.4708 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0308 | 0.3773 | 0.0 | 0.0082 | 0.0174 | 0.2044 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 3.3009 | 0.2718 | 3100 | 2.7115 | 0.0045 | 0.0104 | 0.0035 | 0.0024 | 0.0056 | 0.0032 | 0.0132 | 0.0346 | 0.0393 | 0.0187 | 0.0322 | 0.0517 | 0.0 | 0.0 | 0.0092 | 0.0149 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0138 | 0.1573 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0447 | 0.5648 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0906 | 0.4767 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0306 | 0.3753 | 0.0 | 0.0067 | 0.0188 | 0.2125 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 3.8116 | 0.2762 | 3150 | 2.7435 | 0.004 | 0.0097 | 0.0028 | 0.0019 | 0.0057 | 0.0032 | 0.013 | 0.0342 | 0.0389 | 0.0169 | 0.0321 | 0.0557 | 0.0 | 0.0 | 0.0049 | 0.0356 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0123 | 0.1611 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0511 | 0.588 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0725 | 0.4503 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0255 | 0.3599 | 0.0 | 0.0104 | 0.0186 | 0.1863 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 3.0954 | 0.2806 | 3200 | 2.7445 | 0.004 | 0.0097 | 0.0029 | 0.0021 | 0.0059 | 0.0028 | 0.0121 | 0.0325 | 0.037 | 0.0167 | 0.0314 | 0.0534 | 0.0 | 0.0 | 0.0026 | 0.0118 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0128 | 0.1401 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0457 | 0.5624 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0779 | 0.4503 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0266 | 0.3494 | 0.0 | 0.0117 | 0.0181 | 0.1784 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 3.0561 | 0.2849 | 3250 | 2.7112 | 0.0043 | 0.01 | 0.0031 | 0.0022 | 0.0063 | 0.0032 | 0.0122 | 0.0326 | 0.0375 | 0.018 | 0.0331 | 0.0521 | 0.0 | 0.0 | 0.001 | 0.0093 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.009 | 0.1182 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0581 | 0.5736 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0815 | 0.4714 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0299 | 0.365 | 0.0 | 0.0108 | 0.0193 | 0.1759 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 3.4277 | 0.2893 | 3300 | 2.7048 | 0.0044 | 0.0102 | 0.0034 | 0.0022 | 0.0055 | 0.0033 | 0.013 | 0.0333 | 0.0378 | 0.0181 | 0.0328 | 0.0514 | 0.0 | 0.0 | 0.0008 | 0.0067 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0083 | 0.1252 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0636 | 0.5874 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0834 | 0.4812 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0288 | 0.3515 | 0.0 | 0.0112 | 0.0197 | 0.1773 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 1.8427 | 0.2937 | 3350 | 2.7143 | 0.0048 | 0.0108 | 0.0038 | 0.0025 | 0.0058 | 0.0035 | 0.0141 | 0.0352 | 0.0396 | 0.0179 | 0.0339 | 0.0534 | 0.0 | 0.0 | 0.0027 | 0.0074 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0108 | 0.1627 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0596 | 0.6002 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0872 | 0.4834 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0336 | 0.3676 | 0.0 | 0.0123 | 0.0262 | 0.186 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 2.7183 | 0.2981 | 3400 | 2.6873 | 0.0044 | 0.0102 | 0.0033 | 0.0021 | 0.0055 | 0.0032 | 0.0132 | 0.0356 | 0.0402 | 0.0185 | 0.033 | 0.0551 | 0.0 | 0.0 | 0.0026 | 0.0152 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0104 | 0.1433 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0518 | 0.6199 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0796 | 0.4688 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.034 | 0.3977 | 0.0 | 0.0104 | 0.0218 | 0.1949 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 3.4558 | 0.3025 | 3450 | 2.6649 | 0.0047 | 0.0108 | 0.0036 | 0.0026 | 0.0052 | 0.0033 | 0.0146 | 0.0369 | 0.0417 | 0.0191 | 0.035 | 0.0539 | 0.0 | 0.0 | 0.0033 | 0.0272 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0107 | 0.1608 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0558 | 0.6236 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0967 | 0.4799 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0327 | 0.4058 | 0.0 | 0.0167 | 0.0162 | 0.206 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 2.6678 | 0.3069 | 3500 | 2.6434 | 0.0051 | 0.0115 | 0.0039 | 0.0028 | 0.0071 | 0.0036 | 0.0161 | 0.039 | 0.0433 | 0.018 | 0.038 | 0.0649 | 0.0 | 0.0 | 0.0031 | 0.0227 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.019 | 0.2389 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0546 | 0.6406 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.1116 | 0.4765 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0334 | 0.4003 | 0.0 | 0.0143 | 0.0134 | 0.1991 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 2.612 | 0.3112 | 3550 | 2.6392 | 0.0054 | 0.0125 | 0.0042 | 0.0029 | 0.0082 | 0.0038 | 0.0163 | 0.0396 | 0.0436 | 0.0183 | 0.0386 | 0.067 | 0.0 | 0.0 | 0.0033 | 0.0272 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0206 | 0.251 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.06 | 0.6465 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.1109 | 0.464 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0366 | 0.3908 | 0.0 | 0.0128 | 0.0181 | 0.2125 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 2.7356 | 0.3156 | 3600 | 2.6317 | 0.0053 | 0.0123 | 0.004 | 0.0026 | 0.0092 | 0.0039 | 0.0161 | 0.0395 | 0.0435 | 0.0178 | 0.0392 | 0.0629 | 0.0 | 0.0 | 0.0038 | 0.0291 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.02 | 0.2487 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0597 | 0.6551 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.1035 | 0.4593 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0372 | 0.379 | 0.0 | 0.0164 | 0.0183 | 0.2114 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 2.4591 | 0.3200 | 3650 | 2.6353 | 0.0049 | 0.0111 | 0.004 | 0.0025 | 0.0097 | 0.0039 | 0.0148 | 0.0382 | 0.0429 | 0.02 | 0.0384 | 0.0605 | 0.0 | 0.0 | 0.0038 | 0.0194 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0111 | 0.1799 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0618 | 0.6604 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0965 | 0.4919 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0351 | 0.3947 | 0.0 | 0.0145 | 0.019 | 0.2121 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 2.5455 | 0.3244 | 3700 | 2.6355 | 0.0047 | 0.0107 | 0.0036 | 0.0024 | 0.0076 | 0.004 | 0.0152 | 0.0379 | 0.0425 | 0.0191 | 0.0369 | 0.067 | 0.0 | 0.0 | 0.0054 | 0.0312 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0116 | 0.1847 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0522 | 0.6392 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0965 | 0.4914 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0312 | 0.3792 | 0.0 | 0.0182 | 0.0174 | 0.2133 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 2.3184 | 0.3288 | 3750 | 2.6308 | 0.005 | 0.0118 | 0.0034 | 0.003 | 0.0075 | 0.0035 | 0.016 | 0.0398 | 0.0436 | 0.0169 | 0.0369 | 0.0684 | 0.0 | 0.0 | 0.0036 | 0.0328 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0197 | 0.2783 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0452 | 0.6394 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.1073 | 0.4639 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0325 | 0.3811 | 0.0 | 0.0143 | 0.021 | 0.1942 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 2.7731 | 0.3332 | 3800 | 2.6112 | 0.0053 | 0.0121 | 0.004 | 0.0028 | 0.0088 | 0.0039 | 0.0171 | 0.0416 | 0.0456 | 0.0182 | 0.0398 | 0.0699 | 0.0 | 0.0 | 0.0055 | 0.0455 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0231 | 0.3175 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0488 | 0.6569 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.1136 | 0.4773 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0344 | 0.3844 | 0.0 | 0.0162 | 0.0175 | 0.198 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 2.9583 | 0.3375 | 3850 | 2.5992 | 0.0056 | 0.0123 | 0.0045 | 0.0027 | 0.0087 | 0.005 | 0.0177 | 0.0417 | 0.0459 | 0.0194 | 0.0407 | 0.069 | 0.0 | 0.0 | 0.0064 | 0.0408 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0304 | 0.3127 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0631 | 0.6553 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.1042 | 0.4894 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0361 | 0.3978 | 0.0001 | 0.0199 | 0.0179 | 0.1954 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 3.573 | 0.3419 | 3900 | 2.5935 | 0.0056 | 0.0121 | 0.0045 | 0.0028 | 0.0089 | 0.0044 | 0.0185 | 0.0435 | 0.0476 | 0.0182 | 0.0416 | 0.0752 | 0.0 | 0.0 | 0.0077 | 0.0716 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.025 | 0.3525 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.056 | 0.6596 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.1147 | 0.4926 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0378 | 0.4056 | 0.0 | 0.0152 | 0.015 | 0.1915 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 3.5327 | 0.3463 | 3950 | 2.5889 | 0.0057 | 0.0126 | 0.0045 | 0.0027 | 0.0098 | 0.005 | 0.0197 | 0.0444 | 0.0487 | 0.0196 | 0.0445 | 0.0739 | 0.0 | 0.0 | 0.0071 | 0.0716 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0257 | 0.3761 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0689 | 0.6663 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.1049 | 0.4866 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0375 | 0.4073 | 0.0001 | 0.0219 | 0.0168 | 0.2122 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 2.0574 | 0.3507 | 4000 | 2.5890 | 0.0059 | 0.013 | 0.0047 | 0.0032 | 0.0094 | 0.0043 | 0.0189 | 0.0431 | 0.0471 | 0.0184 | 0.0407 | 0.0721 | 0.0 | 0.0 | 0.005 | 0.0552 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0252 | 0.3627 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0588 | 0.6419 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.1252 | 0.4864 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0372 | 0.4081 | 0.0 | 0.0152 | 0.0177 | 0.198 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 2.8422 | 0.3551 | 4050 | 2.5875 | 0.0055 | 0.0122 | 0.0044 | 0.003 | 0.0087 | 0.004 | 0.0191 | 0.0439 | 0.048 | 0.0184 | 0.0419 | 0.0731 | 0.0 | 0.0 | 0.0066 | 0.0762 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0236 | 0.3812 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0552 | 0.6531 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.1151 | 0.4732 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0362 | 0.4032 | 0.0 | 0.0141 | 0.0166 | 0.2056 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 2.3636 | 0.3595 | 4100 | 2.5911 | 0.0057 | 0.0126 | 0.0045 | 0.003 | 0.0079 | 0.0042 | 0.0198 | 0.0445 | 0.0486 | 0.0179 | 0.0406 | 0.0754 | 0.0 | 0.0 | 0.011 | 0.1095 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0274 | 0.3889 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.053 | 0.6528 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.117 | 0.4772 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0345 | 0.3974 | 0.0 | 0.0177 | 0.0173 | 0.1926 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 3.2576 | 0.3638 | 4150 | 2.5810 | 0.0057 | 0.0128 | 0.0047 | 0.003 | 0.0082 | 0.0048 | 0.0193 | 0.0426 | 0.0471 | 0.0194 | 0.0422 | 0.0707 | 0.0 | 0.0 | 0.0064 | 0.0808 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0244 | 0.3411 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0721 | 0.6392 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.108 | 0.4726 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0341 | 0.411 | 0.0 | 0.0171 | 0.0188 | 0.2052 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 2.0334 | 0.3682 | 4200 | 2.5793 | 0.0058 | 0.0132 | 0.0044 | 0.0029 | 0.0086 | 0.005 | 0.019 | 0.043 | 0.0472 | 0.0187 | 0.0416 | 0.0722 | 0.0 | 0.0 | 0.0055 | 0.0648 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0276 | 0.3866 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0721 | 0.636 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.1039 | 0.462 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0386 | 0.3946 | 0.0 | 0.0154 | 0.0181 | 0.2106 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 3.1475 | 0.3726 | 4250 | 2.5652 | 0.0062 | 0.0137 | 0.0049 | 0.0032 | 0.0093 | 0.0052 | 0.0208 | 0.045 | 0.0491 | 0.0195 | 0.044 | 0.0724 | 0.0 | 0.0 | 0.006 | 0.0747 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0311 | 0.4051 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0745 | 0.6583 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.115 | 0.4727 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0389 | 0.4109 | 0.0 | 0.0193 | 0.0185 | 0.2178 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 3.0456 | 0.3770 | 4300 | 2.5670 | 0.0059 | 0.0134 | 0.0043 | 0.0032 | 0.0091 | 0.0045 | 0.0198 | 0.0441 | 0.0481 | 0.0188 | 0.0446 | 0.071 | 0.0 | 0.0 | 0.0038 | 0.0554 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0267 | 0.3739 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0644 | 0.6791 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.1148 | 0.4621 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0394 | 0.406 | 0.0 | 0.016 | 0.0217 | 0.2209 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 2.6272 | 0.3814 | 4350 | 2.5713 | 0.0063 | 0.0135 | 0.0054 | 0.0037 | 0.0102 | 0.0045 | 0.0206 | 0.0463 | 0.0505 | 0.0205 | 0.0457 | 0.0755 | 0.0 | 0.0 | 0.0037 | 0.0564 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0271 | 0.3841 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0578 | 0.7024 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.1339 | 0.4939 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0439 | 0.4329 | 0.0001 | 0.018 | 0.0222 | 0.2367 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 2.6363 | 0.3858 | 4400 | 2.5674 | 0.0061 | 0.0133 | 0.0051 | 0.0036 | 0.0099 | 0.0049 | 0.0216 | 0.046 | 0.0504 | 0.0208 | 0.0438 | 0.0767 | 0.0 | 0.0 | 0.0046 | 0.0632 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0294 | 0.3927 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0604 | 0.6986 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.1243 | 0.4937 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0409 | 0.4233 | 0.0 | 0.0125 | 0.0228 | 0.2352 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 3.839 | 0.3901 | 4450 | 2.5555 | 0.0062 | 0.0137 | 0.0052 | 0.0037 | 0.0107 | 0.005 | 0.022 | 0.0472 | 0.0514 | 0.0208 | 0.046 | 0.0756 | 0.0 | 0.0 | 0.0055 | 0.0735 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0291 | 0.4153 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0618 | 0.6961 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.125 | 0.4937 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0412 | 0.4362 | 0.0 | 0.0147 | 0.024 | 0.2343 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 2.481 | 0.3945 | 4500 | 2.5526 | 0.0063 | 0.0139 | 0.0054 | 0.0035 | 0.009 | 0.0048 | 0.0211 | 0.0468 | 0.051 | 0.0194 | 0.0461 | 0.0756 | 0.0 | 0.0 | 0.0056 | 0.0768 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0291 | 0.4417 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0688 | 0.6854 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.125 | 0.4797 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0407 | 0.4237 | 0.0 | 0.0182 | 0.0214 | 0.2203 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 2.5126 | 0.3989 | 4550 | 2.5478 | 0.0062 | 0.0134 | 0.0053 | 0.0033 | 0.0102 | 0.005 | 0.0215 | 0.0475 | 0.0515 | 0.0216 | 0.0459 | 0.0721 | 0.0 | 0.0 | 0.0058 | 0.0792 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0314 | 0.442 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0681 | 0.6758 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.1174 | 0.4905 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0432 | 0.4269 | 0.0001 | 0.0227 | 0.0215 | 0.2298 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 2.3571 | 0.4033 | 4600 | 2.5550 | 0.0065 | 0.0139 | 0.0055 | 0.0032 | 0.0092 | 0.0055 | 0.0221 | 0.0487 | 0.0527 | 0.0213 | 0.0467 | 0.0775 | 0.0 | 0.0 | 0.0085 | 0.1006 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0307 | 0.4908 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0778 | 0.6604 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.1161 | 0.4873 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0425 | 0.4269 | 0.0001 | 0.0243 | 0.0227 | 0.2335 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 2.5163 | 0.4077 | 4650 | 2.5600 | 0.0065 | 0.014 | 0.0053 | 0.0033 | 0.0088 | 0.0055 | 0.0214 | 0.0469 | 0.0506 | 0.0201 | 0.046 | 0.0774 | 0.0 | 0.0 | 0.0071 | 0.0865 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0285 | 0.4691 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0836 | 0.636 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.115 | 0.4763 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0407 | 0.4194 | 0.0001 | 0.0249 | 0.0231 | 0.2149 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 2.7444 | 0.4121 | 4700 | 2.5559 | 0.0066 | 0.0142 | 0.0054 | 0.0037 | 0.0079 | 0.0052 | 0.0219 | 0.0481 | 0.0519 | 0.019 | 0.0456 | 0.0786 | 0.0 | 0.0 | 0.0087 | 0.1213 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0327 | 0.4898 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0777 | 0.6506 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.1208 | 0.4753 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0417 | 0.4177 | 0.0001 | 0.0199 | 0.0202 | 0.2107 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 3.1418 | 0.4164 | 4750 | 2.5343 | 0.0067 | 0.0146 | 0.0057 | 0.0035 | 0.0097 | 0.0054 | 0.0218 | 0.0485 | 0.0525 | 0.0204 | 0.047 | 0.0785 | 0.0 | 0.0 | 0.007 | 0.0981 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0345 | 0.4873 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0772 | 0.6817 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.1281 | 0.4799 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0429 | 0.4312 | 0.0 | 0.0188 | 0.0197 | 0.2168 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 3.715 | 0.4208 | 4800 | 2.5333 | 0.0065 | 0.0141 | 0.0053 | 0.0034 | 0.0104 | 0.0054 | 0.0218 | 0.0483 | 0.0527 | 0.0216 | 0.0496 | 0.0739 | 0.0 | 0.0 | 0.0073 | 0.1133 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0336 | 0.4322 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0694 | 0.7059 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.1261 | 0.4794 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0453 | 0.4393 | 0.0001 | 0.023 | 0.0182 | 0.2314 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 3.0793 | 0.4252 | 4850 | 2.5575 | 0.0066 | 0.015 | 0.005 | 0.0034 | 0.0093 | 0.0059 | 0.0224 | 0.0487 | 0.0526 | 0.0203 | 0.0488 | 0.076 | 0.0 | 0.0 | 0.0094 | 0.1501 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0402 | 0.4618 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0721 | 0.6734 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.1175 | 0.4542 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.047 | 0.4216 | 0.0001 | 0.0247 | 0.0187 | 0.2328 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 3.202 | 0.4296 | 4900 | 2.5455 | 0.0065 | 0.0143 | 0.0052 | 0.0037 | 0.0092 | 0.0051 | 0.0217 | 0.049 | 0.0529 | 0.0189 | 0.0466 | 0.0784 | 0.0 | 0.0 | 0.0085 | 0.1579 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0312 | 0.4736 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0619 | 0.6819 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.1319 | 0.4611 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.051 | 0.4327 | 0.0001 | 0.0234 | 0.0137 | 0.2015 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 2.3629 | 0.4340 | 4950 | 2.5292 | 0.0069 | 0.0156 | 0.0052 | 0.0039 | 0.0106 | 0.0049 | 0.0216 | 0.0498 | 0.0538 | 0.019 | 0.0481 | 0.0826 | 0.0 | 0.0 | 0.0101 | 0.1629 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0322 | 0.4901 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0629 | 0.6902 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.1429 | 0.4516 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0473 | 0.4472 | 0.0001 | 0.021 | 0.0213 | 0.2132 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 2.5327 | 0.4384 | 5000 | 2.5154 | 0.0075 | 0.0165 | 0.0061 | 0.0043 | 0.0108 | 0.0056 | 0.0234 | 0.0509 | 0.0549 | 0.0198 | 0.05 | 0.0819 | 0.0 | 0.0 | 0.0109 | 0.1712 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0408 | 0.5264 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0696 | 0.6894 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.1562 | 0.462 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0463 | 0.4408 | 0.0001 | 0.0257 | 0.0208 | 0.2106 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 3.13 | 0.4427 | 5050 | 2.5230 | 0.0073 | 0.0163 | 0.0056 | 0.0043 | 0.0117 | 0.0051 | 0.0224 | 0.0498 | 0.0536 | 0.0186 | 0.0479 | 0.075 | 0.0 | 0.0 | 0.0096 | 0.1312 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0451 | 0.5271 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0601 | 0.6827 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.1525 | 0.4508 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0472 | 0.4399 | 0.0001 | 0.0243 | 0.0218 | 0.2077 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 3.0245 | 0.4471 | 5100 | 2.5297 | 0.0073 | 0.0162 | 0.0057 | 0.0044 | 0.0114 | 0.0048 | 0.0221 | 0.0503 | 0.0537 | 0.0187 | 0.0473 | 0.0767 | 0.0 | 0.0 | 0.0107 | 0.1503 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0349 | 0.5414 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.053 | 0.6671 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.1681 | 0.4691 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0486 | 0.419 | 0.0001 | 0.0193 | 0.021 | 0.203 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 2.6361 | 0.4515 | 5150 | 2.5198 | 0.0076 | 0.0162 | 0.0063 | 0.0048 | 0.0121 | 0.0049 | 0.0229 | 0.0515 | 0.0551 | 0.0198 | 0.0499 | 0.0763 | 0.0 | 0.0 | 0.0096 | 0.164 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0391 | 0.5535 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.056 | 0.685 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.1729 | 0.4771 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0506 | 0.4245 | 0.0001 | 0.0212 | 0.0213 | 0.2109 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 2.3765 | 0.4559 | 5200 | 2.5120 | 0.0075 | 0.0166 | 0.0057 | 0.0047 | 0.0109 | 0.005 | 0.0228 | 0.0492 | 0.0524 | 0.0193 | 0.0452 | 0.0774 | 0.0 | 0.0 | 0.0068 | 0.1143 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0419 | 0.5166 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0586 | 0.6758 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.1738 | 0.4566 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.047 | 0.4123 | 0.0001 | 0.0268 | 0.0173 | 0.2095 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 3.8775 | 0.4603 | 5250 | 2.5083 | 0.0078 | 0.0169 | 0.0063 | 0.0048 | 0.0118 | 0.0053 | 0.0235 | 0.0503 | 0.0536 | 0.0203 | 0.0489 | 0.0781 | 0.0 | 0.0 | 0.0085 | 0.1347 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0422 | 0.5236 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0642 | 0.6951 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.1783 | 0.4646 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0463 | 0.4095 | 0.0001 | 0.0279 | 0.0202 | 0.2122 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 2.4782 | 0.4647 | 5300 | 2.4925 | 0.0082 | 0.0175 | 0.0069 | 0.0048 | 0.0112 | 0.0063 | 0.0249 | 0.0507 | 0.0543 | 0.0211 | 0.0495 | 0.0797 | 0.0 | 0.0 | 0.011 | 0.1284 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0442 | 0.5041 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0759 | 0.7069 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.1752 | 0.4688 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0488 | 0.422 | 0.0001 | 0.034 | 0.0223 | 0.2319 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 2.6859 | 0.4691 | 5350 | 2.4876 | 0.0084 | 0.0175 | 0.0073 | 0.0051 | 0.0116 | 0.0061 | 0.0242 | 0.0513 | 0.0547 | 0.0213 | 0.0484 | 0.0821 | 0.0 | 0.0 | 0.0096 | 0.1377 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0431 | 0.5118 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0765 | 0.7093 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.1842 | 0.4778 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0508 | 0.4239 | 0.0001 | 0.0275 | 0.0215 | 0.2293 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 2.8552 | 0.4734 | 5400 | 2.4936 | 0.0081 | 0.0173 | 0.0067 | 0.0051 | 0.0117 | 0.0058 | 0.0248 | 0.052 | 0.0552 | 0.0211 | 0.049 | 0.0839 | 0.0 | 0.0 | 0.0096 | 0.1491 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0435 | 0.5312 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.066 | 0.7098 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.1778 | 0.4756 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.051 | 0.4156 | 0.0001 | 0.0286 | 0.0243 | 0.2302 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 2.3906 | 0.4778 | 5450 | 2.4950 | 0.008 | 0.0168 | 0.0068 | 0.0049 | 0.0123 | 0.0057 | 0.025 | 0.0518 | 0.0552 | 0.0207 | 0.0503 | 0.0826 | 0.0 | 0.0 | 0.011 | 0.1688 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0421 | 0.5131 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0659 | 0.7096 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.1768 | 0.4836 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0491 | 0.4151 | 0.0001 | 0.0221 | 0.0224 | 0.2254 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 2.1611 | 0.4822 | 5500 | 2.4878 | 0.0078 | 0.0163 | 0.0065 | 0.0043 | 0.012 | 0.0059 | 0.0246 | 0.0525 | 0.0562 | 0.0206 | 0.0519 | 0.0769 | 0.0 | 0.0 | 0.0127 | 0.1979 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0429 | 0.5178 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0717 | 0.7232 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.1587 | 0.4752 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0493 | 0.4266 | 0.0001 | 0.0221 | 0.0217 | 0.2201 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 3.2529 | 0.4866 | 5550 | 2.4851 | 0.0079 | 0.0161 | 0.0069 | 0.0043 | 0.0118 | 0.0062 | 0.0247 | 0.0511 | 0.0548 | 0.0201 | 0.0506 | 0.0713 | 0.0 | 0.0 | 0.0122 | 0.1598 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0445 | 0.5073 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0769 | 0.7246 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.1634 | 0.4775 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0494 | 0.428 | 0.0001 | 0.026 | 0.0162 | 0.1967 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 3.2671 | 0.4910 | 5600 | 2.4786 | 0.0081 | 0.0166 | 0.0071 | 0.004 | 0.012 | 0.0068 | 0.0248 | 0.0522 | 0.0559 | 0.0217 | 0.0511 | 0.0675 | 0.0 | 0.0 | 0.0161 | 0.188 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0417 | 0.4959 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0893 | 0.7287 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.1547 | 0.474 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0481 | 0.4196 | 0.0001 | 0.0294 | 0.0238 | 0.2341 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 3.1639 | 0.4954 | 5650 | 2.4929 | 0.0078 | 0.0172 | 0.0061 | 0.0041 | 0.0111 | 0.0063 | 0.0243 | 0.0502 | 0.0538 | 0.0197 | 0.0502 | 0.0652 | 0.0 | 0.0 | 0.0126 | 0.1863 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0403 | 0.4745 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0861 | 0.7026 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.1483 | 0.4455 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0487 | 0.4125 | 0.0001 | 0.0316 | 0.0229 | 0.2216 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 3.0184 | 0.4997 | 5700 | 2.4787 | 0.0081 | 0.0173 | 0.0067 | 0.0041 | 0.011 | 0.0066 | 0.0251 | 0.0517 | 0.0553 | 0.0208 | 0.0513 | 0.0683 | 0.0 | 0.0 | 0.0141 | 0.2029 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0408 | 0.4889 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0854 | 0.7073 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.1559 | 0.463 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.053 | 0.4178 | 0.0001 | 0.0348 | 0.0211 | 0.2301 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 2.5792 | 0.5041 | 5750 | 2.4740 | 0.0084 | 0.0181 | 0.0069 | 0.0042 | 0.0114 | 0.0069 | 0.0253 | 0.053 | 0.0566 | 0.0203 | 0.0526 | 0.0791 | 0.0 | 0.0 | 0.0161 | 0.2389 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0412 | 0.501 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.094 | 0.7089 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.1633 | 0.4629 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.052 | 0.4288 | 0.0001 | 0.0338 | 0.0216 | 0.228 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 2.1179 | 0.5085 | 5800 | 2.4682 | 0.0083 | 0.0182 | 0.0063 | 0.0046 | 0.0122 | 0.0062 | 0.0247 | 0.0533 | 0.0567 | 0.0198 | 0.0502 | 0.0716 | 0.0 | 0.0 | 0.0156 | 0.2364 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0406 | 0.5143 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0801 | 0.7118 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.1706 | 0.4616 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.052 | 0.4305 | 0.0001 | 0.0286 | 0.0221 | 0.223 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 2.4085 | 0.5129 | 5850 | 2.4737 | 0.0086 | 0.0181 | 0.0075 | 0.0048 | 0.0126 | 0.0065 | 0.0258 | 0.0547 | 0.0583 | 0.0207 | 0.0524 | 0.0815 | 0.0 | 0.0 | 0.0173 | 0.2383 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.046 | 0.542 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0774 | 0.7222 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.1776 | 0.4858 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0526 | 0.4395 | 0.0001 | 0.0255 | 0.0235 | 0.2262 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 2.4216 | 0.5173 | 5900 | 2.4760 | 0.0091 | 0.0187 | 0.0079 | 0.005 | 0.012 | 0.007 | 0.0264 | 0.0552 | 0.0583 | 0.0206 | 0.052 | 0.089 | 0.0 | 0.0 | 0.0172 | 0.2272 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0618 | 0.5682 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0805 | 0.7171 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.1835 | 0.4871 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0505 | 0.4236 | 0.0001 | 0.0255 | 0.0238 | 0.2353 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 2.9483 | 0.5217 | 5950 | 2.4605 | 0.0091 | 0.0185 | 0.008 | 0.005 | 0.0125 | 0.007 | 0.0263 | 0.055 | 0.0583 | 0.0208 | 0.0511 | 0.0895 | 0.0 | 0.0 | 0.0175 | 0.2434 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0521 | 0.5605 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.086 | 0.7037 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.1852 | 0.4943 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0542 | 0.4248 | 0.0001 | 0.0251 | 0.0224 | 0.2307 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 3.7416 | 0.5260 | 6000 | 2.4768 | 0.0087 | 0.0184 | 0.0073 | 0.0049 | 0.0115 | 0.0065 | 0.026 | 0.0549 | 0.0581 | 0.0196 | 0.0519 | 0.0869 | 0.0 | 0.0 | 0.0179 | 0.2604 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0446 | 0.5653 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0805 | 0.7022 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.1816 | 0.4871 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0507 | 0.4182 | 0.0001 | 0.0216 | 0.0226 | 0.2194 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 2.4621 | 0.5304 | 6050 | 2.4653 | 0.0087 | 0.0186 | 0.0069 | 0.0049 | 0.0121 | 0.0068 | 0.0253 | 0.0535 | 0.0566 | 0.0199 | 0.0504 | 0.0863 | 0.0 | 0.0 | 0.0205 | 0.2381 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0449 | 0.5401 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0792 | 0.6886 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.1816 | 0.4816 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0506 | 0.4036 | 0.0002 | 0.0255 | 0.0232 | 0.2276 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 1.8625 | 0.5348 | 6100 | 2.4758 | 0.0088 | 0.0186 | 0.0074 | 0.0051 | 0.0129 | 0.0064 | 0.0251 | 0.0536 | 0.0566 | 0.0198 | 0.0501 | 0.0879 | 0.0 | 0.0 | 0.0195 | 0.2404 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0414 | 0.5217 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0757 | 0.6988 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.193 | 0.4874 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0508 | 0.4043 | 0.0001 | 0.0208 | 0.0242 | 0.2314 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 3.4473 | 0.5392 | 6150 | 2.4760 | 0.0083 | 0.0183 | 0.0063 | 0.0048 | 0.012 | 0.0063 | 0.0247 | 0.0527 | 0.0557 | 0.0186 | 0.0499 | 0.0848 | 0.0 | 0.0 | 0.0177 | 0.2442 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0463 | 0.5309 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0706 | 0.6817 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.178 | 0.4694 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0489 | 0.3955 | 0.0002 | 0.0251 | 0.0218 | 0.216 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 2.2176 | 0.5436 | 6200 | 2.4557 | 0.0088 | 0.0185 | 0.0074 | 0.005 | 0.013 | 0.0066 | 0.0254 | 0.0541 | 0.0574 | 0.02 | 0.0502 | 0.0872 | 0.0 | 0.0 | 0.0205 | 0.248 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0427 | 0.5376 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0771 | 0.703 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.1883 | 0.4852 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0519 | 0.4147 | 0.0001 | 0.0243 | 0.0237 | 0.2264 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 3.2992 | 0.5480 | 6250 | 2.4544 | 0.0083 | 0.0178 | 0.0066 | 0.0046 | 0.0118 | 0.0063 | 0.0247 | 0.0536 | 0.057 | 0.02 | 0.0502 | 0.0771 | 0.0 | 0.0 | 0.0205 | 0.2448 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0425 | 0.5392 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0741 | 0.7006 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.171 | 0.4762 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0493 | 0.4131 | 0.0001 | 0.027 | 0.0225 | 0.2203 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 3.0539 | 0.5523 | 6300 | 2.4463 | 0.0086 | 0.018 | 0.0073 | 0.0048 | 0.0123 | 0.0065 | 0.025 | 0.054 | 0.0574 | 0.0206 | 0.0515 | 0.0765 | 0.0 | 0.0 | 0.0258 | 0.2404 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0439 | 0.5404 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0752 | 0.7012 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.1815 | 0.4844 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0469 | 0.4098 | 0.0001 | 0.0312 | 0.0218 | 0.2312 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 2.3136 | 0.5567 | 6350 | 2.4534 | 0.0086 | 0.0184 | 0.0073 | 0.0052 | 0.0135 | 0.0063 | 0.0253 | 0.0537 | 0.0571 | 0.0196 | 0.0501 | 0.0771 | 0.0 | 0.0 | 0.0184 | 0.2389 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0445 | 0.5554 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0701 | 0.6957 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.1922 | 0.4878 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0479 | 0.4084 | 0.0001 | 0.0314 | 0.0232 | 0.2076 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 3.488 | 0.5611 | 6400 | 2.4559 | 0.0086 | 0.018 | 0.0075 | 0.0049 | 0.0134 | 0.0062 | 0.0247 | 0.0541 | 0.0577 | 0.0204 | 0.0512 | 0.0803 | 0.0 | 0.0 | 0.0274 | 0.2455 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.04 | 0.5481 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0721 | 0.713 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.1884 | 0.4841 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0472 | 0.4214 | 0.0001 | 0.0299 | 0.0211 | 0.212 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 3.0718 | 0.5655 | 6450 | 2.4648 | 0.0083 | 0.0179 | 0.0066 | 0.0047 | 0.0133 | 0.0062 | 0.0246 | 0.0531 | 0.0562 | 0.0184 | 0.049 | 0.0773 | 0.0 | 0.0 | 0.0228 | 0.2528 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0387 | 0.5522 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0668 | 0.6856 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.1798 | 0.4641 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0479 | 0.4003 | 0.0001 | 0.0301 | 0.0248 | 0.201 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 3.5001 | 0.5699 | 6500 | 2.4589 | 0.0083 | 0.018 | 0.0067 | 0.0047 | 0.0127 | 0.0063 | 0.0246 | 0.0531 | 0.0562 | 0.0189 | 0.0501 | 0.0741 | 0.0 | 0.0 | 0.023 | 0.2589 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0402 | 0.5481 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0659 | 0.689 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.1821 | 0.4627 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0502 | 0.3939 | 0.0001 | 0.0294 | 0.0204 | 0.2033 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 2.1015 | 0.5743 | 6550 | 2.4510 | 0.0085 | 0.018 | 0.007 | 0.0048 | 0.0131 | 0.0069 | 0.0251 | 0.0531 | 0.0563 | 0.0192 | 0.0513 | 0.0756 | 0.0 | 0.0 | 0.0233 | 0.2476 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0395 | 0.5408 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0731 | 0.6894 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.1814 | 0.4765 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0503 | 0.3953 | 0.0001 | 0.0348 | 0.0217 | 0.205 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 2.8605 | 0.5786 | 6600 | 2.4448 | 0.0086 | 0.0183 | 0.0072 | 0.005 | 0.0131 | 0.0067 | 0.0254 | 0.0535 | 0.0567 | 0.0201 | 0.0522 | 0.0765 | 0.0 | 0.0 | 0.0206 | 0.2318 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0415 | 0.5245 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.074 | 0.7033 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.1822 | 0.4872 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0528 | 0.4081 | 0.0001 | 0.0318 | 0.0244 | 0.2202 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 2.3379 | 0.5830 | 6650 | 2.4322 | 0.0088 | 0.0186 | 0.0074 | 0.0051 | 0.0134 | 0.0072 | 0.026 | 0.054 | 0.0572 | 0.02 | 0.0529 | 0.0829 | 0.0 | 0.0 | 0.019 | 0.2171 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0433 | 0.565 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0773 | 0.7037 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.1847 | 0.484 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0557 | 0.4108 | 0.0001 | 0.0374 | 0.0268 | 0.2127 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 2.3859 | 0.5874 | 6700 | 2.4333 | 0.0085 | 0.0183 | 0.0067 | 0.0049 | 0.013 | 0.0069 | 0.0258 | 0.0543 | 0.057 | 0.0197 | 0.0517 | 0.0791 | 0.0 | 0.0 | 0.0172 | 0.2173 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0451 | 0.5653 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0733 | 0.7079 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.1768 | 0.4665 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0523 | 0.3999 | 0.0001 | 0.0344 | 0.0256 | 0.2327 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 2.8406 | 0.5918 | 6750 | 2.4372 | 0.0086 | 0.0182 | 0.0069 | 0.0051 | 0.013 | 0.0067 | 0.0256 | 0.0546 | 0.0574 | 0.0197 | 0.0521 | 0.0732 | 0.0 | 0.0 | 0.017 | 0.2272 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0434 | 0.5704 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0724 | 0.7114 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.1844 | 0.4767 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.051 | 0.3959 | 0.0001 | 0.0349 | 0.0255 | 0.2262 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 2.8826 | 0.5962 | 6800 | 2.4256 | 0.0088 | 0.0184 | 0.0073 | 0.0054 | 0.0142 | 0.0068 | 0.0262 | 0.0549 | 0.0578 | 0.0204 | 0.0536 | 0.0791 | 0.0 | 0.0 | 0.0166 | 0.2145 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.046 | 0.5615 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0728 | 0.7274 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.1926 | 0.4803 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0504 | 0.4125 | 0.0001 | 0.0349 | 0.0268 | 0.2297 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 2.545 | 0.6006 | 6850 | 2.4296 | 0.0088 | 0.0182 | 0.0075 | 0.0051 | 0.0142 | 0.0069 | 0.0261 | 0.0549 | 0.058 | 0.0212 | 0.0545 | 0.0785 | 0.0 | 0.0 | 0.0171 | 0.2215 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0477 | 0.5548 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0737 | 0.7201 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.1868 | 0.4847 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0545 | 0.4143 | 0.0001 | 0.0351 | 0.0253 | 0.236 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 3.4243 | 0.6049 | 6900 | 2.4309 | 0.0087 | 0.0183 | 0.0072 | 0.0049 | 0.0134 | 0.0072 | 0.026 | 0.055 | 0.0581 | 0.0208 | 0.0553 | 0.0795 | 0.0 | 0.0 | 0.0165 | 0.2286 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0457 | 0.5573 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0792 | 0.7276 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.1824 | 0.4784 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0517 | 0.411 | 0.0001 | 0.0346 | 0.0245 | 0.237 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 2.5666 | 0.6093 | 6950 | 2.4256 | 0.0088 | 0.0185 | 0.0073 | 0.0049 | 0.0136 | 0.0072 | 0.0257 | 0.0543 | 0.0573 | 0.0212 | 0.0549 | 0.0783 | 0.0 | 0.0 | 0.0147 | 0.2101 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0494 | 0.5551 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0795 | 0.7132 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.1849 | 0.4863 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0504 | 0.4031 | 0.0001 | 0.0322 | 0.0256 | 0.2378 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 3.1566 | 0.6137 | 7000 | 2.4189 | 0.0092 | 0.0189 | 0.0079 | 0.0052 | 0.0139 | 0.0072 | 0.0263 | 0.0552 | 0.0582 | 0.0216 | 0.0562 | 0.0784 | 0.0 | 0.0 | 0.0155 | 0.2152 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0501 | 0.5611 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0832 | 0.7213 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.1936 | 0.4863 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0565 | 0.4149 | 0.0001 | 0.034 | 0.0256 | 0.2436 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 3.1425 | 0.6181 | 7050 | 2.4154 | 0.0092 | 0.0192 | 0.0077 | 0.0052 | 0.0139 | 0.0071 | 0.0264 | 0.0552 | 0.0582 | 0.0212 | 0.0567 | 0.0771 | 0.0 | 0.0 | 0.0146 | 0.2149 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0521 | 0.5646 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.08 | 0.7211 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.1964 | 0.4796 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0556 | 0.4235 | 0.0001 | 0.0305 | 0.0252 | 0.2424 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 2.4646 | 0.6225 | 7100 | 2.4164 | 0.0093 | 0.0194 | 0.0078 | 0.0054 | 0.0136 | 0.0071 | 0.0268 | 0.0554 | 0.0586 | 0.0209 | 0.0564 | 0.081 | 0.0 | 0.0 | 0.0158 | 0.2215 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0505 | 0.5573 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0791 | 0.727 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.2003 | 0.4798 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0564 | 0.4357 | 0.0001 | 0.0327 | 0.0264 | 0.2433 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 2.3303 | 0.6269 | 7150 | 2.4104 | 0.0095 | 0.0196 | 0.0081 | 0.0054 | 0.0135 | 0.0074 | 0.0274 | 0.056 | 0.0593 | 0.0217 | 0.0576 | 0.0776 | 0.0 | 0.0 | 0.0178 | 0.2337 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0512 | 0.5634 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0824 | 0.7287 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.2022 | 0.4886 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0579 | 0.4328 | 0.0001 | 0.0349 | 0.0266 | 0.2439 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 3.4143 | 0.6312 | 7200 | 2.4092 | 0.0095 | 0.0196 | 0.0081 | 0.0053 | 0.0139 | 0.0076 | 0.027 | 0.0549 | 0.0578 | 0.021 | 0.0563 | 0.0773 | 0.0 | 0.0 | 0.0165 | 0.2135 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0517 | 0.5627 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0823 | 0.71 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.2006 | 0.4838 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0586 | 0.4223 | 0.0001 | 0.0338 | 0.0272 | 0.2332 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 3.4701 | 0.6356 | 7250 | 2.4008 | 0.0095 | 0.0195 | 0.0083 | 0.0058 | 0.0144 | 0.0071 | 0.0268 | 0.0559 | 0.0591 | 0.0219 | 0.058 | 0.0791 | 0.0 | 0.0 | 0.0176 | 0.2381 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0484 | 0.5459 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0756 | 0.7333 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.2097 | 0.4923 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0596 | 0.4281 | 0.0001 | 0.0379 | 0.0281 | 0.2413 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 2.4716 | 0.6400 | 7300 | 2.4012 | 0.0097 | 0.0195 | 0.0083 | 0.0058 | 0.0133 | 0.0072 | 0.0275 | 0.0561 | 0.0591 | 0.022 | 0.0575 | 0.0804 | 0.0 | 0.0 | 0.0173 | 0.2331 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.054 | 0.5455 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0711 | 0.7394 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.2112 | 0.4845 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0637 | 0.4331 | 0.0001 | 0.0368 | 0.0285 | 0.2465 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 2.7446 | 0.6444 | 7350 | 2.4025 | 0.0096 | 0.0194 | 0.0082 | 0.0058 | 0.0145 | 0.0069 | 0.0271 | 0.0564 | 0.0595 | 0.0214 | 0.0584 | 0.0817 | 0.0 | 0.0 | 0.0167 | 0.2404 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0496 | 0.5624 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0718 | 0.7378 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.2119 | 0.4873 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0629 | 0.4292 | 0.0001 | 0.04 | 0.028 | 0.2415 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 2.938 | 0.6488 | 7400 | 2.4081 | 0.0096 | 0.0197 | 0.0079 | 0.0054 | 0.0136 | 0.0072 | 0.027 | 0.0563 | 0.0592 | 0.0205 | 0.0572 | 0.0812 | 0.0 | 0.0 | 0.0189 | 0.2606 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.058 | 0.5662 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0719 | 0.7315 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.2043 | 0.4749 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0611 | 0.4198 | 0.0001 | 0.0353 | 0.0263 | 0.236 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 1.5611 | 0.6532 | 7450 | 2.3955 | 0.01 | 0.0201 | 0.0086 | 0.0058 | 0.014 | 0.0075 | 0.0277 | 0.0574 | 0.0604 | 0.022 | 0.0575 | 0.0815 | 0.0 | 0.0 | 0.019 | 0.2465 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0603 | 0.5844 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0767 | 0.736 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.215 | 0.4969 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0617 | 0.4306 | 0.0001 | 0.037 | 0.0266 | 0.2469 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 3.1077 | 0.6575 | 7500 | 2.3949 | 0.0099 | 0.0202 | 0.0086 | 0.0057 | 0.0147 | 0.0073 | 0.0276 | 0.0575 | 0.0604 | 0.0218 | 0.0581 | 0.0872 | 0.0 | 0.0 | 0.0203 | 0.256 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0605 | 0.5713 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0717 | 0.7435 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.2129 | 0.4898 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0647 | 0.4361 | 0.0001 | 0.0385 | 0.0274 | 0.2444 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 2.8004 | 0.6619 | 7550 | 2.3907 | 0.0099 | 0.0204 | 0.0085 | 0.0058 | 0.0145 | 0.0073 | 0.0275 | 0.0572 | 0.0603 | 0.0213 | 0.058 | 0.0844 | 0.0 | 0.0 | 0.0223 | 0.2577 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0554 | 0.5761 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.074 | 0.7415 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.212 | 0.4936 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.063 | 0.4315 | 0.0001 | 0.0385 | 0.0282 | 0.2362 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 3.2583 | 0.6663 | 7600 | 2.3890 | 0.0102 | 0.0204 | 0.009 | 0.006 | 0.0148 | 0.0075 | 0.0283 | 0.0577 | 0.0609 | 0.0217 | 0.0607 | 0.0842 | 0.0 | 0.0 | 0.0229 | 0.2634 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0557 | 0.5631 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0768 | 0.752 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.2175 | 0.5038 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0657 | 0.4409 | 0.0001 | 0.0409 | 0.0303 | 0.2352 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 2.3283 | 0.6707 | 7650 | 2.3920 | 0.0101 | 0.0202 | 0.0092 | 0.006 | 0.0143 | 0.0072 | 0.0279 | 0.0575 | 0.0606 | 0.0216 | 0.0594 | 0.0834 | 0.0 | 0.0 | 0.0199 | 0.2663 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0555 | 0.5675 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0752 | 0.7419 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.2194 | 0.5058 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0641 | 0.4353 | 0.0001 | 0.0398 | 0.0286 | 0.231 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 3.466 | 0.6751 | 7700 | 2.3923 | 0.0098 | 0.0199 | 0.0087 | 0.0059 | 0.0149 | 0.0071 | 0.0283 | 0.0575 | 0.0606 | 0.0215 | 0.0588 | 0.0828 | 0.0 | 0.0 | 0.0213 | 0.2691 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.053 | 0.5704 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0695 | 0.7459 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.2166 | 0.5065 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0621 | 0.4335 | 0.0001 | 0.0353 | 0.03 | 0.2271 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 3.1811 | 0.6795 | 7750 | 2.4057 | 0.0096 | 0.0199 | 0.0079 | 0.0058 | 0.0148 | 0.0065 | 0.0275 | 0.0565 | 0.0596 | 0.0206 | 0.0576 | 0.082 | 0.0 | 0.0 | 0.0209 | 0.2739 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0532 | 0.5659 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.065 | 0.7354 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.2122 | 0.4938 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.06 | 0.4198 | 0.0001 | 0.0349 | 0.0282 | 0.2193 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 1.7275 | 0.6839 | 7800 | 2.3935 | 0.0099 | 0.02 | 0.0088 | 0.0062 | 0.0151 | 0.0068 | 0.0278 | 0.0573 | 0.0603 | 0.0214 | 0.0583 | 0.0832 | 0.0 | 0.0 | 0.0202 | 0.2693 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0549 | 0.5608 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0678 | 0.7396 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.2233 | 0.5067 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0614 | 0.428 | 0.0001 | 0.037 | 0.0298 | 0.2302 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 3.4082 | 0.6882 | 7850 | 2.3998 | 0.0095 | 0.0197 | 0.008 | 0.0057 | 0.0157 | 0.0069 | 0.0279 | 0.0568 | 0.0597 | 0.0215 | 0.0581 | 0.0827 | 0.0 | 0.0 | 0.0203 | 0.2408 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0499 | 0.5806 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0711 | 0.7396 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.2023 | 0.4901 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0609 | 0.4194 | 0.0001 | 0.0409 | 0.0308 | 0.2367 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 2.6581 | 0.6926 | 7900 | 2.3905 | 0.0098 | 0.0198 | 0.0085 | 0.0058 | 0.0161 | 0.0074 | 0.0283 | 0.0578 | 0.0607 | 0.0222 | 0.0586 | 0.084 | 0.0 | 0.0 | 0.0218 | 0.2512 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0529 | 0.5904 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0732 | 0.7429 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.208 | 0.4992 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0633 | 0.4225 | 0.0001 | 0.0433 | 0.0309 | 0.243 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 2.6352 | 0.6970 | 7950 | 2.3886 | 0.0098 | 0.0196 | 0.0088 | 0.0057 | 0.0154 | 0.0079 | 0.0284 | 0.0583 | 0.0611 | 0.0222 | 0.0587 | 0.0867 | 0.0 | 0.0 | 0.0227 | 0.2766 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0545 | 0.5831 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0754 | 0.7407 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.2069 | 0.503 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0638 | 0.4249 | 0.0001 | 0.0413 | 0.0281 | 0.2421 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 2.9629 | 0.7014 | 8000 | 2.3914 | 0.0099 | 0.0198 | 0.0088 | 0.0056 | 0.0151 | 0.0077 | 0.0283 | 0.0576 | 0.0605 | 0.0214 | 0.0577 | 0.0867 | 0.0 | 0.0 | 0.023 | 0.2762 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0571 | 0.5748 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0746 | 0.7339 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.2082 | 0.499 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.065 | 0.4234 | 0.0001 | 0.0405 | 0.0284 | 0.2341 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 2.6062 | 0.7058 | 8050 | 2.3938 | 0.0098 | 0.02 | 0.0084 | 0.0058 | 0.0149 | 0.0074 | 0.0285 | 0.0571 | 0.06 | 0.0213 | 0.0569 | 0.0824 | 0.0 | 0.0 | 0.0218 | 0.2731 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0514 | 0.5774 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0726 | 0.7169 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.2075 | 0.4962 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0675 | 0.4282 | 0.0001 | 0.0381 | 0.0303 | 0.2279 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 3.6705 | 0.7102 | 8100 | 2.3986 | 0.0098 | 0.02 | 0.0083 | 0.0056 | 0.0146 | 0.0077 | 0.0279 | 0.0575 | 0.0604 | 0.0213 | 0.0572 | 0.0824 | 0.0 | 0.0 | 0.0212 | 0.2806 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0571 | 0.5857 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0758 | 0.7215 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.2018 | 0.4932 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0647 | 0.4232 | 0.0001 | 0.0398 | 0.0303 | 0.2325 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 2.0576 | 0.7145 | 8150 | 2.3894 | 0.0104 | 0.0205 | 0.0093 | 0.0061 | 0.0157 | 0.008 | 0.0289 | 0.0584 | 0.0613 | 0.0223 | 0.0587 | 0.0856 | 0.0 | 0.0 | 0.0232 | 0.2829 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0577 | 0.5876 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0802 | 0.7291 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.2131 | 0.5059 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0677 | 0.4301 | 0.0001 | 0.0411 | 0.0356 | 0.2412 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 2.536 | 0.7189 | 8200 | 2.3835 | 0.0104 | 0.0205 | 0.0095 | 0.006 | 0.0161 | 0.0079 | 0.0288 | 0.0584 | 0.0613 | 0.0226 | 0.0591 | 0.0838 | 0.0 | 0.0 | 0.022 | 0.2796 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.059 | 0.5882 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0832 | 0.7333 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.214 | 0.5082 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0668 | 0.4265 | 0.0001 | 0.0407 | 0.0327 | 0.2412 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 3.2936 | 0.7233 | 8250 | 2.3809 | 0.0104 | 0.0204 | 0.0094 | 0.0059 | 0.016 | 0.0081 | 0.0289 | 0.059 | 0.0619 | 0.0228 | 0.0596 | 0.0851 | 0.0 | 0.0 | 0.0221 | 0.2924 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0603 | 0.5914 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0821 | 0.7394 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.2117 | 0.5136 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0682 | 0.4237 | 0.0001 | 0.0392 | 0.0328 | 0.2466 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 3.0603 | 0.7277 | 8300 | 2.3761 | 0.0103 | 0.0205 | 0.0094 | 0.0058 | 0.0157 | 0.0087 | 0.0296 | 0.0594 | 0.0623 | 0.0229 | 0.0601 | 0.0877 | 0.0 | 0.0 | 0.0219 | 0.2882 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0553 | 0.5917 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0876 | 0.7447 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.2138 | 0.5137 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0682 | 0.4315 | 0.0001 | 0.0414 | 0.0288 | 0.253 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 2.7142 | 0.7321 | 8350 | 2.3751 | 0.0104 | 0.0206 | 0.0092 | 0.0058 | 0.0152 | 0.0088 | 0.03 | 0.0593 | 0.0621 | 0.0226 | 0.0603 | 0.0869 | 0.0 | 0.0 | 0.0228 | 0.2882 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0567 | 0.5987 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0886 | 0.7404 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.2109 | 0.5107 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0685 | 0.4282 | 0.0001 | 0.0385 | 0.0289 | 0.251 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 3.8682 | 0.7365 | 8400 | 2.3761 | 0.0105 | 0.0207 | 0.0094 | 0.006 | 0.015 | 0.0089 | 0.0301 | 0.0591 | 0.0617 | 0.0224 | 0.0597 | 0.086 | 0.0 | 0.0 | 0.0231 | 0.2848 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.059 | 0.599 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.088 | 0.7319 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.214 | 0.5115 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0688 | 0.4274 | 0.0001 | 0.0424 | 0.0285 | 0.2433 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 2.4779 | 0.7408 | 8450 | 2.3748 | 0.0104 | 0.0207 | 0.0091 | 0.0059 | 0.0147 | 0.009 | 0.0299 | 0.0586 | 0.0613 | 0.0224 | 0.0591 | 0.0854 | 0.0 | 0.0 | 0.0218 | 0.2773 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0579 | 0.5873 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0912 | 0.7325 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.2087 | 0.5043 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0676 | 0.4282 | 0.0001 | 0.0426 | 0.029 | 0.2467 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 2.6531 | 0.7452 | 8500 | 2.3725 | 0.0105 | 0.0208 | 0.0093 | 0.0059 | 0.0148 | 0.009 | 0.0301 | 0.0588 | 0.0617 | 0.0227 | 0.0601 | 0.0855 | 0.0 | 0.0 | 0.0235 | 0.2865 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0569 | 0.5752 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0955 | 0.7382 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.2097 | 0.5137 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0683 | 0.428 | 0.0001 | 0.0463 | 0.0288 | 0.2482 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 2.2503 | 0.7496 | 8550 | 2.3778 | 0.0103 | 0.0205 | 0.0092 | 0.0058 | 0.0146 | 0.0087 | 0.0299 | 0.0587 | 0.0614 | 0.0223 | 0.0585 | 0.0863 | 0.0 | 0.0 | 0.0236 | 0.2907 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0581 | 0.5857 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0897 | 0.7317 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.208 | 0.5046 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0674 | 0.429 | 0.0001 | 0.0416 | 0.0283 | 0.2407 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 2.1245 | 0.7540 | 8600 | 2.3778 | 0.0103 | 0.0208 | 0.0089 | 0.0058 | 0.0146 | 0.0086 | 0.0298 | 0.0589 | 0.0615 | 0.0222 | 0.0582 | 0.0865 | 0.0 | 0.0 | 0.0221 | 0.2918 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0571 | 0.5949 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0907 | 0.7321 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.004 | 0.0037 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.2061 | 0.5026 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0665 | 0.4267 | 0.0001 | 0.0398 | 0.0282 | 0.2397 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 2.857 | 0.7584 | 8650 | 2.3844 | 0.0099 | 0.0205 | 0.0084 | 0.0055 | 0.0139 | 0.0084 | 0.0296 | 0.0584 | 0.0611 | 0.0214 | 0.0581 | 0.0841 | 0.0 | 0.0 | 0.0226 | 0.296 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0543 | 0.5939 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0912 | 0.7343 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.1986 | 0.4918 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0627 | 0.4164 | 0.0001 | 0.0383 | 0.0273 | 0.2397 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 2.6832 | 0.7628 | 8700 | 2.3833 | 0.01 | 0.0205 | 0.0086 | 0.0056 | 0.014 | 0.0085 | 0.0297 | 0.0587 | 0.0613 | 0.0216 | 0.0585 | 0.0833 | 0.0 | 0.0 | 0.0224 | 0.2962 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0547 | 0.5949 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0917 | 0.736 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.201 | 0.4986 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0636 | 0.4144 | 0.0001 | 0.0383 | 0.0274 | 0.2408 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 2.2598 | 0.7671 | 8750 | 2.3813 | 0.0102 | 0.0206 | 0.009 | 0.0057 | 0.0143 | 0.0085 | 0.0299 | 0.0589 | 0.0616 | 0.0218 | 0.0587 | 0.0834 | 0.0 | 0.0 | 0.0223 | 0.2947 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0567 | 0.5968 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0927 | 0.7398 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.2059 | 0.5012 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0666 | 0.4221 | 0.0001 | 0.0394 | 0.0266 | 0.238 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 2.5541 | 0.7715 | 8800 | 2.3770 | 0.0104 | 0.0206 | 0.0093 | 0.0059 | 0.0151 | 0.0086 | 0.0299 | 0.0593 | 0.062 | 0.0221 | 0.0598 | 0.0841 | 0.0 | 0.0 | 0.0232 | 0.2979 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0553 | 0.5962 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0926 | 0.7467 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.2125 | 0.5093 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0685 | 0.4292 | 0.0001 | 0.0372 | 0.0267 | 0.2378 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 2.7907 | 0.7759 | 8850 | 2.3766 | 0.0104 | 0.0207 | 0.0093 | 0.0059 | 0.0145 | 0.0085 | 0.03 | 0.0592 | 0.0618 | 0.022 | 0.059 | 0.084 | 0.0 | 0.0 | 0.0233 | 0.3002 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0566 | 0.5994 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0903 | 0.7356 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.2133 | 0.5076 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0686 | 0.4264 | 0.0001 | 0.0372 | 0.0267 | 0.2365 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 3.5793 | 0.7803 | 8900 | 2.3740 | 0.0105 | 0.021 | 0.0094 | 0.0059 | 0.0139 | 0.0087 | 0.0294 | 0.0588 | 0.0615 | 0.0219 | 0.0587 | 0.0829 | 0.0 | 0.0 | 0.0233 | 0.2947 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0584 | 0.6006 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0934 | 0.7339 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.2129 | 0.5049 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0679 | 0.4187 | 0.0001 | 0.0372 | 0.0258 | 0.237 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 2.736 | 0.7847 | 8950 | 2.3708 | 0.0106 | 0.0209 | 0.0095 | 0.006 | 0.0142 | 0.0086 | 0.0295 | 0.0594 | 0.0621 | 0.0222 | 0.0597 | 0.0828 | 0.0 | 0.0 | 0.0231 | 0.3046 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0592 | 0.6041 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0922 | 0.7376 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.2164 | 0.5072 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0683 | 0.4246 | 0.0001 | 0.0396 | 0.0263 | 0.2411 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 3.4449 | 0.7891 | 9000 | 2.3734 | 0.0103 | 0.0207 | 0.009 | 0.0058 | 0.0147 | 0.0084 | 0.0292 | 0.0588 | 0.0615 | 0.0217 | 0.0596 | 0.0826 | 0.0 | 0.0 | 0.023 | 0.3088 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0567 | 0.5815 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0916 | 0.7358 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.2099 | 0.5037 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0661 | 0.4274 | 0.0001 | 0.0375 | 0.0263 | 0.2364 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 2.8655 | 0.7934 | 9050 | 2.3725 | 0.0103 | 0.0207 | 0.0089 | 0.0058 | 0.0134 | 0.0085 | 0.0296 | 0.0587 | 0.0615 | 0.0218 | 0.0585 | 0.0833 | 0.0 | 0.0 | 0.024 | 0.3065 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0566 | 0.5847 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0922 | 0.738 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.2075 | 0.499 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0653 | 0.4242 | 0.0001 | 0.0405 | 0.0273 | 0.2382 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 2.385 | 0.7978 | 9100 | 2.3669 | 0.0105 | 0.0208 | 0.0093 | 0.0059 | 0.0137 | 0.0086 | 0.0302 | 0.0589 | 0.0618 | 0.0223 | 0.0594 | 0.0831 | 0.0 | 0.0 | 0.0241 | 0.3013 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0575 | 0.5879 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0935 | 0.7356 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.2131 | 0.5042 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0646 | 0.4293 | 0.0001 | 0.04 | 0.0285 | 0.2426 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 2.3199 | 0.8022 | 9150 | 2.3654 | 0.0106 | 0.0208 | 0.0095 | 0.006 | 0.0137 | 0.0086 | 0.03 | 0.0589 | 0.0618 | 0.0228 | 0.0588 | 0.0827 | 0.0 | 0.0 | 0.0234 | 0.2947 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0569 | 0.5752 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0951 | 0.7417 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.2157 | 0.5102 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0649 | 0.4307 | 0.0001 | 0.0414 | 0.03 | 0.2483 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 2.2667 | 0.8066 | 9200 | 2.3659 | 0.0106 | 0.0207 | 0.0097 | 0.006 | 0.0136 | 0.0085 | 0.03 | 0.0595 | 0.0623 | 0.0228 | 0.0598 | 0.0832 | 0.0 | 0.0 | 0.0244 | 0.3051 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0571 | 0.5869 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0935 | 0.7419 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.2159 | 0.5113 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0678 | 0.4308 | 0.0001 | 0.0439 | 0.0297 | 0.2482 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 2.3728 | 0.8110 | 9250 | 2.3677 | 0.0105 | 0.0208 | 0.0092 | 0.0059 | 0.0138 | 0.0086 | 0.0304 | 0.0592 | 0.062 | 0.0227 | 0.0598 | 0.0822 | 0.0 | 0.0 | 0.0246 | 0.3061 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.056 | 0.578 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0956 | 0.7364 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.2094 | 0.5089 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0654 | 0.4294 | 0.0001 | 0.0418 | 0.0305 | 0.2501 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 2.4379 | 0.8154 | 9300 | 2.3649 | 0.0104 | 0.0206 | 0.0092 | 0.006 | 0.0144 | 0.0083 | 0.0301 | 0.059 | 0.0618 | 0.0229 | 0.0591 | 0.081 | 0.0 | 0.0 | 0.0238 | 0.3013 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0558 | 0.5764 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0906 | 0.7327 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.2113 | 0.513 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0677 | 0.4288 | 0.0001 | 0.0422 | 0.03 | 0.2482 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 2.6357 | 0.8197 | 9350 | 2.3659 | 0.0107 | 0.021 | 0.0097 | 0.0062 | 0.014 | 0.0083 | 0.0303 | 0.0593 | 0.062 | 0.0226 | 0.0601 | 0.0808 | 0.0 | 0.0 | 0.0232 | 0.3 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0584 | 0.585 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0947 | 0.7392 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.2167 | 0.5098 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0674 | 0.4263 | 0.0001 | 0.0392 | 0.03 | 0.2516 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 2.3531 | 0.8241 | 9400 | 2.3646 | 0.0106 | 0.021 | 0.0096 | 0.006 | 0.0141 | 0.0084 | 0.0302 | 0.0593 | 0.0621 | 0.0227 | 0.0605 | 0.0806 | 0.0 | 0.0 | 0.0236 | 0.3032 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0575 | 0.5901 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0948 | 0.7348 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.2166 | 0.511 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0675 | 0.4258 | 0.0001 | 0.0405 | 0.0285 | 0.2496 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 2.7214 | 0.8285 | 9450 | 2.3623 | 0.0107 | 0.021 | 0.0097 | 0.0061 | 0.0142 | 0.0083 | 0.0302 | 0.0595 | 0.0622 | 0.0228 | 0.0609 | 0.0807 | 0.0 | 0.0 | 0.0241 | 0.3065 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.057 | 0.5879 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0951 | 0.7354 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.218 | 0.5148 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0682 | 0.4278 | 0.0001 | 0.0409 | 0.0283 | 0.2487 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 2.7198 | 0.8329 | 9500 | 2.3630 | 0.0106 | 0.0209 | 0.0094 | 0.0059 | 0.0138 | 0.0085 | 0.0301 | 0.0592 | 0.0619 | 0.0225 | 0.0601 | 0.081 | 0.0 | 0.0 | 0.0242 | 0.3034 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0569 | 0.5841 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0956 | 0.7358 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.2142 | 0.5113 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0678 | 0.4284 | 0.0001 | 0.0398 | 0.0282 | 0.246 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 1.9512 | 0.8373 | 9550 | 2.3623 | 0.0106 | 0.0209 | 0.0094 | 0.006 | 0.0141 | 0.0085 | 0.0303 | 0.0593 | 0.0621 | 0.0225 | 0.0602 | 0.0819 | 0.0 | 0.0 | 0.0238 | 0.3063 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0568 | 0.585 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.095 | 0.7356 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.2155 | 0.5114 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0677 | 0.4287 | 0.0001 | 0.0424 | 0.0279 | 0.2461 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 3.0937 | 0.8417 | 9600 | 2.3605 | 0.0106 | 0.0209 | 0.0095 | 0.006 | 0.0147 | 0.0085 | 0.0303 | 0.0593 | 0.062 | 0.0227 | 0.0604 | 0.0812 | 0.0 | 0.0 | 0.0241 | 0.3065 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0567 | 0.5803 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0959 | 0.736 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.2162 | 0.5134 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0681 | 0.4298 | 0.0001 | 0.0398 | 0.0286 | 0.2483 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 3.2102 | 0.8460 | 9650 | 2.3591 | 0.0107 | 0.0209 | 0.0096 | 0.006 | 0.0148 | 0.0085 | 0.0303 | 0.0596 | 0.0624 | 0.0228 | 0.061 | 0.0815 | 0.0 | 0.0 | 0.0243 | 0.3091 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0567 | 0.5866 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0953 | 0.7364 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.2167 | 0.5153 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0684 | 0.4324 | 0.0001 | 0.0403 | 0.0288 | 0.2491 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 2.2741 | 0.8504 | 9700 | 2.3593 | 0.0106 | 0.021 | 0.0095 | 0.0061 | 0.0148 | 0.0084 | 0.0304 | 0.0595 | 0.0623 | 0.0228 | 0.0611 | 0.0811 | 0.0 | 0.0 | 0.0237 | 0.3074 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0566 | 0.5844 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0954 | 0.7374 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.2163 | 0.5172 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0683 | 0.4313 | 0.0001 | 0.0409 | 0.0289 | 0.2467 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 2.8021 | 0.8548 | 9750 | 2.3598 | 0.0107 | 0.021 | 0.0096 | 0.006 | 0.0148 | 0.0084 | 0.0306 | 0.0596 | 0.0624 | 0.0228 | 0.0612 | 0.0815 | 0.0 | 0.0 | 0.0246 | 0.3112 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0564 | 0.5841 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0959 | 0.7392 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.216 | 0.5171 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0682 | 0.4304 | 0.0001 | 0.0416 | 0.0289 | 0.2481 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 3.2527 | 0.8592 | 9800 | 2.3601 | 0.0106 | 0.0209 | 0.0096 | 0.006 | 0.0152 | 0.0083 | 0.0303 | 0.0595 | 0.0623 | 0.0229 | 0.0611 | 0.0811 | 0.0 | 0.0 | 0.0242 | 0.3109 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0558 | 0.5806 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0952 | 0.7382 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.2154 | 0.5157 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0685 | 0.4331 | 0.0001 | 0.0413 | 0.0285 | 0.2474 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 2.7854 | 0.8636 | 9850 | 2.3590 | 0.0106 | 0.0208 | 0.0096 | 0.006 | 0.0152 | 0.0083 | 0.0304 | 0.0596 | 0.0624 | 0.0227 | 0.0611 | 0.0812 | 0.0 | 0.0 | 0.0243 | 0.3133 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0555 | 0.5822 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0954 | 0.7396 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.2152 | 0.5153 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0684 | 0.4318 | 0.0001 | 0.0392 | 0.0284 | 0.2472 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 2.4886 | 0.8680 | 9900 | 2.3590 | 0.0106 | 0.0209 | 0.0095 | 0.006 | 0.0151 | 0.0083 | 0.0302 | 0.0596 | 0.0623 | 0.0227 | 0.061 | 0.0813 | 0.0 | 0.0 | 0.0243 | 0.3137 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0554 | 0.5815 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0955 | 0.7388 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.2151 | 0.5149 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0684 | 0.4316 | 0.0001 | 0.0401 | 0.0291 | 0.2464 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 2.5477 | 0.8723 | 9950 | 2.3587 | 0.0106 | 0.0209 | 0.0096 | 0.0061 | 0.0151 | 0.0083 | 0.0301 | 0.0597 | 0.0624 | 0.0227 | 0.0613 | 0.0815 | 0.0 | 0.0 | 0.0244 | 0.3137 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0557 | 0.5834 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0959 | 0.74 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.2156 | 0.5153 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0684 | 0.4313 | 0.0001 | 0.0388 | 0.0287 | 0.2471 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
| 3.0935 | 0.8767 | 10000 | 2.3591 | 0.0106 | 0.0209 | 0.0096 | 0.006 | 0.015 | 0.0083 | 0.0303 | 0.0597 | 0.0624 | 0.0227 | 0.0615 | 0.0814 | 0.0 | 0.0 | 0.024 | 0.3141 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0557 | 0.5831 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0957 | 0.7406 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.2154 | 0.5153 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0683 | 0.4312 | 0.0001 | 0.0407 | 0.0288 | 0.2476 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
### Framework versions
- Transformers 4.47.1
- Pytorch 2.5.1+cu124
- Datasets 3.2.0
- Tokenizers 0.21.0
| [
"n/a",
"person",
"traffic light",
"fire hydrant",
"n/a",
"stop sign",
"parking meter",
"bench",
"bird",
"cat",
"dog",
"horse",
"bicycle",
"sheep",
"cow",
"elephant",
"bear",
"zebra",
"giraffe",
"n/a",
"backpack",
"umbrella",
"n/a",
"car",
"n/a",
"handbag",
"tie",
"suitcase",
"frisbee",
"skis",
"snowboard",
"sports ball",
"kite",
"baseball bat",
"motorcycle",
"baseball glove",
"skateboard",
"surfboard",
"tennis racket",
"bottle",
"n/a",
"wine glass",
"cup",
"fork",
"knife",
"airplane",
"spoon",
"bowl",
"banana",
"apple",
"sandwich",
"orange",
"broccoli",
"carrot",
"hot dog",
"pizza",
"bus",
"donut",
"cake",
"chair",
"couch",
"potted plant",
"bed",
"n/a",
"dining table",
"n/a",
"n/a",
"train",
"toilet",
"n/a",
"tv",
"laptop",
"mouse",
"remote",
"keyboard",
"cell phone",
"microwave",
"oven",
"truck",
"toaster",
"sink",
"refrigerator",
"n/a",
"book",
"clock",
"vase",
"scissors",
"teddy bear",
"hair drier",
"boat",
"toothbrush"
] |
kunilata09/Traffic_try_v2_XYZ |
# Model Card for Model ID
<!-- Provide a quick summary of what the model is/does. -->
## Model Details
### Model Description
<!-- Provide a longer summary of what this model is. -->
This is the model card of a 🤗 transformers model that has been pushed on the Hub. This model card has been automatically generated.
- **Developed by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Model type:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
- **Finetuned from model [optional]:** [More Information Needed]
### Model Sources [optional]
<!-- Provide the basic links for the model. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the model is intended to be used, including the foreseeable users of the model and those affected by the model. -->
### Direct Use
<!-- This section is for the model use without fine-tuning or plugging into a larger ecosystem/app. -->
[More Information Needed]
### Downstream Use [optional]
<!-- This section is for the model use when fine-tuned for a task, or when plugged into a larger ecosystem/app -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the model will not work well for. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.
## How to Get Started with the Model
Use the code below to get started with the model.
[More Information Needed]
## Training Details
### Training Data
<!-- This should link to a Dataset Card, perhaps with a short stub of information on what the training data is all about as well as documentation related to data pre-processing or additional filtering. -->
[More Information Needed]
### Training Procedure
<!-- This relates heavily to the Technical Specifications. Content here should link to that section when it is relevant to the training procedure. -->
#### Preprocessing [optional]
[More Information Needed]
#### Training Hyperparameters
- **Training regime:** [More Information Needed] <!--fp32, fp16 mixed precision, bf16 mixed precision, bf16 non-mixed precision, fp16 non-mixed precision, fp8 mixed precision -->
#### Speeds, Sizes, Times [optional]
<!-- This section provides information about throughput, start/end time, checkpoint size if relevant, etc. -->
[More Information Needed]
## Evaluation
<!-- This section describes the evaluation protocols and provides the results. -->
### Testing Data, Factors & Metrics
#### Testing Data
<!-- This should link to a Dataset Card if possible. -->
[More Information Needed]
#### Factors
<!-- These are the things the evaluation is disaggregating by, e.g., subpopulations or domains. -->
[More Information Needed]
#### Metrics
<!-- These are the evaluation metrics being used, ideally with a description of why. -->
[More Information Needed]
### Results
[More Information Needed]
#### Summary
## Model Examination [optional]
<!-- Relevant interpretability work for the model goes here -->
[More Information Needed]
## Environmental Impact
<!-- Total emissions (in grams of CO2eq) and additional considerations, such as electricity usage, go here. Edit the suggested text below accordingly -->
Carbon emissions can be estimated using the [Machine Learning Impact calculator](https://mlco2.github.io/impact#compute) presented in [Lacoste et al. (2019)](https://arxiv.org/abs/1910.09700).
- **Hardware Type:** [More Information Needed]
- **Hours used:** [More Information Needed]
- **Cloud Provider:** [More Information Needed]
- **Compute Region:** [More Information Needed]
- **Carbon Emitted:** [More Information Needed]
## Technical Specifications [optional]
### Model Architecture and Objective
[More Information Needed]
### Compute Infrastructure
[More Information Needed]
#### Hardware
[More Information Needed]
#### Software
[More Information Needed]
## Citation [optional]
<!-- If there is a paper or blog post introducing the model, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the model or model card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Model Card Authors [optional]
[More Information Needed]
## Model Card Contact
[More Information Needed] | [
"label_0",
"label_1",
"label_2",
"label_3",
"label_4",
"label_5",
"label_6",
"label_7",
"label_8",
"label_9",
"label_10"
] |
01FE22BEC307/my-detr-pascal-voc |
# Model Card for Model ID
<!-- Provide a quick summary of what the model is/does. -->
## Model Details
### Model Description
<!-- Provide a longer summary of what this model is. -->
This is the model card of a 🤗 transformers model that has been pushed on the Hub. This model card has been automatically generated.
- **Developed by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Model type:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
- **Finetuned from model [optional]:** [More Information Needed]
### Model Sources [optional]
<!-- Provide the basic links for the model. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the model is intended to be used, including the foreseeable users of the model and those affected by the model. -->
### Direct Use
<!-- This section is for the model use without fine-tuning or plugging into a larger ecosystem/app. -->
[More Information Needed]
### Downstream Use [optional]
<!-- This section is for the model use when fine-tuned for a task, or when plugged into a larger ecosystem/app -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the model will not work well for. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.
## How to Get Started with the Model
Use the code below to get started with the model.
[More Information Needed]
## Training Details
### Training Data
<!-- This should link to a Dataset Card, perhaps with a short stub of information on what the training data is all about as well as documentation related to data pre-processing or additional filtering. -->
[More Information Needed]
### Training Procedure
<!-- This relates heavily to the Technical Specifications. Content here should link to that section when it is relevant to the training procedure. -->
#### Preprocessing [optional]
[More Information Needed]
#### Training Hyperparameters
- **Training regime:** [More Information Needed] <!--fp32, fp16 mixed precision, bf16 mixed precision, bf16 non-mixed precision, fp16 non-mixed precision, fp8 mixed precision -->
#### Speeds, Sizes, Times [optional]
<!-- This section provides information about throughput, start/end time, checkpoint size if relevant, etc. -->
[More Information Needed]
## Evaluation
<!-- This section describes the evaluation protocols and provides the results. -->
### Testing Data, Factors & Metrics
#### Testing Data
<!-- This should link to a Dataset Card if possible. -->
[More Information Needed]
#### Factors
<!-- These are the things the evaluation is disaggregating by, e.g., subpopulations or domains. -->
[More Information Needed]
#### Metrics
<!-- These are the evaluation metrics being used, ideally with a description of why. -->
[More Information Needed]
### Results
[More Information Needed]
#### Summary
## Model Examination [optional]
<!-- Relevant interpretability work for the model goes here -->
[More Information Needed]
## Environmental Impact
<!-- Total emissions (in grams of CO2eq) and additional considerations, such as electricity usage, go here. Edit the suggested text below accordingly -->
Carbon emissions can be estimated using the [Machine Learning Impact calculator](https://mlco2.github.io/impact#compute) presented in [Lacoste et al. (2019)](https://arxiv.org/abs/1910.09700).
- **Hardware Type:** [More Information Needed]
- **Hours used:** [More Information Needed]
- **Cloud Provider:** [More Information Needed]
- **Compute Region:** [More Information Needed]
- **Carbon Emitted:** [More Information Needed]
## Technical Specifications [optional]
### Model Architecture and Objective
[More Information Needed]
### Compute Infrastructure
[More Information Needed]
#### Hardware
[More Information Needed]
#### Software
[More Information Needed]
## Citation [optional]
<!-- If there is a paper or blog post introducing the model, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the model or model card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Model Card Authors [optional]
[More Information Needed]
## Model Card Contact
[More Information Needed] | [
"n/a",
"person",
"bicycle",
"car",
"motorcycle",
"airplane",
"bus",
"train",
"truck",
"boat",
"traffic light",
"fire hydrant",
"street sign",
"stop sign",
"parking meter",
"bench",
"bird",
"cat",
"dog",
"horse",
"sheep",
"cow",
"elephant",
"bear",
"zebra",
"giraffe",
"hat",
"backpack",
"umbrella",
"shoe",
"eye glasses",
"handbag",
"tie",
"suitcase",
"frisbee",
"skis",
"snowboard",
"sports ball",
"kite",
"baseball bat",
"baseball glove",
"skateboard",
"surfboard",
"tennis racket",
"bottle",
"plate",
"wine glass",
"cup",
"fork",
"knife",
"spoon",
"bowl",
"banana",
"apple",
"sandwich",
"orange",
"broccoli",
"carrot",
"hot dog",
"pizza",
"donut",
"cake",
"chair",
"couch",
"potted plant",
"bed",
"mirror",
"dining table",
"window",
"desk",
"toilet",
"door",
"tv",
"laptop",
"mouse",
"remote",
"keyboard",
"cell phone",
"microwave",
"oven",
"toaster",
"sink",
"refrigerator",
"blender",
"book",
"clock",
"vase",
"scissors",
"teddy bear",
"hair drier",
"toothbrush"
] |
kvbiii/test |
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# test
This model is a fine-tuned version of [microsoft/conditional-detr-resnet-50](https://huggingface.co/microsoft/conditional-detr-resnet-50) on an unknown dataset.
It achieves the following results on the evaluation set:
- Loss: 1.1603
- Map: 0.2667
- Map 50: 0.5044
- Map 75: 0.2456
- Map Small: 0.0909
- Map Medium: 0.2038
- Map Large: 0.3504
- Mar 1: 0.2709
- Mar 10: 0.4322
- Mar 100: 0.4537
- Mar Small: 0.1705
- Mar Medium: 0.3988
- Mar Large: 0.5803
- Map Coverall: 0.5892
- Mar 100 Coverall: 0.7071
- Map Face Shield: 0.1292
- Mar 100 Face Shield: 0.4657
- Map Gloves: 0.1967
- Mar 100 Gloves: 0.3534
- Map Goggles: 0.1026
- Mar 100 Goggles: 0.2918
- Map Mask: 0.316
- Mar 100 Mask: 0.4506
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 5e-05
- train_batch_size: 8
- eval_batch_size: 8
- seed: 42
- optimizer: Use adamw_torch with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
- lr_scheduler_type: cosine
- num_epochs: 30
### Training results
| Training Loss | Epoch | Step | Validation Loss | Map | Map 50 | Map 75 | Map Small | Map Medium | Map Large | Mar 1 | Mar 10 | Mar 100 | Mar Small | Mar Medium | Mar Large | Map Coverall | Mar 100 Coverall | Map Face Shield | Mar 100 Face Shield | Map Gloves | Mar 100 Gloves | Map Goggles | Mar 100 Goggles | Map Mask | Mar 100 Mask |
|:-------------:|:-----:|:----:|:---------------:|:------:|:------:|:------:|:---------:|:----------:|:---------:|:------:|:------:|:-------:|:---------:|:----------:|:---------:|:------------:|:----------------:|:---------------:|:-------------------:|:----------:|:--------------:|:-----------:|:---------------:|:--------:|:------------:|
| No log | 1.0 | 91 | 1.8372 | 0.007 | 0.0258 | 0.0011 | 0.0141 | 0.0074 | 0.0074 | 0.0155 | 0.0699 | 0.1041 | 0.0717 | 0.1388 | 0.1297 | 0.0 | 0.0161 | 0.0011 | 0.0039 | 0.004 | 0.1531 | 0.0005 | 0.0682 | 0.0291 | 0.2794 |
| No log | 2.0 | 182 | 1.6868 | 0.0122 | 0.0409 | 0.0051 | 0.0178 | 0.0127 | 0.0159 | 0.0234 | 0.1069 | 0.1472 | 0.1082 | 0.1665 | 0.1779 | 0.003 | 0.1219 | 0.0001 | 0.001 | 0.0101 | 0.2014 | 0.0016 | 0.0518 | 0.0464 | 0.3601 |
| No log | 3.0 | 273 | 1.5610 | 0.027 | 0.0704 | 0.0145 | 0.0209 | 0.0215 | 0.0511 | 0.0768 | 0.1903 | 0.2325 | 0.1097 | 0.2337 | 0.2895 | 0.0339 | 0.3714 | 0.0019 | 0.0471 | 0.0181 | 0.2523 | 0.0026 | 0.0965 | 0.0784 | 0.3951 |
| No log | 4.0 | 364 | 1.5762 | 0.0407 | 0.0899 | 0.0339 | 0.011 | 0.0217 | 0.0606 | 0.0826 | 0.1865 | 0.2398 | 0.104 | 0.2439 | 0.2762 | 0.1194 | 0.5415 | 0.0011 | 0.0441 | 0.0049 | 0.183 | 0.0008 | 0.0576 | 0.0772 | 0.3728 |
| No log | 5.0 | 455 | 1.4866 | 0.0584 | 0.1347 | 0.0434 | 0.0265 | 0.0517 | 0.0705 | 0.1131 | 0.2302 | 0.2768 | 0.1274 | 0.2489 | 0.3144 | 0.1524 | 0.546 | 0.0068 | 0.0755 | 0.0229 | 0.274 | 0.017 | 0.0965 | 0.0932 | 0.3922 |
| 2.2578 | 6.0 | 546 | 1.4517 | 0.0782 | 0.1742 | 0.0608 | 0.0352 | 0.0653 | 0.0921 | 0.1191 | 0.2638 | 0.3064 | 0.1173 | 0.2795 | 0.3505 | 0.2028 | 0.6438 | 0.0052 | 0.1196 | 0.0309 | 0.2671 | 0.0075 | 0.1118 | 0.1448 | 0.3897 |
| 2.2578 | 7.0 | 637 | 1.4567 | 0.1202 | 0.2463 | 0.1019 | 0.0548 | 0.1205 | 0.1364 | 0.163 | 0.3228 | 0.356 | 0.1594 | 0.3255 | 0.4543 | 0.3099 | 0.7063 | 0.0102 | 0.2412 | 0.0491 | 0.2744 | 0.0557 | 0.1753 | 0.176 | 0.3831 |
| 2.2578 | 8.0 | 728 | 1.4039 | 0.1446 | 0.2979 | 0.1181 | 0.0584 | 0.1175 | 0.1695 | 0.1537 | 0.3259 | 0.3556 | 0.128 | 0.3103 | 0.4379 | 0.4289 | 0.6817 | 0.0142 | 0.2598 | 0.0604 | 0.2816 | 0.0327 | 0.1612 | 0.1867 | 0.3938 |
| 2.2578 | 9.0 | 819 | 1.3552 | 0.1623 | 0.3277 | 0.1455 | 0.0586 | 0.1179 | 0.2124 | 0.1759 | 0.3576 | 0.3833 | 0.1318 | 0.3387 | 0.4908 | 0.4853 | 0.6973 | 0.0153 | 0.2853 | 0.0861 | 0.3155 | 0.0257 | 0.2071 | 0.1988 | 0.4111 |
| 2.2578 | 10.0 | 910 | 1.3194 | 0.1819 | 0.3602 | 0.165 | 0.0737 | 0.126 | 0.2357 | 0.1881 | 0.3606 | 0.3881 | 0.1939 | 0.3337 | 0.4953 | 0.4961 | 0.6723 | 0.0203 | 0.3284 | 0.0938 | 0.3072 | 0.0449 | 0.2094 | 0.2543 | 0.423 |
| 1.2495 | 11.0 | 1001 | 1.3209 | 0.1814 | 0.3609 | 0.1521 | 0.067 | 0.1267 | 0.2353 | 0.188 | 0.3616 | 0.3892 | 0.2079 | 0.3159 | 0.5058 | 0.495 | 0.6643 | 0.025 | 0.3284 | 0.1066 | 0.3029 | 0.0508 | 0.2494 | 0.2297 | 0.4008 |
| 1.2495 | 12.0 | 1092 | 1.2817 | 0.1922 | 0.3946 | 0.154 | 0.0681 | 0.1501 | 0.2335 | 0.2042 | 0.3822 | 0.4057 | 0.1483 | 0.3511 | 0.5242 | 0.5286 | 0.6808 | 0.0544 | 0.3961 | 0.1025 | 0.3076 | 0.0462 | 0.2365 | 0.2295 | 0.4074 |
| 1.2495 | 13.0 | 1183 | 1.2797 | 0.207 | 0.4039 | 0.1838 | 0.0767 | 0.1446 | 0.2743 | 0.2117 | 0.3881 | 0.4101 | 0.1362 | 0.356 | 0.5387 | 0.5433 | 0.6893 | 0.0566 | 0.3745 | 0.1286 | 0.3061 | 0.0427 | 0.2671 | 0.2636 | 0.4136 |
| 1.2495 | 14.0 | 1274 | 1.2330 | 0.2165 | 0.416 | 0.1999 | 0.081 | 0.1565 | 0.2767 | 0.2247 | 0.3935 | 0.4251 | 0.1558 | 0.3763 | 0.5399 | 0.5671 | 0.6946 | 0.0579 | 0.3725 | 0.1453 | 0.331 | 0.0371 | 0.3071 | 0.2751 | 0.4202 |
| 1.2495 | 15.0 | 1365 | 1.2150 | 0.2214 | 0.4317 | 0.1966 | 0.0766 | 0.1686 | 0.287 | 0.2243 | 0.3993 | 0.4316 | 0.1896 | 0.3783 | 0.5457 | 0.5626 | 0.6866 | 0.075 | 0.4186 | 0.1465 | 0.339 | 0.0527 | 0.2812 | 0.2701 | 0.4325 |
| 1.2495 | 16.0 | 1456 | 1.1971 | 0.2229 | 0.4395 | 0.1913 | 0.0832 | 0.1531 | 0.2993 | 0.2335 | 0.4103 | 0.4389 | 0.1593 | 0.3698 | 0.5667 | 0.5625 | 0.6888 | 0.0567 | 0.4461 | 0.1539 | 0.3466 | 0.0716 | 0.2882 | 0.2698 | 0.4247 |
| 1.0777 | 17.0 | 1547 | 1.1886 | 0.2435 | 0.4578 | 0.228 | 0.0881 | 0.174 | 0.312 | 0.2487 | 0.4166 | 0.4476 | 0.1758 | 0.3803 | 0.581 | 0.5767 | 0.6978 | 0.1008 | 0.449 | 0.1588 | 0.3538 | 0.0615 | 0.2824 | 0.3194 | 0.4551 |
| 1.0777 | 18.0 | 1638 | 1.1980 | 0.2414 | 0.4659 | 0.2154 | 0.0895 | 0.1714 | 0.3089 | 0.2464 | 0.4181 | 0.4423 | 0.1624 | 0.3736 | 0.5643 | 0.5718 | 0.6875 | 0.1057 | 0.449 | 0.1619 | 0.343 | 0.0644 | 0.2894 | 0.3033 | 0.4424 |
| 1.0777 | 19.0 | 1729 | 1.1748 | 0.2448 | 0.4786 | 0.2174 | 0.0917 | 0.1801 | 0.3185 | 0.2488 | 0.424 | 0.4502 | 0.1551 | 0.3875 | 0.5786 | 0.5706 | 0.6902 | 0.1289 | 0.4745 | 0.1626 | 0.3473 | 0.0568 | 0.2918 | 0.3051 | 0.4473 |
| 1.0777 | 20.0 | 1820 | 1.1770 | 0.2544 | 0.4702 | 0.2366 | 0.0924 | 0.189 | 0.3271 | 0.2632 | 0.4292 | 0.4507 | 0.148 | 0.3992 | 0.5778 | 0.5753 | 0.7085 | 0.1107 | 0.4696 | 0.1718 | 0.339 | 0.0916 | 0.2906 | 0.3225 | 0.4457 |
| 1.0777 | 21.0 | 1911 | 1.1731 | 0.2539 | 0.4917 | 0.2379 | 0.0914 | 0.1924 | 0.3282 | 0.2559 | 0.4247 | 0.4493 | 0.1665 | 0.3907 | 0.5753 | 0.5741 | 0.6991 | 0.113 | 0.4471 | 0.1814 | 0.3444 | 0.0832 | 0.2976 | 0.3177 | 0.4584 |
| 0.9577 | 22.0 | 2002 | 1.1567 | 0.2622 | 0.4932 | 0.2434 | 0.0956 | 0.2006 | 0.3363 | 0.2639 | 0.4339 | 0.4564 | 0.1785 | 0.4013 | 0.5797 | 0.5848 | 0.7018 | 0.1226 | 0.4539 | 0.1924 | 0.357 | 0.0861 | 0.3082 | 0.325 | 0.4609 |
| 0.9577 | 23.0 | 2093 | 1.1649 | 0.2666 | 0.4975 | 0.2456 | 0.091 | 0.2019 | 0.35 | 0.2678 | 0.433 | 0.4573 | 0.1587 | 0.4036 | 0.5832 | 0.5831 | 0.7009 | 0.1563 | 0.4882 | 0.1947 | 0.3477 | 0.0803 | 0.3035 | 0.3186 | 0.4461 |
| 0.9577 | 24.0 | 2184 | 1.1525 | 0.2658 | 0.4949 | 0.2438 | 0.095 | 0.1972 | 0.3465 | 0.2677 | 0.4357 | 0.4585 | 0.1666 | 0.3978 | 0.5878 | 0.5886 | 0.704 | 0.1243 | 0.4588 | 0.1972 | 0.3617 | 0.0938 | 0.3082 | 0.3253 | 0.4597 |
| 0.9577 | 25.0 | 2275 | 1.1496 | 0.2665 | 0.4958 | 0.251 | 0.0927 | 0.1984 | 0.3513 | 0.2733 | 0.4334 | 0.4561 | 0.1568 | 0.3959 | 0.5837 | 0.5879 | 0.7071 | 0.1312 | 0.4696 | 0.1969 | 0.3599 | 0.0947 | 0.2918 | 0.3215 | 0.4519 |
| 0.9577 | 26.0 | 2366 | 1.1596 | 0.2667 | 0.5005 | 0.2434 | 0.092 | 0.1975 | 0.3517 | 0.2691 | 0.4346 | 0.4566 | 0.1714 | 0.3994 | 0.582 | 0.5894 | 0.7089 | 0.1298 | 0.4686 | 0.1964 | 0.357 | 0.0981 | 0.2988 | 0.3197 | 0.4494 |
| 0.9577 | 27.0 | 2457 | 1.1595 | 0.2679 | 0.5033 | 0.2455 | 0.0918 | 0.2041 | 0.3531 | 0.2706 | 0.4341 | 0.4549 | 0.1695 | 0.4039 | 0.5765 | 0.5868 | 0.7067 | 0.1368 | 0.4745 | 0.1972 | 0.3545 | 0.1001 | 0.2894 | 0.3186 | 0.4494 |
| 0.8804 | 28.0 | 2548 | 1.1584 | 0.2673 | 0.5038 | 0.2465 | 0.0916 | 0.2018 | 0.3518 | 0.2703 | 0.4332 | 0.4542 | 0.1698 | 0.3978 | 0.5809 | 0.5888 | 0.7067 | 0.1321 | 0.4676 | 0.1964 | 0.3542 | 0.1032 | 0.2929 | 0.3163 | 0.4498 |
| 0.8804 | 29.0 | 2639 | 1.1602 | 0.2666 | 0.5041 | 0.2454 | 0.0909 | 0.2018 | 0.349 | 0.2707 | 0.4317 | 0.4538 | 0.1701 | 0.397 | 0.58 | 0.5885 | 0.7071 | 0.129 | 0.4647 | 0.1968 | 0.3531 | 0.1016 | 0.2929 | 0.3171 | 0.451 |
| 0.8804 | 30.0 | 2730 | 1.1603 | 0.2667 | 0.5044 | 0.2456 | 0.0909 | 0.2038 | 0.3504 | 0.2709 | 0.4322 | 0.4537 | 0.1705 | 0.3988 | 0.5803 | 0.5892 | 0.7071 | 0.1292 | 0.4657 | 0.1967 | 0.3534 | 0.1026 | 0.2918 | 0.316 | 0.4506 |
### Framework versions
- Transformers 4.47.1
- Pytorch 2.5.1+cu121
- Datasets 3.2.0
- Tokenizers 0.21.0
| [
"coverall",
"face_shield",
"gloves",
"goggles",
"mask"
] |
alvinmrrry837/detr-resnet-50-dc5-fashionpedia-finetuned |
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# detr-resnet-50-dc5-fashionpedia-finetuned
This model is a fine-tuned version of [facebook/detr-resnet-50-dc5](https://huggingface.co/facebook/detr-resnet-50-dc5) on an unknown dataset.
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 1e-05
- train_batch_size: 4
- eval_batch_size: 4
- seed: 42
- optimizer: Use adamw_torch with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
- lr_scheduler_type: linear
- training_steps: 10000
- mixed_precision_training: Native AMP
### Framework versions
- Transformers 4.47.1
- Pytorch 2.5.1+cu121
- Datasets 3.2.0
- Tokenizers 0.21.0
| [
"shirt, blouse",
"top, t-shirt, sweatshirt",
"sweater",
"cardigan",
"jacket",
"vest",
"pants",
"shorts",
"skirt",
"coat",
"dress",
"jumpsuit",
"cape",
"glasses",
"hat",
"headband, head covering, hair accessory",
"tie",
"glove",
"watch",
"belt",
"leg warmer",
"tights, stockings",
"sock",
"shoe",
"bag, wallet",
"scarf",
"umbrella",
"hood",
"collar",
"lapel",
"epaulette",
"sleeve",
"pocket",
"neckline",
"buckle",
"zipper",
"applique",
"bead",
"bow",
"flower",
"fringe",
"ribbon",
"rivet",
"ruffle",
"sequin",
"tassel"
] |
GAVLM/detr-arrows |
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# detr-arrows
This model is a fine-tuned version of [facebook/detr-resnet-50](https://huggingface.co/facebook/detr-resnet-50) on the None dataset.
It achieves the following results on the evaluation set:
- Loss: 1.0051
- Map: 0.0157
- Map 50: 0.0235
- Map 75: 0.017
- Map Small: 0.0157
- Map Medium: -1.0
- Map Large: -1.0
- Mar 1: 0.0854
- Mar 10: 0.0917
- Mar 100: 0.2917
- Mar Small: 0.2917
- Mar Medium: -1.0
- Mar Large: -1.0
- Map Left: 0.0
- Mar 100 Left: 0.0
- Map Right: -1.0
- Mar 100 Right: -1.0
- Map Up: 0.0385
- Mar 100 Up: 0.2667
- Map Down: 0.0146
- Mar 100 Down: 0.8
- Map ?: 0.0099
- Mar 100 ?: 0.1
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 3e-05
- train_batch_size: 8
- eval_batch_size: 8
- seed: 42
- optimizer: Use OptimizerNames.ADAMW_TORCH with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
- lr_scheduler_type: cosine
- num_epochs: 30
### Training results
| Training Loss | Epoch | Step | Validation Loss | Map | Map 50 | Map 75 | Map Small | Map Medium | Map Large | Mar 1 | Mar 10 | Mar 100 | Mar Small | Mar Medium | Mar Large | Map Left | Mar 100 Left | Map Right | Mar 100 Right | Map Up | Mar 100 Up | Map Down | Mar 100 Down | Map ? | Mar 100 ? |
|:-------------:|:-----:|:----:|:---------------:|:------:|:------:|:------:|:---------:|:----------:|:---------:|:------:|:------:|:-------:|:---------:|:----------:|:---------:|:--------:|:------------:|:---------:|:-------------:|:------:|:----------:|:--------:|:------------:|:------:|:---------:|
| No log | 1.0 | 8 | 2.9834 | 0.0001 | 0.0014 | 0.0 | 0.0002 | -1.0 | -1.0 | 0.0 | 0.0 | 0.025 | 0.025 | -1.0 | -1.0 | 0.0 | 0.0 | -1.0 | -1.0 | 0.0 | 0.0 | 0.0006 | 0.1 | 0.0 | 0.0 |
| No log | 2.0 | 16 | 2.3564 | 0.0005 | 0.0023 | 0.0 | 0.0007 | -1.0 | -1.0 | 0.0 | 0.0 | 0.05 | 0.05 | -1.0 | -1.0 | 0.0 | 0.0 | -1.0 | -1.0 | 0.0 | 0.0 | 0.0019 | 0.2 | 0.0 | 0.0 |
| No log | 3.0 | 24 | 2.1618 | 0.001 | 0.0025 | 0.0 | 0.0012 | -1.0 | -1.0 | 0.0 | 0.0 | 0.1 | 0.1 | -1.0 | -1.0 | 0.0 | 0.0 | -1.0 | -1.0 | 0.0 | 0.0 | 0.004 | 0.4 | 0.0 | 0.0 |
| No log | 4.0 | 32 | 2.0074 | 0.0005 | 0.0027 | 0.0 | 0.0007 | -1.0 | -1.0 | 0.0 | 0.0 | 0.05 | 0.05 | -1.0 | -1.0 | 0.0 | 0.0 | -1.0 | -1.0 | 0.0 | 0.0 | 0.0022 | 0.2 | 0.0 | 0.0 |
| No log | 5.0 | 40 | 1.7452 | 0.0008 | 0.003 | 0.0 | 0.0009 | -1.0 | -1.0 | 0.0 | 0.0 | 0.0812 | 0.0812 | -1.0 | -1.0 | 0.0 | 0.0 | -1.0 | -1.0 | 0.0 | 0.0 | 0.0028 | 0.3 | 0.0003 | 0.025 |
| No log | 6.0 | 48 | 1.6746 | 0.0009 | 0.0024 | 0.0 | 0.0011 | -1.0 | -1.0 | 0.0 | 0.0 | 0.1 | 0.1 | -1.0 | -1.0 | 0.0 | 0.0 | -1.0 | -1.0 | 0.0 | 0.0 | 0.0038 | 0.4 | 0.0 | 0.0 |
| No log | 7.0 | 56 | 1.5161 | 0.0014 | 0.0023 | 0.0023 | 0.0016 | -1.0 | -1.0 | 0.0 | 0.0 | 0.15 | 0.15 | -1.0 | -1.0 | 0.0 | 0.0 | -1.0 | -1.0 | 0.0 | 0.0 | 0.0054 | 0.6 | 0.0 | 0.0 |
| No log | 8.0 | 64 | 1.5721 | 0.0009 | 0.0017 | 0.0 | 0.001 | -1.0 | -1.0 | 0.0 | 0.0 | 0.125 | 0.125 | -1.0 | -1.0 | 0.0 | 0.0 | -1.0 | -1.0 | 0.0 | 0.0 | 0.0034 | 0.5 | 0.0 | 0.0 |
| No log | 9.0 | 72 | 1.4536 | 0.0014 | 0.0021 | 0.0021 | 0.0021 | -1.0 | -1.0 | 0.0 | 0.0 | 0.175 | 0.175 | -1.0 | -1.0 | 0.0 | 0.0 | -1.0 | -1.0 | 0.0 | 0.0 | 0.0056 | 0.7 | 0.0 | 0.0 |
| No log | 10.0 | 80 | 1.2638 | 0.0013 | 0.0023 | 0.0014 | 0.0019 | -1.0 | -1.0 | 0.0 | 0.0 | 0.175 | 0.175 | -1.0 | -1.0 | 0.0 | 0.0 | -1.0 | -1.0 | 0.0 | 0.0 | 0.0052 | 0.7 | 0.0 | 0.0 |
| No log | 11.0 | 88 | 1.2646 | 0.0015 | 0.0025 | 0.0014 | 0.0025 | -1.0 | -1.0 | 0.0 | 0.0 | 0.175 | 0.175 | -1.0 | -1.0 | 0.0 | 0.0 | -1.0 | -1.0 | 0.0 | 0.0 | 0.0062 | 0.7 | 0.0 | 0.0 |
| No log | 12.0 | 96 | 1.2239 | 0.0023 | 0.003 | 0.003 | 0.0038 | -1.0 | -1.0 | 0.0 | 0.0 | 0.2 | 0.2 | -1.0 | -1.0 | 0.0 | 0.0 | -1.0 | -1.0 | 0.0 | 0.0 | 0.0092 | 0.8 | 0.0 | 0.0 |
| No log | 13.0 | 104 | 1.1191 | 0.0023 | 0.0057 | 0.0025 | 0.0035 | -1.0 | -1.0 | 0.0 | 0.0125 | 0.1875 | 0.1875 | -1.0 | -1.0 | 0.0 | 0.0 | -1.0 | -1.0 | 0.0 | 0.0 | 0.0068 | 0.7 | 0.0026 | 0.05 |
| No log | 14.0 | 112 | 1.1491 | 0.0029 | 0.0054 | 0.002 | 0.0038 | -1.0 | -1.0 | 0.0312 | 0.0312 | 0.1813 | 0.1813 | -1.0 | -1.0 | 0.0 | 0.0 | -1.0 | -1.0 | 0.0 | 0.0 | 0.0047 | 0.6 | 0.0068 | 0.125 |
| No log | 15.0 | 120 | 1.1062 | 0.02 | 0.0349 | 0.0297 | 0.0287 | -1.0 | -1.0 | 0.05 | 0.075 | 0.25 | 0.25 | -1.0 | -1.0 | 0.0 | 0.0 | -1.0 | -1.0 | 0.0673 | 0.2 | 0.0053 | 0.7 | 0.0074 | 0.1 |
| No log | 16.0 | 128 | 1.0963 | 0.0705 | 0.0925 | 0.086 | 0.0705 | -1.0 | -1.0 | 0.0854 | 0.0854 | 0.2354 | 0.2354 | -1.0 | -1.0 | 0.0 | 0.0 | -1.0 | -1.0 | 0.2693 | 0.2667 | 0.0058 | 0.6 | 0.007 | 0.075 |
| No log | 17.0 | 136 | 1.0369 | 0.0349 | 0.051 | 0.0434 | 0.0349 | -1.0 | -1.0 | 0.075 | 0.0917 | 0.2917 | 0.2917 | -1.0 | -1.0 | 0.0 | 0.0 | -1.0 | -1.0 | 0.1234 | 0.2667 | 0.0049 | 0.8 | 0.0114 | 0.1 |
| No log | 18.0 | 144 | 1.0721 | 0.0324 | 0.05 | 0.0435 | 0.0324 | -1.0 | -1.0 | 0.0771 | 0.0771 | 0.2521 | 0.2521 | -1.0 | -1.0 | 0.0 | 0.0 | -1.0 | -1.0 | 0.1178 | 0.2333 | 0.004 | 0.7 | 0.0077 | 0.075 |
| No log | 19.0 | 152 | 1.0081 | 0.0623 | 0.0922 | 0.0858 | 0.0625 | -1.0 | -1.0 | 0.0646 | 0.0833 | 0.2833 | 0.2833 | -1.0 | -1.0 | 0.0 | 0.0 | -1.0 | -1.0 | 0.2356 | 0.2333 | 0.0051 | 0.8 | 0.0085 | 0.1 |
| No log | 20.0 | 160 | 1.0390 | 0.0367 | 0.0504 | 0.044 | 0.0367 | -1.0 | -1.0 | 0.0729 | 0.0854 | 0.2604 | 0.2604 | -1.0 | -1.0 | 0.0 | 0.0 | -1.0 | -1.0 | 0.1347 | 0.2667 | 0.0053 | 0.7 | 0.0071 | 0.075 |
| No log | 21.0 | 168 | 1.0262 | 0.0255 | 0.0371 | 0.0299 | 0.0255 | -1.0 | -1.0 | 0.0729 | 0.0854 | 0.2604 | 0.2604 | -1.0 | -1.0 | 0.0 | 0.0 | -1.0 | -1.0 | 0.0898 | 0.2667 | 0.0052 | 0.7 | 0.0068 | 0.075 |
| No log | 22.0 | 176 | 0.9972 | 0.0134 | 0.0223 | 0.0159 | 0.0134 | -1.0 | -1.0 | 0.0646 | 0.0833 | 0.2833 | 0.2833 | -1.0 | -1.0 | 0.0 | 0.0 | -1.0 | -1.0 | 0.0393 | 0.2333 | 0.0058 | 0.8 | 0.0086 | 0.1 |
| No log | 23.0 | 184 | 1.0195 | 0.0135 | 0.0204 | 0.014 | 0.0135 | -1.0 | -1.0 | 0.0729 | 0.0917 | 0.2917 | 0.2917 | -1.0 | -1.0 | 0.0 | 0.0 | -1.0 | -1.0 | 0.0385 | 0.2667 | 0.0061 | 0.8 | 0.0092 | 0.1 |
| No log | 24.0 | 192 | 1.0240 | 0.0145 | 0.023 | 0.0139 | 0.0145 | -1.0 | -1.0 | 0.0792 | 0.0917 | 0.2917 | 0.2917 | -1.0 | -1.0 | 0.0 | 0.0 | -1.0 | -1.0 | 0.0385 | 0.2667 | 0.0101 | 0.8 | 0.0095 | 0.1 |
| No log | 25.0 | 200 | 1.0151 | 0.0157 | 0.024 | 0.0176 | 0.0157 | -1.0 | -1.0 | 0.0854 | 0.0917 | 0.2917 | 0.2917 | -1.0 | -1.0 | 0.0 | 0.0 | -1.0 | -1.0 | 0.0385 | 0.2667 | 0.0147 | 0.8 | 0.0096 | 0.1 |
| No log | 26.0 | 208 | 1.0143 | 0.0153 | 0.0233 | 0.0168 | 0.0153 | -1.0 | -1.0 | 0.0854 | 0.0917 | 0.2917 | 0.2917 | -1.0 | -1.0 | 0.0 | 0.0 | -1.0 | -1.0 | 0.0385 | 0.2667 | 0.013 | 0.8 | 0.0099 | 0.1 |
| No log | 27.0 | 216 | 1.0092 | 0.0157 | 0.0235 | 0.017 | 0.0157 | -1.0 | -1.0 | 0.0854 | 0.0917 | 0.2917 | 0.2917 | -1.0 | -1.0 | 0.0 | 0.0 | -1.0 | -1.0 | 0.0385 | 0.2667 | 0.0146 | 0.8 | 0.0099 | 0.1 |
| No log | 28.0 | 224 | 1.0051 | 0.0157 | 0.0235 | 0.017 | 0.0157 | -1.0 | -1.0 | 0.0854 | 0.0917 | 0.2917 | 0.2917 | -1.0 | -1.0 | 0.0 | 0.0 | -1.0 | -1.0 | 0.0385 | 0.2667 | 0.0146 | 0.8 | 0.0099 | 0.1 |
| No log | 29.0 | 232 | 1.0046 | 0.0157 | 0.0235 | 0.017 | 0.0157 | -1.0 | -1.0 | 0.0854 | 0.0917 | 0.2917 | 0.2917 | -1.0 | -1.0 | 0.0 | 0.0 | -1.0 | -1.0 | 0.0385 | 0.2667 | 0.0146 | 0.8 | 0.0099 | 0.1 |
| No log | 30.0 | 240 | 1.0051 | 0.0157 | 0.0235 | 0.017 | 0.0157 | -1.0 | -1.0 | 0.0854 | 0.0917 | 0.2917 | 0.2917 | -1.0 | -1.0 | 0.0 | 0.0 | -1.0 | -1.0 | 0.0385 | 0.2667 | 0.0146 | 0.8 | 0.0099 | 0.1 |
### Framework versions
- Transformers 4.48.0.dev0
- Pytorch 2.5.1+cu124
- Datasets 3.2.0
- Tokenizers 0.21.0
| [
"left",
"right",
"up",
"down",
"?"
] |
GAVLM/original-detr-arrows |
# Model Card for Model ID
<!-- Provide a quick summary of what the model is/does. -->
## Model Details
### Model Description
<!-- Provide a longer summary of what this model is. -->
This is the model card of a 🤗 transformers model that has been pushed on the Hub. This model card has been automatically generated.
- **Developed by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Model type:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
- **Finetuned from model [optional]:** [More Information Needed]
### Model Sources [optional]
<!-- Provide the basic links for the model. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the model is intended to be used, including the foreseeable users of the model and those affected by the model. -->
### Direct Use
<!-- This section is for the model use without fine-tuning or plugging into a larger ecosystem/app. -->
[More Information Needed]
### Downstream Use [optional]
<!-- This section is for the model use when fine-tuned for a task, or when plugged into a larger ecosystem/app -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the model will not work well for. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.
## How to Get Started with the Model
Use the code below to get started with the model.
[More Information Needed]
## Training Details
### Training Data
<!-- This should link to a Dataset Card, perhaps with a short stub of information on what the training data is all about as well as documentation related to data pre-processing or additional filtering. -->
[More Information Needed]
### Training Procedure
<!-- This relates heavily to the Technical Specifications. Content here should link to that section when it is relevant to the training procedure. -->
#### Preprocessing [optional]
[More Information Needed]
#### Training Hyperparameters
- **Training regime:** [More Information Needed] <!--fp32, fp16 mixed precision, bf16 mixed precision, bf16 non-mixed precision, fp16 non-mixed precision, fp8 mixed precision -->
#### Speeds, Sizes, Times [optional]
<!-- This section provides information about throughput, start/end time, checkpoint size if relevant, etc. -->
[More Information Needed]
## Evaluation
<!-- This section describes the evaluation protocols and provides the results. -->
### Testing Data, Factors & Metrics
#### Testing Data
<!-- This should link to a Dataset Card if possible. -->
[More Information Needed]
#### Factors
<!-- These are the things the evaluation is disaggregating by, e.g., subpopulations or domains. -->
[More Information Needed]
#### Metrics
<!-- These are the evaluation metrics being used, ideally with a description of why. -->
[More Information Needed]
### Results
[More Information Needed]
#### Summary
## Model Examination [optional]
<!-- Relevant interpretability work for the model goes here -->
[More Information Needed]
## Environmental Impact
<!-- Total emissions (in grams of CO2eq) and additional considerations, such as electricity usage, go here. Edit the suggested text below accordingly -->
Carbon emissions can be estimated using the [Machine Learning Impact calculator](https://mlco2.github.io/impact#compute) presented in [Lacoste et al. (2019)](https://arxiv.org/abs/1910.09700).
- **Hardware Type:** [More Information Needed]
- **Hours used:** [More Information Needed]
- **Cloud Provider:** [More Information Needed]
- **Compute Region:** [More Information Needed]
- **Carbon Emitted:** [More Information Needed]
## Technical Specifications [optional]
### Model Architecture and Objective
[More Information Needed]
### Compute Infrastructure
[More Information Needed]
#### Hardware
[More Information Needed]
#### Software
[More Information Needed]
## Citation [optional]
<!-- If there is a paper or blog post introducing the model, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the model or model card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Model Card Authors [optional]
[More Information Needed]
## Model Card Contact
[More Information Needed] | [
"left",
"right",
"up",
"down",
"?"
] |
aesat/detr-finetuned-chess |
# DETR (End-to-End Object Detection) model with ResNet-50 backbone fine-tuned on chess pieces
<!-- Provide a quick summary of what the model is/does. -->
DEtection TRansformer (DETR) model trained end-to-end on Chess pieces recognition dataset
## Model Details
### Model Description
<!-- Provide a longer summary of what this model is. -->
The DETR model is an encoder-decoder transformer with a convolutional backbone. Two heads are added on top of the decoder outputs in order to perform object detection: a linear layer for the class labels and a MLP (multi-layer perceptron) for the bounding boxes. The model uses so-called object queries to detect objects in an image. Each object query looks for a particular object in the image. For COCO, the number of object queries is set to 100.
- **Developed by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Model type:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
- **Finetuned from model [optional]:** [More Information Needed]
### Model Sources [optional]
<!-- Provide the basic links for the model. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the model is intended to be used, including the foreseeable users of the model and those affected by the model. -->
### How To Use
<!-- This section is for the model use without fine-tuning or plugging into a larger ecosystem/app. -->
```python
from transformers import DetrImageProcessor, DetrForObjectDetection
import torch
from PIL import Image
import requests
url = "http://images.cocodataset.org/val2017/000000039769.jpg"
image = Image.open(requests.get(url, stream=True).raw)
processor = DetrImageProcessor.from_pretrained("aesat/detr-finetuned-chess", revision="no_timm")
model = DetrForObjectDetection.from_pretrained("facebook/detr-finetuned-chess", revision="no_timm")
inputs = processor(images=image, return_tensors="pt")
outputs = model(**inputs)
# convert outputs (bounding boxes and class logits) to COCO API
# let's only keep detections with score > 0.9
target_sizes = torch.tensor([image.size[::-1]])
results = processor.post_process_object_detection(outputs, target_sizes=target_sizes, threshold=0.9)[0]
for score, label, box in zip(results["scores"], results["labels"], results["boxes"]):
box = [round(i, 2) for i in box.tolist()]
print(
f"Detected {model.config.id2label[label.item()]} with confidence "
f"{round(score.item(), 3)} at location {box}"
)
```
[More Information Needed]
### Downstream Use [optional]
<!-- This section is for the model use when fine-tuned for a task, or when plugged into a larger ecosystem/app -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the model will not work well for. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.
## How to Get Started with the Model
Use the code below to get started with the model.
[More Information Needed]
## Training Details
### Training Data
<!-- This should link to a Dataset Card, perhaps with a short stub of information on what the training data is all about as well as documentation related to data pre-processing or additional filtering. -->
[More Information Needed]
### Training Procedure
<!-- This relates heavily to the Technical Specifications. Content here should link to that section when it is relevant to the training procedure. -->
#### Preprocessing [optional]
[More Information Needed]
#### Training Hyperparameters
- **Training regime:** [More Information Needed] <!--fp32, fp16 mixed precision, bf16 mixed precision, bf16 non-mixed precision, fp16 non-mixed precision, fp8 mixed precision -->
#### Speeds, Sizes, Times [optional]
<!-- This section provides information about throughput, start/end time, checkpoint size if relevant, etc. -->
[More Information Needed]
## Evaluation
<!-- This section describes the evaluation protocols and provides the results. -->
### Testing Data, Factors & Metrics
#### Testing Data
<!-- This should link to a Dataset Card if possible. -->
[More Information Needed]
#### Factors
<!-- These are the things the evaluation is disaggregating by, e.g., subpopulations or domains. -->
[More Information Needed]
#### Metrics
<!-- These are the evaluation metrics being used, ideally with a description of why. -->
[More Information Needed]
### Results
[More Information Needed]
#### Summary
## Model Examination [optional]
<!-- Relevant interpretability work for the model goes here -->
[More Information Needed]
## Environmental Impact
<!-- Total emissions (in grams of CO2eq) and additional considerations, such as electricity usage, go here. Edit the suggested text below accordingly -->
Carbon emissions can be estimated using the [Machine Learning Impact calculator](https://mlco2.github.io/impact#compute) presented in [Lacoste et al. (2019)](https://arxiv.org/abs/1910.09700).
- **Hardware Type:** [More Information Needed]
- **Hours used:** [More Information Needed]
- **Cloud Provider:** [More Information Needed]
- **Compute Region:** [More Information Needed]
- **Carbon Emitted:** [More Information Needed]
## Technical Specifications [optional]
### Model Architecture and Objective
[More Information Needed]
### Compute Infrastructure
[More Information Needed]
#### Hardware
[More Information Needed]
#### Software
[More Information Needed]
## Citation [optional]
<!-- If there is a paper or blog post introducing the model, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the model or model card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Model Card Authors [optional]
[More Information Needed]
## Model Card Contact
[More Information Needed] | [
"label_0",
"label_1",
"label_2",
"label_3",
"label_4",
"label_5",
"label_6",
"label_7",
"label_8",
"label_9",
"label_10",
"label_11",
"label_12"
] |
khoalvd/detr-resnet-50_finetuned_cppe5 |
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# detr-resnet-50_finetuned_cppe5
This model is a fine-tuned version of [facebook/detr-resnet-50](https://huggingface.co/facebook/detr-resnet-50) on an unknown dataset.
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 1e-05
- train_batch_size: 16
- eval_batch_size: 8
- seed: 42
- optimizer: Use adamw_hf with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
- lr_scheduler_type: linear
- num_epochs: 50
- mixed_precision_training: Native AMP
### Training results
### Framework versions
- Transformers 4.47.1
- Pytorch 2.5.1+cu121
- Datasets 3.2.0
- Tokenizers 0.21.0
| [
"coverall",
"face_shield",
"gloves",
"goggles",
"mask"
] |
hzli/voc-detr |
# Model Card for Model ID
<!-- Provide a quick summary of what the model is/does. -->
## Model Details
### Model Description
<!-- Provide a longer summary of what this model is. -->
This is the model card of a 🤗 transformers model that has been pushed on the Hub. This model card has been automatically generated.
- **Developed by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Model type:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
- **Finetuned from model [optional]:** [More Information Needed]
### Model Sources [optional]
<!-- Provide the basic links for the model. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the model is intended to be used, including the foreseeable users of the model and those affected by the model. -->
### Direct Use
<!-- This section is for the model use without fine-tuning or plugging into a larger ecosystem/app. -->
[More Information Needed]
### Downstream Use [optional]
<!-- This section is for the model use when fine-tuned for a task, or when plugged into a larger ecosystem/app -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the model will not work well for. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.
## How to Get Started with the Model
Use the code below to get started with the model.
[More Information Needed]
## Training Details
### Training Data
<!-- This should link to a Dataset Card, perhaps with a short stub of information on what the training data is all about as well as documentation related to data pre-processing or additional filtering. -->
[More Information Needed]
### Training Procedure
<!-- This relates heavily to the Technical Specifications. Content here should link to that section when it is relevant to the training procedure. -->
#### Preprocessing [optional]
[More Information Needed]
#### Training Hyperparameters
- **Training regime:** [More Information Needed] <!--fp32, fp16 mixed precision, bf16 mixed precision, bf16 non-mixed precision, fp16 non-mixed precision, fp8 mixed precision -->
#### Speeds, Sizes, Times [optional]
<!-- This section provides information about throughput, start/end time, checkpoint size if relevant, etc. -->
[More Information Needed]
## Evaluation
<!-- This section describes the evaluation protocols and provides the results. -->
### Testing Data, Factors & Metrics
#### Testing Data
<!-- This should link to a Dataset Card if possible. -->
[More Information Needed]
#### Factors
<!-- These are the things the evaluation is disaggregating by, e.g., subpopulations or domains. -->
[More Information Needed]
#### Metrics
<!-- These are the evaluation metrics being used, ideally with a description of why. -->
[More Information Needed]
### Results
[More Information Needed]
#### Summary
## Model Examination [optional]
<!-- Relevant interpretability work for the model goes here -->
[More Information Needed]
## Environmental Impact
<!-- Total emissions (in grams of CO2eq) and additional considerations, such as electricity usage, go here. Edit the suggested text below accordingly -->
Carbon emissions can be estimated using the [Machine Learning Impact calculator](https://mlco2.github.io/impact#compute) presented in [Lacoste et al. (2019)](https://arxiv.org/abs/1910.09700).
- **Hardware Type:** [More Information Needed]
- **Hours used:** [More Information Needed]
- **Cloud Provider:** [More Information Needed]
- **Compute Region:** [More Information Needed]
- **Carbon Emitted:** [More Information Needed]
## Technical Specifications [optional]
### Model Architecture and Objective
[More Information Needed]
### Compute Infrastructure
[More Information Needed]
#### Hardware
[More Information Needed]
#### Software
[More Information Needed]
## Citation [optional]
<!-- If there is a paper or blog post introducing the model, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the model or model card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Model Card Authors [optional]
[More Information Needed]
## Model Card Contact
[More Information Needed] | [
"label_0",
"label_1",
"label_2",
"label_3",
"label_4",
"label_5",
"label_6",
"label_7",
"label_8",
"label_9",
"label_10",
"label_11",
"label_12",
"label_13",
"label_14",
"label_15",
"label_16",
"label_17",
"label_18",
"label_19"
] |
dltpdn/detr_finetuned_cppe5 |
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# detr_finetuned_cppe5
This model is a fine-tuned version of [microsoft/conditional-detr-resnet-50](https://huggingface.co/microsoft/conditional-detr-resnet-50) on an unknown dataset.
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 5e-05
- train_batch_size: 8
- eval_batch_size: 8
- seed: 42
- optimizer: Use adamw_torch with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
- lr_scheduler_type: cosine
- num_epochs: 30
### Framework versions
- Transformers 4.47.1
- Pytorch 2.5.1+cu121
- Datasets 3.2.0
- Tokenizers 0.21.0
| [
"coverall",
"face_shield",
"gloves",
"goggles",
"mask"
] |
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.