model_id
stringlengths
9
102
model_card
stringlengths
4
343k
model_labels
listlengths
2
50.8k
MedicalVision/test_remove_2
## Original result ``` Not provided ``` ## After training result ``` IoU metric: bbox Average Precision (AP) @[ IoU=0.50:0.95 | area= all | maxDets=100 ] = 0.006 Average Precision (AP) @[ IoU=0.50 | area= all | maxDets=100 ] = 0.016 Average Precision (AP) @[ IoU=0.75 | area= all | maxDets=100 ] = 0.004 Average Precision (AP) @[ IoU=0.50:0.95 | area= small | maxDets=100 ] = 0.000 Average Precision (AP) @[ IoU=0.50:0.95 | area=medium | maxDets=100 ] = 0.000 Average Precision (AP) @[ IoU=0.50:0.95 | area= large | maxDets=100 ] = 0.006 Average Recall (AR) @[ IoU=0.50:0.95 | area= all | maxDets= 1 ] = 0.041 Average Recall (AR) @[ IoU=0.50:0.95 | area= all | maxDets= 10 ] = 0.077 Average Recall (AR) @[ IoU=0.50:0.95 | area= all | maxDets=100 ] = 0.083 Average Recall (AR) @[ IoU=0.50:0.95 | area= small | maxDets=100 ] = 0.000 Average Recall (AR) @[ IoU=0.50:0.95 | area=medium | maxDets=100 ] = 0.000 Average Recall (AR) @[ IoU=0.50:0.95 | area= large | maxDets=100 ] = 0.085 ``` ## Config - dataset: VinXray - original model: hustvl/yolos-tiny - lr: 0.0001 - dropout_rate: 0.1 - weight_decay: 0.0001 - max_epochs: 1 - train samples: 67234 ## Logging ### Training process ``` {'validation_loss': tensor(7.8284, device='cuda:1'), 'validation_loss_ce': tensor(2.7671, device='cuda:1'), 'validation_loss_bbox': tensor(0.5730, device='cuda:1'), 'validation_loss_giou': tensor(1.0983, device='cuda:1'), 'validation_cardinality_error': tensor(98.8125, device='cuda:1')} {'training_loss': tensor(1.3821, device='cuda:1'), 'train_loss_ce': tensor(0.1972, device='cuda:1'), 'train_loss_bbox': tensor(0.0681, device='cuda:1'), 'train_loss_giou': tensor(0.4223, device='cuda:1'), 'train_cardinality_error': tensor(0.4118, device='cuda:1'), 'validation_loss': tensor(1.6166, device='cuda:1'), 'validation_loss_ce': tensor(0.2388, device='cuda:1'), 'validation_loss_bbox': tensor(0.0936, device='cuda:1'), 'validation_loss_giou': tensor(0.4548, device='cuda:1'), 'validation_cardinality_error': tensor(0.5118, device='cuda:1')} ``` ## Examples {'size': tensor([560, 512]), 'image_id': tensor([1]), 'class_labels': tensor([], dtype=torch.int64), 'boxes': tensor([], size=(0, 4)), 'area': tensor([]), 'iscrowd': tensor([], dtype=torch.int64), 'orig_size': tensor([2580, 2332])} ![Example](./example.png)
[ "aortic enlargement", "atelectasis", "calcification", "cardiomegaly", "consolidation", "ild", "infiltration", "lung opacity", "nodule/mass", "other lesion", "pleural effusion", "pleural thickening", "pneumothorax", "pulmonary fibrosis", "no finding" ]
0llheaven/Conditional-detr-finetuned
# Model Card for Model ID <!-- Provide a quick summary of what the model is/does. --> ## Model Details ### Model Description <!-- Provide a longer summary of what this model is. --> This is the model card of a πŸ€— transformers model that has been pushed on the Hub. This model card has been automatically generated. - **Developed by:** [More Information Needed] - **Funded by [optional]:** [More Information Needed] - **Shared by [optional]:** [More Information Needed] - **Model type:** [More Information Needed] - **Language(s) (NLP):** [More Information Needed] - **License:** [More Information Needed] - **Finetuned from model [optional]:** [More Information Needed] ### Model Sources [optional] <!-- Provide the basic links for the model. --> - **Repository:** [More Information Needed] - **Paper [optional]:** [More Information Needed] - **Demo [optional]:** [More Information Needed] ## Uses <!-- Address questions around how the model is intended to be used, including the foreseeable users of the model and those affected by the model. --> ### Direct Use <!-- This section is for the model use without fine-tuning or plugging into a larger ecosystem/app. --> [More Information Needed] ### Downstream Use [optional] <!-- This section is for the model use when fine-tuned for a task, or when plugged into a larger ecosystem/app --> [More Information Needed] ### Out-of-Scope Use <!-- This section addresses misuse, malicious use, and uses that the model will not work well for. --> [More Information Needed] ## Bias, Risks, and Limitations <!-- This section is meant to convey both technical and sociotechnical limitations. --> [More Information Needed] ### Recommendations <!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. --> Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations. ## How to Get Started with the Model Use the code below to get started with the model. [More Information Needed] ## Training Details ### Training Data <!-- This should link to a Dataset Card, perhaps with a short stub of information on what the training data is all about as well as documentation related to data pre-processing or additional filtering. --> [More Information Needed] ### Training Procedure <!-- This relates heavily to the Technical Specifications. Content here should link to that section when it is relevant to the training procedure. --> #### Preprocessing [optional] [More Information Needed] #### Training Hyperparameters - **Training regime:** [More Information Needed] <!--fp32, fp16 mixed precision, bf16 mixed precision, bf16 non-mixed precision, fp16 non-mixed precision, fp8 mixed precision --> #### Speeds, Sizes, Times [optional] <!-- This section provides information about throughput, start/end time, checkpoint size if relevant, etc. --> [More Information Needed] ## Evaluation <!-- This section describes the evaluation protocols and provides the results. --> ### Testing Data, Factors & Metrics #### Testing Data <!-- This should link to a Dataset Card if possible. --> [More Information Needed] #### Factors <!-- These are the things the evaluation is disaggregating by, e.g., subpopulations or domains. --> [More Information Needed] #### Metrics <!-- These are the evaluation metrics being used, ideally with a description of why. --> [More Information Needed] ### Results [More Information Needed] #### Summary ## Model Examination [optional] <!-- Relevant interpretability work for the model goes here --> [More Information Needed] ## Environmental Impact <!-- Total emissions (in grams of CO2eq) and additional considerations, such as electricity usage, go here. Edit the suggested text below accordingly --> Carbon emissions can be estimated using the [Machine Learning Impact calculator](https://mlco2.github.io/impact#compute) presented in [Lacoste et al. (2019)](https://arxiv.org/abs/1910.09700). - **Hardware Type:** [More Information Needed] - **Hours used:** [More Information Needed] - **Cloud Provider:** [More Information Needed] - **Compute Region:** [More Information Needed] - **Carbon Emitted:** [More Information Needed] ## Technical Specifications [optional] ### Model Architecture and Objective [More Information Needed] ### Compute Infrastructure [More Information Needed] #### Hardware [More Information Needed] #### Software [More Information Needed] ## Citation [optional] <!-- If there is a paper or blog post introducing the model, the APA and Bibtex information for that should go in this section. --> **BibTeX:** [More Information Needed] **APA:** [More Information Needed] ## Glossary [optional] <!-- If relevant, include terms and calculations in this section that can help readers understand the model or model card. --> [More Information Needed] ## More Information [optional] [More Information Needed] ## Model Card Authors [optional] [More Information Needed] ## Model Card Contact [More Information Needed]
[ "pneumonia", "0ther" ]
ArjDLAI/tatr-finetuned-for-dlai
# Model Card for Model ID <!-- Provide a quick summary of what the model is/does. --> ## Model Details ### Model Description <!-- Provide a longer summary of what this model is. --> This is the model card of a πŸ€— transformers model that has been pushed on the Hub. This model card has been automatically generated. - **Developed by:** [More Information Needed] - **Funded by [optional]:** [More Information Needed] - **Shared by [optional]:** [More Information Needed] - **Model type:** [More Information Needed] - **Language(s) (NLP):** [More Information Needed] - **License:** [More Information Needed] - **Finetuned from model [optional]:** [More Information Needed] ### Model Sources [optional] <!-- Provide the basic links for the model. --> - **Repository:** [More Information Needed] - **Paper [optional]:** [More Information Needed] - **Demo [optional]:** [More Information Needed] ## Uses <!-- Address questions around how the model is intended to be used, including the foreseeable users of the model and those affected by the model. --> ### Direct Use <!-- This section is for the model use without fine-tuning or plugging into a larger ecosystem/app. --> [More Information Needed] ### Downstream Use [optional] <!-- This section is for the model use when fine-tuned for a task, or when plugged into a larger ecosystem/app --> [More Information Needed] ### Out-of-Scope Use <!-- This section addresses misuse, malicious use, and uses that the model will not work well for. --> [More Information Needed] ## Bias, Risks, and Limitations <!-- This section is meant to convey both technical and sociotechnical limitations. --> [More Information Needed] ### Recommendations <!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. --> Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations. ## How to Get Started with the Model Use the code below to get started with the model. [More Information Needed] ## Training Details ### Training Data <!-- This should link to a Dataset Card, perhaps with a short stub of information on what the training data is all about as well as documentation related to data pre-processing or additional filtering. --> [More Information Needed] ### Training Procedure <!-- This relates heavily to the Technical Specifications. Content here should link to that section when it is relevant to the training procedure. --> #### Preprocessing [optional] [More Information Needed] #### Training Hyperparameters - **Training regime:** [More Information Needed] <!--fp32, fp16 mixed precision, bf16 mixed precision, bf16 non-mixed precision, fp16 non-mixed precision, fp8 mixed precision --> #### Speeds, Sizes, Times [optional] <!-- This section provides information about throughput, start/end time, checkpoint size if relevant, etc. --> [More Information Needed] ## Evaluation <!-- This section describes the evaluation protocols and provides the results. --> ### Testing Data, Factors & Metrics #### Testing Data <!-- This should link to a Dataset Card if possible. --> [More Information Needed] #### Factors <!-- These are the things the evaluation is disaggregating by, e.g., subpopulations or domains. --> [More Information Needed] #### Metrics <!-- These are the evaluation metrics being used, ideally with a description of why. --> [More Information Needed] ### Results [More Information Needed] #### Summary ## Model Examination [optional] <!-- Relevant interpretability work for the model goes here --> [More Information Needed] ## Environmental Impact <!-- Total emissions (in grams of CO2eq) and additional considerations, such as electricity usage, go here. Edit the suggested text below accordingly --> Carbon emissions can be estimated using the [Machine Learning Impact calculator](https://mlco2.github.io/impact#compute) presented in [Lacoste et al. (2019)](https://arxiv.org/abs/1910.09700). - **Hardware Type:** [More Information Needed] - **Hours used:** [More Information Needed] - **Cloud Provider:** [More Information Needed] - **Compute Region:** [More Information Needed] - **Carbon Emitted:** [More Information Needed] ## Technical Specifications [optional] ### Model Architecture and Objective [More Information Needed] ### Compute Infrastructure [More Information Needed] #### Hardware [More Information Needed] #### Software [More Information Needed] ## Citation [optional] <!-- If there is a paper or blog post introducing the model, the APA and Bibtex information for that should go in this section. --> **BibTeX:** [More Information Needed] **APA:** [More Information Needed] ## Glossary [optional] <!-- If relevant, include terms and calculations in this section that can help readers understand the model or model card. --> [More Information Needed] ## More Information [optional] [More Information Needed] ## Model Card Authors [optional] [More Information Needed] ## Model Card Contact [More Information Needed]
[ "table", "table column", "table row", "table column header", "table projected row header", "table spanning cell" ]
0llheaven/Conditional-detr-finetuned-V2
# Model Card for Model ID <!-- Provide a quick summary of what the model is/does. --> ## Model Details ### Model Description <!-- Provide a longer summary of what this model is. --> This is the model card of a πŸ€— transformers model that has been pushed on the Hub. This model card has been automatically generated. - **Developed by:** [More Information Needed] - **Funded by [optional]:** [More Information Needed] - **Shared by [optional]:** [More Information Needed] - **Model type:** [More Information Needed] - **Language(s) (NLP):** [More Information Needed] - **License:** [More Information Needed] - **Finetuned from model [optional]:** [More Information Needed] ### Model Sources [optional] <!-- Provide the basic links for the model. --> - **Repository:** [More Information Needed] - **Paper [optional]:** [More Information Needed] - **Demo [optional]:** [More Information Needed] ## Uses <!-- Address questions around how the model is intended to be used, including the foreseeable users of the model and those affected by the model. --> ### Direct Use <!-- This section is for the model use without fine-tuning or plugging into a larger ecosystem/app. --> [More Information Needed] ### Downstream Use [optional] <!-- This section is for the model use when fine-tuned for a task, or when plugged into a larger ecosystem/app --> [More Information Needed] ### Out-of-Scope Use <!-- This section addresses misuse, malicious use, and uses that the model will not work well for. --> [More Information Needed] ## Bias, Risks, and Limitations <!-- This section is meant to convey both technical and sociotechnical limitations. --> [More Information Needed] ### Recommendations <!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. --> Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations. ## How to Get Started with the Model Use the code below to get started with the model. [More Information Needed] ## Training Details ### Training Data <!-- This should link to a Dataset Card, perhaps with a short stub of information on what the training data is all about as well as documentation related to data pre-processing or additional filtering. --> [More Information Needed] ### Training Procedure <!-- This relates heavily to the Technical Specifications. Content here should link to that section when it is relevant to the training procedure. --> #### Preprocessing [optional] [More Information Needed] #### Training Hyperparameters - **Training regime:** [More Information Needed] <!--fp32, fp16 mixed precision, bf16 mixed precision, bf16 non-mixed precision, fp16 non-mixed precision, fp8 mixed precision --> #### Speeds, Sizes, Times [optional] <!-- This section provides information about throughput, start/end time, checkpoint size if relevant, etc. --> [More Information Needed] ## Evaluation <!-- This section describes the evaluation protocols and provides the results. --> ### Testing Data, Factors & Metrics #### Testing Data <!-- This should link to a Dataset Card if possible. --> [More Information Needed] #### Factors <!-- These are the things the evaluation is disaggregating by, e.g., subpopulations or domains. --> [More Information Needed] #### Metrics <!-- These are the evaluation metrics being used, ideally with a description of why. --> [More Information Needed] ### Results [More Information Needed] #### Summary ## Model Examination [optional] <!-- Relevant interpretability work for the model goes here --> [More Information Needed] ## Environmental Impact <!-- Total emissions (in grams of CO2eq) and additional considerations, such as electricity usage, go here. Edit the suggested text below accordingly --> Carbon emissions can be estimated using the [Machine Learning Impact calculator](https://mlco2.github.io/impact#compute) presented in [Lacoste et al. (2019)](https://arxiv.org/abs/1910.09700). - **Hardware Type:** [More Information Needed] - **Hours used:** [More Information Needed] - **Cloud Provider:** [More Information Needed] - **Compute Region:** [More Information Needed] - **Carbon Emitted:** [More Information Needed] ## Technical Specifications [optional] ### Model Architecture and Objective [More Information Needed] ### Compute Infrastructure [More Information Needed] #### Hardware [More Information Needed] #### Software [More Information Needed] ## Citation [optional] <!-- If there is a paper or blog post introducing the model, the APA and Bibtex information for that should go in this section. --> **BibTeX:** [More Information Needed] **APA:** [More Information Needed] ## Glossary [optional] <!-- If relevant, include terms and calculations in this section that can help readers understand the model or model card. --> [More Information Needed] ## More Information [optional] [More Information Needed] ## Model Card Authors [optional] [More Information Needed] ## Model Card Contact [More Information Needed]
[ "pneumonia", "0ther" ]
abhis91/detr_model
# Model Card for Model ID <!-- Provide a quick summary of what the model is/does. --> ## Model Details ### Model Description <!-- Provide a longer summary of what this model is. --> This is the model card of a πŸ€— transformers model that has been pushed on the Hub. This model card has been automatically generated. - **Developed by:** [More Information Needed] - **Funded by [optional]:** [More Information Needed] - **Shared by [optional]:** [More Information Needed] - **Model type:** [More Information Needed] - **Language(s) (NLP):** [More Information Needed] - **License:** [More Information Needed] - **Finetuned from model [optional]:** [More Information Needed] ### Model Sources [optional] <!-- Provide the basic links for the model. --> - **Repository:** [More Information Needed] - **Paper [optional]:** [More Information Needed] - **Demo [optional]:** [More Information Needed] ## Uses <!-- Address questions around how the model is intended to be used, including the foreseeable users of the model and those affected by the model. --> ### Direct Use <!-- This section is for the model use without fine-tuning or plugging into a larger ecosystem/app. --> [More Information Needed] ### Downstream Use [optional] <!-- This section is for the model use when fine-tuned for a task, or when plugged into a larger ecosystem/app --> [More Information Needed] ### Out-of-Scope Use <!-- This section addresses misuse, malicious use, and uses that the model will not work well for. --> [More Information Needed] ## Bias, Risks, and Limitations <!-- This section is meant to convey both technical and sociotechnical limitations. --> [More Information Needed] ### Recommendations <!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. --> Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations. ## How to Get Started with the Model Use the code below to get started with the model. [More Information Needed] ## Training Details ### Training Data <!-- This should link to a Dataset Card, perhaps with a short stub of information on what the training data is all about as well as documentation related to data pre-processing or additional filtering. --> [More Information Needed] ### Training Procedure <!-- This relates heavily to the Technical Specifications. Content here should link to that section when it is relevant to the training procedure. --> #### Preprocessing [optional] [More Information Needed] #### Training Hyperparameters - **Training regime:** [More Information Needed] <!--fp32, fp16 mixed precision, bf16 mixed precision, bf16 non-mixed precision, fp16 non-mixed precision, fp8 mixed precision --> #### Speeds, Sizes, Times [optional] <!-- This section provides information about throughput, start/end time, checkpoint size if relevant, etc. --> [More Information Needed] ## Evaluation <!-- This section describes the evaluation protocols and provides the results. --> ### Testing Data, Factors & Metrics #### Testing Data <!-- This should link to a Dataset Card if possible. --> [More Information Needed] #### Factors <!-- These are the things the evaluation is disaggregating by, e.g., subpopulations or domains. --> [More Information Needed] #### Metrics <!-- These are the evaluation metrics being used, ideally with a description of why. --> [More Information Needed] ### Results [More Information Needed] #### Summary ## Model Examination [optional] <!-- Relevant interpretability work for the model goes here --> [More Information Needed] ## Environmental Impact <!-- Total emissions (in grams of CO2eq) and additional considerations, such as electricity usage, go here. Edit the suggested text below accordingly --> Carbon emissions can be estimated using the [Machine Learning Impact calculator](https://mlco2.github.io/impact#compute) presented in [Lacoste et al. (2019)](https://arxiv.org/abs/1910.09700). - **Hardware Type:** [More Information Needed] - **Hours used:** [More Information Needed] - **Cloud Provider:** [More Information Needed] - **Compute Region:** [More Information Needed] - **Carbon Emitted:** [More Information Needed] ## Technical Specifications [optional] ### Model Architecture and Objective [More Information Needed] ### Compute Infrastructure [More Information Needed] #### Hardware [More Information Needed] #### Software [More Information Needed] ## Citation [optional] <!-- If there is a paper or blog post introducing the model, the APA and Bibtex information for that should go in this section. --> **BibTeX:** [More Information Needed] **APA:** [More Information Needed] ## Glossary [optional] <!-- If relevant, include terms and calculations in this section that can help readers understand the model or model card. --> [More Information Needed] ## More Information [optional] [More Information Needed] ## Model Card Authors [optional] [More Information Needed] ## Model Card Contact [More Information Needed]
[ "label_0", "label_1", "label_2", "label_3", "label_4", "label_5", "label_6" ]
abhis91/detr_new_model
# Model Card for Model ID <!-- Provide a quick summary of what the model is/does. --> ## Model Details ### Model Description <!-- Provide a longer summary of what this model is. --> This is the model card of a πŸ€— transformers model that has been pushed on the Hub. This model card has been automatically generated. - **Developed by:** [More Information Needed] - **Funded by [optional]:** [More Information Needed] - **Shared by [optional]:** [More Information Needed] - **Model type:** [More Information Needed] - **Language(s) (NLP):** [More Information Needed] - **License:** [More Information Needed] - **Finetuned from model [optional]:** [More Information Needed] ### Model Sources [optional] <!-- Provide the basic links for the model. --> - **Repository:** [More Information Needed] - **Paper [optional]:** [More Information Needed] - **Demo [optional]:** [More Information Needed] ## Uses <!-- Address questions around how the model is intended to be used, including the foreseeable users of the model and those affected by the model. --> ### Direct Use <!-- This section is for the model use without fine-tuning or plugging into a larger ecosystem/app. --> [More Information Needed] ### Downstream Use [optional] <!-- This section is for the model use when fine-tuned for a task, or when plugged into a larger ecosystem/app --> [More Information Needed] ### Out-of-Scope Use <!-- This section addresses misuse, malicious use, and uses that the model will not work well for. --> [More Information Needed] ## Bias, Risks, and Limitations <!-- This section is meant to convey both technical and sociotechnical limitations. --> [More Information Needed] ### Recommendations <!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. --> Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations. ## How to Get Started with the Model Use the code below to get started with the model. [More Information Needed] ## Training Details ### Training Data <!-- This should link to a Dataset Card, perhaps with a short stub of information on what the training data is all about as well as documentation related to data pre-processing or additional filtering. --> [More Information Needed] ### Training Procedure <!-- This relates heavily to the Technical Specifications. Content here should link to that section when it is relevant to the training procedure. --> #### Preprocessing [optional] [More Information Needed] #### Training Hyperparameters - **Training regime:** [More Information Needed] <!--fp32, fp16 mixed precision, bf16 mixed precision, bf16 non-mixed precision, fp16 non-mixed precision, fp8 mixed precision --> #### Speeds, Sizes, Times [optional] <!-- This section provides information about throughput, start/end time, checkpoint size if relevant, etc. --> [More Information Needed] ## Evaluation <!-- This section describes the evaluation protocols and provides the results. --> ### Testing Data, Factors & Metrics #### Testing Data <!-- This should link to a Dataset Card if possible. --> [More Information Needed] #### Factors <!-- These are the things the evaluation is disaggregating by, e.g., subpopulations or domains. --> [More Information Needed] #### Metrics <!-- These are the evaluation metrics being used, ideally with a description of why. --> [More Information Needed] ### Results [More Information Needed] #### Summary ## Model Examination [optional] <!-- Relevant interpretability work for the model goes here --> [More Information Needed] ## Environmental Impact <!-- Total emissions (in grams of CO2eq) and additional considerations, such as electricity usage, go here. Edit the suggested text below accordingly --> Carbon emissions can be estimated using the [Machine Learning Impact calculator](https://mlco2.github.io/impact#compute) presented in [Lacoste et al. (2019)](https://arxiv.org/abs/1910.09700). - **Hardware Type:** [More Information Needed] - **Hours used:** [More Information Needed] - **Cloud Provider:** [More Information Needed] - **Compute Region:** [More Information Needed] - **Carbon Emitted:** [More Information Needed] ## Technical Specifications [optional] ### Model Architecture and Objective [More Information Needed] ### Compute Infrastructure [More Information Needed] #### Hardware [More Information Needed] #### Software [More Information Needed] ## Citation [optional] <!-- If there is a paper or blog post introducing the model, the APA and Bibtex information for that should go in this section. --> **BibTeX:** [More Information Needed] **APA:** [More Information Needed] ## Glossary [optional] <!-- If relevant, include terms and calculations in this section that can help readers understand the model or model card. --> [More Information Needed] ## More Information [optional] [More Information Needed] ## Model Card Authors [optional] [More Information Needed] ## Model Card Contact [More Information Needed]
[ "label_0", "label_1", "label_2", "label_3", "label_4", "label_5", "label_6" ]
0llheaven/Conditional-detr-finetuned-V3
# Model Card for Model ID <!-- Provide a quick summary of what the model is/does. --> ## Model Details ### Model Description <!-- Provide a longer summary of what this model is. --> This is the model card of a πŸ€— transformers model that has been pushed on the Hub. This model card has been automatically generated. - **Developed by:** [More Information Needed] - **Funded by [optional]:** [More Information Needed] - **Shared by [optional]:** [More Information Needed] - **Model type:** [More Information Needed] - **Language(s) (NLP):** [More Information Needed] - **License:** [More Information Needed] - **Finetuned from model [optional]:** [More Information Needed] ### Model Sources [optional] <!-- Provide the basic links for the model. --> - **Repository:** [More Information Needed] - **Paper [optional]:** [More Information Needed] - **Demo [optional]:** [More Information Needed] ## Uses <!-- Address questions around how the model is intended to be used, including the foreseeable users of the model and those affected by the model. --> ### Direct Use <!-- This section is for the model use without fine-tuning or plugging into a larger ecosystem/app. --> [More Information Needed] ### Downstream Use [optional] <!-- This section is for the model use when fine-tuned for a task, or when plugged into a larger ecosystem/app --> [More Information Needed] ### Out-of-Scope Use <!-- This section addresses misuse, malicious use, and uses that the model will not work well for. --> [More Information Needed] ## Bias, Risks, and Limitations <!-- This section is meant to convey both technical and sociotechnical limitations. --> [More Information Needed] ### Recommendations <!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. --> Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations. ## How to Get Started with the Model Use the code below to get started with the model. [More Information Needed] ## Training Details ### Training Data <!-- This should link to a Dataset Card, perhaps with a short stub of information on what the training data is all about as well as documentation related to data pre-processing or additional filtering. --> [More Information Needed] ### Training Procedure <!-- This relates heavily to the Technical Specifications. Content here should link to that section when it is relevant to the training procedure. --> #### Preprocessing [optional] [More Information Needed] #### Training Hyperparameters - **Training regime:** [More Information Needed] <!--fp32, fp16 mixed precision, bf16 mixed precision, bf16 non-mixed precision, fp16 non-mixed precision, fp8 mixed precision --> #### Speeds, Sizes, Times [optional] <!-- This section provides information about throughput, start/end time, checkpoint size if relevant, etc. --> [More Information Needed] ## Evaluation <!-- This section describes the evaluation protocols and provides the results. --> ### Testing Data, Factors & Metrics #### Testing Data <!-- This should link to a Dataset Card if possible. --> [More Information Needed] #### Factors <!-- These are the things the evaluation is disaggregating by, e.g., subpopulations or domains. --> [More Information Needed] #### Metrics <!-- These are the evaluation metrics being used, ideally with a description of why. --> [More Information Needed] ### Results [More Information Needed] #### Summary ## Model Examination [optional] <!-- Relevant interpretability work for the model goes here --> [More Information Needed] ## Environmental Impact <!-- Total emissions (in grams of CO2eq) and additional considerations, such as electricity usage, go here. Edit the suggested text below accordingly --> Carbon emissions can be estimated using the [Machine Learning Impact calculator](https://mlco2.github.io/impact#compute) presented in [Lacoste et al. (2019)](https://arxiv.org/abs/1910.09700). - **Hardware Type:** [More Information Needed] - **Hours used:** [More Information Needed] - **Cloud Provider:** [More Information Needed] - **Compute Region:** [More Information Needed] - **Carbon Emitted:** [More Information Needed] ## Technical Specifications [optional] ### Model Architecture and Objective [More Information Needed] ### Compute Infrastructure [More Information Needed] #### Hardware [More Information Needed] #### Software [More Information Needed] ## Citation [optional] <!-- If there is a paper or blog post introducing the model, the APA and Bibtex information for that should go in this section. --> **BibTeX:** [More Information Needed] **APA:** [More Information Needed] ## Glossary [optional] <!-- If relevant, include terms and calculations in this section that can help readers understand the model or model card. --> [More Information Needed] ## More Information [optional] [More Information Needed] ## Model Card Authors [optional] [More Information Needed] ## Model Card Contact [More Information Needed]
[ "pneumonia", "0ther" ]
abhis91/detr_new_model_darkcircle
# Model Card for Model ID <!-- Provide a quick summary of what the model is/does. --> ## Model Details ### Model Description <!-- Provide a longer summary of what this model is. --> This is the model card of a πŸ€— transformers model that has been pushed on the Hub. This model card has been automatically generated. - **Developed by:** [More Information Needed] - **Funded by [optional]:** [More Information Needed] - **Shared by [optional]:** [More Information Needed] - **Model type:** [More Information Needed] - **Language(s) (NLP):** [More Information Needed] - **License:** [More Information Needed] - **Finetuned from model [optional]:** [More Information Needed] ### Model Sources [optional] <!-- Provide the basic links for the model. --> - **Repository:** [More Information Needed] - **Paper [optional]:** [More Information Needed] - **Demo [optional]:** [More Information Needed] ## Uses <!-- Address questions around how the model is intended to be used, including the foreseeable users of the model and those affected by the model. --> ### Direct Use <!-- This section is for the model use without fine-tuning or plugging into a larger ecosystem/app. --> [More Information Needed] ### Downstream Use [optional] <!-- This section is for the model use when fine-tuned for a task, or when plugged into a larger ecosystem/app --> [More Information Needed] ### Out-of-Scope Use <!-- This section addresses misuse, malicious use, and uses that the model will not work well for. --> [More Information Needed] ## Bias, Risks, and Limitations <!-- This section is meant to convey both technical and sociotechnical limitations. --> [More Information Needed] ### Recommendations <!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. --> Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations. ## How to Get Started with the Model Use the code below to get started with the model. [More Information Needed] ## Training Details ### Training Data <!-- This should link to a Dataset Card, perhaps with a short stub of information on what the training data is all about as well as documentation related to data pre-processing or additional filtering. --> [More Information Needed] ### Training Procedure <!-- This relates heavily to the Technical Specifications. Content here should link to that section when it is relevant to the training procedure. --> #### Preprocessing [optional] [More Information Needed] #### Training Hyperparameters - **Training regime:** [More Information Needed] <!--fp32, fp16 mixed precision, bf16 mixed precision, bf16 non-mixed precision, fp16 non-mixed precision, fp8 mixed precision --> #### Speeds, Sizes, Times [optional] <!-- This section provides information about throughput, start/end time, checkpoint size if relevant, etc. --> [More Information Needed] ## Evaluation <!-- This section describes the evaluation protocols and provides the results. --> ### Testing Data, Factors & Metrics #### Testing Data <!-- This should link to a Dataset Card if possible. --> [More Information Needed] #### Factors <!-- These are the things the evaluation is disaggregating by, e.g., subpopulations or domains. --> [More Information Needed] #### Metrics <!-- These are the evaluation metrics being used, ideally with a description of why. --> [More Information Needed] ### Results [More Information Needed] #### Summary ## Model Examination [optional] <!-- Relevant interpretability work for the model goes here --> [More Information Needed] ## Environmental Impact <!-- Total emissions (in grams of CO2eq) and additional considerations, such as electricity usage, go here. Edit the suggested text below accordingly --> Carbon emissions can be estimated using the [Machine Learning Impact calculator](https://mlco2.github.io/impact#compute) presented in [Lacoste et al. (2019)](https://arxiv.org/abs/1910.09700). - **Hardware Type:** [More Information Needed] - **Hours used:** [More Information Needed] - **Cloud Provider:** [More Information Needed] - **Compute Region:** [More Information Needed] - **Carbon Emitted:** [More Information Needed] ## Technical Specifications [optional] ### Model Architecture and Objective [More Information Needed] ### Compute Infrastructure [More Information Needed] #### Hardware [More Information Needed] #### Software [More Information Needed] ## Citation [optional] <!-- If there is a paper or blog post introducing the model, the APA and Bibtex information for that should go in this section. --> **BibTeX:** [More Information Needed] **APA:** [More Information Needed] ## Glossary [optional] <!-- If relevant, include terms and calculations in this section that can help readers understand the model or model card. --> [More Information Needed] ## More Information [optional] [More Information Needed] ## Model Card Authors [optional] [More Information Needed] ## Model Card Contact [More Information Needed]
[ "label_0", "label_1", "label_2", "label_3", "label_4", "label_5", "label_6" ]
skyBluezz/detr-finetuned
# Model Card for Model ID <!-- Provide a quick summary of what the model is/does. --> ## Model Details ### Model Description <!-- Provide a longer summary of what this model is. --> This is the model card of a πŸ€— transformers model that has been pushed on the Hub. This model card has been automatically generated. - **Developed by:** [More Information Needed] - **Funded by [optional]:** [More Information Needed] - **Shared by [optional]:** [More Information Needed] - **Model type:** [More Information Needed] - **Language(s) (NLP):** [More Information Needed] - **License:** [More Information Needed] - **Finetuned from model [optional]:** [More Information Needed] ### Model Sources [optional] <!-- Provide the basic links for the model. --> - **Repository:** [More Information Needed] - **Paper [optional]:** [More Information Needed] - **Demo [optional]:** [More Information Needed] ## Uses <!-- Address questions around how the model is intended to be used, including the foreseeable users of the model and those affected by the model. --> ### Direct Use <!-- This section is for the model use without fine-tuning or plugging into a larger ecosystem/app. --> [More Information Needed] ### Downstream Use [optional] <!-- This section is for the model use when fine-tuned for a task, or when plugged into a larger ecosystem/app --> [More Information Needed] ### Out-of-Scope Use <!-- This section addresses misuse, malicious use, and uses that the model will not work well for. --> [More Information Needed] ## Bias, Risks, and Limitations <!-- This section is meant to convey both technical and sociotechnical limitations. --> [More Information Needed] ### Recommendations <!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. --> Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations. ## How to Get Started with the Model Use the code below to get started with the model. [More Information Needed] ## Training Details ### Training Data <!-- This should link to a Dataset Card, perhaps with a short stub of information on what the training data is all about as well as documentation related to data pre-processing or additional filtering. --> [More Information Needed] ### Training Procedure <!-- This relates heavily to the Technical Specifications. Content here should link to that section when it is relevant to the training procedure. --> #### Preprocessing [optional] [More Information Needed] #### Training Hyperparameters - **Training regime:** [More Information Needed] <!--fp32, fp16 mixed precision, bf16 mixed precision, bf16 non-mixed precision, fp16 non-mixed precision, fp8 mixed precision --> #### Speeds, Sizes, Times [optional] <!-- This section provides information about throughput, start/end time, checkpoint size if relevant, etc. --> [More Information Needed] ## Evaluation <!-- This section describes the evaluation protocols and provides the results. --> ### Testing Data, Factors & Metrics #### Testing Data <!-- This should link to a Dataset Card if possible. --> [More Information Needed] #### Factors <!-- These are the things the evaluation is disaggregating by, e.g., subpopulations or domains. --> [More Information Needed] #### Metrics <!-- These are the evaluation metrics being used, ideally with a description of why. --> [More Information Needed] ### Results [More Information Needed] #### Summary ## Model Examination [optional] <!-- Relevant interpretability work for the model goes here --> [More Information Needed] ## Environmental Impact <!-- Total emissions (in grams of CO2eq) and additional considerations, such as electricity usage, go here. Edit the suggested text below accordingly --> Carbon emissions can be estimated using the [Machine Learning Impact calculator](https://mlco2.github.io/impact#compute) presented in [Lacoste et al. (2019)](https://arxiv.org/abs/1910.09700). - **Hardware Type:** [More Information Needed] - **Hours used:** [More Information Needed] - **Cloud Provider:** [More Information Needed] - **Compute Region:** [More Information Needed] - **Carbon Emitted:** [More Information Needed] ## Technical Specifications [optional] ### Model Architecture and Objective [More Information Needed] ### Compute Infrastructure [More Information Needed] #### Hardware [More Information Needed] #### Software [More Information Needed] ## Citation [optional] <!-- If there is a paper or blog post introducing the model, the APA and Bibtex information for that should go in this section. --> **BibTeX:** [More Information Needed] **APA:** [More Information Needed] ## Glossary [optional] <!-- If relevant, include terms and calculations in this section that can help readers understand the model or model card. --> [More Information Needed] ## More Information [optional] [More Information Needed] ## Model Card Authors [optional] [More Information Needed] ## Model Card Contact [More Information Needed]
[ "label_0", "label_1", "label_2", "label_3", "label_4", "label_5", "label_6", "label_7", "label_8", "label_9", "label_10", "label_11", "label_12", "label_13", "label_14", "label_15", "label_16", "label_17", "label_18", "label_19", "label_20" ]
danelcsb/rtdetr_v2_r18vd
# Model Card for Model ID <!-- Provide a quick summary of what the model is/does. --> ## Model Details ### Model Description <!-- Provide a longer summary of what this model is. --> This is the model card of a πŸ€— transformers model that has been pushed on the Hub. This model card has been automatically generated. - **Developed by:** [More Information Needed] - **Funded by [optional]:** [More Information Needed] - **Shared by [optional]:** [More Information Needed] - **Model type:** [More Information Needed] - **Language(s) (NLP):** [More Information Needed] - **License:** [More Information Needed] - **Finetuned from model [optional]:** [More Information Needed] ### Model Sources [optional] <!-- Provide the basic links for the model. --> - **Repository:** [More Information Needed] - **Paper [optional]:** [More Information Needed] - **Demo [optional]:** [More Information Needed] ## Uses <!-- Address questions around how the model is intended to be used, including the foreseeable users of the model and those affected by the model. --> ### Direct Use <!-- This section is for the model use without fine-tuning or plugging into a larger ecosystem/app. --> [More Information Needed] ### Downstream Use [optional] <!-- This section is for the model use when fine-tuned for a task, or when plugged into a larger ecosystem/app --> [More Information Needed] ### Out-of-Scope Use <!-- This section addresses misuse, malicious use, and uses that the model will not work well for. --> [More Information Needed] ## Bias, Risks, and Limitations <!-- This section is meant to convey both technical and sociotechnical limitations. --> [More Information Needed] ### Recommendations <!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. --> Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations. ## How to Get Started with the Model Use the code below to get started with the model. [More Information Needed] ## Training Details ### Training Data <!-- This should link to a Dataset Card, perhaps with a short stub of information on what the training data is all about as well as documentation related to data pre-processing or additional filtering. --> [More Information Needed] ### Training Procedure <!-- This relates heavily to the Technical Specifications. Content here should link to that section when it is relevant to the training procedure. --> #### Preprocessing [optional] [More Information Needed] #### Training Hyperparameters - **Training regime:** [More Information Needed] <!--fp32, fp16 mixed precision, bf16 mixed precision, bf16 non-mixed precision, fp16 non-mixed precision, fp8 mixed precision --> #### Speeds, Sizes, Times [optional] <!-- This section provides information about throughput, start/end time, checkpoint size if relevant, etc. --> [More Information Needed] ## Evaluation <!-- This section describes the evaluation protocols and provides the results. --> ### Testing Data, Factors & Metrics #### Testing Data <!-- This should link to a Dataset Card if possible. --> [More Information Needed] #### Factors <!-- These are the things the evaluation is disaggregating by, e.g., subpopulations or domains. --> [More Information Needed] #### Metrics <!-- These are the evaluation metrics being used, ideally with a description of why. --> [More Information Needed] ### Results [More Information Needed] #### Summary ## Model Examination [optional] <!-- Relevant interpretability work for the model goes here --> [More Information Needed] ## Environmental Impact <!-- Total emissions (in grams of CO2eq) and additional considerations, such as electricity usage, go here. Edit the suggested text below accordingly --> Carbon emissions can be estimated using the [Machine Learning Impact calculator](https://mlco2.github.io/impact#compute) presented in [Lacoste et al. (2019)](https://arxiv.org/abs/1910.09700). - **Hardware Type:** [More Information Needed] - **Hours used:** [More Information Needed] - **Cloud Provider:** [More Information Needed] - **Compute Region:** [More Information Needed] - **Carbon Emitted:** [More Information Needed] ## Technical Specifications [optional] ### Model Architecture and Objective [More Information Needed] ### Compute Infrastructure [More Information Needed] #### Hardware [More Information Needed] #### Software [More Information Needed] ## Citation [optional] <!-- If there is a paper or blog post introducing the model, the APA and Bibtex information for that should go in this section. --> **BibTeX:** [More Information Needed] **APA:** [More Information Needed] ## Glossary [optional] <!-- If relevant, include terms and calculations in this section that can help readers understand the model or model card. --> [More Information Needed] ## More Information [optional] [More Information Needed] ## Model Card Authors [optional] [More Information Needed] ## Model Card Contact [More Information Needed]
[ "person", "bicycle", "car", "motorbike", "aeroplane", "bus", "train", "truck", "boat", "traffic light", "fire hydrant", "stop sign", "parking meter", "bench", "bird", "cat", "dog", "horse", "sheep", "cow", "elephant", "bear", "zebra", "giraffe", "backpack", "umbrella", "handbag", "tie", "suitcase", "frisbee", "skis", "snowboard", "sports ball", "kite", "baseball bat", "baseball glove", "skateboard", "surfboard", "tennis racket", "bottle", "wine glass", "cup", "fork", "knife", "spoon", "bowl", "banana", "apple", "sandwich", "orange", "broccoli", "carrot", "hot dog", "pizza", "donut", "cake", "chair", "sofa", "pottedplant", "bed", "diningtable", "toilet", "tvmonitor", "laptop", "mouse", "remote", "keyboard", "cell phone", "microwave", "oven", "toaster", "sink", "refrigerator", "book", "clock", "vase", "scissors", "teddy bear", "hair drier", "toothbrush" ]
Manuappu5670/detr-finetuned-cppe-5-10k-steps
<!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. --> # detr-finetuned-cppe-5-10k-steps This model is a fine-tuned version of [facebook/detr-resnet-50](https://huggingface.co/facebook/detr-resnet-50) on the cppe-5 dataset. It achieves the following results on the evaluation set: - Loss: 1.2683 - Map: 0.2876 - Map 50: 0.548 - Map 75: 0.268 - Map Coverall: 0.5615 - Map Face Shield: 0.2396 - Map Gloves: 0.2085 - Map Goggles: 0.1386 - Map Mask: 0.2901 - Map Large: 0.4466 - Map Medium: 0.2337 - Map Small: 0.0849 - Mar 1: 0.2822 - Mar 10: 0.4586 - Mar 100: 0.471 - Mar 100 Coverall: 0.6995 - Mar 100 Face Shield: 0.5013 - Mar 100 Gloves: 0.3946 - Mar 100 Goggles: 0.3754 - Mar 100 Mask: 0.384 - Mar Large: 0.6399 - Mar Medium: 0.4132 - Mar Small: 0.2103 ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 5e-05 - train_batch_size: 8 - eval_batch_size: 8 - seed: 1337 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - num_epochs: 100.0 - mixed_precision_training: Native AMP ### Training results | Training Loss | Epoch | Step | Validation Loss | Map | Map 50 | Map 75 | Map Coverall | Map Face Shield | Map Gloves | Map Goggles | Map Mask | Map Large | Map Medium | Map Small | Mar 1 | Mar 10 | Mar 100 | Mar 100 Coverall | Mar 100 Face Shield | Mar 100 Gloves | Mar 100 Goggles | Mar 100 Mask | Mar Large | Mar Medium | Mar Small | |:-------------:|:-----:|:-----:|:---------------:|:------:|:------:|:------:|:------------:|:---------------:|:----------:|:-----------:|:--------:|:---------:|:----------:|:---------:|:------:|:------:|:-------:|:----------------:|:-------------------:|:--------------:|:---------------:|:------------:|:---------:|:----------:|:---------:| | 2.5588 | 1.0 | 107 | 2.5836 | 0.0189 | 0.0371 | 0.017 | 0.0728 | 0.0 | 0.0017 | 0.0 | 0.0201 | 0.0198 | 0.0199 | 0.004 | 0.045 | 0.0925 | 0.1133 | 0.2428 | 0.0 | 0.1125 | 0.0 | 0.2111 | 0.1708 | 0.0911 | 0.0268 | | 2.2402 | 2.0 | 214 | 2.7766 | 0.0381 | 0.0854 | 0.0301 | 0.1538 | 0.0 | 0.0027 | 0.0 | 0.0338 | 0.0476 | 0.0275 | 0.0078 | 0.0705 | 0.1321 | 0.1381 | 0.4532 | 0.0 | 0.0576 | 0.0 | 0.1796 | 0.1649 | 0.0957 | 0.0334 | | 2.1058 | 3.0 | 321 | 2.1333 | 0.0441 | 0.106 | 0.0342 | 0.1568 | 0.0 | 0.0175 | 0.0 | 0.0464 | 0.0512 | 0.0314 | 0.0193 | 0.076 | 0.1619 | 0.1837 | 0.5527 | 0.0 | 0.1621 | 0.0 | 0.2036 | 0.2292 | 0.1393 | 0.0466 | | 1.8196 | 4.0 | 428 | 2.0629 | 0.051 | 0.1266 | 0.0344 | 0.1591 | 0.0 | 0.0068 | 0.0007 | 0.0883 | 0.065 | 0.0424 | 0.0161 | 0.0825 | 0.162 | 0.189 | 0.5351 | 0.0 | 0.1576 | 0.0046 | 0.2476 | 0.2274 | 0.1422 | 0.06 | | 1.7891 | 5.0 | 535 | 2.0399 | 0.0539 | 0.1328 | 0.0344 | 0.1232 | 0.0003 | 0.0279 | 0.0012 | 0.1167 | 0.0796 | 0.0677 | 0.0243 | 0.071 | 0.1731 | 0.2076 | 0.5383 | 0.0025 | 0.2254 | 0.0292 | 0.2427 | 0.2783 | 0.1554 | 0.0689 | | 1.7706 | 6.0 | 642 | 1.8604 | 0.0952 | 0.222 | 0.0713 | 0.3453 | 0.002 | 0.0296 | 0.0008 | 0.0983 | 0.107 | 0.0771 | 0.029 | 0.1126 | 0.2069 | 0.2277 | 0.6108 | 0.0405 | 0.1924 | 0.0892 | 0.2058 | 0.3101 | 0.1681 | 0.0683 | | 1.6854 | 7.0 | 749 | 1.8219 | 0.1078 | 0.2395 | 0.0842 | 0.3538 | 0.0165 | 0.0316 | 0.0084 | 0.1289 | 0.1379 | 0.0985 | 0.0377 | 0.1338 | 0.2557 | 0.2776 | 0.5527 | 0.1203 | 0.2491 | 0.1677 | 0.2982 | 0.3665 | 0.2327 | 0.1161 | | 1.5801 | 8.0 | 856 | 1.7859 | 0.1217 | 0.2614 | 0.1017 | 0.4204 | 0.0264 | 0.0411 | 0.0024 | 0.1184 | 0.1512 | 0.0877 | 0.0184 | 0.1429 | 0.2905 | 0.3167 | 0.6414 | 0.3089 | 0.2567 | 0.1015 | 0.2751 | 0.4161 | 0.2536 | 0.0875 | | 1.5666 | 9.0 | 963 | 1.7921 | 0.1176 | 0.2803 | 0.0871 | 0.3524 | 0.0458 | 0.0532 | 0.0066 | 0.13 | 0.1568 | 0.1007 | 0.0249 | 0.1566 | 0.2935 | 0.3133 | 0.5626 | 0.2785 | 0.2835 | 0.1862 | 0.2556 | 0.4029 | 0.2667 | 0.1035 | | 1.5349 | 10.0 | 1070 | 1.7596 | 0.1323 | 0.2877 | 0.1122 | 0.4035 | 0.0477 | 0.0512 | 0.0115 | 0.1479 | 0.2039 | 0.0962 | 0.0298 | 0.1723 | 0.3053 | 0.3315 | 0.5811 | 0.2975 | 0.2857 | 0.2108 | 0.2827 | 0.4874 | 0.26 | 0.0993 | | 1.5319 | 11.0 | 1177 | 1.8014 | 0.1077 | 0.2538 | 0.0798 | 0.3197 | 0.0526 | 0.0417 | 0.006 | 0.1183 | 0.1372 | 0.0797 | 0.0359 | 0.1409 | 0.271 | 0.291 | 0.5347 | 0.2468 | 0.2482 | 0.1477 | 0.2773 | 0.3782 | 0.2435 | 0.1001 | | 1.5349 | 12.0 | 1284 | 1.7285 | 0.1314 | 0.3 | 0.0875 | 0.3845 | 0.0591 | 0.0477 | 0.0095 | 0.156 | 0.1712 | 0.1022 | 0.0687 | 0.1497 | 0.2993 | 0.3162 | 0.5545 | 0.3177 | 0.2629 | 0.1538 | 0.292 | 0.4104 | 0.2602 | 0.1458 | | 1.6402 | 13.0 | 1391 | 1.7097 | 0.1283 | 0.2851 | 0.0997 | 0.3808 | 0.0348 | 0.0565 | 0.0109 | 0.1587 | 0.1878 | 0.0979 | 0.0372 | 0.1636 | 0.3236 | 0.3445 | 0.5842 | 0.2899 | 0.2973 | 0.2615 | 0.2898 | 0.4481 | 0.2919 | 0.1307 | | 1.5345 | 14.0 | 1498 | 1.7460 | 0.13 | 0.2783 | 0.1091 | 0.3747 | 0.0664 | 0.0602 | 0.0213 | 0.1276 | 0.1994 | 0.0909 | 0.0476 | 0.1856 | 0.3292 | 0.3538 | 0.5595 | 0.4038 | 0.2839 | 0.2538 | 0.268 | 0.4851 | 0.2861 | 0.1474 | | 1.4773 | 15.0 | 1605 | 1.6503 | 0.152 | 0.3114 | 0.1354 | 0.4279 | 0.07 | 0.0538 | 0.016 | 0.1922 | 0.222 | 0.1119 | 0.0422 | 0.2026 | 0.3373 | 0.3607 | 0.6185 | 0.3506 | 0.2576 | 0.2431 | 0.3338 | 0.5023 | 0.2948 | 0.1211 | | 1.4822 | 16.0 | 1712 | 1.6648 | 0.1429 | 0.304 | 0.1187 | 0.4522 | 0.0567 | 0.0463 | 0.0083 | 0.1512 | 0.2121 | 0.1058 | 0.0414 | 0.1637 | 0.3194 | 0.3386 | 0.641 | 0.3456 | 0.2411 | 0.1815 | 0.284 | 0.4395 | 0.2802 | 0.1365 | | 1.4557 | 17.0 | 1819 | 1.6038 | 0.1592 | 0.3358 | 0.1331 | 0.4363 | 0.0785 | 0.0755 | 0.0167 | 0.1888 | 0.2358 | 0.1184 | 0.0566 | 0.191 | 0.3525 | 0.3758 | 0.6396 | 0.343 | 0.3129 | 0.2477 | 0.3356 | 0.5437 | 0.2989 | 0.1301 | | 1.4644 | 18.0 | 1926 | 1.5534 | 0.1588 | 0.3444 | 0.1257 | 0.4351 | 0.0745 | 0.0715 | 0.0269 | 0.1859 | 0.2247 | 0.1262 | 0.0442 | 0.197 | 0.3433 | 0.3615 | 0.6387 | 0.3443 | 0.2964 | 0.22 | 0.308 | 0.4799 | 0.2992 | 0.1397 | | 1.3847 | 19.0 | 2033 | 1.5755 | 0.1583 | 0.3576 | 0.1261 | 0.4503 | 0.0837 | 0.0715 | 0.0187 | 0.1675 | 0.2379 | 0.1209 | 0.0607 | 0.1929 | 0.354 | 0.3715 | 0.6356 | 0.3544 | 0.3121 | 0.2492 | 0.3062 | 0.4949 | 0.3157 | 0.1753 | | 1.4198 | 20.0 | 2140 | 1.7037 | 0.1494 | 0.3249 | 0.1212 | 0.4156 | 0.0408 | 0.0673 | 0.0314 | 0.1921 | 0.2208 | 0.1153 | 0.0494 | 0.1936 | 0.3216 | 0.3395 | 0.6095 | 0.2924 | 0.2629 | 0.2292 | 0.3036 | 0.4689 | 0.2765 | 0.1058 | | 1.3866 | 21.0 | 2247 | 1.5504 | 0.1694 | 0.3681 | 0.1379 | 0.4553 | 0.0792 | 0.0844 | 0.0318 | 0.196 | 0.2608 | 0.1283 | 0.0575 | 0.2032 | 0.3554 | 0.371 | 0.6365 | 0.3342 | 0.3196 | 0.2523 | 0.3124 | 0.5051 | 0.3138 | 0.1409 | | 1.3592 | 22.0 | 2354 | 1.5890 | 0.1604 | 0.3461 | 0.1333 | 0.4449 | 0.0731 | 0.0836 | 0.0173 | 0.183 | 0.2533 | 0.1223 | 0.0404 | 0.1985 | 0.3634 | 0.3789 | 0.6324 | 0.357 | 0.3254 | 0.2785 | 0.3013 | 0.5302 | 0.3158 | 0.1263 | | 1.3602 | 23.0 | 2461 | 1.6250 | 0.1629 | 0.3567 | 0.1342 | 0.4446 | 0.0775 | 0.0669 | 0.0233 | 0.2023 | 0.2478 | 0.1267 | 0.043 | 0.1933 | 0.3375 | 0.3551 | 0.6167 | 0.3354 | 0.2777 | 0.2415 | 0.304 | 0.4965 | 0.2959 | 0.094 | | 1.3773 | 24.0 | 2568 | 1.5816 | 0.1766 | 0.3699 | 0.1473 | 0.4844 | 0.0793 | 0.0792 | 0.026 | 0.2142 | 0.2542 | 0.1417 | 0.0498 | 0.1935 | 0.3434 | 0.3658 | 0.6482 | 0.3177 | 0.3219 | 0.2292 | 0.312 | 0.4903 | 0.3218 | 0.1367 | | 1.3918 | 25.0 | 2675 | 1.5896 | 0.1545 | 0.3315 | 0.1288 | 0.4496 | 0.0587 | 0.0753 | 0.0218 | 0.1673 | 0.2443 | 0.1112 | 0.0442 | 0.1822 | 0.347 | 0.3684 | 0.6203 | 0.3772 | 0.2996 | 0.2538 | 0.2911 | 0.49 | 0.3188 | 0.1396 | | 1.3254 | 26.0 | 2782 | 1.5416 | 0.1723 | 0.3638 | 0.1517 | 0.4706 | 0.0722 | 0.0743 | 0.0394 | 0.2052 | 0.2513 | 0.1429 | 0.0496 | 0.1992 | 0.3594 | 0.3776 | 0.6333 | 0.3747 | 0.3121 | 0.2585 | 0.3093 | 0.5114 | 0.319 | 0.1536 | | 1.3393 | 27.0 | 2889 | 1.4658 | 0.184 | 0.3695 | 0.1653 | 0.4922 | 0.0744 | 0.0949 | 0.0226 | 0.2356 | 0.297 | 0.1286 | 0.0564 | 0.2007 | 0.3848 | 0.4005 | 0.6383 | 0.3861 | 0.3103 | 0.3308 | 0.3373 | 0.5409 | 0.3371 | 0.1833 | | 1.2816 | 28.0 | 2996 | 1.5467 | 0.1792 | 0.3764 | 0.1497 | 0.4785 | 0.0672 | 0.0855 | 0.0317 | 0.2331 | 0.265 | 0.1387 | 0.0474 | 0.2154 | 0.3713 | 0.3842 | 0.6351 | 0.362 | 0.3085 | 0.2738 | 0.3413 | 0.5269 | 0.321 | 0.1462 | | 1.298 | 29.0 | 3103 | 1.4971 | 0.1777 | 0.3909 | 0.1386 | 0.482 | 0.0694 | 0.0983 | 0.0289 | 0.21 | 0.2805 | 0.134 | 0.0686 | 0.2097 | 0.3774 | 0.3919 | 0.6541 | 0.3532 | 0.3201 | 0.2908 | 0.3413 | 0.5229 | 0.3339 | 0.1733 | | 1.3079 | 30.0 | 3210 | 1.5049 | 0.1741 | 0.3718 | 0.1437 | 0.4674 | 0.071 | 0.0998 | 0.0228 | 0.2097 | 0.2751 | 0.1147 | 0.0593 | 0.2156 | 0.3669 | 0.383 | 0.6279 | 0.319 | 0.3259 | 0.3015 | 0.3404 | 0.5281 | 0.2951 | 0.1667 | | 1.2494 | 31.0 | 3317 | 1.4427 | 0.1919 | 0.4041 | 0.1543 | 0.4876 | 0.0968 | 0.1214 | 0.0282 | 0.2253 | 0.3002 | 0.1385 | 0.0653 | 0.2212 | 0.382 | 0.3987 | 0.6419 | 0.3937 | 0.3634 | 0.2738 | 0.3209 | 0.5467 | 0.3201 | 0.1964 | | 1.2191 | 32.0 | 3424 | 1.4607 | 0.182 | 0.3715 | 0.1605 | 0.48 | 0.0694 | 0.1278 | 0.0247 | 0.2083 | 0.304 | 0.1289 | 0.0594 | 0.2245 | 0.3891 | 0.4068 | 0.6279 | 0.3873 | 0.3424 | 0.3508 | 0.3253 | 0.5578 | 0.3428 | 0.1695 | | 1.2032 | 33.0 | 3531 | 1.4382 | 0.2014 | 0.4103 | 0.1757 | 0.4792 | 0.1109 | 0.1354 | 0.0283 | 0.2534 | 0.3126 | 0.1516 | 0.0608 | 0.2291 | 0.3929 | 0.4089 | 0.6324 | 0.3949 | 0.3576 | 0.3123 | 0.3471 | 0.5595 | 0.3468 | 0.1781 | | 1.2524 | 34.0 | 3638 | 1.5703 | 0.1823 | 0.3973 | 0.1353 | 0.4378 | 0.0926 | 0.12 | 0.0292 | 0.2316 | 0.2772 | 0.1383 | 0.0732 | 0.2236 | 0.3887 | 0.4039 | 0.6086 | 0.4165 | 0.3348 | 0.3185 | 0.3413 | 0.5526 | 0.3554 | 0.1525 | | 1.242 | 35.0 | 3745 | 1.5388 | 0.1906 | 0.4011 | 0.1575 | 0.4706 | 0.0915 | 0.1242 | 0.039 | 0.2278 | 0.3073 | 0.1399 | 0.0547 | 0.2317 | 0.3938 | 0.4173 | 0.6144 | 0.4038 | 0.329 | 0.3969 | 0.3422 | 0.574 | 0.3629 | 0.1648 | | 1.229 | 36.0 | 3852 | 1.4810 | 0.1744 | 0.3838 | 0.1452 | 0.4299 | 0.0909 | 0.118 | 0.0195 | 0.2136 | 0.2838 | 0.1259 | 0.0516 | 0.2159 | 0.3755 | 0.3965 | 0.5802 | 0.4177 | 0.3424 | 0.3215 | 0.3204 | 0.5457 | 0.3359 | 0.1789 | | 1.2041 | 37.0 | 3959 | 1.4555 | 0.1896 | 0.3921 | 0.155 | 0.4721 | 0.0731 | 0.1309 | 0.0379 | 0.2339 | 0.2935 | 0.1495 | 0.0486 | 0.2215 | 0.3904 | 0.4061 | 0.6243 | 0.4114 | 0.3527 | 0.3015 | 0.3404 | 0.5462 | 0.3605 | 0.1592 | | 1.1762 | 38.0 | 4066 | 1.4493 | 0.1818 | 0.3896 | 0.1507 | 0.4733 | 0.0844 | 0.1198 | 0.0236 | 0.2081 | 0.2867 | 0.1387 | 0.046 | 0.2103 | 0.3867 | 0.411 | 0.6387 | 0.4367 | 0.3607 | 0.3046 | 0.3142 | 0.532 | 0.362 | 0.1857 | | 1.1731 | 39.0 | 4173 | 1.4147 | 0.1999 | 0.4084 | 0.1802 | 0.5015 | 0.1047 | 0.1363 | 0.0317 | 0.2254 | 0.3095 | 0.1505 | 0.0556 | 0.2195 | 0.4139 | 0.4268 | 0.6441 | 0.4519 | 0.3598 | 0.3569 | 0.3213 | 0.5706 | 0.3736 | 0.1939 | | 1.1373 | 40.0 | 4280 | 1.4616 | 0.2065 | 0.4203 | 0.179 | 0.5155 | 0.0951 | 0.1367 | 0.0446 | 0.2406 | 0.3142 | 0.1577 | 0.0629 | 0.2322 | 0.3984 | 0.416 | 0.6653 | 0.4051 | 0.3558 | 0.32 | 0.3338 | 0.5772 | 0.3599 | 0.1669 | | 1.1555 | 41.0 | 4387 | 1.4085 | 0.2056 | 0.4298 | 0.1852 | 0.5084 | 0.0938 | 0.1467 | 0.0318 | 0.2473 | 0.3073 | 0.1579 | 0.0613 | 0.2332 | 0.4107 | 0.4286 | 0.6599 | 0.4506 | 0.3522 | 0.3308 | 0.3493 | 0.5641 | 0.3887 | 0.1693 | | 1.1251 | 42.0 | 4494 | 1.4251 | 0.2016 | 0.4199 | 0.1708 | 0.4991 | 0.0886 | 0.1526 | 0.0353 | 0.2325 | 0.3155 | 0.1518 | 0.0622 | 0.2355 | 0.4061 | 0.4188 | 0.6369 | 0.4266 | 0.3571 | 0.3262 | 0.3471 | 0.5733 | 0.36 | 0.1816 | | 1.13 | 43.0 | 4601 | 1.4109 | 0.2082 | 0.4235 | 0.1871 | 0.5153 | 0.0981 | 0.1493 | 0.0355 | 0.243 | 0.3314 | 0.1561 | 0.0468 | 0.2499 | 0.4077 | 0.4286 | 0.6527 | 0.4443 | 0.3504 | 0.3538 | 0.3418 | 0.5721 | 0.3896 | 0.1574 | | 1.1054 | 44.0 | 4708 | 1.4445 | 0.1986 | 0.414 | 0.1641 | 0.4908 | 0.0998 | 0.1489 | 0.0334 | 0.2202 | 0.3213 | 0.1555 | 0.0467 | 0.2271 | 0.3984 | 0.4155 | 0.6622 | 0.3987 | 0.3696 | 0.3123 | 0.3347 | 0.5447 | 0.3817 | 0.1669 | | 1.0976 | 45.0 | 4815 | 1.3522 | 0.2204 | 0.4545 | 0.1927 | 0.5164 | 0.1406 | 0.1522 | 0.0447 | 0.2481 | 0.3496 | 0.1758 | 0.0601 | 0.2479 | 0.4202 | 0.4408 | 0.6626 | 0.4316 | 0.3692 | 0.3785 | 0.3622 | 0.5881 | 0.4096 | 0.1567 | | 1.0902 | 46.0 | 4922 | 1.3911 | 0.2255 | 0.459 | 0.1903 | 0.5246 | 0.1499 | 0.1554 | 0.0443 | 0.2531 | 0.3737 | 0.1667 | 0.0562 | 0.2534 | 0.4091 | 0.427 | 0.6608 | 0.4278 | 0.3321 | 0.3492 | 0.3649 | 0.5908 | 0.3789 | 0.1419 | | 1.0924 | 47.0 | 5029 | 1.4129 | 0.2179 | 0.438 | 0.1968 | 0.5126 | 0.1287 | 0.1623 | 0.0357 | 0.2502 | 0.3678 | 0.1568 | 0.064 | 0.2464 | 0.4227 | 0.4414 | 0.6613 | 0.4671 | 0.3656 | 0.3615 | 0.3516 | 0.6118 | 0.3921 | 0.156 | | 1.0578 | 48.0 | 5136 | 1.3591 | 0.226 | 0.458 | 0.1983 | 0.5164 | 0.1453 | 0.1699 | 0.0535 | 0.245 | 0.3499 | 0.1859 | 0.0795 | 0.2654 | 0.4282 | 0.4467 | 0.6649 | 0.4291 | 0.3714 | 0.4092 | 0.3587 | 0.6001 | 0.4059 | 0.1825 | | 1.0784 | 49.0 | 5243 | 1.3628 | 0.2351 | 0.4789 | 0.2056 | 0.5266 | 0.1709 | 0.1683 | 0.0821 | 0.2278 | 0.3749 | 0.1851 | 0.0632 | 0.2463 | 0.4193 | 0.434 | 0.6635 | 0.4241 | 0.3723 | 0.3769 | 0.3333 | 0.5954 | 0.3687 | 0.1871 | | 1.0538 | 50.0 | 5350 | 1.3747 | 0.2235 | 0.4678 | 0.1824 | 0.5198 | 0.1498 | 0.1553 | 0.0658 | 0.2268 | 0.3455 | 0.1741 | 0.0559 | 0.251 | 0.4235 | 0.441 | 0.6698 | 0.4658 | 0.3688 | 0.3646 | 0.336 | 0.5987 | 0.3796 | 0.2044 | | 1.073 | 51.0 | 5457 | 1.3670 | 0.2327 | 0.4743 | 0.2008 | 0.5208 | 0.1665 | 0.153 | 0.0708 | 0.2526 | 0.3665 | 0.1921 | 0.065 | 0.2572 | 0.4244 | 0.4447 | 0.6685 | 0.4519 | 0.3754 | 0.3738 | 0.3538 | 0.6049 | 0.3983 | 0.1842 | | 1.046 | 52.0 | 5564 | 1.3740 | 0.2274 | 0.4689 | 0.2021 | 0.5104 | 0.1316 | 0.1589 | 0.085 | 0.2511 | 0.3452 | 0.1893 | 0.0699 | 0.2603 | 0.4256 | 0.4417 | 0.6599 | 0.443 | 0.3853 | 0.3646 | 0.3556 | 0.6094 | 0.3834 | 0.1673 | | 1.0357 | 53.0 | 5671 | 1.3717 | 0.234 | 0.4706 | 0.205 | 0.5205 | 0.1471 | 0.1758 | 0.0706 | 0.2559 | 0.3477 | 0.1915 | 0.0707 | 0.2489 | 0.4459 | 0.4583 | 0.6752 | 0.4696 | 0.3844 | 0.4108 | 0.3516 | 0.6231 | 0.4067 | 0.1743 | | 1.0409 | 54.0 | 5778 | 1.3340 | 0.2379 | 0.4808 | 0.2053 | 0.5229 | 0.1567 | 0.1758 | 0.078 | 0.2562 | 0.3685 | 0.185 | 0.0666 | 0.257 | 0.4239 | 0.4391 | 0.6788 | 0.4519 | 0.3853 | 0.3246 | 0.3551 | 0.6022 | 0.3845 | 0.1809 | | 0.9965 | 55.0 | 5885 | 1.3820 | 0.2278 | 0.4612 | 0.2021 | 0.5011 | 0.1362 | 0.1706 | 0.0689 | 0.262 | 0.357 | 0.1874 | 0.05 | 0.2588 | 0.4149 | 0.4334 | 0.6586 | 0.4165 | 0.3915 | 0.3354 | 0.3649 | 0.5898 | 0.3901 | 0.1513 | | 1.0164 | 56.0 | 5992 | 1.3389 | 0.2296 | 0.4745 | 0.2029 | 0.5153 | 0.1461 | 0.1701 | 0.0799 | 0.2365 | 0.3576 | 0.1839 | 0.053 | 0.2526 | 0.4205 | 0.438 | 0.6743 | 0.4203 | 0.4062 | 0.3462 | 0.3431 | 0.5991 | 0.3765 | 0.1948 | | 1.0269 | 57.0 | 6099 | 1.3383 | 0.2428 | 0.4846 | 0.2082 | 0.517 | 0.1608 | 0.1881 | 0.0976 | 0.2505 | 0.3809 | 0.1975 | 0.0578 | 0.2657 | 0.4294 | 0.4426 | 0.6757 | 0.4532 | 0.383 | 0.3369 | 0.364 | 0.6041 | 0.3911 | 0.1599 | | 1.0046 | 58.0 | 6206 | 1.3327 | 0.242 | 0.5002 | 0.2153 | 0.5332 | 0.1674 | 0.1622 | 0.1032 | 0.2439 | 0.3925 | 0.1925 | 0.0694 | 0.2554 | 0.4239 | 0.4404 | 0.682 | 0.4595 | 0.3629 | 0.3415 | 0.356 | 0.6121 | 0.3883 | 0.1649 | | 0.9824 | 59.0 | 6313 | 1.3146 | 0.2374 | 0.4831 | 0.2061 | 0.532 | 0.1447 | 0.1729 | 0.0735 | 0.2638 | 0.3835 | 0.1987 | 0.0489 | 0.2584 | 0.4291 | 0.4468 | 0.6811 | 0.4506 | 0.3777 | 0.3508 | 0.3738 | 0.6235 | 0.405 | 0.1578 | | 0.993 | 60.0 | 6420 | 1.3323 | 0.2473 | 0.4911 | 0.2301 | 0.5446 | 0.1633 | 0.1693 | 0.0918 | 0.2675 | 0.3966 | 0.1955 | 0.062 | 0.2581 | 0.4351 | 0.4493 | 0.6959 | 0.4494 | 0.3741 | 0.3585 | 0.3684 | 0.6193 | 0.399 | 0.1689 | | 0.9693 | 61.0 | 6527 | 1.3266 | 0.2482 | 0.5025 | 0.2301 | 0.531 | 0.1638 | 0.1818 | 0.0934 | 0.271 | 0.3887 | 0.208 | 0.0719 | 0.2538 | 0.4263 | 0.4394 | 0.6739 | 0.4203 | 0.3759 | 0.3569 | 0.3702 | 0.6136 | 0.3829 | 0.1565 | | 0.9659 | 62.0 | 6634 | 1.3103 | 0.2489 | 0.5113 | 0.2211 | 0.5394 | 0.1635 | 0.2051 | 0.0788 | 0.2577 | 0.3992 | 0.1974 | 0.0613 | 0.2531 | 0.4417 | 0.4587 | 0.6874 | 0.443 | 0.4036 | 0.3938 | 0.3658 | 0.6242 | 0.4077 | 0.1785 | | 0.9812 | 63.0 | 6741 | 1.3091 | 0.2484 | 0.5071 | 0.2221 | 0.5391 | 0.1971 | 0.1813 | 0.068 | 0.2565 | 0.3881 | 0.2023 | 0.059 | 0.261 | 0.4321 | 0.4472 | 0.6842 | 0.4443 | 0.3683 | 0.3815 | 0.3578 | 0.6004 | 0.4011 | 0.1762 | | 0.9443 | 64.0 | 6848 | 1.3034 | 0.255 | 0.5053 | 0.234 | 0.5466 | 0.1946 | 0.196 | 0.0808 | 0.2569 | 0.4079 | 0.1967 | 0.0655 | 0.2622 | 0.4358 | 0.4519 | 0.6932 | 0.4658 | 0.3781 | 0.3569 | 0.3653 | 0.6256 | 0.4023 | 0.1695 | | 0.9468 | 65.0 | 6955 | 1.3067 | 0.2543 | 0.5115 | 0.2286 | 0.5465 | 0.2078 | 0.1787 | 0.0851 | 0.2533 | 0.3954 | 0.2185 | 0.0509 | 0.2679 | 0.4375 | 0.4469 | 0.6806 | 0.4595 | 0.375 | 0.3646 | 0.3547 | 0.6126 | 0.4035 | 0.1647 | | 0.9427 | 66.0 | 7062 | 1.3082 | 0.2566 | 0.5135 | 0.2296 | 0.5537 | 0.1822 | 0.1903 | 0.1002 | 0.2565 | 0.4108 | 0.2056 | 0.0582 | 0.2631 | 0.4291 | 0.444 | 0.6941 | 0.4481 | 0.3754 | 0.3446 | 0.3578 | 0.6226 | 0.3932 | 0.164 | | 0.9407 | 67.0 | 7169 | 1.2939 | 0.2554 | 0.5046 | 0.2354 | 0.5447 | 0.1809 | 0.1784 | 0.1029 | 0.2699 | 0.3997 | 0.209 | 0.0651 | 0.2775 | 0.4442 | 0.459 | 0.6883 | 0.4405 | 0.4009 | 0.3769 | 0.3884 | 0.6138 | 0.4135 | 0.2002 | | 0.9415 | 68.0 | 7276 | 1.3124 | 0.2646 | 0.5194 | 0.2399 | 0.5569 | 0.1971 | 0.1767 | 0.1011 | 0.2914 | 0.4107 | 0.2057 | 0.0701 | 0.2679 | 0.4475 | 0.4654 | 0.6973 | 0.457 | 0.3781 | 0.4 | 0.3947 | 0.6329 | 0.4028 | 0.2046 | | 0.9355 | 69.0 | 7383 | 1.3057 | 0.2559 | 0.5018 | 0.2431 | 0.564 | 0.1882 | 0.1702 | 0.075 | 0.2823 | 0.397 | 0.2078 | 0.0678 | 0.2693 | 0.4378 | 0.4526 | 0.7054 | 0.457 | 0.3714 | 0.3354 | 0.3938 | 0.6233 | 0.3913 | 0.2025 | | 0.9219 | 70.0 | 7490 | 1.3124 | 0.2597 | 0.5199 | 0.237 | 0.5355 | 0.2125 | 0.1808 | 0.0888 | 0.2806 | 0.3964 | 0.2092 | 0.0786 | 0.2741 | 0.4348 | 0.4492 | 0.6739 | 0.4671 | 0.3933 | 0.3323 | 0.3796 | 0.6152 | 0.3894 | 0.1811 | | 0.924 | 71.0 | 7597 | 1.2846 | 0.2617 | 0.5187 | 0.2391 | 0.5557 | 0.1992 | 0.1934 | 0.0901 | 0.2704 | 0.4049 | 0.2025 | 0.0702 | 0.2746 | 0.4484 | 0.4624 | 0.6851 | 0.4582 | 0.3875 | 0.4015 | 0.3796 | 0.6209 | 0.4033 | 0.197 | | 0.9059 | 72.0 | 7704 | 1.3033 | 0.2654 | 0.5153 | 0.2509 | 0.5439 | 0.2186 | 0.1882 | 0.0907 | 0.2857 | 0.4047 | 0.2232 | 0.0667 | 0.2793 | 0.4522 | 0.4674 | 0.6883 | 0.4861 | 0.3893 | 0.3892 | 0.384 | 0.6274 | 0.4195 | 0.2061 | | 0.898 | 73.0 | 7811 | 1.3032 | 0.2566 | 0.5055 | 0.2341 | 0.5384 | 0.188 | 0.1921 | 0.0934 | 0.271 | 0.4023 | 0.1998 | 0.0624 | 0.2678 | 0.4448 | 0.4627 | 0.677 | 0.4747 | 0.3955 | 0.3908 | 0.3756 | 0.618 | 0.4065 | 0.2145 | | 0.8891 | 74.0 | 7918 | 1.3166 | 0.2575 | 0.5163 | 0.2318 | 0.5505 | 0.1876 | 0.1786 | 0.0896 | 0.2811 | 0.3978 | 0.2063 | 0.0807 | 0.2714 | 0.4399 | 0.4573 | 0.6937 | 0.462 | 0.396 | 0.3477 | 0.3871 | 0.6207 | 0.405 | 0.2097 | | 0.8795 | 75.0 | 8025 | 1.2989 | 0.2629 | 0.5072 | 0.2471 | 0.5572 | 0.198 | 0.1819 | 0.0975 | 0.28 | 0.4177 | 0.2137 | 0.0777 | 0.2667 | 0.4464 | 0.4623 | 0.6973 | 0.4582 | 0.4004 | 0.3738 | 0.3818 | 0.6225 | 0.4086 | 0.2155 | | 0.8835 | 76.0 | 8132 | 1.2978 | 0.2714 | 0.5307 | 0.2606 | 0.5548 | 0.2234 | 0.1905 | 0.1016 | 0.2867 | 0.4302 | 0.2229 | 0.086 | 0.2744 | 0.4566 | 0.4724 | 0.6946 | 0.5038 | 0.3964 | 0.3831 | 0.384 | 0.6389 | 0.4226 | 0.2132 | | 0.9 | 77.0 | 8239 | 1.2916 | 0.2674 | 0.5237 | 0.242 | 0.5506 | 0.2127 | 0.1916 | 0.1094 | 0.2727 | 0.4176 | 0.2194 | 0.0704 | 0.2639 | 0.4432 | 0.4565 | 0.6914 | 0.4532 | 0.4004 | 0.36 | 0.3773 | 0.6091 | 0.3982 | 0.2424 | | 0.8872 | 78.0 | 8346 | 1.2752 | 0.2744 | 0.5395 | 0.2546 | 0.5543 | 0.2139 | 0.1967 | 0.121 | 0.2864 | 0.4319 | 0.2262 | 0.0621 | 0.2721 | 0.4467 | 0.4647 | 0.6964 | 0.481 | 0.3879 | 0.3692 | 0.3889 | 0.6395 | 0.4137 | 0.1906 | | 0.8767 | 79.0 | 8453 | 1.2916 | 0.2737 | 0.5397 | 0.2503 | 0.5436 | 0.2175 | 0.1985 | 0.1219 | 0.287 | 0.4288 | 0.2291 | 0.0837 | 0.2797 | 0.4519 | 0.4664 | 0.6959 | 0.4873 | 0.392 | 0.3631 | 0.3938 | 0.6321 | 0.4188 | 0.1945 | | 0.8785 | 80.0 | 8560 | 1.3004 | 0.2719 | 0.537 | 0.2452 | 0.549 | 0.2249 | 0.1836 | 0.1143 | 0.2876 | 0.426 | 0.2257 | 0.0863 | 0.2821 | 0.4519 | 0.4615 | 0.6892 | 0.4835 | 0.3839 | 0.3585 | 0.3924 | 0.6326 | 0.4125 | 0.1763 | | 0.8733 | 81.0 | 8667 | 1.2918 | 0.2688 | 0.5325 | 0.2473 | 0.5406 | 0.2208 | 0.1963 | 0.1138 | 0.2725 | 0.421 | 0.2187 | 0.0796 | 0.2771 | 0.4499 | 0.4635 | 0.6946 | 0.5038 | 0.3893 | 0.3554 | 0.3747 | 0.6206 | 0.4117 | 0.2144 | | 0.8823 | 82.0 | 8774 | 1.3056 | 0.2712 | 0.5299 | 0.2497 | 0.5278 | 0.2231 | 0.1861 | 0.1298 | 0.289 | 0.4239 | 0.2287 | 0.0651 | 0.2797 | 0.4489 | 0.4588 | 0.6869 | 0.4759 | 0.3737 | 0.3723 | 0.3853 | 0.6287 | 0.4063 | 0.1962 | | 0.8704 | 83.0 | 8881 | 1.2819 | 0.2695 | 0.5333 | 0.2459 | 0.5283 | 0.2208 | 0.1939 | 0.1213 | 0.2834 | 0.4246 | 0.2255 | 0.0779 | 0.279 | 0.4489 | 0.4591 | 0.6869 | 0.4759 | 0.3938 | 0.3569 | 0.3818 | 0.6371 | 0.3994 | 0.1888 | | 0.8465 | 84.0 | 8988 | 1.3033 | 0.2735 | 0.5394 | 0.2551 | 0.5387 | 0.2369 | 0.1953 | 0.1169 | 0.2797 | 0.4277 | 0.2244 | 0.0877 | 0.2786 | 0.4465 | 0.4575 | 0.6797 | 0.481 | 0.3951 | 0.3477 | 0.384 | 0.633 | 0.4044 | 0.1802 | | 0.8366 | 85.0 | 9095 | 1.3053 | 0.2708 | 0.531 | 0.2427 | 0.5338 | 0.2317 | 0.1927 | 0.1134 | 0.2822 | 0.4223 | 0.2165 | 0.0786 | 0.2723 | 0.4379 | 0.4517 | 0.6752 | 0.4823 | 0.375 | 0.3477 | 0.3782 | 0.6049 | 0.4051 | 0.1898 | | 0.8491 | 86.0 | 9202 | 1.2840 | 0.2805 | 0.5355 | 0.2549 | 0.5392 | 0.2504 | 0.2054 | 0.1197 | 0.288 | 0.4282 | 0.2341 | 0.0783 | 0.2831 | 0.4491 | 0.4645 | 0.6811 | 0.4962 | 0.4018 | 0.3569 | 0.3867 | 0.6288 | 0.4189 | 0.199 | | 0.8473 | 87.0 | 9309 | 1.2829 | 0.2785 | 0.5435 | 0.2557 | 0.538 | 0.2432 | 0.2038 | 0.1273 | 0.2802 | 0.4339 | 0.2234 | 0.0868 | 0.2807 | 0.4509 | 0.4637 | 0.6815 | 0.4848 | 0.3879 | 0.3815 | 0.3827 | 0.6366 | 0.4085 | 0.2053 | | 0.844 | 88.0 | 9416 | 1.2751 | 0.2794 | 0.5433 | 0.2532 | 0.5424 | 0.2415 | 0.2012 | 0.1279 | 0.2842 | 0.4367 | 0.2218 | 0.0767 | 0.2785 | 0.4568 | 0.4691 | 0.6829 | 0.5051 | 0.3969 | 0.3754 | 0.3853 | 0.6326 | 0.4168 | 0.2101 | | 0.8411 | 89.0 | 9523 | 1.2844 | 0.2813 | 0.5461 | 0.2606 | 0.5418 | 0.2438 | 0.2045 | 0.1335 | 0.2831 | 0.4414 | 0.2267 | 0.0947 | 0.2824 | 0.4511 | 0.4647 | 0.6847 | 0.4911 | 0.3969 | 0.3662 | 0.3849 | 0.6306 | 0.4107 | 0.2043 | | 0.8375 | 90.0 | 9630 | 1.2949 | 0.2758 | 0.5364 | 0.2516 | 0.5433 | 0.2321 | 0.1937 | 0.1287 | 0.2811 | 0.4359 | 0.2251 | 0.0942 | 0.2797 | 0.4481 | 0.462 | 0.6824 | 0.481 | 0.3929 | 0.3677 | 0.3858 | 0.6303 | 0.4087 | 0.2091 | | 0.8338 | 91.0 | 9737 | 1.2828 | 0.2843 | 0.5496 | 0.2604 | 0.5478 | 0.2493 | 0.2043 | 0.1324 | 0.2877 | 0.4388 | 0.2413 | 0.0914 | 0.2825 | 0.4592 | 0.4735 | 0.6937 | 0.5025 | 0.4071 | 0.3738 | 0.3902 | 0.636 | 0.4261 | 0.2145 | | 0.83 | 92.0 | 9844 | 1.3003 | 0.2866 | 0.5615 | 0.2591 | 0.5463 | 0.2636 | 0.2034 | 0.1303 | 0.2896 | 0.4329 | 0.244 | 0.0747 | 0.2854 | 0.4557 | 0.4687 | 0.6914 | 0.5013 | 0.3942 | 0.3738 | 0.3827 | 0.6259 | 0.4222 | 0.1996 | | 0.8328 | 93.0 | 9951 | 1.2828 | 0.288 | 0.5537 | 0.2649 | 0.5568 | 0.2477 | 0.2126 | 0.1374 | 0.2854 | 0.4461 | 0.2385 | 0.0873 | 0.2819 | 0.4572 | 0.4693 | 0.7009 | 0.4873 | 0.4022 | 0.3769 | 0.3791 | 0.6381 | 0.4181 | 0.186 | | 0.8267 | 94.0 | 10058 | 1.2746 | 0.2883 | 0.5541 | 0.2651 | 0.5599 | 0.249 | 0.212 | 0.1343 | 0.2866 | 0.4479 | 0.2336 | 0.0997 | 0.2809 | 0.4596 | 0.4717 | 0.7045 | 0.4911 | 0.4022 | 0.3769 | 0.3836 | 0.6444 | 0.4129 | 0.2108 | | 0.8184 | 95.0 | 10165 | 1.2821 | 0.2885 | 0.5582 | 0.2654 | 0.5595 | 0.2428 | 0.2099 | 0.1405 | 0.29 | 0.4466 | 0.2342 | 0.0796 | 0.2815 | 0.4591 | 0.4703 | 0.7 | 0.4823 | 0.3973 | 0.3846 | 0.3871 | 0.6396 | 0.4184 | 0.2009 | | 0.8347 | 96.0 | 10272 | 1.2907 | 0.2886 | 0.5566 | 0.2638 | 0.5606 | 0.243 | 0.211 | 0.1368 | 0.2915 | 0.4469 | 0.2346 | 0.0904 | 0.2789 | 0.4536 | 0.4663 | 0.7005 | 0.4785 | 0.3955 | 0.3708 | 0.3862 | 0.6404 | 0.4093 | 0.1899 | | 0.8128 | 97.0 | 10379 | 1.2762 | 0.2894 | 0.5568 | 0.2646 | 0.5633 | 0.2445 | 0.2112 | 0.1379 | 0.2902 | 0.4477 | 0.2343 | 0.094 | 0.2824 | 0.4589 | 0.4704 | 0.7005 | 0.4949 | 0.3942 | 0.3738 | 0.3884 | 0.6454 | 0.4109 | 0.2085 | | 0.8138 | 98.0 | 10486 | 1.2712 | 0.2856 | 0.5452 | 0.2679 | 0.5633 | 0.2304 | 0.2102 | 0.137 | 0.2873 | 0.4444 | 0.2318 | 0.0849 | 0.2804 | 0.4581 | 0.4701 | 0.7023 | 0.4937 | 0.3978 | 0.3754 | 0.3813 | 0.6402 | 0.4112 | 0.204 | | 0.8153 | 99.0 | 10593 | 1.2698 | 0.2874 | 0.5483 | 0.2658 | 0.5605 | 0.2384 | 0.2089 | 0.1397 | 0.2896 | 0.4474 | 0.2323 | 0.0861 | 0.282 | 0.4581 | 0.4693 | 0.6986 | 0.4962 | 0.3938 | 0.3738 | 0.384 | 0.6407 | 0.4107 | 0.2069 | | 0.8216 | 100.0 | 10700 | 1.2683 | 0.2876 | 0.548 | 0.268 | 0.5615 | 0.2396 | 0.2085 | 0.1386 | 0.2901 | 0.4466 | 0.2337 | 0.0849 | 0.2822 | 0.4586 | 0.471 | 0.6995 | 0.5013 | 0.3946 | 0.3754 | 0.384 | 0.6399 | 0.4132 | 0.2103 | ### Framework versions - Transformers 4.45.0.dev0 - Pytorch 2.4.0+cu121 - Datasets 2.21.0 - Tokenizers 0.19.1
[ "coverall", "face_shield", "gloves", "goggles", "mask" ]
0llheaven/Conditional-detr-finetuned-V4
# Model Card for Model ID <!-- Provide a quick summary of what the model is/does. --> ## Model Details ### Model Description <!-- Provide a longer summary of what this model is. --> This is the model card of a πŸ€— transformers model that has been pushed on the Hub. This model card has been automatically generated. - **Developed by:** [More Information Needed] - **Funded by [optional]:** [More Information Needed] - **Shared by [optional]:** [More Information Needed] - **Model type:** [More Information Needed] - **Language(s) (NLP):** [More Information Needed] - **License:** [More Information Needed] - **Finetuned from model [optional]:** [More Information Needed] ### Model Sources [optional] <!-- Provide the basic links for the model. --> - **Repository:** [More Information Needed] - **Paper [optional]:** [More Information Needed] - **Demo [optional]:** [More Information Needed] ## Uses <!-- Address questions around how the model is intended to be used, including the foreseeable users of the model and those affected by the model. --> ### Direct Use <!-- This section is for the model use without fine-tuning or plugging into a larger ecosystem/app. --> [More Information Needed] ### Downstream Use [optional] <!-- This section is for the model use when fine-tuned for a task, or when plugged into a larger ecosystem/app --> [More Information Needed] ### Out-of-Scope Use <!-- This section addresses misuse, malicious use, and uses that the model will not work well for. --> [More Information Needed] ## Bias, Risks, and Limitations <!-- This section is meant to convey both technical and sociotechnical limitations. --> [More Information Needed] ### Recommendations <!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. --> Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations. ## How to Get Started with the Model Use the code below to get started with the model. [More Information Needed] ## Training Details ### Training Data <!-- This should link to a Dataset Card, perhaps with a short stub of information on what the training data is all about as well as documentation related to data pre-processing or additional filtering. --> [More Information Needed] ### Training Procedure <!-- This relates heavily to the Technical Specifications. Content here should link to that section when it is relevant to the training procedure. --> #### Preprocessing [optional] [More Information Needed] #### Training Hyperparameters - **Training regime:** [More Information Needed] <!--fp32, fp16 mixed precision, bf16 mixed precision, bf16 non-mixed precision, fp16 non-mixed precision, fp8 mixed precision --> #### Speeds, Sizes, Times [optional] <!-- This section provides information about throughput, start/end time, checkpoint size if relevant, etc. --> [More Information Needed] ## Evaluation <!-- This section describes the evaluation protocols and provides the results. --> ### Testing Data, Factors & Metrics #### Testing Data <!-- This should link to a Dataset Card if possible. --> [More Information Needed] #### Factors <!-- These are the things the evaluation is disaggregating by, e.g., subpopulations or domains. --> [More Information Needed] #### Metrics <!-- These are the evaluation metrics being used, ideally with a description of why. --> [More Information Needed] ### Results [More Information Needed] #### Summary ## Model Examination [optional] <!-- Relevant interpretability work for the model goes here --> [More Information Needed] ## Environmental Impact <!-- Total emissions (in grams of CO2eq) and additional considerations, such as electricity usage, go here. Edit the suggested text below accordingly --> Carbon emissions can be estimated using the [Machine Learning Impact calculator](https://mlco2.github.io/impact#compute) presented in [Lacoste et al. (2019)](https://arxiv.org/abs/1910.09700). - **Hardware Type:** [More Information Needed] - **Hours used:** [More Information Needed] - **Cloud Provider:** [More Information Needed] - **Compute Region:** [More Information Needed] - **Carbon Emitted:** [More Information Needed] ## Technical Specifications [optional] ### Model Architecture and Objective [More Information Needed] ### Compute Infrastructure [More Information Needed] #### Hardware [More Information Needed] #### Software [More Information Needed] ## Citation [optional] <!-- If there is a paper or blog post introducing the model, the APA and Bibtex information for that should go in this section. --> **BibTeX:** [More Information Needed] **APA:** [More Information Needed] ## Glossary [optional] <!-- If relevant, include terms and calculations in this section that can help readers understand the model or model card. --> [More Information Needed] ## More Information [optional] [More Information Needed] ## Model Card Authors [optional] [More Information Needed] ## Model Card Contact [More Information Needed]
[ "pneumonia", "0ther" ]
0llheaven/Conditional-detr-finetuned-V5
# Model Card for Model ID <!-- Provide a quick summary of what the model is/does. --> ## Model Details ### Model Description <!-- Provide a longer summary of what this model is. --> This is the model card of a πŸ€— transformers model that has been pushed on the Hub. This model card has been automatically generated. - **Developed by:** [More Information Needed] - **Funded by [optional]:** [More Information Needed] - **Shared by [optional]:** [More Information Needed] - **Model type:** [More Information Needed] - **Language(s) (NLP):** [More Information Needed] - **License:** [More Information Needed] - **Finetuned from model [optional]:** [More Information Needed] ### Model Sources [optional] <!-- Provide the basic links for the model. --> - **Repository:** [More Information Needed] - **Paper [optional]:** [More Information Needed] - **Demo [optional]:** [More Information Needed] ## Uses <!-- Address questions around how the model is intended to be used, including the foreseeable users of the model and those affected by the model. --> ### Direct Use <!-- This section is for the model use without fine-tuning or plugging into a larger ecosystem/app. --> [More Information Needed] ### Downstream Use [optional] <!-- This section is for the model use when fine-tuned for a task, or when plugged into a larger ecosystem/app --> [More Information Needed] ### Out-of-Scope Use <!-- This section addresses misuse, malicious use, and uses that the model will not work well for. --> [More Information Needed] ## Bias, Risks, and Limitations <!-- This section is meant to convey both technical and sociotechnical limitations. --> [More Information Needed] ### Recommendations <!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. --> Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations. ## How to Get Started with the Model Use the code below to get started with the model. [More Information Needed] ## Training Details ### Training Data <!-- This should link to a Dataset Card, perhaps with a short stub of information on what the training data is all about as well as documentation related to data pre-processing or additional filtering. --> [More Information Needed] ### Training Procedure <!-- This relates heavily to the Technical Specifications. Content here should link to that section when it is relevant to the training procedure. --> #### Preprocessing [optional] [More Information Needed] #### Training Hyperparameters - **Training regime:** [More Information Needed] <!--fp32, fp16 mixed precision, bf16 mixed precision, bf16 non-mixed precision, fp16 non-mixed precision, fp8 mixed precision --> #### Speeds, Sizes, Times [optional] <!-- This section provides information about throughput, start/end time, checkpoint size if relevant, etc. --> [More Information Needed] ## Evaluation <!-- This section describes the evaluation protocols and provides the results. --> ### Testing Data, Factors & Metrics #### Testing Data <!-- This should link to a Dataset Card if possible. --> [More Information Needed] #### Factors <!-- These are the things the evaluation is disaggregating by, e.g., subpopulations or domains. --> [More Information Needed] #### Metrics <!-- These are the evaluation metrics being used, ideally with a description of why. --> [More Information Needed] ### Results [More Information Needed] #### Summary ## Model Examination [optional] <!-- Relevant interpretability work for the model goes here --> [More Information Needed] ## Environmental Impact <!-- Total emissions (in grams of CO2eq) and additional considerations, such as electricity usage, go here. Edit the suggested text below accordingly --> Carbon emissions can be estimated using the [Machine Learning Impact calculator](https://mlco2.github.io/impact#compute) presented in [Lacoste et al. (2019)](https://arxiv.org/abs/1910.09700). - **Hardware Type:** [More Information Needed] - **Hours used:** [More Information Needed] - **Cloud Provider:** [More Information Needed] - **Compute Region:** [More Information Needed] - **Carbon Emitted:** [More Information Needed] ## Technical Specifications [optional] ### Model Architecture and Objective [More Information Needed] ### Compute Infrastructure [More Information Needed] #### Hardware [More Information Needed] #### Software [More Information Needed] ## Citation [optional] <!-- If there is a paper or blog post introducing the model, the APA and Bibtex information for that should go in this section. --> **BibTeX:** [More Information Needed] **APA:** [More Information Needed] ## Glossary [optional] <!-- If relevant, include terms and calculations in this section that can help readers understand the model or model card. --> [More Information Needed] ## More Information [optional] [More Information Needed] ## Model Card Authors [optional] [More Information Needed] ## Model Card Contact [More Information Needed]
[ "pneumonia", "0ther" ]
sergiopaniego/detr_finetuned_cppe5
<!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. --> # detr_finetuned_cppe5 This model is a fine-tuned version of [microsoft/conditional-detr-resnet-50](https://huggingface.co/microsoft/conditional-detr-resnet-50) on an unknown dataset. ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 5e-05 - train_batch_size: 8 - eval_batch_size: 8 - seed: 42 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: cosine - num_epochs: 30 ### Framework versions - Transformers 4.44.2 - Pytorch 2.4.0+cu121 - Datasets 2.21.0 - Tokenizers 0.19.1
[ "coverall", "face_shield", "gloves", "goggles", "mask" ]
fimbit/detr-resnet-50_finetuned_cppe5
<!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. --> # detr-resnet-50_finetuned_cppe5 This model is a fine-tuned version of [facebook/detr-resnet-50](https://huggingface.co/facebook/detr-resnet-50) on an unknown dataset. ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 1e-05 - train_batch_size: 8 - eval_batch_size: 8 - seed: 42 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - num_epochs: 10 - mixed_precision_training: Native AMP ### Training results ### Framework versions - Transformers 4.44.2 - Pytorch 2.4.0+cu121 - Datasets 2.21.0 - Tokenizers 0.19.1
[ "coverall", "face_shield", "gloves", "goggles", "mask" ]
fimbit/detr_finetuned_cppe5
<!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. --> # detr_finetuned_cppe5 This model is a fine-tuned version of [microsoft/conditional-detr-resnet-50](https://huggingface.co/microsoft/conditional-detr-resnet-50) on an unknown dataset. It achieves the following results on the evaluation set: - Loss: 1.6381 - Map: 0.1544 - Map 50: 0.3211 - Map 75: 0.1335 - Map Small: 0.0306 - Map Medium: 0.1213 - Map Large: 0.2305 - Mar 1: 0.1679 - Mar 10: 0.3519 - Mar 100: 0.391 - Mar Small: 0.1517 - Mar Medium: 0.3448 - Mar Large: 0.5355 - Map Coverall: 0.453 - Mar 100 Coverall: 0.6428 - Map Face Shield: 0.0221 - Mar 100 Face Shield: 0.3165 - Map Gloves: 0.0499 - Mar 100 Gloves: 0.342 - Map Goggles: 0.0449 - Mar 100 Goggles: 0.2846 - Map Mask: 0.202 - Mar 100 Mask: 0.3693 ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 5e-05 - train_batch_size: 48 - eval_batch_size: 8 - seed: 42 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: cosine - num_epochs: 30 ### Training results | Training Loss | Epoch | Step | Validation Loss | Map | Map 50 | Map 75 | Map Small | Map Medium | Map Large | Mar 1 | Mar 10 | Mar 100 | Mar Small | Mar Medium | Mar Large | Map Coverall | Mar 100 Coverall | Map Face Shield | Mar 100 Face Shield | Map Gloves | Mar 100 Gloves | Map Goggles | Mar 100 Goggles | Map Mask | Mar 100 Mask | |:-------------:|:-----:|:----:|:---------------:|:------:|:------:|:------:|:---------:|:----------:|:---------:|:------:|:------:|:-------:|:---------:|:----------:|:---------:|:------------:|:----------------:|:---------------:|:-------------------:|:----------:|:--------------:|:-----------:|:---------------:|:--------:|:------------:| | No log | 1.0 | 18 | 6.9596 | 0.0001 | 0.0005 | 0.0 | 0.0 | 0.0 | 0.0002 | 0.002 | 0.005 | 0.0144 | 0.0 | 0.0095 | 0.0214 | 0.0003 | 0.0599 | 0.0 | 0.0 | 0.0 | 0.0013 | 0.0 | 0.0 | 0.0001 | 0.0107 | | No log | 2.0 | 36 | 3.3034 | 0.0011 | 0.0045 | 0.0003 | 0.0011 | 0.0014 | 0.0031 | 0.0048 | 0.0328 | 0.0561 | 0.032 | 0.0559 | 0.0517 | 0.0021 | 0.1086 | 0.0 | 0.0063 | 0.0002 | 0.0263 | 0.0017 | 0.0508 | 0.0014 | 0.0884 | | No log | 3.0 | 54 | 2.8530 | 0.0039 | 0.0115 | 0.002 | 0.0008 | 0.0014 | 0.0113 | 0.0167 | 0.0698 | 0.1033 | 0.0214 | 0.0651 | 0.1309 | 0.0156 | 0.3122 | 0.0 | 0.0101 | 0.0011 | 0.0719 | 0.0002 | 0.0308 | 0.0025 | 0.0916 | | No log | 4.0 | 72 | 2.5424 | 0.0156 | 0.0329 | 0.0151 | 0.0015 | 0.0092 | 0.0199 | 0.0413 | 0.0949 | 0.1383 | 0.0522 | 0.1064 | 0.1807 | 0.0689 | 0.3369 | 0.0001 | 0.0013 | 0.002 | 0.1321 | 0.0005 | 0.0385 | 0.0063 | 0.1827 | | No log | 5.0 | 90 | 2.4114 | 0.0201 | 0.0454 | 0.0151 | 0.0046 | 0.0166 | 0.0272 | 0.0528 | 0.1104 | 0.1562 | 0.0615 | 0.1094 | 0.2159 | 0.079 | 0.3577 | 0.0 | 0.0 | 0.0043 | 0.1723 | 0.0 | 0.0 | 0.0172 | 0.2511 | | No log | 6.0 | 108 | 2.2955 | 0.0352 | 0.0765 | 0.0298 | 0.0072 | 0.0221 | 0.0371 | 0.067 | 0.1331 | 0.1811 | 0.0667 | 0.1508 | 0.2155 | 0.1438 | 0.4477 | 0.0 | 0.0 | 0.0052 | 0.1737 | 0.0052 | 0.0169 | 0.0218 | 0.2671 | | No log | 7.0 | 126 | 2.2319 | 0.0333 | 0.0741 | 0.0268 | 0.0098 | 0.0259 | 0.0484 | 0.0829 | 0.1645 | 0.2172 | 0.0771 | 0.1655 | 0.2879 | 0.1201 | 0.491 | 0.0 | 0.0 | 0.0103 | 0.2362 | 0.0018 | 0.0585 | 0.0343 | 0.3004 | | No log | 8.0 | 144 | 2.1602 | 0.0346 | 0.0762 | 0.027 | 0.0105 | 0.0319 | 0.0475 | 0.0822 | 0.1698 | 0.2212 | 0.0784 | 0.166 | 0.2791 | 0.114 | 0.5378 | 0.0012 | 0.0063 | 0.0061 | 0.2027 | 0.0036 | 0.0385 | 0.0482 | 0.3204 | | No log | 9.0 | 162 | 2.1318 | 0.0365 | 0.0751 | 0.0317 | 0.008 | 0.0388 | 0.0493 | 0.0876 | 0.1838 | 0.2341 | 0.0799 | 0.1945 | 0.2747 | 0.115 | 0.5775 | 0.0 | 0.0 | 0.0058 | 0.1929 | 0.0054 | 0.0646 | 0.0562 | 0.3356 | | No log | 10.0 | 180 | 2.0494 | 0.0454 | 0.1034 | 0.0363 | 0.0132 | 0.0438 | 0.0645 | 0.1057 | 0.2013 | 0.2497 | 0.0892 | 0.1899 | 0.36 | 0.1363 | 0.5279 | 0.0001 | 0.0051 | 0.0082 | 0.2491 | 0.0086 | 0.0923 | 0.0736 | 0.3742 | | No log | 11.0 | 198 | 2.0013 | 0.0505 | 0.1115 | 0.0411 | 0.0108 | 0.0482 | 0.0709 | 0.1015 | 0.2235 | 0.269 | 0.0854 | 0.2097 | 0.3776 | 0.1646 | 0.5914 | 0.0005 | 0.0177 | 0.0113 | 0.2562 | 0.0061 | 0.1215 | 0.0699 | 0.3582 | | No log | 12.0 | 216 | 1.9699 | 0.057 | 0.1211 | 0.0445 | 0.0117 | 0.0476 | 0.078 | 0.0962 | 0.2212 | 0.2676 | 0.0758 | 0.215 | 0.3634 | 0.1992 | 0.6122 | 0.0004 | 0.0127 | 0.0093 | 0.2527 | 0.006 | 0.1169 | 0.0702 | 0.3436 | | No log | 13.0 | 234 | 1.9105 | 0.0722 | 0.1588 | 0.06 | 0.0183 | 0.0591 | 0.1058 | 0.1318 | 0.2622 | 0.3075 | 0.1026 | 0.2644 | 0.4241 | 0.2304 | 0.6252 | 0.0018 | 0.062 | 0.0125 | 0.2848 | 0.0221 | 0.1862 | 0.0942 | 0.3791 | | No log | 14.0 | 252 | 1.8849 | 0.0859 | 0.1809 | 0.0771 | 0.0189 | 0.0681 | 0.1184 | 0.1271 | 0.2626 | 0.3093 | 0.1056 | 0.265 | 0.4173 | 0.2761 | 0.632 | 0.0044 | 0.0962 | 0.0153 | 0.2835 | 0.0203 | 0.1615 | 0.1136 | 0.3733 | | No log | 15.0 | 270 | 1.8380 | 0.0968 | 0.2026 | 0.0867 | 0.0139 | 0.0679 | 0.1375 | 0.1275 | 0.2733 | 0.3172 | 0.1078 | 0.2588 | 0.4298 | 0.3325 | 0.645 | 0.0111 | 0.1367 | 0.0173 | 0.3022 | 0.0124 | 0.1369 | 0.1108 | 0.3653 | | No log | 16.0 | 288 | 1.8123 | 0.1153 | 0.2438 | 0.101 | 0.0254 | 0.0862 | 0.1513 | 0.14 | 0.2974 | 0.3346 | 0.1297 | 0.2825 | 0.443 | 0.3832 | 0.6392 | 0.0256 | 0.2114 | 0.0192 | 0.3125 | 0.0221 | 0.16 | 0.1265 | 0.3498 | | No log | 17.0 | 306 | 1.7964 | 0.1199 | 0.2621 | 0.1026 | 0.0219 | 0.094 | 0.1591 | 0.1306 | 0.2957 | 0.3384 | 0.1264 | 0.2885 | 0.454 | 0.3926 | 0.6374 | 0.0236 | 0.1987 | 0.0221 | 0.3152 | 0.0271 | 0.2062 | 0.1343 | 0.3347 | | No log | 18.0 | 324 | 1.7520 | 0.1294 | 0.2814 | 0.1075 | 0.0242 | 0.1067 | 0.1774 | 0.1399 | 0.319 | 0.3555 | 0.138 | 0.3067 | 0.4831 | 0.4066 | 0.6541 | 0.0262 | 0.2165 | 0.0302 | 0.3263 | 0.0286 | 0.2246 | 0.1555 | 0.356 | | No log | 19.0 | 342 | 1.7232 | 0.1373 | 0.2907 | 0.1166 | 0.0273 | 0.1082 | 0.1956 | 0.1483 | 0.3258 | 0.3608 | 0.1471 | 0.3052 | 0.5008 | 0.4284 | 0.645 | 0.016 | 0.2266 | 0.0355 | 0.3402 | 0.0305 | 0.2292 | 0.1764 | 0.3631 | | No log | 20.0 | 360 | 1.7113 | 0.1395 | 0.3024 | 0.1141 | 0.0301 | 0.1092 | 0.2093 | 0.1525 | 0.3247 | 0.3575 | 0.1575 | 0.3006 | 0.4969 | 0.4258 | 0.6293 | 0.0225 | 0.2418 | 0.0372 | 0.3366 | 0.0254 | 0.2277 | 0.1869 | 0.352 | | No log | 21.0 | 378 | 1.6864 | 0.1447 | 0.3079 | 0.1238 | 0.0295 | 0.1154 | 0.2157 | 0.1598 | 0.342 | 0.374 | 0.1575 | 0.3211 | 0.5222 | 0.4284 | 0.6437 | 0.0231 | 0.2671 | 0.045 | 0.3384 | 0.0349 | 0.2615 | 0.1923 | 0.3591 | | No log | 22.0 | 396 | 1.6746 | 0.1495 | 0.3155 | 0.1282 | 0.03 | 0.115 | 0.2223 | 0.169 | 0.3466 | 0.379 | 0.1673 | 0.3227 | 0.5279 | 0.4376 | 0.6464 | 0.0258 | 0.2759 | 0.0458 | 0.3362 | 0.0436 | 0.2677 | 0.1946 | 0.3689 | | No log | 23.0 | 414 | 1.6604 | 0.1499 | 0.311 | 0.1336 | 0.0313 | 0.1178 | 0.2233 | 0.161 | 0.3479 | 0.3836 | 0.1599 | 0.3352 | 0.5265 | 0.4435 | 0.6486 | 0.0246 | 0.3013 | 0.0458 | 0.3339 | 0.0411 | 0.2677 | 0.1944 | 0.3662 | | No log | 24.0 | 432 | 1.6552 | 0.1508 | 0.3167 | 0.1301 | 0.0284 | 0.1209 | 0.2256 | 0.1645 | 0.3503 | 0.389 | 0.1621 | 0.342 | 0.533 | 0.4469 | 0.6446 | 0.0229 | 0.3025 | 0.046 | 0.3429 | 0.0399 | 0.2846 | 0.1985 | 0.3702 | | No log | 25.0 | 450 | 1.6465 | 0.1506 | 0.3124 | 0.13 | 0.0287 | 0.1185 | 0.2266 | 0.1611 | 0.3505 | 0.3869 | 0.1588 | 0.339 | 0.5355 | 0.4472 | 0.6446 | 0.0209 | 0.2962 | 0.0473 | 0.342 | 0.0404 | 0.2831 | 0.1974 | 0.3684 | | No log | 26.0 | 468 | 1.6419 | 0.1526 | 0.3209 | 0.1298 | 0.0283 | 0.1197 | 0.2258 | 0.1625 | 0.3453 | 0.3854 | 0.1545 | 0.338 | 0.5263 | 0.4531 | 0.6428 | 0.022 | 0.2911 | 0.0467 | 0.3438 | 0.0441 | 0.2785 | 0.197 | 0.3707 | | No log | 27.0 | 486 | 1.6383 | 0.1546 | 0.3235 | 0.1354 | 0.0288 | 0.1247 | 0.2274 | 0.164 | 0.3475 | 0.3872 | 0.1546 | 0.3396 | 0.5312 | 0.4539 | 0.645 | 0.0216 | 0.2975 | 0.0493 | 0.3438 | 0.048 | 0.2815 | 0.2004 | 0.3684 | | 3.1149 | 28.0 | 504 | 1.6393 | 0.1544 | 0.3208 | 0.1328 | 0.0306 | 0.1218 | 0.2298 | 0.1668 | 0.3518 | 0.3902 | 0.1524 | 0.3423 | 0.5368 | 0.4535 | 0.6414 | 0.0222 | 0.3114 | 0.0496 | 0.3429 | 0.045 | 0.2862 | 0.2015 | 0.3693 | | 3.1149 | 29.0 | 522 | 1.6383 | 0.1544 | 0.3221 | 0.1332 | 0.0308 | 0.1218 | 0.2306 | 0.167 | 0.3515 | 0.3909 | 0.1517 | 0.3436 | 0.536 | 0.4533 | 0.6428 | 0.0219 | 0.3152 | 0.05 | 0.342 | 0.0449 | 0.2846 | 0.2021 | 0.3698 | | 3.1149 | 30.0 | 540 | 1.6381 | 0.1544 | 0.3211 | 0.1335 | 0.0306 | 0.1213 | 0.2305 | 0.1679 | 0.3519 | 0.391 | 0.1517 | 0.3448 | 0.5355 | 0.453 | 0.6428 | 0.0221 | 0.3165 | 0.0499 | 0.342 | 0.0449 | 0.2846 | 0.202 | 0.3693 | ### Framework versions - Transformers 4.44.2 - Pytorch 2.4.0+cu121 - Datasets 2.21.0 - Tokenizers 0.19.1
[ "coverall", "face_shield", "gloves", "goggles", "mask" ]
danelcsb/rtdetr_v2_r34vd
# Model Card for Model ID <!-- Provide a quick summary of what the model is/does. --> ## Model Details ### Model Description <!-- Provide a longer summary of what this model is. --> This is the model card of a πŸ€— transformers model that has been pushed on the Hub. This model card has been automatically generated. - **Developed by:** [More Information Needed] - **Funded by [optional]:** [More Information Needed] - **Shared by [optional]:** [More Information Needed] - **Model type:** [More Information Needed] - **Language(s) (NLP):** [More Information Needed] - **License:** [More Information Needed] - **Finetuned from model [optional]:** [More Information Needed] ### Model Sources [optional] <!-- Provide the basic links for the model. --> - **Repository:** [More Information Needed] - **Paper [optional]:** [More Information Needed] - **Demo [optional]:** [More Information Needed] ## Uses <!-- Address questions around how the model is intended to be used, including the foreseeable users of the model and those affected by the model. --> ### Direct Use <!-- This section is for the model use without fine-tuning or plugging into a larger ecosystem/app. --> [More Information Needed] ### Downstream Use [optional] <!-- This section is for the model use when fine-tuned for a task, or when plugged into a larger ecosystem/app --> [More Information Needed] ### Out-of-Scope Use <!-- This section addresses misuse, malicious use, and uses that the model will not work well for. --> [More Information Needed] ## Bias, Risks, and Limitations <!-- This section is meant to convey both technical and sociotechnical limitations. --> [More Information Needed] ### Recommendations <!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. --> Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations. ## How to Get Started with the Model Use the code below to get started with the model. [More Information Needed] ## Training Details ### Training Data <!-- This should link to a Dataset Card, perhaps with a short stub of information on what the training data is all about as well as documentation related to data pre-processing or additional filtering. --> [More Information Needed] ### Training Procedure <!-- This relates heavily to the Technical Specifications. Content here should link to that section when it is relevant to the training procedure. --> #### Preprocessing [optional] [More Information Needed] #### Training Hyperparameters - **Training regime:** [More Information Needed] <!--fp32, fp16 mixed precision, bf16 mixed precision, bf16 non-mixed precision, fp16 non-mixed precision, fp8 mixed precision --> #### Speeds, Sizes, Times [optional] <!-- This section provides information about throughput, start/end time, checkpoint size if relevant, etc. --> [More Information Needed] ## Evaluation <!-- This section describes the evaluation protocols and provides the results. --> ### Testing Data, Factors & Metrics #### Testing Data <!-- This should link to a Dataset Card if possible. --> [More Information Needed] #### Factors <!-- These are the things the evaluation is disaggregating by, e.g., subpopulations or domains. --> [More Information Needed] #### Metrics <!-- These are the evaluation metrics being used, ideally with a description of why. --> [More Information Needed] ### Results [More Information Needed] #### Summary ## Model Examination [optional] <!-- Relevant interpretability work for the model goes here --> [More Information Needed] ## Environmental Impact <!-- Total emissions (in grams of CO2eq) and additional considerations, such as electricity usage, go here. Edit the suggested text below accordingly --> Carbon emissions can be estimated using the [Machine Learning Impact calculator](https://mlco2.github.io/impact#compute) presented in [Lacoste et al. (2019)](https://arxiv.org/abs/1910.09700). - **Hardware Type:** [More Information Needed] - **Hours used:** [More Information Needed] - **Cloud Provider:** [More Information Needed] - **Compute Region:** [More Information Needed] - **Carbon Emitted:** [More Information Needed] ## Technical Specifications [optional] ### Model Architecture and Objective [More Information Needed] ### Compute Infrastructure [More Information Needed] #### Hardware [More Information Needed] #### Software [More Information Needed] ## Citation [optional] <!-- If there is a paper or blog post introducing the model, the APA and Bibtex information for that should go in this section. --> **BibTeX:** [More Information Needed] **APA:** [More Information Needed] ## Glossary [optional] <!-- If relevant, include terms and calculations in this section that can help readers understand the model or model card. --> [More Information Needed] ## More Information [optional] [More Information Needed] ## Model Card Authors [optional] [More Information Needed] ## Model Card Contact [More Information Needed]
[ "person", "bicycle", "car", "motorbike", "aeroplane", "bus", "train", "truck", "boat", "traffic light", "fire hydrant", "stop sign", "parking meter", "bench", "bird", "cat", "dog", "horse", "sheep", "cow", "elephant", "bear", "zebra", "giraffe", "backpack", "umbrella", "handbag", "tie", "suitcase", "frisbee", "skis", "snowboard", "sports ball", "kite", "baseball bat", "baseball glove", "skateboard", "surfboard", "tennis racket", "bottle", "wine glass", "cup", "fork", "knife", "spoon", "bowl", "banana", "apple", "sandwich", "orange", "broccoli", "carrot", "hot dog", "pizza", "donut", "cake", "chair", "sofa", "pottedplant", "bed", "diningtable", "toilet", "tvmonitor", "laptop", "mouse", "remote", "keyboard", "cell phone", "microwave", "oven", "toaster", "sink", "refrigerator", "book", "clock", "vase", "scissors", "teddy bear", "hair drier", "toothbrush" ]
danelcsb/rtdetr_v2_r50vd
# Model Card for Model ID <!-- Provide a quick summary of what the model is/does. --> ## Model Details ### Model Description <!-- Provide a longer summary of what this model is. --> This is the model card of a πŸ€— transformers model that has been pushed on the Hub. This model card has been automatically generated. - **Developed by:** [More Information Needed] - **Funded by [optional]:** [More Information Needed] - **Shared by [optional]:** [More Information Needed] - **Model type:** [More Information Needed] - **Language(s) (NLP):** [More Information Needed] - **License:** [More Information Needed] - **Finetuned from model [optional]:** [More Information Needed] ### Model Sources [optional] <!-- Provide the basic links for the model. --> - **Repository:** [More Information Needed] - **Paper [optional]:** [More Information Needed] - **Demo [optional]:** [More Information Needed] ## Uses <!-- Address questions around how the model is intended to be used, including the foreseeable users of the model and those affected by the model. --> ### Direct Use <!-- This section is for the model use without fine-tuning or plugging into a larger ecosystem/app. --> [More Information Needed] ### Downstream Use [optional] <!-- This section is for the model use when fine-tuned for a task, or when plugged into a larger ecosystem/app --> [More Information Needed] ### Out-of-Scope Use <!-- This section addresses misuse, malicious use, and uses that the model will not work well for. --> [More Information Needed] ## Bias, Risks, and Limitations <!-- This section is meant to convey both technical and sociotechnical limitations. --> [More Information Needed] ### Recommendations <!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. --> Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations. ## How to Get Started with the Model Use the code below to get started with the model. [More Information Needed] ## Training Details ### Training Data <!-- This should link to a Dataset Card, perhaps with a short stub of information on what the training data is all about as well as documentation related to data pre-processing or additional filtering. --> [More Information Needed] ### Training Procedure <!-- This relates heavily to the Technical Specifications. Content here should link to that section when it is relevant to the training procedure. --> #### Preprocessing [optional] [More Information Needed] #### Training Hyperparameters - **Training regime:** [More Information Needed] <!--fp32, fp16 mixed precision, bf16 mixed precision, bf16 non-mixed precision, fp16 non-mixed precision, fp8 mixed precision --> #### Speeds, Sizes, Times [optional] <!-- This section provides information about throughput, start/end time, checkpoint size if relevant, etc. --> [More Information Needed] ## Evaluation <!-- This section describes the evaluation protocols and provides the results. --> ### Testing Data, Factors & Metrics #### Testing Data <!-- This should link to a Dataset Card if possible. --> [More Information Needed] #### Factors <!-- These are the things the evaluation is disaggregating by, e.g., subpopulations or domains. --> [More Information Needed] #### Metrics <!-- These are the evaluation metrics being used, ideally with a description of why. --> [More Information Needed] ### Results [More Information Needed] #### Summary ## Model Examination [optional] <!-- Relevant interpretability work for the model goes here --> [More Information Needed] ## Environmental Impact <!-- Total emissions (in grams of CO2eq) and additional considerations, such as electricity usage, go here. Edit the suggested text below accordingly --> Carbon emissions can be estimated using the [Machine Learning Impact calculator](https://mlco2.github.io/impact#compute) presented in [Lacoste et al. (2019)](https://arxiv.org/abs/1910.09700). - **Hardware Type:** [More Information Needed] - **Hours used:** [More Information Needed] - **Cloud Provider:** [More Information Needed] - **Compute Region:** [More Information Needed] - **Carbon Emitted:** [More Information Needed] ## Technical Specifications [optional] ### Model Architecture and Objective [More Information Needed] ### Compute Infrastructure [More Information Needed] #### Hardware [More Information Needed] #### Software [More Information Needed] ## Citation [optional] <!-- If there is a paper or blog post introducing the model, the APA and Bibtex information for that should go in this section. --> **BibTeX:** [More Information Needed] **APA:** [More Information Needed] ## Glossary [optional] <!-- If relevant, include terms and calculations in this section that can help readers understand the model or model card. --> [More Information Needed] ## More Information [optional] [More Information Needed] ## Model Card Authors [optional] [More Information Needed] ## Model Card Contact [More Information Needed]
[ "person", "bicycle", "car", "motorbike", "aeroplane", "bus", "train", "truck", "boat", "traffic light", "fire hydrant", "stop sign", "parking meter", "bench", "bird", "cat", "dog", "horse", "sheep", "cow", "elephant", "bear", "zebra", "giraffe", "backpack", "umbrella", "handbag", "tie", "suitcase", "frisbee", "skis", "snowboard", "sports ball", "kite", "baseball bat", "baseball glove", "skateboard", "surfboard", "tennis racket", "bottle", "wine glass", "cup", "fork", "knife", "spoon", "bowl", "banana", "apple", "sandwich", "orange", "broccoli", "carrot", "hot dog", "pizza", "donut", "cake", "chair", "sofa", "pottedplant", "bed", "diningtable", "toilet", "tvmonitor", "laptop", "mouse", "remote", "keyboard", "cell phone", "microwave", "oven", "toaster", "sink", "refrigerator", "book", "clock", "vase", "scissors", "teddy bear", "hair drier", "toothbrush" ]
danelcsb/rtdetr_v2_r101vd
# Model Card for Model ID <!-- Provide a quick summary of what the model is/does. --> ## Model Details ### Model Description <!-- Provide a longer summary of what this model is. --> This is the model card of a πŸ€— transformers model that has been pushed on the Hub. This model card has been automatically generated. - **Developed by:** [More Information Needed] - **Funded by [optional]:** [More Information Needed] - **Shared by [optional]:** [More Information Needed] - **Model type:** [More Information Needed] - **Language(s) (NLP):** [More Information Needed] - **License:** [More Information Needed] - **Finetuned from model [optional]:** [More Information Needed] ### Model Sources [optional] <!-- Provide the basic links for the model. --> - **Repository:** [More Information Needed] - **Paper [optional]:** [More Information Needed] - **Demo [optional]:** [More Information Needed] ## Uses <!-- Address questions around how the model is intended to be used, including the foreseeable users of the model and those affected by the model. --> ### Direct Use <!-- This section is for the model use without fine-tuning or plugging into a larger ecosystem/app. --> [More Information Needed] ### Downstream Use [optional] <!-- This section is for the model use when fine-tuned for a task, or when plugged into a larger ecosystem/app --> [More Information Needed] ### Out-of-Scope Use <!-- This section addresses misuse, malicious use, and uses that the model will not work well for. --> [More Information Needed] ## Bias, Risks, and Limitations <!-- This section is meant to convey both technical and sociotechnical limitations. --> [More Information Needed] ### Recommendations <!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. --> Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations. ## How to Get Started with the Model Use the code below to get started with the model. [More Information Needed] ## Training Details ### Training Data <!-- This should link to a Dataset Card, perhaps with a short stub of information on what the training data is all about as well as documentation related to data pre-processing or additional filtering. --> [More Information Needed] ### Training Procedure <!-- This relates heavily to the Technical Specifications. Content here should link to that section when it is relevant to the training procedure. --> #### Preprocessing [optional] [More Information Needed] #### Training Hyperparameters - **Training regime:** [More Information Needed] <!--fp32, fp16 mixed precision, bf16 mixed precision, bf16 non-mixed precision, fp16 non-mixed precision, fp8 mixed precision --> #### Speeds, Sizes, Times [optional] <!-- This section provides information about throughput, start/end time, checkpoint size if relevant, etc. --> [More Information Needed] ## Evaluation <!-- This section describes the evaluation protocols and provides the results. --> ### Testing Data, Factors & Metrics #### Testing Data <!-- This should link to a Dataset Card if possible. --> [More Information Needed] #### Factors <!-- These are the things the evaluation is disaggregating by, e.g., subpopulations or domains. --> [More Information Needed] #### Metrics <!-- These are the evaluation metrics being used, ideally with a description of why. --> [More Information Needed] ### Results [More Information Needed] #### Summary ## Model Examination [optional] <!-- Relevant interpretability work for the model goes here --> [More Information Needed] ## Environmental Impact <!-- Total emissions (in grams of CO2eq) and additional considerations, such as electricity usage, go here. Edit the suggested text below accordingly --> Carbon emissions can be estimated using the [Machine Learning Impact calculator](https://mlco2.github.io/impact#compute) presented in [Lacoste et al. (2019)](https://arxiv.org/abs/1910.09700). - **Hardware Type:** [More Information Needed] - **Hours used:** [More Information Needed] - **Cloud Provider:** [More Information Needed] - **Compute Region:** [More Information Needed] - **Carbon Emitted:** [More Information Needed] ## Technical Specifications [optional] ### Model Architecture and Objective [More Information Needed] ### Compute Infrastructure [More Information Needed] #### Hardware [More Information Needed] #### Software [More Information Needed] ## Citation [optional] <!-- If there is a paper or blog post introducing the model, the APA and Bibtex information for that should go in this section. --> **BibTeX:** [More Information Needed] **APA:** [More Information Needed] ## Glossary [optional] <!-- If relevant, include terms and calculations in this section that can help readers understand the model or model card. --> [More Information Needed] ## More Information [optional] [More Information Needed] ## Model Card Authors [optional] [More Information Needed] ## Model Card Contact [More Information Needed]
[ "person", "bicycle", "car", "motorbike", "aeroplane", "bus", "train", "truck", "boat", "traffic light", "fire hydrant", "stop sign", "parking meter", "bench", "bird", "cat", "dog", "horse", "sheep", "cow", "elephant", "bear", "zebra", "giraffe", "backpack", "umbrella", "handbag", "tie", "suitcase", "frisbee", "skis", "snowboard", "sports ball", "kite", "baseball bat", "baseball glove", "skateboard", "surfboard", "tennis racket", "bottle", "wine glass", "cup", "fork", "knife", "spoon", "bowl", "banana", "apple", "sandwich", "orange", "broccoli", "carrot", "hot dog", "pizza", "donut", "cake", "chair", "sofa", "pottedplant", "bed", "diningtable", "toilet", "tvmonitor", "laptop", "mouse", "remote", "keyboard", "cell phone", "microwave", "oven", "toaster", "sink", "refrigerator", "book", "clock", "vase", "scissors", "teddy bear", "hair drier", "toothbrush" ]
Wuwani/queue_detection_cctv
# Model Card for Model ID <!-- Provide a quick summary of what the model is/does. --> ## Model Details ### Model Description <!-- Provide a longer summary of what this model is. --> This is the model card of a πŸ€— transformers model that has been pushed on the Hub. This model card has been automatically generated. - **Developed by:** [More Information Needed] - **Funded by [optional]:** [More Information Needed] - **Shared by [optional]:** [More Information Needed] - **Model type:** [More Information Needed] - **Language(s) (NLP):** [More Information Needed] - **License:** [More Information Needed] - **Finetuned from model [optional]:** [More Information Needed] ### Model Sources [optional] <!-- Provide the basic links for the model. --> - **Repository:** [More Information Needed] - **Paper [optional]:** [More Information Needed] - **Demo [optional]:** [More Information Needed] ## Uses <!-- Address questions around how the model is intended to be used, including the foreseeable users of the model and those affected by the model. --> ### Direct Use <!-- This section is for the model use without fine-tuning or plugging into a larger ecosystem/app. --> [More Information Needed] ### Downstream Use [optional] <!-- This section is for the model use when fine-tuned for a task, or when plugged into a larger ecosystem/app --> [More Information Needed] ### Out-of-Scope Use <!-- This section addresses misuse, malicious use, and uses that the model will not work well for. --> [More Information Needed] ## Bias, Risks, and Limitations <!-- This section is meant to convey both technical and sociotechnical limitations. --> [More Information Needed] ### Recommendations <!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. --> Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations. ## How to Get Started with the Model Use the code below to get started with the model. [More Information Needed] ## Training Details ### Training Data <!-- This should link to a Dataset Card, perhaps with a short stub of information on what the training data is all about as well as documentation related to data pre-processing or additional filtering. --> [More Information Needed] ### Training Procedure <!-- This relates heavily to the Technical Specifications. Content here should link to that section when it is relevant to the training procedure. --> #### Preprocessing [optional] [More Information Needed] #### Training Hyperparameters - **Training regime:** [More Information Needed] <!--fp32, fp16 mixed precision, bf16 mixed precision, bf16 non-mixed precision, fp16 non-mixed precision, fp8 mixed precision --> #### Speeds, Sizes, Times [optional] <!-- This section provides information about throughput, start/end time, checkpoint size if relevant, etc. --> [More Information Needed] ## Evaluation <!-- This section describes the evaluation protocols and provides the results. --> ### Testing Data, Factors & Metrics #### Testing Data <!-- This should link to a Dataset Card if possible. --> [More Information Needed] #### Factors <!-- These are the things the evaluation is disaggregating by, e.g., subpopulations or domains. --> [More Information Needed] #### Metrics <!-- These are the evaluation metrics being used, ideally with a description of why. --> [More Information Needed] ### Results [More Information Needed] #### Summary ## Model Examination [optional] <!-- Relevant interpretability work for the model goes here --> [More Information Needed] ## Environmental Impact <!-- Total emissions (in grams of CO2eq) and additional considerations, such as electricity usage, go here. Edit the suggested text below accordingly --> Carbon emissions can be estimated using the [Machine Learning Impact calculator](https://mlco2.github.io/impact#compute) presented in [Lacoste et al. (2019)](https://arxiv.org/abs/1910.09700). - **Hardware Type:** [More Information Needed] - **Hours used:** [More Information Needed] - **Cloud Provider:** [More Information Needed] - **Compute Region:** [More Information Needed] - **Carbon Emitted:** [More Information Needed] ## Technical Specifications [optional] ### Model Architecture and Objective [More Information Needed] ### Compute Infrastructure [More Information Needed] #### Hardware [More Information Needed] #### Software [More Information Needed] ## Citation [optional] <!-- If there is a paper or blog post introducing the model, the APA and Bibtex information for that should go in this section. --> **BibTeX:** [More Information Needed] **APA:** [More Information Needed] ## Glossary [optional] <!-- If relevant, include terms and calculations in this section that can help readers understand the model or model card. --> [More Information Needed] ## More Information [optional] [More Information Needed] ## Model Card Authors [optional] [More Information Needed] ## Model Card Contact [More Information Needed]
[ "cashier", "cx" ]
joe611/joe-detr-resnet-50-hardhat-finetuned-12-dc-50-manual-upload
<!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. --> # joe-detr-resnet-50-hardhat-finetuned-12-dc-50-manual-upload This model is a fine-tuned version of [facebook/detr-resnet-50](https://huggingface.co/facebook/detr-resnet-50) on an unknown dataset. ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 1e-05 - train_batch_size: 12 - eval_batch_size: 8 - seed: 42 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - training_steps: 1000 - mixed_precision_training: Native AMP ### Training results ### Framework versions - Transformers 4.44.2 - Pytorch 2.4.0+cu121 - Datasets 2.21.0 - Tokenizers 0.19.1
[ "head", "helmet", "person" ]
ashaduzzaman/detr_finetuned_cppe5
<!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. --> # Model Card for DETR Finetuned on CPPE-5 ## Model Overview This model is a fine-tuned version of [facebook/detr-resnet-50](https://huggingface.co/facebook/detr-resnet-50) on a custom dataset, likely focused on detecting personal protective equipment (PPE) items. The fine-tuning has optimized the model to recognize various PPE elements such as face shields, masks, gloves, and goggles. The model is based on the DEtection TRansformer (DETR) architecture, leveraging a ResNet-50 backbone for feature extraction. This fine-tuned version retains DETR's core functionality, enabling object detection tasks but is specifically adjusted to detect items relevant to occupational safety or PPE. ## Model Performance The model achieves the following metrics on its evaluation set: - **Loss**: 1.2294 - **mAP** (mean Average Precision): - Overall: 0.2366 - 50 IoU threshold: 0.4852 - 75 IoU threshold: 0.2032 - Small objects: 0.1082 - Medium objects: 0.2086 - Large objects: 0.3408 - **mAR** (mean Average Recall): - At 1 detection: 0.2819 - At 10 detections: 0.4463 - At 100 detections: 0.4665 - Small objects: 0.249 - Medium objects: 0.4004 - Large objects: 0.5893 For specific categories (face shields, gloves, goggles, masks), the precision and recall vary, with room for improvement, particularly for small objects like goggles. ## Intended Use and Limitations ### Intended Use - Detecting personal protective equipment (PPE) in images or video streams. - Monitoring workplace safety by ensuring proper usage of PPE items such as masks, gloves, face shields, and goggles. - Suitable for industries like construction, healthcare, and manufacturing where PPE detection is critical for compliance and safety. ### Limitations - The model may not generalize well to non-PPE items or general object detection tasks. - Performance on small or occluded objects can be limited, as indicated by lower mAP and mAR scores for small objects. - The model was trained on a dataset specific to PPE detection, so its performance on images outside of this domain might be inconsistent. ## Training and Evaluation Data The dataset used for fine-tuning remains unspecified, but it appears to focus on personal protective equipment, such as face shields, masks, goggles, and gloves. ## Training Procedure ### Hyperparameters: - **Learning rate**: 5e-05 - **Train batch size**: 8 - **Eval batch size**: 8 - **Optimizer**: Adam (betas=(0.9, 0.999), epsilon=1e-08) - **Learning rate scheduler**: Cosine decay - **Number of epochs**: 30 - **Seed**: 42 The model was trained for 30 epochs with Adam optimization, using a learning rate of 5e-05 and cosine learning rate decay. The training was conducted with a batch size of 8 for both training and evaluation. ## Evaluation Results The following are performance metrics captured during the training process across multiple epochs: | Epoch | Validation Loss | mAP | mAP 50 | mAP 75 | mAR | Comments | |-------|-----------------|-----|--------|--------|-----|----------| | 1 | 2.1073 | 0.0518 | 0.1075 | 0.0423 | 0.2819 | Initial training | | 5 | 1.6220 | 0.1223 | 0.2258 | 0.1115 | 0.4463 | Significant improvement | | 10 | 1.5033 | 0.155 | 0.3265 | 0.1325 | 0.5032 | Stable performance | | 20 | 1.2649 | 0.2211 | 0.4427 | 0.1952 | 0.5867 | Peak performance | | 25 | 1.2347 | 0.2333 | 0.4831 | 0.1989 | 0.5966 | Final metrics | ## Limitations and Ethical Considerations ### Limitations: - **Domain-specific**: The model performs well in PPE-related object detection but may not generalize to other tasks. - **Bias**: If the dataset is skewed or limited, certain PPE items may be under-represented, leading to poorer performance for some categories. - **Real-time Applications**: The model might not meet the latency requirements for real-time detection in high-throughput environments. ### Ethical Considerations: - **Privacy**: Using this model in surveillance scenarios (e.g., workplaces) may raise concerns about employee privacy, especially if applied without clear consent. - **Misuse**: Improper use of this model could lead to incorrect enforcement of safety regulations. ## Future Work - **Dataset Improvements**: Expanding the dataset to include more diverse PPE items, environments, and object scales could improve model performance, especially for smaller objects. - **Model Efficiency**: Further fine-tuning or model distillation may help make the model more suitable for real-time applications.
[ "coverall", "face_shield", "gloves", "goggles", "mask" ]
IDEA-Research/dab-detr-resnet-50-dc5
# Model Card for Model ID ## Table of Contents 1. [Model Details](#model-details) 2. [Model Sources](#model-sources) 3. [How to Get Started with the Model](#how-to-get-started-with-the-model) 4. [Training Details](#training-details) 5. [Evaluation](#evaluation) 6. [Model Architecture and Objective](#model-architecture-and-objective) 7. [Citation](#citation) ## Model Details ![image/png](https://huggingface.co/datasets/huggingface/documentation-images/resolve/main/transformers/model_doc/dab_detr_convergence_plot.png) > We present in this paper a novel query formulation using dynamic anchor boxes for DETR (DEtection TRansformer) and offer a deeper understanding of the role of queries in DETR. This new formulation directly uses box coordinates as queries in Transformer decoders and dynamically updates them layer-by-layer. Using box coordinates not only helps using explicit positional priors to improve the query-to-feature similarity and eliminate the slow training convergence issue in DETR, but also allows us to modulate the positional attention map using the box width and height information. Such a design makes it clear that queries in DETR can be implemented as performing soft ROI pooling layer-by-layer in a cascade manner. As a result, it leads to the best performance on MS-COCO benchmark among the DETR-like detection models under the same setting, e.g., AP 45.7\% using ResNet50-DC5 as backbone trained in 50 epochs. We also conducted extensive experiments to confirm our analysis and verify the effectiveness of our methods. ### Model Description <!-- Provide a longer summary of what this model is. --> This is the model card of a πŸ€— transformers model that has been pushed on the Hub. This model card has been automatically generated. - **Developed by:** Shilong Liu, Feng Li, Hao Zhang, Xiao Yang, Xianbiao Qi, Hang Su, Jun Zhu, Lei Zhang - **Funded by:** IDEA-Research - **Shared by:** David Hajdu - **Model type:** DAB-DETR - **License:** Apache-2.0 ### Model Sources <!-- Provide the basic links for the model. --> - **Repository:** https://github.com/IDEA-Research/DAB-DETR - **Paper:** https://arxiv.org/abs/2201.12329 ## How to Get Started with the Model Use the code below to get started with the model. ```python import torch import requests from PIL import Image from transformers import AutoModelForObjectDetection, AutoImageProcessor url = 'http://images.cocodataset.org/val2017/000000039769.jpg' image = Image.open(requests.get(url, stream=True).raw) image_processor = AutoImageProcessor.from_pretrained("IDEA-Research/dab-detr-resnet-50-dc5") model = AutoModelForObjectDetection.from_pretrained("IDEA-Research/dab-detr-resnet-50-dc5") inputs = image_processor(images=image, return_tensors="pt") with torch.no_grad(): outputs = model(**inputs) results = image_processor.post_process_object_detection(outputs, target_sizes=torch.tensor([image.size[::-1]]), threshold=0.3) for result in results: for score, label_id, box in zip(result["scores"], result["labels"], result["boxes"]): score, label = score.item(), label_id.item() box = [round(i, 2) for i in box.tolist()] print(f"{model.config.id2label[label]}: {score:.2f} {box}") ``` This should output ``` cat: 0.89 [344.17, 20.93, 640.53, 371.3] cat: 0.88 [16.19, 53.44, 315.77, 469.12] remote: 0.87 [40.35, 73.28, 175.18, 117.59] couch: 0.60 [0.1, 0.88, 640.08, 476.5] remote: 0.55 [333.52, 77.34, 369.16, 191.01] ``` ## Training Details ### Training Data <!-- This should link to a Dataset Card, perhaps with a short stub of information on what the training data is all about as well as documentation related to data pre-processing or additional filtering. --> The DAB-DETR model was trained on [COCO 2017 object detection](https://cocodataset.org/#download), a dataset consisting of 118k/5k annotated images for training/validation respectively. ### Training Procedure Following Deformable DETR and Conditional DETR, we use 300 anchors as queries. We select 300 predicted boxes and labels with the largest classification logits for evaluation as well. We also use focal loss (Lin et al., 2020) with Ξ± = 0.25, Ξ³ = 2 for classification. The same loss terms are used in bipartite matching and final loss calculating, but with different coefficients. Classification loss with coefficient 2.0 is used in pipartite matching but 1.0 in the final loss. L1 loss with coefficient 5.0 and GIOU loss (Rezatofighi et al., 2019) with coefficient 2.0 are consistent in both the matching and the final loss calculation procedures. All models are trained on 16 GPUs with 1 image per GPU and AdamW (Loshchilov & Hutter, 2018) is used for training with weight decay 10βˆ’4. The learning rates for backbone and other modules are set to 10βˆ’5 and 10βˆ’4 respectively. We train our models for 50 epochs and drop the learning rate by 0.1 after 40 epochs. All models are trained on Nvidia A100 GPU. We search hyperparameters with batch size 64 and all results in our paper are reported with batch size 16 #### Preprocessing Images are resized/rescaled such that the shortest side is at least 480 and at most 800 pixels and the long size is at most 1333 pixels, and normalized across the RGB channels with the ImageNet mean (0.485, 0.456, 0.406) and standard deviation (0.229, 0.224, 0.225). ### Training Hyperparameters - **Training regime:** <!--fp32, fp16 mixed precision, bf16 mixed precision, bf16 non-mixed precision, fp16 non-mixed precision, fp8 mixed precision --> | **Key** | **Value** | |-----------------------------|---------------------------------------------------------------------------------------------------------------------------------------------| | **activation_dropout** | `0.0` | | **activation_function** | `prelu` | | **attention_dropout** | `0.0` | | **auxiliary_loss** | `false` | | **backbone** | `resnet50` | | **bbox_cost** | `5` | | **bbox_loss_coefficient** | `5` | | **class_cost** | `2` | | **cls_loss_coefficient** | `2` | | **decoder_attention_heads** | `8` | | **decoder_ffn_dim** | `2048` | | **decoder_layers** | `6` | | **dropout** | `0.1` | | **encoder_attention_heads** | `8` | | **encoder_ffn_dim** | `2048` | | **encoder_layers** | `6` | | **focal_alpha** | `0.25` | | **giou_cost** | `2` | | **giou_loss_coefficient** | `2` | | **hidden_size** | `256` | | **init_std** | `0.02` | | **init_xavier_std** | `1.0` | | **initializer_bias_prior_prob** | `null` | | **keep_query_pos** | `false` | | **normalize_before** | `false` | | **num_hidden_layers** | `6` | | **num_patterns** | `0` | | **num_queries** | `300` | | **query_dim** | `4` | | **random_refpoints_xy** | `false` | | **sine_position_embedding_scale** | `null` | | **temperature_height** | `20` | | **temperature_width** | `20` | ## Evaluation ![image/png](https://huggingface.co/datasets/huggingface/documentation-images/resolve/main/transformers/model_doc/dab_detr_results.png) ### Model Architecture and Objective ![image/png](https://huggingface.co/datasets/huggingface/documentation-images/resolve/main/transformers/model_doc/dab_detr_model_arch.png) Overview of DAB-DETR. We extract image spatial features using a CNN backbone followed with Transformer encoders to refine the CNN features. Then dual queries, including positional queries (anchor boxes) and content queries (decoder embeddings), are fed into the decoder to probe the objects which correspond to the anchors and have similar patterns with the content queries. The dual queries are updated layer-by-layer to get close to the target ground-truth objects gradually. The outputs of the final decoder layer are used to predict the objects with labels and boxes by prediction heads, and then a bipartite graph matching is conducted to calculate loss as in DETR. ## Citation <!-- If there is a paper or blog post introducing the model, the APA and Bibtex information for that should go in this section. --> **BibTeX:** ```bibtex @inproceedings{ liu2022dabdetr, title={{DAB}-{DETR}: Dynamic Anchor Boxes are Better Queries for {DETR}}, author={Shilong Liu and Feng Li and Hao Zhang and Xiao Yang and Xianbiao Qi and Hang Su and Jun Zhu and Lei Zhang}, booktitle={International Conference on Learning Representations}, year={2022}, url={https://openreview.net/forum?id=oMI9PjOb9Jl} } ``` ## Model Card Authors [David Hajdu](https://huggingface.co/davidhajdu)
[ "n/a", "person", "bicycle", "car", "motorcycle", "airplane", "bus", "train", "truck", "boat", "traffic light", "fire hydrant", "n/a", "stop sign", "parking meter", "bench", "bird", "cat", "dog", "horse", "sheep", "cow", "elephant", "bear", "zebra", "giraffe", "n/a", "backpack", "umbrella", "n/a", "n/a", "handbag", "tie", "suitcase", "frisbee", "skis", "snowboard", "sports ball", "kite", "baseball bat", "baseball glove", "skateboard", "surfboard", "tennis racket", "bottle", "n/a", "wine glass", "cup", "fork", "knife", "spoon", "bowl", "banana", "apple", "sandwich", "orange", "broccoli", "carrot", "hot dog", "pizza", "donut", "cake", "chair", "couch", "potted plant", "bed", "n/a", "dining table", "n/a", "n/a", "toilet", "n/a", "tv", "laptop", "mouse", "remote", "keyboard", "cell phone", "microwave", "oven", "toaster", "sink", "refrigerator", "n/a", "book", "clock", "vase", "scissors", "teddy bear", "hair drier", "toothbrush" ]
aviola/detrDominoTest
<!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. --> # detrDominoTest This model is a fine-tuned version of [aviola/detrDominoTest](https://huggingface.co/aviola/detrDominoTest) on the temp_domino2 dataset. It achieves the following results on the evaluation set: - Loss: 0.3054 - Map: 0.795 - Map 50: 0.9644 - Map 75: 0.9489 - Map Small: 0.6649 - Map Medium: 0.8058 - Map Large: 0.875 - Mar 1: 0.563 - Mar 10: 0.8475 - Mar 100: 0.8475 - Mar Small: 0.7197 - Mar Medium: 0.8572 - Mar Large: 0.875 - Map Pip-1: 0.7408 - Mar 100 Pip-1: 0.8615 - Map Pip-10: 0.8189 - Mar 100 Pip-10: 0.8524 - Map Pip-11: 0.8134 - Mar 100 Pip-11: 0.8462 - Map Pip-12: 0.8307 - Mar 100 Pip-12: 0.8579 - Map Pip-2: 0.841 - Mar 100 Pip-2: 0.855 - Map Pip-3: 0.7812 - Mar 100 Pip-3: 0.8533 - Map Pip-4: 0.8175 - Mar 100 Pip-4: 0.8429 - Map Pip-5: 0.7701 - Mar 100 Pip-5: 0.8355 - Map Pip-6: 0.7159 - Mar 100 Pip-6: 0.7682 - Map Pip-7: 0.8037 - Mar 100 Pip-7: 0.8692 - Map Pip-8: 0.843 - Mar 100 Pip-8: 0.8813 - Map Pip-9: 0.7638 - Mar 100 Pip-9: 0.8467 ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 5e-05 - train_batch_size: 8 - eval_batch_size: 8 - seed: 42 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: cosine - num_epochs: 100 ### Training results | Training Loss | Epoch | Step | Validation Loss | Map | Map 50 | Map 75 | Map Small | Map Medium | Map Large | Mar 1 | Mar 10 | Mar 100 | Mar Small | Mar Medium | Mar Large | Map Pip-1 | Mar 100 Pip-1 | Map Pip-10 | Mar 100 Pip-10 | Map Pip-11 | Mar 100 Pip-11 | Map Pip-12 | Mar 100 Pip-12 | Map Pip-2 | Mar 100 Pip-2 | Map Pip-3 | Mar 100 Pip-3 | Map Pip-4 | Mar 100 Pip-4 | Map Pip-5 | Mar 100 Pip-5 | Map Pip-6 | Mar 100 Pip-6 | Map Pip-7 | Mar 100 Pip-7 | Map Pip-8 | Mar 100 Pip-8 | Map Pip-9 | Mar 100 Pip-9 | |:-------------:|:-----:|:----:|:---------------:|:------:|:------:|:------:|:---------:|:----------:|:---------:|:------:|:------:|:-------:|:---------:|:----------:|:---------:|:---------:|:-------------:|:----------:|:--------------:|:----------:|:--------------:|:----------:|:--------------:|:---------:|:-------------:|:---------:|:-------------:|:---------:|:-------------:|:---------:|:-------------:|:---------:|:-------------:|:---------:|:-------------:|:---------:|:-------------:|:---------:|:-------------:| | No log | 1.0 | 39 | 0.3831 | 0.7707 | 0.9684 | 0.927 | 0.5938 | 0.792 | 0.8 | 0.5445 | 0.8254 | 0.8283 | 0.6727 | 0.8425 | 0.925 | 0.6593 | 0.8154 | 0.8327 | 0.8476 | 0.7972 | 0.85 | 0.8122 | 0.8421 | 0.7815 | 0.84 | 0.8186 | 0.86 | 0.7848 | 0.819 | 0.7498 | 0.8097 | 0.7032 | 0.7727 | 0.7729 | 0.8385 | 0.7822 | 0.8375 | 0.7537 | 0.8067 | | No log | 2.0 | 78 | 0.3396 | 0.7759 | 0.9671 | 0.935 | 0.5735 | 0.7961 | 0.95 | 0.552 | 0.8251 | 0.8251 | 0.5955 | 0.8415 | 0.95 | 0.7095 | 0.8 | 0.8458 | 0.8762 | 0.7965 | 0.8269 | 0.8161 | 0.8526 | 0.7549 | 0.795 | 0.7708 | 0.8333 | 0.7731 | 0.7952 | 0.7655 | 0.8226 | 0.6989 | 0.7682 | 0.7717 | 0.8423 | 0.8346 | 0.875 | 0.7732 | 0.8133 | | No log | 3.0 | 117 | 0.3413 | 0.7665 | 0.9736 | 0.945 | 0.6044 | 0.7818 | 0.95 | 0.5434 | 0.8206 | 0.8216 | 0.6561 | 0.8315 | 0.95 | 0.6818 | 0.8154 | 0.8044 | 0.8381 | 0.763 | 0.8038 | 0.814 | 0.8368 | 0.8183 | 0.855 | 0.7968 | 0.8267 | 0.7756 | 0.8143 | 0.773 | 0.829 | 0.6994 | 0.75 | 0.7752 | 0.8538 | 0.7946 | 0.8562 | 0.7014 | 0.78 | | No log | 4.0 | 156 | 0.3630 | 0.7517 | 0.9697 | 0.9465 | 0.6005 | 0.7677 | 0.9 | 0.534 | 0.8094 | 0.8094 | 0.6439 | 0.8207 | 0.9 | 0.6363 | 0.7692 | 0.8175 | 0.8476 | 0.7821 | 0.8192 | 0.7616 | 0.8053 | 0.776 | 0.815 | 0.7621 | 0.82 | 0.7999 | 0.819 | 0.7593 | 0.8194 | 0.6738 | 0.7409 | 0.7666 | 0.8462 | 0.7832 | 0.8313 | 0.7022 | 0.78 | | No log | 5.0 | 195 | 0.3685 | 0.7546 | 0.9572 | 0.9281 | 0.5432 | 0.7767 | 0.7625 | 0.5394 | 0.8172 | 0.8187 | 0.6152 | 0.8331 | 0.875 | 0.6488 | 0.8231 | 0.8343 | 0.8619 | 0.7554 | 0.8077 | 0.784 | 0.8474 | 0.7966 | 0.845 | 0.7246 | 0.82 | 0.8225 | 0.8476 | 0.7397 | 0.8065 | 0.7159 | 0.7545 | 0.7259 | 0.8 | 0.7993 | 0.8438 | 0.708 | 0.7667 | | No log | 6.0 | 234 | 0.3723 | 0.7631 | 0.9685 | 0.9171 | 0.6247 | 0.7764 | 0.925 | 0.5454 | 0.8146 | 0.8146 | 0.6773 | 0.8233 | 0.925 | 0.7055 | 0.8385 | 0.7739 | 0.8048 | 0.7866 | 0.8192 | 0.7976 | 0.8368 | 0.786 | 0.82 | 0.746 | 0.8133 | 0.8275 | 0.8476 | 0.7449 | 0.7968 | 0.7101 | 0.75 | 0.7107 | 0.7923 | 0.8261 | 0.8562 | 0.7418 | 0.8 | | No log | 7.0 | 273 | 0.3554 | 0.7663 | 0.9724 | 0.9386 | 0.6053 | 0.7857 | 0.925 | 0.5409 | 0.8192 | 0.8198 | 0.6591 | 0.8314 | 0.925 | 0.7293 | 0.8308 | 0.8056 | 0.8476 | 0.7934 | 0.8269 | 0.8256 | 0.8526 | 0.7477 | 0.795 | 0.751 | 0.8333 | 0.8182 | 0.8524 | 0.7572 | 0.8129 | 0.7043 | 0.7591 | 0.746 | 0.7923 | 0.8267 | 0.875 | 0.6905 | 0.76 | | No log | 8.0 | 312 | 0.3772 | 0.753 | 0.961 | 0.9144 | 0.5717 | 0.774 | 0.925 | 0.5344 | 0.8075 | 0.8084 | 0.6303 | 0.8232 | 0.925 | 0.7172 | 0.8 | 0.7985 | 0.8238 | 0.7954 | 0.8346 | 0.7836 | 0.8211 | 0.7747 | 0.815 | 0.7661 | 0.8467 | 0.7922 | 0.8238 | 0.7433 | 0.8032 | 0.6735 | 0.7455 | 0.6932 | 0.7885 | 0.7621 | 0.8125 | 0.7359 | 0.7867 | | No log | 9.0 | 351 | 0.3680 | 0.7603 | 0.9688 | 0.9231 | 0.605 | 0.7738 | 0.9 | 0.5446 | 0.8082 | 0.8088 | 0.6455 | 0.82 | 0.9 | 0.7341 | 0.8231 | 0.8035 | 0.8476 | 0.7777 | 0.8115 | 0.8294 | 0.8474 | 0.7754 | 0.81 | 0.7356 | 0.8 | 0.8142 | 0.8333 | 0.7692 | 0.8226 | 0.6743 | 0.7409 | 0.7332 | 0.7962 | 0.7726 | 0.8125 | 0.7042 | 0.76 | | No log | 10.0 | 390 | 0.3842 | 0.7361 | 0.9719 | 0.9261 | 0.5901 | 0.7512 | 0.875 | 0.53 | 0.7929 | 0.7932 | 0.6364 | 0.8044 | 0.9 | 0.7093 | 0.7615 | 0.7597 | 0.8143 | 0.7931 | 0.8231 | 0.7435 | 0.7947 | 0.7316 | 0.765 | 0.7113 | 0.8067 | 0.7918 | 0.8238 | 0.7465 | 0.8032 | 0.6395 | 0.7045 | 0.7224 | 0.7885 | 0.7574 | 0.8125 | 0.7269 | 0.82 | | No log | 11.0 | 429 | 0.3707 | 0.7449 | 0.9691 | 0.932 | 0.5949 | 0.7595 | 0.975 | 0.5337 | 0.806 | 0.8063 | 0.6379 | 0.8169 | 0.975 | 0.7366 | 0.8462 | 0.7883 | 0.8333 | 0.7493 | 0.7962 | 0.7859 | 0.8158 | 0.7558 | 0.8 | 0.7327 | 0.8267 | 0.773 | 0.8095 | 0.7741 | 0.829 | 0.6793 | 0.7409 | 0.7383 | 0.8231 | 0.7239 | 0.7812 | 0.7013 | 0.7733 | | No log | 12.0 | 468 | 0.3725 | 0.7494 | 0.9619 | 0.9344 | 0.6046 | 0.7639 | 0.9 | 0.5368 | 0.8088 | 0.8095 | 0.6545 | 0.8202 | 0.9 | 0.7211 | 0.8308 | 0.7904 | 0.819 | 0.7713 | 0.8115 | 0.7896 | 0.8368 | 0.7352 | 0.765 | 0.7698 | 0.8133 | 0.781 | 0.8286 | 0.7739 | 0.8323 | 0.6444 | 0.7318 | 0.7599 | 0.8308 | 0.7156 | 0.7875 | 0.7401 | 0.8267 | | 0.3708 | 13.0 | 507 | 0.3603 | 0.7604 | 0.9737 | 0.9428 | 0.6148 | 0.7774 | 0.9 | 0.5354 | 0.8098 | 0.8102 | 0.65 | 0.8218 | 0.9 | 0.7263 | 0.7923 | 0.787 | 0.8333 | 0.7927 | 0.8346 | 0.7742 | 0.8316 | 0.7697 | 0.8 | 0.76 | 0.7933 | 0.7854 | 0.8286 | 0.7544 | 0.8161 | 0.6845 | 0.7455 | 0.7327 | 0.7962 | 0.7767 | 0.8313 | 0.7808 | 0.82 | | 0.3708 | 14.0 | 546 | 0.3518 | 0.7604 | 0.9671 | 0.9351 | 0.6309 | 0.7722 | 0.95 | 0.5431 | 0.8149 | 0.8159 | 0.6606 | 0.8254 | 0.95 | 0.7241 | 0.8231 | 0.8124 | 0.8238 | 0.7864 | 0.8154 | 0.8048 | 0.8316 | 0.7808 | 0.825 | 0.7626 | 0.8267 | 0.778 | 0.819 | 0.7292 | 0.7903 | 0.6355 | 0.7136 | 0.7794 | 0.8654 | 0.7803 | 0.85 | 0.7509 | 0.8067 | | 0.3708 | 15.0 | 585 | 0.3711 | 0.7405 | 0.9639 | 0.896 | 0.5804 | 0.7557 | 0.95 | 0.5378 | 0.8032 | 0.8032 | 0.6121 | 0.8161 | 0.95 | 0.6714 | 0.8154 | 0.7787 | 0.8143 | 0.7892 | 0.8192 | 0.739 | 0.8 | 0.7384 | 0.79 | 0.7617 | 0.8267 | 0.7737 | 0.8048 | 0.7659 | 0.8258 | 0.602 | 0.6773 | 0.7855 | 0.8538 | 0.767 | 0.8313 | 0.7131 | 0.78 | | 0.3708 | 16.0 | 624 | 0.3562 | 0.7587 | 0.9622 | 0.9351 | 0.5894 | 0.7776 | 0.925 | 0.5471 | 0.8182 | 0.8193 | 0.6364 | 0.8322 | 0.925 | 0.6482 | 0.8385 | 0.8094 | 0.8429 | 0.7884 | 0.8115 | 0.8029 | 0.8263 | 0.7913 | 0.82 | 0.7394 | 0.8 | 0.7939 | 0.8333 | 0.7428 | 0.829 | 0.6968 | 0.7591 | 0.7733 | 0.8269 | 0.7765 | 0.8375 | 0.7413 | 0.8067 | | 0.3708 | 17.0 | 663 | 0.3939 | 0.7434 | 0.963 | 0.9271 | 0.6029 | 0.7557 | 0.925 | 0.5355 | 0.7964 | 0.7975 | 0.6455 | 0.8072 | 0.925 | 0.687 | 0.7769 | 0.7825 | 0.8143 | 0.7534 | 0.7808 | 0.7847 | 0.8211 | 0.7474 | 0.795 | 0.7514 | 0.8 | 0.7445 | 0.7905 | 0.7378 | 0.8258 | 0.7045 | 0.7545 | 0.734 | 0.7923 | 0.7368 | 0.825 | 0.7574 | 0.7933 | | 0.3708 | 18.0 | 702 | 0.3832 | 0.7299 | 0.9633 | 0.8936 | 0.5973 | 0.7443 | 0.925 | 0.5284 | 0.7942 | 0.7945 | 0.6258 | 0.8045 | 0.925 | 0.6571 | 0.8154 | 0.7968 | 0.8286 | 0.7588 | 0.7885 | 0.7437 | 0.7737 | 0.723 | 0.78 | 0.7491 | 0.8 | 0.7686 | 0.8095 | 0.7393 | 0.8097 | 0.6504 | 0.7273 | 0.7116 | 0.7962 | 0.767 | 0.8188 | 0.6931 | 0.7867 | | 0.3708 | 19.0 | 741 | 0.3596 | 0.756 | 0.9635 | 0.9066 | 0.5885 | 0.7727 | 0.875 | 0.541 | 0.8184 | 0.8184 | 0.6621 | 0.8296 | 0.875 | 0.7242 | 0.8538 | 0.821 | 0.8476 | 0.777 | 0.8038 | 0.7652 | 0.8105 | 0.7732 | 0.82 | 0.7665 | 0.82 | 0.7971 | 0.8381 | 0.7583 | 0.8194 | 0.6624 | 0.7455 | 0.7384 | 0.8308 | 0.7717 | 0.8313 | 0.7173 | 0.8 | | 0.3708 | 20.0 | 780 | 0.3683 | 0.7612 | 0.9639 | 0.9358 | 0.631 | 0.779 | 0.7375 | 0.5374 | 0.8183 | 0.8197 | 0.6833 | 0.8311 | 0.85 | 0.689 | 0.8154 | 0.8223 | 0.8381 | 0.7803 | 0.8038 | 0.8172 | 0.8263 | 0.7537 | 0.8 | 0.7671 | 0.8133 | 0.7806 | 0.8333 | 0.7725 | 0.8355 | 0.6862 | 0.7591 | 0.7538 | 0.8538 | 0.7661 | 0.8375 | 0.7454 | 0.82 | | 0.3708 | 21.0 | 819 | 0.3645 | 0.7675 | 0.9677 | 0.9265 | 0.5878 | 0.7858 | 0.85 | 0.5424 | 0.8246 | 0.8252 | 0.6682 | 0.8384 | 0.85 | 0.7228 | 0.8308 | 0.7889 | 0.8238 | 0.7897 | 0.8115 | 0.7997 | 0.8474 | 0.7476 | 0.805 | 0.802 | 0.84 | 0.7969 | 0.8333 | 0.7595 | 0.8065 | 0.6952 | 0.7818 | 0.7647 | 0.8462 | 0.7934 | 0.8562 | 0.75 | 0.82 | | 0.3708 | 22.0 | 858 | 0.3777 | 0.7607 | 0.9619 | 0.93 | 0.6276 | 0.776 | 0.9 | 0.5388 | 0.8223 | 0.8249 | 0.697 | 0.8348 | 0.9 | 0.6858 | 0.8538 | 0.7868 | 0.8238 | 0.7764 | 0.8077 | 0.8306 | 0.8526 | 0.7746 | 0.83 | 0.7922 | 0.84 | 0.8176 | 0.8476 | 0.746 | 0.8032 | 0.6699 | 0.7636 | 0.7264 | 0.8269 | 0.78 | 0.85 | 0.7416 | 0.8 | | 0.3708 | 23.0 | 897 | 0.3532 | 0.7677 | 0.9615 | 0.9309 | 0.6114 | 0.7815 | 0.875 | 0.5458 | 0.8214 | 0.8217 | 0.6652 | 0.8335 | 0.875 | 0.6714 | 0.8077 | 0.7988 | 0.8238 | 0.7901 | 0.8269 | 0.842 | 0.8579 | 0.7779 | 0.82 | 0.7714 | 0.8133 | 0.8097 | 0.8381 | 0.7556 | 0.8129 | 0.691 | 0.75 | 0.77 | 0.8462 | 0.7881 | 0.8375 | 0.7469 | 0.8267 | | 0.3708 | 24.0 | 936 | 0.3546 | 0.7696 | 0.9681 | 0.9385 | 0.6058 | 0.7819 | 0.9 | 0.5429 | 0.8229 | 0.8233 | 0.6788 | 0.8331 | 0.9 | 0.7156 | 0.8154 | 0.792 | 0.8286 | 0.8082 | 0.8462 | 0.8238 | 0.8526 | 0.7677 | 0.815 | 0.7925 | 0.8267 | 0.782 | 0.8238 | 0.7649 | 0.8226 | 0.7122 | 0.7682 | 0.7567 | 0.8231 | 0.7703 | 0.8438 | 0.749 | 0.8133 | | 0.3708 | 25.0 | 975 | 0.3651 | 0.764 | 0.963 | 0.9319 | 0.5945 | 0.7788 | 0.925 | 0.5363 | 0.819 | 0.8213 | 0.6576 | 0.8332 | 0.925 | 0.6889 | 0.8154 | 0.7908 | 0.8381 | 0.8041 | 0.8346 | 0.8226 | 0.8526 | 0.7512 | 0.8 | 0.7845 | 0.8267 | 0.7923 | 0.8143 | 0.757 | 0.8194 | 0.7231 | 0.7955 | 0.7693 | 0.8269 | 0.7477 | 0.8188 | 0.7362 | 0.8133 | | 0.3647 | 26.0 | 1014 | 0.3579 | 0.7522 | 0.9627 | 0.9224 | 0.5684 | 0.7718 | 0.7875 | 0.5353 | 0.8172 | 0.8193 | 0.647 | 0.8316 | 0.9 | 0.65 | 0.8231 | 0.7766 | 0.8286 | 0.7873 | 0.8192 | 0.798 | 0.8316 | 0.7617 | 0.825 | 0.7624 | 0.8133 | 0.7986 | 0.8381 | 0.7514 | 0.8226 | 0.7005 | 0.7773 | 0.7357 | 0.8231 | 0.8009 | 0.8562 | 0.7037 | 0.7733 | | 0.3647 | 27.0 | 1053 | 0.3478 | 0.7722 | 0.9654 | 0.9393 | 0.5952 | 0.7921 | 0.9 | 0.542 | 0.8329 | 0.8329 | 0.6515 | 0.8486 | 0.9 | 0.6669 | 0.8308 | 0.8077 | 0.8333 | 0.7836 | 0.8192 | 0.8066 | 0.8368 | 0.8162 | 0.86 | 0.8172 | 0.86 | 0.8157 | 0.8381 | 0.7643 | 0.8323 | 0.6787 | 0.7727 | 0.763 | 0.8423 | 0.8063 | 0.8562 | 0.7401 | 0.8133 | | 0.3647 | 28.0 | 1092 | 0.3548 | 0.7622 | 0.9605 | 0.9301 | 0.5752 | 0.7805 | 0.8781 | 0.5393 | 0.8186 | 0.8195 | 0.6394 | 0.8329 | 0.9 | 0.6486 | 0.8231 | 0.8143 | 0.8429 | 0.7778 | 0.8077 | 0.7877 | 0.8158 | 0.7795 | 0.815 | 0.8061 | 0.8533 | 0.7792 | 0.8095 | 0.7595 | 0.8419 | 0.6903 | 0.7682 | 0.7596 | 0.8192 | 0.7893 | 0.8375 | 0.754 | 0.8 | | 0.3647 | 29.0 | 1131 | 0.3761 | 0.7473 | 0.9562 | 0.9307 | 0.6266 | 0.7609 | 0.875 | 0.5315 | 0.8078 | 0.8089 | 0.6667 | 0.8193 | 0.875 | 0.6492 | 0.8308 | 0.7912 | 0.8238 | 0.7703 | 0.8038 | 0.7751 | 0.8 | 0.7692 | 0.795 | 0.7556 | 0.82 | 0.7916 | 0.8238 | 0.7694 | 0.8194 | 0.6874 | 0.75 | 0.7689 | 0.8423 | 0.7746 | 0.8375 | 0.6649 | 0.76 | | 0.3647 | 30.0 | 1170 | 0.3424 | 0.7624 | 0.9653 | 0.9307 | 0.6605 | 0.7723 | 0.8833 | 0.5458 | 0.8201 | 0.8201 | 0.697 | 0.829 | 0.9 | 0.6695 | 0.8385 | 0.7992 | 0.8238 | 0.793 | 0.8192 | 0.7972 | 0.8211 | 0.7894 | 0.825 | 0.7615 | 0.82 | 0.789 | 0.8238 | 0.7719 | 0.829 | 0.6836 | 0.7318 | 0.7466 | 0.8269 | 0.804 | 0.8687 | 0.7442 | 0.8133 | | 0.3647 | 31.0 | 1209 | 0.3459 | 0.7745 | 0.9651 | 0.9226 | 0.6118 | 0.7894 | 0.9 | 0.541 | 0.8231 | 0.8239 | 0.6561 | 0.8369 | 0.9 | 0.6797 | 0.8154 | 0.8204 | 0.8381 | 0.7959 | 0.8269 | 0.8347 | 0.8526 | 0.7722 | 0.815 | 0.7904 | 0.8267 | 0.803 | 0.8286 | 0.7616 | 0.8194 | 0.7078 | 0.7545 | 0.763 | 0.8346 | 0.8241 | 0.8687 | 0.7414 | 0.8067 | | 0.3647 | 32.0 | 1248 | 0.3604 | 0.7687 | 0.9698 | 0.9452 | 0.606 | 0.784 | 0.825 | 0.5444 | 0.8159 | 0.8159 | 0.6591 | 0.8285 | 0.825 | 0.7222 | 0.8385 | 0.7988 | 0.8286 | 0.7875 | 0.8115 | 0.7993 | 0.8263 | 0.7737 | 0.81 | 0.8202 | 0.8467 | 0.7956 | 0.8143 | 0.7673 | 0.8129 | 0.7102 | 0.7682 | 0.7578 | 0.8308 | 0.782 | 0.85 | 0.71 | 0.7533 | | 0.3647 | 33.0 | 1287 | 0.3488 | 0.7746 | 0.9719 | 0.9411 | 0.635 | 0.7886 | 0.825 | 0.5467 | 0.8234 | 0.8239 | 0.6848 | 0.8356 | 0.825 | 0.7174 | 0.8077 | 0.8026 | 0.8286 | 0.8074 | 0.8385 | 0.8236 | 0.8526 | 0.7782 | 0.815 | 0.789 | 0.8333 | 0.8001 | 0.8286 | 0.7664 | 0.8226 | 0.7297 | 0.7864 | 0.7594 | 0.8308 | 0.765 | 0.85 | 0.7566 | 0.7933 | | 0.3647 | 34.0 | 1326 | 0.3774 | 0.7615 | 0.9606 | 0.9323 | 0.6755 | 0.7723 | 0.825 | 0.535 | 0.8195 | 0.8199 | 0.7303 | 0.8276 | 0.825 | 0.6784 | 0.8154 | 0.7705 | 0.8095 | 0.7803 | 0.8115 | 0.8217 | 0.8579 | 0.7548 | 0.8 | 0.7654 | 0.8133 | 0.7767 | 0.819 | 0.7505 | 0.8194 | 0.7157 | 0.7818 | 0.7716 | 0.8423 | 0.8356 | 0.8687 | 0.7173 | 0.8 | | 0.3647 | 35.0 | 1365 | 0.3671 | 0.7461 | 0.9647 | 0.8973 | 0.5755 | 0.764 | 0.85 | 0.5404 | 0.8079 | 0.8079 | 0.6167 | 0.8233 | 0.85 | 0.6925 | 0.8077 | 0.7819 | 0.8143 | 0.7778 | 0.8077 | 0.7873 | 0.8158 | 0.7288 | 0.8 | 0.7817 | 0.8067 | 0.7738 | 0.8095 | 0.7303 | 0.8129 | 0.6881 | 0.7682 | 0.7126 | 0.8038 | 0.7963 | 0.8687 | 0.7026 | 0.78 | | 0.3647 | 36.0 | 1404 | 0.3604 | 0.7603 | 0.9628 | 0.9211 | 0.5709 | 0.7786 | 0.85 | 0.5435 | 0.8112 | 0.8112 | 0.6303 | 0.8268 | 0.85 | 0.7226 | 0.8308 | 0.7665 | 0.7905 | 0.8099 | 0.8385 | 0.8065 | 0.8316 | 0.7669 | 0.815 | 0.7489 | 0.8 | 0.789 | 0.8048 | 0.7648 | 0.8161 | 0.7246 | 0.7864 | 0.7536 | 0.8231 | 0.7548 | 0.825 | 0.7152 | 0.7733 | | 0.3647 | 37.0 | 1443 | 0.3428 | 0.7776 | 0.9712 | 0.9339 | 0.6261 | 0.7926 | 0.9 | 0.5532 | 0.8306 | 0.8313 | 0.6848 | 0.8446 | 0.9 | 0.7509 | 0.8615 | 0.7924 | 0.8238 | 0.8001 | 0.8346 | 0.8503 | 0.8684 | 0.7855 | 0.825 | 0.8124 | 0.8467 | 0.8012 | 0.8286 | 0.7723 | 0.8194 | 0.7011 | 0.7727 | 0.7839 | 0.85 | 0.749 | 0.8313 | 0.7321 | 0.8133 | | 0.3647 | 38.0 | 1482 | 0.3478 | 0.7699 | 0.962 | 0.9428 | 0.6391 | 0.7855 | 0.9 | 0.5478 | 0.8235 | 0.8235 | 0.6682 | 0.8371 | 0.9 | 0.6912 | 0.8308 | 0.7782 | 0.8 | 0.8063 | 0.8308 | 0.8305 | 0.8579 | 0.7655 | 0.805 | 0.7891 | 0.8333 | 0.7948 | 0.8333 | 0.7555 | 0.8161 | 0.7119 | 0.7773 | 0.7909 | 0.8423 | 0.795 | 0.8687 | 0.7301 | 0.7867 | | 0.3448 | 39.0 | 1521 | 0.3364 | 0.7651 | 0.9705 | 0.9359 | 0.6459 | 0.7773 | 0.875 | 0.5441 | 0.8202 | 0.8205 | 0.6879 | 0.8314 | 0.875 | 0.736 | 0.8385 | 0.7913 | 0.8143 | 0.8102 | 0.8423 | 0.8351 | 0.8526 | 0.7775 | 0.815 | 0.7768 | 0.8467 | 0.7973 | 0.8286 | 0.7415 | 0.8161 | 0.6688 | 0.7364 | 0.7834 | 0.8577 | 0.7613 | 0.8313 | 0.7018 | 0.7667 | | 0.3448 | 40.0 | 1560 | 0.3587 | 0.7637 | 0.9703 | 0.9207 | 0.6239 | 0.7762 | 0.925 | 0.5447 | 0.8144 | 0.8144 | 0.6682 | 0.8239 | 0.925 | 0.7714 | 0.8538 | 0.7711 | 0.8095 | 0.8072 | 0.8231 | 0.8018 | 0.8263 | 0.7764 | 0.825 | 0.7827 | 0.82 | 0.764 | 0.7952 | 0.745 | 0.8226 | 0.6794 | 0.7636 | 0.7655 | 0.8308 | 0.8111 | 0.8562 | 0.6887 | 0.7467 | | 0.3448 | 41.0 | 1599 | 0.3255 | 0.7904 | 0.9805 | 0.962 | 0.6426 | 0.8061 | 0.9 | 0.5546 | 0.8325 | 0.8331 | 0.6985 | 0.8436 | 0.9 | 0.7856 | 0.8462 | 0.8096 | 0.8286 | 0.8266 | 0.85 | 0.8263 | 0.8421 | 0.8067 | 0.86 | 0.7991 | 0.84 | 0.7951 | 0.8333 | 0.7505 | 0.8129 | 0.7429 | 0.7818 | 0.7766 | 0.8462 | 0.8191 | 0.8562 | 0.7468 | 0.8 | | 0.3448 | 42.0 | 1638 | 0.3370 | 0.7869 | 0.9703 | 0.9425 | 0.6512 | 0.7984 | 0.925 | 0.557 | 0.8354 | 0.8354 | 0.697 | 0.846 | 0.925 | 0.7076 | 0.8231 | 0.7993 | 0.8333 | 0.8252 | 0.8462 | 0.8316 | 0.8579 | 0.8062 | 0.845 | 0.8117 | 0.86 | 0.7905 | 0.819 | 0.7614 | 0.8258 | 0.7499 | 0.8045 | 0.7784 | 0.8269 | 0.8097 | 0.8562 | 0.7711 | 0.8267 | | 0.3448 | 43.0 | 1677 | 0.3588 | 0.7607 | 0.9568 | 0.9296 | 0.6168 | 0.7724 | 0.875 | 0.548 | 0.8159 | 0.8159 | 0.653 | 0.8267 | 0.875 | 0.7101 | 0.8462 | 0.7958 | 0.8095 | 0.7779 | 0.8115 | 0.7812 | 0.8053 | 0.7649 | 0.795 | 0.7917 | 0.8467 | 0.7927 | 0.819 | 0.7425 | 0.8226 | 0.7341 | 0.7773 | 0.7517 | 0.8269 | 0.7666 | 0.8438 | 0.7194 | 0.7867 | | 0.3448 | 44.0 | 1716 | 0.3477 | 0.7795 | 0.9678 | 0.9398 | 0.61 | 0.7952 | 0.875 | 0.5486 | 0.8234 | 0.8238 | 0.6697 | 0.8381 | 0.875 | 0.8197 | 0.8538 | 0.7896 | 0.8238 | 0.8269 | 0.8462 | 0.8326 | 0.8579 | 0.7767 | 0.81 | 0.7852 | 0.8333 | 0.8019 | 0.8238 | 0.7393 | 0.8065 | 0.6834 | 0.7636 | 0.7739 | 0.8308 | 0.8006 | 0.8625 | 0.7235 | 0.7733 | | 0.3448 | 45.0 | 1755 | 0.3447 | 0.7744 | 0.9709 | 0.9261 | 0.6342 | 0.7876 | 0.9 | 0.551 | 0.8255 | 0.8255 | 0.6758 | 0.8388 | 0.9 | 0.7865 | 0.8308 | 0.7672 | 0.8 | 0.8035 | 0.8346 | 0.8338 | 0.8579 | 0.7888 | 0.825 | 0.7846 | 0.86 | 0.7846 | 0.819 | 0.7549 | 0.8194 | 0.7213 | 0.7955 | 0.7713 | 0.8462 | 0.7688 | 0.8313 | 0.7278 | 0.7867 | | 0.3448 | 46.0 | 1794 | 0.3430 | 0.7741 | 0.9704 | 0.9294 | 0.6212 | 0.7883 | 0.925 | 0.5513 | 0.8192 | 0.8205 | 0.6697 | 0.8326 | 0.925 | 0.7891 | 0.8231 | 0.7791 | 0.819 | 0.8068 | 0.8269 | 0.8379 | 0.8579 | 0.7579 | 0.795 | 0.7864 | 0.8467 | 0.7933 | 0.8143 | 0.7645 | 0.8355 | 0.6911 | 0.7682 | 0.7841 | 0.8423 | 0.7769 | 0.8438 | 0.722 | 0.7733 | | 0.3448 | 47.0 | 1833 | 0.3547 | 0.7624 | 0.9709 | 0.9265 | 0.6266 | 0.7786 | 0.9 | 0.5453 | 0.817 | 0.817 | 0.6864 | 0.8279 | 0.9 | 0.7801 | 0.8615 | 0.7803 | 0.8048 | 0.8041 | 0.8308 | 0.7921 | 0.8263 | 0.7849 | 0.815 | 0.7573 | 0.8267 | 0.7775 | 0.8 | 0.7644 | 0.8323 | 0.6757 | 0.7591 | 0.7818 | 0.8423 | 0.7467 | 0.8188 | 0.7045 | 0.7867 | | 0.3448 | 48.0 | 1872 | 0.3337 | 0.7789 | 0.9694 | 0.9436 | 0.6685 | 0.7908 | 0.9 | 0.5532 | 0.8315 | 0.8319 | 0.7212 | 0.8408 | 0.9 | 0.7529 | 0.8385 | 0.8186 | 0.8524 | 0.827 | 0.8462 | 0.8253 | 0.8474 | 0.7835 | 0.83 | 0.7971 | 0.8467 | 0.7951 | 0.8286 | 0.7789 | 0.8484 | 0.675 | 0.7591 | 0.7713 | 0.8231 | 0.7884 | 0.8687 | 0.7333 | 0.7933 | | 0.3448 | 49.0 | 1911 | 0.3230 | 0.7799 | 0.9706 | 0.9481 | 0.6638 | 0.7912 | 0.9 | 0.5525 | 0.8289 | 0.8293 | 0.7015 | 0.8402 | 0.9 | 0.7015 | 0.8231 | 0.7753 | 0.8048 | 0.8337 | 0.8538 | 0.8148 | 0.8368 | 0.7871 | 0.82 | 0.8297 | 0.8667 | 0.7949 | 0.8238 | 0.7769 | 0.8387 | 0.7055 | 0.7773 | 0.7986 | 0.85 | 0.7769 | 0.8562 | 0.7639 | 0.8 | | 0.3448 | 50.0 | 1950 | 0.3338 | 0.7779 | 0.9668 | 0.9304 | 0.6896 | 0.7882 | 0.9 | 0.5514 | 0.8258 | 0.8287 | 0.7242 | 0.8375 | 0.9 | 0.7171 | 0.8462 | 0.7719 | 0.8143 | 0.8369 | 0.8615 | 0.8328 | 0.8526 | 0.7718 | 0.81 | 0.7655 | 0.8133 | 0.7822 | 0.819 | 0.7728 | 0.8323 | 0.7233 | 0.7591 | 0.8051 | 0.8615 | 0.8148 | 0.8813 | 0.7404 | 0.7933 | | 0.3448 | 51.0 | 1989 | 0.3329 | 0.7742 | 0.9633 | 0.9366 | 0.6585 | 0.7849 | 0.9 | 0.5551 | 0.8314 | 0.8314 | 0.7045 | 0.8412 | 0.9 | 0.7177 | 0.8385 | 0.7831 | 0.8381 | 0.8298 | 0.8577 | 0.7877 | 0.8421 | 0.7898 | 0.835 | 0.7565 | 0.8267 | 0.7795 | 0.8238 | 0.7742 | 0.8355 | 0.7239 | 0.7636 | 0.7991 | 0.8538 | 0.8084 | 0.8687 | 0.741 | 0.7933 | | 0.3244 | 52.0 | 2028 | 0.3387 | 0.7726 | 0.9641 | 0.9352 | 0.6517 | 0.7867 | 0.9 | 0.5514 | 0.8285 | 0.8285 | 0.697 | 0.8386 | 0.9 | 0.7406 | 0.8538 | 0.8301 | 0.8667 | 0.8229 | 0.8462 | 0.7583 | 0.8158 | 0.8058 | 0.85 | 0.7491 | 0.8 | 0.7802 | 0.8143 | 0.7618 | 0.8323 | 0.7198 | 0.7636 | 0.7965 | 0.85 | 0.7871 | 0.8562 | 0.7196 | 0.7933 | | 0.3244 | 53.0 | 2067 | 0.3371 | 0.7832 | 0.9671 | 0.9449 | 0.6874 | 0.7952 | 0.9 | 0.5569 | 0.8328 | 0.8331 | 0.7333 | 0.8412 | 0.9 | 0.7871 | 0.8615 | 0.7975 | 0.8286 | 0.816 | 0.85 | 0.8129 | 0.8421 | 0.7926 | 0.83 | 0.7737 | 0.8333 | 0.799 | 0.8286 | 0.7682 | 0.8323 | 0.702 | 0.7545 | 0.8012 | 0.8423 | 0.8436 | 0.8875 | 0.7049 | 0.8067 | | 0.3244 | 54.0 | 2106 | 0.3511 | 0.7682 | 0.9667 | 0.9373 | 0.6718 | 0.7788 | 0.9 | 0.5493 | 0.8221 | 0.8225 | 0.7197 | 0.8302 | 0.9 | 0.7665 | 0.8231 | 0.7621 | 0.8095 | 0.8072 | 0.8346 | 0.8005 | 0.8316 | 0.7891 | 0.835 | 0.755 | 0.82 | 0.7594 | 0.8143 | 0.7664 | 0.829 | 0.7 | 0.7682 | 0.7882 | 0.8423 | 0.8087 | 0.8687 | 0.7147 | 0.7933 | | 0.3244 | 55.0 | 2145 | 0.3340 | 0.7674 | 0.9672 | 0.9298 | 0.6237 | 0.7799 | 0.9 | 0.5504 | 0.8273 | 0.8276 | 0.7152 | 0.8359 | 0.9 | 0.7155 | 0.8231 | 0.7725 | 0.8095 | 0.8097 | 0.8308 | 0.7913 | 0.8316 | 0.7796 | 0.835 | 0.7677 | 0.8333 | 0.7688 | 0.8095 | 0.7588 | 0.8387 | 0.7163 | 0.7909 | 0.7751 | 0.8346 | 0.8289 | 0.8875 | 0.7247 | 0.8067 | | 0.3244 | 56.0 | 2184 | 0.3461 | 0.7747 | 0.9619 | 0.9308 | 0.6657 | 0.7857 | 0.9 | 0.557 | 0.8247 | 0.8251 | 0.7106 | 0.834 | 0.9 | 0.7096 | 0.8308 | 0.8014 | 0.8286 | 0.821 | 0.8423 | 0.7874 | 0.8158 | 0.8007 | 0.83 | 0.8116 | 0.8533 | 0.7689 | 0.8095 | 0.7625 | 0.829 | 0.7082 | 0.7727 | 0.8057 | 0.8538 | 0.8221 | 0.875 | 0.697 | 0.76 | | 0.3244 | 57.0 | 2223 | 0.3470 | 0.7747 | 0.9593 | 0.943 | 0.6614 | 0.786 | 0.9 | 0.5548 | 0.8299 | 0.8299 | 0.7182 | 0.8383 | 0.9 | 0.7153 | 0.8385 | 0.7961 | 0.8333 | 0.8345 | 0.8538 | 0.7896 | 0.8421 | 0.7963 | 0.83 | 0.8181 | 0.8467 | 0.7944 | 0.8286 | 0.7582 | 0.829 | 0.6994 | 0.7682 | 0.8031 | 0.8654 | 0.7977 | 0.8562 | 0.6936 | 0.7667 | | 0.3244 | 58.0 | 2262 | 0.3367 | 0.7766 | 0.9662 | 0.9395 | 0.6466 | 0.7903 | 0.7875 | 0.5509 | 0.8302 | 0.8302 | 0.6924 | 0.8405 | 0.9 | 0.7642 | 0.8385 | 0.7876 | 0.8286 | 0.8196 | 0.8462 | 0.8063 | 0.8474 | 0.7951 | 0.825 | 0.777 | 0.84 | 0.7838 | 0.8238 | 0.7518 | 0.8355 | 0.7024 | 0.7636 | 0.7848 | 0.8462 | 0.8286 | 0.875 | 0.7181 | 0.7933 | | 0.3244 | 59.0 | 2301 | 0.3343 | 0.7839 | 0.9692 | 0.9297 | 0.6242 | 0.8023 | 0.7875 | 0.5508 | 0.8327 | 0.8347 | 0.6788 | 0.8473 | 0.9 | 0.7949 | 0.8538 | 0.823 | 0.8524 | 0.8115 | 0.8423 | 0.8057 | 0.8421 | 0.7891 | 0.84 | 0.7982 | 0.8467 | 0.7915 | 0.8333 | 0.7713 | 0.8258 | 0.698 | 0.7682 | 0.8114 | 0.8692 | 0.8257 | 0.8687 | 0.6864 | 0.7733 | | 0.3244 | 60.0 | 2340 | 0.3160 | 0.7843 | 0.9699 | 0.9412 | 0.6567 | 0.7996 | 0.875 | 0.5588 | 0.8369 | 0.8369 | 0.7015 | 0.8484 | 0.875 | 0.7567 | 0.8462 | 0.8036 | 0.8476 | 0.8449 | 0.8654 | 0.8155 | 0.8474 | 0.8116 | 0.84 | 0.7813 | 0.84 | 0.777 | 0.819 | 0.7591 | 0.8258 | 0.7329 | 0.7864 | 0.8063 | 0.8692 | 0.8221 | 0.8687 | 0.7011 | 0.7867 | | 0.3244 | 61.0 | 2379 | 0.3186 | 0.7788 | 0.97 | 0.9377 | 0.6608 | 0.7902 | 0.875 | 0.5533 | 0.8332 | 0.8336 | 0.6985 | 0.8458 | 0.875 | 0.7187 | 0.8308 | 0.7957 | 0.8333 | 0.8303 | 0.8577 | 0.831 | 0.8526 | 0.8094 | 0.84 | 0.8014 | 0.8333 | 0.785 | 0.8333 | 0.7568 | 0.8226 | 0.7048 | 0.7682 | 0.7863 | 0.85 | 0.8166 | 0.875 | 0.7092 | 0.8067 | | 0.3244 | 62.0 | 2418 | 0.3139 | 0.7736 | 0.9665 | 0.9358 | 0.6263 | 0.7854 | 0.9 | 0.5528 | 0.8299 | 0.8314 | 0.697 | 0.8418 | 0.9 | 0.7243 | 0.8308 | 0.7997 | 0.8333 | 0.7954 | 0.8269 | 0.8232 | 0.8526 | 0.8103 | 0.835 | 0.754 | 0.8267 | 0.7928 | 0.819 | 0.7659 | 0.829 | 0.7033 | 0.7727 | 0.7879 | 0.8615 | 0.8047 | 0.8687 | 0.722 | 0.82 | | 0.3244 | 63.0 | 2457 | 0.3227 | 0.7887 | 0.9736 | 0.9476 | 0.6921 | 0.799 | 0.9 | 0.5551 | 0.8387 | 0.8387 | 0.7424 | 0.8467 | 0.9 | 0.7834 | 0.8615 | 0.8091 | 0.8476 | 0.8349 | 0.8577 | 0.8292 | 0.8526 | 0.7917 | 0.835 | 0.8085 | 0.86 | 0.7871 | 0.8143 | 0.7692 | 0.8419 | 0.7118 | 0.7773 | 0.8088 | 0.8731 | 0.7981 | 0.85 | 0.7325 | 0.7933 | | 0.3244 | 64.0 | 2496 | 0.3161 | 0.7804 | 0.9686 | 0.9448 | 0.6907 | 0.789 | 0.9 | 0.5544 | 0.8354 | 0.8362 | 0.7409 | 0.8438 | 0.9 | 0.7723 | 0.8692 | 0.7887 | 0.8333 | 0.8127 | 0.8423 | 0.7941 | 0.8368 | 0.7913 | 0.835 | 0.788 | 0.8533 | 0.8054 | 0.8333 | 0.7517 | 0.829 | 0.7488 | 0.7909 | 0.7987 | 0.8615 | 0.7913 | 0.8562 | 0.7217 | 0.7933 | | 0.2973 | 65.0 | 2535 | 0.3186 | 0.7846 | 0.9666 | 0.9405 | 0.6644 | 0.7976 | 0.85 | 0.5538 | 0.8374 | 0.8374 | 0.7273 | 0.8467 | 0.85 | 0.8024 | 0.8538 | 0.7887 | 0.8238 | 0.8159 | 0.8423 | 0.8234 | 0.8526 | 0.812 | 0.86 | 0.7879 | 0.8533 | 0.7942 | 0.8238 | 0.7602 | 0.8323 | 0.7184 | 0.7818 | 0.8023 | 0.8769 | 0.8258 | 0.875 | 0.6846 | 0.7733 | | 0.2973 | 66.0 | 2574 | 0.3230 | 0.7824 | 0.9629 | 0.9384 | 0.659 | 0.7941 | 0.875 | 0.5561 | 0.8418 | 0.8423 | 0.7258 | 0.8513 | 0.875 | 0.7691 | 0.8769 | 0.807 | 0.8381 | 0.7975 | 0.8385 | 0.825 | 0.8474 | 0.8294 | 0.87 | 0.7628 | 0.8467 | 0.8065 | 0.8333 | 0.7636 | 0.8355 | 0.7133 | 0.7773 | 0.7965 | 0.8692 | 0.8084 | 0.875 | 0.7098 | 0.8 | | 0.2973 | 67.0 | 2613 | 0.3210 | 0.7839 | 0.9637 | 0.9361 | 0.6755 | 0.7947 | 0.875 | 0.5596 | 0.8434 | 0.8434 | 0.7273 | 0.8533 | 0.875 | 0.7315 | 0.8692 | 0.815 | 0.8524 | 0.8115 | 0.85 | 0.8449 | 0.8684 | 0.8356 | 0.87 | 0.7724 | 0.8467 | 0.8201 | 0.8429 | 0.751 | 0.8194 | 0.707 | 0.7636 | 0.7984 | 0.8692 | 0.8127 | 0.8687 | 0.7063 | 0.8 | | 0.2973 | 68.0 | 2652 | 0.3237 | 0.7861 | 0.9655 | 0.9493 | 0.682 | 0.7961 | 0.875 | 0.5601 | 0.8426 | 0.8433 | 0.7364 | 0.8517 | 0.875 | 0.7583 | 0.8846 | 0.8046 | 0.8381 | 0.813 | 0.85 | 0.8225 | 0.8421 | 0.8129 | 0.855 | 0.7957 | 0.8533 | 0.8048 | 0.8381 | 0.7667 | 0.8387 | 0.722 | 0.7727 | 0.7927 | 0.8654 | 0.828 | 0.875 | 0.7118 | 0.8067 | | 0.2973 | 69.0 | 2691 | 0.3199 | 0.7825 | 0.9684 | 0.9462 | 0.6812 | 0.7922 | 0.875 | 0.5576 | 0.8384 | 0.8403 | 0.75 | 0.8469 | 0.875 | 0.7381 | 0.8692 | 0.8035 | 0.8333 | 0.7955 | 0.8385 | 0.8152 | 0.8368 | 0.8194 | 0.865 | 0.772 | 0.8333 | 0.8116 | 0.8333 | 0.7617 | 0.8194 | 0.7319 | 0.8 | 0.8026 | 0.8731 | 0.8295 | 0.8813 | 0.7093 | 0.8 | | 0.2973 | 70.0 | 2730 | 0.3144 | 0.7844 | 0.9671 | 0.9439 | 0.6711 | 0.794 | 0.875 | 0.5592 | 0.8409 | 0.8428 | 0.7409 | 0.8508 | 0.875 | 0.7423 | 0.8615 | 0.8172 | 0.8429 | 0.8043 | 0.8423 | 0.8186 | 0.8474 | 0.8122 | 0.855 | 0.7951 | 0.84 | 0.7987 | 0.8286 | 0.7633 | 0.829 | 0.7176 | 0.7864 | 0.7947 | 0.8731 | 0.8172 | 0.8938 | 0.7312 | 0.8133 | | 0.2973 | 71.0 | 2769 | 0.3160 | 0.7883 | 0.9743 | 0.9428 | 0.6638 | 0.7998 | 0.875 | 0.5575 | 0.8407 | 0.8424 | 0.7227 | 0.8517 | 0.875 | 0.7803 | 0.8692 | 0.828 | 0.8571 | 0.8113 | 0.8462 | 0.8335 | 0.8526 | 0.8126 | 0.855 | 0.7893 | 0.84 | 0.7934 | 0.8238 | 0.759 | 0.8258 | 0.7196 | 0.7864 | 0.8123 | 0.8769 | 0.7994 | 0.8562 | 0.7208 | 0.82 | | 0.2973 | 72.0 | 2808 | 0.3126 | 0.7862 | 0.9724 | 0.9444 | 0.6806 | 0.7953 | 0.875 | 0.5582 | 0.841 | 0.8424 | 0.7409 | 0.85 | 0.875 | 0.7453 | 0.8615 | 0.8237 | 0.8524 | 0.7994 | 0.8308 | 0.8224 | 0.8474 | 0.8292 | 0.86 | 0.7968 | 0.84 | 0.7884 | 0.8238 | 0.7712 | 0.8419 | 0.7228 | 0.7864 | 0.8025 | 0.8692 | 0.8052 | 0.875 | 0.7281 | 0.82 | | 0.2973 | 73.0 | 2847 | 0.3082 | 0.7896 | 0.9708 | 0.9463 | 0.6745 | 0.8021 | 0.875 | 0.5595 | 0.8444 | 0.8454 | 0.7288 | 0.8548 | 0.875 | 0.7778 | 0.8769 | 0.8244 | 0.8571 | 0.7996 | 0.8385 | 0.8326 | 0.8579 | 0.8208 | 0.85 | 0.763 | 0.8467 | 0.7888 | 0.8238 | 0.7714 | 0.8355 | 0.7132 | 0.7864 | 0.8152 | 0.8769 | 0.8191 | 0.875 | 0.7498 | 0.82 | | 0.2973 | 74.0 | 2886 | 0.3101 | 0.7935 | 0.9729 | 0.9513 | 0.6723 | 0.8077 | 0.875 | 0.5598 | 0.8458 | 0.8469 | 0.7333 | 0.8562 | 0.875 | 0.8075 | 0.8692 | 0.8271 | 0.8571 | 0.8077 | 0.8423 | 0.8319 | 0.8579 | 0.8345 | 0.86 | 0.7732 | 0.8533 | 0.8136 | 0.8429 | 0.7574 | 0.829 | 0.7122 | 0.7864 | 0.8045 | 0.8692 | 0.8001 | 0.8687 | 0.7522 | 0.8267 | | 0.2973 | 75.0 | 2925 | 0.3094 | 0.7829 | 0.9619 | 0.9425 | 0.6697 | 0.7952 | 0.875 | 0.5573 | 0.8418 | 0.8418 | 0.7182 | 0.852 | 0.875 | 0.7548 | 0.8692 | 0.8009 | 0.8429 | 0.8063 | 0.8423 | 0.8228 | 0.8474 | 0.8045 | 0.835 | 0.7582 | 0.84 | 0.7975 | 0.8286 | 0.7637 | 0.8323 | 0.7199 | 0.7864 | 0.7981 | 0.8692 | 0.8077 | 0.8687 | 0.7605 | 0.84 | | 0.2973 | 76.0 | 2964 | 0.3021 | 0.7847 | 0.9702 | 0.9443 | 0.6618 | 0.7977 | 0.875 | 0.5604 | 0.8437 | 0.844 | 0.7152 | 0.8551 | 0.875 | 0.7506 | 0.8538 | 0.8136 | 0.8524 | 0.8015 | 0.8385 | 0.8289 | 0.8526 | 0.8311 | 0.855 | 0.77 | 0.8533 | 0.7943 | 0.8286 | 0.7766 | 0.8387 | 0.7137 | 0.7818 | 0.791 | 0.8654 | 0.8005 | 0.875 | 0.7442 | 0.8333 | | 0.2728 | 77.0 | 3003 | 0.3078 | 0.7897 | 0.9675 | 0.9508 | 0.6532 | 0.8038 | 0.875 | 0.5568 | 0.8426 | 0.8426 | 0.7091 | 0.8536 | 0.875 | 0.7906 | 0.8538 | 0.807 | 0.8381 | 0.8039 | 0.8423 | 0.8284 | 0.8526 | 0.844 | 0.87 | 0.7799 | 0.86 | 0.7876 | 0.819 | 0.7775 | 0.8419 | 0.7118 | 0.7727 | 0.7916 | 0.8577 | 0.7997 | 0.8625 | 0.7546 | 0.84 | | 0.2728 | 78.0 | 3042 | 0.3073 | 0.7935 | 0.9634 | 0.9439 | 0.6684 | 0.8063 | 0.875 | 0.5638 | 0.8491 | 0.8491 | 0.7136 | 0.8601 | 0.875 | 0.756 | 0.8692 | 0.812 | 0.8524 | 0.8189 | 0.85 | 0.8361 | 0.8632 | 0.8104 | 0.845 | 0.7815 | 0.8467 | 0.8143 | 0.8476 | 0.774 | 0.8323 | 0.7187 | 0.7727 | 0.8152 | 0.8808 | 0.8068 | 0.8625 | 0.7782 | 0.8667 | | 0.2728 | 79.0 | 3081 | 0.3117 | 0.7861 | 0.9667 | 0.946 | 0.6695 | 0.7963 | 0.875 | 0.561 | 0.8446 | 0.8446 | 0.7288 | 0.8532 | 0.875 | 0.7423 | 0.8615 | 0.803 | 0.8476 | 0.805 | 0.8346 | 0.8163 | 0.8421 | 0.816 | 0.85 | 0.783 | 0.8667 | 0.808 | 0.8429 | 0.7622 | 0.8258 | 0.725 | 0.7864 | 0.7987 | 0.8692 | 0.8362 | 0.8813 | 0.7377 | 0.8267 | | 0.2728 | 80.0 | 3120 | 0.3131 | 0.7875 | 0.9621 | 0.9428 | 0.6658 | 0.7993 | 0.875 | 0.5599 | 0.8436 | 0.8436 | 0.7197 | 0.8528 | 0.875 | 0.7455 | 0.8615 | 0.816 | 0.8619 | 0.82 | 0.85 | 0.8177 | 0.8526 | 0.8289 | 0.85 | 0.7627 | 0.8533 | 0.8057 | 0.8381 | 0.7692 | 0.8355 | 0.7129 | 0.7727 | 0.7961 | 0.8654 | 0.8306 | 0.875 | 0.7452 | 0.8067 | | 0.2728 | 81.0 | 3159 | 0.3104 | 0.792 | 0.9635 | 0.9457 | 0.6798 | 0.8009 | 0.875 | 0.5609 | 0.8476 | 0.8476 | 0.7318 | 0.8563 | 0.875 | 0.7586 | 0.8769 | 0.8082 | 0.8476 | 0.8082 | 0.8462 | 0.8404 | 0.8632 | 0.8343 | 0.85 | 0.7755 | 0.8533 | 0.7909 | 0.8286 | 0.775 | 0.8323 | 0.7231 | 0.7818 | 0.8079 | 0.8769 | 0.8322 | 0.875 | 0.7496 | 0.84 | | 0.2728 | 82.0 | 3198 | 0.3063 | 0.7931 | 0.9639 | 0.9469 | 0.6657 | 0.8033 | 0.875 | 0.5593 | 0.8485 | 0.8485 | 0.7288 | 0.8573 | 0.875 | 0.746 | 0.8615 | 0.818 | 0.8524 | 0.8034 | 0.8423 | 0.8448 | 0.8684 | 0.8299 | 0.855 | 0.7792 | 0.8533 | 0.8002 | 0.8286 | 0.7654 | 0.8323 | 0.719 | 0.7818 | 0.82 | 0.8846 | 0.8302 | 0.875 | 0.7612 | 0.8467 | | 0.2728 | 83.0 | 3237 | 0.3069 | 0.7947 | 0.965 | 0.9489 | 0.6776 | 0.8044 | 0.875 | 0.5604 | 0.847 | 0.847 | 0.7333 | 0.8559 | 0.875 | 0.7467 | 0.8538 | 0.8142 | 0.8476 | 0.8142 | 0.8462 | 0.8436 | 0.8684 | 0.8185 | 0.84 | 0.7807 | 0.86 | 0.8182 | 0.8429 | 0.7604 | 0.8355 | 0.726 | 0.7773 | 0.8124 | 0.8769 | 0.8389 | 0.875 | 0.7624 | 0.84 | | 0.2728 | 84.0 | 3276 | 0.3071 | 0.7929 | 0.9646 | 0.9484 | 0.6733 | 0.8028 | 0.875 | 0.5605 | 0.8447 | 0.8452 | 0.7242 | 0.8542 | 0.875 | 0.7446 | 0.8615 | 0.811 | 0.8476 | 0.8129 | 0.8462 | 0.839 | 0.8632 | 0.8409 | 0.855 | 0.7704 | 0.84 | 0.8199 | 0.8476 | 0.7642 | 0.8194 | 0.7101 | 0.7682 | 0.8087 | 0.8731 | 0.8384 | 0.8813 | 0.7546 | 0.84 | | 0.2728 | 85.0 | 3315 | 0.3051 | 0.7919 | 0.9642 | 0.9484 | 0.671 | 0.8025 | 0.875 | 0.5622 | 0.8473 | 0.8473 | 0.7197 | 0.8566 | 0.875 | 0.748 | 0.8692 | 0.8187 | 0.8476 | 0.8167 | 0.8462 | 0.8329 | 0.8632 | 0.8233 | 0.845 | 0.7791 | 0.86 | 0.7975 | 0.8333 | 0.7671 | 0.829 | 0.7133 | 0.7727 | 0.8028 | 0.8731 | 0.8397 | 0.8813 | 0.7637 | 0.8467 | | 0.2728 | 86.0 | 3354 | 0.3082 | 0.7922 | 0.964 | 0.9479 | 0.6935 | 0.8011 | 0.875 | 0.5605 | 0.8453 | 0.8453 | 0.7424 | 0.8529 | 0.875 | 0.7513 | 0.8692 | 0.8101 | 0.8429 | 0.8079 | 0.8423 | 0.8271 | 0.8526 | 0.84 | 0.855 | 0.7738 | 0.8533 | 0.7953 | 0.8333 | 0.767 | 0.8258 | 0.7192 | 0.7773 | 0.8104 | 0.8769 | 0.8517 | 0.8813 | 0.7529 | 0.8333 | | 0.2728 | 87.0 | 3393 | 0.3085 | 0.7917 | 0.9631 | 0.9476 | 0.6712 | 0.8034 | 0.875 | 0.562 | 0.8456 | 0.8461 | 0.7212 | 0.8559 | 0.875 | 0.7538 | 0.8692 | 0.8173 | 0.8476 | 0.8139 | 0.8462 | 0.831 | 0.8579 | 0.8095 | 0.845 | 0.7788 | 0.8533 | 0.8002 | 0.8381 | 0.7672 | 0.8226 | 0.7228 | 0.7727 | 0.813 | 0.8731 | 0.8356 | 0.8813 | 0.7572 | 0.8467 | | 0.2728 | 88.0 | 3432 | 0.3055 | 0.7929 | 0.9628 | 0.9472 | 0.669 | 0.8052 | 0.875 | 0.562 | 0.8473 | 0.8473 | 0.7242 | 0.8574 | 0.875 | 0.7436 | 0.8538 | 0.8254 | 0.8571 | 0.8197 | 0.8538 | 0.8304 | 0.8579 | 0.8236 | 0.85 | 0.7707 | 0.8533 | 0.8037 | 0.8429 | 0.7703 | 0.8323 | 0.7142 | 0.7682 | 0.8185 | 0.8769 | 0.8355 | 0.875 | 0.7595 | 0.8467 | | 0.2728 | 89.0 | 3471 | 0.3054 | 0.7939 | 0.9633 | 0.9475 | 0.6657 | 0.8064 | 0.875 | 0.5628 | 0.8472 | 0.8472 | 0.7106 | 0.8579 | 0.875 | 0.7378 | 0.8538 | 0.8267 | 0.8571 | 0.8183 | 0.85 | 0.8361 | 0.8632 | 0.8213 | 0.85 | 0.7731 | 0.8533 | 0.8028 | 0.8381 | 0.7727 | 0.8355 | 0.7163 | 0.7682 | 0.8122 | 0.8692 | 0.843 | 0.8813 | 0.7662 | 0.8467 | | 0.2674 | 90.0 | 3510 | 0.3074 | 0.7927 | 0.9634 | 0.9479 | 0.6694 | 0.8042 | 0.875 | 0.5623 | 0.8466 | 0.8466 | 0.7197 | 0.857 | 0.875 | 0.7378 | 0.8538 | 0.815 | 0.8476 | 0.8143 | 0.8462 | 0.8293 | 0.8579 | 0.8305 | 0.855 | 0.7732 | 0.8533 | 0.8143 | 0.8429 | 0.7654 | 0.8323 | 0.7197 | 0.7727 | 0.8058 | 0.8692 | 0.8438 | 0.8813 | 0.7629 | 0.8467 | | 0.2674 | 91.0 | 3549 | 0.3062 | 0.7949 | 0.9644 | 0.9491 | 0.6722 | 0.8053 | 0.875 | 0.5626 | 0.8476 | 0.8476 | 0.7288 | 0.8565 | 0.875 | 0.7415 | 0.8615 | 0.8163 | 0.8524 | 0.8095 | 0.8423 | 0.8264 | 0.8526 | 0.8404 | 0.855 | 0.7812 | 0.8533 | 0.8162 | 0.8476 | 0.77 | 0.8323 | 0.7203 | 0.7727 | 0.8099 | 0.8731 | 0.8431 | 0.8813 | 0.7641 | 0.8467 | | 0.2674 | 92.0 | 3588 | 0.3069 | 0.7931 | 0.9637 | 0.9483 | 0.6626 | 0.8049 | 0.875 | 0.5623 | 0.846 | 0.846 | 0.7152 | 0.8559 | 0.875 | 0.7433 | 0.8615 | 0.8252 | 0.8571 | 0.8134 | 0.8462 | 0.8343 | 0.8632 | 0.823 | 0.85 | 0.7856 | 0.8533 | 0.8143 | 0.8381 | 0.7699 | 0.8355 | 0.717 | 0.7682 | 0.7917 | 0.8577 | 0.8431 | 0.8813 | 0.7563 | 0.84 | | 0.2674 | 93.0 | 3627 | 0.3068 | 0.7936 | 0.9637 | 0.9483 | 0.6672 | 0.8048 | 0.875 | 0.5635 | 0.8472 | 0.8472 | 0.7152 | 0.8572 | 0.875 | 0.7415 | 0.8615 | 0.8198 | 0.8571 | 0.8152 | 0.8462 | 0.8306 | 0.8579 | 0.8245 | 0.85 | 0.7753 | 0.8533 | 0.8154 | 0.8429 | 0.7694 | 0.8323 | 0.7154 | 0.7682 | 0.8075 | 0.8692 | 0.8431 | 0.8813 | 0.7652 | 0.8467 | | 0.2674 | 94.0 | 3666 | 0.3064 | 0.7939 | 0.9637 | 0.9483 | 0.6611 | 0.8056 | 0.875 | 0.563 | 0.8472 | 0.8472 | 0.7106 | 0.8578 | 0.875 | 0.7378 | 0.8538 | 0.821 | 0.8571 | 0.8152 | 0.8462 | 0.8308 | 0.8579 | 0.8301 | 0.85 | 0.7753 | 0.8533 | 0.8166 | 0.8429 | 0.7701 | 0.8355 | 0.7138 | 0.7682 | 0.8089 | 0.8731 | 0.8431 | 0.8813 | 0.7641 | 0.8467 | | 0.2674 | 95.0 | 3705 | 0.3051 | 0.7949 | 0.9644 | 0.9489 | 0.6634 | 0.806 | 0.875 | 0.5634 | 0.8475 | 0.8475 | 0.7152 | 0.8578 | 0.875 | 0.7371 | 0.8538 | 0.8202 | 0.8571 | 0.8154 | 0.8462 | 0.831 | 0.8579 | 0.8319 | 0.85 | 0.781 | 0.8533 | 0.8183 | 0.8429 | 0.7704 | 0.8355 | 0.7154 | 0.7682 | 0.8116 | 0.8769 | 0.8431 | 0.8813 | 0.7629 | 0.8467 | | 0.2674 | 96.0 | 3744 | 0.3059 | 0.7941 | 0.9644 | 0.9491 | 0.6619 | 0.8052 | 0.875 | 0.5625 | 0.8466 | 0.8466 | 0.7152 | 0.8566 | 0.875 | 0.7408 | 0.8615 | 0.8181 | 0.8524 | 0.8154 | 0.8462 | 0.8307 | 0.8579 | 0.8416 | 0.855 | 0.7812 | 0.8533 | 0.8183 | 0.8429 | 0.7657 | 0.8323 | 0.7159 | 0.7682 | 0.7938 | 0.8615 | 0.8431 | 0.8813 | 0.7641 | 0.8467 | | 0.2674 | 97.0 | 3783 | 0.3058 | 0.7939 | 0.9644 | 0.9491 | 0.6665 | 0.8049 | 0.875 | 0.5629 | 0.8466 | 0.8466 | 0.7152 | 0.8566 | 0.875 | 0.7408 | 0.8615 | 0.8176 | 0.8524 | 0.8134 | 0.8462 | 0.8312 | 0.8579 | 0.8419 | 0.855 | 0.7812 | 0.8533 | 0.8166 | 0.8429 | 0.7657 | 0.8323 | 0.7159 | 0.7682 | 0.7939 | 0.8615 | 0.843 | 0.8813 | 0.765 | 0.8467 | | 0.2674 | 98.0 | 3822 | 0.3055 | 0.7942 | 0.9644 | 0.9489 | 0.6626 | 0.8053 | 0.875 | 0.5627 | 0.8469 | 0.8469 | 0.7152 | 0.8569 | 0.875 | 0.7408 | 0.8615 | 0.8189 | 0.8524 | 0.8134 | 0.8462 | 0.8307 | 0.8579 | 0.8415 | 0.855 | 0.7812 | 0.8533 | 0.8175 | 0.8429 | 0.7701 | 0.8355 | 0.7159 | 0.7682 | 0.7939 | 0.8615 | 0.843 | 0.8813 | 0.7638 | 0.8467 | | 0.2674 | 99.0 | 3861 | 0.3054 | 0.7951 | 0.9644 | 0.9489 | 0.6649 | 0.8059 | 0.875 | 0.563 | 0.8475 | 0.8475 | 0.7197 | 0.8572 | 0.875 | 0.7408 | 0.8615 | 0.8199 | 0.8524 | 0.8134 | 0.8462 | 0.8307 | 0.8579 | 0.841 | 0.855 | 0.7812 | 0.8533 | 0.8175 | 0.8429 | 0.7701 | 0.8355 | 0.7159 | 0.7682 | 0.8037 | 0.8692 | 0.843 | 0.8813 | 0.7638 | 0.8467 | | 0.2674 | 100.0 | 3900 | 0.3054 | 0.795 | 0.9644 | 0.9489 | 0.6649 | 0.8058 | 0.875 | 0.563 | 0.8475 | 0.8475 | 0.7197 | 0.8572 | 0.875 | 0.7408 | 0.8615 | 0.8189 | 0.8524 | 0.8134 | 0.8462 | 0.8307 | 0.8579 | 0.841 | 0.855 | 0.7812 | 0.8533 | 0.8175 | 0.8429 | 0.7701 | 0.8355 | 0.7159 | 0.7682 | 0.8037 | 0.8692 | 0.843 | 0.8813 | 0.7638 | 0.8467 | ### Framework versions - Transformers 4.44.2 - Pytorch 2.4.0+cu121 - Datasets 2.21.0 - Tokenizers 0.19.1
[ "pip-1", "pip-10", "pip-11", "pip-12", "pip-2", "pip-3", "pip-4", "pip-5", "pip-6", "pip-7", "pip-8", "pip-9" ]
IDEA-Research/dab-detr-resnet-50-dc5-fixxy
# Model Card for Model ID ## Table of Contents 1. [Model Details](#model-details) 2. [Model Sources](#model-sources) 3. [How to Get Started with the Model](#how-to-get-started-with-the-model) 4. [Training Details](#training-details) 5. [Evaluation](#evaluation) 6. [Model Architecture and Objective](#model-architecture-and-objective) 7. [Citation](#citation) ## Model Details ![image/png](https://huggingface.co/datasets/huggingface/documentation-images/resolve/main/transformers/model_doc/dab_detr_convergence_plot.png) > We present in this paper a novel query formulation using dynamic anchor boxes for DETR (DEtection TRansformer) and offer a deeper understanding of the role of queries in DETR. This new formulation directly uses box coordinates as queries in Transformer decoders and dynamically updates them layer-by-layer. Using box coordinates not only helps using explicit positional priors to improve the query-to-feature similarity and eliminate the slow training convergence issue in DETR, but also allows us to modulate the positional attention map using the box width and height information. Such a design makes it clear that queries in DETR can be implemented as performing soft ROI pooling layer-by-layer in a cascade manner. As a result, it leads to the best performance on MS-COCO benchmark among the DETR-like detection models under the same setting, e.g., AP 45.7\% using ResNet50-DC5 as backbone trained in 50 epochs. We also conducted extensive experiments to confirm our analysis and verify the effectiveness of our methods. ### Model Description <!-- Provide a longer summary of what this model is. --> This is the model card of a πŸ€— transformers model that has been pushed on the Hub. This model card has been automatically generated. - **Developed by:** Shilong Liu, Feng Li, Hao Zhang, Xiao Yang, Xianbiao Qi, Hang Su, Jun Zhu, Lei Zhang - **Funded by:** IDEA-Research - **Shared by:** David Hajdu - **Model type:** DAB-DETR - **License:** Apache-2.0 ### Model Sources <!-- Provide the basic links for the model. --> - **Repository:** https://github.com/IDEA-Research/DAB-DETR - **Paper:** https://arxiv.org/abs/2201.12329 ## How to Get Started with the Model Use the code below to get started with the model. ```python import torch import requests from PIL import Image from transformers import AutoModelForObjectDetection, AutoImageProcessor url = 'http://images.cocodataset.org/val2017/000000039769.jpg' image = Image.open(requests.get(url, stream=True).raw) image_processor = AutoImageProcessor.from_pretrained("IDEA-Research/dab-detr-resnet-50-dc5-fixxy") model = AutoModelForObjectDetection.from_pretrained("IDEA-Research/dab-detr-resnet-50-dc5-fixxy") inputs = image_processor(images=image, return_tensors="pt") with torch.no_grad(): outputs = model(**inputs) results = image_processor.post_process_object_detection(outputs, target_sizes=torch.tensor([image.size[::-1]]), threshold=0.3) for result in results: for score, label_id, box in zip(result["scores"], result["labels"], result["boxes"]): score, label = score.item(), label_id.item() box = [round(i, 2) for i in box.tolist()] print(f"{model.config.id2label[label]}: {score:.2f} {box}") ``` This should output ``` remote: 0.85 [41.41, 72.6, 177.42, 118.84] cat: 0.84 [343.45, 21.74, 641.99, 368.87] cat: 0.82 [13.25, 54.13, 318.95, 470.27] remote: 0.70 [333.44, 76.56, 369.1, 189.68] couch: 0.55 [-0.95, 0.03, 639.02, 476.81] ``` ## Training Details ### Training Data <!-- This should link to a Dataset Card, perhaps with a short stub of information on what the training data is all about as well as documentation related to data pre-processing or additional filtering. --> The DAB-DETR model was trained on [COCO 2017 object detection](https://cocodataset.org/#download), a dataset consisting of 118k/5k annotated images for training/validation respectively. ### Training Procedure Following Deformable DETR and Conditional DETR, we use 300 anchors as queries. We select 300 predicted boxes and labels with the largest classification logits for evaluation as well. We also use focal loss (Lin et al., 2020) with Ξ± = 0.25, Ξ³ = 2 for classification. The same loss terms are used in bipartite matching and final loss calculating, but with different coefficients. Classification loss with coefficient 2.0 is used in pipartite matching but 1.0 in the final loss. L1 loss with coefficient 5.0 and GIOU loss (Rezatofighi et al., 2019) with coefficient 2.0 are consistent in both the matching and the final loss calculation procedures. All models are trained on 16 GPUs with 1 image per GPU and AdamW (Loshchilov & Hutter, 2018) is used for training with weight decay 10βˆ’4. The learning rates for backbone and other modules are set to 10βˆ’5 and 10βˆ’4 respectively. We train our models for 50 epochs and drop the learning rate by 0.1 after 40 epochs. All models are trained on Nvidia A100 GPU. We search hyperparameters with batch size 64 and all results in our paper are reported with batch size 16 #### Preprocessing Images are resized/rescaled such that the shortest side is at least 480 and at most 800 pixels and the long size is at most 1333 pixels, and normalized across the RGB channels with the ImageNet mean (0.485, 0.456, 0.406) and standard deviation (0.229, 0.224, 0.225). ### Training Hyperparameters - **Training regime:** <!--fp32, fp16 mixed precision, bf16 mixed precision, bf16 non-mixed precision, fp16 non-mixed precision, fp8 mixed precision --> | **Key** | **Value** | |-----------------------------|---------------------------------------------------------------------------------------------------------------------------------------------| | **activation_dropout** | `0.0` | | **activation_function** | `prelu` | | **attention_dropout** | `0.0` | | **auxiliary_loss** | `false` | | **backbone** | `resnet50` | | **bbox_cost** | `5` | | **bbox_loss_coefficient** | `5` | | **class_cost** | `2` | | **cls_loss_coefficient** | `2` | | **decoder_attention_heads** | `8` | | **decoder_ffn_dim** | `2048` | | **decoder_layers** | `6` | | **dropout** | `0.1` | | **encoder_attention_heads** | `8` | | **encoder_ffn_dim** | `2048` | | **encoder_layers** | `6` | | **focal_alpha** | `0.25` | | **giou_cost** | `2` | | **giou_loss_coefficient** | `2` | | **hidden_size** | `256` | | **init_std** | `0.02` | | **init_xavier_std** | `1.0` | | **initializer_bias_prior_prob** | `null` | | **keep_query_pos** | `false` | | **normalize_before** | `false` | | **num_hidden_layers** | `6` | | **num_patterns** | `0` | | **num_queries** | `300` | | **query_dim** | `4` | | **random_refpoints_xy** | `false` | | **sine_position_embedding_scale** | `null` | | **temperature_height** | `20` | | **temperature_width** | `20` | ## Evaluation ![image/png](https://huggingface.co/datasets/huggingface/documentation-images/resolve/main/transformers/model_doc/dab_detr_results.png) ### Model Architecture and Objective ![image/png](https://huggingface.co/datasets/huggingface/documentation-images/resolve/main/transformers/model_doc/dab_detr_model_arch.png) Overview of DAB-DETR. We extract image spatial features using a CNN backbone followed with Transformer encoders to refine the CNN features. Then dual queries, including positional queries (anchor boxes) and content queries (decoder embeddings), are fed into the decoder to probe the objects which correspond to the anchors and have similar patterns with the content queries. The dual queries are updated layer-by-layer to get close to the target ground-truth objects gradually. The outputs of the final decoder layer are used to predict the objects with labels and boxes by prediction heads, and then a bipartite graph matching is conducted to calculate loss as in DETR. ## Citation <!-- If there is a paper or blog post introducing the model, the APA and Bibtex information for that should go in this section. --> **BibTeX:** ```bibtex @inproceedings{ liu2022dabdetr, title={{DAB}-{DETR}: Dynamic Anchor Boxes are Better Queries for {DETR}}, author={Shilong Liu and Feng Li and Hao Zhang and Xiao Yang and Xianbiao Qi and Hang Su and Jun Zhu and Lei Zhang}, booktitle={International Conference on Learning Representations}, year={2022}, url={https://openreview.net/forum?id=oMI9PjOb9Jl} } ``` ## Model Card Authors [David Hajdu](https://huggingface.co/davidhajdu)
[ "n/a", "person", "bicycle", "car", "motorcycle", "airplane", "bus", "train", "truck", "boat", "traffic light", "fire hydrant", "n/a", "stop sign", "parking meter", "bench", "bird", "cat", "dog", "horse", "sheep", "cow", "elephant", "bear", "zebra", "giraffe", "n/a", "backpack", "umbrella", "n/a", "n/a", "handbag", "tie", "suitcase", "frisbee", "skis", "snowboard", "sports ball", "kite", "baseball bat", "baseball glove", "skateboard", "surfboard", "tennis racket", "bottle", "n/a", "wine glass", "cup", "fork", "knife", "spoon", "bowl", "banana", "apple", "sandwich", "orange", "broccoli", "carrot", "hot dog", "pizza", "donut", "cake", "chair", "couch", "potted plant", "bed", "n/a", "dining table", "n/a", "n/a", "toilet", "n/a", "tv", "laptop", "mouse", "remote", "keyboard", "cell phone", "microwave", "oven", "toaster", "sink", "refrigerator", "n/a", "book", "clock", "vase", "scissors", "teddy bear", "hair drier", "toothbrush" ]
IDEA-Research/dab-detr-resnet-50-pat3
# Model Card for Model ID ## Table of Contents 1. [Model Details](#model-details) 2. [Model Sources](#model-sources) 3. [How to Get Started with the Model](#how-to-get-started-with-the-model) 4. [Training Details](#training-details) 5. [Evaluation](#evaluation) 6. [Model Architecture and Objective](#model-architecture-and-objective) 7. [Citation](#citation) ## Model Details ![image/png](https://huggingface.co/datasets/huggingface/documentation-images/resolve/main/transformers/model_doc/dab_detr_convergence_plot.png) > We present in this paper a novel query formulation using dynamic anchor boxes for DETR (DEtection TRansformer) and offer a deeper understanding of the role of queries in DETR. This new formulation directly uses box coordinates as queries in Transformer decoders and dynamically updates them layer-by-layer. Using box coordinates not only helps using explicit positional priors to improve the query-to-feature similarity and eliminate the slow training convergence issue in DETR, but also allows us to modulate the positional attention map using the box width and height information. Such a design makes it clear that queries in DETR can be implemented as performing soft ROI pooling layer-by-layer in a cascade manner. As a result, it leads to the best performance on MS-COCO benchmark among the DETR-like detection models under the same setting, e.g., AP 45.7\% using ResNet50-DC5 as backbone trained in 50 epochs. We also conducted extensive experiments to confirm our analysis and verify the effectiveness of our methods. ### Model Description <!-- Provide a longer summary of what this model is. --> This is the model card of a πŸ€— transformers model that has been pushed on the Hub. This model card has been automatically generated. - **Developed by:** Shilong Liu, Feng Li, Hao Zhang, Xiao Yang, Xianbiao Qi, Hang Su, Jun Zhu, Lei Zhang - **Funded by:** IDEA-Research - **Shared by:** David Hajdu - **Model type:** DAB-DETR - **License:** Apache-2.0 ### Model Sources <!-- Provide the basic links for the model. --> - **Repository:** https://github.com/IDEA-Research/DAB-DETR - **Paper:** https://arxiv.org/abs/2201.12329 ## How to Get Started with the Model Use the code below to get started with the model. ```python import torch import requests from PIL import Image from transformers import AutoModelForObjectDetection, AutoImageProcessor url = 'http://images.cocodataset.org/val2017/000000039769.jpg' image = Image.open(requests.get(url, stream=True).raw) image_processor = AutoImageProcessor.from_pretrained("IDEA-Research/dab-detr-resnet-50-pat3") model = AutoModelForObjectDetection.from_pretrained("IDEA-Research/dab-detr-resnet-50-pat3") inputs = image_processor(images=image, return_tensors="pt") with torch.no_grad(): outputs = model(**inputs) results = image_processor.post_process_object_detection(outputs, target_sizes=torch.tensor([image.size[::-1]]), threshold=0.3) for result in results: for score, label_id, box in zip(result["scores"], result["labels"], result["boxes"]): score, label = score.item(), label_id.item() box = [round(i, 2) for i in box.tolist()] print(f"{model.config.id2label[label]}: {score:.2f} {box}") ``` This should output ``` cat: 0.85 [12.57, 49.83, 319.89, 472.63] remote: 0.84 [38.19, 72.69, 176.99, 118.93] cat: 0.81 [342.33, 20.66, 640.16, 374.93] couch: 0.62 [-0.02, 1.33, 639.94, 475.61] remote: 0.59 [334.27, 75.04, 367.96, 189.94] ``` ## Training Details ### Training Data <!-- This should link to a Dataset Card, perhaps with a short stub of information on what the training data is all about as well as documentation related to data pre-processing or additional filtering. --> The DAB-DETR model was trained on [COCO 2017 object detection](https://cocodataset.org/#download), a dataset consisting of 118k/5k annotated images for training/validation respectively. ### Training Procedure Following Deformable DETR and Conditional DETR, we use 300 anchors as queries. We select 300 predicted boxes and labels with the largest classification logits for evaluation as well. We also use focal loss (Lin et al., 2020) with Ξ± = 0.25, Ξ³ = 2 for classification. The same loss terms are used in bipartite matching and final loss calculating, but with different coefficients. Classification loss with coefficient 2.0 is used in pipartite matching but 1.0 in the final loss. L1 loss with coefficient 5.0 and GIOU loss (Rezatofighi et al., 2019) with coefficient 2.0 are consistent in both the matching and the final loss calculation procedures. All models are trained on 16 GPUs with 1 image per GPU and AdamW (Loshchilov & Hutter, 2018) is used for training with weight decay 10βˆ’4. The learning rates for backbone and other modules are set to 10βˆ’5 and 10βˆ’4 respectively. We train our models for 50 epochs and drop the learning rate by 0.1 after 40 epochs. All models are trained on Nvidia A100 GPU. We search hyperparameters with batch size 64 and all results in our paper are reported with batch size 16 #### Preprocessing Images are resized/rescaled such that the shortest side is at least 480 and at most 800 pixels and the long size is at most 1333 pixels, and normalized across the RGB channels with the ImageNet mean (0.485, 0.456, 0.406) and standard deviation (0.229, 0.224, 0.225). ### Training Hyperparameters - **Training regime:** <!--fp32, fp16 mixed precision, bf16 mixed precision, bf16 non-mixed precision, fp16 non-mixed precision, fp8 mixed precision --> | **Key** | **Value** | |-----------------------------|---------------------------------------------------------------------------------------------------------------------------------------------| | **activation_dropout** | `0.0` | | **activation_function** | `prelu` | | **attention_dropout** | `0.0` | | **auxiliary_loss** | `false` | | **backbone** | `resnet50` | | **bbox_cost** | `5` | | **bbox_loss_coefficient** | `5` | | **class_cost** | `2` | | **cls_loss_coefficient** | `2` | | **decoder_attention_heads** | `8` | | **decoder_ffn_dim** | `2048` | | **decoder_layers** | `6` | | **dropout** | `0.1` | | **encoder_attention_heads** | `8` | | **encoder_ffn_dim** | `2048` | | **encoder_layers** | `6` | | **focal_alpha** | `0.25` | | **giou_cost** | `2` | | **giou_loss_coefficient** | `2` | | **hidden_size** | `256` | | **init_std** | `0.02` | | **init_xavier_std** | `1.0` | | **initializer_bias_prior_prob** | `null` | | **keep_query_pos** | `false` | | **normalize_before** | `false` | | **num_hidden_layers** | `6` | | **num_patterns** | `0` | | **num_queries** | `300` | | **query_dim** | `4` | | **random_refpoints_xy** | `false` | | **sine_position_embedding_scale** | `null` | | **temperature_height** | `20` | | **temperature_width** | `20` | ## Evaluation ![image/png](https://huggingface.co/datasets/huggingface/documentation-images/resolve/main/transformers/model_doc/dab_detr_results.png) ### Model Architecture and Objective ![image/png](https://huggingface.co/datasets/huggingface/documentation-images/resolve/main/transformers/model_doc/dab_detr_model_arch.png) Overview of DAB-DETR. We extract image spatial features using a CNN backbone followed with Transformer encoders to refine the CNN features. Then dual queries, including positional queries (anchor boxes) and content queries (decoder embeddings), are fed into the decoder to probe the objects which correspond to the anchors and have similar patterns with the content queries. The dual queries are updated layer-by-layer to get close to the target ground-truth objects gradually. The outputs of the final decoder layer are used to predict the objects with labels and boxes by prediction heads, and then a bipartite graph matching is conducted to calculate loss as in DETR. ## Citation <!-- If there is a paper or blog post introducing the model, the APA and Bibtex information for that should go in this section. --> **BibTeX:** ```bibtex @inproceedings{ liu2022dabdetr, title={{DAB}-{DETR}: Dynamic Anchor Boxes are Better Queries for {DETR}}, author={Shilong Liu and Feng Li and Hao Zhang and Xiao Yang and Xianbiao Qi and Hang Su and Jun Zhu and Lei Zhang}, booktitle={International Conference on Learning Representations}, year={2022}, url={https://openreview.net/forum?id=oMI9PjOb9Jl} } ``` ## Model Card Authors [David Hajdu](https://huggingface.co/davidhajdu)
[ "n/a", "person", "bicycle", "car", "motorcycle", "airplane", "bus", "train", "truck", "boat", "traffic light", "fire hydrant", "n/a", "stop sign", "parking meter", "bench", "bird", "cat", "dog", "horse", "sheep", "cow", "elephant", "bear", "zebra", "giraffe", "n/a", "backpack", "umbrella", "n/a", "n/a", "handbag", "tie", "suitcase", "frisbee", "skis", "snowboard", "sports ball", "kite", "baseball bat", "baseball glove", "skateboard", "surfboard", "tennis racket", "bottle", "n/a", "wine glass", "cup", "fork", "knife", "spoon", "bowl", "banana", "apple", "sandwich", "orange", "broccoli", "carrot", "hot dog", "pizza", "donut", "cake", "chair", "couch", "potted plant", "bed", "n/a", "dining table", "n/a", "n/a", "toilet", "n/a", "tv", "laptop", "mouse", "remote", "keyboard", "cell phone", "microwave", "oven", "toaster", "sink", "refrigerator", "n/a", "book", "clock", "vase", "scissors", "teddy bear", "hair drier", "toothbrush" ]
IDEA-Research/dab-detr-resnet-50-dc5-pat3
# Model Card for Model ID ## Table of Contents 1. [Model Details](#model-details) 2. [Model Sources](#model-sources) 3. [How to Get Started with the Model](#how-to-get-started-with-the-model) 4. [Training Details](#training-details) 5. [Evaluation](#evaluation) 6. [Model Architecture and Objective](#model-architecture-and-objective) 7. [Citation](#citation) ## Model Details ![image/png](https://huggingface.co/datasets/huggingface/documentation-images/resolve/main/transformers/model_doc/dab_detr_convergence_plot.png) > We present in this paper a novel query formulation using dynamic anchor boxes for DETR (DEtection TRansformer) and offer a deeper understanding of the role of queries in DETR. This new formulation directly uses box coordinates as queries in Transformer decoders and dynamically updates them layer-by-layer. Using box coordinates not only helps using explicit positional priors to improve the query-to-feature similarity and eliminate the slow training convergence issue in DETR, but also allows us to modulate the positional attention map using the box width and height information. Such a design makes it clear that queries in DETR can be implemented as performing soft ROI pooling layer-by-layer in a cascade manner. As a result, it leads to the best performance on MS-COCO benchmark among the DETR-like detection models under the same setting, e.g., AP 45.7\% using ResNet50-DC5 as backbone trained in 50 epochs. We also conducted extensive experiments to confirm our analysis and verify the effectiveness of our methods. ### Model Description <!-- Provide a longer summary of what this model is. --> This is the model card of a πŸ€— transformers model that has been pushed on the Hub. This model card has been automatically generated. - **Developed by:** Shilong Liu, Feng Li, Hao Zhang, Xiao Yang, Xianbiao Qi, Hang Su, Jun Zhu, Lei Zhang - **Funded by:** IDEA-Research - **Shared by:** David Hajdu - **Model type:** DAB-DETR - **License:** Apache-2.0 ### Model Sources <!-- Provide the basic links for the model. --> - **Repository:** https://github.com/IDEA-Research/DAB-DETR - **Paper:** https://arxiv.org/abs/2201.12329 ## How to Get Started with the Model Use the code below to get started with the model. ```python import torch import requests from PIL import Image from transformers import AutoModelForObjectDetection, AutoImageProcessor url = 'http://images.cocodataset.org/val2017/000000039769.jpg' image = Image.open(requests.get(url, stream=True).raw) image_processor = AutoImageProcessor.from_pretrained("IDEA-Research/dab-detr-resnet-50-dc5-pat3") model = AutoModelForObjectDetection.from_pretrained("IDEA-Research/dab-detr-resnet-50-dc5-pat3") inputs = image_processor(images=image, return_tensors="pt") with torch.no_grad(): outputs = model(**inputs) results = image_processor.post_process_object_detection(outputs, target_sizes=torch.tensor([image.size[::-1]]), threshold=0.3) for result in results: for score, label_id, box in zip(result["scores"], result["labels"], result["boxes"]): score, label = score.item(), label_id.item() box = [round(i, 2) for i in box.tolist()] print(f"{model.config.id2label[label]}: {score:.2f} {box}") ``` This should output ``` remote: 0.84 [39.72, 73.18, 177.02, 119.15] cat: 0.82 [341.19, 23.94, 641.08, 369.33] cat: 0.82 [11.81, 50.36, 318.21, 472.41] remote: 0.81 [334.1, 77.0, 368.36, 189.37] couch: 0.52 [0.22, 1.87, 640.21, 474.03] ``` ## Training Details ### Training Data <!-- This should link to a Dataset Card, perhaps with a short stub of information on what the training data is all about as well as documentation related to data pre-processing or additional filtering. --> The DAB-DETR model was trained on [COCO 2017 object detection](https://cocodataset.org/#download), a dataset consisting of 118k/5k annotated images for training/validation respectively. ### Training Procedure Following Deformable DETR and Conditional DETR, we use 300 anchors as queries. We select 300 predicted boxes and labels with the largest classification logits for evaluation as well. We also use focal loss (Lin et al., 2020) with Ξ± = 0.25, Ξ³ = 2 for classification. The same loss terms are used in bipartite matching and final loss calculating, but with different coefficients. Classification loss with coefficient 2.0 is used in pipartite matching but 1.0 in the final loss. L1 loss with coefficient 5.0 and GIOU loss (Rezatofighi et al., 2019) with coefficient 2.0 are consistent in both the matching and the final loss calculation procedures. All models are trained on 16 GPUs with 1 image per GPU and AdamW (Loshchilov & Hutter, 2018) is used for training with weight decay 10βˆ’4. The learning rates for backbone and other modules are set to 10βˆ’5 and 10βˆ’4 respectively. We train our models for 50 epochs and drop the learning rate by 0.1 after 40 epochs. All models are trained on Nvidia A100 GPU. We search hyperparameters with batch size 64 and all results in our paper are reported with batch size 16 #### Preprocessing Images are resized/rescaled such that the shortest side is at least 480 and at most 800 pixels and the long size is at most 1333 pixels, and normalized across the RGB channels with the ImageNet mean (0.485, 0.456, 0.406) and standard deviation (0.229, 0.224, 0.225). ### Training Hyperparameters - **Training regime:** <!--fp32, fp16 mixed precision, bf16 mixed precision, bf16 non-mixed precision, fp16 non-mixed precision, fp8 mixed precision --> | **Key** | **Value** | |-----------------------------|---------------------------------------------------------------------------------------------------------------------------------------------| | **activation_dropout** | `0.0` | | **activation_function** | `prelu` | | **attention_dropout** | `0.0` | | **auxiliary_loss** | `false` | | **backbone** | `resnet50` | | **bbox_cost** | `5` | | **bbox_loss_coefficient** | `5` | | **class_cost** | `2` | | **cls_loss_coefficient** | `2` | | **decoder_attention_heads** | `8` | | **decoder_ffn_dim** | `2048` | | **decoder_layers** | `6` | | **dropout** | `0.1` | | **encoder_attention_heads** | `8` | | **encoder_ffn_dim** | `2048` | | **encoder_layers** | `6` | | **focal_alpha** | `0.25` | | **giou_cost** | `2` | | **giou_loss_coefficient** | `2` | | **hidden_size** | `256` | | **init_std** | `0.02` | | **init_xavier_std** | `1.0` | | **initializer_bias_prior_prob** | `null` | | **keep_query_pos** | `false` | | **normalize_before** | `false` | | **num_hidden_layers** | `6` | | **num_patterns** | `0` | | **num_queries** | `300` | | **query_dim** | `4` | | **random_refpoints_xy** | `false` | | **sine_position_embedding_scale** | `null` | | **temperature_height** | `20` | | **temperature_width** | `20` | ## Evaluation ![image/png](https://huggingface.co/datasets/huggingface/documentation-images/resolve/main/transformers/model_doc/dab_detr_results.png) ### Model Architecture and Objective ![image/png](https://huggingface.co/datasets/huggingface/documentation-images/resolve/main/transformers/model_doc/dab_detr_model_arch.png) Overview of DAB-DETR. We extract image spatial features using a CNN backbone followed with Transformer encoders to refine the CNN features. Then dual queries, including positional queries (anchor boxes) and content queries (decoder embeddings), are fed into the decoder to probe the objects which correspond to the anchors and have similar patterns with the content queries. The dual queries are updated layer-by-layer to get close to the target ground-truth objects gradually. The outputs of the final decoder layer are used to predict the objects with labels and boxes by prediction heads, and then a bipartite graph matching is conducted to calculate loss as in DETR. ## Citation <!-- If there is a paper or blog post introducing the model, the APA and Bibtex information for that should go in this section. --> **BibTeX:** ```bibtex @inproceedings{ liu2022dabdetr, title={{DAB}-{DETR}: Dynamic Anchor Boxes are Better Queries for {DETR}}, author={Shilong Liu and Feng Li and Hao Zhang and Xiao Yang and Xianbiao Qi and Hang Su and Jun Zhu and Lei Zhang}, booktitle={International Conference on Learning Representations}, year={2022}, url={https://openreview.net/forum?id=oMI9PjOb9Jl} } ``` ## Model Card Authors [David Hajdu](https://huggingface.co/davidhajdu)
[ "n/a", "person", "bicycle", "car", "motorcycle", "airplane", "bus", "train", "truck", "boat", "traffic light", "fire hydrant", "n/a", "stop sign", "parking meter", "bench", "bird", "cat", "dog", "horse", "sheep", "cow", "elephant", "bear", "zebra", "giraffe", "n/a", "backpack", "umbrella", "n/a", "n/a", "handbag", "tie", "suitcase", "frisbee", "skis", "snowboard", "sports ball", "kite", "baseball bat", "baseball glove", "skateboard", "surfboard", "tennis racket", "bottle", "n/a", "wine glass", "cup", "fork", "knife", "spoon", "bowl", "banana", "apple", "sandwich", "orange", "broccoli", "carrot", "hot dog", "pizza", "donut", "cake", "chair", "couch", "potted plant", "bed", "n/a", "dining table", "n/a", "n/a", "toilet", "n/a", "tv", "laptop", "mouse", "remote", "keyboard", "cell phone", "microwave", "oven", "toaster", "sink", "refrigerator", "n/a", "book", "clock", "vase", "scissors", "teddy bear", "hair drier", "toothbrush" ]
joe611/detr-resnet-50_finetuned_cppe5
<!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. --> # detr-resnet-50_finetuned_cppe5 This model is a fine-tuned version of [facebook/detr-resnet-50](https://huggingface.co/facebook/detr-resnet-50) on an unknown dataset. ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 1e-05 - train_batch_size: 8 - eval_batch_size: 8 - seed: 42 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - num_epochs: 100 - mixed_precision_training: Native AMP ### Training results ### Framework versions - Transformers 4.44.2 - Pytorch 2.4.1+cu121 - Datasets 3.0.0 - Tokenizers 0.19.1
[ "coverall", "face_shield", "gloves", "goggles", "mask" ]
joe611/detr-resnet-50_finetuned_cppe5-take3
<!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. --> # detr-resnet-50_finetuned_cppe5-take3 This model is a fine-tuned version of [facebook/detr-resnet-50](https://huggingface.co/facebook/detr-resnet-50) on an unknown dataset. ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 1e-05 - train_batch_size: 8 - eval_batch_size: 8 - seed: 42 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - num_epochs: 10 - mixed_precision_training: Native AMP ### Training results ### Framework versions - Transformers 4.44.2 - Pytorch 2.4.0+cu121 - Datasets 2.21.0 - Tokenizers 0.19.1
[ "coverall", "face_shield", "gloves", "goggles", "mask" ]
Jannis997/detr-resnet-50_finetuned_cppe5
<!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. --> # detr-resnet-50_finetuned_cppe5 This model is a fine-tuned version of [facebook/detr-resnet-50](https://huggingface.co/facebook/detr-resnet-50) on an unknown dataset. ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 1e-05 - train_batch_size: 8 - eval_batch_size: 8 - seed: 42 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - num_epochs: 10 - mixed_precision_training: Native AMP ### Training results ### Framework versions - Transformers 4.44.2 - Pytorch 2.4.1+cu121 - Datasets 3.0.0 - Tokenizers 0.19.1
[ "coverall", "face_shield", "gloves", "goggles", "mask" ]
joe611/detr-resnet-50_finetuned_cppe5-take4
<!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. --> # detr-resnet-50_finetuned_cppe5-take4 This model is a fine-tuned version of [facebook/detr-resnet-50](https://huggingface.co/facebook/detr-resnet-50) on an unknown dataset. ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 1e-05 - train_batch_size: 8 - eval_batch_size: 8 - seed: 42 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - num_epochs: 10 - mixed_precision_training: Native AMP ### Training results ### Framework versions - Transformers 4.44.2 - Pytorch 2.4.0+cu121 - Datasets 2.21.0 - Tokenizers 0.19.1
[ "coverall", "face_shield", "gloves", "goggles", "mask" ]
0llheaven/Conditional-detr-finetuned-V6
# Model Card for Model ID <!-- Provide a quick summary of what the model is/does. --> ## Model Details ### Model Description <!-- Provide a longer summary of what this model is. --> This is the model card of a πŸ€— transformers model that has been pushed on the Hub. This model card has been automatically generated. - **Developed by:** [More Information Needed] - **Funded by [optional]:** [More Information Needed] - **Shared by [optional]:** [More Information Needed] - **Model type:** [More Information Needed] - **Language(s) (NLP):** [More Information Needed] - **License:** [More Information Needed] - **Finetuned from model [optional]:** [More Information Needed] ### Model Sources [optional] <!-- Provide the basic links for the model. --> - **Repository:** [More Information Needed] - **Paper [optional]:** [More Information Needed] - **Demo [optional]:** [More Information Needed] ## Uses <!-- Address questions around how the model is intended to be used, including the foreseeable users of the model and those affected by the model. --> ### Direct Use <!-- This section is for the model use without fine-tuning or plugging into a larger ecosystem/app. --> [More Information Needed] ### Downstream Use [optional] <!-- This section is for the model use when fine-tuned for a task, or when plugged into a larger ecosystem/app --> [More Information Needed] ### Out-of-Scope Use <!-- This section addresses misuse, malicious use, and uses that the model will not work well for. --> [More Information Needed] ## Bias, Risks, and Limitations <!-- This section is meant to convey both technical and sociotechnical limitations. --> [More Information Needed] ### Recommendations <!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. --> Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations. ## How to Get Started with the Model Use the code below to get started with the model. [More Information Needed] ## Training Details ### Training Data <!-- This should link to a Dataset Card, perhaps with a short stub of information on what the training data is all about as well as documentation related to data pre-processing or additional filtering. --> [More Information Needed] ### Training Procedure <!-- This relates heavily to the Technical Specifications. Content here should link to that section when it is relevant to the training procedure. --> #### Preprocessing [optional] [More Information Needed] #### Training Hyperparameters - **Training regime:** [More Information Needed] <!--fp32, fp16 mixed precision, bf16 mixed precision, bf16 non-mixed precision, fp16 non-mixed precision, fp8 mixed precision --> #### Speeds, Sizes, Times [optional] <!-- This section provides information about throughput, start/end time, checkpoint size if relevant, etc. --> [More Information Needed] ## Evaluation <!-- This section describes the evaluation protocols and provides the results. --> ### Testing Data, Factors & Metrics #### Testing Data <!-- This should link to a Dataset Card if possible. --> [More Information Needed] #### Factors <!-- These are the things the evaluation is disaggregating by, e.g., subpopulations or domains. --> [More Information Needed] #### Metrics <!-- These are the evaluation metrics being used, ideally with a description of why. --> [More Information Needed] ### Results [More Information Needed] #### Summary ## Model Examination [optional] <!-- Relevant interpretability work for the model goes here --> [More Information Needed] ## Environmental Impact <!-- Total emissions (in grams of CO2eq) and additional considerations, such as electricity usage, go here. Edit the suggested text below accordingly --> Carbon emissions can be estimated using the [Machine Learning Impact calculator](https://mlco2.github.io/impact#compute) presented in [Lacoste et al. (2019)](https://arxiv.org/abs/1910.09700). - **Hardware Type:** [More Information Needed] - **Hours used:** [More Information Needed] - **Cloud Provider:** [More Information Needed] - **Compute Region:** [More Information Needed] - **Carbon Emitted:** [More Information Needed] ## Technical Specifications [optional] ### Model Architecture and Objective [More Information Needed] ### Compute Infrastructure [More Information Needed] #### Hardware [More Information Needed] #### Software [More Information Needed] ## Citation [optional] <!-- If there is a paper or blog post introducing the model, the APA and Bibtex information for that should go in this section. --> **BibTeX:** [More Information Needed] **APA:** [More Information Needed] ## Glossary [optional] <!-- If relevant, include terms and calculations in this section that can help readers understand the model or model card. --> [More Information Needed] ## More Information [optional] [More Information Needed] ## Model Card Authors [optional] [More Information Needed] ## Model Card Contact [More Information Needed]
[ "pneumonia", "0ther" ]
wu999/detr-finetuned-XVIEW2
# Model Card for Model ID <!-- Provide a quick summary of what the model is/does. --> ## Model Details ### Model Description <!-- Provide a longer summary of what this model is. --> This is the model card of a πŸ€— transformers model that has been pushed on the Hub. This model card has been automatically generated. - **Developed by:** [More Information Needed] - **Funded by [optional]:** [More Information Needed] - **Shared by [optional]:** [More Information Needed] - **Model type:** [More Information Needed] - **Language(s) (NLP):** [More Information Needed] - **License:** [More Information Needed] - **Finetuned from model [optional]:** [More Information Needed] ### Model Sources [optional] <!-- Provide the basic links for the model. --> - **Repository:** [More Information Needed] - **Paper [optional]:** [More Information Needed] - **Demo [optional]:** [More Information Needed] ## Uses <!-- Address questions around how the model is intended to be used, including the foreseeable users of the model and those affected by the model. --> ### Direct Use <!-- This section is for the model use without fine-tuning or plugging into a larger ecosystem/app. --> [More Information Needed] ### Downstream Use [optional] <!-- This section is for the model use when fine-tuned for a task, or when plugged into a larger ecosystem/app --> [More Information Needed] ### Out-of-Scope Use <!-- This section addresses misuse, malicious use, and uses that the model will not work well for. --> [More Information Needed] ## Bias, Risks, and Limitations <!-- This section is meant to convey both technical and sociotechnical limitations. --> [More Information Needed] ### Recommendations <!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. --> Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations. ## How to Get Started with the Model Use the code below to get started with the model. [More Information Needed] ## Training Details ### Training Data <!-- This should link to a Dataset Card, perhaps with a short stub of information on what the training data is all about as well as documentation related to data pre-processing or additional filtering. --> [More Information Needed] ### Training Procedure <!-- This relates heavily to the Technical Specifications. Content here should link to that section when it is relevant to the training procedure. --> #### Preprocessing [optional] [More Information Needed] #### Training Hyperparameters - **Training regime:** [More Information Needed] <!--fp32, fp16 mixed precision, bf16 mixed precision, bf16 non-mixed precision, fp16 non-mixed precision, fp8 mixed precision --> #### Speeds, Sizes, Times [optional] <!-- This section provides information about throughput, start/end time, checkpoint size if relevant, etc. --> [More Information Needed] ## Evaluation <!-- This section describes the evaluation protocols and provides the results. --> ### Testing Data, Factors & Metrics #### Testing Data <!-- This should link to a Dataset Card if possible. --> [More Information Needed] #### Factors <!-- These are the things the evaluation is disaggregating by, e.g., subpopulations or domains. --> [More Information Needed] #### Metrics <!-- These are the evaluation metrics being used, ideally with a description of why. --> [More Information Needed] ### Results [More Information Needed] #### Summary ## Model Examination [optional] <!-- Relevant interpretability work for the model goes here --> [More Information Needed] ## Environmental Impact <!-- Total emissions (in grams of CO2eq) and additional considerations, such as electricity usage, go here. Edit the suggested text below accordingly --> Carbon emissions can be estimated using the [Machine Learning Impact calculator](https://mlco2.github.io/impact#compute) presented in [Lacoste et al. (2019)](https://arxiv.org/abs/1910.09700). - **Hardware Type:** [More Information Needed] - **Hours used:** [More Information Needed] - **Cloud Provider:** [More Information Needed] - **Compute Region:** [More Information Needed] - **Carbon Emitted:** [More Information Needed] ## Technical Specifications [optional] ### Model Architecture and Objective [More Information Needed] ### Compute Infrastructure [More Information Needed] #### Hardware [More Information Needed] #### Software [More Information Needed] ## Citation [optional] <!-- If there is a paper or blog post introducing the model, the APA and Bibtex information for that should go in this section. --> **BibTeX:** [More Information Needed] **APA:** [More Information Needed] ## Glossary [optional] <!-- If relevant, include terms and calculations in this section that can help readers understand the model or model card. --> [More Information Needed] ## More Information [optional] [More Information Needed] ## Model Card Authors [optional] [More Information Needed] ## Model Card Contact [More Information Needed]
[ "label_0", "label_1", "label_2", "label_3", "label_4", "label_5", "label_6", "label_7", "label_8", "label_9", "label_10", "label_11", "label_12", "label_13", "label_14", "label_15", "label_16", "label_17", "label_18", "label_19", "label_20", "label_21", "label_22", "label_23", "label_24", "label_25", "label_26", "label_27", "label_28", "label_29", "label_30", "label_31", "label_32", "label_33", "label_34", "label_35", "label_36", "label_37", "label_38", "label_39", "label_40", "label_41", "label_42", "label_43", "label_44", "label_45", "label_46", "label_47", "label_48", "label_49", "label_50", "label_51", "label_52", "label_53", "label_54", "label_55", "label_56", "label_57", "label_58", "label_59", "label_60" ]
joe611/chickens
<!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. --> # chickens This model is a fine-tuned version of [facebook/detr-resnet-50](https://huggingface.co/facebook/detr-resnet-50) on the None dataset. It achieves the following results on the evaluation set: - Loss: 0.2613 ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 1e-05 - train_batch_size: 2 - eval_batch_size: 8 - seed: 42 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: cosine - num_epochs: 300 ### Training results | Training Loss | Epoch | Step | Validation Loss | |:-------------:|:-----:|:------:|:---------------:| | 1.4696 | 1.0 | 497 | 1.3836 | | 1.0964 | 2.0 | 994 | 1.0764 | | 0.8905 | 3.0 | 1491 | 0.8395 | | 0.8093 | 4.0 | 1988 | 0.7073 | | 0.7444 | 5.0 | 2485 | 0.6422 | | 0.6324 | 6.0 | 2982 | 0.5710 | | 0.6004 | 7.0 | 3479 | 0.5129 | | 0.5475 | 8.0 | 3976 | 0.4670 | | 0.4901 | 9.0 | 4473 | 0.4419 | | 0.4855 | 10.0 | 4970 | 0.4930 | | 0.4547 | 11.0 | 5467 | 0.4002 | | 0.4106 | 12.0 | 5964 | 0.3824 | | 0.4184 | 13.0 | 6461 | 0.3699 | | 0.4038 | 14.0 | 6958 | 0.3770 | | 0.3967 | 15.0 | 7455 | 0.3576 | | 0.4071 | 16.0 | 7952 | 0.3368 | | 0.3439 | 17.0 | 8449 | 0.3470 | | 0.4089 | 18.0 | 8946 | 0.3332 | | 0.3769 | 19.0 | 9443 | 0.3318 | | 0.3565 | 20.0 | 9940 | 0.3513 | | 0.3357 | 21.0 | 10437 | 0.3220 | | 0.3294 | 22.0 | 10934 | 0.3278 | | 0.3482 | 23.0 | 11431 | 0.3140 | | 0.3326 | 24.0 | 11928 | 0.2937 | | 0.3304 | 25.0 | 12425 | 0.3053 | | 0.3439 | 26.0 | 12922 | 0.3023 | | 0.3336 | 27.0 | 13419 | 0.2763 | | 0.318 | 28.0 | 13916 | 0.2977 | | 0.3446 | 29.0 | 14413 | 0.3100 | | 0.3352 | 30.0 | 14910 | 0.3298 | | 0.339 | 31.0 | 15407 | 0.2906 | | 0.3152 | 32.0 | 15904 | 0.3107 | | 0.3235 | 33.0 | 16401 | 0.2959 | | 0.3098 | 34.0 | 16898 | 0.2984 | | 0.3152 | 35.0 | 17395 | 0.2776 | | 0.2926 | 36.0 | 17892 | 0.2892 | | 0.2881 | 37.0 | 18389 | 0.2864 | | 0.3131 | 38.0 | 18886 | 0.2814 | | 0.2829 | 39.0 | 19383 | 0.2890 | | 0.3168 | 40.0 | 19880 | 0.2696 | | 0.3284 | 41.0 | 20377 | 0.2716 | | 0.285 | 42.0 | 20874 | 0.2740 | | 0.2797 | 43.0 | 21371 | 0.2866 | | 0.2888 | 44.0 | 21868 | 0.2714 | | 0.2898 | 45.0 | 22365 | 0.2794 | | 0.2615 | 46.0 | 22862 | 0.2730 | | 0.2661 | 47.0 | 23359 | 0.2964 | | 0.2806 | 48.0 | 23856 | 0.3475 | | 0.2591 | 49.0 | 24353 | 0.2749 | | 0.2824 | 50.0 | 24850 | 0.2791 | | 0.2599 | 51.0 | 25347 | 0.2899 | | 0.2908 | 52.0 | 25844 | 0.2889 | | 0.2776 | 53.0 | 26341 | 0.2812 | | 0.2722 | 54.0 | 26838 | 0.2819 | | 0.2519 | 55.0 | 27335 | 0.2671 | | 0.2735 | 56.0 | 27832 | 0.2716 | | 0.2731 | 57.0 | 28329 | 0.2663 | | 0.2604 | 58.0 | 28826 | 0.2728 | | 0.2602 | 59.0 | 29323 | 0.2742 | | 0.2433 | 60.0 | 29820 | 0.2565 | | 0.2603 | 61.0 | 30317 | 0.2921 | | 0.2515 | 62.0 | 30814 | 0.2737 | | 0.2618 | 63.0 | 31311 | 0.2619 | | 0.248 | 64.0 | 31808 | 0.2770 | | 0.2481 | 65.0 | 32305 | 0.2586 | | 0.2556 | 66.0 | 32802 | 0.2797 | | 0.2515 | 67.0 | 33299 | 0.2464 | | 0.2372 | 68.0 | 33796 | 0.2500 | | 0.239 | 69.0 | 34293 | 0.2562 | | 0.2387 | 70.0 | 34790 | 0.2522 | | 0.2329 | 71.0 | 35287 | 0.2628 | | 0.252 | 72.0 | 35784 | 0.2543 | | 0.263 | 73.0 | 36281 | 0.2615 | | 0.2501 | 74.0 | 36778 | 0.2893 | | 0.2521 | 75.0 | 37275 | 0.2733 | | 0.2855 | 76.0 | 37772 | 0.2708 | | 0.2387 | 77.0 | 38269 | 0.2532 | | 0.2516 | 78.0 | 38766 | 0.2686 | | 0.2318 | 79.0 | 39263 | 0.2537 | | 0.2436 | 80.0 | 39760 | 0.2452 | | 0.2237 | 81.0 | 40257 | 0.2581 | | 0.2278 | 82.0 | 40754 | 0.2575 | | 0.2303 | 83.0 | 41251 | 0.2526 | | 0.2256 | 84.0 | 41748 | 0.2563 | | 0.2379 | 85.0 | 42245 | 0.2690 | | 0.2154 | 86.0 | 42742 | 0.2685 | | 0.2348 | 87.0 | 43239 | 0.2547 | | 0.2471 | 88.0 | 43736 | 0.2442 | | 0.2234 | 89.0 | 44233 | 0.2715 | | 0.2206 | 90.0 | 44730 | 0.2643 | | 0.214 | 91.0 | 45227 | 0.2647 | | 0.2181 | 92.0 | 45724 | 0.2597 | | 0.204 | 93.0 | 46221 | 0.2730 | | 0.213 | 94.0 | 46718 | 0.2392 | | 0.2107 | 95.0 | 47215 | 0.2479 | | 0.2161 | 96.0 | 47712 | 0.2695 | | 0.2173 | 97.0 | 48209 | 0.2525 | | 0.2215 | 98.0 | 48706 | 0.2475 | | 0.2057 | 99.0 | 49203 | 0.2686 | | 0.2231 | 100.0 | 49700 | 0.2786 | | 0.2055 | 101.0 | 50197 | 0.2644 | | 0.2121 | 102.0 | 50694 | 0.2705 | | 0.2337 | 103.0 | 51191 | 0.2538 | | 0.2172 | 104.0 | 51688 | 0.2561 | | 0.2207 | 105.0 | 52185 | 0.2673 | | 0.213 | 106.0 | 52682 | 0.2871 | | 0.2191 | 107.0 | 53179 | 0.2733 | | 0.2075 | 108.0 | 53676 | 0.2595 | | 0.2293 | 109.0 | 54173 | 0.2706 | | 0.2166 | 110.0 | 54670 | 0.2642 | | 0.2011 | 111.0 | 55167 | 0.2628 | | 0.2067 | 112.0 | 55664 | 0.2650 | | 0.215 | 113.0 | 56161 | 0.2807 | | 0.1979 | 114.0 | 56658 | 0.2764 | | 0.2349 | 115.0 | 57155 | 0.2569 | | 0.2029 | 116.0 | 57652 | 0.2613 | | 0.1968 | 117.0 | 58149 | 0.2720 | | 0.1898 | 118.0 | 58646 | 0.2752 | | 0.2093 | 119.0 | 59143 | 0.2696 | | 0.21 | 120.0 | 59640 | 0.2635 | | 0.1968 | 121.0 | 60137 | 0.2684 | | 0.2006 | 122.0 | 60634 | 0.2500 | | 0.2042 | 123.0 | 61131 | 0.2479 | | 0.194 | 124.0 | 61628 | 0.2486 | | 0.1938 | 125.0 | 62125 | 0.2541 | | 0.1968 | 126.0 | 62622 | 0.2792 | | 0.2102 | 127.0 | 63119 | 0.2635 | | 0.2041 | 128.0 | 63616 | 0.2413 | | 0.1975 | 129.0 | 64113 | 0.2604 | | 0.2017 | 130.0 | 64610 | 0.2682 | | 0.198 | 131.0 | 65107 | 0.2646 | | 0.2011 | 132.0 | 65604 | 0.2753 | | 0.1792 | 133.0 | 66101 | 0.2599 | | 0.2035 | 134.0 | 66598 | 0.2660 | | 0.2014 | 135.0 | 67095 | 0.2687 | | 0.2053 | 136.0 | 67592 | 0.2678 | | 0.2042 | 137.0 | 68089 | 0.2752 | | 0.1992 | 138.0 | 68586 | 0.2697 | | 0.1783 | 139.0 | 69083 | 0.2608 | | 0.179 | 140.0 | 69580 | 0.2735 | | 0.1845 | 141.0 | 70077 | 0.2649 | | 0.1868 | 142.0 | 70574 | 0.2666 | | 0.1997 | 143.0 | 71071 | 0.2565 | | 0.1755 | 144.0 | 71568 | 0.2666 | | 0.1846 | 145.0 | 72065 | 0.2677 | | 0.1784 | 146.0 | 72562 | 0.2621 | | 0.1888 | 147.0 | 73059 | 0.2523 | | 0.1963 | 148.0 | 73556 | 0.2599 | | 0.1905 | 149.0 | 74053 | 0.2512 | | 0.1789 | 150.0 | 74550 | 0.2675 | | 0.1916 | 151.0 | 75047 | 0.2443 | | 0.1841 | 152.0 | 75544 | 0.2371 | | 0.1884 | 153.0 | 76041 | 0.2528 | | 0.1767 | 154.0 | 76538 | 0.2542 | | 0.1691 | 155.0 | 77035 | 0.2565 | | 0.189 | 156.0 | 77532 | 0.2482 | | 0.1746 | 157.0 | 78029 | 0.2665 | | 0.1739 | 158.0 | 78526 | 0.2599 | | 0.1751 | 159.0 | 79023 | 0.2576 | | 0.1664 | 160.0 | 79520 | 0.2616 | | 0.177 | 161.0 | 80017 | 0.2581 | | 0.1791 | 162.0 | 80514 | 0.2534 | | 0.1825 | 163.0 | 81011 | 0.2673 | | 0.1779 | 164.0 | 81508 | 0.2576 | | 0.1734 | 165.0 | 82005 | 0.2537 | | 0.1726 | 166.0 | 82502 | 0.2429 | | 0.1644 | 167.0 | 82999 | 0.2354 | | 0.1767 | 168.0 | 83496 | 0.2446 | | 0.1749 | 169.0 | 83993 | 0.2498 | | 0.1819 | 170.0 | 84490 | 0.2415 | | 0.1863 | 171.0 | 84987 | 0.2361 | | 0.1675 | 172.0 | 85484 | 0.2550 | | 0.1693 | 173.0 | 85981 | 0.2461 | | 0.1658 | 174.0 | 86478 | 0.2413 | | 0.1712 | 175.0 | 86975 | 0.2457 | | 0.1729 | 176.0 | 87472 | 0.2433 | | 0.1645 | 177.0 | 87969 | 0.2487 | | 0.1676 | 178.0 | 88466 | 0.2384 | | 0.1592 | 179.0 | 88963 | 0.2575 | | 0.1707 | 180.0 | 89460 | 0.2612 | | 0.1628 | 181.0 | 89957 | 0.2500 | | 0.1631 | 182.0 | 90454 | 0.2399 | | 0.1678 | 183.0 | 90951 | 0.2463 | | 0.1634 | 184.0 | 91448 | 0.2618 | | 0.1702 | 185.0 | 91945 | 0.2560 | | 0.1692 | 186.0 | 92442 | 0.2552 | | 0.1633 | 187.0 | 92939 | 0.2492 | | 0.1601 | 188.0 | 93436 | 0.2477 | | 0.1711 | 189.0 | 93933 | 0.2603 | | 0.1495 | 190.0 | 94430 | 0.2665 | | 0.1751 | 191.0 | 94927 | 0.2545 | | 0.1765 | 192.0 | 95424 | 0.2546 | | 0.1529 | 193.0 | 95921 | 0.2523 | | 0.1669 | 194.0 | 96418 | 0.2678 | | 0.1552 | 195.0 | 96915 | 0.2672 | | 0.1564 | 196.0 | 97412 | 0.2561 | | 0.1597 | 197.0 | 97909 | 0.2483 | | 0.1592 | 198.0 | 98406 | 0.2612 | | 0.156 | 199.0 | 98903 | 0.2685 | | 0.1627 | 200.0 | 99400 | 0.2564 | | 0.1606 | 201.0 | 99897 | 0.2618 | | 0.1514 | 202.0 | 100394 | 0.2450 | | 0.1524 | 203.0 | 100891 | 0.2638 | | 0.1597 | 204.0 | 101388 | 0.2518 | | 0.1549 | 205.0 | 101885 | 0.2497 | | 0.1542 | 206.0 | 102382 | 0.2605 | | 0.167 | 207.0 | 102879 | 0.2610 | | 0.1531 | 208.0 | 103376 | 0.2616 | | 0.1511 | 209.0 | 103873 | 0.2555 | | 0.1482 | 210.0 | 104370 | 0.2496 | | 0.1665 | 211.0 | 104867 | 0.2520 | | 0.16 | 212.0 | 105364 | 0.2573 | | 0.1539 | 213.0 | 105861 | 0.2669 | | 0.1463 | 214.0 | 106358 | 0.2463 | | 0.1502 | 215.0 | 106855 | 0.2514 | | 0.1487 | 216.0 | 107352 | 0.2596 | | 0.152 | 217.0 | 107849 | 0.2625 | | 0.1567 | 218.0 | 108346 | 0.2588 | | 0.1558 | 219.0 | 108843 | 0.2558 | | 0.1531 | 220.0 | 109340 | 0.2549 | | 0.1561 | 221.0 | 109837 | 0.2558 | | 0.1525 | 222.0 | 110334 | 0.2534 | | 0.1539 | 223.0 | 110831 | 0.2542 | | 0.1673 | 224.0 | 111328 | 0.2475 | | 0.1501 | 225.0 | 111825 | 0.2654 | | 0.1476 | 226.0 | 112322 | 0.2636 | | 0.1506 | 227.0 | 112819 | 0.2603 | | 0.1521 | 228.0 | 113316 | 0.2581 | | 0.1459 | 229.0 | 113813 | 0.2546 | | 0.1519 | 230.0 | 114310 | 0.2526 | | 0.1546 | 231.0 | 114807 | 0.2575 | | 0.147 | 232.0 | 115304 | 0.2566 | | 0.1501 | 233.0 | 115801 | 0.2531 | | 0.1451 | 234.0 | 116298 | 0.2570 | | 0.1586 | 235.0 | 116795 | 0.2616 | | 0.1499 | 236.0 | 117292 | 0.2516 | | 0.1565 | 237.0 | 117789 | 0.2565 | | 0.1521 | 238.0 | 118286 | 0.2578 | | 0.1526 | 239.0 | 118783 | 0.2568 | | 0.153 | 240.0 | 119280 | 0.2526 | | 0.1486 | 241.0 | 119777 | 0.2600 | | 0.1534 | 242.0 | 120274 | 0.2584 | | 0.1434 | 243.0 | 120771 | 0.2589 | | 0.1474 | 244.0 | 121268 | 0.2589 | | 0.1375 | 245.0 | 121765 | 0.2571 | | 0.1426 | 246.0 | 122262 | 0.2562 | | 0.14 | 247.0 | 122759 | 0.2534 | | 0.1448 | 248.0 | 123256 | 0.2564 | | 0.1506 | 249.0 | 123753 | 0.2575 | | 0.1454 | 250.0 | 124250 | 0.2595 | | 0.1499 | 251.0 | 124747 | 0.2585 | | 0.1497 | 252.0 | 125244 | 0.2569 | | 0.1503 | 253.0 | 125741 | 0.2613 | | 0.1496 | 254.0 | 126238 | 0.2590 | | 0.1489 | 255.0 | 126735 | 0.2607 | | 0.141 | 256.0 | 127232 | 0.2595 | | 0.1535 | 257.0 | 127729 | 0.2593 | | 0.1435 | 258.0 | 128226 | 0.2603 | | 0.1461 | 259.0 | 128723 | 0.2601 | | 0.1568 | 260.0 | 129220 | 0.2573 | | 0.1392 | 261.0 | 129717 | 0.2579 | | 0.1458 | 262.0 | 130214 | 0.2682 | | 0.1492 | 263.0 | 130711 | 0.2634 | | 0.1502 | 264.0 | 131208 | 0.2636 | | 0.145 | 265.0 | 131705 | 0.2649 | | 0.1493 | 266.0 | 132202 | 0.2633 | | 0.1466 | 267.0 | 132699 | 0.2622 | | 0.1526 | 268.0 | 133196 | 0.2594 | | 0.1415 | 269.0 | 133693 | 0.2598 | | 0.1463 | 270.0 | 134190 | 0.2623 | | 0.15 | 271.0 | 134687 | 0.2618 | | 0.1454 | 272.0 | 135184 | 0.2588 | | 0.1445 | 273.0 | 135681 | 0.2547 | | 0.1456 | 274.0 | 136178 | 0.2609 | | 0.15 | 275.0 | 136675 | 0.2598 | | 0.1458 | 276.0 | 137172 | 0.2630 | | 0.1379 | 277.0 | 137669 | 0.2625 | | 0.1409 | 278.0 | 138166 | 0.2604 | | 0.1424 | 279.0 | 138663 | 0.2607 | | 0.1479 | 280.0 | 139160 | 0.2612 | | 0.1407 | 281.0 | 139657 | 0.2610 | | 0.1507 | 282.0 | 140154 | 0.2611 | | 0.1442 | 283.0 | 140651 | 0.2618 | | 0.1497 | 284.0 | 141148 | 0.2610 | | 0.146 | 285.0 | 141645 | 0.2600 | | 0.1687 | 286.0 | 142142 | 0.2620 | | 0.1409 | 287.0 | 142639 | 0.2610 | | 0.1507 | 288.0 | 143136 | 0.2585 | | 0.1474 | 289.0 | 143633 | 0.2608 | | 0.1431 | 290.0 | 144130 | 0.2594 | | 0.1496 | 291.0 | 144627 | 0.2617 | | 0.1444 | 292.0 | 145124 | 0.2596 | | 0.1382 | 293.0 | 145621 | 0.2612 | | 0.151 | 294.0 | 146118 | 0.2614 | | 0.1357 | 295.0 | 146615 | 0.2615 | | 0.1436 | 296.0 | 147112 | 0.2614 | | 0.1473 | 297.0 | 147609 | 0.2613 | | 0.1423 | 298.0 | 148106 | 0.2613 | | 0.1495 | 299.0 | 148603 | 0.2613 | | 0.1457 | 300.0 | 149100 | 0.2613 | ### Framework versions - Transformers 4.45.2 - Pytorch 2.4.1+cu121 - Datasets 2.19.2 - Tokenizers 0.20.0
[ "chicken", "duck", "plant" ]
NOTURBU/detr-resnet-50_finetuned_cppe5
# Model Card for Model ID <!-- Provide a quick summary of what the model is/does. --> ## Model Details ### Model Description <!-- Provide a longer summary of what this model is. --> This is the model card of a πŸ€— transformers model that has been pushed on the Hub. This model card has been automatically generated. - **Developed by:** [More Information Needed] - **Funded by [optional]:** [More Information Needed] - **Shared by [optional]:** [More Information Needed] - **Model type:** [More Information Needed] - **Language(s) (NLP):** [More Information Needed] - **License:** [More Information Needed] - **Finetuned from model [optional]:** [More Information Needed] ### Model Sources [optional] <!-- Provide the basic links for the model. --> - **Repository:** [More Information Needed] - **Paper [optional]:** [More Information Needed] - **Demo [optional]:** [More Information Needed] ## Uses <!-- Address questions around how the model is intended to be used, including the foreseeable users of the model and those affected by the model. --> ### Direct Use <!-- This section is for the model use without fine-tuning or plugging into a larger ecosystem/app. --> [More Information Needed] ### Downstream Use [optional] <!-- This section is for the model use when fine-tuned for a task, or when plugged into a larger ecosystem/app --> [More Information Needed] ### Out-of-Scope Use <!-- This section addresses misuse, malicious use, and uses that the model will not work well for. --> [More Information Needed] ## Bias, Risks, and Limitations <!-- This section is meant to convey both technical and sociotechnical limitations. --> [More Information Needed] ### Recommendations <!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. --> Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations. ## How to Get Started with the Model Use the code below to get started with the model. [More Information Needed] ## Training Details ### Training Data <!-- This should link to a Dataset Card, perhaps with a short stub of information on what the training data is all about as well as documentation related to data pre-processing or additional filtering. --> [More Information Needed] ### Training Procedure <!-- This relates heavily to the Technical Specifications. Content here should link to that section when it is relevant to the training procedure. --> #### Preprocessing [optional] [More Information Needed] #### Training Hyperparameters - **Training regime:** [More Information Needed] <!--fp32, fp16 mixed precision, bf16 mixed precision, bf16 non-mixed precision, fp16 non-mixed precision, fp8 mixed precision --> #### Speeds, Sizes, Times [optional] <!-- This section provides information about throughput, start/end time, checkpoint size if relevant, etc. --> [More Information Needed] ## Evaluation <!-- This section describes the evaluation protocols and provides the results. --> ### Testing Data, Factors & Metrics #### Testing Data <!-- This should link to a Dataset Card if possible. --> [More Information Needed] #### Factors <!-- These are the things the evaluation is disaggregating by, e.g., subpopulations or domains. --> [More Information Needed] #### Metrics <!-- These are the evaluation metrics being used, ideally with a description of why. --> [More Information Needed] ### Results [More Information Needed] #### Summary ## Model Examination [optional] <!-- Relevant interpretability work for the model goes here --> [More Information Needed] ## Environmental Impact <!-- Total emissions (in grams of CO2eq) and additional considerations, such as electricity usage, go here. Edit the suggested text below accordingly --> Carbon emissions can be estimated using the [Machine Learning Impact calculator](https://mlco2.github.io/impact#compute) presented in [Lacoste et al. (2019)](https://arxiv.org/abs/1910.09700). - **Hardware Type:** [More Information Needed] - **Hours used:** [More Information Needed] - **Cloud Provider:** [More Information Needed] - **Compute Region:** [More Information Needed] - **Carbon Emitted:** [More Information Needed] ## Technical Specifications [optional] ### Model Architecture and Objective [More Information Needed] ### Compute Infrastructure [More Information Needed] #### Hardware [More Information Needed] #### Software [More Information Needed] ## Citation [optional] <!-- If there is a paper or blog post introducing the model, the APA and Bibtex information for that should go in this section. --> **BibTeX:** [More Information Needed] **APA:** [More Information Needed] ## Glossary [optional] <!-- If relevant, include terms and calculations in this section that can help readers understand the model or model card. --> [More Information Needed] ## More Information [optional] [More Information Needed] ## Model Card Authors [optional] [More Information Needed] ## Model Card Contact [More Information Needed]
[ "coverall", "face_shield", "gloves", "goggles", "mask" ]
joe611/detr_finetuned_cppe5
<!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. --> # detr_finetuned_cppe5 This model is a fine-tuned version of [facebook/detr-resnet-50](https://huggingface.co/facebook/detr-resnet-50) on an unknown dataset. It achieves the following results on the evaluation set: - Loss: 1.2803 - Map: 0.2236 - Map 50: 0.4243 - Map 75: 0.206 - Map Small: 0.087 - Map Medium: 0.2507 - Map Large: 0.2808 - Mar 1: 0.28 - Mar 10: 0.4603 - Mar 100: 0.4703 - Mar Small: 0.2619 - Mar Medium: 0.4212 - Mar Large: 0.5738 - Map Coverall: 0.5876 - Mar 100 Coverall: 0.7507 - Map Face Shield: 0.0978 - Mar 100 Face Shield: 0.5019 - Map Gloves: 0.1088 - Mar 100 Gloves: 0.3443 - Map Goggles: 0.0251 - Mar 100 Goggles: 0.3109 - Map Mask: 0.2988 - Mar 100 Mask: 0.4436 ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 5e-05 - train_batch_size: 8 - eval_batch_size: 8 - seed: 42 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: cosine - num_epochs: 30 ### Training results | Training Loss | Epoch | Step | Validation Loss | Map | Map 50 | Map 75 | Map Small | Map Medium | Map Large | Mar 1 | Mar 10 | Mar 100 | Mar Small | Mar Medium | Mar Large | Map Coverall | Mar 100 Coverall | Map Face Shield | Mar 100 Face Shield | Map Gloves | Mar 100 Gloves | Map Goggles | Mar 100 Goggles | Map Mask | Mar 100 Mask | |:-------------:|:-----:|:----:|:---------------:|:------:|:------:|:------:|:---------:|:----------:|:---------:|:------:|:------:|:-------:|:---------:|:----------:|:---------:|:------------:|:----------------:|:---------------:|:-------------------:|:----------:|:--------------:|:-----------:|:---------------:|:--------:|:------------:| | No log | 1.0 | 101 | 2.2664 | 0.0299 | 0.0624 | 0.0218 | 0.008 | 0.0698 | 0.0295 | 0.081 | 0.1593 | 0.1948 | 0.0526 | 0.1853 | 0.2126 | 0.1248 | 0.642 | 0.0 | 0.0 | 0.0053 | 0.0805 | 0.0 | 0.0 | 0.0193 | 0.2517 | | No log | 2.0 | 202 | 2.0505 | 0.0534 | 0.1073 | 0.0505 | 0.0065 | 0.095 | 0.0546 | 0.0986 | 0.1795 | 0.2122 | 0.0649 | 0.2148 | 0.2305 | 0.2306 | 0.67 | 0.0 | 0.0 | 0.0054 | 0.1437 | 0.0 | 0.0 | 0.0311 | 0.2471 | | No log | 3.0 | 303 | 1.9191 | 0.0388 | 0.0853 | 0.032 | 0.0069 | 0.0715 | 0.0418 | 0.1113 | 0.2058 | 0.2444 | 0.0838 | 0.2401 | 0.2708 | 0.1412 | 0.6947 | 0.0 | 0.0 | 0.0138 | 0.1943 | 0.0 | 0.0 | 0.0391 | 0.3331 | | No log | 4.0 | 404 | 1.9140 | 0.0689 | 0.1334 | 0.0686 | 0.0119 | 0.1069 | 0.0613 | 0.1073 | 0.2019 | 0.2281 | 0.1048 | 0.2354 | 0.2333 | 0.2665 | 0.6 | 0.0184 | 0.05 | 0.015 | 0.1828 | 0.0 | 0.0 | 0.0445 | 0.3076 | | 2.2885 | 5.0 | 505 | 1.8535 | 0.065 | 0.1347 | 0.0579 | 0.0107 | 0.109 | 0.0681 | 0.1064 | 0.1994 | 0.2271 | 0.0641 | 0.2213 | 0.2554 | 0.2582 | 0.6707 | 0.0022 | 0.0426 | 0.0233 | 0.15 | 0.0 | 0.0 | 0.0414 | 0.2721 | | 2.2885 | 6.0 | 606 | 1.6854 | 0.106 | 0.2131 | 0.0866 | 0.0174 | 0.154 | 0.1124 | 0.1623 | 0.2819 | 0.2947 | 0.1207 | 0.2903 | 0.3191 | 0.3878 | 0.6953 | 0.0273 | 0.2389 | 0.0375 | 0.2167 | 0.0 | 0.0 | 0.0772 | 0.3227 | | 2.2885 | 7.0 | 707 | 1.7142 | 0.1321 | 0.2588 | 0.1213 | 0.0287 | 0.1694 | 0.1454 | 0.155 | 0.2548 | 0.2629 | 0.104 | 0.2571 | 0.2894 | 0.4419 | 0.6727 | 0.0194 | 0.1111 | 0.0291 | 0.1684 | 0.0158 | 0.0145 | 0.1541 | 0.3477 | | 2.2885 | 8.0 | 808 | 1.6260 | 0.1329 | 0.2764 | 0.1116 | 0.0309 | 0.1806 | 0.133 | 0.1733 | 0.2959 | 0.3231 | 0.1237 | 0.3281 | 0.3412 | 0.4414 | 0.716 | 0.0365 | 0.263 | 0.0328 | 0.2598 | 0.002 | 0.0345 | 0.1517 | 0.3424 | | 2.2885 | 9.0 | 909 | 1.5600 | 0.144 | 0.2839 | 0.1295 | 0.0413 | 0.1806 | 0.1517 | 0.1962 | 0.34 | 0.357 | 0.1705 | 0.3582 | 0.3866 | 0.4652 | 0.6833 | 0.0401 | 0.3315 | 0.0331 | 0.2517 | 0.0346 | 0.1145 | 0.1472 | 0.4041 | | 1.8836 | 10.0 | 1010 | 1.6155 | 0.132 | 0.2677 | 0.1157 | 0.0192 | 0.1619 | 0.1535 | 0.1793 | 0.3314 | 0.3432 | 0.1246 | 0.3549 | 0.3759 | 0.4695 | 0.6727 | 0.0321 | 0.3333 | 0.0422 | 0.2201 | 0.0156 | 0.1236 | 0.1008 | 0.3663 | | 1.8836 | 11.0 | 1111 | 1.7756 | 0.131 | 0.273 | 0.1157 | 0.0451 | 0.1763 | 0.1382 | 0.169 | 0.2725 | 0.2772 | 0.1249 | 0.2801 | 0.2997 | 0.4245 | 0.654 | 0.0371 | 0.2019 | 0.04 | 0.1937 | 0.0051 | 0.0164 | 0.1482 | 0.3203 | | 1.8836 | 12.0 | 1212 | 1.5594 | 0.1479 | 0.2988 | 0.1274 | 0.0357 | 0.1797 | 0.1682 | 0.1802 | 0.339 | 0.354 | 0.1872 | 0.3288 | 0.4026 | 0.4925 | 0.6933 | 0.0419 | 0.3444 | 0.0535 | 0.254 | 0.0065 | 0.0655 | 0.145 | 0.4128 | | 1.8836 | 13.0 | 1313 | 1.5275 | 0.1686 | 0.3448 | 0.1442 | 0.0419 | 0.1905 | 0.1946 | 0.2034 | 0.3691 | 0.3835 | 0.162 | 0.3604 | 0.4459 | 0.5224 | 0.714 | 0.0573 | 0.4056 | 0.0557 | 0.2351 | 0.0319 | 0.2055 | 0.1757 | 0.3576 | | 1.8836 | 14.0 | 1414 | 1.4658 | 0.1716 | 0.3471 | 0.1529 | 0.0615 | 0.1918 | 0.1997 | 0.234 | 0.4032 | 0.4258 | 0.244 | 0.3949 | 0.4814 | 0.5247 | 0.72 | 0.0477 | 0.4907 | 0.0605 | 0.2787 | 0.0125 | 0.2382 | 0.2126 | 0.4012 | | 1.7112 | 15.0 | 1515 | 1.4980 | 0.1632 | 0.3423 | 0.149 | 0.0436 | 0.1925 | 0.1903 | 0.2106 | 0.3644 | 0.3841 | 0.155 | 0.3627 | 0.4528 | 0.5172 | 0.6987 | 0.0357 | 0.4056 | 0.0435 | 0.2425 | 0.023 | 0.2145 | 0.1963 | 0.3593 | | 1.7112 | 16.0 | 1616 | 1.4760 | 0.1673 | 0.3361 | 0.1462 | 0.0639 | 0.1893 | 0.1917 | 0.2036 | 0.376 | 0.3902 | 0.1922 | 0.3695 | 0.4412 | 0.5163 | 0.6953 | 0.049 | 0.4074 | 0.0763 | 0.2816 | 0.0062 | 0.1745 | 0.1886 | 0.3919 | | 1.7112 | 17.0 | 1717 | 1.4224 | 0.1841 | 0.3711 | 0.1608 | 0.1013 | 0.208 | 0.2253 | 0.2356 | 0.4147 | 0.4282 | 0.2339 | 0.3828 | 0.5104 | 0.5087 | 0.7247 | 0.0594 | 0.4426 | 0.0771 | 0.304 | 0.0179 | 0.2673 | 0.2575 | 0.4023 | | 1.7112 | 18.0 | 1818 | 1.4224 | 0.1909 | 0.3743 | 0.17 | 0.082 | 0.2007 | 0.2407 | 0.2446 | 0.4147 | 0.4283 | 0.2211 | 0.3755 | 0.5219 | 0.5453 | 0.7293 | 0.0738 | 0.4833 | 0.0762 | 0.2724 | 0.0141 | 0.2745 | 0.2454 | 0.382 | | 1.7112 | 19.0 | 1919 | 1.3652 | 0.2068 | 0.4087 | 0.1879 | 0.0874 | 0.2187 | 0.2478 | 0.2572 | 0.4189 | 0.4333 | 0.2443 | 0.4009 | 0.5066 | 0.5488 | 0.726 | 0.0785 | 0.4611 | 0.0913 | 0.3155 | 0.0226 | 0.2382 | 0.293 | 0.4256 | | 1.5736 | 20.0 | 2020 | 1.3381 | 0.2098 | 0.4139 | 0.1944 | 0.0826 | 0.2269 | 0.2571 | 0.2517 | 0.4231 | 0.437 | 0.223 | 0.4093 | 0.5108 | 0.553 | 0.7353 | 0.0926 | 0.4352 | 0.0892 | 0.2994 | 0.0241 | 0.2836 | 0.2899 | 0.4314 | | 1.5736 | 21.0 | 2121 | 1.3366 | 0.2142 | 0.4226 | 0.1973 | 0.088 | 0.2302 | 0.2616 | 0.2574 | 0.4299 | 0.4431 | 0.2459 | 0.4053 | 0.5214 | 0.5573 | 0.72 | 0.0851 | 0.4556 | 0.0986 | 0.3034 | 0.0345 | 0.3 | 0.2958 | 0.4366 | | 1.5736 | 22.0 | 2222 | 1.3208 | 0.2174 | 0.4119 | 0.1985 | 0.0864 | 0.2348 | 0.2689 | 0.2659 | 0.4506 | 0.4605 | 0.2751 | 0.4101 | 0.5576 | 0.5717 | 0.7447 | 0.0784 | 0.4593 | 0.108 | 0.3293 | 0.0223 | 0.3218 | 0.3066 | 0.4477 | | 1.5736 | 23.0 | 2323 | 1.3249 | 0.216 | 0.417 | 0.195 | 0.0897 | 0.2359 | 0.2655 | 0.2734 | 0.4431 | 0.4569 | 0.2478 | 0.4077 | 0.5547 | 0.5606 | 0.744 | 0.085 | 0.4648 | 0.1042 | 0.3293 | 0.0297 | 0.3109 | 0.3006 | 0.4355 | | 1.5736 | 24.0 | 2424 | 1.3029 | 0.2179 | 0.4165 | 0.1978 | 0.0847 | 0.2401 | 0.2718 | 0.2732 | 0.4512 | 0.4602 | 0.2515 | 0.4124 | 0.5595 | 0.5791 | 0.748 | 0.0956 | 0.4889 | 0.106 | 0.3391 | 0.028 | 0.3 | 0.2808 | 0.425 | | 1.475 | 25.0 | 2525 | 1.3052 | 0.221 | 0.4208 | 0.2005 | 0.0787 | 0.2486 | 0.2767 | 0.2755 | 0.4533 | 0.4668 | 0.2224 | 0.4193 | 0.5809 | 0.5811 | 0.7473 | 0.0925 | 0.4796 | 0.1031 | 0.3339 | 0.0286 | 0.3291 | 0.2998 | 0.4442 | | 1.475 | 26.0 | 2626 | 1.2998 | 0.2212 | 0.4156 | 0.2065 | 0.0826 | 0.2454 | 0.2761 | 0.2715 | 0.4538 | 0.4688 | 0.2426 | 0.4206 | 0.5781 | 0.5802 | 0.7513 | 0.0918 | 0.4981 | 0.1098 | 0.3431 | 0.0245 | 0.3109 | 0.2996 | 0.4407 | | 1.475 | 27.0 | 2727 | 1.2836 | 0.2217 | 0.4217 | 0.202 | 0.0805 | 0.2452 | 0.2793 | 0.2791 | 0.4579 | 0.47 | 0.2482 | 0.4226 | 0.5736 | 0.581 | 0.7473 | 0.0916 | 0.4944 | 0.1094 | 0.3448 | 0.0261 | 0.3182 | 0.3003 | 0.4453 | | 1.475 | 28.0 | 2828 | 1.2790 | 0.2238 | 0.4264 | 0.2033 | 0.0864 | 0.2524 | 0.2806 | 0.278 | 0.4589 | 0.4701 | 0.2425 | 0.4246 | 0.5751 | 0.5848 | 0.7467 | 0.098 | 0.4926 | 0.1107 | 0.3489 | 0.026 | 0.3182 | 0.2994 | 0.4442 | | 1.475 | 29.0 | 2929 | 1.2804 | 0.2234 | 0.4256 | 0.2059 | 0.0865 | 0.2507 | 0.2808 | 0.2802 | 0.4603 | 0.4703 | 0.2614 | 0.4216 | 0.5738 | 0.5875 | 0.7507 | 0.0977 | 0.5019 | 0.1087 | 0.3431 | 0.025 | 0.3127 | 0.298 | 0.443 | | 1.4089 | 30.0 | 3030 | 1.2803 | 0.2236 | 0.4243 | 0.206 | 0.087 | 0.2507 | 0.2808 | 0.28 | 0.4603 | 0.4703 | 0.2619 | 0.4212 | 0.5738 | 0.5876 | 0.7507 | 0.0978 | 0.5019 | 0.1088 | 0.3443 | 0.0251 | 0.3109 | 0.2988 | 0.4436 | ### Framework versions - Transformers 4.44.2 - Pytorch 2.4.1+cu121 - Datasets 3.0.0 - Tokenizers 0.19.1
[ "coverall", "face_shield", "gloves", "goggles", "mask" ]
jabed/detr-resnet-50_finetuned_cppe5
<!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. --> # detr-resnet-50_finetuned_cppe5 This model is a fine-tuned version of [facebook/detr-resnet-50](https://huggingface.co/facebook/detr-resnet-50) on an unknown dataset. ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 1e-05 - train_batch_size: 8 - eval_batch_size: 8 - seed: 42 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - num_epochs: 10 - mixed_precision_training: Native AMP ### Training results ### Framework versions - Transformers 4.44.2 - Pytorch 2.4.1+cu121 - Datasets 3.0.1 - Tokenizers 0.19.1
[ "coverall", "face_shield", "gloves", "goggles", "mask" ]
0llheaven/CON-DETR-V1
# Model Card for Model ID <!-- Provide a quick summary of what the model is/does. --> ## Model Details ### Model Description <!-- Provide a longer summary of what this model is. --> This is the model card of a πŸ€— transformers model that has been pushed on the Hub. This model card has been automatically generated. - **Developed by:** [More Information Needed] - **Funded by [optional]:** [More Information Needed] - **Shared by [optional]:** [More Information Needed] - **Model type:** [More Information Needed] - **Language(s) (NLP):** [More Information Needed] - **License:** [More Information Needed] - **Finetuned from model [optional]:** [More Information Needed] ### Model Sources [optional] <!-- Provide the basic links for the model. --> - **Repository:** [More Information Needed] - **Paper [optional]:** [More Information Needed] - **Demo [optional]:** [More Information Needed] ## Uses <!-- Address questions around how the model is intended to be used, including the foreseeable users of the model and those affected by the model. --> ### Direct Use <!-- This section is for the model use without fine-tuning or plugging into a larger ecosystem/app. --> [More Information Needed] ### Downstream Use [optional] <!-- This section is for the model use when fine-tuned for a task, or when plugged into a larger ecosystem/app --> [More Information Needed] ### Out-of-Scope Use <!-- This section addresses misuse, malicious use, and uses that the model will not work well for. --> [More Information Needed] ## Bias, Risks, and Limitations <!-- This section is meant to convey both technical and sociotechnical limitations. --> [More Information Needed] ### Recommendations <!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. --> Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations. ## How to Get Started with the Model Use the code below to get started with the model. [More Information Needed] ## Training Details ### Training Data <!-- This should link to a Dataset Card, perhaps with a short stub of information on what the training data is all about as well as documentation related to data pre-processing or additional filtering. --> [More Information Needed] ### Training Procedure <!-- This relates heavily to the Technical Specifications. Content here should link to that section when it is relevant to the training procedure. --> #### Preprocessing [optional] [More Information Needed] #### Training Hyperparameters - **Training regime:** [More Information Needed] <!--fp32, fp16 mixed precision, bf16 mixed precision, bf16 non-mixed precision, fp16 non-mixed precision, fp8 mixed precision --> #### Speeds, Sizes, Times [optional] <!-- This section provides information about throughput, start/end time, checkpoint size if relevant, etc. --> [More Information Needed] ## Evaluation <!-- This section describes the evaluation protocols and provides the results. --> ### Testing Data, Factors & Metrics #### Testing Data <!-- This should link to a Dataset Card if possible. --> [More Information Needed] #### Factors <!-- These are the things the evaluation is disaggregating by, e.g., subpopulations or domains. --> [More Information Needed] #### Metrics <!-- These are the evaluation metrics being used, ideally with a description of why. --> [More Information Needed] ### Results [More Information Needed] #### Summary ## Model Examination [optional] <!-- Relevant interpretability work for the model goes here --> [More Information Needed] ## Environmental Impact <!-- Total emissions (in grams of CO2eq) and additional considerations, such as electricity usage, go here. Edit the suggested text below accordingly --> Carbon emissions can be estimated using the [Machine Learning Impact calculator](https://mlco2.github.io/impact#compute) presented in [Lacoste et al. (2019)](https://arxiv.org/abs/1910.09700). - **Hardware Type:** [More Information Needed] - **Hours used:** [More Information Needed] - **Cloud Provider:** [More Information Needed] - **Compute Region:** [More Information Needed] - **Carbon Emitted:** [More Information Needed] ## Technical Specifications [optional] ### Model Architecture and Objective [More Information Needed] ### Compute Infrastructure [More Information Needed] #### Hardware [More Information Needed] #### Software [More Information Needed] ## Citation [optional] <!-- If there is a paper or blog post introducing the model, the APA and Bibtex information for that should go in this section. --> **BibTeX:** [More Information Needed] **APA:** [More Information Needed] ## Glossary [optional] <!-- If relevant, include terms and calculations in this section that can help readers understand the model or model card. --> [More Information Needed] ## More Information [optional] [More Information Needed] ## Model Card Authors [optional] [More Information Needed] ## Model Card Contact [More Information Needed]
[ "pneumonia", "normal" ]
0llheaven/CON-DETR-V2
# Model Card for Model ID <!-- Provide a quick summary of what the model is/does. --> ## Model Details ### Model Description <!-- Provide a longer summary of what this model is. --> This is the model card of a πŸ€— transformers model that has been pushed on the Hub. This model card has been automatically generated. - **Developed by:** [More Information Needed] - **Funded by [optional]:** [More Information Needed] - **Shared by [optional]:** [More Information Needed] - **Model type:** [More Information Needed] - **Language(s) (NLP):** [More Information Needed] - **License:** [More Information Needed] - **Finetuned from model [optional]:** [More Information Needed] ### Model Sources [optional] <!-- Provide the basic links for the model. --> - **Repository:** [More Information Needed] - **Paper [optional]:** [More Information Needed] - **Demo [optional]:** [More Information Needed] ## Uses <!-- Address questions around how the model is intended to be used, including the foreseeable users of the model and those affected by the model. --> ### Direct Use <!-- This section is for the model use without fine-tuning or plugging into a larger ecosystem/app. --> [More Information Needed] ### Downstream Use [optional] <!-- This section is for the model use when fine-tuned for a task, or when plugged into a larger ecosystem/app --> [More Information Needed] ### Out-of-Scope Use <!-- This section addresses misuse, malicious use, and uses that the model will not work well for. --> [More Information Needed] ## Bias, Risks, and Limitations <!-- This section is meant to convey both technical and sociotechnical limitations. --> [More Information Needed] ### Recommendations <!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. --> Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations. ## How to Get Started with the Model Use the code below to get started with the model. [More Information Needed] ## Training Details ### Training Data <!-- This should link to a Dataset Card, perhaps with a short stub of information on what the training data is all about as well as documentation related to data pre-processing or additional filtering. --> [More Information Needed] ### Training Procedure <!-- This relates heavily to the Technical Specifications. Content here should link to that section when it is relevant to the training procedure. --> #### Preprocessing [optional] [More Information Needed] #### Training Hyperparameters - **Training regime:** [More Information Needed] <!--fp32, fp16 mixed precision, bf16 mixed precision, bf16 non-mixed precision, fp16 non-mixed precision, fp8 mixed precision --> #### Speeds, Sizes, Times [optional] <!-- This section provides information about throughput, start/end time, checkpoint size if relevant, etc. --> [More Information Needed] ## Evaluation <!-- This section describes the evaluation protocols and provides the results. --> ### Testing Data, Factors & Metrics #### Testing Data <!-- This should link to a Dataset Card if possible. --> [More Information Needed] #### Factors <!-- These are the things the evaluation is disaggregating by, e.g., subpopulations or domains. --> [More Information Needed] #### Metrics <!-- These are the evaluation metrics being used, ideally with a description of why. --> [More Information Needed] ### Results [More Information Needed] #### Summary ## Model Examination [optional] <!-- Relevant interpretability work for the model goes here --> [More Information Needed] ## Environmental Impact <!-- Total emissions (in grams of CO2eq) and additional considerations, such as electricity usage, go here. Edit the suggested text below accordingly --> Carbon emissions can be estimated using the [Machine Learning Impact calculator](https://mlco2.github.io/impact#compute) presented in [Lacoste et al. (2019)](https://arxiv.org/abs/1910.09700). - **Hardware Type:** [More Information Needed] - **Hours used:** [More Information Needed] - **Cloud Provider:** [More Information Needed] - **Compute Region:** [More Information Needed] - **Carbon Emitted:** [More Information Needed] ## Technical Specifications [optional] ### Model Architecture and Objective [More Information Needed] ### Compute Infrastructure [More Information Needed] #### Hardware [More Information Needed] #### Software [More Information Needed] ## Citation [optional] <!-- If there is a paper or blog post introducing the model, the APA and Bibtex information for that should go in this section. --> **BibTeX:** [More Information Needed] **APA:** [More Information Needed] ## Glossary [optional] <!-- If relevant, include terms and calculations in this section that can help readers understand the model or model card. --> [More Information Needed] ## More Information [optional] [More Information Needed] ## Model Card Authors [optional] [More Information Needed] ## Model Card Contact [More Information Needed]
[ "pneumonia", "normal" ]
0llheaven/CON-DETR-V3
# Model Card for Model ID <!-- Provide a quick summary of what the model is/does. --> ## Model Details ### Model Description <!-- Provide a longer summary of what this model is. --> This is the model card of a πŸ€— transformers model that has been pushed on the Hub. This model card has been automatically generated. - **Developed by:** [More Information Needed] - **Funded by [optional]:** [More Information Needed] - **Shared by [optional]:** [More Information Needed] - **Model type:** [More Information Needed] - **Language(s) (NLP):** [More Information Needed] - **License:** [More Information Needed] - **Finetuned from model [optional]:** [More Information Needed] ### Model Sources [optional] <!-- Provide the basic links for the model. --> - **Repository:** [More Information Needed] - **Paper [optional]:** [More Information Needed] - **Demo [optional]:** [More Information Needed] ## Uses <!-- Address questions around how the model is intended to be used, including the foreseeable users of the model and those affected by the model. --> ### Direct Use <!-- This section is for the model use without fine-tuning or plugging into a larger ecosystem/app. --> [More Information Needed] ### Downstream Use [optional] <!-- This section is for the model use when fine-tuned for a task, or when plugged into a larger ecosystem/app --> [More Information Needed] ### Out-of-Scope Use <!-- This section addresses misuse, malicious use, and uses that the model will not work well for. --> [More Information Needed] ## Bias, Risks, and Limitations <!-- This section is meant to convey both technical and sociotechnical limitations. --> [More Information Needed] ### Recommendations <!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. --> Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations. ## How to Get Started with the Model Use the code below to get started with the model. [More Information Needed] ## Training Details ### Training Data <!-- This should link to a Dataset Card, perhaps with a short stub of information on what the training data is all about as well as documentation related to data pre-processing or additional filtering. --> [More Information Needed] ### Training Procedure <!-- This relates heavily to the Technical Specifications. Content here should link to that section when it is relevant to the training procedure. --> #### Preprocessing [optional] [More Information Needed] #### Training Hyperparameters - **Training regime:** [More Information Needed] <!--fp32, fp16 mixed precision, bf16 mixed precision, bf16 non-mixed precision, fp16 non-mixed precision, fp8 mixed precision --> #### Speeds, Sizes, Times [optional] <!-- This section provides information about throughput, start/end time, checkpoint size if relevant, etc. --> [More Information Needed] ## Evaluation <!-- This section describes the evaluation protocols and provides the results. --> ### Testing Data, Factors & Metrics #### Testing Data <!-- This should link to a Dataset Card if possible. --> [More Information Needed] #### Factors <!-- These are the things the evaluation is disaggregating by, e.g., subpopulations or domains. --> [More Information Needed] #### Metrics <!-- These are the evaluation metrics being used, ideally with a description of why. --> [More Information Needed] ### Results [More Information Needed] #### Summary ## Model Examination [optional] <!-- Relevant interpretability work for the model goes here --> [More Information Needed] ## Environmental Impact <!-- Total emissions (in grams of CO2eq) and additional considerations, such as electricity usage, go here. Edit the suggested text below accordingly --> Carbon emissions can be estimated using the [Machine Learning Impact calculator](https://mlco2.github.io/impact#compute) presented in [Lacoste et al. (2019)](https://arxiv.org/abs/1910.09700). - **Hardware Type:** [More Information Needed] - **Hours used:** [More Information Needed] - **Cloud Provider:** [More Information Needed] - **Compute Region:** [More Information Needed] - **Carbon Emitted:** [More Information Needed] ## Technical Specifications [optional] ### Model Architecture and Objective [More Information Needed] ### Compute Infrastructure [More Information Needed] #### Hardware [More Information Needed] #### Software [More Information Needed] ## Citation [optional] <!-- If there is a paper or blog post introducing the model, the APA and Bibtex information for that should go in this section. --> **BibTeX:** [More Information Needed] **APA:** [More Information Needed] ## Glossary [optional] <!-- If relevant, include terms and calculations in this section that can help readers understand the model or model card. --> [More Information Needed] ## More Information [optional] [More Information Needed] ## Model Card Authors [optional] [More Information Needed] ## Model Card Contact [More Information Needed]
[ "pneumonia", "normal" ]
0llheaven/CON-DETR-V4
# Model Card for Model ID <!-- Provide a quick summary of what the model is/does. --> ## Model Details ### Model Description <!-- Provide a longer summary of what this model is. --> This is the model card of a πŸ€— transformers model that has been pushed on the Hub. This model card has been automatically generated. - **Developed by:** [More Information Needed] - **Funded by [optional]:** [More Information Needed] - **Shared by [optional]:** [More Information Needed] - **Model type:** [More Information Needed] - **Language(s) (NLP):** [More Information Needed] - **License:** [More Information Needed] - **Finetuned from model [optional]:** [More Information Needed] ### Model Sources [optional] <!-- Provide the basic links for the model. --> - **Repository:** [More Information Needed] - **Paper [optional]:** [More Information Needed] - **Demo [optional]:** [More Information Needed] ## Uses <!-- Address questions around how the model is intended to be used, including the foreseeable users of the model and those affected by the model. --> ### Direct Use <!-- This section is for the model use without fine-tuning or plugging into a larger ecosystem/app. --> [More Information Needed] ### Downstream Use [optional] <!-- This section is for the model use when fine-tuned for a task, or when plugged into a larger ecosystem/app --> [More Information Needed] ### Out-of-Scope Use <!-- This section addresses misuse, malicious use, and uses that the model will not work well for. --> [More Information Needed] ## Bias, Risks, and Limitations <!-- This section is meant to convey both technical and sociotechnical limitations. --> [More Information Needed] ### Recommendations <!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. --> Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations. ## How to Get Started with the Model Use the code below to get started with the model. [More Information Needed] ## Training Details ### Training Data <!-- This should link to a Dataset Card, perhaps with a short stub of information on what the training data is all about as well as documentation related to data pre-processing or additional filtering. --> [More Information Needed] ### Training Procedure <!-- This relates heavily to the Technical Specifications. Content here should link to that section when it is relevant to the training procedure. --> #### Preprocessing [optional] [More Information Needed] #### Training Hyperparameters - **Training regime:** [More Information Needed] <!--fp32, fp16 mixed precision, bf16 mixed precision, bf16 non-mixed precision, fp16 non-mixed precision, fp8 mixed precision --> #### Speeds, Sizes, Times [optional] <!-- This section provides information about throughput, start/end time, checkpoint size if relevant, etc. --> [More Information Needed] ## Evaluation <!-- This section describes the evaluation protocols and provides the results. --> ### Testing Data, Factors & Metrics #### Testing Data <!-- This should link to a Dataset Card if possible. --> [More Information Needed] #### Factors <!-- These are the things the evaluation is disaggregating by, e.g., subpopulations or domains. --> [More Information Needed] #### Metrics <!-- These are the evaluation metrics being used, ideally with a description of why. --> [More Information Needed] ### Results [More Information Needed] #### Summary ## Model Examination [optional] <!-- Relevant interpretability work for the model goes here --> [More Information Needed] ## Environmental Impact <!-- Total emissions (in grams of CO2eq) and additional considerations, such as electricity usage, go here. Edit the suggested text below accordingly --> Carbon emissions can be estimated using the [Machine Learning Impact calculator](https://mlco2.github.io/impact#compute) presented in [Lacoste et al. (2019)](https://arxiv.org/abs/1910.09700). - **Hardware Type:** [More Information Needed] - **Hours used:** [More Information Needed] - **Cloud Provider:** [More Information Needed] - **Compute Region:** [More Information Needed] - **Carbon Emitted:** [More Information Needed] ## Technical Specifications [optional] ### Model Architecture and Objective [More Information Needed] ### Compute Infrastructure [More Information Needed] #### Hardware [More Information Needed] #### Software [More Information Needed] ## Citation [optional] <!-- If there is a paper or blog post introducing the model, the APA and Bibtex information for that should go in this section. --> **BibTeX:** [More Information Needed] **APA:** [More Information Needed] ## Glossary [optional] <!-- If relevant, include terms and calculations in this section that can help readers understand the model or model card. --> [More Information Needed] ## More Information [optional] [More Information Needed] ## Model Card Authors [optional] [More Information Needed] ## Model Card Contact [More Information Needed]
[ "pneumonia", "normal" ]
0llheaven/CON-DETR-V5
# Model Overview This model is a DETR-based object detection model trained for medical image analysis, 4 classes : 0:Pneumonia, 1:Normal, 2:Pneumonia_bacteria, and 3:Pneumonia_virus ## Model Description Model Architecture: DEtection TRansformers (DETR) Training Data: Trained on a custom dataset of annotated medical images Intended Use: Designed for analyzing chest X-ray images to detect the presence and type of pneumonia, or classify as normal.. ## Uses Example Code test model! ```python import os from transformers import AutoImageProcessor, AutoModelForObjectDetection import torch from PIL import Image import pandas as pd folder_path = "" processor = AutoImageProcessor.from_pretrained("0llheaven/CON-DETR-V5") model = AutoModelForObjectDetection.from_pretrained("0llheaven/CON-DETR-V5") results_list = [] for image_name in os.listdir(folder_path): if image_name.endswith((".jpg", ".png", ".jpeg")): image_path = os.path.join(folder_path, image_name) image = Image.open(image_path) # RGB To grayscale if image.mode != "RGB": image = image.convert("RGB") # prediction inputs = processor(images=image, return_tensors="pt") outputs = model(**inputs) target_sizes = torch.tensor([image.size[::-1]]) results = processor.post_process_object_detection(outputs, target_sizes=target_sizes) print(f"Processing image: {image_name}") # detect_box detected_any = False for result in results: scores = result["scores"] labels = result["labels"] boxes = result["boxes"] filtered_data = [(score, label, box) for score, label, box in zip(scores, labels, boxes) if score > 0.5][:2] # ΰΈˆΰΈ³ΰΈΰΈ±ΰΈ”ΰΉƒΰΈ«ΰΉ‰ΰΉ€ΰΈΰΉ‡ΰΈšΰΉ„ΰΈ”ΰΉ‰ΰΉ„ΰΈ‘ΰΉˆΰΉ€ΰΈΰΈ΄ΰΈ™ 2 กΰΈ₯ΰΉˆΰΈ­ΰΈ‡ for score, label, box in zip(scores, labels, boxes): if score > 0.5: if len(filtered_data) > 0: detected_any = True for score, label, box in filtered_data: if label.item() == 0: label_name = "Pneumonia" elif label.item() == 1: label_name = "Normal" elif label.item() == 2: label_name = "Pneumonia_bacteria" else: label_name = "Pneumonia_virus" xmin, ymin, xmax, ymax = [round(i, 2) for i in box.tolist()] print(f" - Detected {label_name} with score {round(score.item(), 3)} at {xmin, ymin, xmax, ymax}") results_list.append({ "image_name": image_name, "label": label_name, "xmin": xmin, "ymin": ymin, "xmax": xmax, "ymax": ymax, "score": round(score.item(), 3), }) if not detected_any: print(" - No Detect") results_list.append({ "image_name": image_name, "label": "Other", "xmin": 0, "ymin": 0, "xmax": 0, "ymax": 0, "score": 0, }) results_df = pd.DataFrame(results_list) print("\nFinal results:") results_df.to_csv("testmodel.csv", index=False) ```
[ "pneumonia", "normal", "pneumonia_bacteria", "pneumonia_virus" ]
WANGTINGTING/finetuned-table-transformer-detection-v1
# Model Card for Model ID <!-- Provide a quick summary of what the model is/does. --> ## Model Details ### Model Description <!-- Provide a longer summary of what this model is. --> This is the model card of a πŸ€— transformers model that has been pushed on the Hub. This model card has been automatically generated. - **Developed by:** [More Information Needed] - **Funded by [optional]:** [More Information Needed] - **Shared by [optional]:** [More Information Needed] - **Model type:** [More Information Needed] - **Language(s) (NLP):** [More Information Needed] - **License:** [More Information Needed] - **Finetuned from model [optional]:** [More Information Needed] ### Model Sources [optional] <!-- Provide the basic links for the model. --> - **Repository:** [More Information Needed] - **Paper [optional]:** [More Information Needed] - **Demo [optional]:** [More Information Needed] ## Uses <!-- Address questions around how the model is intended to be used, including the foreseeable users of the model and those affected by the model. --> ### Direct Use <!-- This section is for the model use without fine-tuning or plugging into a larger ecosystem/app. --> [More Information Needed] ### Downstream Use [optional] <!-- This section is for the model use when fine-tuned for a task, or when plugged into a larger ecosystem/app --> [More Information Needed] ### Out-of-Scope Use <!-- This section addresses misuse, malicious use, and uses that the model will not work well for. --> [More Information Needed] ## Bias, Risks, and Limitations <!-- This section is meant to convey both technical and sociotechnical limitations. --> [More Information Needed] ### Recommendations <!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. --> Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations. ## How to Get Started with the Model Use the code below to get started with the model. [More Information Needed] ## Training Details ### Training Data <!-- This should link to a Dataset Card, perhaps with a short stub of information on what the training data is all about as well as documentation related to data pre-processing or additional filtering. --> [More Information Needed] ### Training Procedure <!-- This relates heavily to the Technical Specifications. Content here should link to that section when it is relevant to the training procedure. --> #### Preprocessing [optional] [More Information Needed] #### Training Hyperparameters - **Training regime:** [More Information Needed] <!--fp32, fp16 mixed precision, bf16 mixed precision, bf16 non-mixed precision, fp16 non-mixed precision, fp8 mixed precision --> #### Speeds, Sizes, Times [optional] <!-- This section provides information about throughput, start/end time, checkpoint size if relevant, etc. --> [More Information Needed] ## Evaluation <!-- This section describes the evaluation protocols and provides the results. --> ### Testing Data, Factors & Metrics #### Testing Data <!-- This should link to a Dataset Card if possible. --> [More Information Needed] #### Factors <!-- These are the things the evaluation is disaggregating by, e.g., subpopulations or domains. --> [More Information Needed] #### Metrics <!-- These are the evaluation metrics being used, ideally with a description of why. --> [More Information Needed] ### Results [More Information Needed] #### Summary ## Model Examination [optional] <!-- Relevant interpretability work for the model goes here --> [More Information Needed] ## Environmental Impact <!-- Total emissions (in grams of CO2eq) and additional considerations, such as electricity usage, go here. Edit the suggested text below accordingly --> Carbon emissions can be estimated using the [Machine Learning Impact calculator](https://mlco2.github.io/impact#compute) presented in [Lacoste et al. (2019)](https://arxiv.org/abs/1910.09700). - **Hardware Type:** [More Information Needed] - **Hours used:** [More Information Needed] - **Cloud Provider:** [More Information Needed] - **Compute Region:** [More Information Needed] - **Carbon Emitted:** [More Information Needed] ## Technical Specifications [optional] ### Model Architecture and Objective [More Information Needed] ### Compute Infrastructure [More Information Needed] #### Hardware [More Information Needed] #### Software [More Information Needed] ## Citation [optional] <!-- If there is a paper or blog post introducing the model, the APA and Bibtex information for that should go in this section. --> **BibTeX:** [More Information Needed] **APA:** [More Information Needed] ## Glossary [optional] <!-- If relevant, include terms and calculations in this section that can help readers understand the model or model card. --> [More Information Needed] ## More Information [optional] [More Information Needed] ## Model Card Authors [optional] [More Information Needed] ## Model Card Contact [More Information Needed]
[ "table", "table rotated" ]
WANGTINGTING/finetuned-table-transformer-structure-recognition-v1
# Model Card for Model ID <!-- Provide a quick summary of what the model is/does. --> ## Model Details ### Model Description <!-- Provide a longer summary of what this model is. --> This is the model card of a πŸ€— transformers model that has been pushed on the Hub. This model card has been automatically generated. - **Developed by:** [More Information Needed] - **Funded by [optional]:** [More Information Needed] - **Shared by [optional]:** [More Information Needed] - **Model type:** [More Information Needed] - **Language(s) (NLP):** [More Information Needed] - **License:** [More Information Needed] - **Finetuned from model [optional]:** [More Information Needed] ### Model Sources [optional] <!-- Provide the basic links for the model. --> - **Repository:** [More Information Needed] - **Paper [optional]:** [More Information Needed] - **Demo [optional]:** [More Information Needed] ## Uses <!-- Address questions around how the model is intended to be used, including the foreseeable users of the model and those affected by the model. --> ### Direct Use <!-- This section is for the model use without fine-tuning or plugging into a larger ecosystem/app. --> [More Information Needed] ### Downstream Use [optional] <!-- This section is for the model use when fine-tuned for a task, or when plugged into a larger ecosystem/app --> [More Information Needed] ### Out-of-Scope Use <!-- This section addresses misuse, malicious use, and uses that the model will not work well for. --> [More Information Needed] ## Bias, Risks, and Limitations <!-- This section is meant to convey both technical and sociotechnical limitations. --> [More Information Needed] ### Recommendations <!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. --> Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations. ## How to Get Started with the Model Use the code below to get started with the model. [More Information Needed] ## Training Details ### Training Data <!-- This should link to a Dataset Card, perhaps with a short stub of information on what the training data is all about as well as documentation related to data pre-processing or additional filtering. --> [More Information Needed] ### Training Procedure <!-- This relates heavily to the Technical Specifications. Content here should link to that section when it is relevant to the training procedure. --> #### Preprocessing [optional] [More Information Needed] #### Training Hyperparameters - **Training regime:** [More Information Needed] <!--fp32, fp16 mixed precision, bf16 mixed precision, bf16 non-mixed precision, fp16 non-mixed precision, fp8 mixed precision --> #### Speeds, Sizes, Times [optional] <!-- This section provides information about throughput, start/end time, checkpoint size if relevant, etc. --> [More Information Needed] ## Evaluation <!-- This section describes the evaluation protocols and provides the results. --> ### Testing Data, Factors & Metrics #### Testing Data <!-- This should link to a Dataset Card if possible. --> [More Information Needed] #### Factors <!-- These are the things the evaluation is disaggregating by, e.g., subpopulations or domains. --> [More Information Needed] #### Metrics <!-- These are the evaluation metrics being used, ideally with a description of why. --> [More Information Needed] ### Results [More Information Needed] #### Summary ## Model Examination [optional] <!-- Relevant interpretability work for the model goes here --> [More Information Needed] ## Environmental Impact <!-- Total emissions (in grams of CO2eq) and additional considerations, such as electricity usage, go here. Edit the suggested text below accordingly --> Carbon emissions can be estimated using the [Machine Learning Impact calculator](https://mlco2.github.io/impact#compute) presented in [Lacoste et al. (2019)](https://arxiv.org/abs/1910.09700). - **Hardware Type:** [More Information Needed] - **Hours used:** [More Information Needed] - **Cloud Provider:** [More Information Needed] - **Compute Region:** [More Information Needed] - **Carbon Emitted:** [More Information Needed] ## Technical Specifications [optional] ### Model Architecture and Objective [More Information Needed] ### Compute Infrastructure [More Information Needed] #### Hardware [More Information Needed] #### Software [More Information Needed] ## Citation [optional] <!-- If there is a paper or blog post introducing the model, the APA and Bibtex information for that should go in this section. --> **BibTeX:** [More Information Needed] **APA:** [More Information Needed] ## Glossary [optional] <!-- If relevant, include terms and calculations in this section that can help readers understand the model or model card. --> [More Information Needed] ## More Information [optional] [More Information Needed] ## Model Card Authors [optional] [More Information Needed] ## Model Card Contact [More Information Needed]
[ "table", "table column", "table row", "table column header", "table projected row header", "table spanning cell" ]
0llheaven/CON-DETR-V6
# Model Card for Model ID <!-- Provide a quick summary of what the model is/does. --> ## Model Details ### Model Description <!-- Provide a longer summary of what this model is. --> This is the model card of a πŸ€— transformers model that has been pushed on the Hub. This model card has been automatically generated. - **Developed by:** [More Information Needed] - **Funded by [optional]:** [More Information Needed] - **Shared by [optional]:** [More Information Needed] - **Model type:** [More Information Needed] - **Language(s) (NLP):** [More Information Needed] - **License:** [More Information Needed] - **Finetuned from model [optional]:** [More Information Needed] ### Model Sources [optional] <!-- Provide the basic links for the model. --> - **Repository:** [More Information Needed] - **Paper [optional]:** [More Information Needed] - **Demo [optional]:** [More Information Needed] ## Uses <!-- Address questions around how the model is intended to be used, including the foreseeable users of the model and those affected by the model. --> ### Direct Use <!-- This section is for the model use without fine-tuning or plugging into a larger ecosystem/app. --> [More Information Needed] ### Downstream Use [optional] <!-- This section is for the model use when fine-tuned for a task, or when plugged into a larger ecosystem/app --> [More Information Needed] ### Out-of-Scope Use <!-- This section addresses misuse, malicious use, and uses that the model will not work well for. --> [More Information Needed] ## Bias, Risks, and Limitations <!-- This section is meant to convey both technical and sociotechnical limitations. --> [More Information Needed] ### Recommendations <!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. --> Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations. ## How to Get Started with the Model Use the code below to get started with the model. [More Information Needed] ## Training Details ### Training Data <!-- This should link to a Dataset Card, perhaps with a short stub of information on what the training data is all about as well as documentation related to data pre-processing or additional filtering. --> [More Information Needed] ### Training Procedure <!-- This relates heavily to the Technical Specifications. Content here should link to that section when it is relevant to the training procedure. --> #### Preprocessing [optional] [More Information Needed] #### Training Hyperparameters - **Training regime:** [More Information Needed] <!--fp32, fp16 mixed precision, bf16 mixed precision, bf16 non-mixed precision, fp16 non-mixed precision, fp8 mixed precision --> #### Speeds, Sizes, Times [optional] <!-- This section provides information about throughput, start/end time, checkpoint size if relevant, etc. --> [More Information Needed] ## Evaluation <!-- This section describes the evaluation protocols and provides the results. --> ### Testing Data, Factors & Metrics #### Testing Data <!-- This should link to a Dataset Card if possible. --> [More Information Needed] #### Factors <!-- These are the things the evaluation is disaggregating by, e.g., subpopulations or domains. --> [More Information Needed] #### Metrics <!-- These are the evaluation metrics being used, ideally with a description of why. --> [More Information Needed] ### Results [More Information Needed] #### Summary ## Model Examination [optional] <!-- Relevant interpretability work for the model goes here --> [More Information Needed] ## Environmental Impact <!-- Total emissions (in grams of CO2eq) and additional considerations, such as electricity usage, go here. Edit the suggested text below accordingly --> Carbon emissions can be estimated using the [Machine Learning Impact calculator](https://mlco2.github.io/impact#compute) presented in [Lacoste et al. (2019)](https://arxiv.org/abs/1910.09700). - **Hardware Type:** [More Information Needed] - **Hours used:** [More Information Needed] - **Cloud Provider:** [More Information Needed] - **Compute Region:** [More Information Needed] - **Carbon Emitted:** [More Information Needed] ## Technical Specifications [optional] ### Model Architecture and Objective [More Information Needed] ### Compute Infrastructure [More Information Needed] #### Hardware [More Information Needed] #### Software [More Information Needed] ## Citation [optional] <!-- If there is a paper or blog post introducing the model, the APA and Bibtex information for that should go in this section. --> **BibTeX:** [More Information Needed] **APA:** [More Information Needed] ## Glossary [optional] <!-- If relevant, include terms and calculations in this section that can help readers understand the model or model card. --> [More Information Needed] ## More Information [optional] [More Information Needed] ## Model Card Authors [optional] [More Information Needed] ## Model Card Contact [More Information Needed]
[ "pneumonia", "normal", "pneumonia_bacteria", "pneumonia_virus" ]
Srajanseth84/detr-finetuned-balloon-v2
# Model Card for Model ID <!-- Provide a quick summary of what the model is/does. --> ## Model Details ### Model Description <!-- Provide a longer summary of what this model is. --> This is the model card of a πŸ€— transformers model that has been pushed on the Hub. This model card has been automatically generated. - **Developed by:** [More Information Needed] - **Funded by [optional]:** [More Information Needed] - **Shared by [optional]:** [More Information Needed] - **Model type:** [More Information Needed] - **Language(s) (NLP):** [More Information Needed] - **License:** [More Information Needed] - **Finetuned from model [optional]:** [More Information Needed] ### Model Sources [optional] <!-- Provide the basic links for the model. --> - **Repository:** [More Information Needed] - **Paper [optional]:** [More Information Needed] - **Demo [optional]:** [More Information Needed] ## Uses <!-- Address questions around how the model is intended to be used, including the foreseeable users of the model and those affected by the model. --> ### Direct Use <!-- This section is for the model use without fine-tuning or plugging into a larger ecosystem/app. --> [More Information Needed] ### Downstream Use [optional] <!-- This section is for the model use when fine-tuned for a task, or when plugged into a larger ecosystem/app --> [More Information Needed] ### Out-of-Scope Use <!-- This section addresses misuse, malicious use, and uses that the model will not work well for. --> [More Information Needed] ## Bias, Risks, and Limitations <!-- This section is meant to convey both technical and sociotechnical limitations. --> [More Information Needed] ### Recommendations <!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. --> Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations. ## How to Get Started with the Model Use the code below to get started with the model. [More Information Needed] ## Training Details ### Training Data <!-- This should link to a Dataset Card, perhaps with a short stub of information on what the training data is all about as well as documentation related to data pre-processing or additional filtering. --> [More Information Needed] ### Training Procedure <!-- This relates heavily to the Technical Specifications. Content here should link to that section when it is relevant to the training procedure. --> #### Preprocessing [optional] [More Information Needed] #### Training Hyperparameters - **Training regime:** [More Information Needed] <!--fp32, fp16 mixed precision, bf16 mixed precision, bf16 non-mixed precision, fp16 non-mixed precision, fp8 mixed precision --> #### Speeds, Sizes, Times [optional] <!-- This section provides information about throughput, start/end time, checkpoint size if relevant, etc. --> [More Information Needed] ## Evaluation <!-- This section describes the evaluation protocols and provides the results. --> ### Testing Data, Factors & Metrics #### Testing Data <!-- This should link to a Dataset Card if possible. --> [More Information Needed] #### Factors <!-- These are the things the evaluation is disaggregating by, e.g., subpopulations or domains. --> [More Information Needed] #### Metrics <!-- These are the evaluation metrics being used, ideally with a description of why. --> [More Information Needed] ### Results [More Information Needed] #### Summary ## Model Examination [optional] <!-- Relevant interpretability work for the model goes here --> [More Information Needed] ## Environmental Impact <!-- Total emissions (in grams of CO2eq) and additional considerations, such as electricity usage, go here. Edit the suggested text below accordingly --> Carbon emissions can be estimated using the [Machine Learning Impact calculator](https://mlco2.github.io/impact#compute) presented in [Lacoste et al. (2019)](https://arxiv.org/abs/1910.09700). - **Hardware Type:** [More Information Needed] - **Hours used:** [More Information Needed] - **Cloud Provider:** [More Information Needed] - **Compute Region:** [More Information Needed] - **Carbon Emitted:** [More Information Needed] ## Technical Specifications [optional] ### Model Architecture and Objective [More Information Needed] ### Compute Infrastructure [More Information Needed] #### Hardware [More Information Needed] #### Software [More Information Needed] ## Citation [optional] <!-- If there is a paper or blog post introducing the model, the APA and Bibtex information for that should go in this section. --> **BibTeX:** [More Information Needed] **APA:** [More Information Needed] ## Glossary [optional] <!-- If relevant, include terms and calculations in this section that can help readers understand the model or model card. --> [More Information Needed] ## More Information [optional] [More Information Needed] ## Model Card Authors [optional] [More Information Needed] ## Model Card Contact [More Information Needed]
[ "label_0", "label_1", "label_2", "label_3", "label_4", "label_5", "label_6" ]
joe611/chickens-60
<!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. --> # chickens-60 This model is a fine-tuned version of [facebook/detr-resnet-50](https://huggingface.co/facebook/detr-resnet-50) on the None dataset. It achieves the following results on the evaluation set: - Loss: 0.2476 ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 1e-05 - train_batch_size: 2 - eval_batch_size: 8 - seed: 42 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: cosine - num_epochs: 60 ### Training results | Training Loss | Epoch | Step | Validation Loss | |:-------------:|:-----:|:-----:|:---------------:| | 1.2826 | 1.0 | 497 | 1.1718 | | 1.0565 | 2.0 | 994 | 0.9496 | | 0.8699 | 3.0 | 1491 | 0.8140 | | 0.9199 | 4.0 | 1988 | 0.7631 | | 0.7011 | 5.0 | 2485 | 0.6355 | | 0.7246 | 6.0 | 2982 | 0.6444 | | 0.5952 | 7.0 | 3479 | 0.5118 | | 0.5584 | 8.0 | 3976 | 0.4591 | | 0.4884 | 9.0 | 4473 | 0.4302 | | 0.4804 | 10.0 | 4970 | 0.4176 | | 0.4511 | 11.0 | 5467 | 0.3914 | | 0.4124 | 12.0 | 5964 | 0.3830 | | 0.4182 | 13.0 | 6461 | 0.3754 | | 0.4103 | 14.0 | 6958 | 0.3583 | | 0.4037 | 15.0 | 7455 | 0.3451 | | 0.3983 | 16.0 | 7952 | 0.3453 | | 0.3606 | 17.0 | 8449 | 0.3426 | | 0.3885 | 18.0 | 8946 | 0.3436 | | 0.3713 | 19.0 | 9443 | 0.3321 | | 0.339 | 20.0 | 9940 | 0.3199 | | 0.3345 | 21.0 | 10437 | 0.3119 | | 0.3215 | 22.0 | 10934 | 0.3189 | | 0.353 | 23.0 | 11431 | 0.3031 | | 0.338 | 24.0 | 11928 | 0.3152 | | 0.3183 | 25.0 | 12425 | 0.3113 | | 0.3219 | 26.0 | 12922 | 0.2882 | | 0.3128 | 27.0 | 13419 | 0.2872 | | 0.3032 | 28.0 | 13916 | 0.2876 | | 0.3129 | 29.0 | 14413 | 0.2835 | | 0.3062 | 30.0 | 14910 | 0.2832 | | 0.3214 | 31.0 | 15407 | 0.2668 | | 0.2891 | 32.0 | 15904 | 0.2838 | | 0.297 | 33.0 | 16401 | 0.2756 | | 0.296 | 34.0 | 16898 | 0.2753 | | 0.2882 | 35.0 | 17395 | 0.2706 | | 0.2704 | 36.0 | 17892 | 0.2644 | | 0.2709 | 37.0 | 18389 | 0.2643 | | 0.2852 | 38.0 | 18886 | 0.2640 | | 0.259 | 39.0 | 19383 | 0.2698 | | 0.276 | 40.0 | 19880 | 0.2602 | | 0.2867 | 41.0 | 20377 | 0.2499 | | 0.2986 | 42.0 | 20874 | 0.2672 | | 0.2577 | 43.0 | 21371 | 0.2577 | | 0.2583 | 44.0 | 21868 | 0.2487 | | 0.2599 | 45.0 | 22365 | 0.2570 | | 0.2498 | 46.0 | 22862 | 0.2529 | | 0.2496 | 47.0 | 23359 | 0.2522 | | 0.2596 | 48.0 | 23856 | 0.2568 | | 0.2352 | 49.0 | 24353 | 0.2525 | | 0.2552 | 50.0 | 24850 | 0.2452 | | 0.2428 | 51.0 | 25347 | 0.2496 | | 0.2538 | 52.0 | 25844 | 0.2483 | | 0.2657 | 53.0 | 26341 | 0.2510 | | 0.2532 | 54.0 | 26838 | 0.2510 | | 0.2412 | 55.0 | 27335 | 0.2485 | | 0.2822 | 56.0 | 27832 | 0.2491 | | 0.2694 | 57.0 | 28329 | 0.2482 | | 0.249 | 58.0 | 28826 | 0.2479 | | 0.2408 | 59.0 | 29323 | 0.2477 | | 0.2345 | 60.0 | 29820 | 0.2476 | ### Framework versions - Transformers 4.45.2 - Pytorch 2.4.1+cu121 - Datasets 2.19.2 - Tokenizers 0.20.0
[ "chicken", "duck", "plant" ]
0llheaven/CON-DETR-V7
# Model Card for Model ID <!-- Provide a quick summary of what the model is/does. --> ## Model Details ### Model Description <!-- Provide a longer summary of what this model is. --> This is the model card of a πŸ€— transformers model that has been pushed on the Hub. This model card has been automatically generated. - **Developed by:** [More Information Needed] - **Funded by [optional]:** [More Information Needed] - **Shared by [optional]:** [More Information Needed] - **Model type:** [More Information Needed] - **Language(s) (NLP):** [More Information Needed] - **License:** [More Information Needed] - **Finetuned from model [optional]:** [More Information Needed] ### Model Sources [optional] <!-- Provide the basic links for the model. --> - **Repository:** [More Information Needed] - **Paper [optional]:** [More Information Needed] - **Demo [optional]:** [More Information Needed] ## Uses <!-- Address questions around how the model is intended to be used, including the foreseeable users of the model and those affected by the model. --> ### Direct Use <!-- This section is for the model use without fine-tuning or plugging into a larger ecosystem/app. --> [More Information Needed] ### Downstream Use [optional] <!-- This section is for the model use when fine-tuned for a task, or when plugged into a larger ecosystem/app --> [More Information Needed] ### Out-of-Scope Use <!-- This section addresses misuse, malicious use, and uses that the model will not work well for. --> [More Information Needed] ## Bias, Risks, and Limitations <!-- This section is meant to convey both technical and sociotechnical limitations. --> [More Information Needed] ### Recommendations <!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. --> Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations. ## How to Get Started with the Model Use the code below to get started with the model. [More Information Needed] ## Training Details ### Training Data <!-- This should link to a Dataset Card, perhaps with a short stub of information on what the training data is all about as well as documentation related to data pre-processing or additional filtering. --> [More Information Needed] ### Training Procedure <!-- This relates heavily to the Technical Specifications. Content here should link to that section when it is relevant to the training procedure. --> #### Preprocessing [optional] [More Information Needed] #### Training Hyperparameters - **Training regime:** [More Information Needed] <!--fp32, fp16 mixed precision, bf16 mixed precision, bf16 non-mixed precision, fp16 non-mixed precision, fp8 mixed precision --> #### Speeds, Sizes, Times [optional] <!-- This section provides information about throughput, start/end time, checkpoint size if relevant, etc. --> [More Information Needed] ## Evaluation <!-- This section describes the evaluation protocols and provides the results. --> ### Testing Data, Factors & Metrics #### Testing Data <!-- This should link to a Dataset Card if possible. --> [More Information Needed] #### Factors <!-- These are the things the evaluation is disaggregating by, e.g., subpopulations or domains. --> [More Information Needed] #### Metrics <!-- These are the evaluation metrics being used, ideally with a description of why. --> [More Information Needed] ### Results [More Information Needed] #### Summary ## Model Examination [optional] <!-- Relevant interpretability work for the model goes here --> [More Information Needed] ## Environmental Impact <!-- Total emissions (in grams of CO2eq) and additional considerations, such as electricity usage, go here. Edit the suggested text below accordingly --> Carbon emissions can be estimated using the [Machine Learning Impact calculator](https://mlco2.github.io/impact#compute) presented in [Lacoste et al. (2019)](https://arxiv.org/abs/1910.09700). - **Hardware Type:** [More Information Needed] - **Hours used:** [More Information Needed] - **Cloud Provider:** [More Information Needed] - **Compute Region:** [More Information Needed] - **Carbon Emitted:** [More Information Needed] ## Technical Specifications [optional] ### Model Architecture and Objective [More Information Needed] ### Compute Infrastructure [More Information Needed] #### Hardware [More Information Needed] #### Software [More Information Needed] ## Citation [optional] <!-- If there is a paper or blog post introducing the model, the APA and Bibtex information for that should go in this section. --> **BibTeX:** [More Information Needed] **APA:** [More Information Needed] ## Glossary [optional] <!-- If relevant, include terms and calculations in this section that can help readers understand the model or model card. --> [More Information Needed] ## More Information [optional] [More Information Needed] ## Model Card Authors [optional] [More Information Needed] ## Model Card Contact [More Information Needed]
[ "pneumonia", "normal", "pneumonia_bacteria", "pneumonia_virus" ]
ethans333/detr-finetuned-cppe-5-10k-steps
<!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. --> # detr-finetuned-cppe-5-10k-steps This model is a fine-tuned version of [PekingU/rtdetr_r50vd](https://huggingface.co/PekingU/rtdetr_r50vd) on the imagefolder dataset. It achieves the following results on the evaluation set: - Loss: 13.8955 - Map: 0.1056 - Map 50: 0.2682 - Map 75: 0.0807 - Map Small: 0.0 - Map Medium: 0.0749 - Map Large: 0.3361 - Mar 1: 0.1272 - Mar 10: 0.1948 - Mar 100: 0.208 - Mar Small: 0.0 - Mar Medium: 0.1572 - Mar Large: 0.5072 - Map Hand: 0.1655 - Mar 100 Hand: 0.1906 - Map Knife: 0.0605 - Mar 100 Knife: 0.1768 - Map Radio: 0.0001 - Mar 100 Radio: 0.008 - Map Binos: 0.0613 - Mar 100 Binos: 0.1578 - Map Handgun: 0.0479 - Mar 100 Handgun: 0.1712 - Map Grenade: -1.0 - Mar 100 Grenade: -1.0 - Map Rifle: 0.2983 - Mar 100 Rifle: 0.5436 ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 5e-05 - train_batch_size: 8 - eval_batch_size: 8 - seed: 1337 - optimizer: Use adamw_torch with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments - lr_scheduler_type: linear - num_epochs: 100.0 - mixed_precision_training: Native AMP ### Training results | Training Loss | Epoch | Step | Validation Loss | Map | Map 50 | Map 75 | Map Small | Map Medium | Map Large | Mar 1 | Mar 10 | Mar 100 | Mar Small | Mar Medium | Mar Large | Map Hand | Mar 100 Hand | Map Knife | Mar 100 Knife | Map Radio | Mar 100 Radio | Map Binos | Mar 100 Binos | Map Handgun | Mar 100 Handgun | Map Grenade | Mar 100 Grenade | Map Rifle | Mar 100 Rifle | |:-------------:|:-----:|:-----:|:---------------:|:------:|:------:|:------:|:---------:|:----------:|:---------:|:------:|:------:|:-------:|:---------:|:----------:|:---------:|:--------:|:------------:|:---------:|:-------------:|:---------:|:-------------:|:---------:|:-------------:|:-----------:|:---------------:|:-----------:|:---------------:|:---------:|:-------------:| | 26.2445 | 1.0 | 243 | 11.0351 | 0.2021 | 0.3788 | 0.1827 | 0.0494 | 0.1709 | 0.6416 | 0.3117 | 0.546 | 0.5916 | 0.3089 | 0.5616 | 0.7861 | 0.3219 | 0.6676 | 0.221 | 0.6161 | 0.0383 | 0.384 | 0.1037 | 0.593 | 0.097 | 0.5542 | -1.0 | -1.0 | 0.4305 | 0.735 | | 13.8976 | 2.0 | 486 | 10.6770 | 0.267 | 0.4577 | 0.256 | 0.099 | 0.2159 | 0.6999 | 0.3652 | 0.5896 | 0.6423 | 0.4103 | 0.6191 | 0.7929 | 0.4229 | 0.689 | 0.2994 | 0.675 | 0.1294 | 0.456 | 0.1087 | 0.6336 | 0.1131 | 0.639 | -1.0 | -1.0 | 0.5288 | 0.7613 | | 12.4745 | 3.0 | 729 | 10.8013 | 0.3713 | 0.6319 | 0.377 | 0.1295 | 0.3456 | 0.7472 | 0.4297 | 0.6123 | 0.6566 | 0.4287 | 0.6406 | 0.8478 | 0.4393 | 0.6936 | 0.4227 | 0.6875 | 0.2402 | 0.516 | 0.1834 | 0.6633 | 0.3384 | 0.6017 | -1.0 | -1.0 | 0.6039 | 0.7773 | | 11.7775 | 4.0 | 972 | 11.1568 | 0.3555 | 0.6473 | 0.3376 | 0.0939 | 0.3101 | 0.6466 | 0.4029 | 0.5839 | 0.6172 | 0.3607 | 0.5772 | 0.7741 | 0.4485 | 0.652 | 0.3977 | 0.6821 | 0.1729 | 0.408 | 0.2821 | 0.632 | 0.2314 | 0.5847 | -1.0 | -1.0 | 0.6001 | 0.7442 | | 11.4179 | 5.0 | 1215 | 11.5190 | 0.3511 | 0.6276 | 0.3462 | 0.1138 | 0.2936 | 0.7477 | 0.3902 | 0.5407 | 0.5582 | 0.2909 | 0.5096 | 0.8256 | 0.4455 | 0.6224 | 0.4251 | 0.6518 | 0.1222 | 0.248 | 0.2528 | 0.557 | 0.2357 | 0.5237 | -1.0 | -1.0 | 0.6252 | 0.746 | | 10.701 | 6.0 | 1458 | 11.6731 | 0.3654 | 0.6875 | 0.3227 | 0.11 | 0.3099 | 0.671 | 0.398 | 0.5208 | 0.5236 | 0.2972 | 0.476 | 0.773 | 0.4361 | 0.5738 | 0.378 | 0.5518 | 0.1017 | 0.264 | 0.3391 | 0.5227 | 0.2934 | 0.4898 | -1.0 | -1.0 | 0.6443 | 0.7393 | | 10.3427 | 7.0 | 1701 | 11.3578 | 0.3595 | 0.6606 | 0.3333 | 0.0921 | 0.2978 | 0.7263 | 0.3836 | 0.516 | 0.5234 | 0.291 | 0.4699 | 0.7769 | 0.4384 | 0.5654 | 0.3974 | 0.6143 | 0.1092 | 0.236 | 0.333 | 0.5273 | 0.232 | 0.4492 | -1.0 | -1.0 | 0.647 | 0.7485 | | 10.1603 | 8.0 | 1944 | 11.8045 | 0.322 | 0.6255 | 0.2688 | 0.0877 | 0.2597 | 0.6186 | 0.3493 | 0.4776 | 0.4828 | 0.2725 | 0.424 | 0.7425 | 0.3921 | 0.5237 | 0.3264 | 0.5071 | 0.1358 | 0.268 | 0.2088 | 0.4305 | 0.2201 | 0.4237 | -1.0 | -1.0 | 0.6489 | 0.7436 | | 10.0173 | 9.0 | 2187 | 12.1883 | 0.2644 | 0.5553 | 0.2284 | 0.0489 | 0.2003 | 0.542 | 0.3044 | 0.3826 | 0.3843 | 0.2017 | 0.3197 | 0.7068 | 0.3369 | 0.4111 | 0.2471 | 0.3821 | 0.056 | 0.148 | 0.2111 | 0.3531 | 0.1684 | 0.3271 | -1.0 | -1.0 | 0.5666 | 0.684 | | 9.6875 | 10.0 | 2430 | 12.1560 | 0.2642 | 0.5203 | 0.2204 | 0.0508 | 0.1777 | 0.6493 | 0.2981 | 0.4006 | 0.411 | 0.2216 | 0.3384 | 0.7372 | 0.3607 | 0.46 | 0.2705 | 0.4536 | 0.0547 | 0.16 | 0.1428 | 0.3344 | 0.1455 | 0.3373 | -1.0 | -1.0 | 0.611 | 0.7209 | | 9.6552 | 11.0 | 2673 | 12.0592 | 0.3206 | 0.647 | 0.2632 | 0.0698 | 0.2566 | 0.6 | 0.3517 | 0.447 | 0.4593 | 0.2393 | 0.4043 | 0.7483 | 0.3659 | 0.4794 | 0.3822 | 0.5268 | 0.1181 | 0.232 | 0.3116 | 0.4305 | 0.1387 | 0.3746 | -1.0 | -1.0 | 0.6069 | 0.7123 | | 9.5691 | 12.0 | 2916 | 12.7867 | 0.2438 | 0.4925 | 0.2089 | 0.0762 | 0.1896 | 0.5726 | 0.2754 | 0.3667 | 0.3718 | 0.1302 | 0.3196 | 0.6834 | 0.3288 | 0.3975 | 0.2792 | 0.3929 | 0.077 | 0.116 | 0.262 | 0.3719 | 0.0639 | 0.3322 | -1.0 | -1.0 | 0.4518 | 0.6202 | | 9.3589 | 13.0 | 3159 | 12.5751 | 0.2707 | 0.5256 | 0.2447 | 0.0768 | 0.2379 | 0.5878 | 0.3125 | 0.4015 | 0.4109 | 0.0952 | 0.3635 | 0.7347 | 0.3247 | 0.4095 | 0.3035 | 0.4804 | 0.0879 | 0.144 | 0.2914 | 0.4156 | 0.1783 | 0.3407 | -1.0 | -1.0 | 0.4386 | 0.6755 | | 9.2296 | 14.0 | 3402 | 12.8074 | 0.2713 | 0.5674 | 0.2255 | 0.073 | 0.2352 | 0.5959 | 0.3204 | 0.4277 | 0.4395 | 0.2285 | 0.3956 | 0.7407 | 0.3357 | 0.4456 | 0.3858 | 0.5232 | 0.1399 | 0.236 | 0.2605 | 0.4078 | 0.131 | 0.3797 | -1.0 | -1.0 | 0.3752 | 0.6448 | | 9.1393 | 15.0 | 3645 | 12.8611 | 0.2091 | 0.458 | 0.1634 | 0.0471 | 0.1603 | 0.5465 | 0.2583 | 0.3526 | 0.3654 | 0.1712 | 0.32 | 0.7242 | 0.2391 | 0.3361 | 0.2929 | 0.4286 | 0.1155 | 0.16 | 0.2083 | 0.3633 | 0.0709 | 0.3085 | -1.0 | -1.0 | 0.3281 | 0.5957 | | 8.6971 | 16.0 | 3888 | 13.2202 | 0.1643 | 0.3535 | 0.131 | 0.0766 | 0.1227 | 0.4971 | 0.2033 | 0.3366 | 0.3553 | 0.1727 | 0.299 | 0.7236 | 0.2575 | 0.3282 | 0.2209 | 0.4107 | 0.0798 | 0.152 | 0.1059 | 0.2828 | 0.0197 | 0.3186 | -1.0 | -1.0 | 0.3019 | 0.6393 | | 8.8298 | 17.0 | 4131 | 12.9959 | 0.2316 | 0.4501 | 0.208 | 0.0754 | 0.1642 | 0.6608 | 0.275 | 0.3759 | 0.3856 | 0.2045 | 0.3213 | 0.7603 | 0.2923 | 0.3672 | 0.3374 | 0.4946 | 0.0624 | 0.14 | 0.1697 | 0.3203 | 0.0756 | 0.3458 | -1.0 | -1.0 | 0.4522 | 0.6454 | | 8.5902 | 18.0 | 4374 | 13.0290 | 0.2248 | 0.4844 | 0.1781 | 0.1132 | 0.1982 | 0.4629 | 0.2702 | 0.3483 | 0.3575 | 0.1163 | 0.3168 | 0.6249 | 0.2505 | 0.313 | 0.2429 | 0.3982 | 0.0655 | 0.096 | 0.2972 | 0.3805 | 0.092 | 0.3085 | -1.0 | -1.0 | 0.4008 | 0.6491 | | 8.7283 | 19.0 | 4617 | 13.3499 | 0.2062 | 0.4008 | 0.1895 | 0.0251 | 0.1647 | 0.4201 | 0.2501 | 0.3286 | 0.3394 | 0.0548 | 0.2949 | 0.679 | 0.2881 | 0.3614 | 0.2011 | 0.3518 | 0.0011 | 0.036 | 0.2681 | 0.3664 | 0.0767 | 0.3119 | -1.0 | -1.0 | 0.402 | 0.6092 | | 8.5466 | 20.0 | 4860 | 12.9557 | 0.2371 | 0.487 | 0.1974 | 0.0287 | 0.1965 | 0.4002 | 0.2675 | 0.3524 | 0.3621 | 0.1426 | 0.3218 | 0.5405 | 0.2977 | 0.3739 | 0.1847 | 0.3518 | 0.0235 | 0.072 | 0.2699 | 0.3578 | 0.1516 | 0.3712 | -1.0 | -1.0 | 0.4952 | 0.646 | | 8.4748 | 21.0 | 5103 | 13.0234 | 0.2146 | 0.4234 | 0.181 | 0.0595 | 0.1596 | 0.6419 | 0.2459 | 0.3548 | 0.3708 | 0.1423 | 0.3196 | 0.794 | 0.3144 | 0.3965 | 0.2584 | 0.3821 | 0.0388 | 0.076 | 0.195 | 0.3297 | 0.0265 | 0.3881 | -1.0 | -1.0 | 0.4545 | 0.6521 | | 8.3742 | 22.0 | 5346 | 12.8936 | 0.2619 | 0.5216 | 0.222 | 0.0718 | 0.2017 | 0.6594 | 0.2911 | 0.4129 | 0.4436 | 0.1629 | 0.4076 | 0.8079 | 0.3292 | 0.448 | 0.2435 | 0.4196 | 0.0674 | 0.168 | 0.2667 | 0.4656 | 0.165 | 0.461 | -1.0 | -1.0 | 0.4998 | 0.6994 | | 8.3388 | 23.0 | 5589 | 12.8914 | 0.2867 | 0.5574 | 0.252 | 0.083 | 0.219 | 0.6348 | 0.3137 | 0.3969 | 0.4238 | 0.1786 | 0.3612 | 0.805 | 0.3192 | 0.408 | 0.3331 | 0.4786 | 0.0756 | 0.12 | 0.2837 | 0.4211 | 0.1154 | 0.3864 | -1.0 | -1.0 | 0.5933 | 0.7288 | | 8.2653 | 24.0 | 5832 | 13.0177 | 0.2601 | 0.5182 | 0.2312 | 0.0449 | 0.202 | 0.6073 | 0.2952 | 0.385 | 0.4092 | 0.1883 | 0.3547 | 0.8023 | 0.3058 | 0.4345 | 0.3332 | 0.5161 | 0.0353 | 0.08 | 0.2753 | 0.4398 | 0.1358 | 0.3593 | -1.0 | -1.0 | 0.4751 | 0.6252 | | 8.1082 | 25.0 | 6075 | 12.7751 | 0.2683 | 0.5175 | 0.2474 | 0.0724 | 0.2167 | 0.6102 | 0.3117 | 0.4138 | 0.4441 | 0.2429 | 0.3934 | 0.7998 | 0.2997 | 0.4411 | 0.3905 | 0.5268 | 0.0718 | 0.144 | 0.3215 | 0.4734 | 0.0467 | 0.4102 | -1.0 | -1.0 | 0.4794 | 0.6693 | | 8.1065 | 26.0 | 6318 | 12.9549 | 0.2826 | 0.5735 | 0.2454 | 0.0543 | 0.233 | 0.583 | 0.3189 | 0.4211 | 0.4429 | 0.1982 | 0.4008 | 0.7366 | 0.2931 | 0.3901 | 0.3671 | 0.5125 | 0.1329 | 0.22 | 0.2824 | 0.4305 | 0.1482 | 0.422 | -1.0 | -1.0 | 0.4721 | 0.6822 | | 8.1044 | 27.0 | 6561 | 12.9133 | 0.2826 | 0.5508 | 0.2619 | 0.0456 | 0.2471 | 0.5274 | 0.3083 | 0.4367 | 0.4608 | 0.1707 | 0.4215 | 0.7187 | 0.3608 | 0.4839 | 0.3253 | 0.4857 | 0.0825 | 0.212 | 0.3228 | 0.4688 | 0.1335 | 0.4356 | -1.0 | -1.0 | 0.4705 | 0.6785 | | 8.058 | 28.0 | 6804 | 13.1728 | 0.278 | 0.5244 | 0.2548 | 0.0709 | 0.2259 | 0.5298 | 0.3127 | 0.4338 | 0.4648 | 0.2356 | 0.4237 | 0.6889 | 0.3487 | 0.4786 | 0.3091 | 0.5 | 0.0727 | 0.176 | 0.335 | 0.4945 | 0.1083 | 0.4492 | -1.0 | -1.0 | 0.4944 | 0.6902 | | 8.0279 | 29.0 | 7047 | 12.7755 | 0.2481 | 0.5278 | 0.2067 | 0.0818 | 0.1918 | 0.5582 | 0.2779 | 0.3779 | 0.4014 | 0.1882 | 0.349 | 0.7132 | 0.3203 | 0.4348 | 0.3218 | 0.4643 | 0.0769 | 0.164 | 0.267 | 0.3938 | 0.073 | 0.3424 | -1.0 | -1.0 | 0.4294 | 0.6092 | | 8.1898 | 30.0 | 7290 | 13.1205 | 0.1991 | 0.431 | 0.1611 | 0.0705 | 0.1561 | 0.5312 | 0.2349 | 0.3144 | 0.3586 | 0.1133 | 0.3054 | 0.6877 | 0.2738 | 0.3913 | 0.2385 | 0.3839 | 0.0161 | 0.088 | 0.2596 | 0.3859 | 0.0445 | 0.3203 | -1.0 | -1.0 | 0.362 | 0.5822 | | 7.9297 | 31.0 | 7533 | 13.0575 | 0.225 | 0.472 | 0.1937 | 0.0809 | 0.1766 | 0.529 | 0.2585 | 0.3522 | 0.3838 | 0.1978 | 0.3235 | 0.6909 | 0.2634 | 0.3792 | 0.2321 | 0.4 | 0.0412 | 0.12 | 0.3014 | 0.3859 | 0.0624 | 0.3797 | -1.0 | -1.0 | 0.4493 | 0.638 | | 7.7829 | 32.0 | 7776 | 13.1299 | 0.1994 | 0.4066 | 0.1804 | 0.0351 | 0.1545 | 0.4497 | 0.2228 | 0.3092 | 0.3372 | 0.0869 | 0.2819 | 0.6158 | 0.288 | 0.3843 | 0.1393 | 0.2875 | 0.011 | 0.032 | 0.3102 | 0.4273 | 0.0575 | 0.3271 | -1.0 | -1.0 | 0.3905 | 0.565 | | 7.8297 | 33.0 | 8019 | 13.2561 | 0.1601 | 0.36 | 0.1059 | 0.0619 | 0.1143 | 0.3273 | 0.1862 | 0.2922 | 0.3253 | 0.1427 | 0.261 | 0.612 | 0.2731 | 0.3816 | 0.1277 | 0.2911 | 0.0017 | 0.052 | 0.158 | 0.393 | 0.0546 | 0.2831 | -1.0 | -1.0 | 0.3454 | 0.5509 | | 7.7388 | 34.0 | 8262 | 13.3269 | 0.176 | 0.3842 | 0.1429 | 0.086 | 0.1382 | 0.4469 | 0.2019 | 0.3311 | 0.3678 | 0.1904 | 0.3219 | 0.6543 | 0.3193 | 0.4331 | 0.1714 | 0.3679 | 0.0062 | 0.104 | 0.234 | 0.4297 | 0.018 | 0.3373 | -1.0 | -1.0 | 0.3073 | 0.535 | | 7.6546 | 35.0 | 8505 | 13.2449 | 0.1929 | 0.4139 | 0.1485 | 0.0578 | 0.1398 | 0.3528 | 0.2142 | 0.3369 | 0.3658 | 0.2039 | 0.3171 | 0.5149 | 0.351 | 0.4632 | 0.0922 | 0.3143 | 0.0059 | 0.08 | 0.2666 | 0.4641 | 0.0252 | 0.3068 | -1.0 | -1.0 | 0.4166 | 0.5663 | | 7.6865 | 36.0 | 8748 | 12.8590 | 0.2182 | 0.4927 | 0.1486 | 0.0504 | 0.1742 | 0.3345 | 0.2531 | 0.3572 | 0.3905 | 0.1397 | 0.3484 | 0.4167 | 0.3121 | 0.4506 | 0.2231 | 0.4411 | 0.0152 | 0.12 | 0.2499 | 0.4414 | 0.0833 | 0.3169 | -1.0 | -1.0 | 0.4254 | 0.573 | | 7.5674 | 37.0 | 8991 | 13.0991 | 0.2028 | 0.4365 | 0.1547 | 0.076 | 0.1522 | 0.3704 | 0.2443 | 0.3659 | 0.394 | 0.2037 | 0.3357 | 0.6472 | 0.2982 | 0.4314 | 0.1944 | 0.4214 | 0.0112 | 0.132 | 0.2576 | 0.4484 | 0.0156 | 0.322 | -1.0 | -1.0 | 0.4396 | 0.6086 | | 7.6865 | 38.0 | 9234 | 13.0191 | 0.2235 | 0.4754 | 0.1727 | 0.0486 | 0.157 | 0.4769 | 0.2474 | 0.3656 | 0.3959 | 0.184 | 0.3362 | 0.6818 | 0.3335 | 0.4471 | 0.2282 | 0.4304 | 0.0151 | 0.116 | 0.2269 | 0.4094 | 0.0737 | 0.3119 | -1.0 | -1.0 | 0.4636 | 0.6607 | | 7.5458 | 39.0 | 9477 | 12.9589 | 0.227 | 0.4613 | 0.194 | 0.0543 | 0.1683 | 0.461 | 0.2564 | 0.3748 | 0.3962 | 0.1465 | 0.3396 | 0.6418 | 0.3456 | 0.4476 | 0.1435 | 0.3911 | 0.0426 | 0.108 | 0.3199 | 0.4453 | 0.0506 | 0.3441 | -1.0 | -1.0 | 0.4597 | 0.6411 | | 7.6026 | 40.0 | 9720 | 12.9822 | 0.248 | 0.4938 | 0.2332 | 0.0439 | 0.1997 | 0.4981 | 0.2806 | 0.3808 | 0.4053 | 0.0835 | 0.3527 | 0.6414 | 0.3479 | 0.4342 | 0.2802 | 0.4429 | 0.02 | 0.104 | 0.3117 | 0.4594 | 0.0669 | 0.3492 | -1.0 | -1.0 | 0.4616 | 0.6423 | | 7.3867 | 41.0 | 9963 | 12.7931 | 0.2375 | 0.4883 | 0.1956 | 0.0636 | 0.1762 | 0.5705 | 0.2573 | 0.355 | 0.3746 | 0.1059 | 0.3201 | 0.755 | 0.3128 | 0.4097 | 0.2361 | 0.3893 | 0.0126 | 0.092 | 0.2704 | 0.4016 | 0.0864 | 0.322 | -1.0 | -1.0 | 0.5064 | 0.6331 | | 7.4443 | 42.0 | 10206 | 12.8659 | 0.2047 | 0.4475 | 0.1633 | 0.0865 | 0.1575 | 0.4228 | 0.2384 | 0.3372 | 0.3603 | 0.1663 | 0.308 | 0.6599 | 0.3008 | 0.3947 | 0.1932 | 0.3732 | 0.0167 | 0.072 | 0.2681 | 0.4187 | 0.0454 | 0.3034 | -1.0 | -1.0 | 0.4038 | 0.6 | | 7.3614 | 43.0 | 10449 | 13.2190 | 0.1793 | 0.4103 | 0.1374 | 0.0465 | 0.1373 | 0.3392 | 0.196 | 0.2979 | 0.3223 | 0.1427 | 0.2712 | 0.6286 | 0.3247 | 0.4261 | 0.1526 | 0.3018 | 0.0144 | 0.056 | 0.1791 | 0.3672 | 0.0867 | 0.2763 | -1.0 | -1.0 | 0.3183 | 0.5067 | | 7.3409 | 44.0 | 10692 | 12.7442 | 0.2238 | 0.4835 | 0.1831 | 0.047 | 0.1724 | 0.5491 | 0.2615 | 0.355 | 0.3849 | 0.0973 | 0.3267 | 0.7573 | 0.2933 | 0.3945 | 0.2269 | 0.3875 | 0.0229 | 0.12 | 0.2521 | 0.3773 | 0.0887 | 0.3424 | -1.0 | -1.0 | 0.4588 | 0.6877 | | 7.2999 | 45.0 | 10935 | 12.9593 | 0.2451 | 0.4682 | 0.2493 | 0.0401 | 0.1951 | 0.5133 | 0.2693 | 0.3577 | 0.3696 | 0.0416 | 0.3152 | 0.6131 | 0.3254 | 0.3836 | 0.2363 | 0.3339 | 0.037 | 0.108 | 0.3388 | 0.3906 | 0.0332 | 0.3458 | -1.0 | -1.0 | 0.4999 | 0.6558 | | 7.2993 | 46.0 | 11178 | 13.8004 | 0.1653 | 0.3334 | 0.1374 | 0.0054 | 0.1194 | 0.4166 | 0.1933 | 0.2298 | 0.2315 | 0.0061 | 0.1856 | 0.4442 | 0.2548 | 0.2789 | 0.1144 | 0.1786 | 0.0128 | 0.08 | 0.1795 | 0.1914 | 0.0109 | 0.1542 | -1.0 | -1.0 | 0.4195 | 0.5061 | | 7.2182 | 47.0 | 11421 | 12.9737 | 0.2579 | 0.4936 | 0.2512 | 0.0387 | 0.2088 | 0.4464 | 0.2868 | 0.3511 | 0.3597 | 0.0429 | 0.3032 | 0.6626 | 0.3061 | 0.3708 | 0.24 | 0.3482 | 0.013 | 0.052 | 0.3263 | 0.3859 | 0.1062 | 0.2966 | -1.0 | -1.0 | 0.5561 | 0.7049 | | 7.1549 | 48.0 | 11664 | 13.3703 | 0.2134 | 0.4164 | 0.1831 | 0.0864 | 0.1745 | 0.5287 | 0.2479 | 0.3341 | 0.3498 | 0.1454 | 0.3007 | 0.6704 | 0.2764 | 0.3381 | 0.259 | 0.3625 | 0.0183 | 0.096 | 0.2504 | 0.3125 | 0.0184 | 0.3068 | -1.0 | -1.0 | 0.458 | 0.6828 | | 7.0829 | 49.0 | 11907 | 13.5683 | 0.1903 | 0.4008 | 0.1437 | 0.0408 | 0.1422 | 0.3277 | 0.2157 | 0.2869 | 0.2968 | 0.0756 | 0.2224 | 0.6381 | 0.2638 | 0.3261 | 0.1967 | 0.3054 | 0.0028 | 0.016 | 0.2061 | 0.2695 | 0.0421 | 0.239 | -1.0 | -1.0 | 0.4302 | 0.6245 | | 7.0886 | 50.0 | 12150 | 13.1647 | 0.1956 | 0.41 | 0.1695 | 0.0614 | 0.1527 | 0.361 | 0.2306 | 0.2945 | 0.3079 | 0.0924 | 0.2341 | 0.6198 | 0.2381 | 0.2845 | 0.2057 | 0.2982 | 0.0194 | 0.036 | 0.2429 | 0.3078 | 0.0376 | 0.2508 | -1.0 | -1.0 | 0.4302 | 0.6699 | | 7.1215 | 51.0 | 12393 | 13.3009 | 0.2183 | 0.4324 | 0.192 | 0.0519 | 0.1759 | 0.5263 | 0.246 | 0.3264 | 0.3392 | 0.118 | 0.28 | 0.6531 | 0.2954 | 0.3551 | 0.1875 | 0.3536 | 0.0078 | 0.032 | 0.2764 | 0.3422 | 0.0712 | 0.2763 | -1.0 | -1.0 | 0.4716 | 0.6761 | | 7.0264 | 52.0 | 12636 | 12.7987 | 0.2666 | 0.5304 | 0.2404 | 0.0556 | 0.2193 | 0.6009 | 0.2991 | 0.3812 | 0.3911 | 0.1656 | 0.3279 | 0.7913 | 0.3279 | 0.3964 | 0.3026 | 0.4411 | 0.0187 | 0.076 | 0.3228 | 0.4039 | 0.1129 | 0.3407 | -1.0 | -1.0 | 0.515 | 0.6883 | | 7.0154 | 53.0 | 12879 | 12.9858 | 0.2263 | 0.4532 | 0.1961 | 0.0361 | 0.1721 | 0.5205 | 0.2555 | 0.3334 | 0.3479 | 0.0882 | 0.2891 | 0.7552 | 0.294 | 0.354 | 0.2641 | 0.3714 | 0.0215 | 0.06 | 0.2842 | 0.3594 | 0.0679 | 0.3102 | -1.0 | -1.0 | 0.4263 | 0.6325 | | 6.989 | 54.0 | 13122 | 13.0541 | 0.2385 | 0.4831 | 0.2079 | 0.0417 | 0.1886 | 0.6095 | 0.2659 | 0.3664 | 0.3887 | 0.1608 | 0.3265 | 0.7682 | 0.3134 | 0.3905 | 0.2493 | 0.4518 | 0.0226 | 0.112 | 0.2643 | 0.3594 | 0.0906 | 0.3492 | -1.0 | -1.0 | 0.4906 | 0.6693 | | 7.0821 | 55.0 | 13365 | 12.9868 | 0.2621 | 0.5188 | 0.2324 | 0.0939 | 0.217 | 0.6357 | 0.2897 | 0.3685 | 0.3909 | 0.1642 | 0.3368 | 0.7806 | 0.2999 | 0.3671 | 0.3128 | 0.4696 | 0.0413 | 0.084 | 0.2914 | 0.3836 | 0.1092 | 0.3305 | -1.0 | -1.0 | 0.5178 | 0.7104 | | 7.0275 | 56.0 | 13608 | 13.1371 | 0.2197 | 0.4404 | 0.1934 | 0.0434 | 0.1892 | 0.4125 | 0.2449 | 0.3361 | 0.3529 | 0.0766 | 0.3023 | 0.6506 | 0.3044 | 0.3681 | 0.1866 | 0.4036 | 0.0031 | 0.036 | 0.3 | 0.3812 | 0.0997 | 0.3068 | -1.0 | -1.0 | 0.4243 | 0.6215 | | 6.9655 | 57.0 | 13851 | 13.0132 | 0.2039 | 0.4321 | 0.1648 | 0.0292 | 0.1576 | 0.5681 | 0.2278 | 0.3018 | 0.3138 | 0.0358 | 0.2543 | 0.7199 | 0.2673 | 0.3185 | 0.2031 | 0.3429 | 0.0011 | 0.02 | 0.2311 | 0.3117 | 0.0763 | 0.2627 | -1.0 | -1.0 | 0.4444 | 0.627 | | 6.8743 | 58.0 | 14094 | 12.9247 | 0.2335 | 0.4756 | 0.2028 | 0.0821 | 0.1822 | 0.4712 | 0.265 | 0.3349 | 0.3486 | 0.1235 | 0.2925 | 0.5905 | 0.2733 | 0.3203 | 0.2532 | 0.3946 | 0.0257 | 0.068 | 0.2687 | 0.332 | 0.1012 | 0.3373 | -1.0 | -1.0 | 0.479 | 0.6393 | | 6.8568 | 59.0 | 14337 | 13.1372 | 0.2253 | 0.4659 | 0.1878 | 0.0478 | 0.1736 | 0.5795 | 0.2503 | 0.3265 | 0.3375 | 0.0877 | 0.2736 | 0.7258 | 0.2735 | 0.3318 | 0.2097 | 0.3589 | 0.0121 | 0.06 | 0.2475 | 0.3219 | 0.1092 | 0.3102 | -1.0 | -1.0 | 0.4995 | 0.6423 | | 6.8846 | 60.0 | 14580 | 13.1527 | 0.2017 | 0.436 | 0.1593 | 0.0221 | 0.1578 | 0.4798 | 0.224 | 0.2969 | 0.3072 | 0.043 | 0.2539 | 0.6475 | 0.2622 | 0.3184 | 0.2092 | 0.3196 | 0.0055 | 0.032 | 0.2066 | 0.2688 | 0.0965 | 0.2915 | -1.0 | -1.0 | 0.4301 | 0.6129 | | 6.7842 | 61.0 | 14823 | 13.1684 | 0.18 | 0.386 | 0.153 | 0.0404 | 0.1401 | 0.5151 | 0.2028 | 0.2802 | 0.2892 | 0.0825 | 0.234 | 0.66 | 0.2491 | 0.2896 | 0.2106 | 0.3268 | 0.0005 | 0.02 | 0.2045 | 0.2727 | 0.0552 | 0.2763 | -1.0 | -1.0 | 0.3604 | 0.5497 | | 6.6136 | 62.0 | 15066 | 13.6925 | 0.1494 | 0.3463 | 0.1014 | 0.0284 | 0.1092 | 0.308 | 0.1704 | 0.243 | 0.2548 | 0.0921 | 0.1921 | 0.4772 | 0.2246 | 0.27 | 0.1147 | 0.25 | 0.0001 | 0.012 | 0.1629 | 0.2352 | 0.0697 | 0.2136 | -1.0 | -1.0 | 0.3243 | 0.5479 | | 6.647 | 63.0 | 15309 | 13.5645 | 0.1831 | 0.3939 | 0.1467 | 0.0131 | 0.1445 | 0.5261 | 0.2079 | 0.2673 | 0.276 | 0.0141 | 0.2236 | 0.5996 | 0.229 | 0.2655 | 0.203 | 0.3036 | 0.0071 | 0.048 | 0.1831 | 0.2391 | 0.078 | 0.2305 | -1.0 | -1.0 | 0.3984 | 0.5693 | | 6.6742 | 64.0 | 15552 | 13.7404 | 0.1699 | 0.3786 | 0.133 | 0.0071 | 0.1293 | 0.5695 | 0.1951 | 0.2495 | 0.2557 | 0.0071 | 0.1929 | 0.6471 | 0.2043 | 0.2352 | 0.1754 | 0.275 | 0.0005 | 0.016 | 0.1582 | 0.2078 | 0.1039 | 0.2424 | -1.0 | -1.0 | 0.3768 | 0.5577 | | 6.7936 | 65.0 | 15795 | 14.0013 | 0.1325 | 0.3081 | 0.0941 | 0.0147 | 0.108 | 0.1836 | 0.1591 | 0.2109 | 0.2185 | 0.0158 | 0.1697 | 0.2661 | 0.2038 | 0.229 | 0.0862 | 0.1607 | 0.0002 | 0.02 | 0.137 | 0.1828 | 0.0652 | 0.1983 | -1.0 | -1.0 | 0.3026 | 0.5202 | | 6.6426 | 66.0 | 16038 | 14.0935 | 0.1142 | 0.2794 | 0.081 | 0.0016 | 0.0733 | 0.2064 | 0.1418 | 0.1864 | 0.194 | 0.0014 | 0.1331 | 0.3155 | 0.1753 | 0.1986 | 0.0455 | 0.1179 | 0.0004 | 0.02 | 0.0895 | 0.1367 | 0.0604 | 0.178 | -1.0 | -1.0 | 0.3141 | 0.5129 | | 6.6421 | 67.0 | 16281 | 14.0872 | 0.1397 | 0.3349 | 0.1031 | 0.0059 | 0.1004 | 0.2509 | 0.1726 | 0.2173 | 0.2237 | 0.0137 | 0.1615 | 0.4434 | 0.1773 | 0.1993 | 0.1115 | 0.1821 | 0.0064 | 0.028 | 0.1302 | 0.1758 | 0.056 | 0.2102 | -1.0 | -1.0 | 0.3567 | 0.5466 | | 6.7091 | 68.0 | 16524 | 14.2243 | 0.1184 | 0.2764 | 0.09 | 0.0008 | 0.0917 | 0.2185 | 0.1476 | 0.1819 | 0.1888 | 0.0007 | 0.1334 | 0.4659 | 0.1844 | 0.2065 | 0.07 | 0.1125 | 0.0 | 0.0 | 0.1043 | 0.1445 | 0.0589 | 0.1508 | -1.0 | -1.0 | 0.2929 | 0.5184 | | 6.503 | 69.0 | 16767 | 14.0666 | 0.0983 | 0.2512 | 0.0708 | 0.0026 | 0.0809 | 0.2638 | 0.1199 | 0.1637 | 0.1664 | 0.0027 | 0.1233 | 0.3923 | 0.174 | 0.192 | 0.0533 | 0.0982 | 0.0 | 0.0 | 0.0643 | 0.1242 | 0.0321 | 0.1305 | -1.0 | -1.0 | 0.2663 | 0.4534 | | 6.529 | 70.0 | 17010 | 13.8595 | 0.121 | 0.2914 | 0.085 | 0.0016 | 0.0902 | 0.3914 | 0.1445 | 0.1983 | 0.2088 | 0.0014 | 0.1484 | 0.5212 | 0.1932 | 0.2207 | 0.0785 | 0.1696 | 0.0 | 0.0 | 0.0921 | 0.1516 | 0.0505 | 0.1797 | -1.0 | -1.0 | 0.3115 | 0.5313 | | 6.5712 | 71.0 | 17253 | 14.2668 | 0.0904 | 0.2277 | 0.058 | 0.005 | 0.0693 | 0.2145 | 0.1048 | 0.1441 | 0.1501 | 0.0047 | 0.1025 | 0.3444 | 0.1693 | 0.1848 | 0.038 | 0.0875 | 0.0 | 0.0 | 0.061 | 0.1023 | 0.0515 | 0.1186 | -1.0 | -1.0 | 0.2229 | 0.4074 | | 6.4963 | 72.0 | 17496 | 13.9886 | 0.0923 | 0.229 | 0.0631 | 0.001 | 0.0697 | 0.2177 | 0.1105 | 0.1571 | 0.1722 | 0.0037 | 0.1135 | 0.4009 | 0.1726 | 0.1941 | 0.0438 | 0.1054 | 0.0 | 0.0 | 0.0717 | 0.1352 | 0.035 | 0.1271 | -1.0 | -1.0 | 0.2306 | 0.4712 | | 6.486 | 73.0 | 17739 | 13.9064 | 0.0988 | 0.2554 | 0.067 | 0.0047 | 0.0774 | 0.2258 | 0.1229 | 0.1816 | 0.1901 | 0.0144 | 0.1397 | 0.3985 | 0.1763 | 0.2052 | 0.0455 | 0.1393 | 0.0 | 0.0 | 0.0894 | 0.1711 | 0.0279 | 0.1576 | -1.0 | -1.0 | 0.2535 | 0.4675 | | 6.4018 | 74.0 | 17982 | 13.8191 | 0.1121 | 0.2841 | 0.0701 | 0.0039 | 0.0922 | 0.2385 | 0.1374 | 0.1979 | 0.2085 | 0.0071 | 0.1587 | 0.4444 | 0.1958 | 0.2232 | 0.0482 | 0.1643 | 0.0 | 0.0 | 0.103 | 0.1977 | 0.0591 | 0.161 | -1.0 | -1.0 | 0.2666 | 0.5049 | | 6.4593 | 75.0 | 18225 | 14.0884 | 0.1064 | 0.2594 | 0.0724 | 0.0009 | 0.0794 | 0.2061 | 0.1307 | 0.1783 | 0.1897 | 0.0017 | 0.1343 | 0.3326 | 0.1663 | 0.1912 | 0.039 | 0.1268 | 0.0 | 0.0 | 0.0953 | 0.1453 | 0.042 | 0.1576 | -1.0 | -1.0 | 0.2959 | 0.5172 | | 6.3809 | 76.0 | 18468 | 14.1367 | 0.096 | 0.2555 | 0.0638 | 0.0004 | 0.0718 | 0.1747 | 0.118 | 0.1769 | 0.1902 | 0.0003 | 0.1484 | 0.3919 | 0.1626 | 0.1861 | 0.05 | 0.1464 | 0.0099 | 0.008 | 0.067 | 0.1445 | 0.0357 | 0.1542 | -1.0 | -1.0 | 0.2505 | 0.5018 | | 6.3686 | 77.0 | 18711 | 14.0372 | 0.1107 | 0.2737 | 0.0759 | 0.0007 | 0.0779 | 0.3149 | 0.1358 | 0.1827 | 0.1956 | 0.0007 | 0.1422 | 0.5135 | 0.1541 | 0.1786 | 0.0834 | 0.1464 | 0.0 | 0.0 | 0.1108 | 0.1672 | 0.0311 | 0.1695 | -1.0 | -1.0 | 0.2849 | 0.5117 | | 6.3912 | 78.0 | 18954 | 14.0216 | 0.1082 | 0.2735 | 0.0729 | 0.0159 | 0.0745 | 0.2692 | 0.1306 | 0.2029 | 0.2177 | 0.0503 | 0.1619 | 0.4328 | 0.1739 | 0.1972 | 0.0647 | 0.2107 | 0.0 | 0.0 | 0.1049 | 0.1711 | 0.0296 | 0.2085 | -1.0 | -1.0 | 0.2762 | 0.5184 | | 6.2804 | 79.0 | 19197 | 13.8047 | 0.1445 | 0.331 | 0.1021 | 0.0103 | 0.1115 | 0.3195 | 0.1699 | 0.2319 | 0.241 | 0.038 | 0.1846 | 0.5322 | 0.205 | 0.2318 | 0.0768 | 0.1768 | 0.0149 | 0.012 | 0.1722 | 0.2242 | 0.0553 | 0.2356 | -1.0 | -1.0 | 0.3427 | 0.5656 | | 6.2077 | 80.0 | 19440 | 13.7026 | 0.1421 | 0.3342 | 0.1044 | 0.0186 | 0.1089 | 0.4093 | 0.1678 | 0.2271 | 0.2358 | 0.025 | 0.1862 | 0.5778 | 0.1877 | 0.2151 | 0.0876 | 0.1982 | 0.01 | 0.016 | 0.1416 | 0.1969 | 0.0758 | 0.2322 | -1.0 | -1.0 | 0.3498 | 0.5564 | | 6.1376 | 81.0 | 19683 | 13.8274 | 0.1333 | 0.3331 | 0.0935 | 0.015 | 0.1017 | 0.3837 | 0.1607 | 0.2199 | 0.2315 | 0.027 | 0.1752 | 0.58 | 0.187 | 0.2142 | 0.0991 | 0.1964 | 0.0 | 0.012 | 0.1275 | 0.1859 | 0.0634 | 0.2203 | -1.0 | -1.0 | 0.3228 | 0.5601 | | 6.3004 | 82.0 | 19926 | 13.5665 | 0.1499 | 0.3549 | 0.1033 | 0.019 | 0.114 | 0.5094 | 0.1716 | 0.2324 | 0.2429 | 0.0237 | 0.1899 | 0.6989 | 0.1887 | 0.2222 | 0.1305 | 0.225 | 0.005 | 0.004 | 0.1553 | 0.2141 | 0.0699 | 0.2237 | -1.0 | -1.0 | 0.3499 | 0.5687 | | 6.2376 | 83.0 | 20169 | 13.7511 | 0.1543 | 0.373 | 0.1158 | 0.0118 | 0.1216 | 0.5102 | 0.1823 | 0.2358 | 0.2449 | 0.0165 | 0.1935 | 0.6928 | 0.1976 | 0.2231 | 0.1393 | 0.225 | 0.0099 | 0.008 | 0.1343 | 0.1984 | 0.0736 | 0.2136 | -1.0 | -1.0 | 0.3712 | 0.6012 | | 6.2156 | 84.0 | 20412 | 13.9033 | 0.1305 | 0.3199 | 0.0954 | 0.0003 | 0.0947 | 0.4543 | 0.1539 | 0.2128 | 0.2195 | 0.0037 | 0.1681 | 0.5496 | 0.1771 | 0.199 | 0.101 | 0.1893 | 0.0101 | 0.024 | 0.1049 | 0.1625 | 0.019 | 0.1712 | -1.0 | -1.0 | 0.3709 | 0.5712 | | 6.1856 | 85.0 | 20655 | 13.7346 | 0.1312 | 0.3168 | 0.0975 | 0.0009 | 0.0935 | 0.3018 | 0.1608 | 0.2147 | 0.2227 | 0.004 | 0.1673 | 0.4538 | 0.1901 | 0.213 | 0.074 | 0.1839 | 0.0 | 0.0 | 0.1118 | 0.1688 | 0.0344 | 0.1898 | -1.0 | -1.0 | 0.3771 | 0.5804 | | 6.0757 | 86.0 | 20898 | 13.8015 | 0.1147 | 0.2863 | 0.0885 | 0.0016 | 0.0901 | 0.3154 | 0.139 | 0.1952 | 0.2043 | 0.0014 | 0.1532 | 0.489 | 0.1815 | 0.2027 | 0.0733 | 0.1571 | 0.0 | 0.0 | 0.0703 | 0.157 | 0.0412 | 0.1695 | -1.0 | -1.0 | 0.3218 | 0.5393 | | 6.1175 | 87.0 | 21141 | 13.9323 | 0.104 | 0.258 | 0.0765 | 0.0019 | 0.0766 | 0.3252 | 0.1283 | 0.1818 | 0.1954 | 0.0024 | 0.1453 | 0.5345 | 0.1746 | 0.1991 | 0.0577 | 0.1518 | 0.0 | 0.0 | 0.052 | 0.1289 | 0.0413 | 0.1661 | -1.0 | -1.0 | 0.2985 | 0.5264 | | 6.1331 | 88.0 | 21384 | 13.9147 | 0.1055 | 0.2591 | 0.0784 | 0.0008 | 0.0765 | 0.261 | 0.1296 | 0.1909 | 0.2018 | 0.0007 | 0.1582 | 0.4977 | 0.1735 | 0.1969 | 0.0485 | 0.1643 | 0.0 | 0.0 | 0.0696 | 0.1648 | 0.0341 | 0.1661 | -1.0 | -1.0 | 0.3072 | 0.5184 | | 6.0443 | 89.0 | 21627 | 13.9121 | 0.1105 | 0.2734 | 0.0809 | 0.0 | 0.0785 | 0.2645 | 0.1353 | 0.1909 | 0.1999 | 0.0 | 0.1484 | 0.4379 | 0.1647 | 0.1873 | 0.057 | 0.1607 | 0.0 | 0.0 | 0.0854 | 0.1641 | 0.0427 | 0.1525 | -1.0 | -1.0 | 0.3136 | 0.535 | | 5.9599 | 90.0 | 21870 | 13.9595 | 0.1076 | 0.2627 | 0.0835 | 0.0004 | 0.0759 | 0.3224 | 0.129 | 0.1876 | 0.1964 | 0.0003 | 0.1426 | 0.4854 | 0.1712 | 0.1978 | 0.0679 | 0.1518 | 0.0 | 0.0 | 0.0557 | 0.1375 | 0.0449 | 0.1644 | -1.0 | -1.0 | 0.3058 | 0.527 | | 6.0833 | 91.0 | 22113 | 13.9040 | 0.1136 | 0.2787 | 0.0839 | 0.0005 | 0.08 | 0.3734 | 0.14 | 0.2011 | 0.2098 | 0.0037 | 0.1606 | 0.5747 | 0.1706 | 0.1979 | 0.0715 | 0.1714 | 0.0011 | 0.016 | 0.0622 | 0.1516 | 0.0505 | 0.1746 | -1.0 | -1.0 | 0.3259 | 0.5472 | | 5.9494 | 92.0 | 22356 | 13.9089 | 0.119 | 0.2737 | 0.0916 | 0.0011 | 0.0875 | 0.4188 | 0.1463 | 0.2095 | 0.2186 | 0.004 | 0.1689 | 0.6114 | 0.1763 | 0.204 | 0.0857 | 0.2071 | 0.0003 | 0.008 | 0.0627 | 0.1508 | 0.0589 | 0.1847 | -1.0 | -1.0 | 0.3301 | 0.5571 | | 6.0241 | 93.0 | 22599 | 13.9887 | 0.1061 | 0.2651 | 0.0797 | 0.0075 | 0.071 | 0.2641 | 0.1262 | 0.1911 | 0.202 | 0.0073 | 0.1508 | 0.4552 | 0.1522 | 0.1749 | 0.0677 | 0.1839 | 0.0001 | 0.004 | 0.0644 | 0.1625 | 0.0379 | 0.1492 | -1.0 | -1.0 | 0.3147 | 0.5374 | | 5.9258 | 94.0 | 22842 | 13.9146 | 0.1064 | 0.2572 | 0.0809 | 0.0034 | 0.0777 | 0.27 | 0.1308 | 0.1944 | 0.2029 | 0.0044 | 0.1565 | 0.4545 | 0.1677 | 0.1928 | 0.0581 | 0.1804 | 0.0002 | 0.008 | 0.0543 | 0.1445 | 0.0497 | 0.1542 | -1.0 | -1.0 | 0.3086 | 0.5374 | | 6.0891 | 95.0 | 23085 | 13.8317 | 0.1093 | 0.2658 | 0.0862 | 0.0048 | 0.0778 | 0.3427 | 0.1387 | 0.207 | 0.2165 | 0.0047 | 0.1623 | 0.5265 | 0.1744 | 0.2009 | 0.0644 | 0.2018 | 0.0002 | 0.008 | 0.0601 | 0.1641 | 0.041 | 0.1746 | -1.0 | -1.0 | 0.3159 | 0.5497 | | 5.9154 | 96.0 | 23328 | 13.8791 | 0.1047 | 0.2643 | 0.0799 | 0.0044 | 0.0761 | 0.3248 | 0.1298 | 0.1915 | 0.2058 | 0.005 | 0.1588 | 0.4714 | 0.1631 | 0.1855 | 0.0571 | 0.175 | 0.0002 | 0.008 | 0.0638 | 0.1555 | 0.0465 | 0.1729 | -1.0 | -1.0 | 0.2977 | 0.538 | | 5.9291 | 97.0 | 23571 | 13.8460 | 0.1147 | 0.2828 | 0.0836 | 0.0071 | 0.0813 | 0.3824 | 0.1393 | 0.2042 | 0.2135 | 0.007 | 0.1608 | 0.5407 | 0.1775 | 0.2039 | 0.0628 | 0.175 | 0.0 | 0.004 | 0.063 | 0.1664 | 0.0651 | 0.1881 | -1.0 | -1.0 | 0.3199 | 0.5436 | | 5.813 | 98.0 | 23814 | 13.8821 | 0.1091 | 0.2787 | 0.0806 | 0.0004 | 0.0807 | 0.337 | 0.1329 | 0.201 | 0.2121 | 0.0003 | 0.1615 | 0.5081 | 0.167 | 0.1935 | 0.0597 | 0.1679 | 0.0002 | 0.008 | 0.0701 | 0.1672 | 0.057 | 0.1898 | -1.0 | -1.0 | 0.3006 | 0.546 | | 5.8194 | 99.0 | 24057 | 13.8909 | 0.1086 | 0.2779 | 0.0772 | 0.0004 | 0.0744 | 0.3633 | 0.1325 | 0.1931 | 0.2077 | 0.0003 | 0.1527 | 0.5219 | 0.1686 | 0.1956 | 0.0634 | 0.1661 | 0.0 | 0.004 | 0.0585 | 0.1633 | 0.0481 | 0.1814 | -1.0 | -1.0 | 0.3133 | 0.5356 | | 5.8823 | 100.0 | 24300 | 13.8955 | 0.1056 | 0.2682 | 0.0807 | 0.0 | 0.0749 | 0.3361 | 0.1272 | 0.1948 | 0.208 | 0.0 | 0.1572 | 0.5072 | 0.1655 | 0.1906 | 0.0605 | 0.1768 | 0.0001 | 0.008 | 0.0613 | 0.1578 | 0.0479 | 0.1712 | -1.0 | -1.0 | 0.2983 | 0.5436 | ### Framework versions - Transformers 4.46.0.dev0 - Pytorch 2.4.1+cu118 - Datasets 3.0.1 - Tokenizers 0.20.0
[ "hand", "knife", "radio", "binos", "handgun", "grenade", "rifle" ]
joe611/chickens-300-epoch-200-images
<!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. --> # chickens-300-epoch-200-images This model is a fine-tuned version of [facebook/detr-resnet-50](https://huggingface.co/facebook/detr-resnet-50) on the None dataset. It achieves the following results on the evaluation set: - Loss: 0.5365 ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 1e-05 - train_batch_size: 2 - eval_batch_size: 8 - seed: 42 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: cosine - num_epochs: 60 ### Training results | Training Loss | Epoch | Step | Validation Loss | |:-------------:|:-----:|:----:|:---------------:| | 1.9097 | 1.0 | 99 | 1.7611 | | 1.6037 | 2.0 | 198 | 1.5045 | | 1.5109 | 3.0 | 297 | 1.3585 | | 1.3272 | 4.0 | 396 | 1.2303 | | 1.3035 | 5.0 | 495 | 1.2142 | | 1.1962 | 6.0 | 594 | 1.1744 | | 1.1761 | 7.0 | 693 | 1.0835 | | 1.1091 | 8.0 | 792 | 1.0690 | | 1.1074 | 9.0 | 891 | 1.0030 | | 0.9913 | 10.0 | 990 | 0.9990 | | 0.9685 | 11.0 | 1089 | 0.9787 | | 0.9439 | 12.0 | 1188 | 0.9659 | | 0.9102 | 13.0 | 1287 | 0.9237 | | 0.9849 | 14.0 | 1386 | 0.8619 | | 0.8692 | 15.0 | 1485 | 0.8463 | | 0.833 | 16.0 | 1584 | 0.8279 | | 0.8148 | 17.0 | 1683 | 0.7647 | | 0.7309 | 18.0 | 1782 | 0.8500 | | 0.7024 | 19.0 | 1881 | 0.7294 | | 0.73 | 20.0 | 1980 | 0.7389 | | 0.7115 | 21.0 | 2079 | 0.7617 | | 0.6883 | 22.0 | 2178 | 0.8105 | | 0.7151 | 23.0 | 2277 | 0.6843 | | 0.6598 | 24.0 | 2376 | 0.7199 | | 0.762 | 25.0 | 2475 | 0.6930 | | 0.6331 | 26.0 | 2574 | 0.6905 | | 0.6141 | 27.0 | 2673 | 0.6327 | | 0.6014 | 28.0 | 2772 | 0.6516 | | 0.5935 | 29.0 | 2871 | 0.6052 | | 0.557 | 30.0 | 2970 | 0.5846 | | 0.6292 | 31.0 | 3069 | 0.5857 | | 0.5536 | 32.0 | 3168 | 0.5905 | | 0.5309 | 33.0 | 3267 | 0.6030 | | 0.5065 | 34.0 | 3366 | 0.5743 | | 0.5429 | 35.0 | 3465 | 0.5699 | | 0.4981 | 36.0 | 3564 | 0.5755 | | 0.5184 | 37.0 | 3663 | 0.5689 | | 0.5338 | 38.0 | 3762 | 0.5646 | | 0.5218 | 39.0 | 3861 | 0.5688 | | 0.5102 | 40.0 | 3960 | 0.5738 | | 0.5052 | 41.0 | 4059 | 0.5599 | | 0.4897 | 42.0 | 4158 | 0.5454 | | 0.4916 | 43.0 | 4257 | 0.5393 | | 0.4696 | 44.0 | 4356 | 0.5470 | | 0.494 | 45.0 | 4455 | 0.5516 | | 0.465 | 46.0 | 4554 | 0.5459 | | 0.4733 | 47.0 | 4653 | 0.5483 | | 0.4929 | 48.0 | 4752 | 0.5316 | | 0.4822 | 49.0 | 4851 | 0.5306 | | 0.4594 | 50.0 | 4950 | 0.5351 | | 0.4598 | 51.0 | 5049 | 0.5348 | | 0.4511 | 52.0 | 5148 | 0.5432 | | 0.4561 | 53.0 | 5247 | 0.5467 | | 0.4653 | 54.0 | 5346 | 0.5444 | | 0.4475 | 55.0 | 5445 | 0.5423 | | 0.4734 | 56.0 | 5544 | 0.5368 | | 0.4815 | 57.0 | 5643 | 0.5363 | | 0.4365 | 58.0 | 5742 | 0.5366 | | 0.4596 | 59.0 | 5841 | 0.5365 | | 0.4757 | 60.0 | 5940 | 0.5365 | ### Framework versions - Transformers 4.45.2 - Pytorch 2.4.1+cu121 - Datasets 2.19.2 - Tokenizers 0.20.0
[ "chicken", "duck", "plant" ]
joe611/chickens-60-epoch-200-images-aug
<!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. --> # chickens-60-epoch-200-images-aug This model is a fine-tuned version of [facebook/detr-resnet-50](https://huggingface.co/facebook/detr-resnet-50) on the None dataset. It achieves the following results on the evaluation set: - Loss: 0.3172 - Map: 0.7813 - Map 50: 0.9672 - Map 75: 0.9061 - Map Small: 0.5029 - Map Medium: 0.7534 - Map Large: 0.8581 - Mar 1: 0.2609 - Mar 10: 0.8237 - Mar 100: 0.8407 - Mar Small: 0.5407 - Mar Medium: 0.8181 - Mar Large: 0.9033 - Map Chicken: 0.7893 - Mar 100 Chicken: 0.8558 - Map Duck: 0.7616 - Mar 100 Duck: 0.825 - Map Plant: 0.7929 - Mar 100 Plant: 0.8414 ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 1e-05 - train_batch_size: 2 - eval_batch_size: 8 - seed: 42 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: cosine - num_epochs: 60 ### Training results | Training Loss | Epoch | Step | Validation Loss | Map | Map 50 | Map 75 | Map Small | Map Medium | Map Large | Mar 1 | Mar 10 | Mar 100 | Mar Small | Mar Medium | Mar Large | Map Chicken | Mar 100 Chicken | Map Duck | Mar 100 Duck | Map Plant | Mar 100 Plant | |:-------------:|:-----:|:-----:|:---------------:|:------:|:------:|:------:|:---------:|:----------:|:---------:|:------:|:------:|:-------:|:---------:|:----------:|:---------:|:-----------:|:---------------:|:--------:|:------------:|:---------:|:-------------:| | 1.39 | 1.0 | 497 | 1.4568 | 0.1658 | 0.2544 | 0.19 | 0.0007 | 0.1178 | 0.572 | 0.0735 | 0.2448 | 0.2951 | 0.0444 | 0.2733 | 0.7627 | 0.0281 | 0.15 | 0.0297 | 0.0324 | 0.4395 | 0.7029 | | 1.0818 | 2.0 | 994 | 1.2408 | 0.1924 | 0.2866 | 0.2221 | 0.0107 | 0.1379 | 0.6468 | 0.0738 | 0.2559 | 0.2789 | 0.0333 | 0.2522 | 0.75 | 0.038 | 0.1558 | 0.0 | 0.0 | 0.5391 | 0.6809 | | 1.0382 | 3.0 | 1491 | 1.1565 | 0.2389 | 0.3456 | 0.2667 | 0.0175 | 0.2088 | 0.6989 | 0.0965 | 0.3342 | 0.3793 | 0.3111 | 0.3535 | 0.7928 | 0.094 | 0.4112 | 0.0 | 0.0 | 0.6229 | 0.7266 | | 0.9389 | 4.0 | 1988 | 1.0233 | 0.2651 | 0.3695 | 0.3088 | 0.0187 | 0.2392 | 0.7309 | 0.104 | 0.3987 | 0.46 | 0.3519 | 0.4331 | 0.8088 | 0.137 | 0.6396 | 0.0 | 0.0 | 0.6583 | 0.7404 | | 0.8606 | 5.0 | 2485 | 0.9402 | 0.3365 | 0.502 | 0.3846 | 0.0152 | 0.3165 | 0.7061 | 0.1389 | 0.4708 | 0.5159 | 0.1667 | 0.494 | 0.7807 | 0.2694 | 0.6954 | 0.0976 | 0.1345 | 0.6424 | 0.7178 | | 0.8217 | 6.0 | 2982 | 0.8550 | 0.3399 | 0.497 | 0.3807 | 0.0512 | 0.3161 | 0.7385 | 0.1247 | 0.4612 | 0.4907 | 0.1222 | 0.4637 | 0.8108 | 0.304 | 0.6796 | 0.0474 | 0.0554 | 0.6685 | 0.7371 | | 0.8555 | 7.0 | 3479 | 0.7985 | 0.3644 | 0.512 | 0.4361 | 0.0648 | 0.3433 | 0.7396 | 0.1417 | 0.5009 | 0.531 | 0.2296 | 0.5061 | 0.801 | 0.3468 | 0.7642 | 0.0697 | 0.0926 | 0.6768 | 0.7363 | | 0.7846 | 8.0 | 3976 | 0.7775 | 0.4093 | 0.5643 | 0.4936 | 0.0811 | 0.3834 | 0.7663 | 0.1702 | 0.541 | 0.5874 | 0.2926 | 0.563 | 0.8176 | 0.3666 | 0.7942 | 0.1645 | 0.2162 | 0.6968 | 0.752 | | 1.1121 | 9.0 | 4473 | 0.7245 | 0.3964 | 0.5446 | 0.4803 | 0.0597 | 0.3762 | 0.759 | 0.1392 | 0.5093 | 0.5425 | 0.2889 | 0.5193 | 0.8111 | 0.4271 | 0.8033 | 0.0679 | 0.0764 | 0.6942 | 0.7479 | | 0.7181 | 10.0 | 4970 | 0.7218 | 0.3985 | 0.5517 | 0.4777 | 0.1619 | 0.3747 | 0.7474 | 0.14 | 0.5086 | 0.5403 | 0.2852 | 0.5159 | 0.8033 | 0.4358 | 0.785 | 0.0785 | 0.0959 | 0.6812 | 0.74 | | 0.7338 | 11.0 | 5467 | 0.7031 | 0.5215 | 0.7323 | 0.6247 | 0.2342 | 0.4985 | 0.7549 | 0.1927 | 0.6175 | 0.6458 | 0.3667 | 0.6213 | 0.8124 | 0.5003 | 0.7454 | 0.3735 | 0.4466 | 0.6908 | 0.7455 | | 0.7107 | 12.0 | 5964 | 0.6559 | 0.4871 | 0.6743 | 0.5933 | 0.1353 | 0.4609 | 0.7674 | 0.1755 | 0.5688 | 0.5983 | 0.3222 | 0.5742 | 0.8131 | 0.5366 | 0.7946 | 0.227 | 0.2514 | 0.6979 | 0.7488 | | 0.6867 | 13.0 | 6461 | 0.6307 | 0.5689 | 0.7843 | 0.6921 | 0.2219 | 0.5409 | 0.7784 | 0.204 | 0.6467 | 0.6719 | 0.2407 | 0.6486 | 0.8265 | 0.5677 | 0.775 | 0.4308 | 0.4804 | 0.7081 | 0.7602 | | 0.6659 | 14.0 | 6958 | 0.6498 | 0.5874 | 0.834 | 0.7233 | 0.1414 | 0.564 | 0.7351 | 0.2123 | 0.6699 | 0.6914 | 0.337 | 0.6663 | 0.8023 | 0.5821 | 0.7738 | 0.5066 | 0.5655 | 0.6736 | 0.7348 | | 0.6979 | 15.0 | 7455 | 0.5984 | 0.6138 | 0.874 | 0.7383 | 0.1613 | 0.5832 | 0.7824 | 0.2239 | 0.6946 | 0.7171 | 0.2963 | 0.6913 | 0.8278 | 0.5625 | 0.7554 | 0.5718 | 0.6405 | 0.7071 | 0.7553 | | 0.5107 | 16.0 | 7952 | 0.5714 | 0.6356 | 0.9011 | 0.7713 | 0.2066 | 0.6084 | 0.789 | 0.2249 | 0.72 | 0.7417 | 0.3704 | 0.7177 | 0.8415 | 0.5761 | 0.7579 | 0.6135 | 0.6953 | 0.7172 | 0.7719 | | 0.6271 | 17.0 | 8449 | 0.5257 | 0.6636 | 0.9051 | 0.7937 | 0.351 | 0.6331 | 0.8044 | 0.2317 | 0.7393 | 0.7635 | 0.4556 | 0.7407 | 0.85 | 0.6165 | 0.7917 | 0.6398 | 0.7122 | 0.7346 | 0.7867 | | 0.5645 | 18.0 | 8946 | 0.5132 | 0.6835 | 0.9215 | 0.8205 | 0.4803 | 0.6611 | 0.792 | 0.2429 | 0.7496 | 0.769 | 0.5185 | 0.7498 | 0.8382 | 0.6527 | 0.7871 | 0.6679 | 0.7372 | 0.7299 | 0.7826 | | 0.6116 | 19.0 | 9443 | 0.5127 | 0.6574 | 0.9452 | 0.7874 | 0.4883 | 0.6233 | 0.7941 | 0.2285 | 0.7226 | 0.7382 | 0.5111 | 0.7106 | 0.8435 | 0.6334 | 0.7325 | 0.62 | 0.7095 | 0.7188 | 0.7727 | | 0.5896 | 20.0 | 9940 | 0.4924 | 0.6682 | 0.9411 | 0.8196 | 0.5009 | 0.6364 | 0.7906 | 0.2348 | 0.732 | 0.747 | 0.5222 | 0.7187 | 0.8438 | 0.6419 | 0.7417 | 0.6441 | 0.7284 | 0.7186 | 0.7709 | | 0.538 | 21.0 | 10437 | 0.4794 | 0.6745 | 0.947 | 0.8269 | 0.4886 | 0.6421 | 0.7972 | 0.2385 | 0.7357 | 0.7528 | 0.5222 | 0.7256 | 0.8536 | 0.6632 | 0.7508 | 0.6336 | 0.723 | 0.7266 | 0.7846 | | 0.581 | 22.0 | 10934 | 0.4552 | 0.6926 | 0.9556 | 0.8348 | 0.5355 | 0.662 | 0.8043 | 0.2412 | 0.7491 | 0.7662 | 0.5556 | 0.7429 | 0.8569 | 0.6813 | 0.7629 | 0.6591 | 0.7405 | 0.7374 | 0.7951 | | 0.4655 | 23.0 | 11431 | 0.4418 | 0.6984 | 0.9577 | 0.8555 | 0.506 | 0.6693 | 0.799 | 0.2408 | 0.7511 | 0.7678 | 0.5481 | 0.7446 | 0.849 | 0.7097 | 0.7825 | 0.6537 | 0.7345 | 0.7317 | 0.7865 | | 0.4435 | 24.0 | 11928 | 0.4251 | 0.7043 | 0.9552 | 0.8358 | 0.3469 | 0.6769 | 0.802 | 0.2439 | 0.7612 | 0.7763 | 0.3889 | 0.7496 | 0.8614 | 0.7102 | 0.7892 | 0.6703 | 0.7507 | 0.7325 | 0.7891 | | 0.521 | 25.0 | 12425 | 0.4064 | 0.7268 | 0.962 | 0.8799 | 0.3649 | 0.6998 | 0.8139 | 0.25 | 0.7767 | 0.7937 | 0.6222 | 0.7714 | 0.8657 | 0.7058 | 0.78 | 0.7239 | 0.7939 | 0.7507 | 0.807 | | 0.5358 | 26.0 | 12922 | 0.4021 | 0.7158 | 0.9596 | 0.8604 | 0.3895 | 0.6886 | 0.8158 | 0.2497 | 0.7677 | 0.7857 | 0.4148 | 0.7646 | 0.866 | 0.7198 | 0.8012 | 0.6743 | 0.7493 | 0.7533 | 0.8066 | | 0.478 | 27.0 | 13419 | 0.4045 | 0.712 | 0.9597 | 0.8763 | 0.4779 | 0.6811 | 0.8136 | 0.2431 | 0.7606 | 0.7788 | 0.5037 | 0.7558 | 0.8611 | 0.6946 | 0.7763 | 0.6954 | 0.7615 | 0.746 | 0.7986 | | 0.4684 | 28.0 | 13916 | 0.3945 | 0.7246 | 0.9589 | 0.8756 | 0.3809 | 0.6987 | 0.8248 | 0.2436 | 0.7704 | 0.7887 | 0.4481 | 0.7704 | 0.868 | 0.7186 | 0.7837 | 0.6913 | 0.7682 | 0.7639 | 0.8141 | | 0.4758 | 29.0 | 14413 | 0.3802 | 0.7363 | 0.9607 | 0.8843 | 0.4282 | 0.7047 | 0.8264 | 0.2492 | 0.7806 | 0.7994 | 0.5 | 0.7777 | 0.8765 | 0.7271 | 0.8 | 0.7199 | 0.7811 | 0.762 | 0.817 | | 0.5724 | 30.0 | 14910 | 0.3773 | 0.7508 | 0.9608 | 0.8825 | 0.4597 | 0.7246 | 0.8255 | 0.2525 | 0.7929 | 0.8103 | 0.5741 | 0.787 | 0.8755 | 0.7633 | 0.8275 | 0.7277 | 0.7905 | 0.7612 | 0.8129 | | 0.5229 | 31.0 | 15407 | 0.3797 | 0.7341 | 0.964 | 0.8791 | 0.4828 | 0.705 | 0.8393 | 0.2508 | 0.7812 | 0.7987 | 0.5111 | 0.773 | 0.8843 | 0.7341 | 0.8037 | 0.7007 | 0.7757 | 0.7675 | 0.8166 | | 0.4247 | 32.0 | 15904 | 0.3720 | 0.7533 | 0.9598 | 0.8998 | 0.4806 | 0.7271 | 0.8303 | 0.2523 | 0.7979 | 0.8164 | 0.5185 | 0.7945 | 0.8758 | 0.7561 | 0.8271 | 0.7396 | 0.8061 | 0.7641 | 0.816 | | 0.4791 | 33.0 | 16401 | 0.3666 | 0.7556 | 0.968 | 0.8978 | 0.4072 | 0.725 | 0.8319 | 0.2561 | 0.7978 | 0.8154 | 0.4778 | 0.7944 | 0.8804 | 0.7659 | 0.8292 | 0.7335 | 0.7953 | 0.7676 | 0.8219 | | 0.4107 | 34.0 | 16898 | 0.3568 | 0.751 | 0.9592 | 0.8941 | 0.5087 | 0.722 | 0.8416 | 0.2537 | 0.7945 | 0.8128 | 0.5667 | 0.7901 | 0.8886 | 0.7645 | 0.8263 | 0.7143 | 0.7851 | 0.7742 | 0.8271 | | 0.5572 | 35.0 | 17395 | 0.3460 | 0.7607 | 0.9613 | 0.9006 | 0.4575 | 0.7377 | 0.838 | 0.259 | 0.8017 | 0.82 | 0.5111 | 0.8016 | 0.8817 | 0.7648 | 0.8304 | 0.7384 | 0.8 | 0.779 | 0.8295 | | 0.4188 | 36.0 | 17892 | 0.3547 | 0.7522 | 0.9623 | 0.8907 | 0.4372 | 0.7274 | 0.8258 | 0.2572 | 0.7951 | 0.8122 | 0.4852 | 0.7921 | 0.8729 | 0.7627 | 0.8288 | 0.7286 | 0.7919 | 0.7653 | 0.816 | | 0.3833 | 37.0 | 18389 | 0.3474 | 0.7589 | 0.9619 | 0.9009 | 0.484 | 0.731 | 0.8389 | 0.2569 | 0.8028 | 0.8208 | 0.5296 | 0.799 | 0.8876 | 0.7742 | 0.8404 | 0.7286 | 0.7939 | 0.7739 | 0.8281 | | 0.4402 | 38.0 | 18886 | 0.3546 | 0.7577 | 0.9662 | 0.9037 | 0.4664 | 0.73 | 0.8281 | 0.2579 | 0.7997 | 0.8166 | 0.5074 | 0.7943 | 0.8791 | 0.7628 | 0.8238 | 0.7449 | 0.8088 | 0.7654 | 0.8174 | | 0.4705 | 39.0 | 19383 | 0.3458 | 0.7676 | 0.9625 | 0.8974 | 0.556 | 0.7396 | 0.8375 | 0.2606 | 0.8096 | 0.8269 | 0.6111 | 0.8058 | 0.884 | 0.7782 | 0.8413 | 0.7503 | 0.8142 | 0.7743 | 0.8254 | | 0.4476 | 40.0 | 19880 | 0.3528 | 0.7613 | 0.9603 | 0.8951 | 0.4661 | 0.7346 | 0.8308 | 0.2579 | 0.8062 | 0.8248 | 0.5111 | 0.8027 | 0.8827 | 0.7736 | 0.8408 | 0.7424 | 0.8122 | 0.768 | 0.8215 | | 0.4515 | 41.0 | 20377 | 0.3540 | 0.7652 | 0.9646 | 0.8982 | 0.5283 | 0.7364 | 0.8366 | 0.257 | 0.8087 | 0.8274 | 0.5926 | 0.8047 | 0.8899 | 0.7645 | 0.8296 | 0.7573 | 0.8216 | 0.7738 | 0.8309 | | 0.5283 | 42.0 | 20874 | 0.3416 | 0.761 | 0.964 | 0.8967 | 0.4546 | 0.7305 | 0.8388 | 0.2547 | 0.8077 | 0.8256 | 0.5407 | 0.8031 | 0.8895 | 0.7714 | 0.8379 | 0.7368 | 0.8088 | 0.7748 | 0.8301 | | 0.4371 | 43.0 | 21371 | 0.3397 | 0.7609 | 0.9658 | 0.9026 | 0.4562 | 0.7316 | 0.8418 | 0.2555 | 0.8073 | 0.8254 | 0.5185 | 0.8032 | 0.8905 | 0.7686 | 0.8388 | 0.736 | 0.8074 | 0.778 | 0.8299 | | 0.3756 | 44.0 | 21868 | 0.3444 | 0.7645 | 0.9615 | 0.9075 | 0.5114 | 0.735 | 0.8403 | 0.2607 | 0.8085 | 0.8255 | 0.5815 | 0.8011 | 0.8925 | 0.7749 | 0.8379 | 0.744 | 0.8101 | 0.7746 | 0.8283 | | 0.4228 | 45.0 | 22365 | 0.3341 | 0.7744 | 0.9645 | 0.8991 | 0.5075 | 0.7461 | 0.8398 | 0.2617 | 0.8159 | 0.8332 | 0.5593 | 0.8117 | 0.8899 | 0.7876 | 0.8479 | 0.7597 | 0.8216 | 0.776 | 0.8301 | | 0.3976 | 46.0 | 22862 | 0.3294 | 0.7751 | 0.9648 | 0.907 | 0.4606 | 0.7469 | 0.8427 | 0.2638 | 0.8171 | 0.8352 | 0.5222 | 0.8156 | 0.8915 | 0.7821 | 0.8487 | 0.761 | 0.8223 | 0.7821 | 0.8346 | | 0.4371 | 47.0 | 23359 | 0.3312 | 0.7744 | 0.9665 | 0.9009 | 0.4494 | 0.7465 | 0.8476 | 0.2596 | 0.8157 | 0.8334 | 0.5148 | 0.8127 | 0.8938 | 0.783 | 0.8471 | 0.7565 | 0.8182 | 0.7838 | 0.8348 | | 0.3589 | 48.0 | 23856 | 0.3307 | 0.7705 | 0.9664 | 0.8963 | 0.5143 | 0.742 | 0.8432 | 0.2577 | 0.8134 | 0.8309 | 0.5667 | 0.809 | 0.8931 | 0.7732 | 0.8425 | 0.7577 | 0.8176 | 0.7806 | 0.8326 | | 0.4881 | 49.0 | 24353 | 0.3237 | 0.7771 | 0.9655 | 0.9058 | 0.525 | 0.7485 | 0.8457 | 0.2593 | 0.8207 | 0.8383 | 0.5963 | 0.8163 | 0.8944 | 0.7878 | 0.8558 | 0.759 | 0.823 | 0.7843 | 0.8361 | | 0.3637 | 50.0 | 24850 | 0.3246 | 0.7762 | 0.9647 | 0.9041 | 0.5141 | 0.7451 | 0.8517 | 0.2588 | 0.8173 | 0.8346 | 0.5741 | 0.8102 | 0.899 | 0.7824 | 0.8475 | 0.761 | 0.8216 | 0.7852 | 0.8348 | | 0.3978 | 51.0 | 25347 | 0.3296 | 0.7744 | 0.9654 | 0.8952 | 0.5077 | 0.7459 | 0.8478 | 0.259 | 0.8172 | 0.8341 | 0.5519 | 0.8113 | 0.8954 | 0.7835 | 0.8475 | 0.7562 | 0.8223 | 0.7835 | 0.8324 | | 0.3942 | 52.0 | 25844 | 0.3241 | 0.7808 | 0.9681 | 0.9075 | 0.5314 | 0.753 | 0.8568 | 0.2612 | 0.8214 | 0.8393 | 0.5815 | 0.8172 | 0.9023 | 0.7874 | 0.8533 | 0.7624 | 0.8223 | 0.7925 | 0.8424 | | 0.4079 | 53.0 | 26341 | 0.3221 | 0.779 | 0.9708 | 0.9091 | 0.4946 | 0.7525 | 0.855 | 0.2598 | 0.8206 | 0.8377 | 0.5333 | 0.8162 | 0.9007 | 0.7867 | 0.8542 | 0.7583 | 0.8182 | 0.792 | 0.8408 | | 0.3715 | 54.0 | 26838 | 0.3191 | 0.7831 | 0.969 | 0.9074 | 0.5022 | 0.7551 | 0.8596 | 0.2606 | 0.824 | 0.8417 | 0.5407 | 0.8199 | 0.9046 | 0.789 | 0.855 | 0.7646 | 0.8257 | 0.7957 | 0.8443 | | 0.4328 | 55.0 | 27335 | 0.3184 | 0.7799 | 0.9686 | 0.9057 | 0.5077 | 0.7512 | 0.8586 | 0.2598 | 0.8217 | 0.8396 | 0.5481 | 0.8172 | 0.9023 | 0.7878 | 0.8562 | 0.7579 | 0.8209 | 0.7939 | 0.8416 | | 0.3676 | 56.0 | 27832 | 0.3192 | 0.7792 | 0.9683 | 0.9059 | 0.5051 | 0.7509 | 0.8561 | 0.2603 | 0.8218 | 0.8394 | 0.5481 | 0.8168 | 0.9007 | 0.7884 | 0.8562 | 0.7581 | 0.8223 | 0.791 | 0.8396 | | 0.3737 | 57.0 | 28329 | 0.3174 | 0.7814 | 0.9677 | 0.9071 | 0.5052 | 0.7529 | 0.8579 | 0.2606 | 0.823 | 0.8406 | 0.5481 | 0.8182 | 0.9023 | 0.7884 | 0.8558 | 0.7629 | 0.8243 | 0.7928 | 0.8416 | | 0.3532 | 58.0 | 28826 | 0.3172 | 0.7813 | 0.9674 | 0.9062 | 0.5029 | 0.7531 | 0.8581 | 0.2607 | 0.8236 | 0.8407 | 0.5407 | 0.8179 | 0.9029 | 0.7897 | 0.8562 | 0.7617 | 0.825 | 0.7926 | 0.8408 | | 0.356 | 59.0 | 29323 | 0.3172 | 0.7814 | 0.9672 | 0.9061 | 0.5029 | 0.7534 | 0.8587 | 0.2609 | 0.8236 | 0.8407 | 0.5407 | 0.818 | 0.9036 | 0.7887 | 0.8554 | 0.7616 | 0.825 | 0.794 | 0.8416 | | 0.4282 | 60.0 | 29820 | 0.3172 | 0.7813 | 0.9672 | 0.9061 | 0.5029 | 0.7534 | 0.8581 | 0.2609 | 0.8237 | 0.8407 | 0.5407 | 0.8181 | 0.9033 | 0.7893 | 0.8558 | 0.7616 | 0.825 | 0.7929 | 0.8414 | ### Framework versions - Transformers 4.45.2 - Pytorch 2.4.1+cu121 - Datasets 2.19.2 - Tokenizers 0.20.0
[ "chicken", "duck", "plant" ]
joe611/chickens-60-epoch-200-images
<!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. --> # chickens-60-epoch-200-images This model is a fine-tuned version of [facebook/detr-resnet-50](https://huggingface.co/facebook/detr-resnet-50) on the None dataset. It achieves the following results on the evaluation set: - Loss: 0.5714 - Map: 0.6874 - Map 50: 0.9234 - Map 75: 0.8248 - Map Small: 0.5069 - Map Medium: 0.6714 - Map Large: 0.7654 - Mar 1: 0.2397 - Mar 10: 0.7424 - Mar 100: 0.7555 - Mar Small: 0.5185 - Mar Medium: 0.7332 - Mar Large: 0.8229 - Map Chicken: 0.7156 - Mar 100 Chicken: 0.7929 - Map Duck: 0.6392 - Mar 100 Duck: 0.7115 - Map Plant: 0.7073 - Mar 100 Plant: 0.7621 ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 1e-05 - train_batch_size: 2 - eval_batch_size: 8 - seed: 42 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: cosine - num_epochs: 60 ### Training results | Training Loss | Epoch | Step | Validation Loss | Map | Map 50 | Map 75 | Map Small | Map Medium | Map Large | Mar 1 | Mar 10 | Mar 100 | Mar Small | Mar Medium | Mar Large | Map Chicken | Mar 100 Chicken | Map Duck | Mar 100 Duck | Map Plant | Mar 100 Plant | |:-------------:|:-----:|:----:|:---------------:|:------:|:------:|:------:|:---------:|:----------:|:---------:|:------:|:------:|:-------:|:---------:|:----------:|:---------:|:-----------:|:---------------:|:--------:|:------------:|:---------:|:-------------:| | 1.8277 | 1.0 | 100 | 1.7954 | 0.0273 | 0.0429 | 0.0322 | 0.0023 | 0.0189 | 0.1724 | 0.0322 | 0.0974 | 0.252 | 0.1333 | 0.2442 | 0.3118 | 0.0101 | 0.4325 | 0.0066 | 0.0358 | 0.0651 | 0.2877 | | 1.6529 | 2.0 | 200 | 1.6153 | 0.0747 | 0.1142 | 0.0884 | 0.0115 | 0.0385 | 0.3795 | 0.0361 | 0.1619 | 0.2907 | 0.1407 | 0.2683 | 0.6732 | 0.0202 | 0.2592 | 0.0 | 0.0 | 0.2039 | 0.6129 | | 1.5757 | 3.0 | 300 | 1.5640 | 0.1063 | 0.1635 | 0.12 | 0.152 | 0.0606 | 0.4749 | 0.0447 | 0.2034 | 0.2538 | 0.2444 | 0.2323 | 0.6948 | 0.0348 | 0.1246 | 0.0 | 0.0 | 0.2842 | 0.6369 | | 1.3741 | 4.0 | 400 | 1.4431 | 0.1124 | 0.1636 | 0.1307 | 0.0213 | 0.0532 | 0.5408 | 0.0402 | 0.1769 | 0.2496 | 0.2259 | 0.2309 | 0.7725 | 0.015 | 0.0296 | 0.0 | 0.0 | 0.3222 | 0.7191 | | 1.3922 | 5.0 | 500 | 1.3206 | 0.1518 | 0.2268 | 0.1758 | 0.1359 | 0.0896 | 0.5684 | 0.051 | 0.2012 | 0.2536 | 0.2667 | 0.2257 | 0.7954 | 0.0208 | 0.0358 | 0.0 | 0.0 | 0.4346 | 0.725 | | 1.2718 | 6.0 | 600 | 1.2374 | 0.1656 | 0.2413 | 0.1854 | 0.1213 | 0.0965 | 0.6428 | 0.0467 | 0.2083 | 0.2438 | 0.1963 | 0.2155 | 0.7912 | 0.007 | 0.0121 | 0.0 | 0.0 | 0.4897 | 0.7193 | | 1.1692 | 7.0 | 700 | 1.2744 | 0.188 | 0.276 | 0.2086 | 0.0 | 0.1374 | 0.6403 | 0.0565 | 0.2173 | 0.2364 | 0.0 | 0.2066 | 0.7565 | 0.0238 | 0.0304 | 0.0 | 0.0 | 0.5402 | 0.6789 | | 1.1205 | 8.0 | 800 | 1.1032 | 0.2248 | 0.3159 | 0.2623 | 0.1032 | 0.1919 | 0.6805 | 0.0781 | 0.264 | 0.285 | 0.2481 | 0.2604 | 0.785 | 0.0718 | 0.1358 | 0.0 | 0.0 | 0.6025 | 0.7191 | | 1.1331 | 9.0 | 900 | 1.1086 | 0.2243 | 0.3351 | 0.2575 | 0.0355 | 0.1898 | 0.6761 | 0.0903 | 0.3044 | 0.331 | 0.2704 | 0.3025 | 0.7918 | 0.0842 | 0.2725 | 0.0002 | 0.0007 | 0.5884 | 0.7199 | | 1.0573 | 10.0 | 1000 | 1.0497 | 0.2609 | 0.3848 | 0.2961 | 0.1714 | 0.2329 | 0.7023 | 0.1019 | 0.3431 | 0.3617 | 0.2556 | 0.331 | 0.7879 | 0.1495 | 0.3654 | 0.0084 | 0.0095 | 0.6248 | 0.7104 | | 1.2048 | 11.0 | 1100 | 1.0155 | 0.3063 | 0.4403 | 0.3541 | 0.2089 | 0.2905 | 0.6859 | 0.1203 | 0.4108 | 0.4412 | 0.2333 | 0.4181 | 0.7663 | 0.2528 | 0.5717 | 0.0443 | 0.0486 | 0.6217 | 0.7033 | | 1.0135 | 12.0 | 1200 | 0.9896 | 0.3031 | 0.4506 | 0.3501 | 0.3099 | 0.2756 | 0.6828 | 0.1195 | 0.4114 | 0.4319 | 0.3259 | 0.4 | 0.765 | 0.2659 | 0.5738 | 0.0333 | 0.0338 | 0.6101 | 0.6881 | | 1.0212 | 13.0 | 1300 | 1.0943 | 0.3275 | 0.4723 | 0.3971 | 0.2373 | 0.3166 | 0.638 | 0.1258 | 0.444 | 0.4607 | 0.2481 | 0.4422 | 0.7229 | 0.3622 | 0.6779 | 0.0289 | 0.0358 | 0.5915 | 0.6684 | | 0.9173 | 14.0 | 1400 | 0.9510 | 0.3389 | 0.4909 | 0.4013 | 0.2408 | 0.3205 | 0.6806 | 0.1251 | 0.4614 | 0.4832 | 0.363 | 0.4563 | 0.7683 | 0.359 | 0.7088 | 0.0406 | 0.0385 | 0.617 | 0.7023 | | 0.8862 | 15.0 | 1500 | 0.9053 | 0.3771 | 0.5484 | 0.438 | 0.1616 | 0.3555 | 0.7143 | 0.1432 | 0.4874 | 0.5095 | 0.3148 | 0.4825 | 0.7882 | 0.4009 | 0.7129 | 0.0861 | 0.0953 | 0.6442 | 0.7203 | | 0.9012 | 16.0 | 1600 | 0.9406 | 0.3889 | 0.5543 | 0.4661 | 0.185 | 0.3682 | 0.7036 | 0.1338 | 0.4758 | 0.4894 | 0.2704 | 0.466 | 0.7752 | 0.4708 | 0.6983 | 0.0509 | 0.0561 | 0.6451 | 0.7137 | | 0.864 | 17.0 | 1700 | 0.8412 | 0.3906 | 0.5531 | 0.4728 | 0.2214 | 0.3729 | 0.7195 | 0.141 | 0.4916 | 0.514 | 0.2407 | 0.4925 | 0.7938 | 0.4593 | 0.7454 | 0.055 | 0.0608 | 0.6574 | 0.7357 | | 0.7657 | 18.0 | 1800 | 0.8316 | 0.4046 | 0.5714 | 0.4843 | 0.1751 | 0.3865 | 0.7221 | 0.1539 | 0.5089 | 0.5296 | 0.3444 | 0.5031 | 0.7951 | 0.459 | 0.7546 | 0.0935 | 0.1054 | 0.6612 | 0.7287 | | 0.8597 | 19.0 | 1900 | 0.8379 | 0.4105 | 0.5831 | 0.4949 | 0.2155 | 0.3935 | 0.7089 | 0.1394 | 0.4875 | 0.5033 | 0.2222 | 0.4807 | 0.7801 | 0.5315 | 0.7371 | 0.0509 | 0.0541 | 0.649 | 0.7188 | | 0.7747 | 20.0 | 2000 | 0.8280 | 0.4182 | 0.5924 | 0.5143 | 0.2735 | 0.3993 | 0.6977 | 0.1392 | 0.4881 | 0.5025 | 0.2741 | 0.4789 | 0.7748 | 0.5537 | 0.735 | 0.0585 | 0.0601 | 0.6424 | 0.7125 | | 0.7621 | 21.0 | 2100 | 0.7842 | 0.4345 | 0.6189 | 0.5267 | 0.1441 | 0.4145 | 0.7219 | 0.1461 | 0.5071 | 0.5213 | 0.3333 | 0.496 | 0.7915 | 0.556 | 0.7404 | 0.0915 | 0.1 | 0.6559 | 0.7234 | | 0.7609 | 22.0 | 2200 | 0.7726 | 0.4249 | 0.5906 | 0.5262 | 0.1769 | 0.4043 | 0.7519 | 0.1291 | 0.4878 | 0.5039 | 0.3296 | 0.4788 | 0.8114 | 0.5697 | 0.7404 | 0.025 | 0.0223 | 0.68 | 0.749 | | 0.7386 | 23.0 | 2300 | 0.7575 | 0.4478 | 0.6312 | 0.5444 | 0.3547 | 0.4234 | 0.7451 | 0.1477 | 0.5171 | 0.5308 | 0.3778 | 0.5035 | 0.8108 | 0.5688 | 0.7392 | 0.0972 | 0.1074 | 0.6773 | 0.7457 | | 0.8086 | 24.0 | 2400 | 0.8128 | 0.4589 | 0.6456 | 0.5643 | 0.3348 | 0.4367 | 0.7173 | 0.1552 | 0.5206 | 0.5346 | 0.3407 | 0.5069 | 0.7863 | 0.5936 | 0.7433 | 0.1297 | 0.1419 | 0.6533 | 0.7186 | | 0.7901 | 25.0 | 2500 | 0.7641 | 0.4724 | 0.6601 | 0.5711 | 0.2783 | 0.4507 | 0.7255 | 0.1638 | 0.54 | 0.5534 | 0.2926 | 0.5284 | 0.7895 | 0.5919 | 0.7571 | 0.1618 | 0.1777 | 0.6636 | 0.7254 | | 0.7292 | 26.0 | 2600 | 0.7594 | 0.4769 | 0.6881 | 0.5651 | 0.3251 | 0.458 | 0.7166 | 0.1577 | 0.5427 | 0.555 | 0.3333 | 0.529 | 0.7918 | 0.5987 | 0.7479 | 0.1743 | 0.1899 | 0.6576 | 0.7273 | | 0.6989 | 27.0 | 2700 | 0.7371 | 0.5386 | 0.7406 | 0.6675 | 0.3238 | 0.5192 | 0.7394 | 0.1833 | 0.6022 | 0.6144 | 0.3222 | 0.5895 | 0.8033 | 0.644 | 0.7683 | 0.2977 | 0.3372 | 0.6741 | 0.7377 | | 0.7199 | 28.0 | 2800 | 0.6863 | 0.5527 | 0.7655 | 0.6766 | 0.3825 | 0.5321 | 0.7545 | 0.1845 | 0.6204 | 0.633 | 0.3889 | 0.6111 | 0.8141 | 0.6372 | 0.7758 | 0.3264 | 0.3696 | 0.6944 | 0.7537 | | 0.6656 | 29.0 | 2900 | 0.6749 | 0.5543 | 0.7595 | 0.6732 | 0.3931 | 0.5343 | 0.7568 | 0.1765 | 0.608 | 0.6206 | 0.3963 | 0.5957 | 0.8183 | 0.6634 | 0.7704 | 0.3048 | 0.3385 | 0.6946 | 0.7529 | | 0.6462 | 30.0 | 3000 | 0.6701 | 0.5793 | 0.8205 | 0.6987 | 0.3248 | 0.5603 | 0.7541 | 0.1903 | 0.6341 | 0.6478 | 0.3333 | 0.6225 | 0.8196 | 0.6641 | 0.7571 | 0.3821 | 0.4345 | 0.6916 | 0.752 | | 0.6706 | 31.0 | 3100 | 0.6555 | 0.6146 | 0.8476 | 0.7482 | 0.364 | 0.5938 | 0.7543 | 0.2091 | 0.6742 | 0.6883 | 0.3704 | 0.6637 | 0.8193 | 0.6567 | 0.7583 | 0.4974 | 0.5547 | 0.6897 | 0.752 | | 0.6464 | 32.0 | 3200 | 0.6445 | 0.6235 | 0.8621 | 0.7518 | 0.4682 | 0.6042 | 0.7563 | 0.207 | 0.6797 | 0.6951 | 0.4889 | 0.6696 | 0.819 | 0.6768 | 0.7713 | 0.4988 | 0.5601 | 0.6947 | 0.7539 | | 0.5998 | 33.0 | 3300 | 0.6266 | 0.6292 | 0.8724 | 0.7738 | 0.4007 | 0.6132 | 0.7534 | 0.2113 | 0.6863 | 0.7008 | 0.4148 | 0.6787 | 0.818 | 0.6862 | 0.7758 | 0.5049 | 0.5682 | 0.6965 | 0.7584 | | 0.6153 | 34.0 | 3400 | 0.6388 | 0.6202 | 0.8619 | 0.7475 | 0.417 | 0.6012 | 0.7515 | 0.2027 | 0.6755 | 0.6903 | 0.4185 | 0.6657 | 0.8167 | 0.6957 | 0.7804 | 0.4766 | 0.5392 | 0.6884 | 0.7512 | | 0.5857 | 35.0 | 3500 | 0.6126 | 0.6426 | 0.8968 | 0.797 | 0.4822 | 0.6215 | 0.7521 | 0.218 | 0.6984 | 0.7132 | 0.4963 | 0.6895 | 0.8111 | 0.6907 | 0.7721 | 0.5477 | 0.6169 | 0.6894 | 0.7506 | | 0.5625 | 36.0 | 3600 | 0.6348 | 0.6329 | 0.8926 | 0.7794 | 0.4634 | 0.6143 | 0.7436 | 0.2184 | 0.6909 | 0.7051 | 0.4815 | 0.6816 | 0.8082 | 0.6803 | 0.7679 | 0.5335 | 0.6 | 0.685 | 0.7475 | | 0.5969 | 37.0 | 3700 | 0.6090 | 0.6643 | 0.9033 | 0.8282 | 0.4425 | 0.6453 | 0.7616 | 0.2295 | 0.7186 | 0.7334 | 0.4667 | 0.7117 | 0.8193 | 0.6971 | 0.7812 | 0.5944 | 0.6595 | 0.7013 | 0.7596 | | 0.5882 | 38.0 | 3800 | 0.6197 | 0.6534 | 0.9061 | 0.811 | 0.4838 | 0.6356 | 0.7542 | 0.2241 | 0.7077 | 0.722 | 0.5 | 0.7002 | 0.8105 | 0.6798 | 0.7638 | 0.5863 | 0.6514 | 0.6941 | 0.7508 | | 0.5455 | 39.0 | 3900 | 0.5811 | 0.6769 | 0.9265 | 0.8258 | 0.471 | 0.6573 | 0.7649 | 0.2312 | 0.7309 | 0.7461 | 0.4815 | 0.7225 | 0.8186 | 0.7126 | 0.7887 | 0.6156 | 0.6946 | 0.7023 | 0.7551 | | 0.5654 | 40.0 | 4000 | 0.6227 | 0.662 | 0.9225 | 0.7937 | 0.4556 | 0.6472 | 0.749 | 0.2297 | 0.723 | 0.7351 | 0.4593 | 0.7155 | 0.8078 | 0.7022 | 0.7858 | 0.5892 | 0.6676 | 0.6945 | 0.7518 | | 0.5414 | 41.0 | 4100 | 0.6159 | 0.6743 | 0.9156 | 0.813 | 0.4628 | 0.6597 | 0.753 | 0.234 | 0.7353 | 0.7477 | 0.4667 | 0.7275 | 0.8144 | 0.6968 | 0.7854 | 0.6274 | 0.7014 | 0.6987 | 0.7564 | | 0.5337 | 42.0 | 4200 | 0.5916 | 0.6724 | 0.9252 | 0.8298 | 0.4644 | 0.656 | 0.7617 | 0.2346 | 0.7323 | 0.7456 | 0.463 | 0.7248 | 0.8183 | 0.6908 | 0.7758 | 0.6232 | 0.702 | 0.7034 | 0.759 | | 0.5356 | 43.0 | 4300 | 0.6240 | 0.6713 | 0.9148 | 0.8097 | 0.5307 | 0.6517 | 0.767 | 0.2322 | 0.7304 | 0.7443 | 0.5333 | 0.7216 | 0.8225 | 0.6926 | 0.7825 | 0.6165 | 0.6899 | 0.7047 | 0.7605 | | 0.5438 | 44.0 | 4400 | 0.6095 | 0.677 | 0.9178 | 0.8307 | 0.5792 | 0.6598 | 0.7614 | 0.235 | 0.7335 | 0.7477 | 0.5815 | 0.7264 | 0.8176 | 0.6938 | 0.7796 | 0.6331 | 0.7041 | 0.7042 | 0.7594 | | 0.5397 | 45.0 | 4500 | 0.5821 | 0.6838 | 0.9288 | 0.8383 | 0.5075 | 0.6635 | 0.7693 | 0.2354 | 0.7406 | 0.7541 | 0.5185 | 0.7323 | 0.8232 | 0.7023 | 0.7883 | 0.6405 | 0.7108 | 0.7087 | 0.7633 | | 0.4965 | 46.0 | 4600 | 0.5899 | 0.6809 | 0.9232 | 0.8247 | 0.4549 | 0.6644 | 0.7616 | 0.2351 | 0.7373 | 0.7505 | 0.4741 | 0.7294 | 0.8203 | 0.6995 | 0.7833 | 0.6402 | 0.7068 | 0.7031 | 0.7613 | | 0.5718 | 47.0 | 4700 | 0.5733 | 0.6856 | 0.9248 | 0.8121 | 0.4957 | 0.666 | 0.7741 | 0.2389 | 0.7435 | 0.7561 | 0.5074 | 0.7326 | 0.8301 | 0.7131 | 0.7942 | 0.6321 | 0.7074 | 0.7117 | 0.7666 | | 0.5055 | 48.0 | 4800 | 0.5774 | 0.6854 | 0.9236 | 0.8323 | 0.4892 | 0.6675 | 0.7673 | 0.2377 | 0.7399 | 0.7522 | 0.5037 | 0.7304 | 0.8196 | 0.7113 | 0.7887 | 0.6402 | 0.7088 | 0.7048 | 0.7592 | | 0.5249 | 49.0 | 4900 | 0.5793 | 0.6855 | 0.9241 | 0.8387 | 0.506 | 0.6677 | 0.7659 | 0.2398 | 0.7394 | 0.7541 | 0.5185 | 0.7315 | 0.8196 | 0.7064 | 0.7862 | 0.6437 | 0.7176 | 0.7065 | 0.7584 | | 0.5299 | 50.0 | 5000 | 0.5700 | 0.6888 | 0.9243 | 0.8388 | 0.4921 | 0.671 | 0.7666 | 0.2399 | 0.7411 | 0.7562 | 0.5037 | 0.7335 | 0.8235 | 0.7108 | 0.7892 | 0.648 | 0.7182 | 0.7075 | 0.7611 | | 0.5028 | 51.0 | 5100 | 0.5649 | 0.6917 | 0.9274 | 0.844 | 0.5059 | 0.6739 | 0.7707 | 0.2404 | 0.7468 | 0.7606 | 0.5296 | 0.7371 | 0.8297 | 0.7104 | 0.7892 | 0.6552 | 0.725 | 0.7095 | 0.7678 | | 0.5179 | 52.0 | 5200 | 0.5713 | 0.687 | 0.9277 | 0.8289 | 0.5104 | 0.6692 | 0.7662 | 0.2388 | 0.7419 | 0.7554 | 0.5222 | 0.7331 | 0.8216 | 0.7107 | 0.7921 | 0.6438 | 0.7128 | 0.7064 | 0.7611 | | 0.5294 | 53.0 | 5300 | 0.5721 | 0.6889 | 0.9253 | 0.8264 | 0.4982 | 0.672 | 0.7665 | 0.2405 | 0.7444 | 0.7575 | 0.5111 | 0.7357 | 0.8225 | 0.7144 | 0.7946 | 0.6447 | 0.7155 | 0.7076 | 0.7625 | | 0.528 | 54.0 | 5400 | 0.5722 | 0.6836 | 0.9247 | 0.8239 | 0.4957 | 0.6674 | 0.7658 | 0.2388 | 0.7388 | 0.754 | 0.5074 | 0.7324 | 0.8212 | 0.71 | 0.7937 | 0.6335 | 0.7068 | 0.7073 | 0.7615 | | 0.498 | 55.0 | 5500 | 0.5722 | 0.6844 | 0.926 | 0.8314 | 0.4957 | 0.6676 | 0.7653 | 0.2393 | 0.7405 | 0.7541 | 0.5074 | 0.7317 | 0.8232 | 0.7128 | 0.7929 | 0.634 | 0.7074 | 0.7065 | 0.7619 | | 0.5066 | 56.0 | 5600 | 0.5726 | 0.6853 | 0.9229 | 0.8279 | 0.4957 | 0.6691 | 0.7678 | 0.2396 | 0.741 | 0.7541 | 0.5074 | 0.7317 | 0.8242 | 0.7145 | 0.7929 | 0.6348 | 0.7068 | 0.7066 | 0.7627 | | 0.4986 | 57.0 | 5700 | 0.5714 | 0.6872 | 0.9231 | 0.8224 | 0.4957 | 0.6713 | 0.7657 | 0.2396 | 0.7422 | 0.7552 | 0.5074 | 0.733 | 0.8235 | 0.7165 | 0.7937 | 0.6373 | 0.7095 | 0.7077 | 0.7625 | | 0.4709 | 58.0 | 5800 | 0.5714 | 0.6869 | 0.9233 | 0.8248 | 0.5044 | 0.6709 | 0.7655 | 0.2394 | 0.7421 | 0.7552 | 0.5185 | 0.7331 | 0.8229 | 0.7157 | 0.7929 | 0.6374 | 0.7101 | 0.7075 | 0.7625 | | 0.4974 | 59.0 | 5900 | 0.5714 | 0.6872 | 0.9234 | 0.8248 | 0.5069 | 0.6712 | 0.7654 | 0.2394 | 0.7422 | 0.7553 | 0.5185 | 0.733 | 0.8229 | 0.7156 | 0.7929 | 0.6388 | 0.7108 | 0.7073 | 0.7621 | | 0.4991 | 60.0 | 6000 | 0.5714 | 0.6874 | 0.9234 | 0.8248 | 0.5069 | 0.6714 | 0.7654 | 0.2397 | 0.7424 | 0.7555 | 0.5185 | 0.7332 | 0.8229 | 0.7156 | 0.7929 | 0.6392 | 0.7115 | 0.7073 | 0.7621 | ### Framework versions - Transformers 4.45.2 - Pytorch 2.4.1+cu121 - Datasets 2.19.2 - Tokenizers 0.20.0
[ "chicken", "duck", "plant" ]
joe611/chickens-150-epoch-1000-images
<!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. --> # chickens-150-epoch-1000-images This model is a fine-tuned version of [facebook/detr-resnet-50](https://huggingface.co/facebook/detr-resnet-50) on the None dataset. It achieves the following results on the evaluation set: - Loss: 0.2206 - Map: 0.8755 - Map 50: 0.9863 - Map 75: 0.9647 - Map Small: 0.7847 - Map Medium: 0.8514 - Map Large: 0.913 - Mar 1: 0.2775 - Mar 10: 0.8909 - Mar 100: 0.9112 - Mar Small: 0.8111 - Mar Medium: 0.8936 - Mar Large: 0.9489 - Map Chicken: 0.8885 - Mar 100 Chicken: 0.9198 - Map Duck: 0.8801 - Mar 100 Duck: 0.9113 - Map Plant: 0.8578 - Mar 100 Plant: 0.9025 ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 1e-05 - train_batch_size: 2 - eval_batch_size: 8 - seed: 42 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: cosine - num_epochs: 150 ### Training results | Training Loss | Epoch | Step | Validation Loss | Map | Map 50 | Map 75 | Map Small | Map Medium | Map Large | Mar 1 | Mar 10 | Mar 100 | Mar Small | Mar Medium | Mar Large | Map Chicken | Mar 100 Chicken | Map Duck | Mar 100 Duck | Map Plant | Mar 100 Plant | |:-------------:|:-----:|:-----:|:---------------:|:------:|:------:|:------:|:---------:|:----------:|:---------:|:------:|:------:|:-------:|:---------:|:----------:|:---------:|:-----------:|:---------------:|:--------:|:------------:|:---------:|:-------------:| | 1.4296 | 1.0 | 497 | 1.2938 | 0.2117 | 0.3 | 0.2428 | 0.0068 | 0.1615 | 0.6959 | 0.0719 | 0.3115 | 0.3548 | 0.0222 | 0.3329 | 0.8095 | 0.069 | 0.3186 | 0.0 | 0.0 | 0.5662 | 0.7459 | | 1.097 | 2.0 | 994 | 1.2351 | 0.3214 | 0.4717 | 0.3864 | 0.0119 | 0.2982 | 0.6584 | 0.1084 | 0.4079 | 0.4215 | 0.0111 | 0.3929 | 0.7338 | 0.3727 | 0.6065 | 0.0 | 0.0 | 0.5914 | 0.658 | | 0.8765 | 3.0 | 1491 | 0.8339 | 0.3861 | 0.5556 | 0.4625 | 0.1825 | 0.3645 | 0.7106 | 0.1193 | 0.4622 | 0.49 | 0.2444 | 0.4662 | 0.7777 | 0.5014 | 0.7474 | 0.0059 | 0.004 | 0.6511 | 0.7186 | | 0.7793 | 4.0 | 1988 | 0.7135 | 0.4536 | 0.6213 | 0.5595 | 0.1029 | 0.4369 | 0.7495 | 0.1331 | 0.4986 | 0.5176 | 0.3222 | 0.4968 | 0.8089 | 0.648 | 0.7814 | 0.0178 | 0.0179 | 0.695 | 0.7535 | | 0.7088 | 5.0 | 2485 | 0.6670 | 0.527 | 0.7739 | 0.6086 | 0.3287 | 0.5035 | 0.765 | 0.1664 | 0.5776 | 0.5948 | 0.463 | 0.5694 | 0.8223 | 0.6085 | 0.7251 | 0.2666 | 0.3007 | 0.706 | 0.7586 | | 0.6103 | 6.0 | 2982 | 0.5614 | 0.6368 | 0.8763 | 0.7839 | 0.4332 | 0.6138 | 0.7822 | 0.2067 | 0.682 | 0.7006 | 0.5074 | 0.6784 | 0.8318 | 0.687 | 0.7611 | 0.5042 | 0.5702 | 0.7192 | 0.7705 | | 0.6698 | 7.0 | 3479 | 0.5172 | 0.7023 | 0.9459 | 0.8568 | 0.4879 | 0.6833 | 0.7685 | 0.2365 | 0.75 | 0.7677 | 0.5222 | 0.7472 | 0.8357 | 0.7275 | 0.7899 | 0.6638 | 0.7351 | 0.7155 | 0.7781 | | 0.5203 | 8.0 | 3976 | 0.4650 | 0.7035 | 0.9674 | 0.854 | 0.4968 | 0.6796 | 0.8058 | 0.234 | 0.7519 | 0.7687 | 0.5593 | 0.747 | 0.8534 | 0.6979 | 0.7648 | 0.6669 | 0.745 | 0.7457 | 0.7963 | | 0.5365 | 9.0 | 4473 | 0.4477 | 0.7279 | 0.9651 | 0.8795 | 0.5507 | 0.6949 | 0.7979 | 0.2408 | 0.7645 | 0.7802 | 0.5852 | 0.7523 | 0.843 | 0.7435 | 0.7927 | 0.7124 | 0.7735 | 0.7279 | 0.7744 | | 0.4889 | 10.0 | 4970 | 0.4009 | 0.7319 | 0.9739 | 0.89 | 0.4977 | 0.7089 | 0.8122 | 0.2401 | 0.7737 | 0.7885 | 0.5 | 0.7674 | 0.8656 | 0.7398 | 0.7887 | 0.7036 | 0.7722 | 0.7522 | 0.8047 | | 0.4643 | 11.0 | 5467 | 0.4179 | 0.7323 | 0.97 | 0.8815 | 0.4197 | 0.71 | 0.8031 | 0.2425 | 0.7734 | 0.7873 | 0.4407 | 0.7655 | 0.8515 | 0.7396 | 0.7935 | 0.7144 | 0.7775 | 0.743 | 0.7908 | | 0.4338 | 12.0 | 5964 | 0.3881 | 0.7467 | 0.9746 | 0.9038 | 0.4911 | 0.723 | 0.8134 | 0.2457 | 0.7873 | 0.8023 | 0.4963 | 0.779 | 0.8636 | 0.7562 | 0.8061 | 0.7318 | 0.802 | 0.7521 | 0.7988 | | 0.4975 | 13.0 | 6461 | 0.3863 | 0.7392 | 0.9759 | 0.9071 | 0.492 | 0.7212 | 0.8045 | 0.2436 | 0.7785 | 0.7948 | 0.6037 | 0.7773 | 0.8544 | 0.7233 | 0.7838 | 0.7426 | 0.7993 | 0.7518 | 0.8014 | | 0.4204 | 14.0 | 6958 | 0.3964 | 0.7501 | 0.9743 | 0.9108 | 0.4955 | 0.7265 | 0.8116 | 0.2481 | 0.7848 | 0.8006 | 0.5185 | 0.7785 | 0.857 | 0.7537 | 0.8085 | 0.7458 | 0.7954 | 0.7509 | 0.7979 | | 0.3908 | 15.0 | 7455 | 0.3705 | 0.7656 | 0.9758 | 0.9231 | 0.4787 | 0.7438 | 0.8173 | 0.2489 | 0.8011 | 0.817 | 0.5333 | 0.798 | 0.861 | 0.7789 | 0.8275 | 0.7536 | 0.8139 | 0.7643 | 0.8096 | | 0.4647 | 16.0 | 7952 | 0.4876 | 0.703 | 0.9518 | 0.8469 | 0.4959 | 0.6801 | 0.8005 | 0.2333 | 0.7424 | 0.7588 | 0.5296 | 0.7365 | 0.8459 | 0.6979 | 0.753 | 0.671 | 0.7377 | 0.7401 | 0.7857 | | 0.3955 | 17.0 | 8449 | 0.3684 | 0.7529 | 0.9755 | 0.9189 | 0.5456 | 0.7261 | 0.8323 | 0.2455 | 0.7847 | 0.8029 | 0.563 | 0.7788 | 0.8741 | 0.7656 | 0.813 | 0.7272 | 0.7854 | 0.7658 | 0.8104 | | 0.374 | 18.0 | 8946 | 0.3251 | 0.7842 | 0.9792 | 0.9469 | 0.5192 | 0.7597 | 0.8488 | 0.2526 | 0.8122 | 0.8324 | 0.5704 | 0.8122 | 0.8951 | 0.8014 | 0.8401 | 0.7628 | 0.8166 | 0.7885 | 0.8404 | | 0.3919 | 19.0 | 9443 | 0.3246 | 0.7695 | 0.9801 | 0.9274 | 0.5829 | 0.7471 | 0.8307 | 0.2513 | 0.8043 | 0.8236 | 0.7 | 0.8072 | 0.8787 | 0.7839 | 0.8352 | 0.7451 | 0.8013 | 0.7794 | 0.8342 | | 0.3258 | 20.0 | 9940 | 0.3678 | 0.7568 | 0.9765 | 0.9218 | 0.5451 | 0.7323 | 0.8296 | 0.245 | 0.791 | 0.8089 | 0.5741 | 0.7885 | 0.88 | 0.7644 | 0.813 | 0.7332 | 0.7901 | 0.7726 | 0.8238 | | 0.345 | 21.0 | 10437 | 0.3155 | 0.7869 | 0.9836 | 0.9327 | 0.5771 | 0.7597 | 0.8524 | 0.2557 | 0.817 | 0.8354 | 0.6222 | 0.8142 | 0.8967 | 0.811 | 0.8474 | 0.7566 | 0.8179 | 0.7929 | 0.8408 | | 0.3775 | 22.0 | 10934 | 0.3256 | 0.7804 | 0.9817 | 0.9292 | 0.5367 | 0.7616 | 0.8408 | 0.2528 | 0.8074 | 0.8277 | 0.5963 | 0.8109 | 0.8879 | 0.7902 | 0.834 | 0.7622 | 0.8119 | 0.7889 | 0.8373 | | 0.3266 | 23.0 | 11431 | 0.3161 | 0.791 | 0.9778 | 0.934 | 0.4913 | 0.767 | 0.8415 | 0.2568 | 0.823 | 0.8409 | 0.5296 | 0.8234 | 0.8905 | 0.8125 | 0.853 | 0.7758 | 0.8305 | 0.7846 | 0.8393 | | 0.3451 | 24.0 | 11928 | 0.3353 | 0.7617 | 0.9809 | 0.9165 | 0.5147 | 0.7361 | 0.8359 | 0.243 | 0.7971 | 0.8151 | 0.5259 | 0.794 | 0.8911 | 0.7954 | 0.8377 | 0.713 | 0.7742 | 0.7766 | 0.8336 | | 0.3157 | 25.0 | 12425 | 0.3157 | 0.8068 | 0.9761 | 0.9408 | 0.5057 | 0.7776 | 0.8537 | 0.261 | 0.8341 | 0.8516 | 0.5444 | 0.8306 | 0.8984 | 0.8333 | 0.8676 | 0.7959 | 0.8464 | 0.7911 | 0.8408 | | 0.3305 | 26.0 | 12922 | 0.3047 | 0.7949 | 0.9856 | 0.9436 | 0.5129 | 0.771 | 0.8555 | 0.2595 | 0.8242 | 0.8429 | 0.5259 | 0.8227 | 0.898 | 0.8026 | 0.8498 | 0.7864 | 0.8377 | 0.7959 | 0.8412 | | 0.3169 | 27.0 | 13419 | 0.3240 | 0.788 | 0.9812 | 0.9341 | 0.5832 | 0.7668 | 0.835 | 0.2552 | 0.8164 | 0.8357 | 0.6222 | 0.8194 | 0.8852 | 0.8002 | 0.8457 | 0.7793 | 0.8225 | 0.7843 | 0.8389 | | 0.3147 | 28.0 | 13916 | 0.2864 | 0.8171 | 0.9841 | 0.9487 | 0.5705 | 0.799 | 0.8541 | 0.2645 | 0.8427 | 0.8637 | 0.6296 | 0.848 | 0.9036 | 0.8294 | 0.8725 | 0.8155 | 0.8609 | 0.8064 | 0.8578 | | 0.3031 | 29.0 | 14413 | 0.3105 | 0.8015 | 0.984 | 0.943 | 0.6201 | 0.7829 | 0.8466 | 0.2557 | 0.8303 | 0.8485 | 0.637 | 0.8316 | 0.8957 | 0.8321 | 0.8696 | 0.7737 | 0.8272 | 0.7986 | 0.8486 | | 0.3125 | 30.0 | 14910 | 0.2876 | 0.8172 | 0.9849 | 0.9524 | 0.6261 | 0.7956 | 0.8599 | 0.2614 | 0.8396 | 0.8588 | 0.6815 | 0.84 | 0.903 | 0.8489 | 0.8798 | 0.7961 | 0.8437 | 0.8066 | 0.8529 | | 0.2807 | 31.0 | 15407 | 0.2986 | 0.7807 | 0.9856 | 0.9329 | 0.5698 | 0.7637 | 0.8525 | 0.2527 | 0.8154 | 0.8356 | 0.6222 | 0.8198 | 0.9043 | 0.7957 | 0.8457 | 0.7416 | 0.804 | 0.8047 | 0.857 | | 0.3149 | 32.0 | 15904 | 0.2922 | 0.7853 | 0.9866 | 0.9404 | 0.6456 | 0.7663 | 0.8564 | 0.2519 | 0.8164 | 0.8368 | 0.6815 | 0.8195 | 0.9069 | 0.8093 | 0.8543 | 0.7393 | 0.7967 | 0.8072 | 0.8594 | | 0.3525 | 33.0 | 16401 | 0.2835 | 0.8239 | 0.9846 | 0.9479 | 0.6779 | 0.8052 | 0.863 | 0.2668 | 0.8438 | 0.8633 | 0.6889 | 0.8494 | 0.9072 | 0.8304 | 0.8664 | 0.8232 | 0.8589 | 0.818 | 0.8646 | | 0.3304 | 34.0 | 16898 | 0.2639 | 0.8231 | 0.9866 | 0.9401 | 0.6237 | 0.8069 | 0.8713 | 0.2624 | 0.8484 | 0.8692 | 0.6481 | 0.8543 | 0.9207 | 0.8348 | 0.8773 | 0.812 | 0.857 | 0.8224 | 0.8734 | | 0.3139 | 35.0 | 17395 | 0.2877 | 0.821 | 0.9836 | 0.9351 | 0.6336 | 0.8072 | 0.8445 | 0.2621 | 0.8463 | 0.8651 | 0.6556 | 0.8503 | 0.8997 | 0.8504 | 0.8858 | 0.8154 | 0.8556 | 0.7971 | 0.8539 | | 0.2777 | 36.0 | 17892 | 0.2773 | 0.8118 | 0.9843 | 0.9341 | 0.6721 | 0.7934 | 0.862 | 0.2619 | 0.8399 | 0.8605 | 0.7185 | 0.8444 | 0.9066 | 0.8356 | 0.8773 | 0.7883 | 0.8424 | 0.8113 | 0.8617 | | 0.279 | 37.0 | 18389 | 0.2657 | 0.8207 | 0.9873 | 0.9428 | 0.6517 | 0.7998 | 0.8775 | 0.2632 | 0.8457 | 0.8655 | 0.6852 | 0.8454 | 0.922 | 0.837 | 0.8773 | 0.8021 | 0.851 | 0.8231 | 0.8682 | | 0.3138 | 38.0 | 18886 | 0.2828 | 0.8075 | 0.9831 | 0.9424 | 0.5929 | 0.7918 | 0.8673 | 0.2557 | 0.8342 | 0.8538 | 0.6852 | 0.8384 | 0.9111 | 0.8197 | 0.8599 | 0.7818 | 0.8325 | 0.821 | 0.8689 | | 0.2836 | 39.0 | 19383 | 0.2714 | 0.819 | 0.9852 | 0.9512 | 0.6244 | 0.7991 | 0.8681 | 0.261 | 0.8434 | 0.8636 | 0.663 | 0.846 | 0.9138 | 0.8364 | 0.8749 | 0.8027 | 0.8503 | 0.8179 | 0.8656 | | 0.2865 | 40.0 | 19880 | 0.2703 | 0.8274 | 0.9848 | 0.9499 | 0.6799 | 0.8085 | 0.875 | 0.2633 | 0.8488 | 0.8698 | 0.7037 | 0.8544 | 0.9226 | 0.8425 | 0.8745 | 0.8118 | 0.8583 | 0.828 | 0.8766 | | 0.2896 | 41.0 | 20377 | 0.2730 | 0.8314 | 0.9866 | 0.9517 | 0.6703 | 0.8121 | 0.8662 | 0.2686 | 0.8551 | 0.8747 | 0.7259 | 0.8571 | 0.9118 | 0.8422 | 0.8802 | 0.8356 | 0.8815 | 0.8163 | 0.8625 | | 0.2856 | 42.0 | 20874 | 0.2779 | 0.817 | 0.9872 | 0.95 | 0.6066 | 0.8014 | 0.8686 | 0.2619 | 0.8414 | 0.8598 | 0.637 | 0.8425 | 0.9105 | 0.8347 | 0.8721 | 0.798 | 0.8477 | 0.8183 | 0.8598 | | 0.2687 | 43.0 | 21371 | 0.2878 | 0.8161 | 0.9833 | 0.9354 | 0.5987 | 0.794 | 0.8766 | 0.2616 | 0.8412 | 0.8602 | 0.637 | 0.8399 | 0.9184 | 0.8204 | 0.8611 | 0.8096 | 0.8583 | 0.8183 | 0.8611 | | 0.2705 | 44.0 | 21868 | 0.2773 | 0.8223 | 0.9865 | 0.9556 | 0.6594 | 0.7944 | 0.8865 | 0.2613 | 0.8451 | 0.8653 | 0.6667 | 0.8426 | 0.9272 | 0.8301 | 0.8709 | 0.8151 | 0.857 | 0.8217 | 0.8682 | | 0.2752 | 45.0 | 22365 | 0.2812 | 0.8222 | 0.9836 | 0.9532 | 0.7159 | 0.7973 | 0.8736 | 0.2654 | 0.8441 | 0.8637 | 0.7222 | 0.8445 | 0.9193 | 0.8343 | 0.8704 | 0.8137 | 0.8523 | 0.8186 | 0.8684 | | 0.303 | 46.0 | 22862 | 0.2877 | 0.8104 | 0.9843 | 0.9493 | 0.6484 | 0.7902 | 0.8585 | 0.2597 | 0.8376 | 0.8574 | 0.7037 | 0.8396 | 0.9082 | 0.8405 | 0.8737 | 0.7825 | 0.8397 | 0.8081 | 0.8588 | | 0.2474 | 47.0 | 23359 | 0.2686 | 0.8106 | 0.9864 | 0.955 | 0.7045 | 0.7898 | 0.8751 | 0.2599 | 0.837 | 0.857 | 0.7148 | 0.8384 | 0.9236 | 0.832 | 0.866 | 0.7752 | 0.8311 | 0.8247 | 0.8738 | | 0.28 | 48.0 | 23856 | 0.2820 | 0.8212 | 0.9799 | 0.947 | 0.6201 | 0.7992 | 0.85 | 0.2668 | 0.8492 | 0.8686 | 0.6778 | 0.8501 | 0.9059 | 0.8486 | 0.885 | 0.8166 | 0.8642 | 0.7984 | 0.8564 | | 0.2537 | 49.0 | 24353 | 0.2863 | 0.827 | 0.9868 | 0.9457 | 0.7191 | 0.8072 | 0.8667 | 0.2664 | 0.8507 | 0.8695 | 0.7481 | 0.8509 | 0.9102 | 0.8446 | 0.8781 | 0.8205 | 0.8682 | 0.8158 | 0.8621 | | 0.283 | 50.0 | 24850 | 0.2646 | 0.8267 | 0.9852 | 0.9501 | 0.6957 | 0.8024 | 0.8854 | 0.2654 | 0.8488 | 0.8712 | 0.7296 | 0.851 | 0.9305 | 0.8414 | 0.8794 | 0.8097 | 0.8556 | 0.8292 | 0.8785 | | 0.2638 | 51.0 | 25347 | 0.2603 | 0.8305 | 0.986 | 0.9607 | 0.6926 | 0.8074 | 0.8768 | 0.2689 | 0.8541 | 0.8756 | 0.7407 | 0.8565 | 0.9226 | 0.8453 | 0.883 | 0.8232 | 0.8702 | 0.8229 | 0.8736 | | 0.2797 | 52.0 | 25844 | 0.2759 | 0.8154 | 0.982 | 0.9409 | 0.6262 | 0.7895 | 0.8707 | 0.2629 | 0.8404 | 0.8595 | 0.6444 | 0.8378 | 0.919 | 0.8232 | 0.8611 | 0.8138 | 0.857 | 0.8091 | 0.8604 | | 0.2555 | 53.0 | 26341 | 0.2707 | 0.8186 | 0.9874 | 0.9557 | 0.7187 | 0.7936 | 0.8731 | 0.2633 | 0.8396 | 0.8612 | 0.7593 | 0.8403 | 0.9197 | 0.8377 | 0.8648 | 0.8021 | 0.8523 | 0.8161 | 0.8666 | | 0.2599 | 54.0 | 26838 | 0.2860 | 0.8328 | 0.9826 | 0.9538 | 0.7138 | 0.8084 | 0.8797 | 0.2651 | 0.8554 | 0.8753 | 0.7333 | 0.8541 | 0.9269 | 0.8481 | 0.8842 | 0.8287 | 0.8695 | 0.8216 | 0.8721 | | 0.2764 | 55.0 | 27335 | 0.2851 | 0.8307 | 0.9854 | 0.9502 | 0.6945 | 0.8067 | 0.871 | 0.2648 | 0.8516 | 0.871 | 0.7481 | 0.8513 | 0.9167 | 0.8512 | 0.8822 | 0.8246 | 0.8662 | 0.8162 | 0.8645 | | 0.2482 | 56.0 | 27832 | 0.2743 | 0.827 | 0.9849 | 0.9531 | 0.7243 | 0.8057 | 0.8822 | 0.2644 | 0.8499 | 0.8697 | 0.7407 | 0.8505 | 0.9266 | 0.833 | 0.8664 | 0.8192 | 0.8662 | 0.8288 | 0.8764 | | 0.2518 | 57.0 | 28329 | 0.2665 | 0.8368 | 0.9852 | 0.9531 | 0.7424 | 0.8129 | 0.8819 | 0.2676 | 0.8604 | 0.8793 | 0.7741 | 0.8594 | 0.9226 | 0.8512 | 0.8883 | 0.832 | 0.8775 | 0.8273 | 0.8723 | | 0.2506 | 58.0 | 28826 | 0.2616 | 0.8275 | 0.9866 | 0.9573 | 0.7332 | 0.8064 | 0.8794 | 0.2646 | 0.8489 | 0.8693 | 0.7741 | 0.8533 | 0.923 | 0.8399 | 0.8717 | 0.8121 | 0.8563 | 0.8304 | 0.8799 | | 0.2477 | 59.0 | 29323 | 0.2728 | 0.8144 | 0.9871 | 0.9564 | 0.7406 | 0.7954 | 0.8644 | 0.259 | 0.8419 | 0.862 | 0.7704 | 0.844 | 0.9108 | 0.8279 | 0.8737 | 0.7971 | 0.8457 | 0.8182 | 0.8666 | | 0.2456 | 60.0 | 29820 | 0.2560 | 0.8395 | 0.9877 | 0.9524 | 0.7542 | 0.8183 | 0.8785 | 0.2673 | 0.8607 | 0.8791 | 0.7889 | 0.8601 | 0.9236 | 0.8563 | 0.8879 | 0.8332 | 0.8742 | 0.8289 | 0.8752 | | 0.2313 | 61.0 | 30317 | 0.2508 | 0.8373 | 0.9884 | 0.9592 | 0.7065 | 0.8171 | 0.8804 | 0.2671 | 0.8602 | 0.8803 | 0.7556 | 0.8621 | 0.9289 | 0.8531 | 0.8874 | 0.8283 | 0.8715 | 0.8306 | 0.882 | | 0.2436 | 62.0 | 30814 | 0.2384 | 0.8513 | 0.9875 | 0.9565 | 0.7027 | 0.8252 | 0.8928 | 0.2705 | 0.8687 | 0.8873 | 0.7481 | 0.868 | 0.9341 | 0.8634 | 0.8951 | 0.8491 | 0.8815 | 0.8415 | 0.8854 | | 0.2544 | 63.0 | 31311 | 0.2559 | 0.846 | 0.9867 | 0.9607 | 0.7734 | 0.8216 | 0.8831 | 0.2686 | 0.8646 | 0.8846 | 0.7963 | 0.8643 | 0.9262 | 0.8583 | 0.8964 | 0.8485 | 0.8801 | 0.8313 | 0.8773 | | 0.2368 | 64.0 | 31808 | 0.2500 | 0.837 | 0.9881 | 0.9532 | 0.7211 | 0.8143 | 0.876 | 0.2695 | 0.8577 | 0.877 | 0.7519 | 0.858 | 0.9226 | 0.8559 | 0.8915 | 0.8322 | 0.8662 | 0.8228 | 0.8732 | | 0.242 | 65.0 | 32305 | 0.2462 | 0.8447 | 0.9824 | 0.9603 | 0.7452 | 0.8217 | 0.8837 | 0.2706 | 0.8649 | 0.8846 | 0.7815 | 0.865 | 0.9279 | 0.8586 | 0.8911 | 0.8459 | 0.8848 | 0.8296 | 0.8779 | | 0.2337 | 66.0 | 32802 | 0.2472 | 0.8522 | 0.9865 | 0.9642 | 0.6838 | 0.8326 | 0.8863 | 0.2713 | 0.8712 | 0.8911 | 0.7407 | 0.873 | 0.9279 | 0.8616 | 0.8935 | 0.8597 | 0.8974 | 0.8354 | 0.8824 | | 0.227 | 67.0 | 33299 | 0.2296 | 0.8499 | 0.9878 | 0.9683 | 0.775 | 0.8303 | 0.8844 | 0.2703 | 0.8716 | 0.8901 | 0.8185 | 0.8736 | 0.9266 | 0.8647 | 0.8988 | 0.8461 | 0.8861 | 0.8389 | 0.8855 | | 0.2456 | 68.0 | 33796 | 0.2465 | 0.8395 | 0.9877 | 0.9606 | 0.7729 | 0.8201 | 0.8807 | 0.2685 | 0.8621 | 0.8819 | 0.8074 | 0.8658 | 0.9256 | 0.8389 | 0.8794 | 0.8429 | 0.8828 | 0.8368 | 0.8836 | | 0.2178 | 69.0 | 34293 | 0.2571 | 0.8387 | 0.987 | 0.9563 | 0.7436 | 0.8174 | 0.89 | 0.2689 | 0.86 | 0.8797 | 0.7556 | 0.8613 | 0.9305 | 0.8455 | 0.8842 | 0.8348 | 0.8742 | 0.8357 | 0.8807 | | 0.227 | 70.0 | 34790 | 0.2518 | 0.8417 | 0.9859 | 0.9521 | 0.7424 | 0.8185 | 0.8821 | 0.2703 | 0.863 | 0.8838 | 0.7778 | 0.8658 | 0.9249 | 0.8624 | 0.8939 | 0.8327 | 0.8795 | 0.8299 | 0.8779 | | 0.2375 | 71.0 | 35287 | 0.2721 | 0.8418 | 0.9849 | 0.9579 | 0.7289 | 0.822 | 0.8758 | 0.2724 | 0.8621 | 0.8818 | 0.8111 | 0.8642 | 0.9167 | 0.8535 | 0.8874 | 0.8463 | 0.8848 | 0.8255 | 0.873 | | 0.2073 | 72.0 | 35784 | 0.2554 | 0.8385 | 0.9819 | 0.9567 | 0.7187 | 0.8173 | 0.8782 | 0.2712 | 0.8642 | 0.8828 | 0.763 | 0.8628 | 0.9216 | 0.8539 | 0.8939 | 0.8375 | 0.8828 | 0.8242 | 0.8717 | | 0.2305 | 73.0 | 36281 | 0.2584 | 0.8442 | 0.9849 | 0.9647 | 0.7547 | 0.8221 | 0.8856 | 0.2698 | 0.8616 | 0.8811 | 0.8037 | 0.8632 | 0.9256 | 0.8582 | 0.887 | 0.84 | 0.8742 | 0.8346 | 0.882 | | 0.2243 | 74.0 | 36778 | 0.2419 | 0.8471 | 0.9846 | 0.9556 | 0.7778 | 0.8243 | 0.8899 | 0.2725 | 0.8699 | 0.8893 | 0.8148 | 0.8708 | 0.9311 | 0.8617 | 0.9012 | 0.8414 | 0.8808 | 0.8381 | 0.8857 | | 0.227 | 75.0 | 37275 | 0.2311 | 0.861 | 0.9864 | 0.9575 | 0.7202 | 0.8386 | 0.8941 | 0.2759 | 0.8795 | 0.899 | 0.7407 | 0.8818 | 0.9334 | 0.8769 | 0.9057 | 0.8672 | 0.904 | 0.8388 | 0.8873 | | 0.2226 | 76.0 | 37772 | 0.2420 | 0.8568 | 0.9842 | 0.9558 | 0.7593 | 0.8367 | 0.8968 | 0.2761 | 0.8753 | 0.8948 | 0.8185 | 0.8782 | 0.9311 | 0.8707 | 0.8988 | 0.8561 | 0.8967 | 0.8436 | 0.8889 | | 0.2239 | 77.0 | 38269 | 0.2340 | 0.8483 | 0.988 | 0.9686 | 0.7653 | 0.8245 | 0.9028 | 0.2724 | 0.8682 | 0.8877 | 0.8296 | 0.8677 | 0.9403 | 0.8546 | 0.8879 | 0.8433 | 0.8828 | 0.8471 | 0.8924 | | 0.2188 | 78.0 | 38766 | 0.2398 | 0.8601 | 0.9842 | 0.9648 | 0.6889 | 0.8375 | 0.8977 | 0.2751 | 0.8794 | 0.8984 | 0.7333 | 0.8804 | 0.9351 | 0.8719 | 0.9045 | 0.866 | 0.904 | 0.8423 | 0.8867 | | 0.2114 | 79.0 | 39263 | 0.2431 | 0.8509 | 0.9857 | 0.9603 | 0.6987 | 0.8253 | 0.8979 | 0.2725 | 0.8698 | 0.889 | 0.7185 | 0.8691 | 0.9357 | 0.8592 | 0.8911 | 0.8555 | 0.8914 | 0.8378 | 0.8844 | | 0.2123 | 80.0 | 39760 | 0.2447 | 0.8598 | 0.9848 | 0.9637 | 0.6535 | 0.8392 | 0.8883 | 0.2746 | 0.8784 | 0.8988 | 0.6926 | 0.8814 | 0.9305 | 0.8734 | 0.9045 | 0.8751 | 0.9073 | 0.8309 | 0.8846 | | 0.2213 | 81.0 | 40257 | 0.2319 | 0.8665 | 0.9857 | 0.96 | 0.717 | 0.8465 | 0.8982 | 0.2787 | 0.8833 | 0.9036 | 0.7667 | 0.8875 | 0.9367 | 0.8749 | 0.9061 | 0.8784 | 0.9113 | 0.8462 | 0.8936 | | 0.2307 | 82.0 | 40754 | 0.2348 | 0.8536 | 0.987 | 0.96 | 0.7467 | 0.8337 | 0.8923 | 0.273 | 0.8717 | 0.8917 | 0.7815 | 0.8746 | 0.9348 | 0.8605 | 0.8883 | 0.8604 | 0.896 | 0.8398 | 0.8908 | | 0.2274 | 83.0 | 41251 | 0.2292 | 0.8573 | 0.9872 | 0.9668 | 0.7512 | 0.8362 | 0.8966 | 0.2742 | 0.8763 | 0.8966 | 0.7667 | 0.8809 | 0.9334 | 0.8815 | 0.9142 | 0.8508 | 0.8868 | 0.8396 | 0.8889 | | 0.1966 | 84.0 | 41748 | 0.2411 | 0.8553 | 0.9844 | 0.9641 | 0.7674 | 0.8299 | 0.8978 | 0.272 | 0.874 | 0.8939 | 0.8074 | 0.8728 | 0.9348 | 0.8648 | 0.896 | 0.8626 | 0.9013 | 0.8384 | 0.8844 | | 0.2045 | 85.0 | 42245 | 0.2366 | 0.858 | 0.9859 | 0.9569 | 0.7941 | 0.8362 | 0.8868 | 0.2727 | 0.8785 | 0.8988 | 0.8444 | 0.8805 | 0.9289 | 0.8819 | 0.9113 | 0.8594 | 0.9 | 0.8327 | 0.885 | | 0.2067 | 86.0 | 42742 | 0.2460 | 0.8519 | 0.9862 | 0.9609 | 0.7314 | 0.8307 | 0.887 | 0.2704 | 0.8687 | 0.8876 | 0.7926 | 0.87 | 0.9262 | 0.8656 | 0.8955 | 0.8528 | 0.8848 | 0.8373 | 0.8826 | | 0.2104 | 87.0 | 43239 | 0.2477 | 0.8504 | 0.9854 | 0.9563 | 0.746 | 0.8321 | 0.8835 | 0.2699 | 0.869 | 0.8886 | 0.7741 | 0.8734 | 0.9246 | 0.8642 | 0.8931 | 0.8491 | 0.8887 | 0.8377 | 0.884 | | 0.2152 | 88.0 | 43736 | 0.2344 | 0.8651 | 0.9868 | 0.958 | 0.7668 | 0.8454 | 0.8888 | 0.2756 | 0.882 | 0.9016 | 0.8037 | 0.8859 | 0.9289 | 0.882 | 0.9105 | 0.8737 | 0.9053 | 0.8395 | 0.8889 | | 0.2177 | 89.0 | 44233 | 0.2322 | 0.8575 | 0.9849 | 0.9557 | 0.7543 | 0.8342 | 0.8972 | 0.273 | 0.8749 | 0.8945 | 0.8 | 0.8754 | 0.9377 | 0.8734 | 0.9004 | 0.8575 | 0.8921 | 0.8416 | 0.891 | | 0.2219 | 90.0 | 44730 | 0.2301 | 0.8646 | 0.9853 | 0.9625 | 0.745 | 0.8426 | 0.8921 | 0.2747 | 0.8827 | 0.9028 | 0.763 | 0.8862 | 0.9351 | 0.8795 | 0.9089 | 0.875 | 0.9093 | 0.8394 | 0.8902 | | 0.2127 | 91.0 | 45227 | 0.2331 | 0.8603 | 0.9868 | 0.9621 | 0.7677 | 0.8376 | 0.8954 | 0.2754 | 0.8772 | 0.8977 | 0.7889 | 0.8797 | 0.9357 | 0.8733 | 0.9036 | 0.866 | 0.9 | 0.8416 | 0.8895 | | 0.2101 | 92.0 | 45724 | 0.2309 | 0.8626 | 0.9868 | 0.9539 | 0.7802 | 0.8384 | 0.9057 | 0.2757 | 0.8802 | 0.9006 | 0.8259 | 0.8821 | 0.9426 | 0.8717 | 0.9016 | 0.8664 | 0.904 | 0.8498 | 0.8963 | | 0.2106 | 93.0 | 46221 | 0.2438 | 0.8628 | 0.9854 | 0.963 | 0.7465 | 0.8378 | 0.9023 | 0.2756 | 0.8793 | 0.8998 | 0.7815 | 0.8809 | 0.9397 | 0.8755 | 0.9061 | 0.867 | 0.902 | 0.8457 | 0.8914 | | 0.2181 | 94.0 | 46718 | 0.2464 | 0.8587 | 0.9859 | 0.9589 | 0.7119 | 0.8334 | 0.9053 | 0.2753 | 0.8766 | 0.8961 | 0.737 | 0.8774 | 0.9384 | 0.881 | 0.9097 | 0.853 | 0.8901 | 0.842 | 0.8885 | | 0.2053 | 95.0 | 47215 | 0.2526 | 0.8593 | 0.9848 | 0.9624 | 0.7102 | 0.8356 | 0.8947 | 0.2759 | 0.8771 | 0.8983 | 0.7481 | 0.882 | 0.9302 | 0.8716 | 0.904 | 0.8683 | 0.9033 | 0.838 | 0.8875 | | 0.2154 | 96.0 | 47712 | 0.2418 | 0.8593 | 0.9854 | 0.9623 | 0.7251 | 0.8362 | 0.9009 | 0.2757 | 0.8781 | 0.8984 | 0.7815 | 0.881 | 0.9397 | 0.8701 | 0.9012 | 0.8633 | 0.8993 | 0.8446 | 0.8945 | | 0.2141 | 97.0 | 48209 | 0.2390 | 0.8627 | 0.9856 | 0.9658 | 0.7413 | 0.8418 | 0.9045 | 0.2751 | 0.8804 | 0.9003 | 0.7593 | 0.8836 | 0.943 | 0.8707 | 0.9045 | 0.8645 | 0.8987 | 0.8529 | 0.8977 | | 0.2027 | 98.0 | 48706 | 0.2465 | 0.8573 | 0.9856 | 0.96 | 0.6934 | 0.8355 | 0.8974 | 0.2737 | 0.8745 | 0.8947 | 0.7185 | 0.8786 | 0.9364 | 0.8724 | 0.904 | 0.8548 | 0.8887 | 0.8448 | 0.8914 | | 0.2051 | 99.0 | 49203 | 0.2376 | 0.8632 | 0.9859 | 0.9616 | 0.7454 | 0.8403 | 0.8999 | 0.2759 | 0.8803 | 0.9004 | 0.7889 | 0.8826 | 0.9387 | 0.8805 | 0.9117 | 0.8634 | 0.8967 | 0.8457 | 0.8928 | | 0.2134 | 100.0 | 49700 | 0.2297 | 0.8605 | 0.9862 | 0.9603 | 0.7965 | 0.8384 | 0.9012 | 0.2747 | 0.8762 | 0.8964 | 0.8185 | 0.8796 | 0.9393 | 0.8716 | 0.9012 | 0.8605 | 0.8927 | 0.8492 | 0.8953 | | 0.2033 | 101.0 | 50197 | 0.2380 | 0.8613 | 0.987 | 0.9653 | 0.7704 | 0.8356 | 0.9071 | 0.2741 | 0.8777 | 0.8968 | 0.7963 | 0.8779 | 0.9423 | 0.87 | 0.9016 | 0.8649 | 0.8954 | 0.849 | 0.8934 | | 0.196 | 102.0 | 50694 | 0.2312 | 0.8663 | 0.9867 | 0.9624 | 0.7646 | 0.8448 | 0.9052 | 0.2763 | 0.8829 | 0.9023 | 0.7963 | 0.8845 | 0.942 | 0.8796 | 0.9097 | 0.8703 | 0.902 | 0.8491 | 0.8951 | | 0.2097 | 103.0 | 51191 | 0.2271 | 0.8659 | 0.9862 | 0.9663 | 0.7552 | 0.8426 | 0.9066 | 0.2768 | 0.8824 | 0.9026 | 0.7741 | 0.8856 | 0.9403 | 0.8812 | 0.915 | 0.8669 | 0.8993 | 0.8497 | 0.8936 | | 0.1846 | 104.0 | 51688 | 0.2234 | 0.8703 | 0.9874 | 0.9612 | 0.7354 | 0.8483 | 0.9116 | 0.2771 | 0.8863 | 0.9062 | 0.7481 | 0.89 | 0.9459 | 0.8855 | 0.9158 | 0.8701 | 0.902 | 0.8553 | 0.9008 | | 0.1967 | 105.0 | 52185 | 0.2242 | 0.87 | 0.9863 | 0.9599 | 0.7643 | 0.8473 | 0.91 | 0.278 | 0.8879 | 0.9079 | 0.7852 | 0.892 | 0.9452 | 0.8863 | 0.9166 | 0.8695 | 0.9066 | 0.8543 | 0.9006 | | 0.1978 | 106.0 | 52682 | 0.2376 | 0.8643 | 0.987 | 0.9726 | 0.7906 | 0.8423 | 0.9016 | 0.2755 | 0.8792 | 0.8997 | 0.8111 | 0.8829 | 0.94 | 0.8726 | 0.902 | 0.8711 | 0.9013 | 0.8491 | 0.8957 | | 0.1898 | 107.0 | 53179 | 0.2256 | 0.8742 | 0.9876 | 0.9695 | 0.801 | 0.8524 | 0.9078 | 0.2776 | 0.8883 | 0.909 | 0.8296 | 0.893 | 0.9426 | 0.8872 | 0.9166 | 0.8796 | 0.9093 | 0.8558 | 0.9012 | | 0.1988 | 108.0 | 53676 | 0.2422 | 0.8666 | 0.984 | 0.9653 | 0.7536 | 0.8433 | 0.9054 | 0.2751 | 0.8804 | 0.9011 | 0.7852 | 0.8832 | 0.9413 | 0.8857 | 0.9138 | 0.868 | 0.896 | 0.8462 | 0.8934 | | 0.1866 | 109.0 | 54173 | 0.2213 | 0.8716 | 0.9866 | 0.9659 | 0.8065 | 0.8483 | 0.9131 | 0.278 | 0.8884 | 0.909 | 0.8259 | 0.8914 | 0.9452 | 0.8797 | 0.913 | 0.8792 | 0.9139 | 0.856 | 0.9 | | 0.1966 | 110.0 | 54670 | 0.2334 | 0.8698 | 0.9852 | 0.9698 | 0.7817 | 0.8469 | 0.9075 | 0.2783 | 0.8849 | 0.9049 | 0.8111 | 0.8875 | 0.942 | 0.8802 | 0.9121 | 0.8769 | 0.9066 | 0.8525 | 0.8959 | | 0.1906 | 111.0 | 55167 | 0.2242 | 0.8727 | 0.9863 | 0.9648 | 0.7727 | 0.8493 | 0.9145 | 0.2792 | 0.8877 | 0.9078 | 0.8074 | 0.8903 | 0.9485 | 0.8875 | 0.9174 | 0.8728 | 0.904 | 0.8577 | 0.9021 | | 0.1843 | 112.0 | 55664 | 0.2275 | 0.8684 | 0.986 | 0.9677 | 0.7769 | 0.8463 | 0.91 | 0.277 | 0.8854 | 0.9059 | 0.7926 | 0.8899 | 0.9443 | 0.8844 | 0.9162 | 0.8659 | 0.9013 | 0.855 | 0.9002 | | 0.1876 | 113.0 | 56161 | 0.2142 | 0.8796 | 0.9871 | 0.9656 | 0.7797 | 0.8564 | 0.9123 | 0.2799 | 0.8939 | 0.9142 | 0.8037 | 0.8973 | 0.9479 | 0.8987 | 0.9271 | 0.8815 | 0.9126 | 0.8584 | 0.9029 | | 0.1832 | 114.0 | 56658 | 0.2159 | 0.8763 | 0.9871 | 0.9705 | 0.7773 | 0.8549 | 0.9091 | 0.2786 | 0.8912 | 0.9123 | 0.7963 | 0.896 | 0.9475 | 0.8927 | 0.9239 | 0.8792 | 0.9099 | 0.857 | 0.9031 | | 0.1821 | 115.0 | 57155 | 0.2225 | 0.8725 | 0.987 | 0.9691 | 0.7763 | 0.8498 | 0.9113 | 0.2768 | 0.8884 | 0.9081 | 0.8 | 0.8914 | 0.9475 | 0.8884 | 0.919 | 0.8713 | 0.9026 | 0.8577 | 0.9027 | | 0.2108 | 116.0 | 57652 | 0.2359 | 0.8658 | 0.9851 | 0.959 | 0.7265 | 0.8431 | 0.9096 | 0.2741 | 0.882 | 0.9018 | 0.7481 | 0.8847 | 0.9462 | 0.8787 | 0.9117 | 0.865 | 0.8947 | 0.8537 | 0.899 | | 0.1785 | 117.0 | 58149 | 0.2293 | 0.8667 | 0.9862 | 0.9612 | 0.7804 | 0.8432 | 0.91 | 0.2757 | 0.884 | 0.9035 | 0.7963 | 0.8856 | 0.9456 | 0.8838 | 0.9154 | 0.8626 | 0.8974 | 0.8537 | 0.8979 | | 0.195 | 118.0 | 58646 | 0.2312 | 0.8754 | 0.9859 | 0.9685 | 0.7779 | 0.8501 | 0.9095 | 0.2786 | 0.8916 | 0.9119 | 0.8037 | 0.8938 | 0.9449 | 0.8936 | 0.9239 | 0.8813 | 0.9146 | 0.8512 | 0.8973 | | 0.1827 | 119.0 | 59143 | 0.2323 | 0.8723 | 0.9858 | 0.9654 | 0.7506 | 0.8498 | 0.9108 | 0.2762 | 0.8859 | 0.9069 | 0.7889 | 0.8897 | 0.9456 | 0.8858 | 0.915 | 0.8768 | 0.9066 | 0.8543 | 0.899 | | 0.1918 | 120.0 | 59640 | 0.2302 | 0.8733 | 0.9858 | 0.965 | 0.7674 | 0.85 | 0.9129 | 0.2776 | 0.8879 | 0.9088 | 0.7926 | 0.8915 | 0.9466 | 0.888 | 0.9178 | 0.8748 | 0.9086 | 0.8571 | 0.9 | | 0.1748 | 121.0 | 60137 | 0.2332 | 0.8711 | 0.9855 | 0.9656 | 0.7674 | 0.8481 | 0.912 | 0.2772 | 0.8861 | 0.9064 | 0.7889 | 0.8889 | 0.9456 | 0.8856 | 0.9154 | 0.8721 | 0.9053 | 0.8555 | 0.8984 | | 0.1748 | 122.0 | 60634 | 0.2304 | 0.8691 | 0.9858 | 0.9651 | 0.7692 | 0.8448 | 0.9137 | 0.2773 | 0.8841 | 0.905 | 0.8 | 0.8872 | 0.9466 | 0.8793 | 0.9077 | 0.8746 | 0.9079 | 0.8534 | 0.8992 | | 0.1918 | 123.0 | 61131 | 0.2226 | 0.8749 | 0.9856 | 0.9652 | 0.773 | 0.85 | 0.9172 | 0.2787 | 0.8911 | 0.9114 | 0.7963 | 0.8945 | 0.9495 | 0.8862 | 0.9162 | 0.8786 | 0.9146 | 0.8598 | 0.9033 | | 0.1828 | 124.0 | 61628 | 0.2262 | 0.8685 | 0.9858 | 0.9647 | 0.7576 | 0.8439 | 0.9145 | 0.2758 | 0.8847 | 0.9057 | 0.7815 | 0.8886 | 0.9485 | 0.8822 | 0.9126 | 0.8677 | 0.9033 | 0.8554 | 0.9014 | | 0.1836 | 125.0 | 62125 | 0.2359 | 0.8703 | 0.9849 | 0.9592 | 0.781 | 0.8463 | 0.9128 | 0.2773 | 0.8851 | 0.9064 | 0.8074 | 0.8892 | 0.9472 | 0.8831 | 0.9134 | 0.8715 | 0.904 | 0.8563 | 0.9018 | | 0.1815 | 126.0 | 62622 | 0.2266 | 0.8733 | 0.9857 | 0.9615 | 0.776 | 0.8481 | 0.9162 | 0.2775 | 0.8893 | 0.91 | 0.7963 | 0.8921 | 0.9502 | 0.8887 | 0.9206 | 0.8741 | 0.9073 | 0.8571 | 0.9021 | | 0.1777 | 127.0 | 63119 | 0.2339 | 0.8736 | 0.9854 | 0.9648 | 0.773 | 0.8496 | 0.9157 | 0.2768 | 0.8888 | 0.9099 | 0.7963 | 0.8928 | 0.9489 | 0.882 | 0.9121 | 0.8824 | 0.9152 | 0.8565 | 0.9023 | | 0.1843 | 128.0 | 63616 | 0.2300 | 0.8739 | 0.9857 | 0.9649 | 0.7729 | 0.8508 | 0.9173 | 0.2776 | 0.8899 | 0.9106 | 0.7963 | 0.8938 | 0.9489 | 0.8837 | 0.9162 | 0.8796 | 0.9126 | 0.8586 | 0.9029 | | 0.1856 | 129.0 | 64113 | 0.2303 | 0.8731 | 0.9858 | 0.9648 | 0.7849 | 0.8495 | 0.9124 | 0.2767 | 0.8887 | 0.9095 | 0.8 | 0.8925 | 0.9482 | 0.8863 | 0.9174 | 0.8767 | 0.9086 | 0.8564 | 0.9023 | | 0.195 | 130.0 | 64610 | 0.2183 | 0.8759 | 0.9863 | 0.9648 | 0.7913 | 0.8521 | 0.9156 | 0.2776 | 0.8908 | 0.9114 | 0.8074 | 0.894 | 0.9498 | 0.8909 | 0.9219 | 0.8776 | 0.9086 | 0.8591 | 0.9037 | | 0.1903 | 131.0 | 65107 | 0.2282 | 0.8727 | 0.9862 | 0.9647 | 0.7858 | 0.8487 | 0.9151 | 0.2761 | 0.8877 | 0.9088 | 0.8037 | 0.8914 | 0.9482 | 0.8888 | 0.919 | 0.8728 | 0.9053 | 0.8566 | 0.902 | | 0.1913 | 132.0 | 65604 | 0.2240 | 0.8735 | 0.9861 | 0.9645 | 0.781 | 0.8507 | 0.9137 | 0.2769 | 0.8873 | 0.9083 | 0.8037 | 0.8915 | 0.9475 | 0.8878 | 0.9166 | 0.8749 | 0.906 | 0.8578 | 0.9023 | | 0.1827 | 133.0 | 66101 | 0.2221 | 0.873 | 0.9862 | 0.9645 | 0.7914 | 0.849 | 0.9163 | 0.2775 | 0.8893 | 0.9098 | 0.8074 | 0.8924 | 0.9492 | 0.8879 | 0.9182 | 0.8737 | 0.9079 | 0.8575 | 0.9031 | | 0.189 | 134.0 | 66598 | 0.2176 | 0.8765 | 0.9864 | 0.9649 | 0.7873 | 0.8542 | 0.9145 | 0.2775 | 0.8917 | 0.9126 | 0.8074 | 0.8962 | 0.9485 | 0.8898 | 0.9215 | 0.8803 | 0.9119 | 0.8595 | 0.9043 | | 0.1855 | 135.0 | 67095 | 0.2198 | 0.877 | 0.9871 | 0.9648 | 0.7846 | 0.8534 | 0.9149 | 0.2795 | 0.8913 | 0.9123 | 0.8074 | 0.8958 | 0.9479 | 0.8878 | 0.9174 | 0.883 | 0.9159 | 0.8601 | 0.9035 | | 0.1741 | 136.0 | 67592 | 0.2204 | 0.8755 | 0.9864 | 0.9648 | 0.7841 | 0.8509 | 0.9137 | 0.278 | 0.8904 | 0.9112 | 0.8074 | 0.8932 | 0.9489 | 0.8894 | 0.9202 | 0.8807 | 0.9119 | 0.8563 | 0.9016 | | 0.182 | 137.0 | 68089 | 0.2199 | 0.8782 | 0.9863 | 0.9648 | 0.7841 | 0.8544 | 0.9142 | 0.2787 | 0.893 | 0.914 | 0.8074 | 0.8965 | 0.9485 | 0.8942 | 0.9239 | 0.8837 | 0.9159 | 0.8568 | 0.9021 | | 0.1886 | 138.0 | 68586 | 0.2184 | 0.8758 | 0.9864 | 0.9648 | 0.781 | 0.8528 | 0.9145 | 0.2774 | 0.8915 | 0.9122 | 0.8037 | 0.8951 | 0.9502 | 0.8902 | 0.9206 | 0.88 | 0.9119 | 0.8571 | 0.9041 | | 0.1862 | 139.0 | 69083 | 0.2182 | 0.877 | 0.9863 | 0.9648 | 0.7847 | 0.8528 | 0.9149 | 0.2774 | 0.8919 | 0.9127 | 0.8074 | 0.8948 | 0.9498 | 0.892 | 0.9215 | 0.8811 | 0.9139 | 0.8579 | 0.9027 | | 0.1815 | 140.0 | 69580 | 0.2195 | 0.8751 | 0.9864 | 0.9648 | 0.7847 | 0.8516 | 0.9119 | 0.2771 | 0.8902 | 0.9112 | 0.8074 | 0.8937 | 0.9489 | 0.8907 | 0.9211 | 0.8787 | 0.9099 | 0.8558 | 0.9025 | | 0.1856 | 141.0 | 70077 | 0.2212 | 0.8749 | 0.9862 | 0.9647 | 0.7841 | 0.8508 | 0.9137 | 0.2774 | 0.8901 | 0.9112 | 0.8074 | 0.8937 | 0.9492 | 0.888 | 0.919 | 0.8803 | 0.9119 | 0.8565 | 0.9027 | | 0.1912 | 142.0 | 70574 | 0.2191 | 0.8752 | 0.9863 | 0.9649 | 0.7847 | 0.8516 | 0.9143 | 0.2771 | 0.8906 | 0.9116 | 0.8074 | 0.8943 | 0.9495 | 0.8892 | 0.9206 | 0.879 | 0.9106 | 0.8574 | 0.9035 | | 0.1838 | 143.0 | 71071 | 0.2183 | 0.8759 | 0.9863 | 0.9648 | 0.7847 | 0.8522 | 0.915 | 0.2773 | 0.8913 | 0.9119 | 0.8074 | 0.8943 | 0.9502 | 0.8896 | 0.9206 | 0.8798 | 0.9113 | 0.8582 | 0.9037 | | 0.1982 | 144.0 | 71568 | 0.2195 | 0.8752 | 0.9862 | 0.9647 | 0.7847 | 0.8509 | 0.9152 | 0.2774 | 0.8905 | 0.9109 | 0.8111 | 0.8931 | 0.9495 | 0.8878 | 0.9186 | 0.8796 | 0.9113 | 0.8583 | 0.9029 | | 0.1732 | 145.0 | 72065 | 0.2198 | 0.8755 | 0.9863 | 0.9648 | 0.7847 | 0.8516 | 0.915 | 0.2773 | 0.8906 | 0.9114 | 0.8111 | 0.8935 | 0.9502 | 0.888 | 0.9202 | 0.8801 | 0.9106 | 0.8585 | 0.9033 | | 0.1908 | 146.0 | 72562 | 0.2208 | 0.875 | 0.9862 | 0.9647 | 0.7847 | 0.8507 | 0.9127 | 0.2775 | 0.8906 | 0.9109 | 0.8111 | 0.893 | 0.9492 | 0.8885 | 0.9194 | 0.8787 | 0.9106 | 0.8576 | 0.9025 | | 0.1781 | 147.0 | 73059 | 0.2207 | 0.8752 | 0.9863 | 0.9647 | 0.7847 | 0.8512 | 0.913 | 0.2775 | 0.8906 | 0.9109 | 0.8111 | 0.8933 | 0.9489 | 0.8878 | 0.919 | 0.8801 | 0.9113 | 0.8578 | 0.9025 | | 0.1829 | 148.0 | 73556 | 0.2207 | 0.8755 | 0.9863 | 0.9647 | 0.7847 | 0.8514 | 0.913 | 0.2775 | 0.8909 | 0.9112 | 0.8111 | 0.8936 | 0.9489 | 0.8885 | 0.9198 | 0.8801 | 0.9113 | 0.8578 | 0.9025 | | 0.1788 | 149.0 | 74053 | 0.2206 | 0.8755 | 0.9863 | 0.9647 | 0.7847 | 0.8514 | 0.913 | 0.2775 | 0.8909 | 0.9112 | 0.8111 | 0.8936 | 0.9489 | 0.8885 | 0.9198 | 0.8801 | 0.9113 | 0.8578 | 0.9025 | | 0.178 | 150.0 | 74550 | 0.2206 | 0.8755 | 0.9863 | 0.9647 | 0.7847 | 0.8514 | 0.913 | 0.2775 | 0.8909 | 0.9112 | 0.8111 | 0.8936 | 0.9489 | 0.8885 | 0.9198 | 0.8801 | 0.9113 | 0.8578 | 0.9025 | ### Framework versions - Transformers 4.45.2 - Pytorch 2.4.1+cu121 - Datasets 2.19.2 - Tokenizers 0.20.1
[ "chicken", "duck", "plant" ]
0llheaven/CON-DETR-V8
# Model Card for Model ID <!-- Provide a quick summary of what the model is/does. --> ## Model Details ### Model Description <!-- Provide a longer summary of what this model is. --> This is the model card of a πŸ€— transformers model that has been pushed on the Hub. This model card has been automatically generated. - **Developed by:** [More Information Needed] - **Funded by [optional]:** [More Information Needed] - **Shared by [optional]:** [More Information Needed] - **Model type:** [More Information Needed] - **Language(s) (NLP):** [More Information Needed] - **License:** [More Information Needed] - **Finetuned from model [optional]:** [More Information Needed] ### Model Sources [optional] <!-- Provide the basic links for the model. --> - **Repository:** [More Information Needed] - **Paper [optional]:** [More Information Needed] - **Demo [optional]:** [More Information Needed] ## Uses <!-- Address questions around how the model is intended to be used, including the foreseeable users of the model and those affected by the model. --> ### Direct Use <!-- This section is for the model use without fine-tuning or plugging into a larger ecosystem/app. --> [More Information Needed] ### Downstream Use [optional] <!-- This section is for the model use when fine-tuned for a task, or when plugged into a larger ecosystem/app --> [More Information Needed] ### Out-of-Scope Use <!-- This section addresses misuse, malicious use, and uses that the model will not work well for. --> [More Information Needed] ## Bias, Risks, and Limitations <!-- This section is meant to convey both technical and sociotechnical limitations. --> [More Information Needed] ### Recommendations <!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. --> Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations. ## How to Get Started with the Model Use the code below to get started with the model. [More Information Needed] ## Training Details ### Training Data <!-- This should link to a Dataset Card, perhaps with a short stub of information on what the training data is all about as well as documentation related to data pre-processing or additional filtering. --> [More Information Needed] ### Training Procedure <!-- This relates heavily to the Technical Specifications. Content here should link to that section when it is relevant to the training procedure. --> #### Preprocessing [optional] [More Information Needed] #### Training Hyperparameters - **Training regime:** [More Information Needed] <!--fp32, fp16 mixed precision, bf16 mixed precision, bf16 non-mixed precision, fp16 non-mixed precision, fp8 mixed precision --> #### Speeds, Sizes, Times [optional] <!-- This section provides information about throughput, start/end time, checkpoint size if relevant, etc. --> [More Information Needed] ## Evaluation <!-- This section describes the evaluation protocols and provides the results. --> ### Testing Data, Factors & Metrics #### Testing Data <!-- This should link to a Dataset Card if possible. --> [More Information Needed] #### Factors <!-- These are the things the evaluation is disaggregating by, e.g., subpopulations or domains. --> [More Information Needed] #### Metrics <!-- These are the evaluation metrics being used, ideally with a description of why. --> [More Information Needed] ### Results [More Information Needed] #### Summary ## Model Examination [optional] <!-- Relevant interpretability work for the model goes here --> [More Information Needed] ## Environmental Impact <!-- Total emissions (in grams of CO2eq) and additional considerations, such as electricity usage, go here. Edit the suggested text below accordingly --> Carbon emissions can be estimated using the [Machine Learning Impact calculator](https://mlco2.github.io/impact#compute) presented in [Lacoste et al. (2019)](https://arxiv.org/abs/1910.09700). - **Hardware Type:** [More Information Needed] - **Hours used:** [More Information Needed] - **Cloud Provider:** [More Information Needed] - **Compute Region:** [More Information Needed] - **Carbon Emitted:** [More Information Needed] ## Technical Specifications [optional] ### Model Architecture and Objective [More Information Needed] ### Compute Infrastructure [More Information Needed] #### Hardware [More Information Needed] #### Software [More Information Needed] ## Citation [optional] <!-- If there is a paper or blog post introducing the model, the APA and Bibtex information for that should go in this section. --> **BibTeX:** [More Information Needed] **APA:** [More Information Needed] ## Glossary [optional] <!-- If relevant, include terms and calculations in this section that can help readers understand the model or model card. --> [More Information Needed] ## More Information [optional] [More Information Needed] ## Model Card Authors [optional] [More Information Needed] ## Model Card Contact [More Information Needed]
[ "pneumonia", "normal", "pneumonia_bacteria", "pneumonia_virus" ]
b09501048/detr
<!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. --> [<img src="https://raw.githubusercontent.com/wandb/assets/main/wandb-github-badge-28.svg" alt="Visualize in Weights & Biases" width="200" height="32"/>](https://wandb.ai/djengo890-national-taiwan-university/CVDPL_HW1_DETR/runs/gdqqmrof) [<img src="https://raw.githubusercontent.com/wandb/assets/main/wandb-github-badge-28.svg" alt="Visualize in Weights & Biases" width="200" height="32"/>](https://wandb.ai/djengo890-national-taiwan-university/CVDPL_HW1_DETR/runs/gdqqmrof) # detr This model is a fine-tuned version of [facebook/detr-resnet-50](https://huggingface.co/facebook/detr-resnet-50) on an unknown dataset. It achieves the following results on the evaluation set: - Loss: 2.1588 - Map: 0.0662 - Map 50: 0.1137 - Map 75: 0.0667 - Map Small: 0.0 - Map Medium: 0.0013 - Map Large: 0.0706 - Mar 1: 0.0705 - Mar 10: 0.1303 - Mar 100: 0.1374 - Mar Small: 0.0 - Mar Medium: 0.009 - Mar Large: 0.1536 - Map Person: 0.5545 - Mar 100 Person: 0.6988 - Map Ear: 0.0068 - Mar 100 Ear: 0.1599 - Map Earmuffs: 0.0 - Mar 100 Earmuffs: 0.0 - Map Face: 0.1494 - Mar 100 Face: 0.3924 - Map Face-guard: 0.0 - Mar 100 Face-guard: 0.0 - Map Face-mask-medical: 0.0 - Mar 100 Face-mask-medical: 0.0 - Map Foot: 0.0 - Mar 100 Foot: 0.0 - Map Tools: 0.0012 - Mar 100 Tools: 0.0722 - Map Glasses: 0.0 - Mar 100 Glasses: 0.0 - Map Gloves: 0.0 - Mar 100 Gloves: 0.0 - Map Helmet: 0.0 - Mar 100 Helmet: 0.0 - Map Hands: 0.1315 - Mar 100 Hands: 0.4111 - Map Head: 0.274 - Mar 100 Head: 0.5671 - Map Medical-suit: 0.0 - Mar 100 Medical-suit: 0.0 - Map Shoes: 0.0073 - Mar 100 Shoes: 0.035 - Map Safety-suit: 0.0 - Mar 100 Safety-suit: 0.0 - Map Safety-vest: 0.0 - Mar 100 Safety-vest: 0.0 ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 5e-05 - train_batch_size: 32 - eval_batch_size: 32 - seed: 42 - gradient_accumulation_steps: 2 - total_train_batch_size: 64 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: cosine - num_epochs: 10 - mixed_precision_training: Native AMP ### Training results | Training Loss | Epoch | Step | Validation Loss | Map | Map 50 | Map 75 | Map Small | Map Medium | Map Large | Mar 1 | Mar 10 | Mar 100 | Mar Small | Mar Medium | Mar Large | Map Person | Mar 100 Person | Map Ear | Mar 100 Ear | Map Earmuffs | Mar 100 Earmuffs | Map Face | Mar 100 Face | Map Face-guard | Mar 100 Face-guard | Map Face-mask-medical | Mar 100 Face-mask-medical | Map Foot | Mar 100 Foot | Map Tools | Mar 100 Tools | Map Glasses | Mar 100 Glasses | Map Gloves | Mar 100 Gloves | Map Helmet | Mar 100 Helmet | Map Hands | Mar 100 Hands | Map Head | Mar 100 Head | Map Medical-suit | Mar 100 Medical-suit | Map Shoes | Mar 100 Shoes | Map Safety-suit | Mar 100 Safety-suit | Map Safety-vest | Mar 100 Safety-vest | |:-------------:|:------:|:----:|:---------------:|:------:|:------:|:------:|:---------:|:----------:|:---------:|:------:|:------:|:-------:|:---------:|:----------:|:---------:|:----------:|:--------------:|:-------:|:-----------:|:------------:|:----------------:|:--------:|:------------:|:--------------:|:------------------:|:---------------------:|:-------------------------:|:--------:|:------------:|:---------:|:-------------:|:-----------:|:---------------:|:----------:|:--------------:|:----------:|:--------------:|:---------:|:-------------:|:--------:|:------------:|:----------------:|:--------------------:|:---------:|:-------------:|:---------------:|:-------------------:|:---------------:|:-------------------:| | No log | 0.9913 | 57 | 2.9995 | 0.0253 | 0.0391 | 0.0271 | 0.0 | 0.0001 | 0.0263 | 0.0319 | 0.0569 | 0.0684 | 0.0 | 0.0005 | 0.0725 | 0.3757 | 0.6561 | 0.0002 | 0.0046 | 0.0 | 0.0 | 0.0006 | 0.0042 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0211 | 0.3065 | 0.0333 | 0.1919 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | | No log | 2.0 | 115 | 2.6946 | 0.0312 | 0.0518 | 0.0323 | 0.0 | 0.0002 | 0.0327 | 0.0411 | 0.0775 | 0.0874 | 0.0 | 0.003 | 0.094 | 0.42 | 0.6762 | 0.0007 | 0.0246 | 0.0 | 0.0 | 0.0053 | 0.0515 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0373 | 0.3427 | 0.0669 | 0.3916 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | | No log | 2.9913 | 172 | 2.5527 | 0.0325 | 0.0595 | 0.0308 | 0.0 | 0.002 | 0.0343 | 0.0441 | 0.0853 | 0.0961 | 0.0 | 0.0057 | 0.1058 | 0.3649 | 0.652 | 0.0021 | 0.0798 | 0.0 | 0.0 | 0.0142 | 0.0748 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0022 | 0.0073 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0508 | 0.3384 | 0.1178 | 0.482 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | | No log | 4.0 | 230 | 2.4637 | 0.0425 | 0.0788 | 0.0396 | 0.0 | 0.0006 | 0.0449 | 0.0503 | 0.0968 | 0.1059 | 0.0 | 0.0049 | 0.1168 | 0.451 | 0.6564 | 0.0018 | 0.0867 | 0.0 | 0.0 | 0.0548 | 0.1764 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0017 | 0.0225 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0725 | 0.3554 | 0.1402 | 0.5027 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | | No log | 4.9913 | 287 | 2.3455 | 0.0527 | 0.0929 | 0.052 | 0.0 | 0.0006 | 0.0561 | 0.0625 | 0.1154 | 0.1234 | 0.0 | 0.0042 | 0.1375 | 0.4924 | 0.6771 | 0.0068 | 0.1346 | 0.0 | 0.0 | 0.1061 | 0.3505 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0011 | 0.0278 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0916 | 0.3802 | 0.1918 | 0.5232 | 0.0 | 0.0 | 0.0058 | 0.0047 | 0.0 | 0.0 | 0.0 | 0.0 | | No log | 6.0 | 345 | 2.2859 | 0.0538 | 0.0938 | 0.0534 | 0.0 | 0.0011 | 0.0572 | 0.0624 | 0.1175 | 0.1268 | 0.0 | 0.0051 | 0.1412 | 0.5106 | 0.6945 | 0.0037 | 0.1284 | 0.0 | 0.0 | 0.1108 | 0.3333 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0006 | 0.0503 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.1048 | 0.4016 | 0.1793 | 0.5394 | 0.0 | 0.0 | 0.0055 | 0.0088 | 0.0 | 0.0 | 0.0 | 0.0 | | No log | 6.9913 | 402 | 2.2126 | 0.0609 | 0.1056 | 0.0596 | 0.0 | 0.0019 | 0.065 | 0.0681 | 0.1257 | 0.1319 | 0.0 | 0.0086 | 0.1469 | 0.5288 | 0.6915 | 0.0064 | 0.1412 | 0.0 | 0.0 | 0.1366 | 0.3886 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0006 | 0.0517 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.1204 | 0.3905 | 0.2371 | 0.5637 | 0.0 | 0.0 | 0.0055 | 0.0154 | 0.0 | 0.0 | 0.0 | 0.0 | | No log | 8.0 | 460 | 2.1794 | 0.0641 | 0.11 | 0.0643 | 0.0 | 0.0015 | 0.0685 | 0.0699 | 0.1294 | 0.1372 | 0.0 | 0.0089 | 0.1533 | 0.5525 | 0.6964 | 0.0064 | 0.1599 | 0.0 | 0.0 | 0.1375 | 0.3892 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0012 | 0.0775 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.1273 | 0.4089 | 0.2586 | 0.567 | 0.0 | 0.0 | 0.007 | 0.0333 | 0.0 | 0.0 | 0.0 | 0.0 | | 2.8481 | 8.9913 | 517 | 2.1642 | 0.0658 | 0.113 | 0.0661 | 0.0 | 0.0016 | 0.0703 | 0.0702 | 0.1296 | 0.1369 | 0.0 | 0.0091 | 0.1532 | 0.5529 | 0.6969 | 0.0067 | 0.1606 | 0.0 | 0.0 | 0.1475 | 0.3875 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0012 | 0.0725 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.13 | 0.4102 | 0.2727 | 0.5659 | 0.0 | 0.0 | 0.0073 | 0.0344 | 0.0 | 0.0 | 0.0 | 0.0 | | 2.8481 | 9.9130 | 570 | 2.1588 | 0.0662 | 0.1137 | 0.0667 | 0.0 | 0.0013 | 0.0706 | 0.0705 | 0.1303 | 0.1374 | 0.0 | 0.009 | 0.1536 | 0.5545 | 0.6988 | 0.0068 | 0.1599 | 0.0 | 0.0 | 0.1494 | 0.3924 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0012 | 0.0722 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.1315 | 0.4111 | 0.274 | 0.5671 | 0.0 | 0.0 | 0.0073 | 0.035 | 0.0 | 0.0 | 0.0 | 0.0 | ### Framework versions - Transformers 4.44.2 - Pytorch 2.4.1+cu121 - Datasets 3.0.1 - Tokenizers 0.19.1
[ "person", "ear", "earmuffs", "face", "face-guard", "face-mask-medical", "foot", "tools", "glasses", "gloves", "helmet", "hands", "head", "medical-suit", "shoes", "safety-suit", "safety-vest" ]
doktor47/zinemind_msft_v7
# Model Card for Model ID <!-- Provide a quick summary of what the model is/does. --> ## Model Details ### Model Description <!-- Provide a longer summary of what this model is. --> This is the model card of a πŸ€— transformers model that has been pushed on the Hub. This model card has been automatically generated. - **Developed by:** [More Information Needed] - **Funded by [optional]:** [More Information Needed] - **Shared by [optional]:** [More Information Needed] - **Model type:** [More Information Needed] - **Language(s) (NLP):** [More Information Needed] - **License:** [More Information Needed] - **Finetuned from model [optional]:** [More Information Needed] ### Model Sources [optional] <!-- Provide the basic links for the model. --> - **Repository:** [More Information Needed] - **Paper [optional]:** [More Information Needed] - **Demo [optional]:** [More Information Needed] ## Uses <!-- Address questions around how the model is intended to be used, including the foreseeable users of the model and those affected by the model. --> ### Direct Use <!-- This section is for the model use without fine-tuning or plugging into a larger ecosystem/app. --> [More Information Needed] ### Downstream Use [optional] <!-- This section is for the model use when fine-tuned for a task, or when plugged into a larger ecosystem/app --> [More Information Needed] ### Out-of-Scope Use <!-- This section addresses misuse, malicious use, and uses that the model will not work well for. --> [More Information Needed] ## Bias, Risks, and Limitations <!-- This section is meant to convey both technical and sociotechnical limitations. --> [More Information Needed] ### Recommendations <!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. --> Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations. ## How to Get Started with the Model Use the code below to get started with the model. [More Information Needed] ## Training Details ### Training Data <!-- This should link to a Dataset Card, perhaps with a short stub of information on what the training data is all about as well as documentation related to data pre-processing or additional filtering. --> [More Information Needed] ### Training Procedure <!-- This relates heavily to the Technical Specifications. Content here should link to that section when it is relevant to the training procedure. --> #### Preprocessing [optional] [More Information Needed] #### Training Hyperparameters - **Training regime:** [More Information Needed] <!--fp32, fp16 mixed precision, bf16 mixed precision, bf16 non-mixed precision, fp16 non-mixed precision, fp8 mixed precision --> #### Speeds, Sizes, Times [optional] <!-- This section provides information about throughput, start/end time, checkpoint size if relevant, etc. --> [More Information Needed] ## Evaluation <!-- This section describes the evaluation protocols and provides the results. --> ### Testing Data, Factors & Metrics #### Testing Data <!-- This should link to a Dataset Card if possible. --> [More Information Needed] #### Factors <!-- These are the things the evaluation is disaggregating by, e.g., subpopulations or domains. --> [More Information Needed] #### Metrics <!-- These are the evaluation metrics being used, ideally with a description of why. --> [More Information Needed] ### Results [More Information Needed] #### Summary ## Model Examination [optional] <!-- Relevant interpretability work for the model goes here --> [More Information Needed] ## Environmental Impact <!-- Total emissions (in grams of CO2eq) and additional considerations, such as electricity usage, go here. Edit the suggested text below accordingly --> Carbon emissions can be estimated using the [Machine Learning Impact calculator](https://mlco2.github.io/impact#compute) presented in [Lacoste et al. (2019)](https://arxiv.org/abs/1910.09700). - **Hardware Type:** [More Information Needed] - **Hours used:** [More Information Needed] - **Cloud Provider:** [More Information Needed] - **Compute Region:** [More Information Needed] - **Carbon Emitted:** [More Information Needed] ## Technical Specifications [optional] ### Model Architecture and Objective [More Information Needed] ### Compute Infrastructure [More Information Needed] #### Hardware [More Information Needed] #### Software [More Information Needed] ## Citation [optional] <!-- If there is a paper or blog post introducing the model, the APA and Bibtex information for that should go in this section. --> **BibTeX:** [More Information Needed] **APA:** [More Information Needed] ## Glossary [optional] <!-- If relevant, include terms and calculations in this section that can help readers understand the model or model card. --> [More Information Needed] ## More Information [optional] [More Information Needed] ## Model Card Authors [optional] [More Information Needed] ## Model Card Contact [More Information Needed]
[ "table", "table rotated" ]
doktor47/zinemind_msft_v8
# Model Card for Model ID <!-- Provide a quick summary of what the model is/does. --> ## Model Details ### Model Description <!-- Provide a longer summary of what this model is. --> This is the model card of a πŸ€— transformers model that has been pushed on the Hub. This model card has been automatically generated. - **Developed by:** [More Information Needed] - **Funded by [optional]:** [More Information Needed] - **Shared by [optional]:** [More Information Needed] - **Model type:** [More Information Needed] - **Language(s) (NLP):** [More Information Needed] - **License:** [More Information Needed] - **Finetuned from model [optional]:** [More Information Needed] ### Model Sources [optional] <!-- Provide the basic links for the model. --> - **Repository:** [More Information Needed] - **Paper [optional]:** [More Information Needed] - **Demo [optional]:** [More Information Needed] ## Uses <!-- Address questions around how the model is intended to be used, including the foreseeable users of the model and those affected by the model. --> ### Direct Use <!-- This section is for the model use without fine-tuning or plugging into a larger ecosystem/app. --> [More Information Needed] ### Downstream Use [optional] <!-- This section is for the model use when fine-tuned for a task, or when plugged into a larger ecosystem/app --> [More Information Needed] ### Out-of-Scope Use <!-- This section addresses misuse, malicious use, and uses that the model will not work well for. --> [More Information Needed] ## Bias, Risks, and Limitations <!-- This section is meant to convey both technical and sociotechnical limitations. --> [More Information Needed] ### Recommendations <!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. --> Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations. ## How to Get Started with the Model Use the code below to get started with the model. [More Information Needed] ## Training Details ### Training Data <!-- This should link to a Dataset Card, perhaps with a short stub of information on what the training data is all about as well as documentation related to data pre-processing or additional filtering. --> [More Information Needed] ### Training Procedure <!-- This relates heavily to the Technical Specifications. Content here should link to that section when it is relevant to the training procedure. --> #### Preprocessing [optional] [More Information Needed] #### Training Hyperparameters - **Training regime:** [More Information Needed] <!--fp32, fp16 mixed precision, bf16 mixed precision, bf16 non-mixed precision, fp16 non-mixed precision, fp8 mixed precision --> #### Speeds, Sizes, Times [optional] <!-- This section provides information about throughput, start/end time, checkpoint size if relevant, etc. --> [More Information Needed] ## Evaluation <!-- This section describes the evaluation protocols and provides the results. --> ### Testing Data, Factors & Metrics #### Testing Data <!-- This should link to a Dataset Card if possible. --> [More Information Needed] #### Factors <!-- These are the things the evaluation is disaggregating by, e.g., subpopulations or domains. --> [More Information Needed] #### Metrics <!-- These are the evaluation metrics being used, ideally with a description of why. --> [More Information Needed] ### Results [More Information Needed] #### Summary ## Model Examination [optional] <!-- Relevant interpretability work for the model goes here --> [More Information Needed] ## Environmental Impact <!-- Total emissions (in grams of CO2eq) and additional considerations, such as electricity usage, go here. Edit the suggested text below accordingly --> Carbon emissions can be estimated using the [Machine Learning Impact calculator](https://mlco2.github.io/impact#compute) presented in [Lacoste et al. (2019)](https://arxiv.org/abs/1910.09700). - **Hardware Type:** [More Information Needed] - **Hours used:** [More Information Needed] - **Cloud Provider:** [More Information Needed] - **Compute Region:** [More Information Needed] - **Carbon Emitted:** [More Information Needed] ## Technical Specifications [optional] ### Model Architecture and Objective [More Information Needed] ### Compute Infrastructure [More Information Needed] #### Hardware [More Information Needed] #### Software [More Information Needed] ## Citation [optional] <!-- If there is a paper or blog post introducing the model, the APA and Bibtex information for that should go in this section. --> **BibTeX:** [More Information Needed] **APA:** [More Information Needed] ## Glossary [optional] <!-- If relevant, include terms and calculations in this section that can help readers understand the model or model card. --> [More Information Needed] ## More Information [optional] [More Information Needed] ## Model Card Authors [optional] [More Information Needed] ## Model Card Contact [More Information Needed]
[ "table", "table rotated" ]
gargeya2003/detr-layers-updated
# Model Card for Model ID <!-- Provide a quick summary of what the model is/does. --> ## Model Details ### Model Description <!-- Provide a longer summary of what this model is. --> This is the model card of a πŸ€— transformers model that has been pushed on the Hub. This model card has been automatically generated. - **Developed by:** [More Information Needed] - **Funded by [optional]:** [More Information Needed] - **Shared by [optional]:** [More Information Needed] - **Model type:** [More Information Needed] - **Language(s) (NLP):** [More Information Needed] - **License:** [More Information Needed] - **Finetuned from model [optional]:** [More Information Needed] ### Model Sources [optional] <!-- Provide the basic links for the model. --> - **Repository:** [More Information Needed] - **Paper [optional]:** [More Information Needed] - **Demo [optional]:** [More Information Needed] ## Uses <!-- Address questions around how the model is intended to be used, including the foreseeable users of the model and those affected by the model. --> ### Direct Use <!-- This section is for the model use without fine-tuning or plugging into a larger ecosystem/app. --> [More Information Needed] ### Downstream Use [optional] <!-- This section is for the model use when fine-tuned for a task, or when plugged into a larger ecosystem/app --> [More Information Needed] ### Out-of-Scope Use <!-- This section addresses misuse, malicious use, and uses that the model will not work well for. --> [More Information Needed] ## Bias, Risks, and Limitations <!-- This section is meant to convey both technical and sociotechnical limitations. --> [More Information Needed] ### Recommendations <!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. --> Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations. ## How to Get Started with the Model Use the code below to get started with the model. [More Information Needed] ## Training Details ### Training Data <!-- This should link to a Dataset Card, perhaps with a short stub of information on what the training data is all about as well as documentation related to data pre-processing or additional filtering. --> [More Information Needed] ### Training Procedure <!-- This relates heavily to the Technical Specifications. Content here should link to that section when it is relevant to the training procedure. --> #### Preprocessing [optional] [More Information Needed] #### Training Hyperparameters - **Training regime:** [More Information Needed] <!--fp32, fp16 mixed precision, bf16 mixed precision, bf16 non-mixed precision, fp16 non-mixed precision, fp8 mixed precision --> #### Speeds, Sizes, Times [optional] <!-- This section provides information about throughput, start/end time, checkpoint size if relevant, etc. --> [More Information Needed] ## Evaluation <!-- This section describes the evaluation protocols and provides the results. --> ### Testing Data, Factors & Metrics #### Testing Data <!-- This should link to a Dataset Card if possible. --> [More Information Needed] #### Factors <!-- These are the things the evaluation is disaggregating by, e.g., subpopulations or domains. --> [More Information Needed] #### Metrics <!-- These are the evaluation metrics being used, ideally with a description of why. --> [More Information Needed] ### Results [More Information Needed] #### Summary ## Model Examination [optional] <!-- Relevant interpretability work for the model goes here --> [More Information Needed] ## Environmental Impact <!-- Total emissions (in grams of CO2eq) and additional considerations, such as electricity usage, go here. Edit the suggested text below accordingly --> Carbon emissions can be estimated using the [Machine Learning Impact calculator](https://mlco2.github.io/impact#compute) presented in [Lacoste et al. (2019)](https://arxiv.org/abs/1910.09700). - **Hardware Type:** [More Information Needed] - **Hours used:** [More Information Needed] - **Cloud Provider:** [More Information Needed] - **Compute Region:** [More Information Needed] - **Carbon Emitted:** [More Information Needed] ## Technical Specifications [optional] ### Model Architecture and Objective [More Information Needed] ### Compute Infrastructure [More Information Needed] #### Hardware [More Information Needed] #### Software [More Information Needed] ## Citation [optional] <!-- If there is a paper or blog post introducing the model, the APA and Bibtex information for that should go in this section. --> **BibTeX:** [More Information Needed] **APA:** [More Information Needed] ## Glossary [optional] <!-- If relevant, include terms and calculations in this section that can help readers understand the model or model card. --> [More Information Needed] ## More Information [optional] [More Information Needed] ## Model Card Authors [optional] [More Information Needed] ## Model Card Contact [More Information Needed]
[ "label_0", "label_1", "label_2", "label_3", "label_4", "label_5" ]
gargeya2003/detr-pretrained
# Model Card for Model ID <!-- Provide a quick summary of what the model is/does. --> ## Model Details ### Model Description <!-- Provide a longer summary of what this model is. --> This is the model card of a πŸ€— transformers model that has been pushed on the Hub. This model card has been automatically generated. - **Developed by:** [More Information Needed] - **Funded by [optional]:** [More Information Needed] - **Shared by [optional]:** [More Information Needed] - **Model type:** [More Information Needed] - **Language(s) (NLP):** [More Information Needed] - **License:** [More Information Needed] - **Finetuned from model [optional]:** [More Information Needed] ### Model Sources [optional] <!-- Provide the basic links for the model. --> - **Repository:** [More Information Needed] - **Paper [optional]:** [More Information Needed] - **Demo [optional]:** [More Information Needed] ## Uses <!-- Address questions around how the model is intended to be used, including the foreseeable users of the model and those affected by the model. --> ### Direct Use <!-- This section is for the model use without fine-tuning or plugging into a larger ecosystem/app. --> [More Information Needed] ### Downstream Use [optional] <!-- This section is for the model use when fine-tuned for a task, or when plugged into a larger ecosystem/app --> [More Information Needed] ### Out-of-Scope Use <!-- This section addresses misuse, malicious use, and uses that the model will not work well for. --> [More Information Needed] ## Bias, Risks, and Limitations <!-- This section is meant to convey both technical and sociotechnical limitations. --> [More Information Needed] ### Recommendations <!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. --> Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations. ## How to Get Started with the Model Use the code below to get started with the model. [More Information Needed] ## Training Details ### Training Data <!-- This should link to a Dataset Card, perhaps with a short stub of information on what the training data is all about as well as documentation related to data pre-processing or additional filtering. --> [More Information Needed] ### Training Procedure <!-- This relates heavily to the Technical Specifications. Content here should link to that section when it is relevant to the training procedure. --> #### Preprocessing [optional] [More Information Needed] #### Training Hyperparameters - **Training regime:** [More Information Needed] <!--fp32, fp16 mixed precision, bf16 mixed precision, bf16 non-mixed precision, fp16 non-mixed precision, fp8 mixed precision --> #### Speeds, Sizes, Times [optional] <!-- This section provides information about throughput, start/end time, checkpoint size if relevant, etc. --> [More Information Needed] ## Evaluation <!-- This section describes the evaluation protocols and provides the results. --> ### Testing Data, Factors & Metrics #### Testing Data <!-- This should link to a Dataset Card if possible. --> [More Information Needed] #### Factors <!-- These are the things the evaluation is disaggregating by, e.g., subpopulations or domains. --> [More Information Needed] #### Metrics <!-- These are the evaluation metrics being used, ideally with a description of why. --> [More Information Needed] ### Results [More Information Needed] #### Summary ## Model Examination [optional] <!-- Relevant interpretability work for the model goes here --> [More Information Needed] ## Environmental Impact <!-- Total emissions (in grams of CO2eq) and additional considerations, such as electricity usage, go here. Edit the suggested text below accordingly --> Carbon emissions can be estimated using the [Machine Learning Impact calculator](https://mlco2.github.io/impact#compute) presented in [Lacoste et al. (2019)](https://arxiv.org/abs/1910.09700). - **Hardware Type:** [More Information Needed] - **Hours used:** [More Information Needed] - **Cloud Provider:** [More Information Needed] - **Compute Region:** [More Information Needed] - **Carbon Emitted:** [More Information Needed] ## Technical Specifications [optional] ### Model Architecture and Objective [More Information Needed] ### Compute Infrastructure [More Information Needed] #### Hardware [More Information Needed] #### Software [More Information Needed] ## Citation [optional] <!-- If there is a paper or blog post introducing the model, the APA and Bibtex information for that should go in this section. --> **BibTeX:** [More Information Needed] **APA:** [More Information Needed] ## Glossary [optional] <!-- If relevant, include terms and calculations in this section that can help readers understand the model or model card. --> [More Information Needed] ## More Information [optional] [More Information Needed] ## Model Card Authors [optional] [More Information Needed] ## Model Card Contact [More Information Needed]
[ "label_0", "label_1", "label_2", "label_3", "label_4", "label_5" ]
0llheaven/CON-DETR-V9
# Model Card for Model ID <!-- Provide a quick summary of what the model is/does. --> ## Model Details ### Model Description <!-- Provide a longer summary of what this model is. --> This is the model card of a πŸ€— transformers model that has been pushed on the Hub. This model card has been automatically generated. - **Developed by:** [More Information Needed] - **Funded by [optional]:** [More Information Needed] - **Shared by [optional]:** [More Information Needed] - **Model type:** [More Information Needed] - **Language(s) (NLP):** [More Information Needed] - **License:** [More Information Needed] - **Finetuned from model [optional]:** [More Information Needed] ### Model Sources [optional] <!-- Provide the basic links for the model. --> - **Repository:** [More Information Needed] - **Paper [optional]:** [More Information Needed] - **Demo [optional]:** [More Information Needed] ## Uses <!-- Address questions around how the model is intended to be used, including the foreseeable users of the model and those affected by the model. --> ### Direct Use <!-- This section is for the model use without fine-tuning or plugging into a larger ecosystem/app. --> [More Information Needed] ### Downstream Use [optional] <!-- This section is for the model use when fine-tuned for a task, or when plugged into a larger ecosystem/app --> [More Information Needed] ### Out-of-Scope Use <!-- This section addresses misuse, malicious use, and uses that the model will not work well for. --> [More Information Needed] ## Bias, Risks, and Limitations <!-- This section is meant to convey both technical and sociotechnical limitations. --> [More Information Needed] ### Recommendations <!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. --> Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations. ## How to Get Started with the Model Use the code below to get started with the model. [More Information Needed] ## Training Details ### Training Data <!-- This should link to a Dataset Card, perhaps with a short stub of information on what the training data is all about as well as documentation related to data pre-processing or additional filtering. --> [More Information Needed] ### Training Procedure <!-- This relates heavily to the Technical Specifications. Content here should link to that section when it is relevant to the training procedure. --> #### Preprocessing [optional] [More Information Needed] #### Training Hyperparameters - **Training regime:** [More Information Needed] <!--fp32, fp16 mixed precision, bf16 mixed precision, bf16 non-mixed precision, fp16 non-mixed precision, fp8 mixed precision --> #### Speeds, Sizes, Times [optional] <!-- This section provides information about throughput, start/end time, checkpoint size if relevant, etc. --> [More Information Needed] ## Evaluation <!-- This section describes the evaluation protocols and provides the results. --> ### Testing Data, Factors & Metrics #### Testing Data <!-- This should link to a Dataset Card if possible. --> [More Information Needed] #### Factors <!-- These are the things the evaluation is disaggregating by, e.g., subpopulations or domains. --> [More Information Needed] #### Metrics <!-- These are the evaluation metrics being used, ideally with a description of why. --> [More Information Needed] ### Results [More Information Needed] #### Summary ## Model Examination [optional] <!-- Relevant interpretability work for the model goes here --> [More Information Needed] ## Environmental Impact <!-- Total emissions (in grams of CO2eq) and additional considerations, such as electricity usage, go here. Edit the suggested text below accordingly --> Carbon emissions can be estimated using the [Machine Learning Impact calculator](https://mlco2.github.io/impact#compute) presented in [Lacoste et al. (2019)](https://arxiv.org/abs/1910.09700). - **Hardware Type:** [More Information Needed] - **Hours used:** [More Information Needed] - **Cloud Provider:** [More Information Needed] - **Compute Region:** [More Information Needed] - **Carbon Emitted:** [More Information Needed] ## Technical Specifications [optional] ### Model Architecture and Objective [More Information Needed] ### Compute Infrastructure [More Information Needed] #### Hardware [More Information Needed] #### Software [More Information Needed] ## Citation [optional] <!-- If there is a paper or blog post introducing the model, the APA and Bibtex information for that should go in this section. --> **BibTeX:** [More Information Needed] **APA:** [More Information Needed] ## Glossary [optional] <!-- If relevant, include terms and calculations in this section that can help readers understand the model or model card. --> [More Information Needed] ## More Information [optional] [More Information Needed] ## Model Card Authors [optional] [More Information Needed] ## Model Card Contact [More Information Needed]
[ "pneumonia", "normal", "pneumonia_bacteria", "pneumonia_virus" ]
b09501048/detr2
<!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. --> [<img src="https://raw.githubusercontent.com/wandb/assets/main/wandb-github-badge-28.svg" alt="Visualize in Weights & Biases" width="200" height="32"/>](https://wandb.ai/djengo890-national-taiwan-university/CVDPL_HW1_DETR/runs/p3ydxc4m) # detr2 This model is a fine-tuned version of [b09501048/detr](https://huggingface.co/b09501048/detr) on an unknown dataset. It achieves the following results on the evaluation set: - Loss: 1.6756 - Map: 0.1051 - Map 50: 0.1903 - Map 75: 0.1015 - Map Small: 0.0033 - Map Medium: 0.0045 - Map Large: 0.118 - Mar 1: 0.0866 - Mar 10: 0.1571 - Mar 100: 0.1653 - Mar Small: 0.0035 - Mar Medium: 0.0137 - Mar Large: 0.1915 - Map Person: 0.5572 - Mar 100 Person: 0.6813 - Map Ear: 0.0504 - Mar 100 Ear: 0.2096 - Map Earmuffs: 0.0 - Mar 100 Earmuffs: 0.0 - Map Face: 0.4079 - Mar 100 Face: 0.5174 - Map Face-guard: 0.0 - Mar 100 Face-guard: 0.0 - Map Face-mask-medical: 0.0 - Mar 100 Face-mask-medical: 0.0 - Map Foot: 0.0 - Mar 100 Foot: 0.0 - Map Tools: 0.0065 - Mar 100 Tools: 0.1896 - Map Glasses: 0.0074 - Mar 100 Glasses: 0.0812 - Map Gloves: 0.0007 - Mar 100 Gloves: 0.009 - Map Helmet: 0.0 - Mar 100 Helmet: 0.0 - Map Hands: 0.2666 - Mar 100 Hands: 0.4266 - Map Head: 0.473 - Mar 100 Head: 0.5522 - Map Medical-suit: 0.0 - Mar 100 Medical-suit: 0.0 - Map Shoes: 0.0168 - Mar 100 Shoes: 0.1435 - Map Safety-suit: 0.0 - Mar 100 Safety-suit: 0.0 - Map Safety-vest: 0.0 - Mar 100 Safety-vest: 0.0 ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 5e-05 - train_batch_size: 16 - eval_batch_size: 16 - seed: 42 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: cosine - num_epochs: 10 - mixed_precision_training: Native AMP ### Training results | Training Loss | Epoch | Step | Validation Loss | Map | Map 50 | Map 75 | Map Small | Map Medium | Map Large | Mar 1 | Mar 10 | Mar 100 | Mar Small | Mar Medium | Mar Large | Map Person | Mar 100 Person | Map Ear | Mar 100 Ear | Map Earmuffs | Mar 100 Earmuffs | Map Face | Mar 100 Face | Map Face-guard | Mar 100 Face-guard | Map Face-mask-medical | Mar 100 Face-mask-medical | Map Foot | Mar 100 Foot | Map Tools | Mar 100 Tools | Map Glasses | Mar 100 Glasses | Map Gloves | Mar 100 Gloves | Map Helmet | Mar 100 Helmet | Map Hands | Mar 100 Hands | Map Head | Mar 100 Head | Map Medical-suit | Mar 100 Medical-suit | Map Shoes | Mar 100 Shoes | Map Safety-suit | Mar 100 Safety-suit | Map Safety-vest | Mar 100 Safety-vest | |:-------------:|:-----:|:----:|:---------------:|:------:|:------:|:------:|:---------:|:----------:|:---------:|:------:|:------:|:-------:|:---------:|:----------:|:---------:|:----------:|:--------------:|:-------:|:-----------:|:------------:|:----------------:|:--------:|:------------:|:--------------:|:------------------:|:---------------------:|:-------------------------:|:--------:|:------------:|:---------:|:-------------:|:-----------:|:---------------:|:----------:|:--------------:|:----------:|:--------------:|:---------:|:-------------:|:--------:|:------------:|:----------------:|:--------------------:|:---------:|:-------------:|:---------------:|:-------------------:|:---------------:|:-------------------:| | No log | 1.0 | 230 | 2.1829 | 0.0605 | 0.1167 | 0.0552 | 0.0009 | 0.0013 | 0.0672 | 0.0586 | 0.1154 | 0.1211 | 0.0007 | 0.0051 | 0.138 | 0.4117 | 0.609 | 0.0032 | 0.1099 | 0.0 | 0.0 | 0.1725 | 0.4168 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0001 | 0.0268 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.1194 | 0.3523 | 0.3179 | 0.4968 | 0.0 | 0.0 | 0.0041 | 0.0467 | 0.0 | 0.0 | 0.0 | 0.0 | | No log | 2.0 | 460 | 2.0866 | 0.0711 | 0.1339 | 0.0672 | 0.0013 | 0.0011 | 0.079 | 0.0641 | 0.1206 | 0.126 | 0.0018 | 0.0084 | 0.1438 | 0.454 | 0.609 | 0.0052 | 0.1209 | 0.0 | 0.0 | 0.1989 | 0.4211 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0004 | 0.0785 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.1608 | 0.3471 | 0.3852 | 0.507 | 0.0 | 0.0 | 0.0038 | 0.058 | 0.0 | 0.0 | 0.0 | 0.0 | | 2.5763 | 3.0 | 690 | 1.9861 | 0.0827 | 0.1539 | 0.0797 | 0.0009 | 0.0019 | 0.0922 | 0.0692 | 0.1281 | 0.1324 | 0.0007 | 0.0068 | 0.1522 | 0.4877 | 0.6318 | 0.0221 | 0.1291 | 0.0 | 0.0 | 0.3016 | 0.4585 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0006 | 0.0826 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.1826 | 0.3567 | 0.4063 | 0.5112 | 0.0 | 0.0 | 0.0048 | 0.0814 | 0.0 | 0.0 | 0.0 | 0.0 | | 2.5763 | 4.0 | 920 | 1.8935 | 0.0862 | 0.1625 | 0.0806 | 0.0002 | 0.0033 | 0.0962 | 0.0728 | 0.1312 | 0.136 | 0.0007 | 0.0101 | 0.1562 | 0.4722 | 0.6241 | 0.0232 | 0.1592 | 0.0 | 0.0 | 0.351 | 0.4621 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0009 | 0.1066 | 0.0004 | 0.0044 | 0.0 | 0.0 | 0.0 | 0.0 | 0.1775 | 0.3583 | 0.4342 | 0.5186 | 0.0 | 0.0 | 0.0066 | 0.079 | 0.0 | 0.0 | 0.0 | 0.0 | | 2.32 | 5.0 | 1150 | 1.8171 | 0.0932 | 0.1746 | 0.0899 | 0.0028 | 0.0033 | 0.1042 | 0.079 | 0.1399 | 0.1469 | 0.0028 | 0.0106 | 0.1685 | 0.4946 | 0.6561 | 0.0379 | 0.1757 | 0.0 | 0.0 | 0.3903 | 0.4897 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0011 | 0.1625 | 0.0042 | 0.0262 | 0.0004 | 0.002 | 0.0 | 0.0 | 0.1991 | 0.3702 | 0.4482 | 0.5292 | 0.0 | 0.0 | 0.009 | 0.0858 | 0.0 | 0.0 | 0.0 | 0.0 | | 2.32 | 6.0 | 1380 | 1.7576 | 0.0981 | 0.1822 | 0.0942 | 0.0015 | 0.0038 | 0.11 | 0.0818 | 0.1474 | 0.1541 | 0.0015 | 0.0123 | 0.1785 | 0.5247 | 0.658 | 0.0522 | 0.1868 | 0.0 | 0.0 | 0.3726 | 0.499 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0083 | 0.1448 | 0.0067 | 0.06 | 0.0 | 0.0 | 0.0 | 0.0 | 0.2357 | 0.4101 | 0.4543 | 0.5349 | 0.0 | 0.0 | 0.0125 | 0.1263 | 0.0 | 0.0 | 0.0 | 0.0 | | 2.1983 | 7.0 | 1610 | 1.7219 | 0.1017 | 0.1871 | 0.0961 | 0.0021 | 0.0059 | 0.1138 | 0.0854 | 0.1531 | 0.1601 | 0.0024 | 0.0153 | 0.1847 | 0.547 | 0.6753 | 0.0411 | 0.207 | 0.0 | 0.0 | 0.3952 | 0.4981 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0042 | 0.1893 | 0.0108 | 0.0662 | 0.0005 | 0.0035 | 0.0 | 0.0 | 0.256 | 0.4141 | 0.4619 | 0.5478 | 0.0 | 0.0 | 0.0123 | 0.1201 | 0.0 | 0.0 | 0.0 | 0.0 | | 2.1983 | 8.0 | 1840 | 1.7015 | 0.1025 | 0.1879 | 0.099 | 0.0028 | 0.0048 | 0.1149 | 0.0852 | 0.1556 | 0.1625 | 0.0029 | 0.0143 | 0.1884 | 0.5415 | 0.6693 | 0.0436 | 0.2041 | 0.0 | 0.0 | 0.4018 | 0.5101 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0056 | 0.1713 | 0.0086 | 0.0962 | 0.0 | 0.002 | 0.0 | 0.0 | 0.2632 | 0.4221 | 0.4633 | 0.5424 | 0.0 | 0.0 | 0.0155 | 0.1456 | 0.0 | 0.0 | 0.0 | 0.0 | | 2.0688 | 9.0 | 2070 | 1.6853 | 0.1042 | 0.1896 | 0.0994 | 0.0033 | 0.0046 | 0.117 | 0.0862 | 0.156 | 0.1636 | 0.0033 | 0.014 | 0.1895 | 0.5507 | 0.6784 | 0.0481 | 0.206 | 0.0 | 0.0 | 0.4047 | 0.5127 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.005 | 0.1716 | 0.009 | 0.0887 | 0.0004 | 0.006 | 0.0 | 0.0 | 0.2662 | 0.4255 | 0.4697 | 0.5484 | 0.0 | 0.0 | 0.0169 | 0.1435 | 0.0 | 0.0 | 0.0 | 0.0 | | 2.0688 | 10.0 | 2300 | 1.6756 | 0.1051 | 0.1903 | 0.1015 | 0.0033 | 0.0045 | 0.118 | 0.0866 | 0.1571 | 0.1653 | 0.0035 | 0.0137 | 0.1915 | 0.5572 | 0.6813 | 0.0504 | 0.2096 | 0.0 | 0.0 | 0.4079 | 0.5174 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0065 | 0.1896 | 0.0074 | 0.0812 | 0.0007 | 0.009 | 0.0 | 0.0 | 0.2666 | 0.4266 | 0.473 | 0.5522 | 0.0 | 0.0 | 0.0168 | 0.1435 | 0.0 | 0.0 | 0.0 | 0.0 | ### Framework versions - Transformers 4.44.2 - Pytorch 2.4.1+cu121 - Datasets 3.0.1 - Tokenizers 0.19.1
[ "person", "ear", "earmuffs", "face", "face-guard", "face-mask-medical", "foot", "tools", "glasses", "gloves", "helmet", "hands", "head", "medical-suit", "shoes", "safety-suit", "safety-vest" ]
b09501048/detr3
<!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. --> [<img src="https://raw.githubusercontent.com/wandb/assets/main/wandb-github-badge-28.svg" alt="Visualize in Weights & Biases" width="200" height="32"/>](https://wandb.ai/djengo890-national-taiwan-university/CVDPL_HW1_DETR/runs/cwd9gy2d) # detr3 This model is a fine-tuned version of [b09501048/detr2](https://huggingface.co/b09501048/detr2) on an unknown dataset. It achieves the following results on the evaluation set: - Loss: 1.7425 - Map: 0.1113 - Map 50: 0.2054 - Map 75: 0.1062 - Map Small: 0.0008 - Map Medium: 0.0074 - Map Large: 0.131 - Mar 1: 0.0965 - Mar 10: 0.169 - Mar 100: 0.1747 - Mar Small: 0.001 - Mar Medium: 0.0244 - Mar Large: 0.2102 - Map Person: 0.5378 - Mar 100 Person: 0.6529 - Map Ear: 0.0933 - Mar 100 Ear: 0.2379 - Map Earmuffs: 0.0 - Mar 100 Earmuffs: 0.0 - Map Face: 0.4246 - Mar 100 Face: 0.5219 - Map Face-guard: 0.0 - Mar 100 Face-guard: 0.0 - Map Face-mask-medical: 0.0 - Mar 100 Face-mask-medical: 0.0 - Map Foot: 0.0 - Mar 100 Foot: 0.0 - Map Tools: 0.0131 - Mar 100 Tools: 0.1422 - Map Glasses: 0.0232 - Mar 100 Glasses: 0.2111 - Map Gloves: 0.007 - Mar 100 Gloves: 0.0638 - Map Helmet: 0.0022 - Mar 100 Helmet: 0.0084 - Map Hands: 0.2866 - Mar 100 Hands: 0.4328 - Map Head: 0.4538 - Mar 100 Head: 0.5218 - Map Medical-suit: 0.0 - Mar 100 Medical-suit: 0.0 - Map Shoes: 0.0499 - Mar 100 Shoes: 0.1768 - Map Safety-suit: 0.0 - Mar 100 Safety-suit: 0.0 - Map Safety-vest: 0.0 - Mar 100 Safety-vest: 0.0 ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 5e-05 - train_batch_size: 16 - eval_batch_size: 16 - seed: 42 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: cosine - num_epochs: 10 - mixed_precision_training: Native AMP ### Training results | Training Loss | Epoch | Step | Validation Loss | Map | Map 50 | Map 75 | Map Small | Map Medium | Map Large | Mar 1 | Mar 10 | Mar 100 | Mar Small | Mar Medium | Mar Large | Map Person | Mar 100 Person | Map Ear | Mar 100 Ear | Map Earmuffs | Mar 100 Earmuffs | Map Face | Mar 100 Face | Map Face-guard | Mar 100 Face-guard | Map Face-mask-medical | Mar 100 Face-mask-medical | Map Foot | Mar 100 Foot | Map Tools | Mar 100 Tools | Map Glasses | Mar 100 Glasses | Map Gloves | Mar 100 Gloves | Map Helmet | Mar 100 Helmet | Map Hands | Mar 100 Hands | Map Head | Mar 100 Head | Map Medical-suit | Mar 100 Medical-suit | Map Shoes | Mar 100 Shoes | Map Safety-suit | Mar 100 Safety-suit | Map Safety-vest | Mar 100 Safety-vest | |:-------------:|:-----:|:----:|:---------------:|:------:|:------:|:------:|:---------:|:----------:|:---------:|:------:|:------:|:-------:|:---------:|:----------:|:---------:|:----------:|:--------------:|:-------:|:-----------:|:------------:|:----------------:|:--------:|:------------:|:--------------:|:------------------:|:---------------------:|:-------------------------:|:--------:|:------------:|:---------:|:-------------:|:-----------:|:---------------:|:----------:|:--------------:|:----------:|:--------------:|:---------:|:-------------:|:--------:|:------------:|:----------------:|:--------------------:|:---------:|:-------------:|:---------------:|:-------------------:|:---------------:|:-------------------:| | No log | 1.0 | 230 | 2.0522 | 0.0803 | 0.1607 | 0.0725 | 0.0 | 0.0034 | 0.0929 | 0.0724 | 0.1318 | 0.1363 | 0.0 | 0.0112 | 0.1631 | 0.3995 | 0.5836 | 0.0156 | 0.1468 | 0.0 | 0.0 | 0.3369 | 0.4425 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0028 | 0.0751 | 0.0114 | 0.1396 | 0.001 | 0.0047 | 0.0 | 0.0 | 0.1877 | 0.3376 | 0.4006 | 0.4755 | 0.0 | 0.0 | 0.0091 | 0.1115 | 0.0 | 0.0 | 0.0 | 0.0 | | No log | 2.0 | 460 | 2.2256 | 0.0769 | 0.1575 | 0.0633 | 0.0 | 0.0046 | 0.0888 | 0.0681 | 0.1248 | 0.1291 | 0.0 | 0.0146 | 0.1528 | 0.4126 | 0.5645 | 0.0134 | 0.1174 | 0.0 | 0.0 | 0.3003 | 0.4032 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0007 | 0.0526 | 0.0056 | 0.1271 | 0.0002 | 0.0024 | 0.0 | 0.0 | 0.2003 | 0.3635 | 0.3566 | 0.4506 | 0.0 | 0.0 | 0.018 | 0.1128 | 0.0 | 0.0 | 0.0 | 0.0 | | 2.2077 | 3.0 | 690 | 2.0148 | 0.0862 | 0.1699 | 0.0781 | 0.0 | 0.0031 | 0.1 | 0.0788 | 0.1402 | 0.1451 | 0.0 | 0.0108 | 0.1745 | 0.4617 | 0.5927 | 0.0353 | 0.1895 | 0.0 | 0.0 | 0.345 | 0.4593 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0039 | 0.1015 | 0.0142 | 0.1688 | 0.0003 | 0.0043 | 0.0 | 0.0 | 0.2037 | 0.3591 | 0.3848 | 0.4643 | 0.0 | 0.0 | 0.0162 | 0.1279 | 0.0 | 0.0 | 0.0 | 0.0 | | 2.2077 | 4.0 | 920 | 1.9426 | 0.0942 | 0.1812 | 0.0873 | 0.0 | 0.0039 | 0.1105 | 0.0834 | 0.148 | 0.1529 | 0.0 | 0.0165 | 0.1832 | 0.435 | 0.5951 | 0.0562 | 0.2075 | 0.0 | 0.0 | 0.3875 | 0.4814 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.006 | 0.1112 | 0.0259 | 0.1736 | 0.0007 | 0.0016 | 0.0 | 0.0 | 0.2441 | 0.3994 | 0.4248 | 0.5024 | 0.0 | 0.0 | 0.0219 | 0.1264 | 0.0 | 0.0 | 0.0 | 0.0 | | 2.0944 | 5.0 | 1150 | 1.8840 | 0.0983 | 0.1899 | 0.0909 | 0.0002 | 0.0054 | 0.1154 | 0.0859 | 0.1517 | 0.1563 | 0.0002 | 0.0181 | 0.187 | 0.4713 | 0.6164 | 0.0695 | 0.1853 | 0.0 | 0.0 | 0.3778 | 0.4833 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.007 | 0.1175 | 0.0214 | 0.1764 | 0.0033 | 0.0272 | 0.0 | 0.0 | 0.2512 | 0.3917 | 0.4353 | 0.5047 | 0.0 | 0.0 | 0.0343 | 0.1546 | 0.0 | 0.0 | 0.0 | 0.0 | | 2.0944 | 6.0 | 1380 | 1.8355 | 0.1029 | 0.1959 | 0.0946 | 0.0 | 0.0071 | 0.121 | 0.0915 | 0.1601 | 0.1653 | 0.0 | 0.019 | 0.1986 | 0.503 | 0.6342 | 0.0869 | 0.2042 | 0.0 | 0.0 | 0.3815 | 0.4824 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0062 | 0.1206 | 0.022 | 0.2215 | 0.0066 | 0.065 | 0.0 | 0.0 | 0.2669 | 0.4158 | 0.437 | 0.5066 | 0.0 | 0.0 | 0.0394 | 0.1595 | 0.0 | 0.0 | 0.0 | 0.0 | | 1.9921 | 7.0 | 1610 | 1.7879 | 0.1052 | 0.1999 | 0.0975 | 0.0 | 0.0064 | 0.1236 | 0.0936 | 0.1616 | 0.1668 | 0.0 | 0.0231 | 0.2004 | 0.5172 | 0.6394 | 0.0832 | 0.2128 | 0.0 | 0.0 | 0.3953 | 0.4923 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0077 | 0.1447 | 0.0237 | 0.2111 | 0.0025 | 0.0358 | 0.0023 | 0.0049 | 0.2686 | 0.4154 | 0.443 | 0.5138 | 0.0 | 0.0 | 0.0452 | 0.1661 | 0.0 | 0.0 | 0.0 | 0.0 | | 1.9921 | 8.0 | 1840 | 1.7682 | 0.1097 | 0.2035 | 0.1065 | 0.0004 | 0.0058 | 0.1291 | 0.0964 | 0.1669 | 0.1722 | 0.0007 | 0.0228 | 0.2067 | 0.5351 | 0.6544 | 0.0849 | 0.2151 | 0.0 | 0.0 | 0.4123 | 0.5122 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0119 | 0.1362 | 0.0272 | 0.2146 | 0.0073 | 0.0646 | 0.0025 | 0.0056 | 0.2814 | 0.4331 | 0.455 | 0.5221 | 0.0 | 0.0 | 0.0476 | 0.17 | 0.0 | 0.0 | 0.0 | 0.0 | | 1.8979 | 9.0 | 2070 | 1.7470 | 0.1107 | 0.2048 | 0.1049 | 0.0009 | 0.0076 | 0.1302 | 0.0971 | 0.1683 | 0.1744 | 0.001 | 0.0249 | 0.2098 | 0.5366 | 0.6539 | 0.0902 | 0.2351 | 0.0 | 0.0 | 0.4211 | 0.5206 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0123 | 0.1412 | 0.0235 | 0.2076 | 0.007 | 0.0669 | 0.0022 | 0.0084 | 0.2856 | 0.4318 | 0.4544 | 0.5229 | 0.0 | 0.0 | 0.0484 | 0.177 | 0.0 | 0.0 | 0.0 | 0.0 | | 1.8979 | 10.0 | 2300 | 1.7425 | 0.1113 | 0.2054 | 0.1062 | 0.0008 | 0.0074 | 0.131 | 0.0965 | 0.169 | 0.1747 | 0.001 | 0.0244 | 0.2102 | 0.5378 | 0.6529 | 0.0933 | 0.2379 | 0.0 | 0.0 | 0.4246 | 0.5219 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0131 | 0.1422 | 0.0232 | 0.2111 | 0.007 | 0.0638 | 0.0022 | 0.0084 | 0.2866 | 0.4328 | 0.4538 | 0.5218 | 0.0 | 0.0 | 0.0499 | 0.1768 | 0.0 | 0.0 | 0.0 | 0.0 | ### Framework versions - Transformers 4.44.2 - Pytorch 2.4.1+cu121 - Datasets 3.0.1 - Tokenizers 0.19.1
[ "person", "ear", "earmuffs", "face", "face-guard", "face-mask-medical", "foot", "tools", "glasses", "gloves", "helmet", "hands", "head", "medical-suit", "shoes", "safety-suit", "safety-vest" ]
b09501048/rtdetr
<!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. --> [<img src="https://raw.githubusercontent.com/wandb/assets/main/wandb-github-badge-28.svg" alt="Visualize in Weights & Biases" width="200" height="32"/>](https://wandb.ai/djengo890-national-taiwan-university/CVDPL_HW1_DETR/runs/zw503nk8) # rtdetr This model is a fine-tuned version of [PekingU/rtdetr_r50vd_coco_o365](https://huggingface.co/PekingU/rtdetr_r50vd_coco_o365) on an unknown dataset. It achieves the following results on the evaluation set: - Loss: 10.4377 - Map: 0.2587 - Map 50: 0.4214 - Map 75: 0.2668 - Map Small: 0.0132 - Map Medium: 0.0707 - Map Large: 0.3083 - Mar 1: 0.2412 - Mar 10: 0.4736 - Mar 100: 0.4998 - Mar Small: 0.0211 - Mar Medium: 0.1684 - Mar Large: 0.5787 - Map Person: 0.6872 - Mar 100 Person: 0.7948 - Map Ear: 0.3267 - Mar 100 Ear: 0.4363 - Map Earmuffs: 0.1061 - Mar 100 Earmuffs: 0.3967 - Map Face: 0.5362 - Mar 100 Face: 0.6549 - Map Face-guard: 0.0236 - Mar 100 Face-guard: 0.51 - Map Face-mask-medical: 0.1466 - Mar 100 Face-mask-medical: 0.3479 - Map Foot: 0.1167 - Mar 100 Foot: 0.3963 - Map Tools: 0.125 - Mar 100 Tools: 0.3664 - Map Glasses: 0.2452 - Mar 100 Glasses: 0.4355 - Map Gloves: 0.3086 - Mar 100 Gloves: 0.4919 - Map Helmet: 0.2733 - Mar 100 Helmet: 0.4595 - Map Hands: 0.4959 - Mar 100 Hands: 0.6459 - Map Head: 0.6255 - Mar 100 Head: 0.7222 - Map Medical-suit: 0.0071 - Mar 100 Medical-suit: 0.6667 - Map Shoes: 0.2826 - Mar 100 Shoes: 0.4203 - Map Safety-suit: 0.0628 - Mar 100 Safety-suit: 0.6043 - Map Safety-vest: 0.0284 - Mar 100 Safety-vest: 0.1479 ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 5e-05 - train_batch_size: 16 - eval_batch_size: 16 - seed: 42 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: cosine - num_epochs: 10 - mixed_precision_training: Native AMP ### Training results | Training Loss | Epoch | Step | Validation Loss | Map | Map 50 | Map 75 | Map Small | Map Medium | Map Large | Mar 1 | Mar 10 | Mar 100 | Mar Small | Mar Medium | Mar Large | Map Person | Mar 100 Person | Map Ear | Mar 100 Ear | Map Earmuffs | Mar 100 Earmuffs | Map Face | Mar 100 Face | Map Face-guard | Mar 100 Face-guard | Map Face-mask-medical | Mar 100 Face-mask-medical | Map Foot | Mar 100 Foot | Map Tools | Mar 100 Tools | Map Glasses | Mar 100 Glasses | Map Gloves | Mar 100 Gloves | Map Helmet | Mar 100 Helmet | Map Hands | Mar 100 Hands | Map Head | Mar 100 Head | Map Medical-suit | Mar 100 Medical-suit | Map Shoes | Mar 100 Shoes | Map Safety-suit | Mar 100 Safety-suit | Map Safety-vest | Mar 100 Safety-vest | |:-------------:|:-----:|:----:|:---------------:|:------:|:------:|:------:|:---------:|:----------:|:---------:|:------:|:------:|:-------:|:---------:|:----------:|:---------:|:----------:|:--------------:|:-------:|:-----------:|:------------:|:----------------:|:--------:|:------------:|:--------------:|:------------------:|:---------------------:|:-------------------------:|:--------:|:------------:|:---------:|:-------------:|:-----------:|:---------------:|:----------:|:--------------:|:----------:|:--------------:|:---------:|:-------------:|:--------:|:------------:|:----------------:|:--------------------:|:---------:|:-------------:|:---------------:|:-------------------:|:---------------:|:-------------------:| | No log | 1.0 | 230 | 11.8687 | 0.1809 | 0.2959 | 0.1854 | 0.0001 | 0.0405 | 0.2126 | 0.1745 | 0.3538 | 0.3831 | 0.0003 | 0.1392 | 0.4463 | 0.6713 | 0.7877 | 0.2916 | 0.3832 | 0.0012 | 0.27 | 0.4942 | 0.6099 | 0.0003 | 0.11 | 0.0051 | 0.2792 | 0.0203 | 0.2963 | 0.0294 | 0.2341 | 0.1226 | 0.3419 | 0.1196 | 0.4622 | 0.0861 | 0.3709 | 0.4235 | 0.6114 | 0.5619 | 0.6973 | 0.0033 | 0.3333 | 0.2386 | 0.4097 | 0.0057 | 0.3043 | 0.0 | 0.0106 | | No log | 2.0 | 460 | 11.2318 | 0.2049 | 0.3336 | 0.2104 | 0.0001 | 0.0546 | 0.2415 | 0.2033 | 0.4067 | 0.4298 | 0.0009 | 0.1818 | 0.4943 | 0.6806 | 0.7963 | 0.299 | 0.4008 | 0.0019 | 0.2967 | 0.5212 | 0.6258 | 0.0013 | 0.3 | 0.0672 | 0.3167 | 0.0227 | 0.3556 | 0.0463 | 0.2657 | 0.1828 | 0.3808 | 0.2148 | 0.4744 | 0.1592 | 0.4101 | 0.4415 | 0.6258 | 0.5971 | 0.708 | 0.0043 | 0.45 | 0.2338 | 0.4048 | 0.0095 | 0.4696 | 0.0001 | 0.0255 | | 21.6045 | 3.0 | 690 | 10.5848 | 0.228 | 0.3728 | 0.2323 | 0.0023 | 0.0556 | 0.2715 | 0.229 | 0.4415 | 0.472 | 0.0054 | 0.1589 | 0.5472 | 0.6736 | 0.7933 | 0.3235 | 0.4293 | 0.015 | 0.3133 | 0.5366 | 0.6397 | 0.0043 | 0.46 | 0.1355 | 0.3646 | 0.0585 | 0.3759 | 0.0672 | 0.3417 | 0.213 | 0.425 | 0.2673 | 0.4793 | 0.2207 | 0.4772 | 0.4553 | 0.6371 | 0.6186 | 0.7185 | 0.0054 | 0.575 | 0.2525 | 0.4074 | 0.0273 | 0.4957 | 0.0018 | 0.0904 | | 21.6045 | 4.0 | 920 | 10.5421 | 0.2332 | 0.3782 | 0.2411 | 0.0009 | 0.0614 | 0.2771 | 0.223 | 0.4435 | 0.4738 | 0.0048 | 0.1859 | 0.5433 | 0.6844 | 0.797 | 0.3263 | 0.4265 | 0.0182 | 0.3733 | 0.5415 | 0.651 | 0.0071 | 0.35 | 0.1292 | 0.3625 | 0.0606 | 0.3815 | 0.0843 | 0.3587 | 0.213 | 0.4349 | 0.2804 | 0.4659 | 0.2003 | 0.4418 | 0.4684 | 0.6475 | 0.6229 | 0.7196 | 0.0096 | 0.6083 | 0.2897 | 0.4371 | 0.0254 | 0.5217 | 0.0029 | 0.0777 | | 16.2105 | 5.0 | 1150 | 10.5670 | 0.2425 | 0.4026 | 0.2462 | 0.0026 | 0.0678 | 0.2876 | 0.2248 | 0.4572 | 0.49 | 0.0084 | 0.175 | 0.5649 | 0.6759 | 0.7959 | 0.3303 | 0.4287 | 0.051 | 0.37 | 0.5377 | 0.6458 | 0.0389 | 0.54 | 0.1382 | 0.3313 | 0.0542 | 0.3833 | 0.0967 | 0.339 | 0.2201 | 0.4081 | 0.2746 | 0.4821 | 0.2307 | 0.4747 | 0.4704 | 0.6355 | 0.6247 | 0.727 | 0.0079 | 0.6833 | 0.2719 | 0.4297 | 0.0787 | 0.5174 | 0.0212 | 0.1372 | | 16.2105 | 6.0 | 1380 | 10.5205 | 0.2454 | 0.4021 | 0.2486 | 0.004 | 0.0642 | 0.2915 | 0.2318 | 0.466 | 0.4921 | 0.009 | 0.2002 | 0.565 | 0.6883 | 0.7967 | 0.3229 | 0.4303 | 0.0432 | 0.4167 | 0.533 | 0.6475 | 0.0206 | 0.49 | 0.134 | 0.3417 | 0.0763 | 0.3759 | 0.1108 | 0.3549 | 0.2424 | 0.4302 | 0.2975 | 0.4923 | 0.2172 | 0.4481 | 0.4765 | 0.6418 | 0.622 | 0.7188 | 0.0088 | 0.675 | 0.2824 | 0.4318 | 0.08 | 0.5478 | 0.0164 | 0.1255 | | 14.9514 | 7.0 | 1610 | 10.4281 | 0.2503 | 0.4074 | 0.2573 | 0.0122 | 0.071 | 0.2971 | 0.2336 | 0.4732 | 0.5003 | 0.0192 | 0.1699 | 0.5787 | 0.6909 | 0.797 | 0.3294 | 0.4358 | 0.0799 | 0.3767 | 0.5398 | 0.6511 | 0.0195 | 0.55 | 0.1253 | 0.3229 | 0.0995 | 0.4037 | 0.1184 | 0.3798 | 0.2401 | 0.4262 | 0.3016 | 0.4878 | 0.2372 | 0.4582 | 0.4943 | 0.6485 | 0.6247 | 0.7213 | 0.0081 | 0.6583 | 0.2781 | 0.4214 | 0.0502 | 0.6087 | 0.019 | 0.1574 | | 14.9514 | 8.0 | 1840 | 10.4168 | 0.2591 | 0.4207 | 0.2665 | 0.0129 | 0.069 | 0.3082 | 0.2426 | 0.471 | 0.5001 | 0.0198 | 0.1687 | 0.5777 | 0.6901 | 0.7971 | 0.3275 | 0.4379 | 0.0953 | 0.3833 | 0.539 | 0.6537 | 0.0503 | 0.54 | 0.1408 | 0.3333 | 0.1041 | 0.387 | 0.1254 | 0.3735 | 0.2417 | 0.4314 | 0.3014 | 0.4854 | 0.2689 | 0.4557 | 0.4984 | 0.6511 | 0.6291 | 0.725 | 0.0068 | 0.6667 | 0.2833 | 0.4224 | 0.0785 | 0.6217 | 0.0245 | 0.1372 | | 14.5079 | 9.0 | 2070 | 10.4207 | 0.2605 | 0.4265 | 0.2676 | 0.013 | 0.072 | 0.3104 | 0.2384 | 0.4779 | 0.5015 | 0.0212 | 0.1712 | 0.5805 | 0.6874 | 0.7957 | 0.3287 | 0.439 | 0.109 | 0.3967 | 0.5364 | 0.6546 | 0.0376 | 0.52 | 0.1454 | 0.3333 | 0.11 | 0.4019 | 0.1238 | 0.3652 | 0.2454 | 0.439 | 0.309 | 0.4882 | 0.2851 | 0.4646 | 0.4972 | 0.6475 | 0.6276 | 0.7259 | 0.0068 | 0.6667 | 0.2853 | 0.4249 | 0.0651 | 0.6043 | 0.0291 | 0.1585 | | 14.5079 | 10.0 | 2300 | 10.4377 | 0.2587 | 0.4214 | 0.2668 | 0.0132 | 0.0707 | 0.3083 | 0.2412 | 0.4736 | 0.4998 | 0.0211 | 0.1684 | 0.5787 | 0.6872 | 0.7948 | 0.3267 | 0.4363 | 0.1061 | 0.3967 | 0.5362 | 0.6549 | 0.0236 | 0.51 | 0.1466 | 0.3479 | 0.1167 | 0.3963 | 0.125 | 0.3664 | 0.2452 | 0.4355 | 0.3086 | 0.4919 | 0.2733 | 0.4595 | 0.4959 | 0.6459 | 0.6255 | 0.7222 | 0.0071 | 0.6667 | 0.2826 | 0.4203 | 0.0628 | 0.6043 | 0.0284 | 0.1479 | ### Framework versions - Transformers 4.44.2 - Pytorch 2.4.1+cu121 - Datasets 3.0.1 - Tokenizers 0.19.1
[ "person", "ear", "earmuffs", "face", "face-guard", "face-mask-medical", "foot", "tools", "glasses", "gloves", "helmet", "hands", "head", "medical-suit", "shoes", "safety-suit", "safety-vest" ]
doktor47/zinemind_msft_500
# Model Card for Model ID <!-- Provide a quick summary of what the model is/does. --> ## Model Details ### Model Description <!-- Provide a longer summary of what this model is. --> This is the model card of a πŸ€— transformers model that has been pushed on the Hub. This model card has been automatically generated. - **Developed by:** [More Information Needed] - **Funded by [optional]:** [More Information Needed] - **Shared by [optional]:** [More Information Needed] - **Model type:** [More Information Needed] - **Language(s) (NLP):** [More Information Needed] - **License:** [More Information Needed] - **Finetuned from model [optional]:** [More Information Needed] ### Model Sources [optional] <!-- Provide the basic links for the model. --> - **Repository:** [More Information Needed] - **Paper [optional]:** [More Information Needed] - **Demo [optional]:** [More Information Needed] ## Uses <!-- Address questions around how the model is intended to be used, including the foreseeable users of the model and those affected by the model. --> ### Direct Use <!-- This section is for the model use without fine-tuning or plugging into a larger ecosystem/app. --> [More Information Needed] ### Downstream Use [optional] <!-- This section is for the model use when fine-tuned for a task, or when plugged into a larger ecosystem/app --> [More Information Needed] ### Out-of-Scope Use <!-- This section addresses misuse, malicious use, and uses that the model will not work well for. --> [More Information Needed] ## Bias, Risks, and Limitations <!-- This section is meant to convey both technical and sociotechnical limitations. --> [More Information Needed] ### Recommendations <!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. --> Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations. ## How to Get Started with the Model Use the code below to get started with the model. [More Information Needed] ## Training Details ### Training Data <!-- This should link to a Dataset Card, perhaps with a short stub of information on what the training data is all about as well as documentation related to data pre-processing or additional filtering. --> [More Information Needed] ### Training Procedure <!-- This relates heavily to the Technical Specifications. Content here should link to that section when it is relevant to the training procedure. --> #### Preprocessing [optional] [More Information Needed] #### Training Hyperparameters - **Training regime:** [More Information Needed] <!--fp32, fp16 mixed precision, bf16 mixed precision, bf16 non-mixed precision, fp16 non-mixed precision, fp8 mixed precision --> #### Speeds, Sizes, Times [optional] <!-- This section provides information about throughput, start/end time, checkpoint size if relevant, etc. --> [More Information Needed] ## Evaluation <!-- This section describes the evaluation protocols and provides the results. --> ### Testing Data, Factors & Metrics #### Testing Data <!-- This should link to a Dataset Card if possible. --> [More Information Needed] #### Factors <!-- These are the things the evaluation is disaggregating by, e.g., subpopulations or domains. --> [More Information Needed] #### Metrics <!-- These are the evaluation metrics being used, ideally with a description of why. --> [More Information Needed] ### Results [More Information Needed] #### Summary ## Model Examination [optional] <!-- Relevant interpretability work for the model goes here --> [More Information Needed] ## Environmental Impact <!-- Total emissions (in grams of CO2eq) and additional considerations, such as electricity usage, go here. Edit the suggested text below accordingly --> Carbon emissions can be estimated using the [Machine Learning Impact calculator](https://mlco2.github.io/impact#compute) presented in [Lacoste et al. (2019)](https://arxiv.org/abs/1910.09700). - **Hardware Type:** [More Information Needed] - **Hours used:** [More Information Needed] - **Cloud Provider:** [More Information Needed] - **Compute Region:** [More Information Needed] - **Carbon Emitted:** [More Information Needed] ## Technical Specifications [optional] ### Model Architecture and Objective [More Information Needed] ### Compute Infrastructure [More Information Needed] #### Hardware [More Information Needed] #### Software [More Information Needed] ## Citation [optional] <!-- If there is a paper or blog post introducing the model, the APA and Bibtex information for that should go in this section. --> **BibTeX:** [More Information Needed] **APA:** [More Information Needed] ## Glossary [optional] <!-- If relevant, include terms and calculations in this section that can help readers understand the model or model card. --> [More Information Needed] ## More Information [optional] [More Information Needed] ## Model Card Authors [optional] [More Information Needed] ## Model Card Contact [More Information Needed]
[ "table", "table rotated" ]
b09501048/rtdetr2
<!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. --> [<img src="https://raw.githubusercontent.com/wandb/assets/main/wandb-github-badge-28.svg" alt="Visualize in Weights & Biases" width="200" height="32"/>](https://wandb.ai/djengo890-national-taiwan-university/CVDPL_HW1_RT_DETR/runs/fnra9lrw) [<img src="https://raw.githubusercontent.com/wandb/assets/main/wandb-github-badge-28.svg" alt="Visualize in Weights & Biases" width="200" height="32"/>](https://wandb.ai/djengo890-national-taiwan-university/CVDPL_HW1_RT_DETR/runs/fnra9lrw) # rtdetr2 This model is a fine-tuned version of [PekingU/rtdetr_r50vd_coco_o365](https://huggingface.co/PekingU/rtdetr_r50vd_coco_o365) on an unknown dataset. It achieves the following results on the evaluation set: - Loss: 11.8738 - Map: 0.2766 - Map 50: 0.439 - Map 75: 0.2829 - Map Small: 0.1319 - Map Medium: 0.375 - Map Large: 0.4519 - Mar 1: 0.2272 - Mar 10: 0.438 - Mar 100: 0.4559 - Mar Small: 0.2268 - Mar Medium: 0.551 - Mar Large: 0.6909 - Map Person: 0.7159 - Mar 100 Person: 0.8345 - Map Ear: 0.3388 - Mar 100 Ear: 0.4368 - Map Earmuffs: 0.2418 - Mar 100 Earmuffs: 0.5135 - Map Face: 0.4365 - Mar 100 Face: 0.6137 - Map Face-guard: 0.1839 - Mar 100 Face-guard: 0.5625 - Map Face-mask-medical: 0.1733 - Mar 100 Face-mask-medical: 0.2842 - Map Foot: 0.0448 - Mar 100 Foot: 0.2647 - Map Tools: 0.0879 - Mar 100 Tools: 0.284 - Map Glasses: 0.224 - Mar 100 Glasses: 0.3644 - Map Gloves: 0.2994 - Mar 100 Gloves: 0.4226 - Map Helmet: 0.2581 - Mar 100 Helmet: 0.3714 - Map Hands: 0.5167 - Mar 100 Hands: 0.6166 - Map Head: 0.5923 - Mar 100 Head: 0.6739 - Map Medical-suit: 0.1135 - Mar 100 Medical-suit: 0.5556 - Map Shoes: 0.3707 - Mar 100 Shoes: 0.483 - Map Safety-suit: 0.0097 - Mar 100 Safety-suit: 0.315 - Map Safety-vest: 0.0956 - Mar 100 Safety-vest: 0.1532 ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 5e-05 - train_batch_size: 8 - eval_batch_size: 8 - seed: 42 - optimizer: Use adamw_torch with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments - lr_scheduler_type: linear - lr_scheduler_warmup_steps: 300 - num_epochs: 10 - mixed_precision_training: Native AMP ### Training results | Training Loss | Epoch | Step | Validation Loss | Map | Map 50 | Map 75 | Map Small | Map Medium | Map Large | Mar 1 | Mar 10 | Mar 100 | Mar Small | Mar Medium | Mar Large | Map Person | Mar 100 Person | Map Ear | Mar 100 Ear | Map Earmuffs | Mar 100 Earmuffs | Map Face | Mar 100 Face | Map Face-guard | Mar 100 Face-guard | Map Face-mask-medical | Mar 100 Face-mask-medical | Map Foot | Mar 100 Foot | Map Tools | Mar 100 Tools | Map Glasses | Mar 100 Glasses | Map Gloves | Mar 100 Gloves | Map Helmet | Mar 100 Helmet | Map Hands | Mar 100 Hands | Map Head | Mar 100 Head | Map Medical-suit | Mar 100 Medical-suit | Map Shoes | Mar 100 Shoes | Map Safety-suit | Mar 100 Safety-suit | Map Safety-vest | Mar 100 Safety-vest | |:-------------:|:-----:|:----:|:---------------:|:------:|:------:|:------:|:---------:|:----------:|:---------:|:------:|:------:|:-------:|:---------:|:----------:|:---------:|:----------:|:--------------:|:-------:|:-----------:|:------------:|:----------------:|:--------:|:------------:|:--------------:|:------------------:|:---------------------:|:-------------------------:|:--------:|:------------:|:---------:|:-------------:|:-----------:|:---------------:|:----------:|:--------------:|:----------:|:--------------:|:---------:|:-------------:|:--------:|:------------:|:----------------:|:--------------------:|:---------:|:-------------:|:---------------:|:-------------------:|:---------------:|:-------------------:| | No log | 1.0 | 459 | 12.5722 | 0.1349 | 0.2143 | 0.143 | 0.0728 | 0.1775 | 0.1605 | 0.1436 | 0.2943 | 0.3139 | 0.1692 | 0.4142 | 0.5087 | 0.4573 | 0.8371 | 0.1857 | 0.4252 | 0.0002 | 0.1162 | 0.3926 | 0.613 | 0.001 | 0.2625 | 0.0572 | 0.1719 | 0.0004 | 0.102 | 0.0248 | 0.1706 | 0.0501 | 0.361 | 0.0807 | 0.1743 | 0.059 | 0.2911 | 0.2425 | 0.521 | 0.4662 | 0.6855 | 0.0 | 0.0222 | 0.2745 | 0.4592 | 0.0016 | 0.095 | 0.0002 | 0.0274 | | 26.3979 | 2.0 | 918 | 11.9482 | 0.208 | 0.3216 | 0.2213 | 0.1016 | 0.2792 | 0.3172 | 0.1846 | 0.363 | 0.3799 | 0.2024 | 0.4917 | 0.6211 | 0.633 | 0.8297 | 0.2283 | 0.335 | 0.0016 | 0.3622 | 0.5135 | 0.6272 | 0.0562 | 0.4875 | 0.1298 | 0.2737 | 0.0051 | 0.2137 | 0.0393 | 0.2209 | 0.1385 | 0.3452 | 0.2407 | 0.3783 | 0.1832 | 0.3482 | 0.4565 | 0.6032 | 0.5955 | 0.694 | 0.0002 | 0.0444 | 0.3128 | 0.4551 | 0.0009 | 0.17 | 0.0009 | 0.0694 | | 18.1729 | 3.0 | 1377 | 11.9715 | 0.2184 | 0.3418 | 0.2313 | 0.1126 | 0.3017 | 0.347 | 0.1909 | 0.3883 | 0.4101 | 0.2354 | 0.5085 | 0.6091 | 0.6705 | 0.8279 | 0.3031 | 0.4063 | 0.0117 | 0.3919 | 0.4866 | 0.6111 | 0.0355 | 0.5375 | 0.1147 | 0.2825 | 0.0111 | 0.2784 | 0.0514 | 0.2397 | 0.2188 | 0.4102 | 0.2171 | 0.3757 | 0.1851 | 0.3554 | 0.4664 | 0.5985 | 0.5942 | 0.6886 | 0.0014 | 0.2167 | 0.3408 | 0.4824 | 0.001 | 0.14 | 0.0037 | 0.129 | | 16.7823 | 4.0 | 1836 | 11.5629 | 0.2362 | 0.3642 | 0.2521 | 0.1213 | 0.3333 | 0.3732 | 0.2058 | 0.3925 | 0.4068 | 0.2236 | 0.5135 | 0.6145 | 0.6948 | 0.8366 | 0.3441 | 0.4262 | 0.0691 | 0.4216 | 0.5345 | 0.6544 | 0.0409 | 0.4625 | 0.1528 | 0.3053 | 0.0125 | 0.2647 | 0.0651 | 0.2936 | 0.1319 | 0.3079 | 0.2895 | 0.4088 | 0.2312 | 0.3634 | 0.4867 | 0.5981 | 0.6117 | 0.6891 | 0.0104 | 0.2056 | 0.3356 | 0.4696 | 0.001 | 0.115 | 0.0039 | 0.0935 | | 15.8096 | 5.0 | 2295 | 11.8381 | 0.2593 | 0.4074 | 0.2739 | 0.1281 | 0.3589 | 0.4355 | 0.2155 | 0.4214 | 0.4365 | 0.2201 | 0.5394 | 0.6675 | 0.6888 | 0.8275 | 0.3686 | 0.4572 | 0.1986 | 0.4378 | 0.5029 | 0.6389 | 0.1532 | 0.475 | 0.1791 | 0.3421 | 0.0229 | 0.2137 | 0.0583 | 0.2428 | 0.2389 | 0.4034 | 0.2943 | 0.4434 | 0.2326 | 0.3607 | 0.486 | 0.601 | 0.5887 | 0.6697 | 0.0124 | 0.4278 | 0.3458 | 0.4595 | 0.0035 | 0.25 | 0.0332 | 0.1694 | | 14.7312 | 6.0 | 2754 | 11.8431 | 0.2451 | 0.3926 | 0.2562 | 0.1243 | 0.3434 | 0.3958 | 0.2131 | 0.412 | 0.4263 | 0.2187 | 0.5118 | 0.6653 | 0.6987 | 0.8307 | 0.3293 | 0.4215 | 0.1879 | 0.4162 | 0.4487 | 0.6174 | 0.0248 | 0.4875 | 0.1453 | 0.3263 | 0.0302 | 0.249 | 0.0799 | 0.2737 | 0.2061 | 0.3531 | 0.2911 | 0.4035 | 0.2357 | 0.333 | 0.5077 | 0.6144 | 0.5511 | 0.6344 | 0.0402 | 0.5222 | 0.3495 | 0.4789 | 0.0018 | 0.15 | 0.0394 | 0.1355 | | 14.5207 | 7.0 | 3213 | 11.9739 | 0.2614 | 0.4212 | 0.2679 | 0.1246 | 0.3646 | 0.4079 | 0.2235 | 0.418 | 0.4289 | 0.2175 | 0.5351 | 0.617 | 0.6899 | 0.8326 | 0.3247 | 0.419 | 0.2156 | 0.4541 | 0.4293 | 0.6056 | 0.1801 | 0.4875 | 0.1518 | 0.2807 | 0.0507 | 0.2294 | 0.084 | 0.2649 | 0.2206 | 0.3678 | 0.2696 | 0.3903 | 0.2628 | 0.3625 | 0.5094 | 0.612 | 0.5925 | 0.6774 | 0.021 | 0.4056 | 0.3423 | 0.4637 | 0.0132 | 0.3 | 0.0863 | 0.1387 | | 14.0764 | 8.0 | 3672 | 11.9295 | 0.2646 | 0.4292 | 0.2715 | 0.1278 | 0.3629 | 0.4405 | 0.2236 | 0.4189 | 0.4308 | 0.2232 | 0.5261 | 0.668 | 0.7004 | 0.8286 | 0.3267 | 0.4195 | 0.2031 | 0.4459 | 0.4416 | 0.6123 | 0.188 | 0.45 | 0.1675 | 0.3053 | 0.0274 | 0.2373 | 0.0765 | 0.2644 | 0.2267 | 0.3921 | 0.321 | 0.446 | 0.245 | 0.367 | 0.5036 | 0.6054 | 0.5888 | 0.6741 | 0.0382 | 0.3722 | 0.3382 | 0.4527 | 0.019 | 0.305 | 0.0858 | 0.1452 | | 13.772 | 9.0 | 4131 | 11.8863 | 0.2678 | 0.4293 | 0.2743 | 0.1306 | 0.3694 | 0.4456 | 0.2276 | 0.4405 | 0.4524 | 0.2331 | 0.5595 | 0.6747 | 0.7065 | 0.8322 | 0.3349 | 0.432 | 0.2274 | 0.5568 | 0.4314 | 0.6113 | 0.1785 | 0.5625 | 0.1666 | 0.3228 | 0.0354 | 0.2608 | 0.0893 | 0.2812 | 0.2225 | 0.3842 | 0.2942 | 0.4235 | 0.2533 | 0.367 | 0.5127 | 0.6124 | 0.5853 | 0.6719 | 0.043 | 0.4611 | 0.3693 | 0.4833 | 0.0097 | 0.245 | 0.092 | 0.1839 | | 13.5604 | 10.0 | 4590 | 11.8738 | 0.2766 | 0.439 | 0.2829 | 0.1319 | 0.375 | 0.4519 | 0.2272 | 0.438 | 0.4559 | 0.2268 | 0.551 | 0.6909 | 0.7159 | 0.8345 | 0.3388 | 0.4368 | 0.2418 | 0.5135 | 0.4365 | 0.6137 | 0.1839 | 0.5625 | 0.1733 | 0.2842 | 0.0448 | 0.2647 | 0.0879 | 0.284 | 0.224 | 0.3644 | 0.2994 | 0.4226 | 0.2581 | 0.3714 | 0.5167 | 0.6166 | 0.5923 | 0.6739 | 0.1135 | 0.5556 | 0.3707 | 0.483 | 0.0097 | 0.315 | 0.0956 | 0.1532 | ### Framework versions - Transformers 4.46.0.dev0 - Pytorch 2.4.1+cu121 - Datasets 3.0.1 - Tokenizers 0.20.1
[ "person", "ear", "earmuffs", "face", "face-guard", "face-mask-medical", "foot", "tools", "glasses", "gloves", "helmet", "hands", "head", "medical-suit", "shoes", "safety-suit", "safety-vest" ]
gargeya2003/detr-pretrained-r101
# Model Card for Model ID <!-- Provide a quick summary of what the model is/does. --> ## Model Details ### Model Description <!-- Provide a longer summary of what this model is. --> This is the model card of a πŸ€— transformers model that has been pushed on the Hub. This model card has been automatically generated. - **Developed by:** [More Information Needed] - **Funded by [optional]:** [More Information Needed] - **Shared by [optional]:** [More Information Needed] - **Model type:** [More Information Needed] - **Language(s) (NLP):** [More Information Needed] - **License:** [More Information Needed] - **Finetuned from model [optional]:** [More Information Needed] ### Model Sources [optional] <!-- Provide the basic links for the model. --> - **Repository:** [More Information Needed] - **Paper [optional]:** [More Information Needed] - **Demo [optional]:** [More Information Needed] ## Uses <!-- Address questions around how the model is intended to be used, including the foreseeable users of the model and those affected by the model. --> ### Direct Use <!-- This section is for the model use without fine-tuning or plugging into a larger ecosystem/app. --> [More Information Needed] ### Downstream Use [optional] <!-- This section is for the model use when fine-tuned for a task, or when plugged into a larger ecosystem/app --> [More Information Needed] ### Out-of-Scope Use <!-- This section addresses misuse, malicious use, and uses that the model will not work well for. --> [More Information Needed] ## Bias, Risks, and Limitations <!-- This section is meant to convey both technical and sociotechnical limitations. --> [More Information Needed] ### Recommendations <!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. --> Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations. ## How to Get Started with the Model Use the code below to get started with the model. [More Information Needed] ## Training Details ### Training Data <!-- This should link to a Dataset Card, perhaps with a short stub of information on what the training data is all about as well as documentation related to data pre-processing or additional filtering. --> [More Information Needed] ### Training Procedure <!-- This relates heavily to the Technical Specifications. Content here should link to that section when it is relevant to the training procedure. --> #### Preprocessing [optional] [More Information Needed] #### Training Hyperparameters - **Training regime:** [More Information Needed] <!--fp32, fp16 mixed precision, bf16 mixed precision, bf16 non-mixed precision, fp16 non-mixed precision, fp8 mixed precision --> #### Speeds, Sizes, Times [optional] <!-- This section provides information about throughput, start/end time, checkpoint size if relevant, etc. --> [More Information Needed] ## Evaluation <!-- This section describes the evaluation protocols and provides the results. --> ### Testing Data, Factors & Metrics #### Testing Data <!-- This should link to a Dataset Card if possible. --> [More Information Needed] #### Factors <!-- These are the things the evaluation is disaggregating by, e.g., subpopulations or domains. --> [More Information Needed] #### Metrics <!-- These are the evaluation metrics being used, ideally with a description of why. --> [More Information Needed] ### Results [More Information Needed] #### Summary ## Model Examination [optional] <!-- Relevant interpretability work for the model goes here --> [More Information Needed] ## Environmental Impact <!-- Total emissions (in grams of CO2eq) and additional considerations, such as electricity usage, go here. Edit the suggested text below accordingly --> Carbon emissions can be estimated using the [Machine Learning Impact calculator](https://mlco2.github.io/impact#compute) presented in [Lacoste et al. (2019)](https://arxiv.org/abs/1910.09700). - **Hardware Type:** [More Information Needed] - **Hours used:** [More Information Needed] - **Cloud Provider:** [More Information Needed] - **Compute Region:** [More Information Needed] - **Carbon Emitted:** [More Information Needed] ## Technical Specifications [optional] ### Model Architecture and Objective [More Information Needed] ### Compute Infrastructure [More Information Needed] #### Hardware [More Information Needed] #### Software [More Information Needed] ## Citation [optional] <!-- If there is a paper or blog post introducing the model, the APA and Bibtex information for that should go in this section. --> **BibTeX:** [More Information Needed] **APA:** [More Information Needed] ## Glossary [optional] <!-- If relevant, include terms and calculations in this section that can help readers understand the model or model card. --> [More Information Needed] ## More Information [optional] [More Information Needed] ## Model Card Authors [optional] [More Information Needed] ## Model Card Contact [More Information Needed]
[ "label_0", "label_1", "label_2", "label_3", "label_4", "label_5" ]
b09501048/rtdetr3
<!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. --> [<img src="https://raw.githubusercontent.com/wandb/assets/main/wandb-github-badge-28.svg" alt="Visualize in Weights & Biases" width="200" height="32"/>](https://wandb.ai/djengo890-national-taiwan-university/CVDPL_HW1_RT_DETR/runs/krauv7q7) # rtdetr3 This model is a fine-tuned version of [b09501048/rtdetr2](https://huggingface.co/b09501048/rtdetr2) on an unknown dataset. It achieves the following results on the evaluation set: - Loss: 12.0504 - Map: 0.2831 - Map 50: 0.4512 - Map 75: 0.2953 - Map Small: 0.1426 - Map Medium: 0.3859 - Map Large: 0.469 - Mar 1: 0.2299 - Mar 10: 0.441 - Mar 100: 0.4546 - Mar Small: 0.2539 - Mar Medium: 0.5608 - Mar Large: 0.6712 - Map Person: 0.716 - Mar 100 Person: 0.8309 - Map Ear: 0.357 - Mar 100 Ear: 0.459 - Map Earmuffs: 0.2616 - Mar 100 Earmuffs: 0.4676 - Map Face: 0.4488 - Mar 100 Face: 0.6417 - Map Face-guard: 0.2019 - Mar 100 Face-guard: 0.5125 - Map Face-mask-medical: 0.1874 - Mar 100 Face-mask-medical: 0.2842 - Map Foot: 0.0467 - Mar 100 Foot: 0.2941 - Map Tools: 0.0755 - Mar 100 Tools: 0.2582 - Map Glasses: 0.2358 - Mar 100 Glasses: 0.3972 - Map Gloves: 0.2782 - Mar 100 Gloves: 0.4345 - Map Helmet: 0.2651 - Mar 100 Helmet: 0.3929 - Map Hands: 0.5228 - Mar 100 Hands: 0.6237 - Map Head: 0.571 - Mar 100 Head: 0.6738 - Map Medical-suit: 0.1583 - Mar 100 Medical-suit: 0.45 - Map Shoes: 0.3526 - Mar 100 Shoes: 0.4729 - Map Safety-suit: 0.0216 - Mar 100 Safety-suit: 0.35 - Map Safety-vest: 0.1126 - Mar 100 Safety-vest: 0.1855 ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 5e-05 - train_batch_size: 8 - eval_batch_size: 8 - seed: 42 - optimizer: Use adamw_torch with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments - lr_scheduler_type: linear - lr_scheduler_warmup_steps: 300 - num_epochs: 5 - mixed_precision_training: Native AMP ### Training results | Training Loss | Epoch | Step | Validation Loss | Map | Map 50 | Map 75 | Map Small | Map Medium | Map Large | Mar 1 | Mar 10 | Mar 100 | Mar Small | Mar Medium | Mar Large | Map Person | Mar 100 Person | Map Ear | Mar 100 Ear | Map Earmuffs | Mar 100 Earmuffs | Map Face | Mar 100 Face | Map Face-guard | Mar 100 Face-guard | Map Face-mask-medical | Mar 100 Face-mask-medical | Map Foot | Mar 100 Foot | Map Tools | Mar 100 Tools | Map Glasses | Mar 100 Glasses | Map Gloves | Mar 100 Gloves | Map Helmet | Mar 100 Helmet | Map Hands | Mar 100 Hands | Map Head | Mar 100 Head | Map Medical-suit | Mar 100 Medical-suit | Map Shoes | Mar 100 Shoes | Map Safety-suit | Mar 100 Safety-suit | Map Safety-vest | Mar 100 Safety-vest | |:-------------:|:-----:|:----:|:---------------:|:------:|:------:|:------:|:---------:|:----------:|:---------:|:------:|:------:|:-------:|:---------:|:----------:|:---------:|:----------:|:--------------:|:-------:|:-----------:|:------------:|:----------------:|:--------:|:------------:|:--------------:|:------------------:|:---------------------:|:-------------------------:|:--------:|:------------:|:---------:|:-------------:|:-----------:|:---------------:|:----------:|:--------------:|:----------:|:--------------:|:---------:|:-------------:|:--------:|:------------:|:----------------:|:--------------------:|:---------:|:-------------:|:---------------:|:-------------------:|:---------------:|:-------------------:| | No log | 1.0 | 459 | 12.7473 | 0.2463 | 0.3958 | 0.2544 | 0.1222 | 0.3462 | 0.3923 | 0.2108 | 0.3909 | 0.404 | 0.2092 | 0.4925 | 0.6225 | 0.6449 | 0.827 | 0.3065 | 0.4 | 0.1965 | 0.4378 | 0.4368 | 0.603 | 0.1483 | 0.45 | 0.1711 | 0.2474 | 0.0568 | 0.2373 | 0.0683 | 0.2296 | 0.2028 | 0.3113 | 0.2313 | 0.3721 | 0.2246 | 0.35 | 0.5063 | 0.6106 | 0.5744 | 0.672 | 0.0043 | 0.2778 | 0.3112 | 0.4452 | 0.0066 | 0.235 | 0.0971 | 0.1613 | | 14.0301 | 2.0 | 918 | 11.8304 | 0.2764 | 0.4391 | 0.283 | 0.141 | 0.3804 | 0.4657 | 0.2313 | 0.4441 | 0.4637 | 0.244 | 0.5718 | 0.7022 | 0.6841 | 0.8373 | 0.3857 | 0.4744 | 0.2263 | 0.4865 | 0.4943 | 0.6643 | 0.2173 | 0.575 | 0.1887 | 0.3298 | 0.0288 | 0.2686 | 0.0586 | 0.2647 | 0.2346 | 0.4232 | 0.3175 | 0.4628 | 0.2462 | 0.4098 | 0.5093 | 0.6097 | 0.6284 | 0.7096 | 0.0157 | 0.3778 | 0.3542 | 0.4786 | 0.0162 | 0.34 | 0.0935 | 0.171 | | 13.9816 | 3.0 | 1377 | 12.0861 | 0.2814 | 0.4484 | 0.2909 | 0.141 | 0.3911 | 0.4422 | 0.2314 | 0.4479 | 0.4649 | 0.2391 | 0.5663 | 0.6917 | 0.7053 | 0.8295 | 0.3844 | 0.4867 | 0.2688 | 0.4703 | 0.426 | 0.6411 | 0.1494 | 0.475 | 0.1747 | 0.3035 | 0.0632 | 0.3275 | 0.0788 | 0.2598 | 0.2417 | 0.409 | 0.2273 | 0.3774 | 0.296 | 0.417 | 0.5231 | 0.6358 | 0.6 | 0.6995 | 0.1266 | 0.5167 | 0.3678 | 0.4991 | 0.0182 | 0.31 | 0.1321 | 0.2452 | | 13.4376 | 4.0 | 1836 | 12.0504 | 0.2831 | 0.4512 | 0.2953 | 0.1426 | 0.3859 | 0.469 | 0.2299 | 0.441 | 0.4546 | 0.2539 | 0.5608 | 0.6712 | 0.716 | 0.8309 | 0.357 | 0.459 | 0.2616 | 0.4676 | 0.4488 | 0.6417 | 0.2019 | 0.5125 | 0.1874 | 0.2842 | 0.0467 | 0.2941 | 0.0755 | 0.2582 | 0.2358 | 0.3972 | 0.2782 | 0.4345 | 0.2651 | 0.3929 | 0.5228 | 0.6237 | 0.571 | 0.6738 | 0.1583 | 0.45 | 0.3526 | 0.4729 | 0.0216 | 0.35 | 0.1126 | 0.1855 | | 13.0503 | 5.0 | 2295 | 12.2470 | 0.2781 | 0.4505 | 0.2859 | 0.1352 | 0.3917 | 0.4434 | 0.2417 | 0.4386 | 0.4518 | 0.2304 | 0.5572 | 0.6509 | 0.7016 | 0.8247 | 0.3341 | 0.4344 | 0.2608 | 0.4919 | 0.4105 | 0.6234 | 0.1222 | 0.4375 | 0.1875 | 0.2825 | 0.0837 | 0.302 | 0.0757 | 0.2683 | 0.2379 | 0.3938 | 0.2675 | 0.4119 | 0.295 | 0.4045 | 0.5211 | 0.6243 | 0.5795 | 0.6805 | 0.131 | 0.4833 | 0.3578 | 0.4833 | 0.0333 | 0.345 | 0.1284 | 0.1887 | ### Framework versions - Transformers 4.46.0.dev0 - Pytorch 2.4.1+cu121 - Datasets 3.0.1 - Tokenizers 0.20.1
[ "person", "ear", "earmuffs", "face", "face-guard", "face-mask-medical", "foot", "tools", "glasses", "gloves", "helmet", "hands", "head", "medical-suit", "shoes", "safety-suit", "safety-vest" ]
b09501048/rtdetr4
<!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. --> # rtdetr4 This model was trained from scratch on an unknown dataset. ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 5e-05 - train_batch_size: 8 - eval_batch_size: 8 - seed: 42 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - num_epochs: 3.0 ### Framework versions - Transformers 4.44.2 - Pytorch 2.4.1+cu121 - Datasets 3.0.1 - Tokenizers 0.19.1
[ "person", "ear", "earmuffs", "face", "face-guard", "face-mask-medical", "foot", "tools", "glasses", "gloves", "helmet", "hands", "head", "medical-suit", "shoes", "safety-suit", "safety-vest" ]
doktor47/zinemind_msft_200true
# Model Card for Model ID <!-- Provide a quick summary of what the model is/does. --> ## Model Details ### Model Description <!-- Provide a longer summary of what this model is. --> This is the model card of a πŸ€— transformers model that has been pushed on the Hub. This model card has been automatically generated. - **Developed by:** [More Information Needed] - **Funded by [optional]:** [More Information Needed] - **Shared by [optional]:** [More Information Needed] - **Model type:** [More Information Needed] - **Language(s) (NLP):** [More Information Needed] - **License:** [More Information Needed] - **Finetuned from model [optional]:** [More Information Needed] ### Model Sources [optional] <!-- Provide the basic links for the model. --> - **Repository:** [More Information Needed] - **Paper [optional]:** [More Information Needed] - **Demo [optional]:** [More Information Needed] ## Uses <!-- Address questions around how the model is intended to be used, including the foreseeable users of the model and those affected by the model. --> ### Direct Use <!-- This section is for the model use without fine-tuning or plugging into a larger ecosystem/app. --> [More Information Needed] ### Downstream Use [optional] <!-- This section is for the model use when fine-tuned for a task, or when plugged into a larger ecosystem/app --> [More Information Needed] ### Out-of-Scope Use <!-- This section addresses misuse, malicious use, and uses that the model will not work well for. --> [More Information Needed] ## Bias, Risks, and Limitations <!-- This section is meant to convey both technical and sociotechnical limitations. --> [More Information Needed] ### Recommendations <!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. --> Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations. ## How to Get Started with the Model Use the code below to get started with the model. [More Information Needed] ## Training Details ### Training Data <!-- This should link to a Dataset Card, perhaps with a short stub of information on what the training data is all about as well as documentation related to data pre-processing or additional filtering. --> [More Information Needed] ### Training Procedure <!-- This relates heavily to the Technical Specifications. Content here should link to that section when it is relevant to the training procedure. --> #### Preprocessing [optional] [More Information Needed] #### Training Hyperparameters - **Training regime:** [More Information Needed] <!--fp32, fp16 mixed precision, bf16 mixed precision, bf16 non-mixed precision, fp16 non-mixed precision, fp8 mixed precision --> #### Speeds, Sizes, Times [optional] <!-- This section provides information about throughput, start/end time, checkpoint size if relevant, etc. --> [More Information Needed] ## Evaluation <!-- This section describes the evaluation protocols and provides the results. --> ### Testing Data, Factors & Metrics #### Testing Data <!-- This should link to a Dataset Card if possible. --> [More Information Needed] #### Factors <!-- These are the things the evaluation is disaggregating by, e.g., subpopulations or domains. --> [More Information Needed] #### Metrics <!-- These are the evaluation metrics being used, ideally with a description of why. --> [More Information Needed] ### Results [More Information Needed] #### Summary ## Model Examination [optional] <!-- Relevant interpretability work for the model goes here --> [More Information Needed] ## Environmental Impact <!-- Total emissions (in grams of CO2eq) and additional considerations, such as electricity usage, go here. Edit the suggested text below accordingly --> Carbon emissions can be estimated using the [Machine Learning Impact calculator](https://mlco2.github.io/impact#compute) presented in [Lacoste et al. (2019)](https://arxiv.org/abs/1910.09700). - **Hardware Type:** [More Information Needed] - **Hours used:** [More Information Needed] - **Cloud Provider:** [More Information Needed] - **Compute Region:** [More Information Needed] - **Carbon Emitted:** [More Information Needed] ## Technical Specifications [optional] ### Model Architecture and Objective [More Information Needed] ### Compute Infrastructure [More Information Needed] #### Hardware [More Information Needed] #### Software [More Information Needed] ## Citation [optional] <!-- If there is a paper or blog post introducing the model, the APA and Bibtex information for that should go in this section. --> **BibTeX:** [More Information Needed] **APA:** [More Information Needed] ## Glossary [optional] <!-- If relevant, include terms and calculations in this section that can help readers understand the model or model card. --> [More Information Needed] ## More Information [optional] [More Information Needed] ## Model Card Authors [optional] [More Information Needed] ## Model Card Contact [More Information Needed]
[ "table", "table rotated" ]
doktor47/zinemind_msft_16temp
# Model Card for Model ID <!-- Provide a quick summary of what the model is/does. --> ## Model Details ### Model Description <!-- Provide a longer summary of what this model is. --> This is the model card of a πŸ€— transformers model that has been pushed on the Hub. This model card has been automatically generated. - **Developed by:** [More Information Needed] - **Funded by [optional]:** [More Information Needed] - **Shared by [optional]:** [More Information Needed] - **Model type:** [More Information Needed] - **Language(s) (NLP):** [More Information Needed] - **License:** [More Information Needed] - **Finetuned from model [optional]:** [More Information Needed] ### Model Sources [optional] <!-- Provide the basic links for the model. --> - **Repository:** [More Information Needed] - **Paper [optional]:** [More Information Needed] - **Demo [optional]:** [More Information Needed] ## Uses <!-- Address questions around how the model is intended to be used, including the foreseeable users of the model and those affected by the model. --> ### Direct Use <!-- This section is for the model use without fine-tuning or plugging into a larger ecosystem/app. --> [More Information Needed] ### Downstream Use [optional] <!-- This section is for the model use when fine-tuned for a task, or when plugged into a larger ecosystem/app --> [More Information Needed] ### Out-of-Scope Use <!-- This section addresses misuse, malicious use, and uses that the model will not work well for. --> [More Information Needed] ## Bias, Risks, and Limitations <!-- This section is meant to convey both technical and sociotechnical limitations. --> [More Information Needed] ### Recommendations <!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. --> Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations. ## How to Get Started with the Model Use the code below to get started with the model. [More Information Needed] ## Training Details ### Training Data <!-- This should link to a Dataset Card, perhaps with a short stub of information on what the training data is all about as well as documentation related to data pre-processing or additional filtering. --> [More Information Needed] ### Training Procedure <!-- This relates heavily to the Technical Specifications. Content here should link to that section when it is relevant to the training procedure. --> #### Preprocessing [optional] [More Information Needed] #### Training Hyperparameters - **Training regime:** [More Information Needed] <!--fp32, fp16 mixed precision, bf16 mixed precision, bf16 non-mixed precision, fp16 non-mixed precision, fp8 mixed precision --> #### Speeds, Sizes, Times [optional] <!-- This section provides information about throughput, start/end time, checkpoint size if relevant, etc. --> [More Information Needed] ## Evaluation <!-- This section describes the evaluation protocols and provides the results. --> ### Testing Data, Factors & Metrics #### Testing Data <!-- This should link to a Dataset Card if possible. --> [More Information Needed] #### Factors <!-- These are the things the evaluation is disaggregating by, e.g., subpopulations or domains. --> [More Information Needed] #### Metrics <!-- These are the evaluation metrics being used, ideally with a description of why. --> [More Information Needed] ### Results [More Information Needed] #### Summary ## Model Examination [optional] <!-- Relevant interpretability work for the model goes here --> [More Information Needed] ## Environmental Impact <!-- Total emissions (in grams of CO2eq) and additional considerations, such as electricity usage, go here. Edit the suggested text below accordingly --> Carbon emissions can be estimated using the [Machine Learning Impact calculator](https://mlco2.github.io/impact#compute) presented in [Lacoste et al. (2019)](https://arxiv.org/abs/1910.09700). - **Hardware Type:** [More Information Needed] - **Hours used:** [More Information Needed] - **Cloud Provider:** [More Information Needed] - **Compute Region:** [More Information Needed] - **Carbon Emitted:** [More Information Needed] ## Technical Specifications [optional] ### Model Architecture and Objective [More Information Needed] ### Compute Infrastructure [More Information Needed] #### Hardware [More Information Needed] #### Software [More Information Needed] ## Citation [optional] <!-- If there is a paper or blog post introducing the model, the APA and Bibtex information for that should go in this section. --> **BibTeX:** [More Information Needed] **APA:** [More Information Needed] ## Glossary [optional] <!-- If relevant, include terms and calculations in this section that can help readers understand the model or model card. --> [More Information Needed] ## More Information [optional] [More Information Needed] ## Model Card Authors [optional] [More Information Needed] ## Model Card Contact [More Information Needed]
[ "table", "table column", "table row", "table column header", "table projected row header", "table spanning cell" ]
pabloOmega/curves-detection
# Model Card for Model ID <!-- Provide a quick summary of what the model is/does. --> ## Model Details ### Model Description <!-- Provide a longer summary of what this model is. --> This is the model card of a πŸ€— transformers model that has been pushed on the Hub. This model card has been automatically generated. - **Developed by:** [More Information Needed] - **Funded by [optional]:** [More Information Needed] - **Shared by [optional]:** [More Information Needed] - **Model type:** [More Information Needed] - **Language(s) (NLP):** [More Information Needed] - **License:** [More Information Needed] - **Finetuned from model [optional]:** [More Information Needed] ### Model Sources [optional] <!-- Provide the basic links for the model. --> - **Repository:** [More Information Needed] - **Paper [optional]:** [More Information Needed] - **Demo [optional]:** [More Information Needed] ## Uses <!-- Address questions around how the model is intended to be used, including the foreseeable users of the model and those affected by the model. --> ### Direct Use <!-- This section is for the model use without fine-tuning or plugging into a larger ecosystem/app. --> [More Information Needed] ### Downstream Use [optional] <!-- This section is for the model use when fine-tuned for a task, or when plugged into a larger ecosystem/app --> [More Information Needed] ### Out-of-Scope Use <!-- This section addresses misuse, malicious use, and uses that the model will not work well for. --> [More Information Needed] ## Bias, Risks, and Limitations <!-- This section is meant to convey both technical and sociotechnical limitations. --> [More Information Needed] ### Recommendations <!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. --> Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations. ## How to Get Started with the Model Use the code below to get started with the model. [More Information Needed] ## Training Details ### Training Data <!-- This should link to a Dataset Card, perhaps with a short stub of information on what the training data is all about as well as documentation related to data pre-processing or additional filtering. --> [More Information Needed] ### Training Procedure <!-- This relates heavily to the Technical Specifications. Content here should link to that section when it is relevant to the training procedure. --> #### Preprocessing [optional] [More Information Needed] #### Training Hyperparameters - **Training regime:** [More Information Needed] <!--fp32, fp16 mixed precision, bf16 mixed precision, bf16 non-mixed precision, fp16 non-mixed precision, fp8 mixed precision --> #### Speeds, Sizes, Times [optional] <!-- This section provides information about throughput, start/end time, checkpoint size if relevant, etc. --> [More Information Needed] ## Evaluation <!-- This section describes the evaluation protocols and provides the results. --> ### Testing Data, Factors & Metrics #### Testing Data <!-- This should link to a Dataset Card if possible. --> [More Information Needed] #### Factors <!-- These are the things the evaluation is disaggregating by, e.g., subpopulations or domains. --> [More Information Needed] #### Metrics <!-- These are the evaluation metrics being used, ideally with a description of why. --> [More Information Needed] ### Results [More Information Needed] #### Summary ## Model Examination [optional] <!-- Relevant interpretability work for the model goes here --> [More Information Needed] ## Environmental Impact <!-- Total emissions (in grams of CO2eq) and additional considerations, such as electricity usage, go here. Edit the suggested text below accordingly --> Carbon emissions can be estimated using the [Machine Learning Impact calculator](https://mlco2.github.io/impact#compute) presented in [Lacoste et al. (2019)](https://arxiv.org/abs/1910.09700). - **Hardware Type:** [More Information Needed] - **Hours used:** [More Information Needed] - **Cloud Provider:** [More Information Needed] - **Compute Region:** [More Information Needed] - **Carbon Emitted:** [More Information Needed] ## Technical Specifications [optional] ### Model Architecture and Objective [More Information Needed] ### Compute Infrastructure [More Information Needed] #### Hardware [More Information Needed] #### Software [More Information Needed] ## Citation [optional] <!-- If there is a paper or blog post introducing the model, the APA and Bibtex information for that should go in this section. --> **BibTeX:** [More Information Needed] **APA:** [More Information Needed] ## Glossary [optional] <!-- If relevant, include terms and calculations in this section that can help readers understand the model or model card. --> [More Information Needed] ## More Information [optional] [More Information Needed] ## Model Card Authors [optional] [More Information Needed] ## Model Card Contact [More Information Needed]
[ "label_0", "label_1", "label_2", "label_3" ]
gargeya2003/detr-layers-updated1
# Model Card for Model ID <!-- Provide a quick summary of what the model is/does. --> ## Model Details ### Model Description <!-- Provide a longer summary of what this model is. --> This is the model card of a πŸ€— transformers model that has been pushed on the Hub. This model card has been automatically generated. - **Developed by:** [More Information Needed] - **Funded by [optional]:** [More Information Needed] - **Shared by [optional]:** [More Information Needed] - **Model type:** [More Information Needed] - **Language(s) (NLP):** [More Information Needed] - **License:** [More Information Needed] - **Finetuned from model [optional]:** [More Information Needed] ### Model Sources [optional] <!-- Provide the basic links for the model. --> - **Repository:** [More Information Needed] - **Paper [optional]:** [More Information Needed] - **Demo [optional]:** [More Information Needed] ## Uses <!-- Address questions around how the model is intended to be used, including the foreseeable users of the model and those affected by the model. --> ### Direct Use <!-- This section is for the model use without fine-tuning or plugging into a larger ecosystem/app. --> [More Information Needed] ### Downstream Use [optional] <!-- This section is for the model use when fine-tuned for a task, or when plugged into a larger ecosystem/app --> [More Information Needed] ### Out-of-Scope Use <!-- This section addresses misuse, malicious use, and uses that the model will not work well for. --> [More Information Needed] ## Bias, Risks, and Limitations <!-- This section is meant to convey both technical and sociotechnical limitations. --> [More Information Needed] ### Recommendations <!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. --> Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations. ## How to Get Started with the Model Use the code below to get started with the model. [More Information Needed] ## Training Details ### Training Data <!-- This should link to a Dataset Card, perhaps with a short stub of information on what the training data is all about as well as documentation related to data pre-processing or additional filtering. --> [More Information Needed] ### Training Procedure <!-- This relates heavily to the Technical Specifications. Content here should link to that section when it is relevant to the training procedure. --> #### Preprocessing [optional] [More Information Needed] #### Training Hyperparameters - **Training regime:** [More Information Needed] <!--fp32, fp16 mixed precision, bf16 mixed precision, bf16 non-mixed precision, fp16 non-mixed precision, fp8 mixed precision --> #### Speeds, Sizes, Times [optional] <!-- This section provides information about throughput, start/end time, checkpoint size if relevant, etc. --> [More Information Needed] ## Evaluation <!-- This section describes the evaluation protocols and provides the results. --> ### Testing Data, Factors & Metrics #### Testing Data <!-- This should link to a Dataset Card if possible. --> [More Information Needed] #### Factors <!-- These are the things the evaluation is disaggregating by, e.g., subpopulations or domains. --> [More Information Needed] #### Metrics <!-- These are the evaluation metrics being used, ideally with a description of why. --> [More Information Needed] ### Results [More Information Needed] #### Summary ## Model Examination [optional] <!-- Relevant interpretability work for the model goes here --> [More Information Needed] ## Environmental Impact <!-- Total emissions (in grams of CO2eq) and additional considerations, such as electricity usage, go here. Edit the suggested text below accordingly --> Carbon emissions can be estimated using the [Machine Learning Impact calculator](https://mlco2.github.io/impact#compute) presented in [Lacoste et al. (2019)](https://arxiv.org/abs/1910.09700). - **Hardware Type:** [More Information Needed] - **Hours used:** [More Information Needed] - **Cloud Provider:** [More Information Needed] - **Compute Region:** [More Information Needed] - **Carbon Emitted:** [More Information Needed] ## Technical Specifications [optional] ### Model Architecture and Objective [More Information Needed] ### Compute Infrastructure [More Information Needed] #### Hardware [More Information Needed] #### Software [More Information Needed] ## Citation [optional] <!-- If there is a paper or blog post introducing the model, the APA and Bibtex information for that should go in this section. --> **BibTeX:** [More Information Needed] **APA:** [More Information Needed] ## Glossary [optional] <!-- If relevant, include terms and calculations in this section that can help readers understand the model or model card. --> [More Information Needed] ## More Information [optional] [More Information Needed] ## Model Card Authors [optional] [More Information Needed] ## Model Card Contact [More Information Needed]
[ "label_0", "label_1", "label_2", "label_3", "label_4", "label_5" ]
iwake/detr-finetuned-cppe-5-10k-steps
<!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. --> # detr-finetuned-cppe-5-10k-steps This model is a fine-tuned version of [facebook/detr-resnet-50](https://huggingface.co/facebook/detr-resnet-50) on the cppe-5 dataset. It achieves the following results on the evaluation set: - Loss: 1.2399 - Map: 0.296 - Map 50: 0.5758 - Map 75: 0.258 - Map Small: 0.0949 - Map Medium: 0.2452 - Map Large: 0.4635 - Mar 1: 0.2979 - Mar 10: 0.4713 - Mar 100: 0.4813 - Mar Small: 0.2026 - Mar Medium: 0.4228 - Mar Large: 0.6701 - Map Coverall: 0.5371 - Mar 100 Coverall: 0.6878 - Map Face Shield: 0.2634 - Mar 100 Face Shield: 0.4785 - Map Gloves: 0.2261 - Mar 100 Gloves: 0.4201 - Map Goggles: 0.1544 - Mar 100 Goggles: 0.3954 - Map Mask: 0.299 - Mar 100 Mask: 0.4249 ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 5e-05 - train_batch_size: 8 - eval_batch_size: 8 - seed: 1337 - optimizer: Use adamw_torch with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments - lr_scheduler_type: linear - num_epochs: 100.0 - mixed_precision_training: Native AMP ### Training results | Training Loss | Epoch | Step | Validation Loss | Map | Map 50 | Map 75 | Map Small | Map Medium | Map Large | Mar 1 | Mar 10 | Mar 100 | Mar Small | Mar Medium | Mar Large | Map Coverall | Mar 100 Coverall | Map Face Shield | Mar 100 Face Shield | Map Gloves | Mar 100 Gloves | Map Goggles | Mar 100 Goggles | Map Mask | Mar 100 Mask | |:-------------:|:-----:|:-----:|:---------------:|:------:|:------:|:------:|:---------:|:----------:|:---------:|:------:|:------:|:-------:|:---------:|:----------:|:---------:|:------------:|:----------------:|:---------------:|:-------------------:|:----------:|:--------------:|:-----------:|:---------------:|:--------:|:------------:| | 3.1413 | 1.0 | 107 | 2.5672 | 0.0222 | 0.0515 | 0.0183 | 0.0012 | 0.0167 | 0.0238 | 0.0482 | 0.112 | 0.1464 | 0.0328 | 0.1046 | 0.1847 | 0.1017 | 0.4509 | 0.0 | 0.0 | 0.0037 | 0.1125 | 0.0 | 0.0 | 0.0058 | 0.1684 | | 2.1763 | 2.0 | 214 | 2.3691 | 0.0418 | 0.0978 | 0.03 | 0.0093 | 0.0232 | 0.0481 | 0.0632 | 0.14 | 0.1706 | 0.0603 | 0.1211 | 0.2081 | 0.1796 | 0.5027 | 0.0 | 0.0 | 0.0065 | 0.1835 | 0.0 | 0.0 | 0.0228 | 0.1667 | | 2.033 | 3.0 | 321 | 2.1971 | 0.051 | 0.1174 | 0.0452 | 0.0081 | 0.0458 | 0.0551 | 0.07 | 0.1497 | 0.1775 | 0.0554 | 0.1426 | 0.2071 | 0.2186 | 0.4986 | 0.0 | 0.0 | 0.012 | 0.2089 | 0.0 | 0.0 | 0.0241 | 0.18 | | 1.9673 | 4.0 | 428 | 2.2385 | 0.0525 | 0.1271 | 0.0364 | 0.0103 | 0.0458 | 0.0576 | 0.0808 | 0.1604 | 0.1836 | 0.0529 | 0.1426 | 0.2078 | 0.1835 | 0.5311 | 0.0013 | 0.0139 | 0.023 | 0.1839 | 0.0 | 0.0 | 0.0547 | 0.1889 | | 1.8638 | 5.0 | 535 | 2.1170 | 0.0726 | 0.1604 | 0.0617 | 0.0172 | 0.0471 | 0.0866 | 0.0858 | 0.1729 | 0.1906 | 0.0584 | 0.1324 | 0.238 | 0.2801 | 0.5338 | 0.0077 | 0.0506 | 0.0272 | 0.1795 | 0.0 | 0.0 | 0.0478 | 0.1889 | | 1.7537 | 6.0 | 642 | 1.8691 | 0.0996 | 0.2181 | 0.082 | 0.0343 | 0.0784 | 0.1225 | 0.1125 | 0.2125 | 0.2296 | 0.0823 | 0.1789 | 0.2897 | 0.3412 | 0.5608 | 0.0062 | 0.0848 | 0.0431 | 0.2576 | 0.0 | 0.0 | 0.1076 | 0.2449 | | 1.8525 | 7.0 | 749 | 1.9352 | 0.0883 | 0.1972 | 0.0729 | 0.0162 | 0.0604 | 0.1131 | 0.0985 | 0.1901 | 0.2068 | 0.0537 | 0.1505 | 0.277 | 0.321 | 0.5514 | 0.004 | 0.0595 | 0.0186 | 0.1683 | 0.0 | 0.0 | 0.0981 | 0.2547 | | 1.7128 | 8.0 | 856 | 2.0212 | 0.0902 | 0.2058 | 0.0703 | 0.026 | 0.0967 | 0.1188 | 0.1292 | 0.2362 | 0.2487 | 0.0897 | 0.2289 | 0.3068 | 0.2383 | 0.4662 | 0.0293 | 0.2051 | 0.0422 | 0.1955 | 0.01 | 0.0723 | 0.1312 | 0.3044 | | 1.6811 | 9.0 | 963 | 1.8181 | 0.1123 | 0.2626 | 0.0793 | 0.0414 | 0.1039 | 0.1486 | 0.1474 | 0.2796 | 0.3016 | 0.1084 | 0.2635 | 0.3804 | 0.3228 | 0.5851 | 0.0444 | 0.243 | 0.0537 | 0.2612 | 0.0055 | 0.1262 | 0.1354 | 0.2924 | | 1.5809 | 10.0 | 1070 | 1.8683 | 0.1227 | 0.2838 | 0.0904 | 0.0536 | 0.106 | 0.1514 | 0.1464 | 0.2618 | 0.2856 | 0.1016 | 0.2436 | 0.3584 | 0.3417 | 0.5712 | 0.0665 | 0.2557 | 0.0441 | 0.2478 | 0.0089 | 0.0585 | 0.1525 | 0.2951 | | 1.5978 | 11.0 | 1177 | 1.7746 | 0.1169 | 0.2736 | 0.0897 | 0.0278 | 0.0922 | 0.1677 | 0.143 | 0.2665 | 0.2851 | 0.0903 | 0.2138 | 0.4063 | 0.3609 | 0.5932 | 0.0394 | 0.2051 | 0.0447 | 0.2241 | 0.0116 | 0.1431 | 0.1279 | 0.26 | | 1.5157 | 12.0 | 1284 | 1.7348 | 0.1261 | 0.2961 | 0.0899 | 0.0398 | 0.0923 | 0.1887 | 0.1445 | 0.2861 | 0.3027 | 0.1129 | 0.2438 | 0.4051 | 0.3806 | 0.555 | 0.0643 | 0.3 | 0.044 | 0.2545 | 0.0052 | 0.1323 | 0.1363 | 0.272 | | 1.4842 | 13.0 | 1391 | 1.6683 | 0.1459 | 0.3258 | 0.1245 | 0.0497 | 0.1165 | 0.2135 | 0.1808 | 0.333 | 0.3509 | 0.1028 | 0.2823 | 0.4944 | 0.4019 | 0.6036 | 0.0706 | 0.2987 | 0.0556 | 0.271 | 0.0121 | 0.2462 | 0.1894 | 0.3351 | | 1.4971 | 14.0 | 1498 | 1.8224 | 0.1376 | 0.3065 | 0.1079 | 0.0544 | 0.102 | 0.201 | 0.1601 | 0.2728 | 0.2865 | 0.1007 | 0.2205 | 0.4186 | 0.3695 | 0.5486 | 0.0501 | 0.2076 | 0.0497 | 0.2125 | 0.0215 | 0.1523 | 0.1971 | 0.3116 | | 1.4884 | 15.0 | 1605 | 1.6449 | 0.1567 | 0.3398 | 0.1244 | 0.0575 | 0.1118 | 0.2375 | 0.1708 | 0.3298 | 0.3492 | 0.1181 | 0.2709 | 0.5033 | 0.4296 | 0.6149 | 0.0709 | 0.3165 | 0.0662 | 0.2884 | 0.0101 | 0.1846 | 0.2068 | 0.3418 | | 1.4714 | 16.0 | 1712 | 1.6947 | 0.1451 | 0.3108 | 0.1146 | 0.0384 | 0.1007 | 0.2271 | 0.1673 | 0.3191 | 0.3475 | 0.1298 | 0.2767 | 0.5012 | 0.4384 | 0.6541 | 0.063 | 0.3089 | 0.0542 | 0.2799 | 0.0067 | 0.2062 | 0.1631 | 0.2884 | | 1.4282 | 17.0 | 1819 | 1.6426 | 0.1489 | 0.3459 | 0.1074 | 0.0545 | 0.1174 | 0.2215 | 0.176 | 0.3311 | 0.3488 | 0.1237 | 0.2875 | 0.4835 | 0.3759 | 0.5806 | 0.082 | 0.2608 | 0.0737 | 0.3246 | 0.0206 | 0.2538 | 0.1923 | 0.324 | | 1.401 | 18.0 | 1926 | 1.6453 | 0.1623 | 0.3508 | 0.1355 | 0.0404 | 0.1236 | 0.2432 | 0.1947 | 0.3464 | 0.3572 | 0.1363 | 0.286 | 0.5063 | 0.4247 | 0.6194 | 0.0808 | 0.3101 | 0.0892 | 0.2902 | 0.0189 | 0.2523 | 0.198 | 0.3142 | | 1.3853 | 19.0 | 2033 | 1.6557 | 0.1625 | 0.364 | 0.1221 | 0.0429 | 0.1351 | 0.2452 | 0.1883 | 0.3572 | 0.3791 | 0.1265 | 0.3225 | 0.5293 | 0.3997 | 0.6027 | 0.0739 | 0.338 | 0.1018 | 0.3281 | 0.0223 | 0.2723 | 0.2149 | 0.3542 | | 1.3623 | 20.0 | 2140 | 1.5289 | 0.1714 | 0.3623 | 0.1442 | 0.0628 | 0.1307 | 0.255 | 0.1846 | 0.3604 | 0.3811 | 0.1259 | 0.3197 | 0.5249 | 0.4419 | 0.6631 | 0.0922 | 0.3709 | 0.0966 | 0.3161 | 0.0104 | 0.2077 | 0.2158 | 0.3476 | | 1.3285 | 21.0 | 2247 | 1.5132 | 0.1702 | 0.3673 | 0.1336 | 0.0684 | 0.1297 | 0.2644 | 0.1868 | 0.366 | 0.3789 | 0.1566 | 0.3097 | 0.5341 | 0.4426 | 0.6509 | 0.0996 | 0.3899 | 0.0908 | 0.2906 | 0.0151 | 0.2385 | 0.203 | 0.3244 | | 1.3149 | 22.0 | 2354 | 1.4973 | 0.1811 | 0.3975 | 0.1431 | 0.0654 | 0.148 | 0.272 | 0.2062 | 0.3738 | 0.3921 | 0.1473 | 0.3242 | 0.5597 | 0.4239 | 0.6405 | 0.1129 | 0.3557 | 0.1112 | 0.317 | 0.0213 | 0.2923 | 0.236 | 0.3551 | | 1.3068 | 23.0 | 2461 | 1.4898 | 0.177 | 0.3899 | 0.1367 | 0.0688 | 0.1308 | 0.2934 | 0.1963 | 0.3586 | 0.3792 | 0.1551 | 0.3026 | 0.5689 | 0.4363 | 0.6455 | 0.1079 | 0.3038 | 0.1159 | 0.3192 | 0.0137 | 0.2908 | 0.2111 | 0.3369 | | 1.2989 | 24.0 | 2568 | 1.5076 | 0.1798 | 0.4009 | 0.14 | 0.0721 | 0.1432 | 0.2728 | 0.1963 | 0.3758 | 0.4003 | 0.1595 | 0.343 | 0.5591 | 0.4407 | 0.6541 | 0.091 | 0.3266 | 0.1228 | 0.3705 | 0.0225 | 0.3046 | 0.2223 | 0.3458 | | 1.2774 | 25.0 | 2675 | 1.4761 | 0.1836 | 0.3988 | 0.151 | 0.0584 | 0.1545 | 0.2718 | 0.2089 | 0.3823 | 0.4049 | 0.2009 | 0.3468 | 0.5419 | 0.457 | 0.645 | 0.095 | 0.3519 | 0.1143 | 0.354 | 0.0281 | 0.32 | 0.2236 | 0.3533 | | 1.2854 | 26.0 | 2782 | 1.4837 | 0.183 | 0.3948 | 0.1453 | 0.0562 | 0.143 | 0.2978 | 0.2086 | 0.3706 | 0.39 | 0.1384 | 0.3259 | 0.5593 | 0.4526 | 0.6617 | 0.0929 | 0.3494 | 0.1228 | 0.3607 | 0.0288 | 0.2415 | 0.218 | 0.3369 | | 1.2664 | 27.0 | 2889 | 1.4817 | 0.1854 | 0.3881 | 0.15 | 0.0674 | 0.1359 | 0.3007 | 0.2194 | 0.3635 | 0.3848 | 0.1468 | 0.3192 | 0.5599 | 0.441 | 0.6351 | 0.1261 | 0.343 | 0.1203 | 0.3603 | 0.0206 | 0.2385 | 0.2189 | 0.3471 | | 1.2362 | 28.0 | 2996 | 1.5092 | 0.1895 | 0.4105 | 0.1573 | 0.0664 | 0.1529 | 0.2986 | 0.2278 | 0.3784 | 0.3927 | 0.1304 | 0.3329 | 0.5668 | 0.4319 | 0.6315 | 0.1316 | 0.3595 | 0.1424 | 0.3607 | 0.0227 | 0.2677 | 0.219 | 0.344 | | 1.2433 | 29.0 | 3103 | 1.5012 | 0.173 | 0.3884 | 0.124 | 0.0487 | 0.1368 | 0.2585 | 0.2016 | 0.3509 | 0.3601 | 0.1423 | 0.2913 | 0.4978 | 0.4395 | 0.595 | 0.0937 | 0.3354 | 0.1109 | 0.3054 | 0.0136 | 0.2308 | 0.2073 | 0.3338 | | 1.221 | 30.0 | 3210 | 1.4452 | 0.1939 | 0.4169 | 0.1656 | 0.0769 | 0.1538 | 0.2959 | 0.2181 | 0.372 | 0.3802 | 0.1333 | 0.3207 | 0.5319 | 0.4588 | 0.6437 | 0.1116 | 0.3582 | 0.1265 | 0.2996 | 0.0285 | 0.2508 | 0.2442 | 0.3489 | | 1.1826 | 31.0 | 3317 | 1.4250 | 0.2024 | 0.4309 | 0.1642 | 0.0577 | 0.1592 | 0.32 | 0.2186 | 0.3952 | 0.4116 | 0.1439 | 0.3663 | 0.5764 | 0.4798 | 0.6779 | 0.1187 | 0.3696 | 0.1517 | 0.3455 | 0.0314 | 0.3123 | 0.2304 | 0.3524 | | 1.2067 | 32.0 | 3424 | 1.4167 | 0.1991 | 0.4379 | 0.144 | 0.0729 | 0.1628 | 0.3051 | 0.2248 | 0.3895 | 0.4076 | 0.2067 | 0.3476 | 0.5385 | 0.4682 | 0.6518 | 0.1416 | 0.3595 | 0.1252 | 0.3384 | 0.0301 | 0.3338 | 0.2302 | 0.3547 | | 1.1873 | 33.0 | 3531 | 1.4461 | 0.1935 | 0.4322 | 0.143 | 0.0748 | 0.1479 | 0.3049 | 0.217 | 0.3865 | 0.4044 | 0.1737 | 0.3478 | 0.5585 | 0.4472 | 0.6315 | 0.0929 | 0.3696 | 0.1704 | 0.3429 | 0.0355 | 0.3585 | 0.2215 | 0.3196 | | 1.1774 | 34.0 | 3638 | 1.4112 | 0.2036 | 0.4482 | 0.1642 | 0.0695 | 0.1481 | 0.3243 | 0.2318 | 0.3955 | 0.4051 | 0.1743 | 0.3449 | 0.5468 | 0.4832 | 0.6631 | 0.1264 | 0.3975 | 0.1389 | 0.3281 | 0.0363 | 0.3 | 0.2332 | 0.3369 | | 1.1894 | 35.0 | 3745 | 1.4219 | 0.213 | 0.4398 | 0.1839 | 0.0646 | 0.1641 | 0.3189 | 0.2258 | 0.3885 | 0.4009 | 0.1443 | 0.3434 | 0.567 | 0.4973 | 0.673 | 0.1232 | 0.3291 | 0.153 | 0.3562 | 0.0402 | 0.3046 | 0.2515 | 0.3418 | | 1.1719 | 36.0 | 3852 | 1.3858 | 0.2075 | 0.43 | 0.1761 | 0.0632 | 0.1567 | 0.3405 | 0.2291 | 0.3966 | 0.4078 | 0.1714 | 0.3427 | 0.5842 | 0.4981 | 0.6883 | 0.1098 | 0.3696 | 0.1647 | 0.3554 | 0.0362 | 0.2785 | 0.2286 | 0.3471 | | 1.1191 | 37.0 | 3959 | 1.4685 | 0.1947 | 0.413 | 0.1557 | 0.0639 | 0.147 | 0.3064 | 0.2115 | 0.3668 | 0.3791 | 0.1312 | 0.3394 | 0.5027 | 0.4546 | 0.6514 | 0.1321 | 0.3177 | 0.1435 | 0.3491 | 0.0262 | 0.2415 | 0.2174 | 0.3356 | | 1.1376 | 38.0 | 4066 | 1.3801 | 0.2095 | 0.4365 | 0.1706 | 0.0729 | 0.1621 | 0.3205 | 0.2337 | 0.3964 | 0.4109 | 0.1773 | 0.352 | 0.5531 | 0.4944 | 0.6734 | 0.1034 | 0.3582 | 0.1508 | 0.3679 | 0.0556 | 0.3015 | 0.2435 | 0.3533 | | 1.1187 | 39.0 | 4173 | 1.3977 | 0.2118 | 0.4348 | 0.1743 | 0.0573 | 0.1632 | 0.3241 | 0.2316 | 0.4068 | 0.4213 | 0.1385 | 0.3542 | 0.6066 | 0.492 | 0.677 | 0.1317 | 0.3987 | 0.1563 | 0.3442 | 0.0329 | 0.3369 | 0.2461 | 0.3498 | | 1.1005 | 40.0 | 4280 | 1.4000 | 0.2183 | 0.4578 | 0.1804 | 0.0648 | 0.1627 | 0.3479 | 0.234 | 0.4057 | 0.4236 | 0.1818 | 0.356 | 0.5934 | 0.496 | 0.6788 | 0.1287 | 0.3911 | 0.1675 | 0.3683 | 0.0411 | 0.32 | 0.2581 | 0.3596 | | 1.1131 | 41.0 | 4387 | 1.3965 | 0.2196 | 0.4648 | 0.186 | 0.0615 | 0.1652 | 0.352 | 0.2429 | 0.4085 | 0.4233 | 0.1522 | 0.3668 | 0.5934 | 0.4985 | 0.6743 | 0.1333 | 0.3709 | 0.1737 | 0.3656 | 0.0434 | 0.3431 | 0.2492 | 0.3627 | | 1.0952 | 42.0 | 4494 | 1.3614 | 0.2211 | 0.4568 | 0.1837 | 0.0791 | 0.1889 | 0.3373 | 0.236 | 0.4192 | 0.4331 | 0.18 | 0.3991 | 0.5683 | 0.5185 | 0.682 | 0.1322 | 0.4114 | 0.1665 | 0.3714 | 0.0418 | 0.3354 | 0.2466 | 0.3653 | | 1.0869 | 43.0 | 4601 | 1.3947 | 0.214 | 0.4677 | 0.1768 | 0.0648 | 0.1666 | 0.3402 | 0.2263 | 0.4023 | 0.4138 | 0.1509 | 0.3593 | 0.5697 | 0.4982 | 0.6468 | 0.1428 | 0.4291 | 0.1653 | 0.3415 | 0.0417 | 0.3031 | 0.2222 | 0.3484 | | 1.0787 | 44.0 | 4708 | 1.3556 | 0.2245 | 0.467 | 0.1869 | 0.0561 | 0.1729 | 0.3474 | 0.2474 | 0.4212 | 0.4368 | 0.1737 | 0.3889 | 0.6009 | 0.5125 | 0.6779 | 0.1332 | 0.4177 | 0.1801 | 0.3871 | 0.0402 | 0.32 | 0.2565 | 0.3813 | | 1.0596 | 45.0 | 4815 | 1.3947 | 0.2369 | 0.4907 | 0.1976 | 0.0707 | 0.1735 | 0.3883 | 0.2502 | 0.4183 | 0.4325 | 0.1603 | 0.3719 | 0.6108 | 0.5126 | 0.6608 | 0.1574 | 0.4228 | 0.188 | 0.3638 | 0.066 | 0.3477 | 0.2602 | 0.3676 | | 1.0826 | 46.0 | 4922 | 1.4612 | 0.208 | 0.4644 | 0.1612 | 0.0643 | 0.1703 | 0.3207 | 0.2345 | 0.3899 | 0.4046 | 0.1531 | 0.355 | 0.5565 | 0.4677 | 0.6315 | 0.12 | 0.3671 | 0.1762 | 0.3491 | 0.0452 | 0.3308 | 0.2308 | 0.3444 | | 1.0805 | 47.0 | 5029 | 1.3511 | 0.2227 | 0.4615 | 0.1878 | 0.0642 | 0.1708 | 0.375 | 0.2433 | 0.4212 | 0.437 | 0.1386 | 0.3853 | 0.6146 | 0.5022 | 0.6739 | 0.1304 | 0.4253 | 0.1903 | 0.3821 | 0.0631 | 0.34 | 0.2275 | 0.3636 | | 1.0771 | 48.0 | 5136 | 1.3661 | 0.2354 | 0.4877 | 0.2058 | 0.0745 | 0.1712 | 0.3831 | 0.2569 | 0.4247 | 0.437 | 0.1426 | 0.375 | 0.6206 | 0.508 | 0.6856 | 0.1561 | 0.4203 | 0.1792 | 0.3737 | 0.067 | 0.3369 | 0.2669 | 0.3684 | | 1.0628 | 49.0 | 5243 | 1.3718 | 0.2257 | 0.4696 | 0.1882 | 0.0666 | 0.1704 | 0.3716 | 0.2495 | 0.4194 | 0.4389 | 0.1565 | 0.3915 | 0.6076 | 0.517 | 0.6856 | 0.1506 | 0.4468 | 0.1632 | 0.3812 | 0.0448 | 0.3046 | 0.2529 | 0.376 | | 1.027 | 50.0 | 5350 | 1.3364 | 0.2191 | 0.4542 | 0.1799 | 0.0547 | 0.1651 | 0.3593 | 0.2381 | 0.4246 | 0.4389 | 0.1632 | 0.385 | 0.5987 | 0.522 | 0.6856 | 0.1332 | 0.443 | 0.1737 | 0.3955 | 0.0446 | 0.3138 | 0.2221 | 0.3564 | | 1.0418 | 51.0 | 5457 | 1.3681 | 0.2369 | 0.4783 | 0.195 | 0.0641 | 0.1763 | 0.3895 | 0.2596 | 0.4307 | 0.4484 | 0.1748 | 0.3867 | 0.626 | 0.5177 | 0.6712 | 0.167 | 0.4278 | 0.1996 | 0.3826 | 0.0542 | 0.3846 | 0.2461 | 0.3756 | | 1.0249 | 52.0 | 5564 | 1.3403 | 0.2343 | 0.4805 | 0.1894 | 0.0595 | 0.1842 | 0.3862 | 0.2507 | 0.4277 | 0.441 | 0.1404 | 0.3939 | 0.6093 | 0.511 | 0.6599 | 0.1725 | 0.4392 | 0.1738 | 0.354 | 0.0614 | 0.3692 | 0.2526 | 0.3827 | | 1.0091 | 53.0 | 5671 | 1.3439 | 0.2291 | 0.4832 | 0.1974 | 0.0653 | 0.1841 | 0.3715 | 0.2557 | 0.4301 | 0.445 | 0.1686 | 0.4035 | 0.6129 | 0.5162 | 0.6784 | 0.142 | 0.4278 | 0.1797 | 0.3754 | 0.052 | 0.3477 | 0.2556 | 0.3956 | | 1.0185 | 54.0 | 5778 | 1.3464 | 0.236 | 0.4899 | 0.1981 | 0.0784 | 0.1762 | 0.3833 | 0.2564 | 0.4278 | 0.4451 | 0.159 | 0.3862 | 0.6195 | 0.5223 | 0.6874 | 0.1754 | 0.4329 | 0.1766 | 0.3812 | 0.0477 | 0.3446 | 0.2577 | 0.3791 | | 1.002 | 55.0 | 5885 | 1.3273 | 0.2397 | 0.5027 | 0.2123 | 0.0706 | 0.1783 | 0.4007 | 0.262 | 0.4312 | 0.4432 | 0.151 | 0.3825 | 0.6209 | 0.5223 | 0.6865 | 0.1774 | 0.443 | 0.1857 | 0.3741 | 0.0574 | 0.3262 | 0.2555 | 0.3862 | | 0.9966 | 56.0 | 5992 | 1.3200 | 0.2411 | 0.5005 | 0.1882 | 0.0878 | 0.1812 | 0.4035 | 0.2553 | 0.4318 | 0.4459 | 0.2299 | 0.3699 | 0.6179 | 0.5224 | 0.677 | 0.1827 | 0.4456 | 0.1869 | 0.3884 | 0.0847 | 0.3554 | 0.2289 | 0.3631 | | 0.9794 | 57.0 | 6099 | 1.3284 | 0.2498 | 0.5009 | 0.2216 | 0.0781 | 0.1934 | 0.4007 | 0.2653 | 0.4302 | 0.443 | 0.1738 | 0.3822 | 0.6232 | 0.5308 | 0.7018 | 0.19 | 0.4228 | 0.1929 | 0.3621 | 0.0746 | 0.3431 | 0.2607 | 0.3853 | | 1.0153 | 58.0 | 6206 | 1.3197 | 0.2536 | 0.5145 | 0.2171 | 0.078 | 0.2042 | 0.4004 | 0.275 | 0.4442 | 0.4588 | 0.1375 | 0.397 | 0.657 | 0.5125 | 0.6856 | 0.1862 | 0.4506 | 0.1927 | 0.3875 | 0.1011 | 0.3738 | 0.2753 | 0.3964 | | 0.9602 | 59.0 | 6313 | 1.2911 | 0.2552 | 0.5334 | 0.2117 | 0.0739 | 0.2123 | 0.4031 | 0.2659 | 0.435 | 0.4489 | 0.1789 | 0.4037 | 0.6197 | 0.531 | 0.686 | 0.1922 | 0.4291 | 0.1926 | 0.3705 | 0.099 | 0.3662 | 0.2611 | 0.3924 | | 0.972 | 60.0 | 6420 | 1.2744 | 0.2603 | 0.5274 | 0.2143 | 0.0811 | 0.2118 | 0.4104 | 0.2669 | 0.4441 | 0.4606 | 0.2039 | 0.4078 | 0.6325 | 0.5301 | 0.6883 | 0.189 | 0.462 | 0.208 | 0.4045 | 0.1011 | 0.3492 | 0.2735 | 0.3991 | | 0.9554 | 61.0 | 6527 | 1.3204 | 0.26 | 0.5405 | 0.222 | 0.0885 | 0.2028 | 0.4086 | 0.2648 | 0.4356 | 0.4484 | 0.1653 | 0.3894 | 0.6299 | 0.5376 | 0.6919 | 0.1997 | 0.4519 | 0.1799 | 0.3571 | 0.0969 | 0.3508 | 0.2858 | 0.3902 | | 0.9409 | 62.0 | 6634 | 1.3063 | 0.2531 | 0.5388 | 0.2049 | 0.077 | 0.2004 | 0.4041 | 0.263 | 0.4347 | 0.4482 | 0.1751 | 0.3891 | 0.6224 | 0.5131 | 0.6743 | 0.1969 | 0.4456 | 0.1894 | 0.3732 | 0.0927 | 0.3508 | 0.2735 | 0.3969 | | 0.9692 | 63.0 | 6741 | 1.3074 | 0.2522 | 0.5361 | 0.2037 | 0.0766 | 0.1991 | 0.4112 | 0.2517 | 0.4435 | 0.4561 | 0.176 | 0.3976 | 0.6311 | 0.5317 | 0.6914 | 0.1913 | 0.4532 | 0.2006 | 0.4 | 0.0854 | 0.3692 | 0.252 | 0.3667 | | 0.9391 | 64.0 | 6848 | 1.3006 | 0.2571 | 0.5233 | 0.2144 | 0.0743 | 0.2098 | 0.4183 | 0.2613 | 0.4439 | 0.4579 | 0.1965 | 0.4005 | 0.6377 | 0.5309 | 0.6896 | 0.1905 | 0.4506 | 0.1945 | 0.3951 | 0.0973 | 0.3677 | 0.2724 | 0.3867 | | 0.9332 | 65.0 | 6955 | 1.2621 | 0.2599 | 0.5177 | 0.2177 | 0.0728 | 0.2141 | 0.4172 | 0.2786 | 0.4613 | 0.4748 | 0.2141 | 0.4338 | 0.6329 | 0.5287 | 0.6892 | 0.2085 | 0.4709 | 0.1822 | 0.4085 | 0.1052 | 0.3938 | 0.2747 | 0.4116 | | 0.9279 | 66.0 | 7062 | 1.2837 | 0.2539 | 0.5219 | 0.2076 | 0.077 | 0.2045 | 0.4055 | 0.2681 | 0.4441 | 0.459 | 0.173 | 0.3982 | 0.6455 | 0.5309 | 0.6937 | 0.1965 | 0.4608 | 0.1829 | 0.3879 | 0.0852 | 0.3569 | 0.2742 | 0.3956 | | 0.9376 | 67.0 | 7169 | 1.2944 | 0.2589 | 0.5241 | 0.2166 | 0.0773 | 0.2102 | 0.4181 | 0.277 | 0.4434 | 0.4589 | 0.1887 | 0.4041 | 0.6343 | 0.5295 | 0.6793 | 0.2006 | 0.4519 | 0.1909 | 0.4054 | 0.101 | 0.3615 | 0.2726 | 0.3964 | | 0.9361 | 68.0 | 7276 | 1.3035 | 0.2623 | 0.5402 | 0.2119 | 0.0789 | 0.2189 | 0.4159 | 0.2807 | 0.4494 | 0.4643 | 0.1988 | 0.4179 | 0.6294 | 0.5236 | 0.6734 | 0.1949 | 0.4494 | 0.2046 | 0.4161 | 0.1056 | 0.3769 | 0.2826 | 0.4058 | | 0.921 | 69.0 | 7383 | 1.2813 | 0.2732 | 0.5391 | 0.2305 | 0.085 | 0.2224 | 0.4243 | 0.2703 | 0.4524 | 0.466 | 0.2096 | 0.4196 | 0.6282 | 0.5322 | 0.6797 | 0.2234 | 0.4633 | 0.209 | 0.4062 | 0.1064 | 0.3692 | 0.295 | 0.4116 | | 0.9013 | 70.0 | 7490 | 1.3099 | 0.2688 | 0.5346 | 0.223 | 0.0741 | 0.2163 | 0.4203 | 0.2732 | 0.4447 | 0.4579 | 0.1724 | 0.4066 | 0.6318 | 0.5329 | 0.6748 | 0.2094 | 0.4481 | 0.215 | 0.4027 | 0.0994 | 0.3646 | 0.2873 | 0.3996 | | 0.9154 | 71.0 | 7597 | 1.2681 | 0.2728 | 0.5379 | 0.2291 | 0.0808 | 0.2241 | 0.4265 | 0.2743 | 0.445 | 0.4563 | 0.1687 | 0.4044 | 0.633 | 0.535 | 0.6788 | 0.2228 | 0.4532 | 0.2104 | 0.3911 | 0.11 | 0.3615 | 0.286 | 0.3969 | | 0.8979 | 72.0 | 7704 | 1.2593 | 0.269 | 0.5354 | 0.2264 | 0.0722 | 0.2152 | 0.4356 | 0.2752 | 0.4583 | 0.4686 | 0.1889 | 0.427 | 0.6399 | 0.5304 | 0.677 | 0.2063 | 0.4544 | 0.1968 | 0.4009 | 0.1263 | 0.4092 | 0.2852 | 0.4013 | | 0.8901 | 73.0 | 7811 | 1.2819 | 0.2656 | 0.5447 | 0.2306 | 0.0831 | 0.2037 | 0.4385 | 0.2656 | 0.4402 | 0.4524 | 0.1725 | 0.3957 | 0.6449 | 0.5208 | 0.6878 | 0.2185 | 0.4443 | 0.1941 | 0.3942 | 0.1169 | 0.3477 | 0.2775 | 0.388 | | 0.8943 | 74.0 | 7918 | 1.2786 | 0.268 | 0.5412 | 0.2261 | 0.0806 | 0.2127 | 0.4294 | 0.2696 | 0.4506 | 0.4598 | 0.1709 | 0.4042 | 0.6454 | 0.5438 | 0.6779 | 0.2051 | 0.4722 | 0.192 | 0.3915 | 0.1189 | 0.3631 | 0.2804 | 0.3942 | | 0.8789 | 75.0 | 8025 | 1.2746 | 0.2772 | 0.5569 | 0.2291 | 0.0829 | 0.226 | 0.4355 | 0.276 | 0.4553 | 0.4666 | 0.1862 | 0.4155 | 0.6381 | 0.5432 | 0.6901 | 0.2387 | 0.4759 | 0.2011 | 0.3884 | 0.1214 | 0.3846 | 0.2817 | 0.3938 | | 0.877 | 76.0 | 8132 | 1.2574 | 0.2757 | 0.5538 | 0.2251 | 0.0852 | 0.2285 | 0.4347 | 0.2706 | 0.4565 | 0.4684 | 0.1859 | 0.4177 | 0.6485 | 0.5329 | 0.6856 | 0.2301 | 0.4759 | 0.2069 | 0.3978 | 0.1224 | 0.3785 | 0.2865 | 0.4044 | | 0.8764 | 77.0 | 8239 | 1.2568 | 0.2776 | 0.5465 | 0.2391 | 0.0777 | 0.2323 | 0.4377 | 0.2747 | 0.4574 | 0.4707 | 0.1812 | 0.4223 | 0.646 | 0.5331 | 0.6815 | 0.2196 | 0.4759 | 0.2148 | 0.3996 | 0.1202 | 0.3785 | 0.3001 | 0.4182 | | 0.878 | 78.0 | 8346 | 1.2859 | 0.2726 | 0.5531 | 0.2368 | 0.0729 | 0.2235 | 0.4329 | 0.2742 | 0.4495 | 0.4608 | 0.1963 | 0.4066 | 0.6355 | 0.5247 | 0.6739 | 0.225 | 0.4557 | 0.2081 | 0.4071 | 0.1183 | 0.3631 | 0.287 | 0.404 | | 0.8631 | 79.0 | 8453 | 1.2818 | 0.2792 | 0.5567 | 0.2261 | 0.0776 | 0.2267 | 0.4525 | 0.2842 | 0.4602 | 0.4716 | 0.1764 | 0.4229 | 0.6576 | 0.5324 | 0.6784 | 0.2282 | 0.4582 | 0.2063 | 0.4027 | 0.1381 | 0.4123 | 0.2911 | 0.4062 | | 0.8718 | 80.0 | 8560 | 1.2793 | 0.2888 | 0.5613 | 0.2521 | 0.0779 | 0.2453 | 0.4497 | 0.2854 | 0.464 | 0.4748 | 0.1812 | 0.4356 | 0.6501 | 0.537 | 0.6734 | 0.2581 | 0.4861 | 0.2224 | 0.4125 | 0.1338 | 0.3985 | 0.2926 | 0.4036 | | 0.8574 | 81.0 | 8667 | 1.2768 | 0.2853 | 0.5614 | 0.2534 | 0.085 | 0.2389 | 0.4454 | 0.2847 | 0.4614 | 0.4718 | 0.1953 | 0.4209 | 0.6534 | 0.5334 | 0.6815 | 0.2342 | 0.4595 | 0.2176 | 0.4165 | 0.15 | 0.3985 | 0.2913 | 0.4031 | | 0.8564 | 82.0 | 8774 | 1.2782 | 0.2832 | 0.5646 | 0.2411 | 0.0917 | 0.2346 | 0.4385 | 0.284 | 0.4659 | 0.4774 | 0.2162 | 0.4213 | 0.6493 | 0.5293 | 0.6811 | 0.2323 | 0.4709 | 0.2203 | 0.4192 | 0.1401 | 0.4046 | 0.2941 | 0.4111 | | 0.8492 | 83.0 | 8881 | 1.2697 | 0.2816 | 0.5515 | 0.2545 | 0.0855 | 0.236 | 0.4398 | 0.2908 | 0.4688 | 0.479 | 0.2011 | 0.4241 | 0.6596 | 0.5252 | 0.6793 | 0.2385 | 0.4861 | 0.2054 | 0.4205 | 0.1463 | 0.4046 | 0.2925 | 0.4044 | | 0.8459 | 84.0 | 8988 | 1.2767 | 0.2863 | 0.5626 | 0.2473 | 0.0857 | 0.2386 | 0.4431 | 0.2897 | 0.4627 | 0.4723 | 0.1854 | 0.4164 | 0.6541 | 0.5434 | 0.6914 | 0.2267 | 0.4696 | 0.2308 | 0.4165 | 0.1434 | 0.3877 | 0.287 | 0.396 | | 0.8295 | 85.0 | 9095 | 1.2625 | 0.2864 | 0.5699 | 0.2489 | 0.084 | 0.2389 | 0.4477 | 0.2889 | 0.4599 | 0.47 | 0.1786 | 0.4187 | 0.6499 | 0.5455 | 0.6923 | 0.2287 | 0.462 | 0.2252 | 0.421 | 0.1397 | 0.3754 | 0.2929 | 0.3991 | | 0.8474 | 86.0 | 9202 | 1.2640 | 0.2925 | 0.572 | 0.2551 | 0.0931 | 0.2401 | 0.453 | 0.2922 | 0.4686 | 0.4778 | 0.1842 | 0.4289 | 0.6645 | 0.544 | 0.6914 | 0.2442 | 0.4886 | 0.2219 | 0.4071 | 0.1569 | 0.3969 | 0.2952 | 0.4049 | | 0.8288 | 87.0 | 9309 | 1.2591 | 0.2863 | 0.563 | 0.2478 | 0.0803 | 0.2381 | 0.4456 | 0.2857 | 0.4635 | 0.4744 | 0.1918 | 0.425 | 0.6502 | 0.5464 | 0.6959 | 0.2381 | 0.4899 | 0.216 | 0.408 | 0.1411 | 0.3769 | 0.2897 | 0.4013 | | 0.8408 | 88.0 | 9416 | 1.2432 | 0.2919 | 0.5713 | 0.2494 | 0.0926 | 0.2446 | 0.4512 | 0.2944 | 0.4682 | 0.4771 | 0.2127 | 0.4237 | 0.6558 | 0.5439 | 0.6869 | 0.2523 | 0.4823 | 0.2304 | 0.4138 | 0.1399 | 0.3938 | 0.2932 | 0.4084 | | 0.835 | 89.0 | 9523 | 1.2472 | 0.2885 | 0.5652 | 0.2456 | 0.0925 | 0.241 | 0.4443 | 0.2892 | 0.4662 | 0.4773 | 0.2096 | 0.4296 | 0.6479 | 0.5483 | 0.6986 | 0.2465 | 0.4709 | 0.2226 | 0.4152 | 0.1322 | 0.3908 | 0.2928 | 0.4111 | | 0.8299 | 90.0 | 9630 | 1.2481 | 0.2898 | 0.5648 | 0.2528 | 0.0972 | 0.239 | 0.4457 | 0.2939 | 0.463 | 0.4703 | 0.1948 | 0.4196 | 0.6457 | 0.5405 | 0.6829 | 0.2402 | 0.4646 | 0.2221 | 0.4022 | 0.1512 | 0.3892 | 0.295 | 0.4124 | | 0.817 | 91.0 | 9737 | 1.2366 | 0.2924 | 0.576 | 0.2475 | 0.0951 | 0.2384 | 0.4528 | 0.2952 | 0.4674 | 0.4775 | 0.2033 | 0.4255 | 0.6498 | 0.5415 | 0.6914 | 0.2416 | 0.4696 | 0.2272 | 0.4089 | 0.1537 | 0.3985 | 0.2981 | 0.4191 | | 0.8196 | 92.0 | 9844 | 1.2502 | 0.2943 | 0.5749 | 0.2597 | 0.089 | 0.2433 | 0.4536 | 0.2961 | 0.4633 | 0.4753 | 0.1969 | 0.423 | 0.6582 | 0.5436 | 0.6883 | 0.2408 | 0.4722 | 0.2311 | 0.4138 | 0.1565 | 0.3862 | 0.2993 | 0.416 | | 0.8184 | 93.0 | 9951 | 1.2399 | 0.2955 | 0.5787 | 0.2571 | 0.0902 | 0.2481 | 0.4588 | 0.294 | 0.4652 | 0.4756 | 0.2001 | 0.4218 | 0.6572 | 0.5405 | 0.6869 | 0.2498 | 0.4722 | 0.2357 | 0.4254 | 0.158 | 0.3815 | 0.2936 | 0.412 | | 0.8053 | 94.0 | 10058 | 1.2512 | 0.2964 | 0.5754 | 0.2633 | 0.0953 | 0.2468 | 0.4592 | 0.2955 | 0.4662 | 0.4786 | 0.1966 | 0.4229 | 0.6605 | 0.5436 | 0.6937 | 0.2502 | 0.4747 | 0.2288 | 0.4143 | 0.1627 | 0.3969 | 0.2964 | 0.4133 | | 0.8152 | 95.0 | 10165 | 1.2426 | 0.2965 | 0.5709 | 0.2562 | 0.0884 | 0.2481 | 0.4587 | 0.2989 | 0.4688 | 0.4784 | 0.19 | 0.4268 | 0.6613 | 0.5405 | 0.6883 | 0.2481 | 0.4722 | 0.2282 | 0.4134 | 0.1713 | 0.4046 | 0.2945 | 0.4133 | | 0.8115 | 96.0 | 10272 | 1.2396 | 0.2958 | 0.5727 | 0.2662 | 0.0912 | 0.2454 | 0.4651 | 0.3015 | 0.4677 | 0.4773 | 0.2023 | 0.421 | 0.6692 | 0.538 | 0.6838 | 0.2547 | 0.4671 | 0.228 | 0.4205 | 0.16 | 0.3954 | 0.2983 | 0.4196 | | 0.7983 | 97.0 | 10379 | 1.2417 | 0.2936 | 0.5731 | 0.2486 | 0.0893 | 0.2465 | 0.4624 | 0.2996 | 0.4718 | 0.4819 | 0.2028 | 0.4284 | 0.6697 | 0.5382 | 0.6892 | 0.2593 | 0.4772 | 0.2267 | 0.4223 | 0.1469 | 0.4031 | 0.2969 | 0.4178 | | 0.8217 | 98.0 | 10486 | 1.2373 | 0.2948 | 0.5722 | 0.257 | 0.0894 | 0.2455 | 0.4597 | 0.2974 | 0.4696 | 0.4801 | 0.2037 | 0.4225 | 0.6672 | 0.5365 | 0.6878 | 0.2612 | 0.481 | 0.2291 | 0.4237 | 0.1513 | 0.3892 | 0.2958 | 0.4187 | | 0.7961 | 99.0 | 10593 | 1.2396 | 0.296 | 0.5758 | 0.2587 | 0.0911 | 0.2463 | 0.4624 | 0.2971 | 0.4705 | 0.4808 | 0.2004 | 0.4241 | 0.6697 | 0.5372 | 0.6883 | 0.2632 | 0.4797 | 0.228 | 0.421 | 0.1539 | 0.3908 | 0.2977 | 0.4244 | | 0.8025 | 100.0 | 10700 | 1.2399 | 0.296 | 0.5758 | 0.258 | 0.0949 | 0.2452 | 0.4635 | 0.2979 | 0.4713 | 0.4813 | 0.2026 | 0.4228 | 0.6701 | 0.5371 | 0.6878 | 0.2634 | 0.4785 | 0.2261 | 0.4201 | 0.1544 | 0.3954 | 0.299 | 0.4249 | ### Framework versions - Transformers 4.46.0.dev0 - Pytorch 2.4.1+cu118 - Datasets 3.0.1 - Tokenizers 0.20.1
[ "coverall", "face_shield", "gloves", "goggles", "mask" ]
doktor47/zinemind_msft_300temp
# Model Card for Model ID <!-- Provide a quick summary of what the model is/does. --> ## Model Details ### Model Description <!-- Provide a longer summary of what this model is. --> This is the model card of a πŸ€— transformers model that has been pushed on the Hub. This model card has been automatically generated. - **Developed by:** [More Information Needed] - **Funded by [optional]:** [More Information Needed] - **Shared by [optional]:** [More Information Needed] - **Model type:** [More Information Needed] - **Language(s) (NLP):** [More Information Needed] - **License:** [More Information Needed] - **Finetuned from model [optional]:** [More Information Needed] ### Model Sources [optional] <!-- Provide the basic links for the model. --> - **Repository:** [More Information Needed] - **Paper [optional]:** [More Information Needed] - **Demo [optional]:** [More Information Needed] ## Uses <!-- Address questions around how the model is intended to be used, including the foreseeable users of the model and those affected by the model. --> ### Direct Use <!-- This section is for the model use without fine-tuning or plugging into a larger ecosystem/app. --> [More Information Needed] ### Downstream Use [optional] <!-- This section is for the model use when fine-tuned for a task, or when plugged into a larger ecosystem/app --> [More Information Needed] ### Out-of-Scope Use <!-- This section addresses misuse, malicious use, and uses that the model will not work well for. --> [More Information Needed] ## Bias, Risks, and Limitations <!-- This section is meant to convey both technical and sociotechnical limitations. --> [More Information Needed] ### Recommendations <!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. --> Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations. ## How to Get Started with the Model Use the code below to get started with the model. [More Information Needed] ## Training Details ### Training Data <!-- This should link to a Dataset Card, perhaps with a short stub of information on what the training data is all about as well as documentation related to data pre-processing or additional filtering. --> [More Information Needed] ### Training Procedure <!-- This relates heavily to the Technical Specifications. Content here should link to that section when it is relevant to the training procedure. --> #### Preprocessing [optional] [More Information Needed] #### Training Hyperparameters - **Training regime:** [More Information Needed] <!--fp32, fp16 mixed precision, bf16 mixed precision, bf16 non-mixed precision, fp16 non-mixed precision, fp8 mixed precision --> #### Speeds, Sizes, Times [optional] <!-- This section provides information about throughput, start/end time, checkpoint size if relevant, etc. --> [More Information Needed] ## Evaluation <!-- This section describes the evaluation protocols and provides the results. --> ### Testing Data, Factors & Metrics #### Testing Data <!-- This should link to a Dataset Card if possible. --> [More Information Needed] #### Factors <!-- These are the things the evaluation is disaggregating by, e.g., subpopulations or domains. --> [More Information Needed] #### Metrics <!-- These are the evaluation metrics being used, ideally with a description of why. --> [More Information Needed] ### Results [More Information Needed] #### Summary ## Model Examination [optional] <!-- Relevant interpretability work for the model goes here --> [More Information Needed] ## Environmental Impact <!-- Total emissions (in grams of CO2eq) and additional considerations, such as electricity usage, go here. Edit the suggested text below accordingly --> Carbon emissions can be estimated using the [Machine Learning Impact calculator](https://mlco2.github.io/impact#compute) presented in [Lacoste et al. (2019)](https://arxiv.org/abs/1910.09700). - **Hardware Type:** [More Information Needed] - **Hours used:** [More Information Needed] - **Cloud Provider:** [More Information Needed] - **Compute Region:** [More Information Needed] - **Carbon Emitted:** [More Information Needed] ## Technical Specifications [optional] ### Model Architecture and Objective [More Information Needed] ### Compute Infrastructure [More Information Needed] #### Hardware [More Information Needed] #### Software [More Information Needed] ## Citation [optional] <!-- If there is a paper or blog post introducing the model, the APA and Bibtex information for that should go in this section. --> **BibTeX:** [More Information Needed] **APA:** [More Information Needed] ## Glossary [optional] <!-- If relevant, include terms and calculations in this section that can help readers understand the model or model card. --> [More Information Needed] ## More Information [optional] [More Information Needed] ## Model Card Authors [optional] [More Information Needed] ## Model Card Contact [More Information Needed]
[ "table", "table column", "table row", "table column header", "table projected row header", "table spanning cell" ]
WANGTINGTING/finetuned-table-transformer-structure-recognition-v2
# Model Card for Model ID <!-- Provide a quick summary of what the model is/does. --> ## Model Details ### Model Description <!-- Provide a longer summary of what this model is. --> This is the model card of a πŸ€— transformers model that has been pushed on the Hub. This model card has been automatically generated. - **Developed by:** [More Information Needed] - **Funded by [optional]:** [More Information Needed] - **Shared by [optional]:** [More Information Needed] - **Model type:** [More Information Needed] - **Language(s) (NLP):** [More Information Needed] - **License:** [More Information Needed] - **Finetuned from model [optional]:** [More Information Needed] ### Model Sources [optional] <!-- Provide the basic links for the model. --> - **Repository:** [More Information Needed] - **Paper [optional]:** [More Information Needed] - **Demo [optional]:** [More Information Needed] ## Uses <!-- Address questions around how the model is intended to be used, including the foreseeable users of the model and those affected by the model. --> ### Direct Use <!-- This section is for the model use without fine-tuning or plugging into a larger ecosystem/app. --> [More Information Needed] ### Downstream Use [optional] <!-- This section is for the model use when fine-tuned for a task, or when plugged into a larger ecosystem/app --> [More Information Needed] ### Out-of-Scope Use <!-- This section addresses misuse, malicious use, and uses that the model will not work well for. --> [More Information Needed] ## Bias, Risks, and Limitations <!-- This section is meant to convey both technical and sociotechnical limitations. --> [More Information Needed] ### Recommendations <!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. --> Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations. ## How to Get Started with the Model Use the code below to get started with the model. [More Information Needed] ## Training Details ### Training Data <!-- This should link to a Dataset Card, perhaps with a short stub of information on what the training data is all about as well as documentation related to data pre-processing or additional filtering. --> [More Information Needed] ### Training Procedure <!-- This relates heavily to the Technical Specifications. Content here should link to that section when it is relevant to the training procedure. --> #### Preprocessing [optional] [More Information Needed] #### Training Hyperparameters - **Training regime:** [More Information Needed] <!--fp32, fp16 mixed precision, bf16 mixed precision, bf16 non-mixed precision, fp16 non-mixed precision, fp8 mixed precision --> #### Speeds, Sizes, Times [optional] <!-- This section provides information about throughput, start/end time, checkpoint size if relevant, etc. --> [More Information Needed] ## Evaluation <!-- This section describes the evaluation protocols and provides the results. --> ### Testing Data, Factors & Metrics #### Testing Data <!-- This should link to a Dataset Card if possible. --> [More Information Needed] #### Factors <!-- These are the things the evaluation is disaggregating by, e.g., subpopulations or domains. --> [More Information Needed] #### Metrics <!-- These are the evaluation metrics being used, ideally with a description of why. --> [More Information Needed] ### Results [More Information Needed] #### Summary ## Model Examination [optional] <!-- Relevant interpretability work for the model goes here --> [More Information Needed] ## Environmental Impact <!-- Total emissions (in grams of CO2eq) and additional considerations, such as electricity usage, go here. Edit the suggested text below accordingly --> Carbon emissions can be estimated using the [Machine Learning Impact calculator](https://mlco2.github.io/impact#compute) presented in [Lacoste et al. (2019)](https://arxiv.org/abs/1910.09700). - **Hardware Type:** [More Information Needed] - **Hours used:** [More Information Needed] - **Cloud Provider:** [More Information Needed] - **Compute Region:** [More Information Needed] - **Carbon Emitted:** [More Information Needed] ## Technical Specifications [optional] ### Model Architecture and Objective [More Information Needed] ### Compute Infrastructure [More Information Needed] #### Hardware [More Information Needed] #### Software [More Information Needed] ## Citation [optional] <!-- If there is a paper or blog post introducing the model, the APA and Bibtex information for that should go in this section. --> **BibTeX:** [More Information Needed] **APA:** [More Information Needed] ## Glossary [optional] <!-- If relevant, include terms and calculations in this section that can help readers understand the model or model card. --> [More Information Needed] ## More Information [optional] [More Information Needed] ## Model Card Authors [optional] [More Information Needed] ## Model Card Contact [More Information Needed]
[ "table", "table column", "table row", "table column header", "table projected row header", "table spanning cell" ]
jonathansuru/detr-resnet-50-dc5-ano
<!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. --> # detr-resnet-50-dc5-ano This model is a fine-tuned version of [facebook/detr-resnet-50-dc5](https://huggingface.co/facebook/detr-resnet-50-dc5) on an unknown dataset. It achieves the following results on the evaluation set: - Loss: 1.0928 - Map: 0.2104 - Map 50: 0.462 - Map 75: 0.1556 - Map Small: 0.2131 - Map Medium: 0.2995 - Map Large: -1.0 - Mar 1: 0.065 - Mar 10: 0.2862 - Mar 100: 0.4225 - Mar Small: 0.424 - Mar Medium: 0.3929 - Mar Large: -1.0 - Map Trophozoite: 0.032 - Mar 100 Trophozoite: 0.3135 - Map Wbc: 0.3889 - Mar 100 Wbc: 0.5315 ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 1e-05 - train_batch_size: 4 - eval_batch_size: 4 - seed: 42 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - training_steps: 10000 - mixed_precision_training: Native AMP ### Training results | Training Loss | Epoch | Step | Validation Loss | Map | Map 50 | Map 75 | Map Small | Map Medium | Map Large | Mar 1 | Mar 10 | Mar 100 | Mar Small | Mar Medium | Mar Large | Map Trophozoite | Mar 100 Trophozoite | Map Wbc | Mar 100 Wbc | |:-------------:|:-------:|:-----:|:---------------:|:------:|:------:|:------:|:---------:|:----------:|:---------:|:------:|:------:|:-------:|:---------:|:----------:|:---------:|:---------------:|:-------------------:|:-------:|:-----------:| | 2.62 | 0.1078 | 50 | 2.1271 | 0.0042 | 0.0131 | 0.0019 | 0.0042 | 0.0155 | -1.0 | 0.0062 | 0.0447 | 0.1115 | 0.1109 | 0.1214 | -1.0 | 0.005 | 0.1545 | 0.0034 | 0.0685 | | 2.4526 | 0.2155 | 100 | 2.0394 | 0.0049 | 0.0165 | 0.0014 | 0.0049 | 0.0372 | -1.0 | 0.0061 | 0.0369 | 0.1148 | 0.1138 | 0.1357 | -1.0 | 0.0073 | 0.1877 | 0.0026 | 0.0419 | | 1.9677 | 0.3233 | 150 | 1.9835 | 0.0059 | 0.0193 | 0.0022 | 0.0058 | 0.0453 | -1.0 | 0.0071 | 0.0353 | 0.1178 | 0.1167 | 0.1357 | -1.0 | 0.0086 | 0.2018 | 0.0032 | 0.0337 | | 2.5926 | 0.4310 | 200 | 1.9593 | 0.0072 | 0.0236 | 0.0027 | 0.0068 | 0.0614 | -1.0 | 0.0068 | 0.0262 | 0.1156 | 0.1148 | 0.0929 | -1.0 | 0.0104 | 0.2154 | 0.0039 | 0.0158 | | 1.8831 | 0.5388 | 250 | 1.9354 | 0.0086 | 0.0252 | 0.005 | 0.0081 | 0.0626 | -1.0 | 0.0093 | 0.0294 | 0.1237 | 0.1226 | 0.1286 | -1.0 | 0.0107 | 0.2271 | 0.0065 | 0.0203 | | 1.7079 | 0.6466 | 300 | 1.9019 | 0.0087 | 0.0277 | 0.0031 | 0.0086 | 0.0713 | -1.0 | 0.0105 | 0.0294 | 0.1281 | 0.1268 | 0.15 | -1.0 | 0.0105 | 0.2292 | 0.0069 | 0.0269 | | 2.1389 | 0.7543 | 350 | 1.8507 | 0.0135 | 0.0378 | 0.0057 | 0.0134 | 0.048 | -1.0 | 0.017 | 0.0465 | 0.147 | 0.1463 | 0.1286 | -1.0 | 0.011 | 0.235 | 0.016 | 0.059 | | 2.3653 | 0.8621 | 400 | 1.8474 | 0.0173 | 0.0456 | 0.0103 | 0.0169 | 0.1147 | -1.0 | 0.0222 | 0.0557 | 0.1534 | 0.1529 | 0.1214 | -1.0 | 0.0136 | 0.2336 | 0.021 | 0.0731 | | 1.866 | 0.9698 | 450 | 1.8293 | 0.0282 | 0.0677 | 0.0197 | 0.0278 | 0.1387 | -1.0 | 0.0331 | 0.0893 | 0.1938 | 0.1936 | 0.1714 | -1.0 | 0.0126 | 0.242 | 0.0439 | 0.1456 | | 2.0982 | 1.0776 | 500 | 1.8596 | 0.0337 | 0.0776 | 0.0256 | 0.0336 | 0.1432 | -1.0 | 0.0355 | 0.0986 | 0.2035 | 0.2034 | 0.1929 | -1.0 | 0.0088 | 0.2267 | 0.0587 | 0.1804 | | 1.5012 | 1.1853 | 550 | 1.7823 | 0.0355 | 0.0848 | 0.0242 | 0.0354 | 0.1608 | -1.0 | 0.0365 | 0.1135 | 0.2235 | 0.2228 | 0.2714 | -1.0 | 0.0104 | 0.2433 | 0.0606 | 0.2036 | | 2.2179 | 1.2931 | 600 | 1.7600 | 0.0456 | 0.1073 | 0.0309 | 0.046 | 0.1035 | -1.0 | 0.0416 | 0.1439 | 0.2612 | 0.2604 | 0.35 | -1.0 | 0.0106 | 0.2499 | 0.0806 | 0.2725 | | 1.9926 | 1.4009 | 650 | 1.6850 | 0.0535 | 0.1172 | 0.0402 | 0.0535 | 0.1301 | -1.0 | 0.0449 | 0.154 | 0.2803 | 0.2791 | 0.4071 | -1.0 | 0.012 | 0.2692 | 0.0949 | 0.2914 | | 1.5461 | 1.5086 | 700 | 1.6928 | 0.061 | 0.1339 | 0.0481 | 0.0619 | 0.1446 | -1.0 | 0.0441 | 0.1662 | 0.2923 | 0.2917 | 0.3857 | -1.0 | 0.0104 | 0.2542 | 0.1116 | 0.3304 | | 1.6376 | 1.6164 | 750 | 1.7107 | 0.0754 | 0.1674 | 0.0565 | 0.0763 | 0.1474 | -1.0 | 0.047 | 0.1771 | 0.3048 | 0.3044 | 0.4071 | -1.0 | 0.0096 | 0.2472 | 0.1412 | 0.3625 | | 1.7446 | 1.7241 | 800 | 1.6416 | 0.0883 | 0.1997 | 0.0607 | 0.0899 | 0.1329 | -1.0 | 0.0494 | 0.2006 | 0.364 | 0.3643 | 0.4429 | -1.0 | 0.0108 | 0.261 | 0.1658 | 0.467 | | 1.6513 | 1.8319 | 850 | 1.5899 | 0.0988 | 0.2155 | 0.0732 | 0.1003 | 0.1206 | -1.0 | 0.0523 | 0.2181 | 0.3856 | 0.387 | 0.3643 | -1.0 | 0.0118 | 0.2704 | 0.1857 | 0.5009 | | 1.4462 | 1.9397 | 900 | 1.5681 | 0.1085 | 0.2465 | 0.0751 | 0.1098 | 0.1331 | -1.0 | 0.0523 | 0.2198 | 0.3817 | 0.384 | 0.2714 | -1.0 | 0.0116 | 0.2694 | 0.2054 | 0.4939 | | 2.1774 | 2.0474 | 950 | 1.5550 | 0.1235 | 0.2747 | 0.0904 | 0.1251 | 0.1514 | -1.0 | 0.0535 | 0.2272 | 0.3895 | 0.3918 | 0.2857 | -1.0 | 0.0116 | 0.2721 | 0.2354 | 0.507 | | 1.4214 | 2.1552 | 1000 | 1.5505 | 0.134 | 0.3 | 0.0941 | 0.1376 | 0.1319 | -1.0 | 0.0549 | 0.2282 | 0.3844 | 0.3868 | 0.2643 | -1.0 | 0.0117 | 0.2677 | 0.2563 | 0.501 | | 2.1869 | 2.2629 | 1050 | 1.5522 | 0.1381 | 0.3067 | 0.0917 | 0.1397 | 0.1458 | -1.0 | 0.0569 | 0.2313 | 0.3848 | 0.3875 | 0.2286 | -1.0 | 0.0145 | 0.2754 | 0.2617 | 0.4942 | | 1.7756 | 2.3707 | 1100 | 1.5011 | 0.1473 | 0.326 | 0.1012 | 0.1489 | 0.215 | -1.0 | 0.0555 | 0.2375 | 0.4078 | 0.4105 | 0.2786 | -1.0 | 0.0156 | 0.2795 | 0.279 | 0.536 | | 1.4246 | 2.4784 | 1150 | 1.4837 | 0.1435 | 0.3249 | 0.0988 | 0.1457 | 0.2315 | -1.0 | 0.0552 | 0.2368 | 0.4049 | 0.4071 | 0.3286 | -1.0 | 0.0164 | 0.2756 | 0.2706 | 0.5343 | | 1.3888 | 2.5862 | 1200 | 1.4599 | 0.1495 | 0.3364 | 0.1006 | 0.1516 | 0.2003 | -1.0 | 0.057 | 0.2316 | 0.4016 | 0.4042 | 0.2786 | -1.0 | 0.0127 | 0.2828 | 0.2864 | 0.5205 | | 1.4362 | 2.6940 | 1250 | 1.4548 | 0.1528 | 0.3524 | 0.1043 | 0.1543 | 0.2426 | -1.0 | 0.0557 | 0.2301 | 0.3902 | 0.3922 | 0.3143 | -1.0 | 0.0117 | 0.2728 | 0.2939 | 0.5076 | | 1.3783 | 2.8017 | 1300 | 1.4243 | 0.1614 | 0.366 | 0.1027 | 0.1645 | 0.1674 | -1.0 | 0.0549 | 0.2372 | 0.3914 | 0.3941 | 0.25 | -1.0 | 0.0118 | 0.275 | 0.3109 | 0.5078 | | 1.91 | 2.9095 | 1350 | 1.4263 | 0.1669 | 0.3804 | 0.1082 | 0.1698 | 0.2609 | -1.0 | 0.0574 | 0.2453 | 0.3907 | 0.3926 | 0.3357 | -1.0 | 0.0126 | 0.2656 | 0.3212 | 0.5158 | | 1.3072 | 3.0172 | 1400 | 1.4192 | 0.1604 | 0.3835 | 0.0959 | 0.1616 | 0.2433 | -1.0 | 0.0539 | 0.2325 | 0.3726 | 0.3749 | 0.2643 | -1.0 | 0.0123 | 0.2587 | 0.3085 | 0.4865 | | 1.469 | 3.125 | 1450 | 1.4252 | 0.1615 | 0.3796 | 0.1009 | 0.1638 | 0.2479 | -1.0 | 0.0546 | 0.2418 | 0.392 | 0.3941 | 0.3071 | -1.0 | 0.0134 | 0.282 | 0.3097 | 0.502 | | 1.6321 | 3.2328 | 1500 | 1.4402 | 0.1609 | 0.3781 | 0.1006 | 0.1639 | 0.1707 | -1.0 | 0.0544 | 0.237 | 0.3739 | 0.3756 | 0.3214 | -1.0 | 0.0133 | 0.2665 | 0.3085 | 0.4812 | | 2.4078 | 3.3405 | 1550 | 1.4123 | 0.166 | 0.3854 | 0.1034 | 0.1679 | 0.2461 | -1.0 | 0.06 | 0.2409 | 0.3824 | 0.384 | 0.3429 | -1.0 | 0.0143 | 0.2713 | 0.3178 | 0.4935 | | 1.6876 | 3.4483 | 1600 | 1.4158 | 0.154 | 0.379 | 0.0945 | 0.1564 | 0.2257 | -1.0 | 0.0549 | 0.2246 | 0.3612 | 0.3622 | 0.3643 | -1.0 | 0.0127 | 0.2558 | 0.2953 | 0.4666 | | 1.4312 | 3.5560 | 1650 | 1.4093 | 0.1547 | 0.3771 | 0.0897 | 0.1574 | 0.1988 | -1.0 | 0.0521 | 0.2261 | 0.3736 | 0.3748 | 0.3571 | -1.0 | 0.0117 | 0.2739 | 0.2976 | 0.4734 | | 1.4903 | 3.6638 | 1700 | 1.4245 | 0.156 | 0.3771 | 0.093 | 0.1581 | 0.1942 | -1.0 | 0.0552 | 0.2267 | 0.3695 | 0.371 | 0.3214 | -1.0 | 0.011 | 0.2652 | 0.301 | 0.4737 | | 2.1859 | 3.7716 | 1750 | 1.3713 | 0.1617 | 0.3844 | 0.096 | 0.1646 | 0.2027 | -1.0 | 0.0564 | 0.2344 | 0.3844 | 0.3857 | 0.3643 | -1.0 | 0.0129 | 0.2818 | 0.3105 | 0.4871 | | 1.231 | 3.8793 | 1800 | 1.3659 | 0.1648 | 0.3866 | 0.1081 | 0.1673 | 0.2019 | -1.0 | 0.0554 | 0.2413 | 0.3937 | 0.3957 | 0.3143 | -1.0 | 0.0131 | 0.2834 | 0.3165 | 0.5041 | | 1.454 | 3.9871 | 1850 | 1.3651 | 0.1554 | 0.3827 | 0.0921 | 0.1583 | 0.2076 | -1.0 | 0.0525 | 0.2355 | 0.3794 | 0.3807 | 0.3571 | -1.0 | 0.0137 | 0.274 | 0.2972 | 0.4847 | | 2.4406 | 4.0948 | 1900 | 1.3590 | 0.1743 | 0.3987 | 0.1156 | 0.1761 | 0.2312 | -1.0 | 0.0579 | 0.2572 | 0.4066 | 0.408 | 0.3857 | -1.0 | 0.0172 | 0.2924 | 0.3313 | 0.5208 | | 1.3729 | 4.2026 | 1950 | 1.3334 | 0.1734 | 0.391 | 0.1157 | 0.176 | 0.2015 | -1.0 | 0.0573 | 0.2529 | 0.4031 | 0.4054 | 0.3071 | -1.0 | 0.0158 | 0.278 | 0.3309 | 0.5282 | | 2.1646 | 4.3103 | 2000 | 1.3248 | 0.1678 | 0.3922 | 0.1082 | 0.1712 | 0.2066 | -1.0 | 0.0536 | 0.2446 | 0.3884 | 0.3902 | 0.3286 | -1.0 | 0.0139 | 0.2727 | 0.3217 | 0.5041 | | 1.9208 | 4.4181 | 2050 | 1.3258 | 0.1604 | 0.3891 | 0.0897 | 0.1641 | 0.1997 | -1.0 | 0.0568 | 0.2372 | 0.3779 | 0.3796 | 0.3214 | -1.0 | 0.0124 | 0.2639 | 0.3085 | 0.4919 | | 1.0212 | 4.5259 | 2100 | 1.3385 | 0.1494 | 0.3737 | 0.0824 | 0.152 | 0.2085 | -1.0 | 0.053 | 0.2244 | 0.3661 | 0.3672 | 0.3643 | -1.0 | 0.0114 | 0.2623 | 0.2874 | 0.4699 | | 2.1047 | 4.6336 | 2150 | 1.3190 | 0.1587 | 0.3711 | 0.1031 | 0.1609 | 0.1565 | -1.0 | 0.0515 | 0.2415 | 0.387 | 0.3896 | 0.25 | -1.0 | 0.0132 | 0.2693 | 0.3041 | 0.5047 | | 1.3 | 4.7414 | 2200 | 1.2984 | 0.1682 | 0.3936 | 0.1054 | 0.1708 | 0.2185 | -1.0 | 0.0546 | 0.2415 | 0.3853 | 0.3869 | 0.3429 | -1.0 | 0.0139 | 0.2737 | 0.3226 | 0.4969 | | 2.0957 | 4.8491 | 2250 | 1.3023 | 0.1581 | 0.377 | 0.1005 | 0.1616 | 0.1674 | -1.0 | 0.0529 | 0.2427 | 0.3802 | 0.3821 | 0.3143 | -1.0 | 0.0147 | 0.2653 | 0.3015 | 0.4951 | | 1.1841 | 4.9569 | 2300 | 1.3213 | 0.1596 | 0.3899 | 0.0872 | 0.1636 | 0.1621 | -1.0 | 0.0563 | 0.24 | 0.3716 | 0.373 | 0.35 | -1.0 | 0.0158 | 0.2602 | 0.3033 | 0.483 | | 1.5673 | 5.0647 | 2350 | 1.3002 | 0.169 | 0.3933 | 0.0998 | 0.1728 | 0.2203 | -1.0 | 0.0567 | 0.2554 | 0.3897 | 0.3914 | 0.35 | -1.0 | 0.0203 | 0.2706 | 0.3177 | 0.5089 | | 2.2189 | 5.1724 | 2400 | 1.2900 | 0.1688 | 0.3994 | 0.0993 | 0.1725 | 0.1419 | -1.0 | 0.0566 | 0.2562 | 0.3928 | 0.3952 | 0.2786 | -1.0 | 0.0218 | 0.2801 | 0.3157 | 0.5055 | | 2.2594 | 5.2802 | 2450 | 1.3065 | 0.1638 | 0.3892 | 0.0991 | 0.1688 | 0.1498 | -1.0 | 0.0553 | 0.2477 | 0.3757 | 0.3772 | 0.35 | -1.0 | 0.0157 | 0.2496 | 0.312 | 0.5017 | | 1.245 | 5.3879 | 2500 | 1.2965 | 0.1631 | 0.4032 | 0.0854 | 0.1687 | 0.1485 | -1.0 | 0.0567 | 0.2439 | 0.3769 | 0.3781 | 0.3786 | -1.0 | 0.0168 | 0.2609 | 0.3093 | 0.4929 | | 1.7621 | 5.4957 | 2550 | 1.3153 | 0.1705 | 0.4046 | 0.1072 | 0.1749 | 0.144 | -1.0 | 0.0553 | 0.2435 | 0.3868 | 0.3886 | 0.3286 | -1.0 | 0.0139 | 0.2721 | 0.3271 | 0.5015 | | 1.1487 | 5.6034 | 2600 | 1.2965 | 0.1745 | 0.4099 | 0.1107 | 0.179 | 0.1456 | -1.0 | 0.0588 | 0.245 | 0.3757 | 0.3775 | 0.3214 | -1.0 | 0.0137 | 0.2545 | 0.3354 | 0.4968 | | 1.167 | 5.7112 | 2650 | 1.2692 | 0.1771 | 0.4099 | 0.1141 | 0.1807 | 0.1511 | -1.0 | 0.0582 | 0.2501 | 0.3931 | 0.3951 | 0.3071 | -1.0 | 0.0168 | 0.2829 | 0.3374 | 0.5032 | | 1.1622 | 5.8190 | 2700 | 1.2755 | 0.1838 | 0.4053 | 0.1185 | 0.187 | 0.1644 | -1.0 | 0.0585 | 0.2528 | 0.3904 | 0.3925 | 0.3071 | -1.0 | 0.0155 | 0.2652 | 0.352 | 0.5156 | | 1.8749 | 5.9267 | 2750 | 1.2667 | 0.1843 | 0.4152 | 0.1192 | 0.1863 | 0.2427 | -1.0 | 0.0607 | 0.2573 | 0.3979 | 0.4002 | 0.2929 | -1.0 | 0.0198 | 0.2784 | 0.3488 | 0.5173 | | 1.3628 | 6.0345 | 2800 | 1.2517 | 0.1805 | 0.4221 | 0.106 | 0.1834 | 0.2231 | -1.0 | 0.06 | 0.2557 | 0.3883 | 0.3902 | 0.3214 | -1.0 | 0.0227 | 0.2717 | 0.3382 | 0.5048 | | 1.3127 | 6.1422 | 2850 | 1.2254 | 0.1877 | 0.4279 | 0.1202 | 0.1914 | 0.1884 | -1.0 | 0.0629 | 0.2611 | 0.3954 | 0.3974 | 0.3286 | -1.0 | 0.0202 | 0.2733 | 0.3552 | 0.5176 | | 1.7499 | 6.25 | 2900 | 1.2486 | 0.1757 | 0.412 | 0.1049 | 0.1794 | 0.1907 | -1.0 | 0.0597 | 0.2507 | 0.3877 | 0.3896 | 0.3214 | -1.0 | 0.0177 | 0.2747 | 0.3337 | 0.5007 | | 1.2982 | 6.3578 | 2950 | 1.2164 | 0.1892 | 0.4214 | 0.1298 | 0.1915 | 0.1861 | -1.0 | 0.06 | 0.2575 | 0.3974 | 0.4 | 0.2714 | -1.0 | 0.0198 | 0.2788 | 0.3586 | 0.5161 | | 1.2499 | 6.4655 | 3000 | 1.2346 | 0.1846 | 0.4172 | 0.1166 | 0.1897 | 0.1567 | -1.0 | 0.0609 | 0.2565 | 0.3969 | 0.3986 | 0.3571 | -1.0 | 0.0172 | 0.2776 | 0.352 | 0.5163 | | 1.8273 | 6.5733 | 3050 | 1.2508 | 0.1801 | 0.4138 | 0.1139 | 0.1842 | 0.1534 | -1.0 | 0.06 | 0.2498 | 0.3988 | 0.4004 | 0.35 | -1.0 | 0.0175 | 0.2907 | 0.3427 | 0.5068 | | 1.8044 | 6.6810 | 3100 | 1.2149 | 0.1816 | 0.4137 | 0.1157 | 0.1866 | 0.1636 | -1.0 | 0.0599 | 0.2576 | 0.3972 | 0.3991 | 0.3286 | -1.0 | 0.0203 | 0.284 | 0.343 | 0.5105 | | 1.8206 | 6.7888 | 3150 | 1.2197 | 0.1765 | 0.4044 | 0.1126 | 0.1809 | 0.1565 | -1.0 | 0.0603 | 0.2574 | 0.4005 | 0.4021 | 0.3571 | -1.0 | 0.0201 | 0.2825 | 0.3329 | 0.5185 | | 1.3453 | 6.8966 | 3200 | 1.2322 | 0.1819 | 0.4101 | 0.1216 | 0.1869 | 0.1497 | -1.0 | 0.0586 | 0.2499 | 0.3908 | 0.3927 | 0.3214 | -1.0 | 0.0144 | 0.2732 | 0.3494 | 0.5084 | | 1.5722 | 7.0043 | 3250 | 1.2202 | 0.1861 | 0.4251 | 0.1147 | 0.1895 | 0.2274 | -1.0 | 0.0613 | 0.246 | 0.3957 | 0.397 | 0.3857 | -1.0 | 0.014 | 0.2839 | 0.3582 | 0.5076 | | 1.1069 | 7.1121 | 3300 | 1.2042 | 0.1913 | 0.4262 | 0.1279 | 0.1946 | 0.2496 | -1.0 | 0.0616 | 0.2545 | 0.3988 | 0.4007 | 0.3286 | -1.0 | 0.0167 | 0.2816 | 0.3659 | 0.516 | | 1.9143 | 7.2198 | 3350 | 1.2001 | 0.1894 | 0.4261 | 0.1298 | 0.1934 | 0.189 | -1.0 | 0.0623 | 0.2583 | 0.4049 | 0.4066 | 0.3643 | -1.0 | 0.0173 | 0.2841 | 0.3616 | 0.5257 | | 1.929 | 7.3276 | 3400 | 1.2157 | 0.1732 | 0.4129 | 0.1044 | 0.1774 | 0.1823 | -1.0 | 0.0591 | 0.2464 | 0.3922 | 0.3937 | 0.3571 | -1.0 | 0.019 | 0.287 | 0.3274 | 0.4974 | | 1.9973 | 7.4353 | 3450 | 1.2210 | 0.1849 | 0.4237 | 0.1256 | 0.1896 | 0.1933 | -1.0 | 0.0599 | 0.2545 | 0.3963 | 0.3976 | 0.3929 | -1.0 | 0.0163 | 0.279 | 0.3535 | 0.5137 | | 1.6126 | 7.5431 | 3500 | 1.2591 | 0.1735 | 0.412 | 0.1083 | 0.178 | 0.1822 | -1.0 | 0.0585 | 0.238 | 0.3835 | 0.3847 | 0.3643 | -1.0 | 0.0147 | 0.2784 | 0.3323 | 0.4885 | | 1.0779 | 7.6509 | 3550 | 1.2142 | 0.1794 | 0.4139 | 0.1136 | 0.1827 | 0.202 | -1.0 | 0.0565 | 0.2499 | 0.3959 | 0.3973 | 0.3714 | -1.0 | 0.0177 | 0.2839 | 0.3411 | 0.508 | | 1.1365 | 7.7586 | 3600 | 1.2074 | 0.1894 | 0.4232 | 0.1298 | 0.1932 | 0.2062 | -1.0 | 0.0615 | 0.2712 | 0.4101 | 0.4118 | 0.3714 | -1.0 | 0.024 | 0.2895 | 0.3549 | 0.5308 | | 1.3007 | 7.8664 | 3650 | 1.2069 | 0.1856 | 0.4244 | 0.1162 | 0.1892 | 0.2222 | -1.0 | 0.0586 | 0.2654 | 0.4022 | 0.4035 | 0.3929 | -1.0 | 0.0209 | 0.2834 | 0.3504 | 0.5209 | | 1.9676 | 7.9741 | 3700 | 1.2110 | 0.1881 | 0.4201 | 0.1221 | 0.1913 | 0.2115 | -1.0 | 0.0605 | 0.2658 | 0.4047 | 0.4065 | 0.35 | -1.0 | 0.0214 | 0.2843 | 0.3547 | 0.5251 | | 1.1711 | 8.0819 | 3750 | 1.2074 | 0.1806 | 0.4224 | 0.1101 | 0.1835 | 0.2001 | -1.0 | 0.0586 | 0.2583 | 0.3907 | 0.3919 | 0.3857 | -1.0 | 0.0218 | 0.278 | 0.3395 | 0.5033 | | 1.9594 | 8.1897 | 3800 | 1.2350 | 0.185 | 0.423 | 0.1256 | 0.188 | 0.2282 | -1.0 | 0.0592 | 0.2596 | 0.3854 | 0.387 | 0.3643 | -1.0 | 0.0178 | 0.2551 | 0.3521 | 0.5157 | | 1.6838 | 8.2974 | 3850 | 1.1909 | 0.1863 | 0.4264 | 0.1183 | 0.1889 | 0.2103 | -1.0 | 0.0597 | 0.2676 | 0.3991 | 0.4006 | 0.3714 | -1.0 | 0.0235 | 0.2817 | 0.3492 | 0.5164 | | 1.0343 | 8.4052 | 3900 | 1.1952 | 0.1695 | 0.4115 | 0.0901 | 0.173 | 0.1697 | -1.0 | 0.0569 | 0.2485 | 0.3893 | 0.3903 | 0.4 | -1.0 | 0.0198 | 0.2864 | 0.3191 | 0.4923 | | 1.5406 | 8.5129 | 3950 | 1.2028 | 0.1895 | 0.4428 | 0.1111 | 0.1924 | 0.226 | -1.0 | 0.0592 | 0.2609 | 0.3924 | 0.3938 | 0.3714 | -1.0 | 0.0215 | 0.2772 | 0.3575 | 0.5076 | | 1.3279 | 8.6207 | 4000 | 1.2069 | 0.177 | 0.4358 | 0.0912 | 0.1791 | 0.2654 | -1.0 | 0.0561 | 0.2491 | 0.3853 | 0.3858 | 0.4357 | -1.0 | 0.0196 | 0.2837 | 0.3344 | 0.4869 | | 1.5472 | 8.7284 | 4050 | 1.1917 | 0.1908 | 0.4444 | 0.1167 | 0.1927 | 0.2728 | -1.0 | 0.0598 | 0.2601 | 0.4003 | 0.4015 | 0.3929 | -1.0 | 0.0224 | 0.294 | 0.3591 | 0.5065 | | 1.3427 | 8.8362 | 4100 | 1.1818 | 0.1937 | 0.4373 | 0.1261 | 0.1962 | 0.2326 | -1.0 | 0.0608 | 0.2607 | 0.3991 | 0.4007 | 0.35 | -1.0 | 0.0206 | 0.2894 | 0.3668 | 0.5087 | | 1.1626 | 8.9440 | 4150 | 1.1952 | 0.1878 | 0.4292 | 0.1192 | 0.1902 | 0.2533 | -1.0 | 0.0602 | 0.2548 | 0.3944 | 0.3955 | 0.4071 | -1.0 | 0.0174 | 0.28 | 0.3581 | 0.5089 | | 1.5528 | 9.0517 | 4200 | 1.2105 | 0.1832 | 0.4154 | 0.1134 | 0.1861 | 0.2042 | -1.0 | 0.0569 | 0.2513 | 0.403 | 0.4045 | 0.3643 | -1.0 | 0.0158 | 0.2944 | 0.3507 | 0.5116 | | 1.1406 | 9.1595 | 4250 | 1.1768 | 0.1937 | 0.44 | 0.1245 | 0.1977 | 0.203 | -1.0 | 0.0616 | 0.273 | 0.4063 | 0.4077 | 0.3929 | -1.0 | 0.0268 | 0.2912 | 0.3607 | 0.5215 | | 1.059 | 9.2672 | 4300 | 1.1723 | 0.1909 | 0.4391 | 0.1195 | 0.1954 | 0.2212 | -1.0 | 0.0611 | 0.2741 | 0.4061 | 0.4069 | 0.4429 | -1.0 | 0.0279 | 0.2912 | 0.354 | 0.5211 | | 1.1914 | 9.375 | 4350 | 1.1884 | 0.1817 | 0.4239 | 0.1155 | 0.1863 | 0.1956 | -1.0 | 0.0583 | 0.2611 | 0.4007 | 0.4016 | 0.4214 | -1.0 | 0.0236 | 0.2917 | 0.3398 | 0.5096 | | 1.1247 | 9.4828 | 4400 | 1.1743 | 0.1803 | 0.4343 | 0.1043 | 0.1847 | 0.1959 | -1.0 | 0.0599 | 0.2655 | 0.3965 | 0.3971 | 0.4429 | -1.0 | 0.0274 | 0.2913 | 0.3332 | 0.5016 | | 1.1566 | 9.5905 | 4450 | 1.1848 | 0.1949 | 0.4383 | 0.1281 | 0.1996 | 0.1943 | -1.0 | 0.0624 | 0.2827 | 0.4091 | 0.4105 | 0.4 | -1.0 | 0.0287 | 0.2862 | 0.3612 | 0.532 | | 1.6062 | 9.6983 | 4500 | 1.1890 | 0.1892 | 0.4224 | 0.1245 | 0.1945 | 0.1935 | -1.0 | 0.0614 | 0.2795 | 0.4126 | 0.414 | 0.4 | -1.0 | 0.0293 | 0.2943 | 0.3492 | 0.531 | | 2.0192 | 9.8060 | 4550 | 1.1862 | 0.1912 | 0.4246 | 0.1326 | 0.1976 | 0.1622 | -1.0 | 0.0627 | 0.2726 | 0.408 | 0.4096 | 0.3714 | -1.0 | 0.0244 | 0.2856 | 0.3581 | 0.5304 | | 1.4838 | 9.9138 | 4600 | 1.1674 | 0.1892 | 0.4346 | 0.1138 | 0.1931 | 0.2099 | -1.0 | 0.0586 | 0.2671 | 0.4019 | 0.4028 | 0.4286 | -1.0 | 0.0244 | 0.2871 | 0.354 | 0.5167 | | 1.1207 | 10.0216 | 4650 | 1.1776 | 0.1892 | 0.4254 | 0.1317 | 0.1939 | 0.1753 | -1.0 | 0.0606 | 0.2681 | 0.4045 | 0.4058 | 0.3929 | -1.0 | 0.023 | 0.2854 | 0.3553 | 0.5235 | | 0.9931 | 10.1293 | 4700 | 1.1569 | 0.1874 | 0.4283 | 0.1215 | 0.1919 | 0.163 | -1.0 | 0.06 | 0.2667 | 0.4033 | 0.4045 | 0.3929 | -1.0 | 0.0229 | 0.2894 | 0.3519 | 0.5172 | | 2.0281 | 10.2371 | 4750 | 1.1877 | 0.1808 | 0.4322 | 0.1107 | 0.1836 | 0.2468 | -1.0 | 0.0574 | 0.2576 | 0.3874 | 0.388 | 0.4357 | -1.0 | 0.0236 | 0.2813 | 0.338 | 0.4935 | | 1.1893 | 10.3448 | 4800 | 1.1767 | 0.1805 | 0.4282 | 0.1085 | 0.1839 | 0.2218 | -1.0 | 0.0598 | 0.2568 | 0.3975 | 0.3982 | 0.45 | -1.0 | 0.0208 | 0.2852 | 0.3402 | 0.5099 | | 1.0194 | 10.4526 | 4850 | 1.1971 | 0.1901 | 0.4267 | 0.1304 | 0.1938 | 0.2603 | -1.0 | 0.0595 | 0.2594 | 0.3967 | 0.3976 | 0.4357 | -1.0 | 0.018 | 0.2753 | 0.3622 | 0.5182 | | 1.1556 | 10.5603 | 4900 | 1.1953 | 0.1876 | 0.4231 | 0.1297 | 0.1909 | 0.2337 | -1.0 | 0.0585 | 0.2558 | 0.403 | 0.4041 | 0.4071 | -1.0 | 0.0174 | 0.2907 | 0.3578 | 0.5153 | | 2.0626 | 10.6681 | 4950 | 1.1856 | 0.1858 | 0.4283 | 0.1187 | 0.1907 | 0.1974 | -1.0 | 0.0595 | 0.2572 | 0.4018 | 0.4031 | 0.3929 | -1.0 | 0.0194 | 0.2901 | 0.3523 | 0.5135 | | 1.0869 | 10.7759 | 5000 | 1.1844 | 0.1827 | 0.4278 | 0.1169 | 0.1865 | 0.2515 | -1.0 | 0.058 | 0.2504 | 0.3951 | 0.3959 | 0.4286 | -1.0 | 0.0177 | 0.2861 | 0.3477 | 0.5041 | | 1.6523 | 10.8836 | 5050 | 1.1843 | 0.1927 | 0.4417 | 0.1232 | 0.1956 | 0.2569 | -1.0 | 0.0603 | 0.2561 | 0.3985 | 0.3993 | 0.4286 | -1.0 | 0.0189 | 0.2852 | 0.3665 | 0.5118 | | 1.6716 | 10.9914 | 5100 | 1.1703 | 0.1937 | 0.4355 | 0.1372 | 0.1974 | 0.22 | -1.0 | 0.0625 | 0.2602 | 0.4086 | 0.4098 | 0.4 | -1.0 | 0.0195 | 0.2953 | 0.3679 | 0.5218 | | 1.5057 | 11.0991 | 5150 | 1.1783 | 0.1894 | 0.43 | 0.1237 | 0.1935 | 0.1858 | -1.0 | 0.0581 | 0.2632 | 0.4049 | 0.4065 | 0.3643 | -1.0 | 0.021 | 0.2912 | 0.3578 | 0.5185 | | 1.126 | 11.2069 | 5200 | 1.1629 | 0.1974 | 0.4371 | 0.1392 | 0.2013 | 0.2295 | -1.0 | 0.06 | 0.2666 | 0.4148 | 0.4167 | 0.3571 | -1.0 | 0.0215 | 0.295 | 0.3732 | 0.5346 | | 1.1369 | 11.3147 | 5250 | 1.1473 | 0.196 | 0.4411 | 0.1307 | 0.2001 | 0.2223 | -1.0 | 0.06 | 0.2701 | 0.4135 | 0.415 | 0.3929 | -1.0 | 0.0241 | 0.2944 | 0.3678 | 0.5327 | | 1.3188 | 11.4224 | 5300 | 1.1575 | 0.1948 | 0.431 | 0.1352 | 0.1997 | 0.175 | -1.0 | 0.0599 | 0.2686 | 0.4134 | 0.4153 | 0.3571 | -1.0 | 0.0232 | 0.2915 | 0.3664 | 0.5353 | | 1.5131 | 11.5302 | 5350 | 1.1621 | 0.1854 | 0.4336 | 0.1203 | 0.1888 | 0.2302 | -1.0 | 0.0592 | 0.2623 | 0.4001 | 0.4012 | 0.4071 | -1.0 | 0.0218 | 0.2871 | 0.3489 | 0.5131 | | 1.5959 | 11.6379 | 5400 | 1.1487 | 0.1959 | 0.4268 | 0.1381 | 0.2004 | 0.1946 | -1.0 | 0.0612 | 0.2664 | 0.4166 | 0.4181 | 0.3857 | -1.0 | 0.0211 | 0.3004 | 0.3708 | 0.5327 | | 0.9766 | 11.7457 | 5450 | 1.1456 | 0.2002 | 0.4397 | 0.1385 | 0.2046 | 0.1886 | -1.0 | 0.0617 | 0.2692 | 0.4156 | 0.4177 | 0.3357 | -1.0 | 0.0223 | 0.3012 | 0.378 | 0.5301 | | 1.0508 | 11.8534 | 5500 | 1.1405 | 0.2008 | 0.4457 | 0.1361 | 0.2039 | 0.2182 | -1.0 | 0.0609 | 0.2692 | 0.4116 | 0.4134 | 0.3571 | -1.0 | 0.0236 | 0.3002 | 0.3779 | 0.5231 | | 1.064 | 11.9612 | 5550 | 1.1395 | 0.1959 | 0.4391 | 0.1232 | 0.2003 | 0.1979 | -1.0 | 0.0604 | 0.2663 | 0.4129 | 0.4146 | 0.3571 | -1.0 | 0.0223 | 0.3023 | 0.3694 | 0.5234 | | 1.4142 | 12.0690 | 5600 | 1.1292 | 0.1942 | 0.4423 | 0.1295 | 0.1973 | 0.2 | -1.0 | 0.0599 | 0.2694 | 0.4122 | 0.4138 | 0.3643 | -1.0 | 0.0245 | 0.3001 | 0.3638 | 0.5243 | | 1.0224 | 12.1767 | 5650 | 1.1389 | 0.1953 | 0.4434 | 0.1319 | 0.199 | 0.2225 | -1.0 | 0.0612 | 0.2694 | 0.411 | 0.4124 | 0.3929 | -1.0 | 0.0233 | 0.2939 | 0.3674 | 0.5281 | | 1.7465 | 12.2845 | 5700 | 1.1245 | 0.2023 | 0.45 | 0.133 | 0.2059 | 0.2339 | -1.0 | 0.0613 | 0.2785 | 0.4183 | 0.4202 | 0.35 | -1.0 | 0.0285 | 0.305 | 0.3761 | 0.5315 | | 1.0503 | 12.3922 | 5750 | 1.1337 | 0.1989 | 0.4506 | 0.1356 | 0.2024 | 0.2496 | -1.0 | 0.0621 | 0.2763 | 0.4076 | 0.409 | 0.3857 | -1.0 | 0.0267 | 0.2863 | 0.3712 | 0.5288 | | 2.2047 | 12.5 | 5800 | 1.1313 | 0.2001 | 0.442 | 0.1361 | 0.2038 | 0.2462 | -1.0 | 0.0641 | 0.2783 | 0.4165 | 0.4182 | 0.3643 | -1.0 | 0.0267 | 0.2997 | 0.3735 | 0.5333 | | 0.9121 | 12.6078 | 5850 | 1.1382 | 0.1974 | 0.4422 | 0.131 | 0.2011 | 0.261 | -1.0 | 0.0644 | 0.2839 | 0.4166 | 0.4182 | 0.3786 | -1.0 | 0.0297 | 0.2994 | 0.365 | 0.5337 | | 1.4769 | 12.7155 | 5900 | 1.1517 | 0.1982 | 0.4438 | 0.1299 | 0.2018 | 0.2747 | -1.0 | 0.0638 | 0.2828 | 0.4188 | 0.4202 | 0.4071 | -1.0 | 0.0289 | 0.3012 | 0.3675 | 0.5365 | | 0.9778 | 12.8233 | 5950 | 1.1364 | 0.1992 | 0.4417 | 0.1333 | 0.2025 | 0.2725 | -1.0 | 0.0638 | 0.2781 | 0.4108 | 0.4121 | 0.4071 | -1.0 | 0.0257 | 0.2904 | 0.3727 | 0.5312 | | 1.2711 | 12.9310 | 6000 | 1.1361 | 0.1957 | 0.44 | 0.1314 | 0.1991 | 0.2555 | -1.0 | 0.0624 | 0.2778 | 0.4109 | 0.4122 | 0.4 | -1.0 | 0.0266 | 0.2955 | 0.3647 | 0.5263 | | 1.3801 | 13.0388 | 6050 | 1.1367 | 0.201 | 0.4486 | 0.1383 | 0.2033 | 0.2743 | -1.0 | 0.0641 | 0.2819 | 0.4163 | 0.4179 | 0.3786 | -1.0 | 0.0291 | 0.3008 | 0.3729 | 0.5318 | | 1.0824 | 13.1466 | 6100 | 1.1392 | 0.1998 | 0.4471 | 0.132 | 0.2039 | 0.2257 | -1.0 | 0.0647 | 0.2819 | 0.4177 | 0.4198 | 0.3357 | -1.0 | 0.0304 | 0.3051 | 0.3693 | 0.5304 | | 1.5716 | 13.2543 | 6150 | 1.1566 | 0.1972 | 0.4372 | 0.1287 | 0.2012 | 0.2389 | -1.0 | 0.0633 | 0.2714 | 0.414 | 0.4154 | 0.3929 | -1.0 | 0.0233 | 0.2963 | 0.3711 | 0.5317 | | 1.1803 | 13.3621 | 6200 | 1.1526 | 0.1979 | 0.434 | 0.14 | 0.2017 | 0.225 | -1.0 | 0.0627 | 0.2654 | 0.4151 | 0.4168 | 0.3714 | -1.0 | 0.0205 | 0.297 | 0.3753 | 0.5331 | | 1.1254 | 13.4698 | 6250 | 1.1398 | 0.2017 | 0.4489 | 0.1347 | 0.2058 | 0.219 | -1.0 | 0.0637 | 0.271 | 0.4175 | 0.4194 | 0.3571 | -1.0 | 0.0269 | 0.3038 | 0.3766 | 0.5312 | | 1.3144 | 13.5776 | 6300 | 1.1227 | 0.2002 | 0.4478 | 0.1359 | 0.2032 | 0.2688 | -1.0 | 0.0631 | 0.2753 | 0.4109 | 0.412 | 0.4143 | -1.0 | 0.0257 | 0.2953 | 0.3746 | 0.5265 | | 1.2006 | 13.6853 | 6350 | 1.1343 | 0.197 | 0.439 | 0.1288 | 0.2014 | 0.1836 | -1.0 | 0.0627 | 0.2718 | 0.4193 | 0.4211 | 0.3571 | -1.0 | 0.0245 | 0.3079 | 0.3695 | 0.5308 | | 1.1371 | 13.7931 | 6400 | 1.1213 | 0.2015 | 0.4491 | 0.1405 | 0.2048 | 0.2258 | -1.0 | 0.0632 | 0.2745 | 0.4173 | 0.4186 | 0.4143 | -1.0 | 0.0255 | 0.3001 | 0.3775 | 0.5346 | | 1.1677 | 13.9009 | 6450 | 1.1327 | 0.201 | 0.4465 | 0.1361 | 0.2046 | 0.2225 | -1.0 | 0.063 | 0.2714 | 0.4158 | 0.4173 | 0.3929 | -1.0 | 0.0242 | 0.2992 | 0.3778 | 0.5324 | | 0.9811 | 14.0086 | 6500 | 1.1240 | 0.2035 | 0.4463 | 0.1453 | 0.2073 | 0.2446 | -1.0 | 0.0628 | 0.2746 | 0.4131 | 0.4147 | 0.3714 | -1.0 | 0.0241 | 0.2931 | 0.383 | 0.533 | | 1.0895 | 14.1164 | 6550 | 1.1297 | 0.2025 | 0.4458 | 0.1367 | 0.2064 | 0.2368 | -1.0 | 0.0628 | 0.2703 | 0.4147 | 0.4163 | 0.3786 | -1.0 | 0.0241 | 0.302 | 0.3809 | 0.5275 | | 0.9775 | 14.2241 | 6600 | 1.1194 | 0.1951 | 0.448 | 0.1288 | 0.1976 | 0.2813 | -1.0 | 0.0618 | 0.2644 | 0.4061 | 0.4071 | 0.4214 | -1.0 | 0.0238 | 0.2981 | 0.3664 | 0.5141 | | 1.6864 | 14.3319 | 6650 | 1.1241 | 0.204 | 0.4507 | 0.142 | 0.2069 | 0.2688 | -1.0 | 0.0634 | 0.2686 | 0.4166 | 0.4179 | 0.4071 | -1.0 | 0.0239 | 0.3044 | 0.3842 | 0.5288 | | 1.0167 | 14.4397 | 6700 | 1.1158 | 0.2066 | 0.4539 | 0.1431 | 0.2099 | 0.2349 | -1.0 | 0.063 | 0.274 | 0.4194 | 0.421 | 0.3786 | -1.0 | 0.0254 | 0.3053 | 0.3878 | 0.5336 | | 1.8267 | 14.5474 | 6750 | 1.1167 | 0.2013 | 0.4506 | 0.1388 | 0.2055 | 0.2316 | -1.0 | 0.0637 | 0.2725 | 0.418 | 0.4198 | 0.3571 | -1.0 | 0.0268 | 0.3092 | 0.3759 | 0.5269 | | 1.4236 | 14.6552 | 6800 | 1.1245 | 0.2053 | 0.4558 | 0.1471 | 0.2085 | 0.2314 | -1.0 | 0.0651 | 0.2784 | 0.4158 | 0.4176 | 0.3571 | -1.0 | 0.0316 | 0.3042 | 0.3789 | 0.5275 | | 1.333 | 14.7629 | 6850 | 1.1058 | 0.2136 | 0.4706 | 0.1492 | 0.2166 | 0.2664 | -1.0 | 0.0656 | 0.2893 | 0.4246 | 0.4265 | 0.3571 | -1.0 | 0.0363 | 0.3127 | 0.391 | 0.5366 | | 1.7656 | 14.8707 | 6900 | 1.1022 | 0.2096 | 0.4651 | 0.1451 | 0.2124 | 0.2538 | -1.0 | 0.0658 | 0.2846 | 0.4174 | 0.4189 | 0.3857 | -1.0 | 0.0315 | 0.3043 | 0.3877 | 0.5305 | | 1.1243 | 14.9784 | 6950 | 1.1080 | 0.2133 | 0.466 | 0.1523 | 0.2157 | 0.2785 | -1.0 | 0.0664 | 0.2854 | 0.4193 | 0.4207 | 0.4 | -1.0 | 0.0322 | 0.3067 | 0.3944 | 0.5318 | | 1.658 | 15.0862 | 7000 | 1.1115 | 0.2043 | 0.4622 | 0.1339 | 0.2068 | 0.2819 | -1.0 | 0.063 | 0.2774 | 0.4105 | 0.4114 | 0.4357 | -1.0 | 0.0304 | 0.3028 | 0.3782 | 0.5183 | | 1.3946 | 15.1940 | 7050 | 1.1119 | 0.2027 | 0.4568 | 0.1345 | 0.2065 | 0.2632 | -1.0 | 0.0643 | 0.2772 | 0.4118 | 0.4129 | 0.4143 | -1.0 | 0.0287 | 0.3021 | 0.3768 | 0.5214 | | 1.4881 | 15.3017 | 7100 | 1.1042 | 0.2012 | 0.4562 | 0.136 | 0.2038 | 0.2673 | -1.0 | 0.0634 | 0.2788 | 0.4124 | 0.4132 | 0.4429 | -1.0 | 0.0287 | 0.3047 | 0.3737 | 0.5201 | | 1.2228 | 15.4095 | 7150 | 1.1150 | 0.2025 | 0.4539 | 0.1344 | 0.2055 | 0.2469 | -1.0 | 0.063 | 0.2742 | 0.4133 | 0.4144 | 0.4143 | -1.0 | 0.0264 | 0.3049 | 0.3787 | 0.5217 | | 1.0396 | 15.5172 | 7200 | 1.1073 | 0.205 | 0.4595 | 0.1403 | 0.2076 | 0.2674 | -1.0 | 0.0635 | 0.2782 | 0.4149 | 0.4161 | 0.4143 | -1.0 | 0.0289 | 0.3029 | 0.3811 | 0.5269 | | 1.0606 | 15.625 | 7250 | 1.1043 | 0.2073 | 0.4517 | 0.145 | 0.2096 | 0.2465 | -1.0 | 0.0652 | 0.2794 | 0.4202 | 0.4219 | 0.3714 | -1.0 | 0.0286 | 0.3096 | 0.386 | 0.5308 | | 1.2285 | 15.7328 | 7300 | 1.1250 | 0.2011 | 0.4389 | 0.1366 | 0.2043 | 0.2387 | -1.0 | 0.0622 | 0.2716 | 0.4151 | 0.4164 | 0.4 | -1.0 | 0.0238 | 0.3038 | 0.3784 | 0.5265 | | 1.089 | 15.8405 | 7350 | 1.1176 | 0.2023 | 0.4445 | 0.1385 | 0.205 | 0.247 | -1.0 | 0.0621 | 0.2753 | 0.4151 | 0.4166 | 0.3857 | -1.0 | 0.0255 | 0.3042 | 0.3791 | 0.526 | | 1.1228 | 15.9483 | 7400 | 1.1060 | 0.2046 | 0.4523 | 0.1357 | 0.2072 | 0.2847 | -1.0 | 0.0651 | 0.2763 | 0.4197 | 0.4211 | 0.4 | -1.0 | 0.0271 | 0.3114 | 0.382 | 0.5281 | | 0.8598 | 16.0560 | 7450 | 1.1095 | 0.2052 | 0.4533 | 0.1402 | 0.2081 | 0.2613 | -1.0 | 0.0626 | 0.2799 | 0.419 | 0.4201 | 0.4214 | -1.0 | 0.0277 | 0.3096 | 0.3827 | 0.5283 | | 1.2708 | 16.1638 | 7500 | 1.1059 | 0.2047 | 0.4556 | 0.1388 | 0.2071 | 0.2837 | -1.0 | 0.0635 | 0.2804 | 0.4153 | 0.4163 | 0.4286 | -1.0 | 0.0285 | 0.3059 | 0.3809 | 0.5247 | | 1.8101 | 16.2716 | 7550 | 1.1023 | 0.2103 | 0.4652 | 0.1497 | 0.2133 | 0.2499 | -1.0 | 0.0649 | 0.283 | 0.4204 | 0.4217 | 0.4 | -1.0 | 0.0311 | 0.3114 | 0.3894 | 0.5294 | | 1.0946 | 16.3793 | 7600 | 1.1104 | 0.2084 | 0.4541 | 0.1422 | 0.2107 | 0.2748 | -1.0 | 0.0637 | 0.2786 | 0.4188 | 0.4202 | 0.4 | -1.0 | 0.0288 | 0.305 | 0.3879 | 0.5327 | | 1.6373 | 16.4871 | 7650 | 1.1160 | 0.2097 | 0.4637 | 0.1519 | 0.212 | 0.2893 | -1.0 | 0.0641 | 0.2809 | 0.4182 | 0.4192 | 0.4357 | -1.0 | 0.0316 | 0.3048 | 0.3877 | 0.5315 | | 1.3377 | 16.5948 | 7700 | 1.1163 | 0.2075 | 0.4579 | 0.1441 | 0.2102 | 0.265 | -1.0 | 0.0628 | 0.2787 | 0.4221 | 0.4232 | 0.4286 | -1.0 | 0.0295 | 0.3117 | 0.3856 | 0.5326 | | 1.8192 | 16.7026 | 7750 | 1.1129 | 0.2039 | 0.4637 | 0.1318 | 0.2071 | 0.2707 | -1.0 | 0.0619 | 0.2803 | 0.4181 | 0.4192 | 0.4143 | -1.0 | 0.0306 | 0.3098 | 0.3773 | 0.5263 | | 1.0494 | 16.8103 | 7800 | 1.1104 | 0.2079 | 0.4713 | 0.1387 | 0.2109 | 0.2665 | -1.0 | 0.0636 | 0.287 | 0.4151 | 0.4163 | 0.4214 | -1.0 | 0.0332 | 0.3008 | 0.3825 | 0.5295 | | 1.2957 | 16.9181 | 7850 | 1.1126 | 0.2033 | 0.462 | 0.1334 | 0.2063 | 0.2712 | -1.0 | 0.0627 | 0.2838 | 0.4115 | 0.4124 | 0.4286 | -1.0 | 0.0312 | 0.2989 | 0.3753 | 0.524 | | 1.6719 | 17.0259 | 7900 | 1.1025 | 0.2116 | 0.4709 | 0.148 | 0.215 | 0.2696 | -1.0 | 0.0646 | 0.2906 | 0.4239 | 0.4252 | 0.4143 | -1.0 | 0.0349 | 0.3108 | 0.3883 | 0.5369 | | 2.2098 | 17.1336 | 7950 | 1.1015 | 0.2086 | 0.4623 | 0.1494 | 0.2122 | 0.272 | -1.0 | 0.0641 | 0.2871 | 0.424 | 0.4252 | 0.4143 | -1.0 | 0.0318 | 0.3142 | 0.3853 | 0.5337 | | 1.7784 | 17.2414 | 8000 | 1.1134 | 0.2076 | 0.4642 | 0.144 | 0.2107 | 0.2688 | -1.0 | 0.0639 | 0.2841 | 0.4244 | 0.4256 | 0.4143 | -1.0 | 0.0308 | 0.3158 | 0.3845 | 0.533 | | 1.0112 | 17.3491 | 8050 | 1.1034 | 0.2097 | 0.4648 | 0.1519 | 0.2129 | 0.272 | -1.0 | 0.0643 | 0.2861 | 0.4261 | 0.4274 | 0.4 | -1.0 | 0.0327 | 0.3185 | 0.3867 | 0.5336 | | 1.0751 | 17.4569 | 8100 | 1.1009 | 0.2087 | 0.4678 | 0.1473 | 0.2115 | 0.2899 | -1.0 | 0.0636 | 0.288 | 0.4229 | 0.424 | 0.4286 | -1.0 | 0.0347 | 0.3139 | 0.3828 | 0.532 | | 1.2879 | 17.5647 | 8150 | 1.1009 | 0.211 | 0.4707 | 0.1426 | 0.2147 | 0.2868 | -1.0 | 0.0657 | 0.2911 | 0.4227 | 0.4238 | 0.4286 | -1.0 | 0.0366 | 0.3103 | 0.3855 | 0.5352 | | 1.3776 | 17.6724 | 8200 | 1.1000 | 0.2081 | 0.4646 | 0.1396 | 0.2111 | 0.2873 | -1.0 | 0.0645 | 0.2868 | 0.4261 | 0.4272 | 0.4286 | -1.0 | 0.0325 | 0.3187 | 0.3838 | 0.5336 | | 1.0149 | 17.7802 | 8250 | 1.0993 | 0.2131 | 0.469 | 0.1508 | 0.2163 | 0.2803 | -1.0 | 0.0657 | 0.2926 | 0.4244 | 0.4255 | 0.4286 | -1.0 | 0.0371 | 0.3129 | 0.3891 | 0.5359 | | 1.2494 | 17.8879 | 8300 | 1.0979 | 0.2117 | 0.4685 | 0.1448 | 0.2149 | 0.282 | -1.0 | 0.0651 | 0.2912 | 0.4271 | 0.4284 | 0.4143 | -1.0 | 0.0371 | 0.3178 | 0.3862 | 0.5363 | | 1.163 | 17.9957 | 8350 | 1.0946 | 0.21 | 0.4656 | 0.1399 | 0.2135 | 0.2528 | -1.0 | 0.0646 | 0.2906 | 0.4232 | 0.4246 | 0.4 | -1.0 | 0.0368 | 0.3132 | 0.3832 | 0.5333 | | 1.516 | 18.1034 | 8400 | 1.0930 | 0.2115 | 0.4646 | 0.1483 | 0.2148 | 0.2571 | -1.0 | 0.0644 | 0.2919 | 0.4265 | 0.428 | 0.4 | -1.0 | 0.0374 | 0.3146 | 0.3856 | 0.5385 | | 1.199 | 18.2112 | 8450 | 1.0872 | 0.2157 | 0.4698 | 0.1497 | 0.2192 | 0.2637 | -1.0 | 0.066 | 0.296 | 0.4257 | 0.427 | 0.4143 | -1.0 | 0.0388 | 0.3115 | 0.3926 | 0.5398 | | 1.1083 | 18.3190 | 8500 | 1.0956 | 0.2114 | 0.4682 | 0.1409 | 0.2141 | 0.288 | -1.0 | 0.0648 | 0.2926 | 0.4255 | 0.4267 | 0.4214 | -1.0 | 0.037 | 0.3155 | 0.3857 | 0.5355 | | 1.1415 | 18.4267 | 8550 | 1.0883 | 0.2112 | 0.466 | 0.1434 | 0.2152 | 0.2507 | -1.0 | 0.0655 | 0.2949 | 0.4269 | 0.4283 | 0.4 | -1.0 | 0.0371 | 0.319 | 0.3852 | 0.5349 | | 1.451 | 18.5345 | 8600 | 1.0935 | 0.2107 | 0.4668 | 0.1439 | 0.2142 | 0.2476 | -1.0 | 0.0642 | 0.2922 | 0.4278 | 0.4293 | 0.3857 | -1.0 | 0.037 | 0.322 | 0.3844 | 0.5336 | | 1.3117 | 18.6422 | 8650 | 1.1024 | 0.2099 | 0.4581 | 0.1484 | 0.2133 | 0.2578 | -1.0 | 0.0633 | 0.2879 | 0.4239 | 0.4252 | 0.4071 | -1.0 | 0.033 | 0.3137 | 0.3868 | 0.534 | | 1.0472 | 18.75 | 8700 | 1.0960 | 0.2079 | 0.4603 | 0.1455 | 0.2103 | 0.2834 | -1.0 | 0.0624 | 0.2896 | 0.4195 | 0.4204 | 0.4429 | -1.0 | 0.0333 | 0.3088 | 0.3824 | 0.5302 | | 1.0329 | 18.8578 | 8750 | 1.0874 | 0.2111 | 0.4609 | 0.1524 | 0.2146 | 0.2738 | -1.0 | 0.0635 | 0.2934 | 0.4221 | 0.4234 | 0.4143 | -1.0 | 0.0343 | 0.3096 | 0.388 | 0.5346 | | 1.3704 | 18.9655 | 8800 | 1.0882 | 0.2115 | 0.4688 | 0.1507 | 0.2148 | 0.2562 | -1.0 | 0.0636 | 0.2912 | 0.4235 | 0.4251 | 0.3786 | -1.0 | 0.0365 | 0.3157 | 0.3864 | 0.5312 | | 2.1077 | 19.0733 | 8850 | 1.0882 | 0.2107 | 0.4673 | 0.1527 | 0.2131 | 0.2712 | -1.0 | 0.0631 | 0.2896 | 0.4216 | 0.4231 | 0.3857 | -1.0 | 0.0353 | 0.3145 | 0.3861 | 0.5288 | | 1.3446 | 19.1810 | 8900 | 1.0947 | 0.2083 | 0.4593 | 0.1564 | 0.2114 | 0.2681 | -1.0 | 0.0628 | 0.2851 | 0.4212 | 0.4227 | 0.3857 | -1.0 | 0.0336 | 0.3147 | 0.383 | 0.5276 | | 1.0836 | 19.2888 | 8950 | 1.0971 | 0.2112 | 0.4646 | 0.1533 | 0.2141 | 0.2694 | -1.0 | 0.0638 | 0.288 | 0.422 | 0.4233 | 0.4 | -1.0 | 0.034 | 0.3131 | 0.3884 | 0.5308 | | 1.1653 | 19.3966 | 9000 | 1.0909 | 0.2084 | 0.4632 | 0.1504 | 0.2113 | 0.2694 | -1.0 | 0.0624 | 0.2869 | 0.4179 | 0.4192 | 0.4 | -1.0 | 0.0344 | 0.3101 | 0.3825 | 0.5256 | | 2.1015 | 19.5043 | 9050 | 1.0924 | 0.2091 | 0.4625 | 0.151 | 0.2124 | 0.278 | -1.0 | 0.0634 | 0.2871 | 0.4211 | 0.4225 | 0.4 | -1.0 | 0.0338 | 0.3149 | 0.3844 | 0.5273 | | 1.0125 | 19.6121 | 9100 | 1.0994 | 0.2086 | 0.4646 | 0.1483 | 0.2106 | 0.2771 | -1.0 | 0.0634 | 0.2854 | 0.4191 | 0.4205 | 0.3929 | -1.0 | 0.0317 | 0.3119 | 0.3855 | 0.5263 | | 1.631 | 19.7198 | 9150 | 1.0970 | 0.2071 | 0.4617 | 0.1453 | 0.2089 | 0.2796 | -1.0 | 0.0636 | 0.283 | 0.418 | 0.4194 | 0.3929 | -1.0 | 0.0319 | 0.3134 | 0.3823 | 0.5227 | | 1.1955 | 19.8276 | 9200 | 1.0946 | 0.2083 | 0.4653 | 0.1491 | 0.2109 | 0.2762 | -1.0 | 0.064 | 0.2856 | 0.4204 | 0.4219 | 0.3857 | -1.0 | 0.0328 | 0.3142 | 0.3838 | 0.5266 | | 1.4341 | 19.9353 | 9250 | 1.0979 | 0.2106 | 0.4655 | 0.156 | 0.2131 | 0.2922 | -1.0 | 0.0658 | 0.2842 | 0.4212 | 0.4226 | 0.3857 | -1.0 | 0.0318 | 0.3137 | 0.3894 | 0.5286 | | 1.3996 | 20.0431 | 9300 | 1.0955 | 0.2077 | 0.4656 | 0.1448 | 0.21 | 0.2925 | -1.0 | 0.0633 | 0.2842 | 0.4172 | 0.4185 | 0.3929 | -1.0 | 0.0323 | 0.3117 | 0.3831 | 0.5227 | | 1.5216 | 20.1509 | 9350 | 1.0997 | 0.2086 | 0.4642 | 0.1534 | 0.2107 | 0.2871 | -1.0 | 0.0647 | 0.2844 | 0.4208 | 0.4223 | 0.3857 | -1.0 | 0.0316 | 0.3167 | 0.3857 | 0.525 | | 1.4524 | 20.2586 | 9400 | 1.0988 | 0.2082 | 0.4635 | 0.1475 | 0.2102 | 0.2866 | -1.0 | 0.0639 | 0.2848 | 0.4198 | 0.4212 | 0.3857 | -1.0 | 0.0316 | 0.3147 | 0.3848 | 0.5249 | | 1.3485 | 20.3664 | 9450 | 1.0998 | 0.2075 | 0.4637 | 0.1431 | 0.2093 | 0.2869 | -1.0 | 0.0644 | 0.2821 | 0.4188 | 0.4203 | 0.3857 | -1.0 | 0.0314 | 0.3142 | 0.3836 | 0.5234 | | 2.4621 | 20.4741 | 9500 | 1.0980 | 0.2084 | 0.4617 | 0.1439 | 0.2102 | 0.2809 | -1.0 | 0.0639 | 0.2833 | 0.4199 | 0.4212 | 0.3929 | -1.0 | 0.0314 | 0.3139 | 0.3855 | 0.5259 | | 0.9933 | 20.5819 | 9550 | 1.0934 | 0.2105 | 0.4641 | 0.1528 | 0.2134 | 0.2782 | -1.0 | 0.0642 | 0.2863 | 0.4228 | 0.4243 | 0.3857 | -1.0 | 0.032 | 0.3149 | 0.389 | 0.5307 | | 1.1412 | 20.6897 | 9600 | 1.0980 | 0.2097 | 0.4615 | 0.1524 | 0.2119 | 0.2903 | -1.0 | 0.0649 | 0.2853 | 0.4197 | 0.4211 | 0.3929 | -1.0 | 0.0316 | 0.311 | 0.3877 | 0.5285 | | 1.4535 | 20.7974 | 9650 | 1.0932 | 0.2106 | 0.4635 | 0.1498 | 0.2127 | 0.2903 | -1.0 | 0.0647 | 0.2862 | 0.4213 | 0.4227 | 0.3929 | -1.0 | 0.0322 | 0.3135 | 0.3891 | 0.5291 | | 1.3408 | 20.9052 | 9700 | 1.0924 | 0.2119 | 0.463 | 0.1554 | 0.2139 | 0.2849 | -1.0 | 0.0645 | 0.2872 | 0.4217 | 0.4232 | 0.3857 | -1.0 | 0.0323 | 0.3125 | 0.3916 | 0.5308 | | 1.2117 | 21.0129 | 9750 | 1.0933 | 0.2104 | 0.4613 | 0.1551 | 0.2124 | 0.2995 | -1.0 | 0.0647 | 0.2855 | 0.4211 | 0.4225 | 0.3929 | -1.0 | 0.0313 | 0.3122 | 0.3894 | 0.5301 | | 1.3188 | 21.1207 | 9800 | 1.0909 | 0.2095 | 0.4611 | 0.1546 | 0.2119 | 0.2995 | -1.0 | 0.0651 | 0.285 | 0.4216 | 0.423 | 0.3929 | -1.0 | 0.0316 | 0.3139 | 0.3874 | 0.5294 | | 1.5079 | 21.2284 | 9850 | 1.0923 | 0.21 | 0.4628 | 0.1534 | 0.213 | 0.2849 | -1.0 | 0.0651 | 0.2856 | 0.422 | 0.4235 | 0.3857 | -1.0 | 0.032 | 0.3141 | 0.3879 | 0.5298 | | 0.9257 | 21.3362 | 9900 | 1.0920 | 0.2099 | 0.4621 | 0.1547 | 0.2124 | 0.2995 | -1.0 | 0.0645 | 0.2852 | 0.4213 | 0.4227 | 0.3929 | -1.0 | 0.0318 | 0.3129 | 0.3881 | 0.5298 | | 1.3689 | 21.4440 | 9950 | 1.0928 | 0.2107 | 0.4622 | 0.1539 | 0.2131 | 0.3049 | -1.0 | 0.065 | 0.2862 | 0.4222 | 0.4236 | 0.4 | -1.0 | 0.0321 | 0.3141 | 0.3893 | 0.5304 | | 1.4012 | 21.5517 | 10000 | 1.0928 | 0.2104 | 0.462 | 0.1556 | 0.2131 | 0.2995 | -1.0 | 0.065 | 0.2862 | 0.4225 | 0.424 | 0.3929 | -1.0 | 0.032 | 0.3135 | 0.3889 | 0.5315 | ### Framework versions - Transformers 4.44.2 - Pytorch 2.4.1+cu121 - Datasets 2.21.0 - Tokenizers 0.19.1
[ "trophozoite", "wbc" ]
acervos-digitais/conditional-detr-resnet-50-ft-0915-e192-augm
# Model Card for Model ID <!-- Provide a quick summary of what the model is/does. --> ## Model Details ### Model Description <!-- Provide a longer summary of what this model is. --> This is the model card of a πŸ€— transformers model that has been pushed on the Hub. This model card has been automatically generated. - **Developed by:** [More Information Needed] - **Funded by [optional]:** [More Information Needed] - **Shared by [optional]:** [More Information Needed] - **Model type:** [More Information Needed] - **Language(s) (NLP):** [More Information Needed] - **License:** [More Information Needed] - **Finetuned from model [optional]:** [More Information Needed] ### Model Sources [optional] <!-- Provide the basic links for the model. --> - **Repository:** [More Information Needed] - **Paper [optional]:** [More Information Needed] - **Demo [optional]:** [More Information Needed] ## Uses <!-- Address questions around how the model is intended to be used, including the foreseeable users of the model and those affected by the model. --> ### Direct Use <!-- This section is for the model use without fine-tuning or plugging into a larger ecosystem/app. --> [More Information Needed] ### Downstream Use [optional] <!-- This section is for the model use when fine-tuned for a task, or when plugged into a larger ecosystem/app --> [More Information Needed] ### Out-of-Scope Use <!-- This section addresses misuse, malicious use, and uses that the model will not work well for. --> [More Information Needed] ## Bias, Risks, and Limitations <!-- This section is meant to convey both technical and sociotechnical limitations. --> [More Information Needed] ### Recommendations <!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. --> Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations. ## How to Get Started with the Model Use the code below to get started with the model. [More Information Needed] ## Training Details ### Training Data <!-- This should link to a Dataset Card, perhaps with a short stub of information on what the training data is all about as well as documentation related to data pre-processing or additional filtering. --> [More Information Needed] ### Training Procedure <!-- This relates heavily to the Technical Specifications. Content here should link to that section when it is relevant to the training procedure. --> #### Preprocessing [optional] [More Information Needed] #### Training Hyperparameters - **Training regime:** [More Information Needed] <!--fp32, fp16 mixed precision, bf16 mixed precision, bf16 non-mixed precision, fp16 non-mixed precision, fp8 mixed precision --> #### Speeds, Sizes, Times [optional] <!-- This section provides information about throughput, start/end time, checkpoint size if relevant, etc. --> [More Information Needed] ## Evaluation <!-- This section describes the evaluation protocols and provides the results. --> ### Testing Data, Factors & Metrics #### Testing Data <!-- This should link to a Dataset Card if possible. --> [More Information Needed] #### Factors <!-- These are the things the evaluation is disaggregating by, e.g., subpopulations or domains. --> [More Information Needed] #### Metrics <!-- These are the evaluation metrics being used, ideally with a description of why. --> [More Information Needed] ### Results [More Information Needed] #### Summary ## Model Examination [optional] <!-- Relevant interpretability work for the model goes here --> [More Information Needed] ## Environmental Impact <!-- Total emissions (in grams of CO2eq) and additional considerations, such as electricity usage, go here. Edit the suggested text below accordingly --> Carbon emissions can be estimated using the [Machine Learning Impact calculator](https://mlco2.github.io/impact#compute) presented in [Lacoste et al. (2019)](https://arxiv.org/abs/1910.09700). - **Hardware Type:** [More Information Needed] - **Hours used:** [More Information Needed] - **Cloud Provider:** [More Information Needed] - **Compute Region:** [More Information Needed] - **Carbon Emitted:** [More Information Needed] ## Technical Specifications [optional] ### Model Architecture and Objective [More Information Needed] ### Compute Infrastructure [More Information Needed] #### Hardware [More Information Needed] #### Software [More Information Needed] ## Citation [optional] <!-- If there is a paper or blog post introducing the model, the APA and Bibtex information for that should go in this section. --> **BibTeX:** [More Information Needed] **APA:** [More Information Needed] ## Glossary [optional] <!-- If relevant, include terms and calculations in this section that can help readers understand the model or model card. --> [More Information Needed] ## More Information [optional] [More Information Needed] ## Model Card Authors [optional] [More Information Needed] ## Model Card Contact [More Information Needed]
[ "awning", "balcony", "ramp" ]
acervos-digitais/conditional-detr-resnet-50-ft-0915-e256-augm2
# Model Card for Model ID <!-- Provide a quick summary of what the model is/does. --> ## Model Details ### Model Description <!-- Provide a longer summary of what this model is. --> This is the model card of a πŸ€— transformers model that has been pushed on the Hub. This model card has been automatically generated. - **Developed by:** [More Information Needed] - **Funded by [optional]:** [More Information Needed] - **Shared by [optional]:** [More Information Needed] - **Model type:** [More Information Needed] - **Language(s) (NLP):** [More Information Needed] - **License:** [More Information Needed] - **Finetuned from model [optional]:** [More Information Needed] ### Model Sources [optional] <!-- Provide the basic links for the model. --> - **Repository:** [More Information Needed] - **Paper [optional]:** [More Information Needed] - **Demo [optional]:** [More Information Needed] ## Uses <!-- Address questions around how the model is intended to be used, including the foreseeable users of the model and those affected by the model. --> ### Direct Use <!-- This section is for the model use without fine-tuning or plugging into a larger ecosystem/app. --> [More Information Needed] ### Downstream Use [optional] <!-- This section is for the model use when fine-tuned for a task, or when plugged into a larger ecosystem/app --> [More Information Needed] ### Out-of-Scope Use <!-- This section addresses misuse, malicious use, and uses that the model will not work well for. --> [More Information Needed] ## Bias, Risks, and Limitations <!-- This section is meant to convey both technical and sociotechnical limitations. --> [More Information Needed] ### Recommendations <!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. --> Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations. ## How to Get Started with the Model Use the code below to get started with the model. [More Information Needed] ## Training Details ### Training Data <!-- This should link to a Dataset Card, perhaps with a short stub of information on what the training data is all about as well as documentation related to data pre-processing or additional filtering. --> [More Information Needed] ### Training Procedure <!-- This relates heavily to the Technical Specifications. Content here should link to that section when it is relevant to the training procedure. --> #### Preprocessing [optional] [More Information Needed] #### Training Hyperparameters - **Training regime:** [More Information Needed] <!--fp32, fp16 mixed precision, bf16 mixed precision, bf16 non-mixed precision, fp16 non-mixed precision, fp8 mixed precision --> #### Speeds, Sizes, Times [optional] <!-- This section provides information about throughput, start/end time, checkpoint size if relevant, etc. --> [More Information Needed] ## Evaluation <!-- This section describes the evaluation protocols and provides the results. --> ### Testing Data, Factors & Metrics #### Testing Data <!-- This should link to a Dataset Card if possible. --> [More Information Needed] #### Factors <!-- These are the things the evaluation is disaggregating by, e.g., subpopulations or domains. --> [More Information Needed] #### Metrics <!-- These are the evaluation metrics being used, ideally with a description of why. --> [More Information Needed] ### Results [More Information Needed] #### Summary ## Model Examination [optional] <!-- Relevant interpretability work for the model goes here --> [More Information Needed] ## Environmental Impact <!-- Total emissions (in grams of CO2eq) and additional considerations, such as electricity usage, go here. Edit the suggested text below accordingly --> Carbon emissions can be estimated using the [Machine Learning Impact calculator](https://mlco2.github.io/impact#compute) presented in [Lacoste et al. (2019)](https://arxiv.org/abs/1910.09700). - **Hardware Type:** [More Information Needed] - **Hours used:** [More Information Needed] - **Cloud Provider:** [More Information Needed] - **Compute Region:** [More Information Needed] - **Carbon Emitted:** [More Information Needed] ## Technical Specifications [optional] ### Model Architecture and Objective [More Information Needed] ### Compute Infrastructure [More Information Needed] #### Hardware [More Information Needed] #### Software [More Information Needed] ## Citation [optional] <!-- If there is a paper or blog post introducing the model, the APA and Bibtex information for that should go in this section. --> **BibTeX:** [More Information Needed] **APA:** [More Information Needed] ## Glossary [optional] <!-- If relevant, include terms and calculations in this section that can help readers understand the model or model card. --> [More Information Needed] ## More Information [optional] [More Information Needed] ## Model Card Authors [optional] [More Information Needed] ## Model Card Contact [More Information Needed]
[ "awning", "balcony", "ramp" ]
skiba16/flowchart
# Model Card for Model ID <!-- Provide a quick summary of what the model is/does. --> ## Model Details ### Model Description <!-- Provide a longer summary of what this model is. --> This is the model card of a πŸ€— transformers model that has been pushed on the Hub. This model card has been automatically generated. - **Developed by:** [More Information Needed] - **Funded by [optional]:** [More Information Needed] - **Shared by [optional]:** [More Information Needed] - **Model type:** [More Information Needed] - **Language(s) (NLP):** [More Information Needed] - **License:** [More Information Needed] - **Finetuned from model [optional]:** [More Information Needed] ### Model Sources [optional] <!-- Provide the basic links for the model. --> - **Repository:** [More Information Needed] - **Paper [optional]:** [More Information Needed] - **Demo [optional]:** [More Information Needed] ## Uses <!-- Address questions around how the model is intended to be used, including the foreseeable users of the model and those affected by the model. --> ### Direct Use <!-- This section is for the model use without fine-tuning or plugging into a larger ecosystem/app. --> [More Information Needed] ### Downstream Use [optional] <!-- This section is for the model use when fine-tuned for a task, or when plugged into a larger ecosystem/app --> [More Information Needed] ### Out-of-Scope Use <!-- This section addresses misuse, malicious use, and uses that the model will not work well for. --> [More Information Needed] ## Bias, Risks, and Limitations <!-- This section is meant to convey both technical and sociotechnical limitations. --> [More Information Needed] ### Recommendations <!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. --> Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations. ## How to Get Started with the Model Use the code below to get started with the model. [More Information Needed] ## Training Details ### Training Data <!-- This should link to a Dataset Card, perhaps with a short stub of information on what the training data is all about as well as documentation related to data pre-processing or additional filtering. --> [More Information Needed] ### Training Procedure <!-- This relates heavily to the Technical Specifications. Content here should link to that section when it is relevant to the training procedure. --> #### Preprocessing [optional] [More Information Needed] #### Training Hyperparameters - **Training regime:** [More Information Needed] <!--fp32, fp16 mixed precision, bf16 mixed precision, bf16 non-mixed precision, fp16 non-mixed precision, fp8 mixed precision --> #### Speeds, Sizes, Times [optional] <!-- This section provides information about throughput, start/end time, checkpoint size if relevant, etc. --> [More Information Needed] ## Evaluation <!-- This section describes the evaluation protocols and provides the results. --> ### Testing Data, Factors & Metrics #### Testing Data <!-- This should link to a Dataset Card if possible. --> [More Information Needed] #### Factors <!-- These are the things the evaluation is disaggregating by, e.g., subpopulations or domains. --> [More Information Needed] #### Metrics <!-- These are the evaluation metrics being used, ideally with a description of why. --> [More Information Needed] ### Results [More Information Needed] #### Summary ## Model Examination [optional] <!-- Relevant interpretability work for the model goes here --> [More Information Needed] ## Environmental Impact <!-- Total emissions (in grams of CO2eq) and additional considerations, such as electricity usage, go here. Edit the suggested text below accordingly --> Carbon emissions can be estimated using the [Machine Learning Impact calculator](https://mlco2.github.io/impact#compute) presented in [Lacoste et al. (2019)](https://arxiv.org/abs/1910.09700). - **Hardware Type:** [More Information Needed] - **Hours used:** [More Information Needed] - **Cloud Provider:** [More Information Needed] - **Compute Region:** [More Information Needed] - **Carbon Emitted:** [More Information Needed] ## Technical Specifications [optional] ### Model Architecture and Objective [More Information Needed] ### Compute Infrastructure [More Information Needed] #### Hardware [More Information Needed] #### Software [More Information Needed] ## Citation [optional] <!-- If there is a paper or blog post introducing the model, the APA and Bibtex information for that should go in this section. --> **BibTeX:** [More Information Needed] **APA:** [More Information Needed] ## Glossary [optional] <!-- If relevant, include terms and calculations in this section that can help readers understand the model or model card. --> [More Information Needed] ## More Information [optional] [More Information Needed] ## Model Card Authors [optional] [More Information Needed] ## Model Card Contact [More Information Needed]
[ "label_0", "label_1", "label_2", "label_3", "label_4", "label_5", "label_6", "label_7", "label_8", "label_9", "label_10", "label_11", "label_12", "label_13", "label_14", "label_15" ]
acervos-digitais/conditional-detr-resnet-50-ft-0915-e256-augm3
# Model Card for Model ID <!-- Provide a quick summary of what the model is/does. --> ## Model Details ### Model Description <!-- Provide a longer summary of what this model is. --> This is the model card of a πŸ€— transformers model that has been pushed on the Hub. This model card has been automatically generated. - **Developed by:** [More Information Needed] - **Funded by [optional]:** [More Information Needed] - **Shared by [optional]:** [More Information Needed] - **Model type:** [More Information Needed] - **Language(s) (NLP):** [More Information Needed] - **License:** [More Information Needed] - **Finetuned from model [optional]:** [More Information Needed] ### Model Sources [optional] <!-- Provide the basic links for the model. --> - **Repository:** [More Information Needed] - **Paper [optional]:** [More Information Needed] - **Demo [optional]:** [More Information Needed] ## Uses <!-- Address questions around how the model is intended to be used, including the foreseeable users of the model and those affected by the model. --> ### Direct Use <!-- This section is for the model use without fine-tuning or plugging into a larger ecosystem/app. --> [More Information Needed] ### Downstream Use [optional] <!-- This section is for the model use when fine-tuned for a task, or when plugged into a larger ecosystem/app --> [More Information Needed] ### Out-of-Scope Use <!-- This section addresses misuse, malicious use, and uses that the model will not work well for. --> [More Information Needed] ## Bias, Risks, and Limitations <!-- This section is meant to convey both technical and sociotechnical limitations. --> [More Information Needed] ### Recommendations <!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. --> Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations. ## How to Get Started with the Model Use the code below to get started with the model. [More Information Needed] ## Training Details ### Training Data <!-- This should link to a Dataset Card, perhaps with a short stub of information on what the training data is all about as well as documentation related to data pre-processing or additional filtering. --> [More Information Needed] ### Training Procedure <!-- This relates heavily to the Technical Specifications. Content here should link to that section when it is relevant to the training procedure. --> #### Preprocessing [optional] [More Information Needed] #### Training Hyperparameters - **Training regime:** [More Information Needed] <!--fp32, fp16 mixed precision, bf16 mixed precision, bf16 non-mixed precision, fp16 non-mixed precision, fp8 mixed precision --> #### Speeds, Sizes, Times [optional] <!-- This section provides information about throughput, start/end time, checkpoint size if relevant, etc. --> [More Information Needed] ## Evaluation <!-- This section describes the evaluation protocols and provides the results. --> ### Testing Data, Factors & Metrics #### Testing Data <!-- This should link to a Dataset Card if possible. --> [More Information Needed] #### Factors <!-- These are the things the evaluation is disaggregating by, e.g., subpopulations or domains. --> [More Information Needed] #### Metrics <!-- These are the evaluation metrics being used, ideally with a description of why. --> [More Information Needed] ### Results [More Information Needed] #### Summary ## Model Examination [optional] <!-- Relevant interpretability work for the model goes here --> [More Information Needed] ## Environmental Impact <!-- Total emissions (in grams of CO2eq) and additional considerations, such as electricity usage, go here. Edit the suggested text below accordingly --> Carbon emissions can be estimated using the [Machine Learning Impact calculator](https://mlco2.github.io/impact#compute) presented in [Lacoste et al. (2019)](https://arxiv.org/abs/1910.09700). - **Hardware Type:** [More Information Needed] - **Hours used:** [More Information Needed] - **Cloud Provider:** [More Information Needed] - **Compute Region:** [More Information Needed] - **Carbon Emitted:** [More Information Needed] ## Technical Specifications [optional] ### Model Architecture and Objective [More Information Needed] ### Compute Infrastructure [More Information Needed] #### Hardware [More Information Needed] #### Software [More Information Needed] ## Citation [optional] <!-- If there is a paper or blog post introducing the model, the APA and Bibtex information for that should go in this section. --> **BibTeX:** [More Information Needed] **APA:** [More Information Needed] ## Glossary [optional] <!-- If relevant, include terms and calculations in this section that can help readers understand the model or model card. --> [More Information Needed] ## More Information [optional] [More Information Needed] ## Model Card Authors [optional] [More Information Needed] ## Model Card Contact [More Information Needed]
[ "awning", "balcony", "ramp" ]
0llheaven/CON-DETR-V10
# Model Card for Model ID <!-- Provide a quick summary of what the model is/does. --> ## Model Details ### Model Description <!-- Provide a longer summary of what this model is. --> This is the model card of a πŸ€— transformers model that has been pushed on the Hub. This model card has been automatically generated. - **Developed by:** [More Information Needed] - **Funded by [optional]:** [More Information Needed] - **Shared by [optional]:** [More Information Needed] - **Model type:** [More Information Needed] - **Language(s) (NLP):** [More Information Needed] - **License:** [More Information Needed] - **Finetuned from model [optional]:** [More Information Needed] ### Model Sources [optional] <!-- Provide the basic links for the model. --> - **Repository:** [More Information Needed] - **Paper [optional]:** [More Information Needed] - **Demo [optional]:** [More Information Needed] ## Uses <!-- Address questions around how the model is intended to be used, including the foreseeable users of the model and those affected by the model. --> ### Direct Use <!-- This section is for the model use without fine-tuning or plugging into a larger ecosystem/app. --> [More Information Needed] ### Downstream Use [optional] <!-- This section is for the model use when fine-tuned for a task, or when plugged into a larger ecosystem/app --> [More Information Needed] ### Out-of-Scope Use <!-- This section addresses misuse, malicious use, and uses that the model will not work well for. --> [More Information Needed] ## Bias, Risks, and Limitations <!-- This section is meant to convey both technical and sociotechnical limitations. --> [More Information Needed] ### Recommendations <!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. --> Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations. ## How to Get Started with the Model Use the code below to get started with the model. [More Information Needed] ## Training Details ### Training Data <!-- This should link to a Dataset Card, perhaps with a short stub of information on what the training data is all about as well as documentation related to data pre-processing or additional filtering. --> [More Information Needed] ### Training Procedure <!-- This relates heavily to the Technical Specifications. Content here should link to that section when it is relevant to the training procedure. --> #### Preprocessing [optional] [More Information Needed] #### Training Hyperparameters - **Training regime:** [More Information Needed] <!--fp32, fp16 mixed precision, bf16 mixed precision, bf16 non-mixed precision, fp16 non-mixed precision, fp8 mixed precision --> #### Speeds, Sizes, Times [optional] <!-- This section provides information about throughput, start/end time, checkpoint size if relevant, etc. --> [More Information Needed] ## Evaluation <!-- This section describes the evaluation protocols and provides the results. --> ### Testing Data, Factors & Metrics #### Testing Data <!-- This should link to a Dataset Card if possible. --> [More Information Needed] #### Factors <!-- These are the things the evaluation is disaggregating by, e.g., subpopulations or domains. --> [More Information Needed] #### Metrics <!-- These are the evaluation metrics being used, ideally with a description of why. --> [More Information Needed] ### Results [More Information Needed] #### Summary ## Model Examination [optional] <!-- Relevant interpretability work for the model goes here --> [More Information Needed] ## Environmental Impact <!-- Total emissions (in grams of CO2eq) and additional considerations, such as electricity usage, go here. Edit the suggested text below accordingly --> Carbon emissions can be estimated using the [Machine Learning Impact calculator](https://mlco2.github.io/impact#compute) presented in [Lacoste et al. (2019)](https://arxiv.org/abs/1910.09700). - **Hardware Type:** [More Information Needed] - **Hours used:** [More Information Needed] - **Cloud Provider:** [More Information Needed] - **Compute Region:** [More Information Needed] - **Carbon Emitted:** [More Information Needed] ## Technical Specifications [optional] ### Model Architecture and Objective [More Information Needed] ### Compute Infrastructure [More Information Needed] #### Hardware [More Information Needed] #### Software [More Information Needed] ## Citation [optional] <!-- If there is a paper or blog post introducing the model, the APA and Bibtex information for that should go in this section. --> **BibTeX:** [More Information Needed] **APA:** [More Information Needed] ## Glossary [optional] <!-- If relevant, include terms and calculations in this section that can help readers understand the model or model card. --> [More Information Needed] ## More Information [optional] [More Information Needed] ## Model Card Authors [optional] [More Information Needed] ## Model Card Contact [More Information Needed]
[ "pneumonia", "normal" ]
jaxnwagner/detr-resnet-50-dc5-fashionpedia-finetuned-jw
<!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. --> # detr-resnet-50-dc5-fashionpedia-finetuned-jw This model is a fine-tuned version of [facebook/detr-resnet-50-dc5](https://huggingface.co/facebook/detr-resnet-50-dc5) on an unknown dataset. It achieves the following results on the evaluation set: - eval_loss: 3.4234 - eval_map: 0.0022 - eval_map_50: 0.0063 - eval_map_75: 0.0008 - eval_map_small: 0.0022 - eval_map_medium: 0.003 - eval_map_large: 0.0007 - eval_mar_1: 0.0034 - eval_mar_10: 0.0093 - eval_mar_100: 0.0131 - eval_mar_small: 0.009 - eval_mar_medium: 0.0171 - eval_mar_large: 0.0144 - eval_map_shirt, blouse: 0.0 - eval_mar_100_shirt, blouse: 0.0 - eval_map_top, t-shirt, sweatshirt: 0.0 - eval_mar_100_top, t-shirt, sweatshirt: 0.0 - eval_map_sweater: 0.0 - eval_mar_100_sweater: 0.0 - eval_map_cardigan: 0.0 - eval_mar_100_cardigan: 0.0 - eval_map_jacket: 0.0 - eval_mar_100_jacket: 0.0 - eval_map_vest: 0.0 - eval_mar_100_vest: 0.0 - eval_map_pants: 0.0 - eval_mar_100_pants: 0.0 - eval_map_shorts: 0.0 - eval_mar_100_shorts: 0.0 - eval_map_skirt: 0.0 - eval_mar_100_skirt: 0.0 - eval_map_coat: 0.0 - eval_mar_100_coat: 0.0 - eval_map_dress: 0.0 - eval_mar_100_dress: 0.0 - eval_map_jumpsuit: 0.0 - eval_mar_100_jumpsuit: 0.0 - eval_map_cape: 0.0 - eval_mar_100_cape: 0.0 - eval_map_glasses: 0.0 - eval_mar_100_glasses: 0.0 - eval_map_hat: 0.0 - eval_mar_100_hat: 0.0 - eval_map_headband, head covering, hair accessory: 0.0 - eval_mar_100_headband, head covering, hair accessory: 0.0 - eval_map_tie: 0.0 - eval_mar_100_tie: 0.0 - eval_map_glove: 0.0 - eval_mar_100_glove: 0.0 - eval_map_watch: 0.0 - eval_mar_100_watch: 0.0 - eval_map_belt: 0.0 - eval_mar_100_belt: 0.0 - eval_map_leg warmer: 0.0 - eval_mar_100_leg warmer: 0.0 - eval_map_tights, stockings: 0.0 - eval_mar_100_tights, stockings: 0.0 - eval_map_sock: 0.0 - eval_mar_100_sock: 0.0 - eval_map_shoe: 0.0939 - eval_mar_100_shoe: 0.3148 - eval_map_bag, wallet: 0.0 - eval_mar_100_bag, wallet: 0.0 - eval_map_scarf: 0.0 - eval_mar_100_scarf: 0.0 - eval_map_umbrella: 0.0 - eval_mar_100_umbrella: 0.0 - eval_map_hood: 0.0 - eval_mar_100_hood: 0.0 - eval_map_collar: 0.0 - eval_mar_100_collar: 0.0 - eval_map_lapel: 0.0 - eval_mar_100_lapel: 0.0 - eval_map_epaulette: 0.0 - eval_mar_100_epaulette: 0.0 - eval_map_sleeve: 0.0076 - eval_mar_100_sleeve: 0.2888 - eval_map_pocket: 0.0 - eval_mar_100_pocket: 0.0 - eval_map_neckline: 0.0 - eval_mar_100_neckline: 0.0 - eval_map_buckle: 0.0 - eval_mar_100_buckle: 0.0 - eval_map_zipper: 0.0 - eval_mar_100_zipper: 0.0 - eval_map_applique: 0.0 - eval_mar_100_applique: 0.0 - eval_map_bead: 0.0 - eval_mar_100_bead: 0.0 - eval_map_bow: 0.0 - eval_mar_100_bow: 0.0 - eval_map_flower: 0.0 - eval_mar_100_flower: 0.0 - eval_map_fringe: 0.0 - eval_mar_100_fringe: 0.0 - eval_map_ribbon: 0.0 - eval_mar_100_ribbon: 0.0 - eval_map_rivet: 0.0 - eval_mar_100_rivet: 0.0 - eval_map_ruffle: 0.0 - eval_mar_100_ruffle: 0.0 - eval_map_sequin: 0.0 - eval_mar_100_sequin: 0.0 - eval_map_tassel: 0.0 - eval_mar_100_tassel: 0.0 - eval_runtime: 202.1296 - eval_samples_per_second: 5.729 - eval_steps_per_second: 1.435 - epoch: 0.0482 - step: 550 ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 1e-05 - train_batch_size: 4 - eval_batch_size: 4 - seed: 42 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - training_steps: 10000 - mixed_precision_training: Native AMP ### Framework versions - Transformers 4.45.2 - Pytorch 2.4.1+cu121 - Datasets 3.0.1 - Tokenizers 0.20.1
[ "shirt, blouse", "top, t-shirt, sweatshirt", "sweater", "cardigan", "jacket", "vest", "pants", "shorts", "skirt", "coat", "dress", "jumpsuit", "cape", "glasses", "hat", "headband, head covering, hair accessory", "tie", "glove", "watch", "belt", "leg warmer", "tights, stockings", "sock", "shoe", "bag, wallet", "scarf", "umbrella", "hood", "collar", "lapel", "epaulette", "sleeve", "pocket", "neckline", "buckle", "zipper", "applique", "bead", "bow", "flower", "fringe", "ribbon", "rivet", "ruffle", "sequin", "tassel" ]
hassinaLetxbe/lightning_logs
# Model Card for Model ID <!-- Provide a quick summary of what the model is/does. --> ## Model Details ### Model Description <!-- Provide a longer summary of what this model is. --> This is the model card of a πŸ€— transformers model that has been pushed on the Hub. This model card has been automatically generated. - **Developed by:** [More Information Needed] - **Funded by [optional]:** [More Information Needed] - **Shared by [optional]:** [More Information Needed] - **Model type:** [More Information Needed] - **Language(s) (NLP):** [More Information Needed] - **License:** [More Information Needed] - **Finetuned from model [optional]:** [More Information Needed] ### Model Sources [optional] <!-- Provide the basic links for the model. --> - **Repository:** [More Information Needed] - **Paper [optional]:** [More Information Needed] - **Demo [optional]:** [More Information Needed] ## Uses <!-- Address questions around how the model is intended to be used, including the foreseeable users of the model and those affected by the model. --> ### Direct Use <!-- This section is for the model use without fine-tuning or plugging into a larger ecosystem/app. --> [More Information Needed] ### Downstream Use [optional] <!-- This section is for the model use when fine-tuned for a task, or when plugged into a larger ecosystem/app --> [More Information Needed] ### Out-of-Scope Use <!-- This section addresses misuse, malicious use, and uses that the model will not work well for. --> [More Information Needed] ## Bias, Risks, and Limitations <!-- This section is meant to convey both technical and sociotechnical limitations. --> [More Information Needed] ### Recommendations <!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. --> Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations. ## How to Get Started with the Model Use the code below to get started with the model. [More Information Needed] ## Training Details ### Training Data <!-- This should link to a Dataset Card, perhaps with a short stub of information on what the training data is all about as well as documentation related to data pre-processing or additional filtering. --> [More Information Needed] ### Training Procedure <!-- This relates heavily to the Technical Specifications. Content here should link to that section when it is relevant to the training procedure. --> #### Preprocessing [optional] [More Information Needed] #### Training Hyperparameters - **Training regime:** [More Information Needed] <!--fp32, fp16 mixed precision, bf16 mixed precision, bf16 non-mixed precision, fp16 non-mixed precision, fp8 mixed precision --> #### Speeds, Sizes, Times [optional] <!-- This section provides information about throughput, start/end time, checkpoint size if relevant, etc. --> [More Information Needed] ## Evaluation <!-- This section describes the evaluation protocols and provides the results. --> ### Testing Data, Factors & Metrics #### Testing Data <!-- This should link to a Dataset Card if possible. --> [More Information Needed] #### Factors <!-- These are the things the evaluation is disaggregating by, e.g., subpopulations or domains. --> [More Information Needed] #### Metrics <!-- These are the evaluation metrics being used, ideally with a description of why. --> [More Information Needed] ### Results [More Information Needed] #### Summary ## Model Examination [optional] <!-- Relevant interpretability work for the model goes here --> [More Information Needed] ## Environmental Impact <!-- Total emissions (in grams of CO2eq) and additional considerations, such as electricity usage, go here. Edit the suggested text below accordingly --> Carbon emissions can be estimated using the [Machine Learning Impact calculator](https://mlco2.github.io/impact#compute) presented in [Lacoste et al. (2019)](https://arxiv.org/abs/1910.09700). - **Hardware Type:** [More Information Needed] - **Hours used:** [More Information Needed] - **Cloud Provider:** [More Information Needed] - **Compute Region:** [More Information Needed] - **Carbon Emitted:** [More Information Needed] ## Technical Specifications [optional] ### Model Architecture and Objective [More Information Needed] ### Compute Infrastructure [More Information Needed] #### Hardware [More Information Needed] #### Software [More Information Needed] ## Citation [optional] <!-- If there is a paper or blog post introducing the model, the APA and Bibtex information for that should go in this section. --> **BibTeX:** [More Information Needed] **APA:** [More Information Needed] ## Glossary [optional] <!-- If relevant, include terms and calculations in this section that can help readers understand the model or model card. --> [More Information Needed] ## More Information [optional] [More Information Needed] ## Model Card Authors [optional] [More Information Needed] ## Model Card Contact [More Information Needed]
[ "table", "table rotated" ]
joe611/chickens-composite-201010101010-150-epochs-w-transform
<!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. --> # chickens-composite-201010101010-150-epochs-w-transform This model is a fine-tuned version of [facebook/detr-resnet-50](https://huggingface.co/facebook/detr-resnet-50) on the None dataset. It achieves the following results on the evaluation set: - Loss: 0.2975 - Map: 0.776 - Map 50: 0.9553 - Map 75: 0.9037 - Map Small: 0.3024 - Map Medium: 0.7689 - Map Large: 0.8175 - Mar 1: 0.3029 - Mar 10: 0.8101 - Mar 100: 0.8143 - Mar Small: 0.3347 - Mar Medium: 0.8109 - Mar Large: 0.8477 - Map Chicken: 0.7778 - Mar 100 Chicken: 0.8216 - Map Duck: 0.7035 - Mar 100 Duck: 0.7385 - Map Plant: 0.8467 - Mar 100 Plant: 0.8828 ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 1e-05 - train_batch_size: 2 - eval_batch_size: 8 - seed: 42 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: cosine - num_epochs: 150 ### Training results | Training Loss | Epoch | Step | Validation Loss | Map | Map 50 | Map 75 | Map Small | Map Medium | Map Large | Mar 1 | Mar 10 | Mar 100 | Mar Small | Mar Medium | Mar Large | Map Chicken | Mar 100 Chicken | Map Duck | Mar 100 Duck | Map Plant | Mar 100 Plant | |:-------------:|:-----:|:-----:|:---------------:|:------:|:------:|:------:|:---------:|:----------:|:---------:|:------:|:------:|:-------:|:---------:|:----------:|:---------:|:-----------:|:---------------:|:--------:|:------------:|:---------:|:-------------:| | 1.4015 | 1.0 | 350 | 1.6223 | 0.0731 | 0.1264 | 0.0779 | 0.0 | 0.037 | 0.1336 | 0.0509 | 0.213 | 0.2862 | 0.0 | 0.2222 | 0.3151 | 0.0601 | 0.3724 | 0.0 | 0.0 | 0.1591 | 0.4862 | | 1.3304 | 2.0 | 700 | 1.3527 | 0.0857 | 0.1237 | 0.0981 | 0.0031 | 0.0206 | 0.1401 | 0.0706 | 0.2057 | 0.3043 | 0.0458 | 0.2488 | 0.3303 | 0.0287 | 0.1681 | 0.0 | 0.0 | 0.2283 | 0.7448 | | 1.2364 | 3.0 | 1050 | 1.2735 | 0.1332 | 0.1913 | 0.1536 | 0.0016 | 0.044 | 0.1722 | 0.0822 | 0.2264 | 0.2923 | 0.0542 | 0.2517 | 0.2954 | 0.0214 | 0.1114 | 0.0 | 0.0 | 0.378 | 0.7657 | | 1.2061 | 4.0 | 1400 | 1.1804 | 0.185 | 0.2661 | 0.2162 | 0.0095 | 0.061 | 0.2246 | 0.0791 | 0.2487 | 0.2697 | 0.0792 | 0.2316 | 0.2708 | 0.021 | 0.0703 | 0.0 | 0.0 | 0.534 | 0.7389 | | 1.1017 | 5.0 | 1750 | 1.0807 | 0.2234 | 0.3147 | 0.2598 | 0.0063 | 0.1273 | 0.2447 | 0.0998 | 0.2939 | 0.3097 | 0.0458 | 0.2816 | 0.289 | 0.0772 | 0.1751 | 0.0 | 0.0 | 0.5929 | 0.754 | | 0.961 | 6.0 | 2100 | 1.0394 | 0.2444 | 0.3482 | 0.2877 | 0.0422 | 0.1562 | 0.2637 | 0.1053 | 0.316 | 0.3257 | 0.15 | 0.3029 | 0.3035 | 0.1075 | 0.247 | 0.0 | 0.0 | 0.6256 | 0.7301 | | 1.0604 | 7.0 | 2450 | 1.0072 | 0.2696 | 0.3797 | 0.3146 | 0.0116 | 0.1993 | 0.2941 | 0.1062 | 0.3387 | 0.3414 | 0.0625 | 0.3076 | 0.3341 | 0.1639 | 0.2973 | 0.0 | 0.0 | 0.6451 | 0.7268 | | 0.9828 | 8.0 | 2800 | 0.9735 | 0.3228 | 0.4682 | 0.3684 | 0.0274 | 0.2773 | 0.329 | 0.1177 | 0.4097 | 0.4157 | 0.0667 | 0.3937 | 0.4051 | 0.2995 | 0.5119 | 0.0 | 0.0 | 0.6689 | 0.7351 | | 0.9783 | 9.0 | 3150 | 0.9098 | 0.3259 | 0.4752 | 0.3836 | 0.0058 | 0.2781 | 0.3507 | 0.1163 | 0.4218 | 0.4295 | 0.0458 | 0.3973 | 0.4243 | 0.2965 | 0.5416 | 0.0 | 0.0 | 0.6811 | 0.7469 | | 0.8591 | 10.0 | 3500 | 0.8607 | 0.3583 | 0.5239 | 0.4245 | 0.0219 | 0.3264 | 0.382 | 0.1172 | 0.4577 | 0.4629 | 0.1125 | 0.442 | 0.4668 | 0.4081 | 0.6616 | 0.0 | 0.0 | 0.6668 | 0.7272 | | 0.9684 | 11.0 | 3850 | 0.8250 | 0.3463 | 0.4889 | 0.403 | 0.0029 | 0.3009 | 0.3763 | 0.1204 | 0.4592 | 0.4703 | 0.05 | 0.441 | 0.4834 | 0.3604 | 0.6578 | 0.0 | 0.0 | 0.6784 | 0.7531 | | 0.7974 | 12.0 | 4200 | 0.7572 | 0.3763 | 0.5392 | 0.4355 | 0.0201 | 0.3315 | 0.4073 | 0.1202 | 0.4752 | 0.4828 | 0.0667 | 0.4484 | 0.5146 | 0.4459 | 0.6995 | 0.0 | 0.0 | 0.683 | 0.749 | | 0.9251 | 13.0 | 4550 | 0.8174 | 0.3836 | 0.5534 | 0.4489 | 0.0182 | 0.3402 | 0.4084 | 0.1256 | 0.472 | 0.4744 | 0.0958 | 0.4455 | 0.4968 | 0.4663 | 0.6838 | 0.0 | 0.0 | 0.6846 | 0.7393 | | 0.6452 | 14.0 | 4900 | 0.7647 | 0.3911 | 0.555 | 0.4545 | 0.0541 | 0.3545 | 0.4092 | 0.1268 | 0.4828 | 0.4908 | 0.125 | 0.465 | 0.5067 | 0.4792 | 0.7168 | 0.0 | 0.0 | 0.6942 | 0.7556 | | 0.811 | 15.0 | 5250 | 0.7892 | 0.3914 | 0.5657 | 0.4546 | 0.0211 | 0.3425 | 0.4176 | 0.1308 | 0.4679 | 0.4737 | 0.0583 | 0.4379 | 0.4973 | 0.4858 | 0.6643 | 0.0 | 0.0 | 0.6884 | 0.7569 | | 0.7441 | 16.0 | 5600 | 0.6832 | 0.4031 | 0.5674 | 0.4806 | 0.0188 | 0.357 | 0.4267 | 0.1274 | 0.49 | 0.4988 | 0.1417 | 0.459 | 0.5234 | 0.4941 | 0.7205 | 0.0 | 0.0 | 0.7153 | 0.7757 | | 0.7059 | 17.0 | 5950 | 0.6710 | 0.4046 | 0.5732 | 0.4775 | 0.0451 | 0.3559 | 0.4176 | 0.1269 | 0.4867 | 0.4921 | 0.1417 | 0.4571 | 0.5155 | 0.4877 | 0.7 | 0.0 | 0.0015 | 0.726 | 0.7749 | | 0.6968 | 18.0 | 6300 | 0.7010 | 0.3908 | 0.5742 | 0.4667 | 0.0249 | 0.3462 | 0.414 | 0.123 | 0.4627 | 0.4723 | 0.1375 | 0.4367 | 0.4901 | 0.4805 | 0.6735 | 0.0 | 0.0 | 0.6918 | 0.7435 | | 0.9007 | 19.0 | 6650 | 0.6574 | 0.4006 | 0.5705 | 0.4698 | 0.0228 | 0.3679 | 0.4237 | 0.127 | 0.4826 | 0.4873 | 0.1083 | 0.4574 | 0.5214 | 0.4835 | 0.6897 | 0.0018 | 0.0108 | 0.7166 | 0.7615 | | 0.6598 | 20.0 | 7000 | 0.6620 | 0.4129 | 0.571 | 0.4961 | 0.0867 | 0.376 | 0.4058 | 0.1297 | 0.4912 | 0.4972 | 0.1583 | 0.4654 | 0.5031 | 0.5112 | 0.7011 | 0.001 | 0.0077 | 0.7267 | 0.7828 | | 0.6775 | 21.0 | 7350 | 0.6212 | 0.4181 | 0.5838 | 0.4786 | 0.1091 | 0.3748 | 0.4463 | 0.1355 | 0.4945 | 0.5006 | 0.1667 | 0.4661 | 0.5274 | 0.5058 | 0.7022 | 0.0099 | 0.0077 | 0.7387 | 0.7921 | | 0.7116 | 22.0 | 7700 | 0.6413 | 0.4356 | 0.5963 | 0.5087 | 0.0639 | 0.3964 | 0.4662 | 0.1357 | 0.5039 | 0.5096 | 0.175 | 0.4705 | 0.5478 | 0.5642 | 0.733 | 0.0069 | 0.0108 | 0.7358 | 0.7849 | | 0.6683 | 23.0 | 8050 | 0.6176 | 0.4366 | 0.5973 | 0.5318 | 0.117 | 0.3984 | 0.4514 | 0.1318 | 0.4979 | 0.5051 | 0.1958 | 0.4741 | 0.5288 | 0.5704 | 0.7227 | 0.0 | 0.0 | 0.7394 | 0.7925 | | 0.6173 | 24.0 | 8400 | 0.6898 | 0.4073 | 0.5832 | 0.4957 | 0.1175 | 0.3651 | 0.4295 | 0.1334 | 0.4789 | 0.4827 | 0.1833 | 0.4447 | 0.5203 | 0.5061 | 0.6741 | 0.0012 | 0.0138 | 0.7147 | 0.7603 | | 0.7656 | 25.0 | 8750 | 0.6743 | 0.4172 | 0.5985 | 0.4949 | 0.0938 | 0.382 | 0.4159 | 0.1235 | 0.4816 | 0.4891 | 0.2083 | 0.4657 | 0.4935 | 0.5175 | 0.6714 | 0.0007 | 0.0077 | 0.7332 | 0.7883 | | 0.7286 | 26.0 | 9100 | 0.6246 | 0.4313 | 0.6086 | 0.5104 | 0.0967 | 0.3909 | 0.4659 | 0.1462 | 0.5096 | 0.5129 | 0.1333 | 0.473 | 0.5574 | 0.5719 | 0.7216 | 0.0106 | 0.0523 | 0.7113 | 0.7649 | | 0.6441 | 27.0 | 9450 | 0.5930 | 0.4447 | 0.6055 | 0.5258 | 0.1523 | 0.3983 | 0.4661 | 0.1356 | 0.5053 | 0.5126 | 0.2417 | 0.4755 | 0.5325 | 0.5773 | 0.7276 | 0.002 | 0.0031 | 0.7547 | 0.8071 | | 0.651 | 28.0 | 9800 | 0.6006 | 0.4415 | 0.601 | 0.4975 | 0.0922 | 0.4016 | 0.4779 | 0.1338 | 0.5053 | 0.5115 | 0.1583 | 0.477 | 0.542 | 0.5672 | 0.7281 | 0.0 | 0.0 | 0.7575 | 0.8063 | | 0.5137 | 29.0 | 10150 | 0.6239 | 0.4396 | 0.608 | 0.5041 | 0.0615 | 0.398 | 0.4678 | 0.1336 | 0.4979 | 0.5035 | 0.1625 | 0.4708 | 0.5345 | 0.5793 | 0.7205 | 0.0012 | 0.0046 | 0.7382 | 0.7854 | | 0.6163 | 30.0 | 10500 | 0.5765 | 0.4481 | 0.618 | 0.5296 | 0.06 | 0.4093 | 0.478 | 0.1363 | 0.501 | 0.5056 | 0.1375 | 0.4702 | 0.5379 | 0.5884 | 0.7157 | 0.0002 | 0.0015 | 0.7558 | 0.7996 | | 0.5929 | 31.0 | 10850 | 0.5546 | 0.4725 | 0.6243 | 0.5649 | 0.1042 | 0.4247 | 0.5053 | 0.1429 | 0.5154 | 0.5243 | 0.2333 | 0.4833 | 0.5631 | 0.6445 | 0.7508 | 0.0 | 0.0 | 0.773 | 0.8222 | | 0.6341 | 32.0 | 11200 | 0.5550 | 0.4694 | 0.6319 | 0.5652 | 0.099 | 0.431 | 0.4864 | 0.1431 | 0.5118 | 0.5164 | 0.2375 | 0.4821 | 0.536 | 0.6453 | 0.7416 | 0.002 | 0.0031 | 0.7609 | 0.8046 | | 0.5382 | 33.0 | 11550 | 0.5301 | 0.4826 | 0.6686 | 0.565 | 0.1157 | 0.4468 | 0.5038 | 0.1521 | 0.5268 | 0.5312 | 0.1917 | 0.5007 | 0.5459 | 0.6525 | 0.7378 | 0.0261 | 0.0415 | 0.7693 | 0.8142 | | 0.6319 | 34.0 | 11900 | 0.5459 | 0.4745 | 0.6429 | 0.5585 | 0.1157 | 0.4352 | 0.4948 | 0.1465 | 0.5139 | 0.521 | 0.2333 | 0.4819 | 0.5443 | 0.6451 | 0.7357 | 0.0118 | 0.0185 | 0.7666 | 0.8088 | | 0.5791 | 35.0 | 12250 | 0.5426 | 0.4938 | 0.6786 | 0.5782 | 0.1625 | 0.4568 | 0.5391 | 0.1691 | 0.5562 | 0.5636 | 0.2917 | 0.5256 | 0.608 | 0.6614 | 0.7411 | 0.0507 | 0.1338 | 0.7693 | 0.8159 | | 0.6093 | 36.0 | 12600 | 0.5230 | 0.4876 | 0.6655 | 0.5737 | 0.1815 | 0.4426 | 0.5191 | 0.1512 | 0.5362 | 0.5435 | 0.2958 | 0.5061 | 0.5667 | 0.6688 | 0.7535 | 0.0234 | 0.0585 | 0.7704 | 0.8184 | | 0.5854 | 37.0 | 12950 | 0.5303 | 0.487 | 0.6786 | 0.5749 | 0.1405 | 0.4664 | 0.4994 | 0.1533 | 0.5285 | 0.5382 | 0.2333 | 0.52 | 0.5525 | 0.6514 | 0.733 | 0.0504 | 0.0754 | 0.7592 | 0.8063 | | 0.5897 | 38.0 | 13300 | 0.5234 | 0.5375 | 0.7548 | 0.6217 | 0.153 | 0.5249 | 0.534 | 0.1855 | 0.5919 | 0.5955 | 0.2208 | 0.5888 | 0.5844 | 0.6594 | 0.7297 | 0.1785 | 0.2354 | 0.7746 | 0.8213 | | 0.6085 | 39.0 | 13650 | 0.5286 | 0.5221 | 0.7329 | 0.6022 | 0.1777 | 0.4899 | 0.5307 | 0.1756 | 0.5782 | 0.5872 | 0.3083 | 0.5628 | 0.5936 | 0.637 | 0.7059 | 0.1514 | 0.2323 | 0.778 | 0.8234 | | 0.5987 | 40.0 | 14000 | 0.5026 | 0.6007 | 0.8007 | 0.7017 | 0.0701 | 0.575 | 0.6082 | 0.2161 | 0.641 | 0.65 | 0.1583 | 0.6284 | 0.6654 | 0.6862 | 0.7449 | 0.3459 | 0.3954 | 0.77 | 0.8096 | | 0.6375 | 41.0 | 14350 | 0.4839 | 0.6234 | 0.8405 | 0.7077 | 0.1601 | 0.611 | 0.5966 | 0.2305 | 0.6674 | 0.6747 | 0.2694 | 0.665 | 0.6393 | 0.6765 | 0.7432 | 0.4108 | 0.4615 | 0.783 | 0.8192 | | 0.5196 | 42.0 | 14700 | 0.4946 | 0.6597 | 0.8971 | 0.7776 | 0.1203 | 0.6492 | 0.6616 | 0.2678 | 0.7068 | 0.7104 | 0.25 | 0.7015 | 0.7165 | 0.6663 | 0.7216 | 0.5288 | 0.5862 | 0.7842 | 0.8234 | | 0.6201 | 43.0 | 15050 | 0.4747 | 0.6575 | 0.8965 | 0.7788 | 0.128 | 0.6487 | 0.6622 | 0.2554 | 0.7108 | 0.7142 | 0.2611 | 0.7053 | 0.7295 | 0.6691 | 0.7308 | 0.5213 | 0.5908 | 0.782 | 0.8209 | | 0.6425 | 44.0 | 15400 | 0.4644 | 0.677 | 0.8952 | 0.8202 | 0.1405 | 0.6713 | 0.689 | 0.258 | 0.7204 | 0.7237 | 0.2042 | 0.7202 | 0.7215 | 0.7091 | 0.7638 | 0.5395 | 0.5846 | 0.7823 | 0.8226 | | 0.6206 | 45.0 | 15750 | 0.4516 | 0.6604 | 0.9031 | 0.7835 | 0.1947 | 0.6629 | 0.6599 | 0.259 | 0.7179 | 0.7207 | 0.2542 | 0.7277 | 0.6962 | 0.687 | 0.7481 | 0.5097 | 0.5877 | 0.7844 | 0.8264 | | 0.5001 | 46.0 | 16100 | 0.4551 | 0.6782 | 0.9126 | 0.8187 | 0.2606 | 0.6633 | 0.7248 | 0.2623 | 0.722 | 0.7262 | 0.2875 | 0.7149 | 0.7613 | 0.6951 | 0.7481 | 0.5637 | 0.6092 | 0.7757 | 0.8213 | | 0.5677 | 47.0 | 16450 | 0.4636 | 0.6845 | 0.921 | 0.8275 | 0.1943 | 0.6686 | 0.7419 | 0.269 | 0.7221 | 0.7307 | 0.2625 | 0.7206 | 0.7832 | 0.6948 | 0.7465 | 0.5848 | 0.6277 | 0.7739 | 0.818 | | 0.5983 | 48.0 | 16800 | 0.4499 | 0.688 | 0.9201 | 0.8086 | 0.1483 | 0.6765 | 0.7213 | 0.2747 | 0.7312 | 0.735 | 0.2292 | 0.7285 | 0.7601 | 0.6915 | 0.7492 | 0.5879 | 0.6308 | 0.7846 | 0.8251 | | 0.5536 | 49.0 | 17150 | 0.4433 | 0.6963 | 0.9324 | 0.848 | 0.1873 | 0.6879 | 0.7475 | 0.2787 | 0.7422 | 0.7455 | 0.2417 | 0.7428 | 0.7798 | 0.6874 | 0.7454 | 0.6187 | 0.6631 | 0.7827 | 0.828 | | 0.5126 | 50.0 | 17500 | 0.4459 | 0.6854 | 0.9285 | 0.8089 | 0.1594 | 0.6739 | 0.7289 | 0.2755 | 0.733 | 0.7381 | 0.2583 | 0.7288 | 0.7768 | 0.6833 | 0.7422 | 0.6008 | 0.6538 | 0.7721 | 0.8184 | | 0.4895 | 51.0 | 17850 | 0.4430 | 0.6927 | 0.9286 | 0.7998 | 0.1293 | 0.6822 | 0.7355 | 0.2755 | 0.7336 | 0.7377 | 0.2472 | 0.7279 | 0.7761 | 0.6859 | 0.7389 | 0.6172 | 0.66 | 0.775 | 0.8142 | | 0.5019 | 52.0 | 18200 | 0.4188 | 0.6973 | 0.9156 | 0.8225 | 0.1778 | 0.6843 | 0.7255 | 0.2709 | 0.7375 | 0.7418 | 0.3042 | 0.7284 | 0.7771 | 0.7109 | 0.7649 | 0.5945 | 0.6385 | 0.7864 | 0.8222 | | 0.4635 | 53.0 | 18550 | 0.4386 | 0.6864 | 0.9184 | 0.8136 | 0.2141 | 0.6724 | 0.7268 | 0.2635 | 0.7309 | 0.7344 | 0.2917 | 0.7237 | 0.7653 | 0.6954 | 0.753 | 0.5759 | 0.6231 | 0.7878 | 0.8272 | | 0.53 | 54.0 | 18900 | 0.4101 | 0.6987 | 0.935 | 0.831 | 0.183 | 0.6723 | 0.7558 | 0.2682 | 0.7419 | 0.7468 | 0.2806 | 0.7268 | 0.7938 | 0.6862 | 0.747 | 0.611 | 0.6569 | 0.7989 | 0.8364 | | 0.3912 | 55.0 | 19250 | 0.4047 | 0.7052 | 0.9364 | 0.8331 | 0.1803 | 0.6879 | 0.7368 | 0.2777 | 0.7462 | 0.7518 | 0.2875 | 0.7422 | 0.7696 | 0.7061 | 0.7627 | 0.6105 | 0.6508 | 0.799 | 0.8418 | | 0.4196 | 56.0 | 19600 | 0.3918 | 0.7121 | 0.9409 | 0.8263 | 0.2154 | 0.6994 | 0.7414 | 0.2832 | 0.7538 | 0.7588 | 0.2542 | 0.7477 | 0.7848 | 0.6904 | 0.7497 | 0.6406 | 0.6831 | 0.8053 | 0.8435 | | 0.4695 | 57.0 | 19950 | 0.3751 | 0.7179 | 0.9405 | 0.8433 | 0.1949 | 0.7119 | 0.753 | 0.282 | 0.7606 | 0.7653 | 0.2583 | 0.7658 | 0.7947 | 0.7241 | 0.7778 | 0.6225 | 0.6708 | 0.8071 | 0.8473 | | 0.432 | 58.0 | 20300 | 0.4059 | 0.6973 | 0.9353 | 0.8266 | 0.1685 | 0.6833 | 0.7306 | 0.2677 | 0.7411 | 0.7467 | 0.25 | 0.7378 | 0.7835 | 0.7025 | 0.7616 | 0.5965 | 0.6415 | 0.7929 | 0.8368 | | 0.4664 | 59.0 | 20650 | 0.4190 | 0.6831 | 0.9366 | 0.8001 | 0.2139 | 0.6734 | 0.7117 | 0.2701 | 0.743 | 0.7492 | 0.3403 | 0.7367 | 0.7794 | 0.6592 | 0.7319 | 0.5967 | 0.6769 | 0.7936 | 0.8389 | | 0.4202 | 60.0 | 21000 | 0.3975 | 0.718 | 0.9382 | 0.8345 | 0.2012 | 0.7137 | 0.754 | 0.2864 | 0.7605 | 0.7664 | 0.2597 | 0.7633 | 0.7968 | 0.7311 | 0.7789 | 0.6254 | 0.68 | 0.7976 | 0.8402 | | 0.4697 | 61.0 | 21350 | 0.3969 | 0.7085 | 0.9339 | 0.8368 | 0.2817 | 0.697 | 0.7559 | 0.2763 | 0.7493 | 0.7547 | 0.3083 | 0.75 | 0.7983 | 0.7257 | 0.7768 | 0.5953 | 0.6431 | 0.8045 | 0.8444 | | 0.4766 | 62.0 | 21700 | 0.3912 | 0.7324 | 0.944 | 0.8479 | 0.2239 | 0.7227 | 0.7739 | 0.285 | 0.7731 | 0.7787 | 0.3458 | 0.7729 | 0.8189 | 0.7392 | 0.7876 | 0.6444 | 0.6938 | 0.8135 | 0.8548 | | 0.5348 | 63.0 | 22050 | 0.4089 | 0.7171 | 0.9399 | 0.8457 | 0.2197 | 0.6999 | 0.7707 | 0.2848 | 0.7551 | 0.7609 | 0.2972 | 0.752 | 0.8049 | 0.7181 | 0.7686 | 0.6393 | 0.6862 | 0.7939 | 0.828 | | 0.4411 | 64.0 | 22400 | 0.3839 | 0.7136 | 0.9221 | 0.8386 | 0.2142 | 0.6964 | 0.7817 | 0.281 | 0.7555 | 0.7599 | 0.3 | 0.7479 | 0.824 | 0.7251 | 0.7805 | 0.599 | 0.6462 | 0.8168 | 0.8531 | | 0.515 | 65.0 | 22750 | 0.3916 | 0.7188 | 0.9328 | 0.8476 | 0.246 | 0.7069 | 0.7728 | 0.2823 | 0.7571 | 0.761 | 0.3083 | 0.7549 | 0.8045 | 0.7311 | 0.7773 | 0.625 | 0.6631 | 0.8002 | 0.8427 | | 0.4149 | 66.0 | 23100 | 0.3791 | 0.7229 | 0.9347 | 0.8742 | 0.2479 | 0.7118 | 0.7682 | 0.2787 | 0.7656 | 0.7694 | 0.3431 | 0.7588 | 0.8183 | 0.7257 | 0.7816 | 0.6256 | 0.6692 | 0.8173 | 0.8573 | | 0.4423 | 67.0 | 23450 | 0.3892 | 0.733 | 0.9353 | 0.8591 | 0.2376 | 0.7209 | 0.7761 | 0.2911 | 0.7717 | 0.7762 | 0.3333 | 0.7658 | 0.8147 | 0.7347 | 0.7876 | 0.6544 | 0.6938 | 0.8099 | 0.8473 | | 0.5056 | 68.0 | 23800 | 0.3703 | 0.739 | 0.9434 | 0.8816 | 0.2454 | 0.7267 | 0.7895 | 0.2912 | 0.7782 | 0.7834 | 0.3681 | 0.7753 | 0.8227 | 0.7363 | 0.7859 | 0.6674 | 0.7154 | 0.8131 | 0.849 | | 0.4528 | 69.0 | 24150 | 0.3878 | 0.7179 | 0.9405 | 0.8289 | 0.2635 | 0.7022 | 0.7609 | 0.287 | 0.7622 | 0.7679 | 0.3319 | 0.7579 | 0.7985 | 0.7037 | 0.7551 | 0.6305 | 0.6892 | 0.8196 | 0.8594 | | 0.488 | 70.0 | 24500 | 0.3665 | 0.7358 | 0.9358 | 0.8612 | 0.1846 | 0.7237 | 0.8002 | 0.2857 | 0.7726 | 0.7781 | 0.2764 | 0.7696 | 0.8338 | 0.7373 | 0.7854 | 0.656 | 0.6923 | 0.8141 | 0.8565 | | 0.4911 | 71.0 | 24850 | 0.3742 | 0.732 | 0.9426 | 0.8599 | 0.2712 | 0.7253 | 0.7568 | 0.2891 | 0.768 | 0.773 | 0.3292 | 0.7725 | 0.794 | 0.7239 | 0.7768 | 0.6607 | 0.6923 | 0.8114 | 0.8498 | | 0.517 | 72.0 | 25200 | 0.3744 | 0.7242 | 0.9268 | 0.8444 | 0.2803 | 0.7068 | 0.7958 | 0.2807 | 0.766 | 0.7722 | 0.3611 | 0.7574 | 0.8345 | 0.7385 | 0.7962 | 0.629 | 0.6769 | 0.8052 | 0.8435 | | 0.5344 | 73.0 | 25550 | 0.3824 | 0.7233 | 0.9412 | 0.8574 | 0.24 | 0.7044 | 0.7595 | 0.2847 | 0.764 | 0.7713 | 0.3361 | 0.7542 | 0.8068 | 0.7125 | 0.7649 | 0.6431 | 0.6985 | 0.8142 | 0.8506 | | 0.4292 | 74.0 | 25900 | 0.3551 | 0.7303 | 0.948 | 0.8483 | 0.2411 | 0.7229 | 0.7523 | 0.2896 | 0.7727 | 0.7788 | 0.3694 | 0.7734 | 0.7878 | 0.7327 | 0.7865 | 0.6445 | 0.6969 | 0.8138 | 0.8531 | | 0.4941 | 75.0 | 26250 | 0.3586 | 0.7332 | 0.9367 | 0.8555 | 0.212 | 0.716 | 0.7782 | 0.2862 | 0.7714 | 0.7775 | 0.3708 | 0.7611 | 0.8171 | 0.7346 | 0.7892 | 0.6423 | 0.6846 | 0.8228 | 0.8586 | | 0.433 | 76.0 | 26600 | 0.3784 | 0.7205 | 0.9407 | 0.8413 | 0.2176 | 0.7119 | 0.7569 | 0.2884 | 0.7628 | 0.7691 | 0.3569 | 0.7612 | 0.7945 | 0.7145 | 0.7741 | 0.6521 | 0.6969 | 0.7951 | 0.8364 | | 0.3602 | 77.0 | 26950 | 0.3571 | 0.7429 | 0.9484 | 0.8797 | 0.238 | 0.7339 | 0.7756 | 0.2939 | 0.7814 | 0.7866 | 0.3431 | 0.7795 | 0.8153 | 0.7461 | 0.7962 | 0.6758 | 0.72 | 0.8067 | 0.8435 | | 0.5168 | 78.0 | 27300 | 0.3580 | 0.732 | 0.9512 | 0.8732 | 0.2239 | 0.725 | 0.7628 | 0.2918 | 0.775 | 0.7798 | 0.325 | 0.7697 | 0.8062 | 0.7216 | 0.7735 | 0.6649 | 0.7169 | 0.8094 | 0.849 | | 0.4287 | 79.0 | 27650 | 0.3531 | 0.7378 | 0.9356 | 0.868 | 0.223 | 0.7154 | 0.8017 | 0.2856 | 0.7741 | 0.7781 | 0.3292 | 0.7575 | 0.8337 | 0.75 | 0.8032 | 0.649 | 0.6815 | 0.8142 | 0.8494 | | 0.4838 | 80.0 | 28000 | 0.3632 | 0.7289 | 0.9457 | 0.8408 | 0.2269 | 0.7157 | 0.758 | 0.2868 | 0.7738 | 0.7787 | 0.3069 | 0.7668 | 0.8026 | 0.7357 | 0.7876 | 0.6467 | 0.7062 | 0.8044 | 0.8423 | | 0.3844 | 81.0 | 28350 | 0.3603 | 0.7311 | 0.9351 | 0.8466 | 0.3043 | 0.7198 | 0.7787 | 0.2811 | 0.7659 | 0.7734 | 0.3625 | 0.7657 | 0.8154 | 0.7447 | 0.7903 | 0.6253 | 0.6677 | 0.8232 | 0.8623 | | 0.3887 | 82.0 | 28700 | 0.3477 | 0.7471 | 0.9451 | 0.8617 | 0.29 | 0.7337 | 0.7957 | 0.2904 | 0.7833 | 0.7889 | 0.3458 | 0.7784 | 0.8352 | 0.76 | 0.8038 | 0.6526 | 0.6985 | 0.8288 | 0.8644 | | 0.422 | 83.0 | 29050 | 0.3432 | 0.7492 | 0.9564 | 0.8753 | 0.3003 | 0.7328 | 0.7857 | 0.2914 | 0.7909 | 0.7942 | 0.3597 | 0.7773 | 0.8278 | 0.7519 | 0.8049 | 0.6699 | 0.7154 | 0.8258 | 0.8623 | | 0.4219 | 84.0 | 29400 | 0.3515 | 0.7418 | 0.9475 | 0.9055 | 0.294 | 0.7297 | 0.7856 | 0.2908 | 0.7823 | 0.7876 | 0.3556 | 0.7794 | 0.8159 | 0.7394 | 0.7919 | 0.6644 | 0.7108 | 0.8216 | 0.8603 | | 0.429 | 85.0 | 29750 | 0.3381 | 0.7523 | 0.9483 | 0.895 | 0.3192 | 0.7365 | 0.8041 | 0.2935 | 0.7933 | 0.7985 | 0.3875 | 0.7841 | 0.841 | 0.7576 | 0.8076 | 0.677 | 0.7262 | 0.8222 | 0.8619 | | 0.457 | 86.0 | 30100 | 0.3434 | 0.7485 | 0.9477 | 0.8982 | 0.3254 | 0.734 | 0.7861 | 0.2916 | 0.7886 | 0.7938 | 0.4111 | 0.7803 | 0.8199 | 0.7415 | 0.7897 | 0.6769 | 0.7246 | 0.8271 | 0.8669 | | 0.4805 | 87.0 | 30450 | 0.3396 | 0.7527 | 0.9491 | 0.8731 | 0.2913 | 0.7379 | 0.7962 | 0.2953 | 0.791 | 0.7974 | 0.3806 | 0.7856 | 0.8398 | 0.7607 | 0.7973 | 0.6669 | 0.7262 | 0.8304 | 0.8686 | | 0.5091 | 88.0 | 30800 | 0.3460 | 0.7494 | 0.955 | 0.8928 | 0.3707 | 0.7286 | 0.7815 | 0.2891 | 0.7873 | 0.7918 | 0.4542 | 0.7713 | 0.8255 | 0.749 | 0.7957 | 0.6717 | 0.7154 | 0.8274 | 0.8644 | | 0.384 | 89.0 | 31150 | 0.3321 | 0.7612 | 0.956 | 0.8905 | 0.3394 | 0.7421 | 0.7949 | 0.2938 | 0.7984 | 0.8033 | 0.4056 | 0.7859 | 0.8393 | 0.77 | 0.8151 | 0.6863 | 0.7323 | 0.8272 | 0.8623 | | 0.3461 | 90.0 | 31500 | 0.3449 | 0.7479 | 0.9489 | 0.9036 | 0.2912 | 0.7337 | 0.7889 | 0.2911 | 0.7872 | 0.7917 | 0.3736 | 0.779 | 0.8273 | 0.7508 | 0.8005 | 0.6669 | 0.7123 | 0.8259 | 0.8623 | | 0.3522 | 91.0 | 31850 | 0.3477 | 0.7454 | 0.9489 | 0.9094 | 0.2996 | 0.7248 | 0.7908 | 0.2867 | 0.7835 | 0.7882 | 0.3931 | 0.7685 | 0.8328 | 0.7475 | 0.7968 | 0.6608 | 0.7046 | 0.828 | 0.8632 | | 0.4073 | 92.0 | 32200 | 0.3417 | 0.7573 | 0.9536 | 0.88 | 0.3281 | 0.7373 | 0.8216 | 0.2923 | 0.7948 | 0.7988 | 0.3806 | 0.7803 | 0.8521 | 0.7564 | 0.7962 | 0.6851 | 0.7323 | 0.8304 | 0.8678 | | 0.3508 | 93.0 | 32550 | 0.3322 | 0.7535 | 0.9553 | 0.902 | 0.3025 | 0.739 | 0.7892 | 0.2932 | 0.7922 | 0.7972 | 0.4083 | 0.7845 | 0.8239 | 0.7578 | 0.7995 | 0.6795 | 0.7292 | 0.8231 | 0.8628 | | 0.4263 | 94.0 | 32900 | 0.3284 | 0.7634 | 0.9567 | 0.9142 | 0.3374 | 0.7459 | 0.7983 | 0.2932 | 0.8004 | 0.8073 | 0.4194 | 0.7945 | 0.8351 | 0.7751 | 0.82 | 0.6775 | 0.7277 | 0.8376 | 0.8741 | | 0.46 | 95.0 | 33250 | 0.3282 | 0.758 | 0.9479 | 0.902 | 0.2693 | 0.7405 | 0.8062 | 0.2977 | 0.7958 | 0.7999 | 0.3569 | 0.7844 | 0.8423 | 0.7626 | 0.8108 | 0.6911 | 0.7308 | 0.8203 | 0.8582 | | 0.4361 | 96.0 | 33600 | 0.3247 | 0.7663 | 0.9483 | 0.9055 | 0.2739 | 0.7462 | 0.8103 | 0.3015 | 0.8008 | 0.8064 | 0.3556 | 0.792 | 0.8398 | 0.7676 | 0.813 | 0.7044 | 0.7431 | 0.8271 | 0.8632 | | 0.343 | 97.0 | 33950 | 0.3357 | 0.7522 | 0.9453 | 0.8961 | 0.2342 | 0.7367 | 0.7839 | 0.295 | 0.7902 | 0.7957 | 0.3347 | 0.7839 | 0.8263 | 0.7539 | 0.8043 | 0.6778 | 0.72 | 0.8248 | 0.8628 | | 0.4763 | 98.0 | 34300 | 0.3274 | 0.7608 | 0.949 | 0.9007 | 0.2723 | 0.7537 | 0.7928 | 0.2991 | 0.7953 | 0.8006 | 0.3458 | 0.7961 | 0.8276 | 0.7645 | 0.8081 | 0.6956 | 0.7354 | 0.8224 | 0.8582 | | 0.3525 | 99.0 | 34650 | 0.3304 | 0.7626 | 0.9535 | 0.8913 | 0.2551 | 0.7493 | 0.8115 | 0.2978 | 0.7966 | 0.8015 | 0.3333 | 0.7925 | 0.8417 | 0.765 | 0.8076 | 0.6972 | 0.7338 | 0.8256 | 0.8632 | | 0.5142 | 100.0 | 35000 | 0.3300 | 0.7609 | 0.9573 | 0.8886 | 0.234 | 0.7473 | 0.8121 | 0.2947 | 0.7944 | 0.7986 | 0.3375 | 0.7844 | 0.8452 | 0.7689 | 0.8092 | 0.6877 | 0.7246 | 0.8262 | 0.8619 | | 0.3987 | 101.0 | 35350 | 0.3356 | 0.7459 | 0.9502 | 0.8889 | 0.2912 | 0.7386 | 0.7918 | 0.2904 | 0.7865 | 0.792 | 0.3514 | 0.7874 | 0.8307 | 0.7616 | 0.8092 | 0.6519 | 0.7046 | 0.8242 | 0.8623 | | 0.4309 | 102.0 | 35700 | 0.3218 | 0.7667 | 0.9485 | 0.91 | 0.2605 | 0.7601 | 0.8034 | 0.2989 | 0.8023 | 0.8082 | 0.3917 | 0.8015 | 0.8409 | 0.7676 | 0.8151 | 0.6973 | 0.7385 | 0.8352 | 0.8711 | | 0.4543 | 103.0 | 36050 | 0.3195 | 0.7658 | 0.9469 | 0.9061 | 0.2682 | 0.7535 | 0.8213 | 0.2993 | 0.8023 | 0.8087 | 0.3333 | 0.8026 | 0.8545 | 0.7755 | 0.8227 | 0.688 | 0.7338 | 0.8339 | 0.8695 | | 0.3316 | 104.0 | 36400 | 0.3195 | 0.7666 | 0.9589 | 0.9012 | 0.3094 | 0.7547 | 0.815 | 0.2961 | 0.8038 | 0.8079 | 0.3667 | 0.7984 | 0.8522 | 0.771 | 0.8184 | 0.6887 | 0.7308 | 0.8401 | 0.8745 | | 0.4275 | 105.0 | 36750 | 0.3173 | 0.7674 | 0.9531 | 0.902 | 0.3075 | 0.7516 | 0.821 | 0.2976 | 0.8031 | 0.808 | 0.3639 | 0.7974 | 0.8531 | 0.7684 | 0.82 | 0.6887 | 0.7246 | 0.8452 | 0.8795 | | 0.3399 | 106.0 | 37100 | 0.3105 | 0.7714 | 0.9511 | 0.901 | 0.2893 | 0.7599 | 0.8194 | 0.297 | 0.8038 | 0.8095 | 0.3708 | 0.8004 | 0.8517 | 0.7792 | 0.8243 | 0.6982 | 0.7323 | 0.8367 | 0.872 | | 0.3859 | 107.0 | 37450 | 0.3100 | 0.7683 | 0.951 | 0.9038 | 0.2746 | 0.7586 | 0.8203 | 0.299 | 0.8044 | 0.8089 | 0.3514 | 0.805 | 0.8557 | 0.7688 | 0.8189 | 0.6933 | 0.7292 | 0.8429 | 0.8787 | | 0.3285 | 108.0 | 37800 | 0.3173 | 0.761 | 0.9487 | 0.893 | 0.2779 | 0.7499 | 0.8031 | 0.2975 | 0.7967 | 0.8022 | 0.3389 | 0.7951 | 0.8393 | 0.7591 | 0.8054 | 0.6827 | 0.7246 | 0.8411 | 0.8766 | | 0.3632 | 109.0 | 38150 | 0.3196 | 0.7641 | 0.9504 | 0.9004 | 0.2845 | 0.7493 | 0.8196 | 0.2977 | 0.8 | 0.8045 | 0.3264 | 0.7936 | 0.852 | 0.7637 | 0.8114 | 0.6981 | 0.7338 | 0.8304 | 0.8682 | | 0.3822 | 110.0 | 38500 | 0.3104 | 0.7717 | 0.9569 | 0.9052 | 0.2832 | 0.7598 | 0.8187 | 0.2998 | 0.8048 | 0.81 | 0.3542 | 0.8014 | 0.8507 | 0.7658 | 0.8103 | 0.705 | 0.7385 | 0.8443 | 0.8812 | | 0.3604 | 111.0 | 38850 | 0.3147 | 0.7642 | 0.9573 | 0.9025 | 0.2674 | 0.7525 | 0.8096 | 0.2976 | 0.7952 | 0.8036 | 0.3556 | 0.7956 | 0.8455 | 0.7643 | 0.8097 | 0.6913 | 0.7246 | 0.837 | 0.8766 | | 0.3675 | 112.0 | 39200 | 0.3032 | 0.7796 | 0.9506 | 0.9105 | 0.3037 | 0.771 | 0.8156 | 0.3035 | 0.8112 | 0.8155 | 0.3639 | 0.8091 | 0.8482 | 0.774 | 0.8168 | 0.7138 | 0.7462 | 0.851 | 0.8837 | | 0.3636 | 113.0 | 39550 | 0.3086 | 0.7772 | 0.9558 | 0.9092 | 0.3139 | 0.7687 | 0.8098 | 0.3036 | 0.8078 | 0.8128 | 0.3708 | 0.8087 | 0.8401 | 0.7752 | 0.8162 | 0.7154 | 0.7462 | 0.841 | 0.8762 | | 0.3617 | 114.0 | 39900 | 0.3060 | 0.781 | 0.9498 | 0.9043 | 0.2909 | 0.7753 | 0.8284 | 0.3044 | 0.8118 | 0.8173 | 0.3306 | 0.8169 | 0.8619 | 0.7771 | 0.8189 | 0.7121 | 0.7431 | 0.8537 | 0.89 | | 0.4996 | 115.0 | 40250 | 0.3068 | 0.7785 | 0.9515 | 0.9003 | 0.2886 | 0.7718 | 0.831 | 0.3022 | 0.808 | 0.8127 | 0.3319 | 0.8092 | 0.8581 | 0.7719 | 0.8146 | 0.7205 | 0.7431 | 0.8431 | 0.8803 | | 0.3962 | 116.0 | 40600 | 0.3063 | 0.7794 | 0.949 | 0.9137 | 0.2904 | 0.7713 | 0.8324 | 0.3045 | 0.8108 | 0.8169 | 0.3333 | 0.8138 | 0.8621 | 0.7806 | 0.8222 | 0.707 | 0.7415 | 0.8505 | 0.887 | | 0.3377 | 117.0 | 40950 | 0.3142 | 0.7714 | 0.952 | 0.9051 | 0.2819 | 0.7615 | 0.8298 | 0.3017 | 0.8045 | 0.8094 | 0.3375 | 0.8046 | 0.861 | 0.7705 | 0.813 | 0.701 | 0.7354 | 0.8428 | 0.8799 | | 0.4085 | 118.0 | 41300 | 0.3037 | 0.7762 | 0.9483 | 0.905 | 0.2893 | 0.7661 | 0.829 | 0.3037 | 0.8121 | 0.8164 | 0.3292 | 0.8102 | 0.8625 | 0.7819 | 0.8243 | 0.6976 | 0.74 | 0.8492 | 0.8849 | | 0.3813 | 119.0 | 41650 | 0.3068 | 0.7753 | 0.9451 | 0.9041 | 0.2979 | 0.767 | 0.8197 | 0.3023 | 0.807 | 0.8113 | 0.3403 | 0.8051 | 0.8571 | 0.7773 | 0.8222 | 0.7053 | 0.7323 | 0.8434 | 0.8795 | | 0.4054 | 120.0 | 42000 | 0.3127 | 0.7702 | 0.9488 | 0.9035 | 0.2737 | 0.7634 | 0.8059 | 0.3004 | 0.8024 | 0.8073 | 0.3319 | 0.8043 | 0.8419 | 0.7746 | 0.8157 | 0.6925 | 0.7246 | 0.8435 | 0.8816 | | 0.3453 | 121.0 | 42350 | 0.3038 | 0.7794 | 0.9495 | 0.9069 | 0.2828 | 0.7721 | 0.8237 | 0.3045 | 0.8119 | 0.8174 | 0.3486 | 0.812 | 0.8586 | 0.7833 | 0.8254 | 0.7085 | 0.7415 | 0.8464 | 0.8854 | | 0.3874 | 122.0 | 42700 | 0.3002 | 0.7794 | 0.9591 | 0.9073 | 0.2916 | 0.771 | 0.8207 | 0.3031 | 0.8137 | 0.8186 | 0.3819 | 0.8111 | 0.8586 | 0.7792 | 0.8211 | 0.7088 | 0.7477 | 0.8502 | 0.887 | | 0.4545 | 123.0 | 43050 | 0.3016 | 0.7745 | 0.9586 | 0.9081 | 0.2964 | 0.7645 | 0.8215 | 0.3021 | 0.8126 | 0.8173 | 0.3889 | 0.8085 | 0.8622 | 0.7709 | 0.8168 | 0.7081 | 0.7523 | 0.8446 | 0.8828 | | 0.4242 | 124.0 | 43400 | 0.3007 | 0.7835 | 0.9573 | 0.9159 | 0.2929 | 0.7739 | 0.8208 | 0.3033 | 0.8162 | 0.8214 | 0.3889 | 0.814 | 0.8564 | 0.7807 | 0.8249 | 0.7171 | 0.7508 | 0.8526 | 0.8887 | | 0.423 | 125.0 | 43750 | 0.2983 | 0.7814 | 0.9614 | 0.8973 | 0.3039 | 0.7709 | 0.8232 | 0.3038 | 0.8166 | 0.8214 | 0.3694 | 0.8134 | 0.8603 | 0.776 | 0.8227 | 0.7191 | 0.7569 | 0.849 | 0.8845 | | 0.3747 | 126.0 | 44100 | 0.2976 | 0.7816 | 0.9521 | 0.9029 | 0.2774 | 0.7752 | 0.8204 | 0.3067 | 0.8139 | 0.8191 | 0.3375 | 0.8161 | 0.8547 | 0.7834 | 0.8265 | 0.7135 | 0.7462 | 0.848 | 0.8845 | | 0.4194 | 127.0 | 44450 | 0.2991 | 0.7776 | 0.9518 | 0.8988 | 0.2871 | 0.7702 | 0.8165 | 0.3034 | 0.8102 | 0.8157 | 0.3431 | 0.8118 | 0.8501 | 0.7777 | 0.8227 | 0.7039 | 0.7369 | 0.8512 | 0.8874 | | 0.3864 | 128.0 | 44800 | 0.2994 | 0.7796 | 0.9572 | 0.9063 | 0.2901 | 0.7682 | 0.8159 | 0.3047 | 0.8137 | 0.8188 | 0.3806 | 0.8101 | 0.8513 | 0.7767 | 0.8232 | 0.7102 | 0.7462 | 0.8518 | 0.887 | | 0.4013 | 129.0 | 45150 | 0.2991 | 0.7797 | 0.9578 | 0.9028 | 0.2809 | 0.7716 | 0.8163 | 0.3049 | 0.8123 | 0.8193 | 0.3611 | 0.8147 | 0.8505 | 0.7781 | 0.8243 | 0.7118 | 0.7477 | 0.8493 | 0.8858 | | 0.3756 | 130.0 | 45500 | 0.3002 | 0.7767 | 0.9582 | 0.9068 | 0.2797 | 0.7678 | 0.8177 | 0.3024 | 0.8094 | 0.8145 | 0.3611 | 0.8079 | 0.8552 | 0.7732 | 0.8189 | 0.7085 | 0.74 | 0.8486 | 0.8845 | | 0.3507 | 131.0 | 45850 | 0.3037 | 0.7744 | 0.9504 | 0.9034 | 0.2899 | 0.7679 | 0.8095 | 0.3031 | 0.8086 | 0.8144 | 0.3653 | 0.8099 | 0.8465 | 0.7793 | 0.8216 | 0.6955 | 0.7369 | 0.8484 | 0.8845 | | 0.4179 | 132.0 | 46200 | 0.3052 | 0.7697 | 0.9505 | 0.9035 | 0.28 | 0.7616 | 0.8083 | 0.3015 | 0.8043 | 0.8096 | 0.3431 | 0.8036 | 0.845 | 0.7759 | 0.8189 | 0.69 | 0.7308 | 0.8433 | 0.8791 | | 0.3436 | 133.0 | 46550 | 0.3007 | 0.7766 | 0.9518 | 0.909 | 0.2915 | 0.7676 | 0.8283 | 0.3029 | 0.8125 | 0.8175 | 0.3472 | 0.8109 | 0.8685 | 0.7817 | 0.8286 | 0.6965 | 0.7369 | 0.8518 | 0.887 | | 0.4076 | 134.0 | 46900 | 0.3019 | 0.7761 | 0.9518 | 0.903 | 0.2906 | 0.767 | 0.8268 | 0.3035 | 0.8113 | 0.8162 | 0.3472 | 0.8095 | 0.8602 | 0.7801 | 0.8254 | 0.7008 | 0.74 | 0.8476 | 0.8833 | | 0.4175 | 135.0 | 47250 | 0.3041 | 0.7742 | 0.9505 | 0.8999 | 0.2809 | 0.766 | 0.8132 | 0.3033 | 0.8077 | 0.8128 | 0.3389 | 0.8071 | 0.8468 | 0.7742 | 0.8189 | 0.705 | 0.74 | 0.8436 | 0.8795 | | 0.3912 | 136.0 | 47600 | 0.2998 | 0.7775 | 0.9505 | 0.9002 | 0.2799 | 0.7702 | 0.8179 | 0.3045 | 0.8098 | 0.8145 | 0.3347 | 0.8106 | 0.8479 | 0.7774 | 0.8205 | 0.7076 | 0.74 | 0.8475 | 0.8828 | | 0.3579 | 137.0 | 47950 | 0.3000 | 0.7779 | 0.9488 | 0.8999 | 0.2835 | 0.7703 | 0.8142 | 0.3045 | 0.8107 | 0.8152 | 0.3264 | 0.8116 | 0.8462 | 0.7787 | 0.8216 | 0.7068 | 0.7415 | 0.8481 | 0.8824 | | 0.3883 | 138.0 | 48300 | 0.2983 | 0.7743 | 0.9488 | 0.9004 | 0.2879 | 0.7659 | 0.8169 | 0.3029 | 0.8095 | 0.8141 | 0.3264 | 0.81 | 0.85 | 0.7775 | 0.8205 | 0.6976 | 0.7385 | 0.8478 | 0.8833 | | 0.3039 | 139.0 | 48650 | 0.2989 | 0.7746 | 0.9487 | 0.9004 | 0.2962 | 0.7684 | 0.812 | 0.303 | 0.8104 | 0.8148 | 0.3347 | 0.8124 | 0.8458 | 0.7773 | 0.8211 | 0.6997 | 0.74 | 0.8468 | 0.8833 | | 0.399 | 140.0 | 49000 | 0.2999 | 0.7751 | 0.9535 | 0.9044 | 0.2967 | 0.7671 | 0.8175 | 0.3023 | 0.81 | 0.8145 | 0.3361 | 0.8104 | 0.8491 | 0.7778 | 0.8222 | 0.7003 | 0.7385 | 0.8471 | 0.8828 | | 0.4337 | 141.0 | 49350 | 0.3001 | 0.7737 | 0.9534 | 0.9018 | 0.2912 | 0.7666 | 0.8178 | 0.3024 | 0.8093 | 0.8138 | 0.3306 | 0.8104 | 0.8521 | 0.775 | 0.8189 | 0.7007 | 0.74 | 0.8455 | 0.8824 | | 0.4288 | 142.0 | 49700 | 0.2999 | 0.7741 | 0.9553 | 0.9008 | 0.3024 | 0.7667 | 0.8173 | 0.3017 | 0.8089 | 0.8131 | 0.3347 | 0.8093 | 0.8477 | 0.7771 | 0.8211 | 0.699 | 0.7354 | 0.8463 | 0.8828 | | 0.405 | 143.0 | 50050 | 0.2993 | 0.7748 | 0.9553 | 0.9036 | 0.2955 | 0.7673 | 0.8169 | 0.3014 | 0.8087 | 0.8131 | 0.3389 | 0.8092 | 0.8472 | 0.7768 | 0.8211 | 0.7031 | 0.7369 | 0.8445 | 0.8812 | | 0.318 | 144.0 | 50400 | 0.2982 | 0.7759 | 0.9553 | 0.9037 | 0.3026 | 0.7689 | 0.8176 | 0.3029 | 0.8105 | 0.8147 | 0.3389 | 0.8107 | 0.8509 | 0.7775 | 0.8216 | 0.7039 | 0.74 | 0.8464 | 0.8824 | | 0.4394 | 145.0 | 50750 | 0.2973 | 0.7762 | 0.9553 | 0.9037 | 0.3026 | 0.7689 | 0.8179 | 0.3029 | 0.8109 | 0.8151 | 0.3389 | 0.8109 | 0.8512 | 0.7777 | 0.8222 | 0.7036 | 0.74 | 0.8473 | 0.8833 | | 0.4312 | 146.0 | 51100 | 0.2976 | 0.7761 | 0.9553 | 0.9037 | 0.3024 | 0.7689 | 0.8179 | 0.3029 | 0.8106 | 0.8148 | 0.3347 | 0.8109 | 0.8511 | 0.7779 | 0.8216 | 0.7037 | 0.74 | 0.8467 | 0.8828 | | 0.3617 | 147.0 | 51450 | 0.2975 | 0.776 | 0.9553 | 0.9037 | 0.3024 | 0.7689 | 0.8175 | 0.3029 | 0.8101 | 0.8143 | 0.3347 | 0.8109 | 0.8477 | 0.7778 | 0.8216 | 0.7034 | 0.7385 | 0.8467 | 0.8828 | | 0.4041 | 148.0 | 51800 | 0.2975 | 0.776 | 0.9553 | 0.9037 | 0.3024 | 0.7689 | 0.8175 | 0.3029 | 0.8101 | 0.8143 | 0.3347 | 0.8109 | 0.8477 | 0.7778 | 0.8216 | 0.7034 | 0.7385 | 0.8467 | 0.8828 | | 0.368 | 149.0 | 52150 | 0.2975 | 0.776 | 0.9553 | 0.9037 | 0.3024 | 0.7689 | 0.8175 | 0.3029 | 0.8101 | 0.8143 | 0.3347 | 0.8109 | 0.8477 | 0.7778 | 0.8216 | 0.7035 | 0.7385 | 0.8467 | 0.8828 | | 0.4667 | 150.0 | 52500 | 0.2975 | 0.776 | 0.9553 | 0.9037 | 0.3024 | 0.7689 | 0.8175 | 0.3029 | 0.8101 | 0.8143 | 0.3347 | 0.8109 | 0.8477 | 0.7778 | 0.8216 | 0.7035 | 0.7385 | 0.8467 | 0.8828 | ### Framework versions - Transformers 4.45.2 - Pytorch 2.5.0+cu121 - Datasets 2.19.2 - Tokenizers 0.20.1
[ "chicken", "duck", "plant" ]
joe611/chickens-composite-201010101010-150-epochs-wo-transform
<!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. --> # chickens-composite-201010101010-150-epochs-wo-transform This model is a fine-tuned version of [facebook/detr-resnet-50](https://huggingface.co/facebook/detr-resnet-50) on the None dataset. It achieves the following results on the evaluation set: - Loss: 0.3142 - Map: 0.8032 - Map 50: 0.9625 - Map 75: 0.911 - Map Small: 0.3791 - Map Medium: 0.7895 - Map Large: 0.8284 - Mar 1: 0.3146 - Mar 10: 0.8323 - Mar 100: 0.8373 - Mar Small: 0.4583 - Mar Medium: 0.8271 - Mar Large: 0.8479 - Map Chicken: 0.7941 - Mar 100 Chicken: 0.8308 - Map Duck: 0.7529 - Mar 100 Duck: 0.7908 - Map Plant: 0.8628 - Mar 100 Plant: 0.8904 ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 1e-05 - train_batch_size: 2 - eval_batch_size: 8 - seed: 42 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: cosine - num_epochs: 150 ### Training results | Training Loss | Epoch | Step | Validation Loss | Map | Map 50 | Map 75 | Map Small | Map Medium | Map Large | Mar 1 | Mar 10 | Mar 100 | Mar Small | Mar Medium | Mar Large | Map Chicken | Mar 100 Chicken | Map Duck | Mar 100 Duck | Map Plant | Mar 100 Plant | |:-------------:|:-----:|:-----:|:---------------:|:------:|:------:|:------:|:---------:|:----------:|:---------:|:------:|:------:|:-------:|:---------:|:----------:|:---------:|:-----------:|:---------------:|:--------:|:------------:|:---------:|:-------------:| | 1.5437 | 1.0 | 350 | 1.4444 | 0.1836 | 0.2645 | 0.2088 | 0.0024 | 0.102 | 0.2106 | 0.0906 | 0.2712 | 0.3273 | 0.1042 | 0.295 | 0.3156 | 0.0947 | 0.2146 | 0.0 | 0.0 | 0.456 | 0.7674 | | 1.3317 | 2.0 | 700 | 1.2074 | 0.2413 | 0.3385 | 0.2689 | 0.013 | 0.1737 | 0.2718 | 0.1022 | 0.3248 | 0.3517 | 0.1833 | 0.317 | 0.3403 | 0.0962 | 0.2827 | 0.0 | 0.0 | 0.6277 | 0.7724 | | 1.0403 | 3.0 | 1050 | 0.9834 | 0.337 | 0.4816 | 0.3897 | 0.0559 | 0.3018 | 0.347 | 0.1244 | 0.43 | 0.4471 | 0.0708 | 0.4203 | 0.4638 | 0.3211 | 0.5741 | 0.0 | 0.0 | 0.6898 | 0.7674 | | 0.9619 | 4.0 | 1400 | 0.9247 | 0.395 | 0.5553 | 0.4642 | 0.0046 | 0.3756 | 0.4021 | 0.1322 | 0.4758 | 0.4816 | 0.05 | 0.4588 | 0.5039 | 0.4988 | 0.6892 | 0.0 | 0.0 | 0.6862 | 0.7556 | | 0.8927 | 5.0 | 1750 | 0.8666 | 0.413 | 0.5877 | 0.4812 | 0.0108 | 0.3877 | 0.4293 | 0.1334 | 0.4698 | 0.4733 | 0.0625 | 0.4465 | 0.4946 | 0.5352 | 0.6665 | 0.0 | 0.0 | 0.7037 | 0.7536 | | 0.7936 | 6.0 | 2100 | 0.7851 | 0.42 | 0.6071 | 0.5021 | 0.0278 | 0.3927 | 0.4341 | 0.1335 | 0.466 | 0.4713 | 0.1167 | 0.4455 | 0.4889 | 0.5803 | 0.6822 | 0.0 | 0.0 | 0.6798 | 0.7318 | | 0.8043 | 7.0 | 2450 | 0.6971 | 0.4369 | 0.6075 | 0.522 | 0.0833 | 0.4103 | 0.4474 | 0.1388 | 0.4874 | 0.4918 | 0.125 | 0.4646 | 0.4977 | 0.6043 | 0.7081 | 0.0 | 0.0 | 0.7066 | 0.7674 | | 0.6806 | 8.0 | 2800 | 0.6553 | 0.4425 | 0.6199 | 0.5389 | 0.0684 | 0.4145 | 0.4554 | 0.1378 | 0.4839 | 0.4871 | 0.1208 | 0.4617 | 0.5023 | 0.6214 | 0.7065 | 0.0 | 0.0 | 0.7062 | 0.7548 | | 0.7617 | 9.0 | 3150 | 0.6102 | 0.4605 | 0.6501 | 0.5485 | 0.0579 | 0.4315 | 0.4708 | 0.148 | 0.5005 | 0.5036 | 0.1125 | 0.476 | 0.5101 | 0.6075 | 0.6903 | 0.0493 | 0.0462 | 0.7249 | 0.7745 | | 0.6627 | 10.0 | 3500 | 0.5567 | 0.5263 | 0.7006 | 0.6179 | 0.0392 | 0.5125 | 0.5077 | 0.1793 | 0.5599 | 0.5668 | 0.125 | 0.5522 | 0.5419 | 0.6815 | 0.7476 | 0.161 | 0.16 | 0.7362 | 0.7929 | | 0.6337 | 11.0 | 3850 | 0.5077 | 0.6189 | 0.8134 | 0.7494 | 0.1854 | 0.6072 | 0.5787 | 0.2267 | 0.6608 | 0.6653 | 0.2333 | 0.6579 | 0.6164 | 0.6869 | 0.7541 | 0.3978 | 0.4277 | 0.7719 | 0.8142 | | 0.5744 | 12.0 | 4200 | 0.5425 | 0.6042 | 0.8536 | 0.7352 | 0.1136 | 0.5885 | 0.6158 | 0.2312 | 0.6516 | 0.6562 | 0.1875 | 0.6487 | 0.6582 | 0.6455 | 0.7054 | 0.4321 | 0.48 | 0.7349 | 0.7833 | | 0.5931 | 13.0 | 4550 | 0.5184 | 0.6297 | 0.8751 | 0.754 | 0.194 | 0.6103 | 0.6116 | 0.247 | 0.676 | 0.6798 | 0.2611 | 0.6641 | 0.6579 | 0.6433 | 0.7119 | 0.4788 | 0.52 | 0.767 | 0.8075 | | 0.5293 | 14.0 | 4900 | 0.5515 | 0.6446 | 0.9257 | 0.7953 | 0.209 | 0.6282 | 0.663 | 0.2492 | 0.6888 | 0.6971 | 0.3653 | 0.6831 | 0.7265 | 0.6187 | 0.6951 | 0.5622 | 0.6 | 0.753 | 0.7962 | | 0.5711 | 15.0 | 5250 | 0.4966 | 0.662 | 0.8953 | 0.7996 | 0.1808 | 0.6618 | 0.6218 | 0.2638 | 0.7075 | 0.7126 | 0.3597 | 0.716 | 0.6631 | 0.6713 | 0.7465 | 0.5559 | 0.5877 | 0.7589 | 0.8038 | | 0.5324 | 16.0 | 5600 | 0.4573 | 0.686 | 0.9184 | 0.8183 | 0.1296 | 0.6714 | 0.7185 | 0.2724 | 0.7357 | 0.7391 | 0.3056 | 0.7307 | 0.7594 | 0.6869 | 0.7616 | 0.6051 | 0.6431 | 0.7659 | 0.8126 | | 0.4658 | 17.0 | 5950 | 0.4699 | 0.6826 | 0.9317 | 0.8134 | 0.1838 | 0.6611 | 0.7147 | 0.2652 | 0.7243 | 0.7286 | 0.3444 | 0.715 | 0.7491 | 0.6649 | 0.7308 | 0.6034 | 0.6369 | 0.7794 | 0.818 | | 0.5225 | 18.0 | 6300 | 0.4092 | 0.7188 | 0.9439 | 0.8623 | 0.2122 | 0.7013 | 0.7485 | 0.2865 | 0.7579 | 0.7628 | 0.2778 | 0.7516 | 0.7879 | 0.7012 | 0.7546 | 0.6645 | 0.7015 | 0.7907 | 0.8322 | | 0.4893 | 19.0 | 6650 | 0.4379 | 0.7009 | 0.9255 | 0.839 | 0.2062 | 0.6864 | 0.7387 | 0.2742 | 0.7477 | 0.7513 | 0.3181 | 0.7391 | 0.777 | 0.6982 | 0.7654 | 0.6195 | 0.6631 | 0.7851 | 0.8255 | | 0.464 | 20.0 | 7000 | 0.4458 | 0.6968 | 0.9143 | 0.8486 | 0.23 | 0.6735 | 0.706 | 0.2702 | 0.7311 | 0.7342 | 0.3097 | 0.7167 | 0.7428 | 0.691 | 0.7427 | 0.6164 | 0.6385 | 0.783 | 0.8213 | | 0.4851 | 21.0 | 7350 | 0.4262 | 0.7213 | 0.9504 | 0.8578 | 0.2028 | 0.722 | 0.731 | 0.2839 | 0.759 | 0.7664 | 0.3292 | 0.7667 | 0.7773 | 0.7052 | 0.7605 | 0.6713 | 0.7062 | 0.7873 | 0.8326 | | 0.441 | 22.0 | 7700 | 0.4094 | 0.7204 | 0.9432 | 0.8603 | 0.1826 | 0.7075 | 0.7426 | 0.2862 | 0.7563 | 0.7624 | 0.3319 | 0.7501 | 0.7837 | 0.7054 | 0.7605 | 0.6582 | 0.6862 | 0.7977 | 0.8406 | | 0.4809 | 23.0 | 8050 | 0.4196 | 0.7225 | 0.9508 | 0.8596 | 0.2156 | 0.7013 | 0.7522 | 0.2852 | 0.7526 | 0.7624 | 0.3847 | 0.7484 | 0.786 | 0.7003 | 0.7519 | 0.6734 | 0.7046 | 0.794 | 0.8305 | | 0.4187 | 24.0 | 8400 | 0.4153 | 0.7183 | 0.9423 | 0.8634 | 0.2267 | 0.7093 | 0.7365 | 0.2819 | 0.754 | 0.7636 | 0.4069 | 0.7581 | 0.7696 | 0.7083 | 0.7741 | 0.6535 | 0.6846 | 0.7932 | 0.8322 | | 0.4433 | 25.0 | 8750 | 0.4256 | 0.718 | 0.9319 | 0.8574 | 0.1879 | 0.7126 | 0.7096 | 0.2848 | 0.7564 | 0.7618 | 0.3681 | 0.7606 | 0.758 | 0.7023 | 0.7622 | 0.6505 | 0.6815 | 0.8012 | 0.8418 | | 0.4378 | 26.0 | 9100 | 0.4343 | 0.7086 | 0.9443 | 0.8609 | 0.2489 | 0.6932 | 0.7301 | 0.2811 | 0.7503 | 0.7543 | 0.3194 | 0.7459 | 0.7697 | 0.6834 | 0.7416 | 0.6563 | 0.6969 | 0.7862 | 0.8243 | | 0.4133 | 27.0 | 9450 | 0.3921 | 0.7275 | 0.9622 | 0.8503 | 0.2967 | 0.7127 | 0.7436 | 0.2841 | 0.7689 | 0.7751 | 0.4597 | 0.762 | 0.7845 | 0.702 | 0.7584 | 0.6858 | 0.7277 | 0.7945 | 0.8393 | | 0.3858 | 28.0 | 9800 | 0.3947 | 0.7291 | 0.9455 | 0.8637 | 0.2977 | 0.7148 | 0.7149 | 0.283 | 0.7649 | 0.7698 | 0.4167 | 0.7605 | 0.7499 | 0.7188 | 0.7697 | 0.6628 | 0.6923 | 0.8056 | 0.8473 | | 0.3894 | 29.0 | 10150 | 0.3838 | 0.7414 | 0.9645 | 0.8848 | 0.2583 | 0.7319 | 0.7503 | 0.2892 | 0.7737 | 0.7821 | 0.3556 | 0.7806 | 0.7888 | 0.7282 | 0.7724 | 0.6967 | 0.7308 | 0.7991 | 0.8431 | | 0.4011 | 30.0 | 10500 | 0.3980 | 0.7118 | 0.9478 | 0.8771 | 0.327 | 0.7049 | 0.7038 | 0.2753 | 0.7538 | 0.7604 | 0.4444 | 0.7591 | 0.7393 | 0.6953 | 0.7573 | 0.6396 | 0.6846 | 0.8005 | 0.8393 | | 0.3976 | 31.0 | 10850 | 0.3923 | 0.7214 | 0.9274 | 0.8621 | 0.3205 | 0.715 | 0.6903 | 0.2826 | 0.7631 | 0.7675 | 0.4597 | 0.7644 | 0.7331 | 0.7239 | 0.7816 | 0.6264 | 0.6708 | 0.8139 | 0.8502 | | 0.4064 | 32.0 | 11200 | 0.3617 | 0.7444 | 0.9589 | 0.8895 | 0.2602 | 0.7389 | 0.7661 | 0.2951 | 0.7814 | 0.7907 | 0.4 | 0.7888 | 0.8002 | 0.7215 | 0.7784 | 0.6972 | 0.7415 | 0.8146 | 0.8523 | | 0.3552 | 33.0 | 11550 | 0.3888 | 0.7253 | 0.9597 | 0.8833 | 0.3399 | 0.7186 | 0.7424 | 0.286 | 0.7593 | 0.7693 | 0.4667 | 0.7641 | 0.7781 | 0.684 | 0.7378 | 0.6826 | 0.7215 | 0.8093 | 0.8485 | | 0.3485 | 34.0 | 11900 | 0.3882 | 0.7383 | 0.9488 | 0.8832 | 0.2804 | 0.7323 | 0.7306 | 0.2893 | 0.7722 | 0.7818 | 0.4403 | 0.7786 | 0.7669 | 0.7156 | 0.7681 | 0.6877 | 0.7215 | 0.8115 | 0.8556 | | 0.3426 | 35.0 | 12250 | 0.3524 | 0.755 | 0.962 | 0.8873 | 0.3185 | 0.7475 | 0.7798 | 0.2931 | 0.789 | 0.7966 | 0.4347 | 0.7894 | 0.811 | 0.7349 | 0.7827 | 0.7032 | 0.7446 | 0.8269 | 0.8623 | | 0.3678 | 36.0 | 12600 | 0.3401 | 0.7564 | 0.9394 | 0.8883 | 0.3199 | 0.7528 | 0.7433 | 0.2995 | 0.7916 | 0.7961 | 0.4028 | 0.7963 | 0.776 | 0.7433 | 0.7935 | 0.6998 | 0.7308 | 0.8261 | 0.864 | | 0.3788 | 37.0 | 12950 | 0.3824 | 0.7359 | 0.9555 | 0.8777 | 0.2943 | 0.7225 | 0.7575 | 0.2885 | 0.7708 | 0.7776 | 0.4125 | 0.7694 | 0.7848 | 0.711 | 0.7551 | 0.6848 | 0.7246 | 0.812 | 0.8531 | | 0.3609 | 38.0 | 13300 | 0.3569 | 0.7542 | 0.9582 | 0.8952 | 0.3691 | 0.7533 | 0.7651 | 0.2966 | 0.7902 | 0.7939 | 0.4083 | 0.7929 | 0.7972 | 0.748 | 0.7941 | 0.696 | 0.7308 | 0.8187 | 0.8569 | | 0.367 | 39.0 | 13650 | 0.3714 | 0.7371 | 0.9503 | 0.8834 | 0.2573 | 0.7288 | 0.7517 | 0.2892 | 0.7722 | 0.7772 | 0.4083 | 0.7713 | 0.7806 | 0.7318 | 0.7773 | 0.6572 | 0.6969 | 0.8223 | 0.8573 | | 0.3228 | 40.0 | 14000 | 0.3741 | 0.7537 | 0.9519 | 0.8918 | 0.2533 | 0.7438 | 0.7904 | 0.2968 | 0.7844 | 0.7909 | 0.3764 | 0.7848 | 0.8191 | 0.7372 | 0.7773 | 0.7003 | 0.7338 | 0.8237 | 0.8615 | | 0.3797 | 41.0 | 14350 | 0.3437 | 0.7511 | 0.9489 | 0.8886 | 0.3072 | 0.7526 | 0.7326 | 0.2941 | 0.7852 | 0.7917 | 0.4472 | 0.7971 | 0.7562 | 0.7414 | 0.7946 | 0.6843 | 0.7138 | 0.8277 | 0.8665 | | 0.343 | 42.0 | 14700 | 0.3426 | 0.7562 | 0.958 | 0.8925 | 0.3526 | 0.7446 | 0.7669 | 0.3 | 0.7947 | 0.7992 | 0.4556 | 0.7907 | 0.7982 | 0.7392 | 0.7946 | 0.7073 | 0.7431 | 0.8221 | 0.8598 | | 0.3832 | 43.0 | 15050 | 0.3543 | 0.7519 | 0.9558 | 0.8896 | 0.2981 | 0.744 | 0.7796 | 0.2961 | 0.7898 | 0.7963 | 0.4458 | 0.7902 | 0.812 | 0.7314 | 0.7897 | 0.6966 | 0.7338 | 0.8278 | 0.8653 | | 0.3598 | 44.0 | 15400 | 0.3587 | 0.7532 | 0.955 | 0.883 | 0.29 | 0.7437 | 0.7834 | 0.298 | 0.7905 | 0.7987 | 0.4375 | 0.7889 | 0.8305 | 0.7261 | 0.773 | 0.7051 | 0.7554 | 0.8285 | 0.8678 | | 0.3291 | 45.0 | 15750 | 0.3615 | 0.7348 | 0.9388 | 0.8716 | 0.3122 | 0.7263 | 0.7593 | 0.2884 | 0.7716 | 0.777 | 0.4056 | 0.7691 | 0.7898 | 0.7415 | 0.7924 | 0.6416 | 0.68 | 0.8212 | 0.8586 | | 0.3413 | 46.0 | 16100 | 0.3522 | 0.7652 | 0.9493 | 0.9093 | 0.3521 | 0.7663 | 0.771 | 0.3011 | 0.7941 | 0.8015 | 0.5014 | 0.8034 | 0.7926 | 0.7561 | 0.8022 | 0.711 | 0.7415 | 0.8286 | 0.8607 | | 0.3297 | 47.0 | 16450 | 0.3396 | 0.7528 | 0.9491 | 0.8659 | 0.251 | 0.7489 | 0.7604 | 0.2935 | 0.7845 | 0.7918 | 0.4111 | 0.7878 | 0.7878 | 0.7532 | 0.7978 | 0.6835 | 0.7169 | 0.8218 | 0.8607 | | 0.4019 | 48.0 | 16800 | 0.3745 | 0.7449 | 0.9518 | 0.9059 | 0.3767 | 0.7367 | 0.7638 | 0.2879 | 0.7804 | 0.787 | 0.5306 | 0.777 | 0.7941 | 0.7386 | 0.7849 | 0.6758 | 0.7154 | 0.8201 | 0.8607 | | 0.3571 | 49.0 | 17150 | 0.3319 | 0.7704 | 0.9698 | 0.9093 | 0.266 | 0.7688 | 0.7706 | 0.3002 | 0.802 | 0.811 | 0.4944 | 0.808 | 0.7931 | 0.7525 | 0.7995 | 0.7253 | 0.76 | 0.8335 | 0.8736 | | 0.3508 | 50.0 | 17500 | 0.3552 | 0.7541 | 0.9536 | 0.9068 | 0.3095 | 0.7509 | 0.7732 | 0.2944 | 0.7918 | 0.797 | 0.4861 | 0.7908 | 0.8 | 0.741 | 0.7897 | 0.7028 | 0.7462 | 0.8184 | 0.8552 | | 0.313 | 51.0 | 17850 | 0.3600 | 0.7557 | 0.959 | 0.9004 | 0.3194 | 0.7501 | 0.777 | 0.2992 | 0.7956 | 0.8013 | 0.4444 | 0.7912 | 0.8199 | 0.755 | 0.8022 | 0.6846 | 0.7369 | 0.8274 | 0.8649 | | 0.2893 | 52.0 | 18200 | 0.3484 | 0.7629 | 0.954 | 0.9007 | 0.365 | 0.7567 | 0.7776 | 0.2971 | 0.8018 | 0.8078 | 0.5347 | 0.8004 | 0.8109 | 0.7355 | 0.7827 | 0.7127 | 0.7615 | 0.8407 | 0.8791 | | 0.3432 | 53.0 | 18550 | 0.3346 | 0.7647 | 0.9589 | 0.8921 | 0.3011 | 0.7614 | 0.7826 | 0.3004 | 0.7985 | 0.8056 | 0.4792 | 0.8028 | 0.8105 | 0.7619 | 0.8027 | 0.6939 | 0.7354 | 0.8382 | 0.8787 | | 0.3566 | 54.0 | 18900 | 0.3534 | 0.7691 | 0.9465 | 0.9061 | 0.372 | 0.7659 | 0.7765 | 0.2994 | 0.8024 | 0.8086 | 0.4944 | 0.8027 | 0.802 | 0.7495 | 0.7978 | 0.7236 | 0.7569 | 0.8342 | 0.8711 | | 0.297 | 55.0 | 19250 | 0.3647 | 0.7428 | 0.9312 | 0.8657 | 0.2769 | 0.7551 | 0.6911 | 0.2893 | 0.7778 | 0.7827 | 0.3653 | 0.7935 | 0.7267 | 0.7341 | 0.7854 | 0.667 | 0.6969 | 0.8274 | 0.8657 | | 0.3159 | 56.0 | 19600 | 0.3336 | 0.7649 | 0.9349 | 0.8937 | 0.3414 | 0.7644 | 0.7598 | 0.3014 | 0.7966 | 0.8046 | 0.475 | 0.8053 | 0.7863 | 0.7721 | 0.813 | 0.6878 | 0.7292 | 0.8349 | 0.8715 | | 0.2795 | 57.0 | 19950 | 0.3222 | 0.7678 | 0.9544 | 0.8981 | 0.3804 | 0.7636 | 0.7921 | 0.3032 | 0.8055 | 0.8116 | 0.5153 | 0.8059 | 0.8184 | 0.7635 | 0.8124 | 0.7016 | 0.7492 | 0.8384 | 0.8732 | | 0.3153 | 58.0 | 20300 | 0.3529 | 0.7514 | 0.9391 | 0.8861 | 0.3047 | 0.7446 | 0.7522 | 0.2949 | 0.787 | 0.7972 | 0.4625 | 0.7905 | 0.7825 | 0.7445 | 0.7951 | 0.6757 | 0.7246 | 0.8339 | 0.872 | | 0.2882 | 59.0 | 20650 | 0.3469 | 0.758 | 0.9544 | 0.8836 | 0.3194 | 0.7452 | 0.7847 | 0.2939 | 0.7896 | 0.799 | 0.45 | 0.7851 | 0.8134 | 0.7586 | 0.7973 | 0.6839 | 0.7277 | 0.8315 | 0.872 | | 0.3166 | 60.0 | 21000 | 0.3246 | 0.7821 | 0.9609 | 0.9142 | 0.2961 | 0.7786 | 0.8174 | 0.3098 | 0.818 | 0.8226 | 0.4069 | 0.8178 | 0.847 | 0.7591 | 0.8065 | 0.7524 | 0.7846 | 0.8346 | 0.8766 | | 0.2749 | 61.0 | 21350 | 0.3453 | 0.7704 | 0.9571 | 0.8932 | 0.3194 | 0.7621 | 0.7971 | 0.3032 | 0.8017 | 0.8104 | 0.4917 | 0.7984 | 0.8234 | 0.767 | 0.8065 | 0.7132 | 0.7585 | 0.831 | 0.8661 | | 0.327 | 62.0 | 21700 | 0.3354 | 0.761 | 0.9595 | 0.8987 | 0.3261 | 0.7501 | 0.7863 | 0.2929 | 0.7959 | 0.8059 | 0.5125 | 0.7931 | 0.8244 | 0.7423 | 0.7886 | 0.7063 | 0.7538 | 0.8345 | 0.8753 | | 0.291 | 63.0 | 22050 | 0.3457 | 0.766 | 0.9525 | 0.8941 | 0.3428 | 0.7557 | 0.7868 | 0.2944 | 0.8015 | 0.8081 | 0.4986 | 0.7952 | 0.8164 | 0.757 | 0.8032 | 0.7064 | 0.7492 | 0.8346 | 0.872 | | 0.3024 | 64.0 | 22400 | 0.3337 | 0.7763 | 0.9623 | 0.9039 | 0.3293 | 0.7675 | 0.7978 | 0.301 | 0.8103 | 0.8196 | 0.4875 | 0.8089 | 0.8242 | 0.7675 | 0.8103 | 0.7212 | 0.7723 | 0.8403 | 0.8762 | | 0.3294 | 65.0 | 22750 | 0.3286 | 0.7801 | 0.9593 | 0.9138 | 0.3716 | 0.7679 | 0.7969 | 0.3031 | 0.8176 | 0.8247 | 0.5167 | 0.8125 | 0.8253 | 0.7565 | 0.8059 | 0.7416 | 0.7908 | 0.8421 | 0.8774 | | 0.2871 | 66.0 | 23100 | 0.3343 | 0.7728 | 0.9461 | 0.9006 | 0.3951 | 0.7615 | 0.7828 | 0.3002 | 0.8071 | 0.8148 | 0.5028 | 0.805 | 0.8116 | 0.7651 | 0.8135 | 0.708 | 0.7523 | 0.8451 | 0.8787 | | 0.2817 | 67.0 | 23450 | 0.3599 | 0.7556 | 0.9453 | 0.8834 | 0.3488 | 0.7548 | 0.7602 | 0.2984 | 0.7935 | 0.8016 | 0.4306 | 0.7988 | 0.7965 | 0.7469 | 0.7957 | 0.6913 | 0.74 | 0.8285 | 0.869 | | 0.3097 | 68.0 | 23800 | 0.3383 | 0.7787 | 0.9695 | 0.9127 | 0.3165 | 0.7658 | 0.8031 | 0.3033 | 0.814 | 0.8206 | 0.5014 | 0.8083 | 0.8311 | 0.7704 | 0.813 | 0.727 | 0.7738 | 0.8388 | 0.8749 | | 0.3027 | 69.0 | 24150 | 0.3267 | 0.7832 | 0.9673 | 0.9114 | 0.391 | 0.7743 | 0.787 | 0.3057 | 0.8173 | 0.8247 | 0.5694 | 0.8129 | 0.8174 | 0.7694 | 0.8151 | 0.7351 | 0.78 | 0.845 | 0.8791 | | 0.3024 | 70.0 | 24500 | 0.3390 | 0.7565 | 0.9424 | 0.8789 | 0.3252 | 0.7561 | 0.7391 | 0.2961 | 0.7926 | 0.8006 | 0.5111 | 0.7957 | 0.7729 | 0.7521 | 0.7978 | 0.6769 | 0.7277 | 0.8404 | 0.8762 | | 0.2909 | 71.0 | 24850 | 0.3243 | 0.7767 | 0.9543 | 0.9042 | 0.3505 | 0.7718 | 0.773 | 0.3071 | 0.8129 | 0.8191 | 0.5361 | 0.8128 | 0.8049 | 0.7615 | 0.807 | 0.737 | 0.78 | 0.8316 | 0.8703 | | 0.2781 | 72.0 | 25200 | 0.3461 | 0.7713 | 0.9565 | 0.895 | 0.3546 | 0.7629 | 0.7718 | 0.2997 | 0.8068 | 0.8129 | 0.5181 | 0.8006 | 0.8019 | 0.7673 | 0.8059 | 0.7118 | 0.76 | 0.8348 | 0.8728 | | 0.2758 | 73.0 | 25550 | 0.3375 | 0.7712 | 0.961 | 0.9114 | 0.3736 | 0.7518 | 0.7913 | 0.3003 | 0.8086 | 0.8144 | 0.5444 | 0.7972 | 0.8275 | 0.7526 | 0.8011 | 0.7232 | 0.7677 | 0.8378 | 0.8745 | | 0.2856 | 74.0 | 25900 | 0.3367 | 0.7794 | 0.9591 | 0.9068 | 0.4018 | 0.7631 | 0.8069 | 0.3066 | 0.8129 | 0.8184 | 0.5139 | 0.8036 | 0.8351 | 0.7648 | 0.8059 | 0.7348 | 0.7738 | 0.8387 | 0.8753 | | 0.293 | 75.0 | 26250 | 0.3328 | 0.7762 | 0.9625 | 0.9053 | 0.3438 | 0.7604 | 0.8022 | 0.3035 | 0.8097 | 0.8178 | 0.4917 | 0.8016 | 0.8306 | 0.7649 | 0.8065 | 0.7183 | 0.7662 | 0.8453 | 0.8808 | | 0.2868 | 76.0 | 26600 | 0.3262 | 0.7758 | 0.9664 | 0.9157 | 0.4012 | 0.7623 | 0.7971 | 0.3022 | 0.8112 | 0.8198 | 0.55 | 0.8044 | 0.8314 | 0.7605 | 0.8049 | 0.7303 | 0.7785 | 0.8366 | 0.8762 | | 0.2669 | 77.0 | 26950 | 0.3290 | 0.775 | 0.9625 | 0.9052 | 0.3396 | 0.7725 | 0.7822 | 0.3018 | 0.812 | 0.8188 | 0.5153 | 0.813 | 0.8159 | 0.7637 | 0.8054 | 0.7213 | 0.7754 | 0.8398 | 0.8757 | | 0.2585 | 78.0 | 27300 | 0.3479 | 0.7722 | 0.9553 | 0.888 | 0.3224 | 0.7582 | 0.8014 | 0.2984 | 0.809 | 0.8139 | 0.4736 | 0.7975 | 0.8332 | 0.761 | 0.8086 | 0.7164 | 0.7585 | 0.8392 | 0.8745 | | 0.2512 | 79.0 | 27650 | 0.3280 | 0.7906 | 0.9539 | 0.9084 | 0.3698 | 0.7809 | 0.8018 | 0.3101 | 0.8251 | 0.8319 | 0.5194 | 0.819 | 0.8376 | 0.778 | 0.8216 | 0.748 | 0.7938 | 0.8459 | 0.8803 | | 0.2529 | 80.0 | 28000 | 0.3304 | 0.775 | 0.957 | 0.903 | 0.362 | 0.7602 | 0.7888 | 0.3028 | 0.8067 | 0.8137 | 0.5375 | 0.7961 | 0.8207 | 0.7624 | 0.7973 | 0.7156 | 0.7646 | 0.8469 | 0.8791 | | 0.2543 | 81.0 | 28350 | 0.3159 | 0.7978 | 0.9656 | 0.9222 | 0.3357 | 0.7878 | 0.8056 | 0.311 | 0.8283 | 0.8349 | 0.5514 | 0.8203 | 0.838 | 0.7839 | 0.8189 | 0.7635 | 0.8046 | 0.8461 | 0.8812 | | 0.2537 | 82.0 | 28700 | 0.3115 | 0.7896 | 0.9572 | 0.907 | 0.3415 | 0.7823 | 0.7967 | 0.3026 | 0.8214 | 0.8289 | 0.5486 | 0.8149 | 0.8224 | 0.7839 | 0.8216 | 0.7307 | 0.7785 | 0.8543 | 0.8866 | | 0.2482 | 83.0 | 29050 | 0.3099 | 0.785 | 0.9531 | 0.896 | 0.3091 | 0.7826 | 0.798 | 0.3099 | 0.8176 | 0.8257 | 0.4736 | 0.8223 | 0.8283 | 0.782 | 0.8195 | 0.7198 | 0.7677 | 0.8531 | 0.89 | | 0.273 | 84.0 | 29400 | 0.3065 | 0.783 | 0.9514 | 0.9083 | 0.3725 | 0.7769 | 0.8031 | 0.3042 | 0.8161 | 0.8227 | 0.4597 | 0.8188 | 0.8311 | 0.7792 | 0.8184 | 0.7129 | 0.7615 | 0.857 | 0.8883 | | 0.2552 | 85.0 | 29750 | 0.3265 | 0.7725 | 0.9459 | 0.8938 | 0.3329 | 0.7668 | 0.8017 | 0.3007 | 0.8044 | 0.811 | 0.4264 | 0.8055 | 0.8265 | 0.7583 | 0.8005 | 0.7135 | 0.7585 | 0.8455 | 0.8741 | | 0.2756 | 86.0 | 30100 | 0.3271 | 0.7745 | 0.9483 | 0.8889 | 0.3715 | 0.7667 | 0.81 | 0.3027 | 0.8077 | 0.8142 | 0.4542 | 0.8077 | 0.8367 | 0.7656 | 0.8076 | 0.7013 | 0.7477 | 0.8567 | 0.8874 | | 0.2802 | 87.0 | 30450 | 0.3171 | 0.7847 | 0.9569 | 0.9092 | 0.3655 | 0.7784 | 0.8151 | 0.3067 | 0.8153 | 0.8223 | 0.5042 | 0.815 | 0.8386 | 0.7808 | 0.8195 | 0.7172 | 0.7585 | 0.8561 | 0.8891 | | 0.2833 | 88.0 | 30800 | 0.3153 | 0.7884 | 0.9564 | 0.9124 | 0.436 | 0.7826 | 0.8054 | 0.3067 | 0.8179 | 0.8248 | 0.5347 | 0.8174 | 0.8329 | 0.7813 | 0.8168 | 0.7256 | 0.7692 | 0.8583 | 0.8883 | | 0.2537 | 89.0 | 31150 | 0.3190 | 0.7853 | 0.9594 | 0.8947 | 0.3973 | 0.7796 | 0.8065 | 0.3041 | 0.8141 | 0.8202 | 0.5319 | 0.8141 | 0.8331 | 0.7705 | 0.8135 | 0.7268 | 0.7554 | 0.8587 | 0.8916 | | 0.2614 | 90.0 | 31500 | 0.3238 | 0.7816 | 0.949 | 0.8917 | 0.3037 | 0.7778 | 0.8115 | 0.3056 | 0.8138 | 0.8216 | 0.4264 | 0.8154 | 0.8459 | 0.7746 | 0.82 | 0.7149 | 0.7554 | 0.8553 | 0.8895 | | 0.2468 | 91.0 | 31850 | 0.3191 | 0.7732 | 0.9591 | 0.9 | 0.2864 | 0.7634 | 0.8006 | 0.3004 | 0.803 | 0.8122 | 0.4472 | 0.8002 | 0.8281 | 0.7632 | 0.8119 | 0.7085 | 0.7477 | 0.8479 | 0.877 | | 0.2667 | 92.0 | 32200 | 0.3343 | 0.7793 | 0.9474 | 0.8901 | 0.3104 | 0.7745 | 0.7971 | 0.3016 | 0.8089 | 0.8169 | 0.3944 | 0.8109 | 0.8268 | 0.7804 | 0.8173 | 0.7074 | 0.7538 | 0.8501 | 0.8795 | | 0.2457 | 93.0 | 32550 | 0.3362 | 0.7828 | 0.9527 | 0.9025 | 0.3748 | 0.7739 | 0.8097 | 0.3033 | 0.8112 | 0.8205 | 0.4792 | 0.8097 | 0.8329 | 0.7764 | 0.8184 | 0.7195 | 0.7615 | 0.8525 | 0.8816 | | 0.2463 | 94.0 | 32900 | 0.3280 | 0.79 | 0.9436 | 0.8999 | 0.4029 | 0.7824 | 0.8046 | 0.3046 | 0.8178 | 0.8269 | 0.4833 | 0.8188 | 0.8274 | 0.7904 | 0.8303 | 0.7207 | 0.76 | 0.8588 | 0.8904 | | 0.2782 | 95.0 | 33250 | 0.3176 | 0.7949 | 0.9554 | 0.9038 | 0.4182 | 0.7849 | 0.8197 | 0.3078 | 0.8247 | 0.8332 | 0.5181 | 0.8215 | 0.8483 | 0.7924 | 0.8351 | 0.7412 | 0.7846 | 0.851 | 0.8799 | | 0.2642 | 96.0 | 33600 | 0.3379 | 0.7816 | 0.9392 | 0.9032 | 0.3129 | 0.773 | 0.8077 | 0.3064 | 0.8136 | 0.8217 | 0.4403 | 0.8092 | 0.8358 | 0.7773 | 0.8195 | 0.7167 | 0.7677 | 0.8507 | 0.8778 | | 0.2453 | 97.0 | 33950 | 0.3346 | 0.7828 | 0.9449 | 0.903 | 0.3503 | 0.7709 | 0.8109 | 0.3085 | 0.8144 | 0.821 | 0.4569 | 0.8071 | 0.8397 | 0.7714 | 0.8168 | 0.7214 | 0.7631 | 0.8557 | 0.8833 | | 0.2553 | 98.0 | 34300 | 0.3367 | 0.7831 | 0.9423 | 0.893 | 0.3334 | 0.7783 | 0.7984 | 0.3079 | 0.8144 | 0.8205 | 0.4333 | 0.8137 | 0.8281 | 0.7762 | 0.8157 | 0.7157 | 0.7585 | 0.8574 | 0.8874 | | 0.2848 | 99.0 | 34650 | 0.3425 | 0.7795 | 0.9322 | 0.9022 | 0.3715 | 0.7749 | 0.7858 | 0.3069 | 0.813 | 0.8172 | 0.4569 | 0.8149 | 0.8094 | 0.7811 | 0.8249 | 0.7073 | 0.7477 | 0.85 | 0.8791 | | 0.2522 | 100.0 | 35000 | 0.3385 | 0.778 | 0.9354 | 0.9049 | 0.3456 | 0.773 | 0.7779 | 0.3064 | 0.8081 | 0.8125 | 0.4486 | 0.8071 | 0.7997 | 0.7777 | 0.8189 | 0.7 | 0.7354 | 0.8564 | 0.8833 | | 0.2422 | 101.0 | 35350 | 0.3334 | 0.7805 | 0.9413 | 0.9059 | 0.3302 | 0.7754 | 0.7926 | 0.307 | 0.8098 | 0.8167 | 0.4625 | 0.8109 | 0.8164 | 0.7803 | 0.8178 | 0.7081 | 0.7477 | 0.8532 | 0.8845 | | 0.268 | 102.0 | 35700 | 0.3327 | 0.7844 | 0.9431 | 0.8985 | 0.3643 | 0.7754 | 0.7989 | 0.3062 | 0.8119 | 0.8189 | 0.4528 | 0.8119 | 0.8189 | 0.7851 | 0.8254 | 0.7135 | 0.7477 | 0.8548 | 0.8837 | | 0.2624 | 103.0 | 36050 | 0.3171 | 0.7927 | 0.953 | 0.905 | 0.3682 | 0.7854 | 0.8108 | 0.311 | 0.8251 | 0.8298 | 0.4611 | 0.8223 | 0.8406 | 0.787 | 0.827 | 0.7324 | 0.7723 | 0.8587 | 0.89 | | 0.2431 | 104.0 | 36400 | 0.3263 | 0.7863 | 0.9461 | 0.9126 | 0.357 | 0.7813 | 0.7857 | 0.3095 | 0.8199 | 0.8266 | 0.4819 | 0.818 | 0.8196 | 0.7862 | 0.8308 | 0.7183 | 0.7615 | 0.8544 | 0.8874 | | 0.2745 | 105.0 | 36750 | 0.3305 | 0.7856 | 0.9342 | 0.9089 | 0.361 | 0.7786 | 0.789 | 0.3061 | 0.8204 | 0.826 | 0.475 | 0.8154 | 0.8231 | 0.7859 | 0.8297 | 0.7148 | 0.7615 | 0.8562 | 0.8866 | | 0.2401 | 106.0 | 37100 | 0.3107 | 0.7931 | 0.9462 | 0.9042 | 0.4462 | 0.7846 | 0.7912 | 0.3105 | 0.8241 | 0.829 | 0.5417 | 0.8194 | 0.8233 | 0.7922 | 0.833 | 0.725 | 0.7615 | 0.8621 | 0.8925 | | 0.2175 | 107.0 | 37450 | 0.3198 | 0.7867 | 0.9462 | 0.9044 | 0.3899 | 0.7794 | 0.7979 | 0.3061 | 0.8178 | 0.8239 | 0.4986 | 0.8158 | 0.8234 | 0.781 | 0.8243 | 0.7223 | 0.76 | 0.8568 | 0.8874 | | 0.2373 | 108.0 | 37800 | 0.3101 | 0.8 | 0.9591 | 0.9153 | 0.4018 | 0.7911 | 0.8239 | 0.3146 | 0.8331 | 0.8397 | 0.5347 | 0.8302 | 0.849 | 0.7883 | 0.8341 | 0.7512 | 0.7938 | 0.8606 | 0.8912 | | 0.2445 | 109.0 | 38150 | 0.3331 | 0.7918 | 0.9538 | 0.9067 | 0.3785 | 0.7816 | 0.8149 | 0.3099 | 0.8236 | 0.8288 | 0.4486 | 0.8193 | 0.8401 | 0.7798 | 0.8243 | 0.7387 | 0.7754 | 0.8568 | 0.8866 | | 0.2324 | 110.0 | 38500 | 0.3111 | 0.7989 | 0.9604 | 0.916 | 0.369 | 0.7966 | 0.8081 | 0.3133 | 0.8288 | 0.8342 | 0.4653 | 0.8305 | 0.8331 | 0.7747 | 0.8178 | 0.7517 | 0.7862 | 0.8703 | 0.8987 | | 0.2398 | 111.0 | 38850 | 0.3139 | 0.7931 | 0.9537 | 0.9118 | 0.3924 | 0.7902 | 0.8025 | 0.3087 | 0.8252 | 0.8297 | 0.4681 | 0.8248 | 0.8328 | 0.7805 | 0.8227 | 0.737 | 0.7723 | 0.8619 | 0.8941 | | 0.2258 | 112.0 | 39200 | 0.3175 | 0.7933 | 0.9606 | 0.9032 | 0.3495 | 0.7889 | 0.8042 | 0.3123 | 0.8233 | 0.8292 | 0.4444 | 0.8229 | 0.8299 | 0.7832 | 0.8232 | 0.7316 | 0.7708 | 0.8651 | 0.8937 | | 0.2799 | 113.0 | 39550 | 0.3123 | 0.7967 | 0.9602 | 0.9093 | 0.3468 | 0.7924 | 0.8136 | 0.3135 | 0.8283 | 0.8346 | 0.4639 | 0.8275 | 0.8417 | 0.7787 | 0.8222 | 0.7483 | 0.7892 | 0.863 | 0.8925 | | 0.2519 | 114.0 | 39900 | 0.3271 | 0.7909 | 0.9562 | 0.9082 | 0.3513 | 0.7835 | 0.819 | 0.3077 | 0.8212 | 0.8277 | 0.4486 | 0.8189 | 0.842 | 0.7722 | 0.8168 | 0.7424 | 0.7785 | 0.8581 | 0.8879 | | 0.2557 | 115.0 | 40250 | 0.3177 | 0.7906 | 0.9568 | 0.9169 | 0.396 | 0.7846 | 0.8019 | 0.3126 | 0.8231 | 0.8289 | 0.4778 | 0.8216 | 0.8284 | 0.7784 | 0.8238 | 0.7332 | 0.7708 | 0.8602 | 0.8921 | | 0.2462 | 116.0 | 40600 | 0.3061 | 0.7935 | 0.9599 | 0.9197 | 0.3886 | 0.7869 | 0.8129 | 0.3127 | 0.826 | 0.8319 | 0.4778 | 0.8256 | 0.8363 | 0.7729 | 0.8189 | 0.7453 | 0.7831 | 0.8623 | 0.8937 | | 0.253 | 117.0 | 40950 | 0.3069 | 0.7991 | 0.9613 | 0.914 | 0.3884 | 0.7891 | 0.8163 | 0.3129 | 0.8282 | 0.8342 | 0.5167 | 0.8242 | 0.838 | 0.7888 | 0.8303 | 0.7482 | 0.78 | 0.8603 | 0.8925 | | 0.2348 | 118.0 | 41300 | 0.2984 | 0.7996 | 0.9628 | 0.9167 | 0.4061 | 0.7915 | 0.8082 | 0.3129 | 0.831 | 0.8365 | 0.5097 | 0.8279 | 0.8293 | 0.7883 | 0.8303 | 0.7513 | 0.7877 | 0.8591 | 0.8916 | | 0.2436 | 119.0 | 41650 | 0.3055 | 0.8014 | 0.9596 | 0.9181 | 0.3964 | 0.7896 | 0.8258 | 0.3154 | 0.8315 | 0.8369 | 0.4819 | 0.8271 | 0.8479 | 0.7928 | 0.8324 | 0.749 | 0.7846 | 0.8625 | 0.8937 | | 0.2455 | 120.0 | 42000 | 0.3067 | 0.7979 | 0.96 | 0.9164 | 0.3928 | 0.7834 | 0.8194 | 0.3107 | 0.8274 | 0.8323 | 0.4736 | 0.8203 | 0.8409 | 0.7907 | 0.8303 | 0.7404 | 0.7754 | 0.8627 | 0.8912 | | 0.217 | 121.0 | 42350 | 0.3032 | 0.7997 | 0.9595 | 0.9177 | 0.3902 | 0.7908 | 0.8199 | 0.3134 | 0.83 | 0.8349 | 0.4667 | 0.8271 | 0.8441 | 0.7905 | 0.8319 | 0.7483 | 0.7815 | 0.8604 | 0.8912 | | 0.2433 | 122.0 | 42700 | 0.3069 | 0.7972 | 0.9591 | 0.9129 | 0.3566 | 0.788 | 0.8152 | 0.3147 | 0.829 | 0.834 | 0.4486 | 0.8259 | 0.8388 | 0.7896 | 0.8297 | 0.7433 | 0.7815 | 0.8588 | 0.8908 | | 0.2745 | 123.0 | 43050 | 0.3063 | 0.8 | 0.9592 | 0.9155 | 0.3829 | 0.7887 | 0.8228 | 0.3133 | 0.8315 | 0.8367 | 0.4833 | 0.8274 | 0.8425 | 0.7933 | 0.8351 | 0.745 | 0.7815 | 0.8618 | 0.8933 | | 0.2635 | 124.0 | 43400 | 0.3075 | 0.8018 | 0.9621 | 0.9203 | 0.4227 | 0.7873 | 0.8196 | 0.3138 | 0.8321 | 0.8377 | 0.5181 | 0.8255 | 0.8409 | 0.7961 | 0.8341 | 0.7488 | 0.7862 | 0.8606 | 0.8929 | | 0.2672 | 125.0 | 43750 | 0.3128 | 0.799 | 0.9562 | 0.9116 | 0.3784 | 0.7869 | 0.8222 | 0.3123 | 0.8281 | 0.8332 | 0.4611 | 0.8231 | 0.8431 | 0.7913 | 0.8303 | 0.7449 | 0.7785 | 0.8606 | 0.8908 | | 0.214 | 126.0 | 44100 | 0.3209 | 0.8028 | 0.9594 | 0.9162 | 0.3835 | 0.7903 | 0.826 | 0.3143 | 0.831 | 0.836 | 0.4556 | 0.8261 | 0.8461 | 0.7911 | 0.8286 | 0.7568 | 0.7908 | 0.8604 | 0.8887 | | 0.2384 | 127.0 | 44450 | 0.3114 | 0.8068 | 0.9625 | 0.915 | 0.3897 | 0.7961 | 0.8197 | 0.3162 | 0.8352 | 0.8398 | 0.4722 | 0.8322 | 0.8374 | 0.7977 | 0.8341 | 0.7557 | 0.7908 | 0.8669 | 0.8946 | | 0.2426 | 128.0 | 44800 | 0.3207 | 0.8018 | 0.9598 | 0.9143 | 0.3814 | 0.79 | 0.8257 | 0.3151 | 0.8309 | 0.8364 | 0.4542 | 0.8264 | 0.8451 | 0.7879 | 0.827 | 0.7551 | 0.7923 | 0.8623 | 0.89 | | 0.2177 | 129.0 | 45150 | 0.3058 | 0.8053 | 0.96 | 0.9175 | 0.3808 | 0.7938 | 0.8254 | 0.3164 | 0.8339 | 0.8391 | 0.4514 | 0.8313 | 0.8467 | 0.7992 | 0.8362 | 0.7527 | 0.7877 | 0.8641 | 0.8933 | | 0.2481 | 130.0 | 45500 | 0.3097 | 0.8022 | 0.9591 | 0.9164 | 0.3838 | 0.7881 | 0.8285 | 0.3137 | 0.8297 | 0.8346 | 0.4431 | 0.8229 | 0.8478 | 0.7912 | 0.8286 | 0.7507 | 0.7831 | 0.8647 | 0.8921 | | 0.282 | 131.0 | 45850 | 0.3087 | 0.8022 | 0.9623 | 0.9136 | 0.3772 | 0.7886 | 0.8213 | 0.3133 | 0.8301 | 0.8355 | 0.4653 | 0.824 | 0.8404 | 0.7893 | 0.8281 | 0.7537 | 0.7877 | 0.8636 | 0.8908 | | 0.2458 | 132.0 | 46200 | 0.3126 | 0.8053 | 0.9606 | 0.916 | 0.3876 | 0.7917 | 0.826 | 0.3146 | 0.8331 | 0.8377 | 0.4667 | 0.8266 | 0.8469 | 0.7961 | 0.8314 | 0.7551 | 0.7892 | 0.8648 | 0.8925 | | 0.2216 | 133.0 | 46550 | 0.3079 | 0.8054 | 0.9623 | 0.9137 | 0.3886 | 0.7951 | 0.8247 | 0.3154 | 0.8344 | 0.8391 | 0.475 | 0.8305 | 0.8465 | 0.7974 | 0.8341 | 0.7535 | 0.7892 | 0.8654 | 0.8941 | | 0.2697 | 134.0 | 46900 | 0.3107 | 0.8037 | 0.9621 | 0.9129 | 0.3784 | 0.7923 | 0.8267 | 0.3148 | 0.8335 | 0.8387 | 0.4667 | 0.8291 | 0.8484 | 0.7925 | 0.8308 | 0.7512 | 0.7908 | 0.8675 | 0.8946 | | 0.2298 | 135.0 | 47250 | 0.3102 | 0.8028 | 0.9622 | 0.9104 | 0.3822 | 0.7908 | 0.8235 | 0.3142 | 0.8321 | 0.8369 | 0.4625 | 0.8266 | 0.8439 | 0.7942 | 0.8324 | 0.7492 | 0.7846 | 0.865 | 0.8937 | | 0.2089 | 136.0 | 47600 | 0.3118 | 0.8045 | 0.9621 | 0.9138 | 0.3833 | 0.7914 | 0.8244 | 0.3149 | 0.8333 | 0.8385 | 0.4681 | 0.8275 | 0.847 | 0.7952 | 0.833 | 0.7533 | 0.7892 | 0.8649 | 0.8933 | | 0.255 | 137.0 | 47950 | 0.3076 | 0.806 | 0.9623 | 0.9141 | 0.3833 | 0.7941 | 0.8236 | 0.3152 | 0.8357 | 0.8409 | 0.4722 | 0.8309 | 0.8457 | 0.8001 | 0.8368 | 0.7535 | 0.7923 | 0.8644 | 0.8937 | | 0.1938 | 138.0 | 48300 | 0.3101 | 0.8057 | 0.9623 | 0.914 | 0.3795 | 0.7942 | 0.8277 | 0.314 | 0.8347 | 0.8398 | 0.4639 | 0.8298 | 0.8484 | 0.7968 | 0.833 | 0.7575 | 0.7938 | 0.8629 | 0.8925 | | 0.2402 | 139.0 | 48650 | 0.3098 | 0.8035 | 0.9622 | 0.9108 | 0.3794 | 0.7911 | 0.8266 | 0.3137 | 0.832 | 0.8375 | 0.4639 | 0.8273 | 0.8474 | 0.7916 | 0.8281 | 0.753 | 0.7908 | 0.8657 | 0.8937 | | 0.2203 | 140.0 | 49000 | 0.3117 | 0.8059 | 0.9623 | 0.914 | 0.3825 | 0.7921 | 0.8285 | 0.3143 | 0.8338 | 0.8395 | 0.4639 | 0.8282 | 0.8484 | 0.7942 | 0.8314 | 0.758 | 0.7923 | 0.8657 | 0.895 | | 0.2386 | 141.0 | 49350 | 0.3135 | 0.8044 | 0.962 | 0.9137 | 0.3833 | 0.791 | 0.8283 | 0.3144 | 0.8333 | 0.8388 | 0.4681 | 0.8277 | 0.8488 | 0.7951 | 0.8314 | 0.7547 | 0.7923 | 0.8634 | 0.8929 | | 0.2262 | 142.0 | 49700 | 0.3132 | 0.8049 | 0.9626 | 0.9142 | 0.3834 | 0.7919 | 0.8283 | 0.3137 | 0.8317 | 0.8372 | 0.4681 | 0.8266 | 0.8474 | 0.7944 | 0.8303 | 0.7534 | 0.7877 | 0.8669 | 0.8937 | | 0.2054 | 143.0 | 50050 | 0.3116 | 0.8038 | 0.9625 | 0.911 | 0.3792 | 0.79 | 0.8295 | 0.3149 | 0.8327 | 0.8378 | 0.4583 | 0.8271 | 0.8496 | 0.7954 | 0.833 | 0.7524 | 0.7892 | 0.8635 | 0.8912 | | 0.2126 | 144.0 | 50400 | 0.3119 | 0.8037 | 0.9625 | 0.911 | 0.3792 | 0.7897 | 0.8294 | 0.3151 | 0.8326 | 0.8376 | 0.4583 | 0.8269 | 0.8496 | 0.7953 | 0.8324 | 0.7524 | 0.7892 | 0.8633 | 0.8912 | | 0.2538 | 145.0 | 50750 | 0.3135 | 0.8033 | 0.9625 | 0.911 | 0.3792 | 0.7889 | 0.8281 | 0.3146 | 0.8323 | 0.8373 | 0.4583 | 0.8266 | 0.848 | 0.7941 | 0.8308 | 0.7529 | 0.7908 | 0.8628 | 0.8904 | | 0.2391 | 146.0 | 51100 | 0.3143 | 0.8033 | 0.9625 | 0.911 | 0.3791 | 0.7895 | 0.8284 | 0.3146 | 0.8323 | 0.8373 | 0.4583 | 0.8271 | 0.8479 | 0.7941 | 0.8308 | 0.7529 | 0.7908 | 0.8629 | 0.8904 | | 0.2148 | 147.0 | 51450 | 0.3143 | 0.8032 | 0.9625 | 0.911 | 0.3791 | 0.7894 | 0.8284 | 0.3146 | 0.8321 | 0.8371 | 0.4583 | 0.8269 | 0.8479 | 0.7938 | 0.8303 | 0.7529 | 0.7908 | 0.8628 | 0.8904 | | 0.2581 | 148.0 | 51800 | 0.3142 | 0.8032 | 0.9625 | 0.911 | 0.3791 | 0.7895 | 0.8284 | 0.3146 | 0.8323 | 0.8373 | 0.4583 | 0.8271 | 0.8479 | 0.7941 | 0.8308 | 0.7529 | 0.7908 | 0.8628 | 0.8904 | | 0.2359 | 149.0 | 52150 | 0.3142 | 0.8032 | 0.9625 | 0.911 | 0.3791 | 0.7895 | 0.8284 | 0.3146 | 0.8323 | 0.8373 | 0.4583 | 0.8271 | 0.8479 | 0.7941 | 0.8308 | 0.7529 | 0.7908 | 0.8628 | 0.8904 | | 0.2258 | 150.0 | 52500 | 0.3142 | 0.8032 | 0.9625 | 0.911 | 0.3791 | 0.7895 | 0.8284 | 0.3146 | 0.8323 | 0.8373 | 0.4583 | 0.8271 | 0.8479 | 0.7941 | 0.8308 | 0.7529 | 0.7908 | 0.8628 | 0.8904 | ### Framework versions - Transformers 4.45.2 - Pytorch 2.5.0+cu121 - Datasets 2.19.2 - Tokenizers 0.20.1
[ "chicken", "duck", "plant" ]
yzhany0r/detr_finetuned_cppe5
<!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. --> # detr_finetuned_cppe5 This model is a fine-tuned version of [microsoft/conditional-detr-resnet-50](https://huggingface.co/microsoft/conditional-detr-resnet-50) on an unknown dataset. It achieves the following results on the evaluation set: - Loss: 1.1933 - Map: 0.0 - Map 50: 0.0 - Map 75: 0.0 - Map Small: 0.0 - Map Medium: -1.0 - Map Large: -1.0 - Mar 1: 0.0 - Mar 10: 0.0 - Mar 100: 0.0 - Mar Small: 0.0 - Mar Medium: -1.0 - Mar Large: -1.0 - Map Coverall: 0.0 - Mar 100 Coverall: 0.0 - Map Face Shield: 0.0 - Mar 100 Face Shield: 0.0 - Map Gloves: 0.0 - Mar 100 Gloves: 0.0 - Map Goggles: 0.0 - Mar 100 Goggles: 0.0 - Map Mask: 0.0 - Mar 100 Mask: 0.0 ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 5e-05 - train_batch_size: 4 - eval_batch_size: 8 - seed: 42 - optimizer: Use adamw_torch with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments - lr_scheduler_type: cosine - num_epochs: 30 ### Training results | Training Loss | Epoch | Step | Validation Loss | Map | Map 50 | Map 75 | Map Small | Map Medium | Map Large | Mar 1 | Mar 10 | Mar 100 | Mar Small | Mar Medium | Mar Large | Map Coverall | Mar 100 Coverall | Map Face Shield | Mar 100 Face Shield | Map Gloves | Mar 100 Gloves | Map Goggles | Mar 100 Goggles | Map Mask | Mar 100 Mask | |:-------------:|:-----:|:----:|:---------------:|:---:|:------:|:------:|:---------:|:----------:|:---------:|:-----:|:------:|:-------:|:---------:|:----------:|:---------:|:------------:|:----------------:|:---------------:|:-------------------:|:----------:|:--------------:|:-----------:|:---------------:|:--------:|:------------:| | No log | 1.0 | 213 | 1.7858 | 0.0 | 0.0 | 0.0 | 0.0 | -1.0 | -1.0 | 0.0 | 0.0 | 0.0 | 0.0 | -1.0 | -1.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | | No log | 2.0 | 426 | 1.7974 | 0.0 | 0.0 | 0.0 | 0.0 | -1.0 | -1.0 | 0.0 | 0.0 | 0.0 | 0.0 | -1.0 | -1.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | | 2.612 | 3.0 | 639 | 1.6086 | 0.0 | 0.0 | 0.0 | 0.0 | -1.0 | -1.0 | 0.0 | 0.0 | 0.0 | 0.0 | -1.0 | -1.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | | 2.612 | 4.0 | 852 | 1.5644 | 0.0 | 0.0 | 0.0 | 0.0 | -1.0 | -1.0 | 0.0 | 0.0 | 0.0 | 0.0 | -1.0 | -1.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | | 1.3972 | 5.0 | 1065 | 1.4356 | 0.0 | 0.0 | 0.0 | 0.0 | -1.0 | -1.0 | 0.0 | 0.0 | 0.0 | 0.0 | -1.0 | -1.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | | 1.3972 | 6.0 | 1278 | 1.4547 | 0.0 | 0.0 | 0.0 | 0.0 | -1.0 | -1.0 | 0.0 | 0.0 | 0.0 | 0.0 | -1.0 | -1.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | | 1.3972 | 7.0 | 1491 | 1.4207 | 0.0 | 0.0 | 0.0 | 0.0 | -1.0 | -1.0 | 0.0 | 0.0 | 0.0 | 0.0 | -1.0 | -1.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | | 1.2125 | 8.0 | 1704 | 1.3967 | 0.0 | 0.0 | 0.0 | 0.0 | -1.0 | -1.0 | 0.0 | 0.0 | 0.0 | 0.0 | -1.0 | -1.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | | 1.2125 | 9.0 | 1917 | 1.3162 | 0.0 | 0.0 | 0.0 | 0.0 | -1.0 | -1.0 | 0.0 | 0.0 | 0.0 | 0.0 | -1.0 | -1.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | | 1.09 | 10.0 | 2130 | 1.3086 | 0.0 | 0.0 | 0.0 | 0.0 | -1.0 | -1.0 | 0.0 | 0.0 | 0.0 | 0.0 | -1.0 | -1.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | | 1.09 | 11.0 | 2343 | 1.3013 | 0.0 | 0.0 | 0.0 | 0.0 | -1.0 | -1.0 | 0.0 | 0.0 | 0.0 | 0.0 | -1.0 | -1.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | | 0.9743 | 12.0 | 2556 | 1.2823 | 0.0 | 0.0 | 0.0 | 0.0 | -1.0 | -1.0 | 0.0 | 0.0 | 0.0 | 0.0 | -1.0 | -1.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | | 0.9743 | 13.0 | 2769 | 1.2798 | 0.0 | 0.0 | 0.0 | 0.0 | -1.0 | -1.0 | 0.0 | 0.0 | 0.0 | 0.0 | -1.0 | -1.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | | 0.9743 | 14.0 | 2982 | 1.2379 | 0.0 | 0.0 | 0.0 | 0.0 | -1.0 | -1.0 | 0.0 | 0.0 | 0.0 | 0.0 | -1.0 | -1.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | | 0.8793 | 15.0 | 3195 | 1.2404 | 0.0 | 0.0 | 0.0 | 0.0 | -1.0 | -1.0 | 0.0 | 0.0 | 0.0 | 0.0 | -1.0 | -1.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | | 0.8793 | 16.0 | 3408 | 1.2136 | 0.0 | 0.0 | 0.0 | 0.0 | -1.0 | -1.0 | 0.0 | 0.0 | 0.0 | 0.0 | -1.0 | -1.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | | 0.7806 | 17.0 | 3621 | 1.2239 | 0.0 | 0.0 | 0.0 | 0.0 | -1.0 | -1.0 | 0.0 | 0.0 | 0.0 | 0.0 | -1.0 | -1.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | | 0.7806 | 18.0 | 3834 | 1.2372 | 0.0 | 0.0 | 0.0 | 0.0 | -1.0 | -1.0 | 0.0 | 0.0 | 0.0 | 0.0 | -1.0 | -1.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | | 0.7053 | 19.0 | 4047 | 1.2269 | 0.0 | 0.0 | 0.0 | 0.0 | -1.0 | -1.0 | 0.0 | 0.0 | 0.0 | 0.0 | -1.0 | -1.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | | 0.7053 | 20.0 | 4260 | 1.2231 | 0.0 | 0.0 | 0.0 | 0.0 | -1.0 | -1.0 | 0.0 | 0.0 | 0.0 | 0.0 | -1.0 | -1.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | | 0.7053 | 21.0 | 4473 | 1.2135 | 0.0 | 0.0 | 0.0 | 0.0 | -1.0 | -1.0 | 0.0 | 0.0 | 0.0 | 0.0 | -1.0 | -1.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | | 0.6369 | 22.0 | 4686 | 1.2037 | 0.0 | 0.0 | 0.0 | 0.0 | -1.0 | -1.0 | 0.0 | 0.0 | 0.0 | 0.0 | -1.0 | -1.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | | 0.6369 | 23.0 | 4899 | 1.2048 | 0.0 | 0.0 | 0.0 | 0.0 | -1.0 | -1.0 | 0.0 | 0.0 | 0.0 | 0.0 | -1.0 | -1.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | | 0.5831 | 24.0 | 5112 | 1.1930 | 0.0 | 0.0 | 0.0 | 0.0 | -1.0 | -1.0 | 0.0 | 0.0 | 0.0 | 0.0 | -1.0 | -1.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | | 0.5831 | 25.0 | 5325 | 1.2022 | 0.0 | 0.0 | 0.0 | 0.0 | -1.0 | -1.0 | 0.0 | 0.0 | 0.0 | 0.0 | -1.0 | -1.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | | 0.5447 | 26.0 | 5538 | 1.1945 | 0.0 | 0.0 | 0.0 | 0.0 | -1.0 | -1.0 | 0.0 | 0.0 | 0.0 | 0.0 | -1.0 | -1.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | | 0.5447 | 27.0 | 5751 | 1.1970 | 0.0 | 0.0 | 0.0 | 0.0 | -1.0 | -1.0 | 0.0 | 0.0 | 0.0 | 0.0 | -1.0 | -1.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | | 0.5447 | 28.0 | 5964 | 1.1923 | 0.0 | 0.0 | 0.0 | 0.0 | -1.0 | -1.0 | 0.0 | 0.0 | 0.0 | 0.0 | -1.0 | -1.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | | 0.5184 | 29.0 | 6177 | 1.1936 | 0.0 | 0.0 | 0.0 | 0.0 | -1.0 | -1.0 | 0.0 | 0.0 | 0.0 | 0.0 | -1.0 | -1.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | | 0.5184 | 30.0 | 6390 | 1.1933 | 0.0 | 0.0 | 0.0 | 0.0 | -1.0 | -1.0 | 0.0 | 0.0 | 0.0 | 0.0 | -1.0 | -1.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | ### Framework versions - Transformers 4.46.0 - Pytorch 2.2.2 - Datasets 3.0.2 - Tokenizers 0.20.1
[ "coverall", "face_shield", "gloves", "goggles", "mask" ]
Garon16/rtdetr_r50vd_russia_plate_detector
# RT-DETR Russian car plate detection with classification by type This model is a fine-tuned version of [PekingU/rtdetr_r50vd_coco_o365](https://huggingface.co/PekingU/rtdetr_r50vd_coco_o365) on an unknown dataset. It achieves the following results on the evaluation set: - Loss: 4.1673 - Map: 0.8829 - Map 50: 0.9858 - Map 75: 0.9736 - Map Car-plates-and-these-types: -1.0 - Map Large: 0.9689 - Map Medium: 0.9125 - Map N P: 0.857 - Map P P: 0.9087 - Map Small: 0.696 - Mar 1: 0.8686 - Mar 10: 0.9299 - Mar 100: 0.9357 - Mar 100 Car-plates-and-these-types: -1.0 - Mar 100 N P: 0.9169 - Mar 100 P P: 0.9545 - Mar Large: 0.9844 - Mar Medium: 0.958 - Mar Small: 0.8354 ## Model description МодСль Π΄Π΅Ρ‚Π΅ΠΊΡ†ΠΈΠΈ Π½ΠΎΠΌΠ΅Ρ€Π½Ρ‹Ρ… Π·Π½Π°ΠΊΠΎΠ² Π°Π²Ρ‚ΠΎΠΌΠΎΠ±ΠΈΠ»Π΅ΠΉ Π Π€, Π² Π΄Π°Π½Π½Ρ‹ΠΉ ΠΌΠΎΠΌΠ΅Π½Ρ‚ 2 класса n_p ΠΈ p_p, ΠΎΠ±Ρ‹Ρ‡Π½Ρ‹Π΅ Π½ΠΎΠΌΠ΅Ρ€Π° ΠΈ полицСйскиС ## Intended uses & limitations ΠŸΡ€ΠΈΠΌΠ΅Ρ€ использования: <pre> from transformers import AutoModelForObjectDetection, AutoImageProcessor import torch import supervision as sv DEVICE = torch.device("cuda" if torch.cuda.is_available() else "cpu") model = AutoModelForObjectDetection.from_pretrained('Garon16/rtdetr_r50vd_russia_plate_detector').to(DEVICE) processor = AutoImageProcessor.from_pretrained('Garon16/rtdetr_r50vd_russia_plate_detector') path = 'path/to/image' image = Image.open(path) inputs = processor(image, return_tensors="pt").to(DEVICE) with torch.no_grad(): outputs = model(**inputs) w, h = image.size results = processor.post_process_object_detection( outputs, target_sizes=[(h, w)], threshold=0.3) detections = sv.Detections.from_transformers(results[0]).with_nms(0.3) labels = [ model.config.id2label[class_id] for class_id in detections.class_id ] annotated_image = image.copy() annotated_image = sv.BoundingBoxAnnotator().annotate(annotated_image, detections) annotated_image = sv.LabelAnnotator().annotate(annotated_image, detections, labels=labels) grid = sv.create_tiles( [annotated_image], grid_size=(1, 1), single_tile_size=(512, 512), tile_padding_color=sv.Color.WHITE, tile_margin_color=sv.Color.WHITE ) sv.plot_image(grid, size=(10, 10)) </pre> ## Training and evaluation data ΠžΠ±ΡƒΡ‡Π°Π» Π½Π° своём датасСтС - https://universe.roboflow.com/testcarplate/russian-license-plates-classification-by-this-type ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 5e-05 - train_batch_size: 32 - eval_batch_size: 8 - seed: 42 - optimizer: Use adamw_torch with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments - lr_scheduler_type: linear - lr_scheduler_warmup_steps: 300 - num_epochs: 20 ### Training results | Training Loss | Epoch | Step | Validation Loss | Map | Map 50 | Map 75 | Map Car-plates-and-these-types | Map Large | Map Medium | Map N P | Map P P | Map Small | Mar 1 | Mar 10 | Mar 100 | Mar 100 Car-plates-and-these-types | Mar 100 N P | Mar 100 P P | Mar Large | Mar Medium | Mar Small | |:-------------:|:-----:|:----:|:---------------:|:------:|:------:|:------:|:------------------------------:|:---------:|:----------:|:-------:|:-------:|:---------:|:------:|:------:|:-------:|:----------------------------------:|:-----------:|:-----------:|:---------:|:----------:|:---------:| | No log | 1.0 | 109 | 64.6127 | 0.035 | 0.0558 | 0.0379 | -1.0 | 0.0039 | 0.0663 | 0.0191 | 0.0508 | 0.0071 | 0.1523 | 0.3009 | 0.3361 | -1.0 | 0.3179 | 0.3543 | 0.7625 | 0.3788 | 0.1157 | | No log | 2.0 | 218 | 15.4008 | 0.8237 | 0.9418 | 0.9327 | -1.0 | 0.893 | 0.879 | 0.7945 | 0.8529 | 0.4319 | 0.8203 | 0.8924 | 0.9018 | -1.0 | 0.8766 | 0.9269 | 0.9656 | 0.9324 | 0.7653 | | No log | 3.0 | 327 | 9.4050 | 0.8439 | 0.9566 | 0.9479 | -1.0 | 0.9439 | 0.8908 | 0.8158 | 0.872 | 0.5171 | 0.8416 | 0.908 | 0.9144 | -1.0 | 0.9002 | 0.9286 | 0.9781 | 0.9368 | 0.8051 | | No log | 4.0 | 436 | 7.9164 | 0.8493 | 0.9665 | 0.9543 | -1.0 | 0.9567 | 0.8903 | 0.8338 | 0.8648 | 0.5581 | 0.8481 | 0.9159 | 0.9267 | -1.0 | 0.9173 | 0.936 | 0.975 | 0.949 | 0.8185 | | 70.2867 | 5.0 | 545 | 6.8177 | 0.8525 | 0.9723 | 0.9602 | -1.0 | 0.9521 | 0.8918 | 0.8234 | 0.8816 | 0.6025 | 0.8438 | 0.9214 | 0.9279 | -1.0 | 0.9181 | 0.9378 | 0.975 | 0.9492 | 0.8211 | | 70.2867 | 6.0 | 654 | 6.0182 | 0.854 | 0.9744 | 0.9619 | -1.0 | 0.9574 | 0.8912 | 0.8251 | 0.8829 | 0.6123 | 0.8438 | 0.9176 | 0.927 | -1.0 | 0.9137 | 0.9403 | 0.9781 | 0.9503 | 0.8163 | | 70.2867 | 7.0 | 763 | 5.4024 | 0.8731 | 0.9772 | 0.9667 | -1.0 | 0.9635 | 0.9113 | 0.8462 | 0.9001 | 0.6376 | 0.8608 | 0.9275 | 0.9336 | -1.0 | 0.9202 | 0.9471 | 0.9781 | 0.956 | 0.8266 | | 70.2867 | 8.0 | 872 | 5.2224 | 0.8726 | 0.9809 | 0.9767 | -1.0 | 0.9582 | 0.9069 | 0.8487 | 0.8966 | 0.6472 | 0.8625 | 0.9265 | 0.9301 | -1.0 | 0.9137 | 0.9464 | 0.9875 | 0.9528 | 0.8232 | | 70.2867 | 9.0 | 981 | 4.7844 | 0.8679 | 0.9821 | 0.9687 | -1.0 | 0.9574 | 0.9023 | 0.8451 | 0.8907 | 0.6382 | 0.8606 | 0.9213 | 0.9283 | -1.0 | 0.9119 | 0.9448 | 0.9844 | 0.952 | 0.8165 | | 4.2466 | 10.0 | 1090 | 5.1437 | 0.8729 | 0.9816 | 0.9762 | -1.0 | 0.9577 | 0.9028 | 0.8448 | 0.901 | 0.6686 | 0.8605 | 0.9296 | 0.9359 | -1.0 | 0.9203 | 0.9514 | 0.9781 | 0.9567 | 0.8413 | | 4.2466 | 11.0 | 1199 | 4.5169 | 0.8858 | 0.9828 | 0.9768 | -1.0 | 0.9707 | 0.9162 | 0.8628 | 0.9087 | 0.6734 | 0.8695 | 0.9264 | 0.931 | -1.0 | 0.9121 | 0.95 | 0.9781 | 0.9538 | 0.823 | | 4.2466 | 12.0 | 1308 | 4.5858 | 0.8813 | 0.9865 | 0.9744 | -1.0 | 0.9623 | 0.9126 | 0.8585 | 0.9041 | 0.6815 | 0.8671 | 0.9308 | 0.9355 | -1.0 | 0.9185 | 0.9526 | 0.9812 | 0.9583 | 0.8308 | | 4.2466 | 13.0 | 1417 | 4.5345 | 0.8778 | 0.9843 | 0.9726 | -1.0 | 0.957 | 0.9101 | 0.8526 | 0.903 | 0.6754 | 0.8628 | 0.9281 | 0.9335 | -1.0 | 0.9158 | 0.9512 | 0.9812 | 0.9557 | 0.8314 | | 3.589 | 14.0 | 1526 | 4.3003 | 0.8885 | 0.9857 | 0.9759 | -1.0 | 0.9656 | 0.9189 | 0.8642 | 0.9128 | 0.6957 | 0.8724 | 0.9334 | 0.9375 | -1.0 | 0.9194 | 0.9555 | 0.9875 | 0.959 | 0.8375 | | 3.589 | 15.0 | 1635 | 4.3999 | 0.8819 | 0.986 | 0.9741 | -1.0 | 0.9606 | 0.9118 | 0.8575 | 0.9064 | 0.6892 | 0.8659 | 0.9283 | 0.9336 | -1.0 | 0.9137 | 0.9534 | 0.9844 | 0.9566 | 0.8245 | | 3.589 | 16.0 | 1744 | 4.2719 | 0.8796 | 0.986 | 0.9726 | -1.0 | 0.9661 | 0.9093 | 0.8543 | 0.905 | 0.6914 | 0.8649 | 0.927 | 0.9313 | -1.0 | 0.9121 | 0.9505 | 0.9875 | 0.9543 | 0.8266 | | 3.589 | 17.0 | 1853 | 4.2497 | 0.8838 | 0.9845 | 0.9733 | -1.0 | 0.9656 | 0.9141 | 0.8599 | 0.9077 | 0.6997 | 0.8678 | 0.9295 | 0.9352 | -1.0 | 0.9141 | 0.9562 | 0.9812 | 0.958 | 0.832 | | 3.589 | 18.0 | 1962 | 4.2807 | 0.8829 | 0.9855 | 0.9754 | -1.0 | 0.9673 | 0.9121 | 0.8558 | 0.9099 | 0.6964 | 0.8683 | 0.9286 | 0.9337 | -1.0 | 0.9126 | 0.9548 | 0.9844 | 0.9555 | 0.8357 | | 3.2442 | 19.0 | 2071 | 4.1978 | 0.8835 | 0.9861 | 0.9748 | -1.0 | 0.9675 | 0.9121 | 0.8559 | 0.911 | 0.6932 | 0.8691 | 0.9272 | 0.9336 | -1.0 | 0.9134 | 0.9538 | 0.9844 | 0.9557 | 0.8337 | | 3.2442 | 20.0 | 2180 | 4.1673 | 0.8829 | 0.9858 | 0.9736 | -1.0 | 0.9689 | 0.9125 | 0.857 | 0.9087 | 0.696 | 0.8686 | 0.9299 | 0.9357 | -1.0 | 0.9169 | 0.9545 | 0.9844 | 0.958 | 0.8354 | ### Framework versions - Transformers 4.46.0.dev0 - Pytorch 2.5.0+cu124 - Tokenizers 0.20.1
[ "car-plates-and-these-types", "n_p", "p_p" ]
doktor47/zinemind_msftall_early
# Model Card for Model ID <!-- Provide a quick summary of what the model is/does. --> ## Model Details ### Model Description <!-- Provide a longer summary of what this model is. --> This is the model card of a πŸ€— transformers model that has been pushed on the Hub. This model card has been automatically generated. - **Developed by:** [More Information Needed] - **Funded by [optional]:** [More Information Needed] - **Shared by [optional]:** [More Information Needed] - **Model type:** [More Information Needed] - **Language(s) (NLP):** [More Information Needed] - **License:** [More Information Needed] - **Finetuned from model [optional]:** [More Information Needed] ### Model Sources [optional] <!-- Provide the basic links for the model. --> - **Repository:** [More Information Needed] - **Paper [optional]:** [More Information Needed] - **Demo [optional]:** [More Information Needed] ## Uses <!-- Address questions around how the model is intended to be used, including the foreseeable users of the model and those affected by the model. --> ### Direct Use <!-- This section is for the model use without fine-tuning or plugging into a larger ecosystem/app. --> [More Information Needed] ### Downstream Use [optional] <!-- This section is for the model use when fine-tuned for a task, or when plugged into a larger ecosystem/app --> [More Information Needed] ### Out-of-Scope Use <!-- This section addresses misuse, malicious use, and uses that the model will not work well for. --> [More Information Needed] ## Bias, Risks, and Limitations <!-- This section is meant to convey both technical and sociotechnical limitations. --> [More Information Needed] ### Recommendations <!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. --> Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations. ## How to Get Started with the Model Use the code below to get started with the model. [More Information Needed] ## Training Details ### Training Data <!-- This should link to a Dataset Card, perhaps with a short stub of information on what the training data is all about as well as documentation related to data pre-processing or additional filtering. --> [More Information Needed] ### Training Procedure <!-- This relates heavily to the Technical Specifications. Content here should link to that section when it is relevant to the training procedure. --> #### Preprocessing [optional] [More Information Needed] #### Training Hyperparameters - **Training regime:** [More Information Needed] <!--fp32, fp16 mixed precision, bf16 mixed precision, bf16 non-mixed precision, fp16 non-mixed precision, fp8 mixed precision --> #### Speeds, Sizes, Times [optional] <!-- This section provides information about throughput, start/end time, checkpoint size if relevant, etc. --> [More Information Needed] ## Evaluation <!-- This section describes the evaluation protocols and provides the results. --> ### Testing Data, Factors & Metrics #### Testing Data <!-- This should link to a Dataset Card if possible. --> [More Information Needed] #### Factors <!-- These are the things the evaluation is disaggregating by, e.g., subpopulations or domains. --> [More Information Needed] #### Metrics <!-- These are the evaluation metrics being used, ideally with a description of why. --> [More Information Needed] ### Results [More Information Needed] #### Summary ## Model Examination [optional] <!-- Relevant interpretability work for the model goes here --> [More Information Needed] ## Environmental Impact <!-- Total emissions (in grams of CO2eq) and additional considerations, such as electricity usage, go here. Edit the suggested text below accordingly --> Carbon emissions can be estimated using the [Machine Learning Impact calculator](https://mlco2.github.io/impact#compute) presented in [Lacoste et al. (2019)](https://arxiv.org/abs/1910.09700). - **Hardware Type:** [More Information Needed] - **Hours used:** [More Information Needed] - **Cloud Provider:** [More Information Needed] - **Compute Region:** [More Information Needed] - **Carbon Emitted:** [More Information Needed] ## Technical Specifications [optional] ### Model Architecture and Objective [More Information Needed] ### Compute Infrastructure [More Information Needed] #### Hardware [More Information Needed] #### Software [More Information Needed] ## Citation [optional] <!-- If there is a paper or blog post introducing the model, the APA and Bibtex information for that should go in this section. --> **BibTeX:** [More Information Needed] **APA:** [More Information Needed] ## Glossary [optional] <!-- If relevant, include terms and calculations in this section that can help readers understand the model or model card. --> [More Information Needed] ## More Information [optional] [More Information Needed] ## Model Card Authors [optional] [More Information Needed] ## Model Card Contact [More Information Needed]
[ "table", "table column", "table row", "table column header", "table projected row header", "table spanning cell" ]
joe611/chickens-composite-101818181818-150-epochs-wo-transform
<!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. --> # chickens-composite-101818181818-150-epochs-wo-transform This model is a fine-tuned version of [facebook/detr-resnet-50](https://huggingface.co/facebook/detr-resnet-50) on the None dataset. It achieves the following results on the evaluation set: - Loss: 0.2920 - Map: 0.8077 - Map 50: 0.9477 - Map 75: 0.9158 - Map Small: 0.3091 - Map Medium: 0.8073 - Map Large: 0.8068 - Mar 1: 0.3412 - Mar 10: 0.8477 - Mar 100: 0.8503 - Mar Small: 0.3681 - Mar Medium: 0.8532 - Mar Large: 0.8416 - Map Chicken: 0.8113 - Mar 100 Chicken: 0.8536 - Map Duck: 0.7376 - Mar 100 Duck: 0.7884 - Map Plant: 0.8741 - Mar 100 Plant: 0.9087 ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 1e-05 - train_batch_size: 2 - eval_batch_size: 8 - seed: 42 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: cosine - num_epochs: 150 ### Training results | Training Loss | Epoch | Step | Validation Loss | Map | Map 50 | Map 75 | Map Chicken | Map Duck | Map Large | Map Medium | Map Plant | Map Small | Mar 1 | Mar 10 | Mar 100 | Mar 100 Chicken | Mar 100 Duck | Mar 100 Plant | Mar Large | Mar Medium | Mar Small | |:-------------:|:-----:|:-----:|:---------------:|:------:|:------:|:------:|:-----------:|:--------:|:---------:|:----------:|:---------:|:---------:|:------:|:------:|:-------:|:---------------:|:------------:|:-------------:|:---------:|:----------:|:---------:| | 1.4053 | 1.0 | 500 | 1.2477 | 0.2132 | 0.2977 | 0.2494 | 0.0752 | 0.0 | 0.2736 | 0.111 | 0.5644 | 0.0246 | 0.0896 | 0.344 | 0.4277 | 0.5375 | 0.0 | 0.7455 | 0.4384 | 0.3889 | 0.125 | | 1.1026 | 2.0 | 1000 | 0.9474 | 0.3289 | 0.4761 | 0.3739 | 0.2886 | 0.0 | 0.3581 | 0.2865 | 0.6982 | 0.0533 | 0.1172 | 0.4404 | 0.453 | 0.5972 | 0.0 | 0.7617 | 0.4725 | 0.4198 | 0.075 | | 0.9145 | 3.0 | 1500 | 0.8323 | 0.3774 | 0.5429 | 0.4501 | 0.4292 | 0.0 | 0.3887 | 0.3387 | 0.7029 | 0.056 | 0.1275 | 0.4604 | 0.4671 | 0.6444 | 0.0 | 0.7569 | 0.4759 | 0.434 | 0.1417 | | 0.7969 | 4.0 | 2000 | 0.7482 | 0.4014 | 0.5722 | 0.4757 | 0.469 | 0.0 | 0.4236 | 0.3599 | 0.7351 | 0.0509 | 0.1308 | 0.4796 | 0.484 | 0.6673 | 0.0 | 0.7846 | 0.5137 | 0.4489 | 0.1 | | 0.7362 | 5.0 | 2500 | 0.6636 | 0.4105 | 0.5683 | 0.4776 | 0.497 | 0.0 | 0.4218 | 0.37 | 0.7346 | 0.0401 | 0.1347 | 0.4905 | 0.4936 | 0.6996 | 0.0 | 0.7813 | 0.5035 | 0.4576 | 0.1042 | | 0.7851 | 6.0 | 3000 | 0.6610 | 0.4002 | 0.5655 | 0.4674 | 0.4651 | 0.0 | 0.4233 | 0.3654 | 0.7355 | 0.0907 | 0.1327 | 0.4844 | 0.4895 | 0.6871 | 0.0 | 0.7813 | 0.5251 | 0.4573 | 0.125 | | 0.6384 | 7.0 | 3500 | 0.6013 | 0.4291 | 0.5761 | 0.5129 | 0.5187 | 0.0 | 0.4431 | 0.4007 | 0.7686 | 0.1241 | 0.1398 | 0.5048 | 0.5076 | 0.7105 | 0.0 | 0.8123 | 0.5196 | 0.4804 | 0.175 | | 0.6665 | 8.0 | 4000 | 0.5634 | 0.4512 | 0.6017 | 0.5271 | 0.5892 | 0.0 | 0.4484 | 0.4258 | 0.7643 | 0.1577 | 0.1449 | 0.5129 | 0.5176 | 0.7435 | 0.0 | 0.8093 | 0.5259 | 0.4958 | 0.2125 | | 0.6543 | 9.0 | 4500 | 0.5299 | 0.4662 | 0.611 | 0.5561 | 0.6205 | 0.0005 | 0.4742 | 0.4369 | 0.7774 | 0.1541 | 0.146 | 0.5226 | 0.5262 | 0.7516 | 0.0021 | 0.825 | 0.5364 | 0.5003 | 0.1792 | | 0.6318 | 10.0 | 5000 | 0.5264 | 0.4936 | 0.6349 | 0.5793 | 0.6865 | 0.0113 | 0.5046 | 0.4713 | 0.7831 | 0.1658 | 0.1576 | 0.5431 | 0.5473 | 0.779 | 0.0284 | 0.8343 | 0.5586 | 0.5251 | 0.2167 | | 0.5645 | 11.0 | 5500 | 0.5428 | 0.4775 | 0.6367 | 0.5697 | 0.6568 | 0.0055 | 0.4917 | 0.4478 | 0.7701 | 0.1445 | 0.1543 | 0.5246 | 0.5273 | 0.7411 | 0.0242 | 0.8166 | 0.5456 | 0.4978 | 0.2042 | | 0.5886 | 12.0 | 6000 | 0.5083 | 0.4952 | 0.6519 | 0.5862 | 0.6893 | 0.0087 | 0.5185 | 0.4623 | 0.7875 | 0.1526 | 0.161 | 0.5425 | 0.55 | 0.7633 | 0.0621 | 0.8247 | 0.5662 | 0.5207 | 0.2958 | | 0.5457 | 13.0 | 6500 | 0.4834 | 0.5798 | 0.7671 | 0.7027 | 0.6983 | 0.256 | 0.562 | 0.5678 | 0.7851 | 0.137 | 0.2139 | 0.6144 | 0.6182 | 0.7484 | 0.2779 | 0.8283 | 0.5973 | 0.6102 | 0.2083 | | 0.5674 | 14.0 | 7000 | 0.4656 | 0.6557 | 0.8508 | 0.7891 | 0.7219 | 0.4557 | 0.6289 | 0.649 | 0.7896 | 0.2017 | 0.2701 | 0.6958 | 0.6988 | 0.7685 | 0.4958 | 0.8319 | 0.6629 | 0.6961 | 0.3083 | | 0.5682 | 15.0 | 7500 | 0.4718 | 0.6374 | 0.8625 | 0.7728 | 0.6638 | 0.4603 | 0.6314 | 0.6244 | 0.7882 | 0.1348 | 0.2676 | 0.682 | 0.6845 | 0.7234 | 0.5032 | 0.8268 | 0.6692 | 0.6795 | 0.175 | | 0.4899 | 16.0 | 8000 | 0.4192 | 0.7045 | 0.9058 | 0.841 | 0.7302 | 0.5906 | 0.7083 | 0.6839 | 0.7927 | 0.1918 | 0.2982 | 0.7452 | 0.748 | 0.777 | 0.6316 | 0.8355 | 0.7423 | 0.734 | 0.2333 | | 0.3987 | 17.0 | 8500 | 0.3887 | 0.7215 | 0.9245 | 0.8647 | 0.724 | 0.6257 | 0.7247 | 0.7142 | 0.8149 | 0.2032 | 0.3116 | 0.7643 | 0.7692 | 0.7722 | 0.6811 | 0.8545 | 0.7637 | 0.762 | 0.2861 | | 0.4619 | 18.0 | 9000 | 0.4065 | 0.7078 | 0.9116 | 0.8648 | 0.729 | 0.5958 | 0.7406 | 0.6949 | 0.7987 | 0.1958 | 0.302 | 0.752 | 0.7545 | 0.777 | 0.6453 | 0.8413 | 0.777 | 0.7438 | 0.2458 | | 0.4152 | 19.0 | 9500 | 0.4048 | 0.7007 | 0.9187 | 0.8475 | 0.7139 | 0.5803 | 0.7123 | 0.6925 | 0.8079 | 0.2193 | 0.2994 | 0.7487 | 0.7523 | 0.7645 | 0.6442 | 0.8482 | 0.7519 | 0.7481 | 0.2875 | | 0.433 | 20.0 | 10000 | 0.3881 | 0.7122 | 0.8942 | 0.8579 | 0.7317 | 0.5846 | 0.6779 | 0.7154 | 0.8203 | 0.23 | 0.2993 | 0.7515 | 0.7553 | 0.7855 | 0.6221 | 0.8584 | 0.7165 | 0.7616 | 0.2958 | | 0.4217 | 21.0 | 10500 | 0.3993 | 0.7098 | 0.9189 | 0.854 | 0.7091 | 0.6125 | 0.6944 | 0.7001 | 0.8078 | 0.1268 | 0.3026 | 0.7515 | 0.755 | 0.7621 | 0.6526 | 0.8503 | 0.7351 | 0.7505 | 0.1875 | | 0.437 | 22.0 | 11000 | 0.3756 | 0.7397 | 0.9455 | 0.8872 | 0.7461 | 0.663 | 0.7388 | 0.7359 | 0.8099 | 0.3 | 0.3132 | 0.7856 | 0.7882 | 0.7839 | 0.7284 | 0.8524 | 0.7758 | 0.7878 | 0.3722 | | 0.3823 | 23.0 | 11500 | 0.3662 | 0.7413 | 0.9302 | 0.8634 | 0.7438 | 0.6558 | 0.7313 | 0.7434 | 0.8243 | 0.2147 | 0.3125 | 0.7779 | 0.7815 | 0.7823 | 0.6979 | 0.8645 | 0.768 | 0.7838 | 0.2917 | | 0.4099 | 24.0 | 12000 | 0.3782 | 0.7106 | 0.9135 | 0.837 | 0.7338 | 0.5835 | 0.7303 | 0.7055 | 0.8144 | 0.17 | 0.3035 | 0.7523 | 0.756 | 0.7823 | 0.6368 | 0.8488 | 0.7634 | 0.7529 | 0.2208 | | 0.4394 | 25.0 | 12500 | 0.3560 | 0.7486 | 0.9406 | 0.9005 | 0.7494 | 0.6785 | 0.7588 | 0.7379 | 0.818 | 0.3099 | 0.3224 | 0.7901 | 0.7924 | 0.796 | 0.7242 | 0.8569 | 0.7922 | 0.7844 | 0.3792 | | 0.4073 | 26.0 | 13000 | 0.3494 | 0.7445 | 0.9451 | 0.8774 | 0.7541 | 0.6621 | 0.7622 | 0.7349 | 0.8174 | 0.2726 | 0.3184 | 0.7848 | 0.7883 | 0.7944 | 0.7126 | 0.8578 | 0.8 | 0.7812 | 0.3361 | | 0.3584 | 27.0 | 13500 | 0.3502 | 0.7466 | 0.929 | 0.8916 | 0.7425 | 0.6693 | 0.7625 | 0.7331 | 0.8279 | 0.1946 | 0.3176 | 0.7842 | 0.7892 | 0.7831 | 0.7168 | 0.8678 | 0.7998 | 0.7816 | 0.2583 | | 0.3832 | 28.0 | 14000 | 0.3387 | 0.7665 | 0.9475 | 0.8894 | 0.7693 | 0.7016 | 0.7768 | 0.7563 | 0.8287 | 0.256 | 0.3242 | 0.8051 | 0.8091 | 0.8069 | 0.7516 | 0.869 | 0.8141 | 0.8013 | 0.3611 | | 0.4091 | 29.0 | 14500 | 0.3726 | 0.7636 | 0.9538 | 0.8979 | 0.7684 | 0.6981 | 0.7624 | 0.7576 | 0.8243 | 0.3196 | 0.3197 | 0.7993 | 0.8036 | 0.8048 | 0.7442 | 0.8617 | 0.7988 | 0.7974 | 0.4361 | | 0.3381 | 30.0 | 15000 | 0.3592 | 0.7447 | 0.9546 | 0.9023 | 0.7316 | 0.6702 | 0.7521 | 0.7442 | 0.8323 | 0.3173 | 0.3134 | 0.7879 | 0.7908 | 0.7742 | 0.7263 | 0.872 | 0.7925 | 0.7903 | 0.3958 | | 0.3368 | 31.0 | 15500 | 0.3480 | 0.7512 | 0.9404 | 0.87 | 0.7578 | 0.6736 | 0.7649 | 0.7399 | 0.8222 | 0.3136 | 0.323 | 0.7971 | 0.8015 | 0.7996 | 0.7379 | 0.8669 | 0.8056 | 0.7943 | 0.425 | | 0.3684 | 32.0 | 16000 | 0.3740 | 0.7176 | 0.9117 | 0.841 | 0.7508 | 0.5789 | 0.7257 | 0.7114 | 0.8233 | 0.2731 | 0.3027 | 0.7582 | 0.7646 | 0.7988 | 0.6295 | 0.8657 | 0.7583 | 0.7638 | 0.3708 | | 0.3512 | 33.0 | 16500 | 0.3633 | 0.746 | 0.9453 | 0.8793 | 0.7502 | 0.6716 | 0.7449 | 0.7318 | 0.8162 | 0.2404 | 0.3147 | 0.783 | 0.7873 | 0.7875 | 0.7158 | 0.8587 | 0.7851 | 0.7773 | 0.2931 | | 0.339 | 34.0 | 17000 | 0.3639 | 0.7315 | 0.9132 | 0.8556 | 0.7576 | 0.6165 | 0.7388 | 0.7248 | 0.8204 | 0.191 | 0.3135 | 0.7729 | 0.7766 | 0.7996 | 0.6674 | 0.863 | 0.7715 | 0.7757 | 0.2681 | | 0.3473 | 35.0 | 17500 | 0.3505 | 0.7395 | 0.9401 | 0.8612 | 0.7276 | 0.6659 | 0.7272 | 0.7386 | 0.8248 | 0.3197 | 0.3182 | 0.7808 | 0.7871 | 0.7815 | 0.7116 | 0.8684 | 0.7637 | 0.7899 | 0.4139 | | 0.3245 | 36.0 | 18000 | 0.3572 | 0.7362 | 0.941 | 0.869 | 0.7426 | 0.6396 | 0.7436 | 0.725 | 0.8263 | 0.2533 | 0.3179 | 0.7794 | 0.7844 | 0.798 | 0.6926 | 0.8627 | 0.788 | 0.7784 | 0.3181 | | 0.3034 | 37.0 | 18500 | 0.3292 | 0.7672 | 0.9507 | 0.9006 | 0.7605 | 0.7016 | 0.7688 | 0.7618 | 0.8394 | 0.3576 | 0.3269 | 0.8055 | 0.8101 | 0.804 | 0.7453 | 0.881 | 0.8074 | 0.8081 | 0.4722 | | 0.3034 | 38.0 | 19000 | 0.3557 | 0.752 | 0.9405 | 0.8849 | 0.7566 | 0.6644 | 0.7595 | 0.7431 | 0.8351 | 0.1968 | 0.323 | 0.7902 | 0.7963 | 0.804 | 0.7147 | 0.8702 | 0.7997 | 0.7888 | 0.2806 | | 0.3203 | 39.0 | 19500 | 0.3412 | 0.7583 | 0.936 | 0.881 | 0.7689 | 0.6778 | 0.7761 | 0.7493 | 0.8282 | 0.238 | 0.3233 | 0.8044 | 0.8074 | 0.8137 | 0.7389 | 0.8696 | 0.8151 | 0.802 | 0.3167 | | 0.3294 | 40.0 | 20000 | 0.3277 | 0.7661 | 0.9502 | 0.9104 | 0.7762 | 0.6952 | 0.7706 | 0.7589 | 0.8268 | 0.2645 | 0.3239 | 0.8081 | 0.8143 | 0.8234 | 0.7505 | 0.869 | 0.8173 | 0.8087 | 0.3597 | | 0.3371 | 41.0 | 20500 | 0.3199 | 0.7796 | 0.9531 | 0.9052 | 0.7818 | 0.7261 | 0.7832 | 0.7775 | 0.8309 | 0.2026 | 0.3321 | 0.8172 | 0.8244 | 0.8262 | 0.7779 | 0.869 | 0.8192 | 0.8216 | 0.3097 | | 0.3152 | 42.0 | 21000 | 0.3230 | 0.7752 | 0.9505 | 0.9004 | 0.777 | 0.7122 | 0.7646 | 0.7776 | 0.8363 | 0.2606 | 0.3261 | 0.8152 | 0.8197 | 0.821 | 0.76 | 0.878 | 0.8078 | 0.8211 | 0.3556 | | 0.2923 | 43.0 | 21500 | 0.3489 | 0.7628 | 0.9461 | 0.9041 | 0.7626 | 0.6946 | 0.7674 | 0.756 | 0.8311 | 0.1877 | 0.3223 | 0.8015 | 0.8074 | 0.8056 | 0.7495 | 0.8672 | 0.8105 | 0.8008 | 0.2806 | | 0.2911 | 44.0 | 22000 | 0.3462 | 0.7604 | 0.9551 | 0.902 | 0.7409 | 0.704 | 0.7573 | 0.7514 | 0.8362 | 0.2793 | 0.3231 | 0.7995 | 0.8035 | 0.7847 | 0.7547 | 0.8711 | 0.8025 | 0.7941 | 0.3542 | | 0.3483 | 45.0 | 22500 | 0.3127 | 0.7808 | 0.949 | 0.9075 | 0.7884 | 0.7041 | 0.7737 | 0.781 | 0.8497 | 0.1944 | 0.3301 | 0.8204 | 0.8267 | 0.827 | 0.7716 | 0.8816 | 0.8127 | 0.8258 | 0.3458 | | 0.2995 | 46.0 | 23000 | 0.3279 | 0.7731 | 0.945 | 0.9019 | 0.7756 | 0.7061 | 0.7637 | 0.773 | 0.8376 | 0.3203 | 0.3282 | 0.8118 | 0.8157 | 0.8181 | 0.7537 | 0.8753 | 0.8122 | 0.8145 | 0.3875 | | 0.3051 | 47.0 | 23500 | 0.3230 | 0.7755 | 0.9534 | 0.8973 | 0.7728 | 0.7064 | 0.7557 | 0.7788 | 0.8471 | 0.3002 | 0.326 | 0.8161 | 0.8203 | 0.8141 | 0.7642 | 0.8825 | 0.8009 | 0.8228 | 0.3903 | | 0.3151 | 48.0 | 24000 | 0.3249 | 0.7727 | 0.9556 | 0.8934 | 0.7583 | 0.7109 | 0.758 | 0.7667 | 0.8488 | 0.2545 | 0.3243 | 0.8145 | 0.8188 | 0.8052 | 0.7695 | 0.8816 | 0.8023 | 0.8114 | 0.3583 | | 0.2858 | 49.0 | 24500 | 0.3337 | 0.7602 | 0.9435 | 0.8897 | 0.7603 | 0.6782 | 0.7742 | 0.7478 | 0.8421 | 0.2511 | 0.3254 | 0.801 | 0.8056 | 0.8077 | 0.7295 | 0.8795 | 0.8134 | 0.7968 | 0.3528 | | 0.2805 | 50.0 | 25000 | 0.3290 | 0.771 | 0.9535 | 0.8821 | 0.7626 | 0.6968 | 0.7587 | 0.7754 | 0.8534 | 0.25 | 0.3266 | 0.8125 | 0.8168 | 0.8105 | 0.7526 | 0.8873 | 0.8029 | 0.8227 | 0.3403 | | 0.3061 | 51.0 | 25500 | 0.3256 | 0.7722 | 0.9437 | 0.8933 | 0.7766 | 0.6889 | 0.7676 | 0.7686 | 0.8511 | 0.2907 | 0.3235 | 0.8103 | 0.8128 | 0.819 | 0.7347 | 0.8846 | 0.8068 | 0.8132 | 0.3458 | | 0.3193 | 52.0 | 26000 | 0.3298 | 0.7714 | 0.9492 | 0.902 | 0.7804 | 0.695 | 0.7758 | 0.7642 | 0.8389 | 0.2703 | 0.3251 | 0.8076 | 0.8107 | 0.8218 | 0.7368 | 0.8735 | 0.8187 | 0.8049 | 0.3375 | | 0.2874 | 53.0 | 26500 | 0.3233 | 0.7678 | 0.9355 | 0.8823 | 0.7867 | 0.6681 | 0.7805 | 0.762 | 0.8485 | 0.2712 | 0.3252 | 0.808 | 0.811 | 0.8266 | 0.7232 | 0.8831 | 0.8179 | 0.8097 | 0.3347 | | 0.274 | 54.0 | 27000 | 0.3293 | 0.7717 | 0.9471 | 0.9046 | 0.764 | 0.7013 | 0.7675 | 0.7685 | 0.8498 | 0.2921 | 0.3274 | 0.8088 | 0.8121 | 0.8109 | 0.7442 | 0.8813 | 0.8026 | 0.8127 | 0.3736 | | 0.3281 | 55.0 | 27500 | 0.3472 | 0.7735 | 0.9504 | 0.9009 | 0.7682 | 0.6983 | 0.7653 | 0.7668 | 0.8541 | 0.3038 | 0.3281 | 0.8107 | 0.8139 | 0.8105 | 0.7453 | 0.8858 | 0.8027 | 0.8109 | 0.3889 | | 0.2792 | 56.0 | 28000 | 0.3254 | 0.7822 | 0.9473 | 0.9008 | 0.7724 | 0.7218 | 0.788 | 0.7787 | 0.8523 | 0.2316 | 0.3338 | 0.8169 | 0.8217 | 0.8185 | 0.7621 | 0.8843 | 0.8252 | 0.8197 | 0.3208 | | 0.2993 | 57.0 | 28500 | 0.3367 | 0.7803 | 0.9461 | 0.9146 | 0.7651 | 0.7266 | 0.7959 | 0.7758 | 0.8493 | 0.3025 | 0.3289 | 0.8147 | 0.8221 | 0.8121 | 0.7705 | 0.8837 | 0.8277 | 0.8184 | 0.4361 | | 0.3099 | 58.0 | 29000 | 0.3266 | 0.7741 | 0.9413 | 0.9049 | 0.7669 | 0.6962 | 0.7867 | 0.7667 | 0.859 | 0.2785 | 0.3255 | 0.8114 | 0.8182 | 0.8157 | 0.7495 | 0.8895 | 0.8228 | 0.8111 | 0.3903 | | 0.3006 | 59.0 | 29500 | 0.3277 | 0.7811 | 0.9463 | 0.8999 | 0.767 | 0.7194 | 0.7902 | 0.7779 | 0.8569 | 0.2351 | 0.3316 | 0.8173 | 0.8203 | 0.8052 | 0.7674 | 0.8883 | 0.8225 | 0.819 | 0.3264 | | 0.293 | 60.0 | 30000 | 0.3289 | 0.775 | 0.9453 | 0.9004 | 0.7754 | 0.7093 | 0.7913 | 0.7725 | 0.8403 | 0.3003 | 0.3296 | 0.8108 | 0.8155 | 0.8214 | 0.7463 | 0.8789 | 0.8294 | 0.8104 | 0.4042 | | 0.293 | 61.0 | 30500 | 0.3381 | 0.7689 | 0.944 | 0.9 | 0.7733 | 0.6846 | 0.7807 | 0.7619 | 0.8488 | 0.2801 | 0.3265 | 0.8021 | 0.8097 | 0.8121 | 0.7337 | 0.8834 | 0.8088 | 0.8084 | 0.3472 | | 0.3054 | 62.0 | 31000 | 0.3289 | 0.7743 | 0.9535 | 0.8912 | 0.7672 | 0.7019 | 0.7817 | 0.7676 | 0.8538 | 0.3075 | 0.3307 | 0.8142 | 0.8184 | 0.8125 | 0.7568 | 0.8858 | 0.8138 | 0.8166 | 0.3931 | | 0.2988 | 63.0 | 31500 | 0.3311 | 0.7717 | 0.9403 | 0.8903 | 0.7761 | 0.6895 | 0.7881 | 0.7662 | 0.8496 | 0.3147 | 0.332 | 0.8132 | 0.8155 | 0.8177 | 0.7442 | 0.8846 | 0.8234 | 0.8121 | 0.3958 | | 0.292 | 64.0 | 32000 | 0.3342 | 0.7766 | 0.9481 | 0.9092 | 0.7738 | 0.7078 | 0.7886 | 0.7735 | 0.8482 | 0.3001 | 0.3296 | 0.8156 | 0.8196 | 0.8181 | 0.7579 | 0.8828 | 0.8229 | 0.8182 | 0.3792 | | 0.2722 | 65.0 | 32500 | 0.3164 | 0.7786 | 0.9492 | 0.9022 | 0.7827 | 0.6865 | 0.7771 | 0.7759 | 0.8665 | 0.3049 | 0.3279 | 0.8165 | 0.8227 | 0.8306 | 0.7411 | 0.8964 | 0.8132 | 0.8217 | 0.3972 | | 0.2741 | 66.0 | 33000 | 0.3280 | 0.7785 | 0.9444 | 0.8962 | 0.7717 | 0.7064 | 0.7973 | 0.7746 | 0.8573 | 0.2521 | 0.3303 | 0.8219 | 0.8262 | 0.8202 | 0.7642 | 0.8943 | 0.8314 | 0.8253 | 0.3528 | | 0.2583 | 67.0 | 33500 | 0.3298 | 0.7684 | 0.9435 | 0.911 | 0.7665 | 0.6877 | 0.7855 | 0.7589 | 0.8511 | 0.2861 | 0.3227 | 0.8092 | 0.8128 | 0.8153 | 0.7368 | 0.8861 | 0.8197 | 0.8066 | 0.3792 | | 0.2687 | 68.0 | 34000 | 0.3116 | 0.7854 | 0.9422 | 0.9099 | 0.7809 | 0.7136 | 0.7954 | 0.7815 | 0.8617 | 0.3415 | 0.3333 | 0.8256 | 0.8306 | 0.8246 | 0.7705 | 0.8967 | 0.8299 | 0.8304 | 0.4347 | | 0.2551 | 69.0 | 34500 | 0.3138 | 0.7862 | 0.9438 | 0.9003 | 0.7851 | 0.7196 | 0.7939 | 0.7841 | 0.8538 | 0.252 | 0.3329 | 0.8226 | 0.8268 | 0.8335 | 0.7589 | 0.888 | 0.8324 | 0.8257 | 0.3347 | | 0.2599 | 70.0 | 35000 | 0.3066 | 0.7808 | 0.9454 | 0.9049 | 0.7823 | 0.7096 | 0.7951 | 0.7726 | 0.8506 | 0.2996 | 0.3309 | 0.8205 | 0.8238 | 0.8274 | 0.7589 | 0.8849 | 0.8362 | 0.8197 | 0.3528 | | 0.2557 | 71.0 | 35500 | 0.3199 | 0.7818 | 0.9501 | 0.9122 | 0.7759 | 0.7061 | 0.7918 | 0.7726 | 0.8635 | 0.2611 | 0.3298 | 0.8214 | 0.8246 | 0.8202 | 0.76 | 0.8937 | 0.8264 | 0.8182 | 0.3569 | | 0.2659 | 72.0 | 36000 | 0.3060 | 0.7937 | 0.9493 | 0.9115 | 0.7909 | 0.7209 | 0.7933 | 0.7942 | 0.8693 | 0.3496 | 0.3357 | 0.8295 | 0.8333 | 0.8331 | 0.7663 | 0.9006 | 0.8241 | 0.8341 | 0.4222 | | 0.2815 | 73.0 | 36500 | 0.3145 | 0.783 | 0.9491 | 0.9155 | 0.7768 | 0.7141 | 0.7839 | 0.7768 | 0.8579 | 0.3012 | 0.329 | 0.8165 | 0.8214 | 0.819 | 0.7568 | 0.8886 | 0.8181 | 0.8173 | 0.3792 | | 0.2303 | 74.0 | 37000 | 0.3187 | 0.7823 | 0.9535 | 0.9207 | 0.7765 | 0.7248 | 0.7897 | 0.7723 | 0.8457 | 0.3227 | 0.3288 | 0.8211 | 0.8247 | 0.8246 | 0.7684 | 0.881 | 0.8234 | 0.8153 | 0.4236 | | 0.2871 | 75.0 | 37500 | 0.3217 | 0.7787 | 0.9401 | 0.906 | 0.771 | 0.7157 | 0.7772 | 0.7782 | 0.8493 | 0.3092 | 0.3292 | 0.8229 | 0.8263 | 0.8214 | 0.7684 | 0.8892 | 0.8157 | 0.8304 | 0.4028 | | 0.2432 | 76.0 | 38000 | 0.3139 | 0.7759 | 0.94 | 0.9042 | 0.7785 | 0.6999 | 0.7796 | 0.7728 | 0.8491 | 0.2778 | 0.3292 | 0.8176 | 0.8212 | 0.8234 | 0.7516 | 0.8886 | 0.8145 | 0.8236 | 0.3361 | | 0.2781 | 77.0 | 38500 | 0.3370 | 0.7739 | 0.9349 | 0.8953 | 0.7758 | 0.6934 | 0.7756 | 0.7738 | 0.8524 | 0.3185 | 0.3297 | 0.816 | 0.8184 | 0.8214 | 0.7421 | 0.8919 | 0.8123 | 0.8188 | 0.3875 | | 0.2527 | 78.0 | 39000 | 0.3022 | 0.7879 | 0.9393 | 0.9096 | 0.7889 | 0.7108 | 0.7881 | 0.7877 | 0.8641 | 0.2717 | 0.3349 | 0.8261 | 0.8289 | 0.8355 | 0.7537 | 0.8976 | 0.8242 | 0.8304 | 0.3542 | | 0.2705 | 79.0 | 39500 | 0.3070 | 0.7803 | 0.9486 | 0.9085 | 0.2776 | 0.7774 | 0.7799 | 0.3326 | 0.8234 | 0.8259 | 0.3333 | 0.8255 | 0.8194 | 0.7755 | 0.823 | 0.7104 | 0.7621 | 0.855 | 0.8925 | | 0.2275 | 80.0 | 40000 | 0.3107 | 0.7739 | 0.9315 | 0.8937 | 0.2574 | 0.7743 | 0.7726 | 0.3256 | 0.815 | 0.8186 | 0.3278 | 0.8189 | 0.8164 | 0.7871 | 0.8335 | 0.6775 | 0.7284 | 0.8573 | 0.894 | | 0.2616 | 81.0 | 40500 | 0.3109 | 0.7855 | 0.941 | 0.9013 | 0.3217 | 0.7877 | 0.7872 | 0.333 | 0.8254 | 0.8286 | 0.3903 | 0.8308 | 0.825 | 0.7883 | 0.8294 | 0.7026 | 0.7558 | 0.8655 | 0.9006 | | 0.2355 | 82.0 | 41000 | 0.3161 | 0.7823 | 0.9425 | 0.8978 | 0.2979 | 0.7793 | 0.78 | 0.3332 | 0.8228 | 0.8274 | 0.3875 | 0.827 | 0.8129 | 0.792 | 0.8351 | 0.6955 | 0.7505 | 0.8594 | 0.8967 | | 0.252 | 83.0 | 41500 | 0.3297 | 0.7764 | 0.9371 | 0.9031 | 0.2282 | 0.7748 | 0.776 | 0.3291 | 0.8189 | 0.8222 | 0.3236 | 0.8247 | 0.8133 | 0.7746 | 0.8218 | 0.6971 | 0.7495 | 0.8577 | 0.8955 | | 0.2559 | 84.0 | 42000 | 0.3112 | 0.7837 | 0.931 | 0.8905 | 0.2496 | 0.7793 | 0.788 | 0.3368 | 0.8241 | 0.8267 | 0.3083 | 0.8267 | 0.8234 | 0.7946 | 0.8343 | 0.6957 | 0.7495 | 0.8607 | 0.8964 | | 0.23 | 85.0 | 42500 | 0.3214 | 0.7768 | 0.9357 | 0.8973 | 0.2582 | 0.7721 | 0.7927 | 0.3265 | 0.8163 | 0.8201 | 0.3583 | 0.8189 | 0.8206 | 0.7768 | 0.8242 | 0.6963 | 0.74 | 0.8574 | 0.8961 | | 0.211 | 86.0 | 43000 | 0.3098 | 0.7871 | 0.9396 | 0.901 | 0.3 | 0.7813 | 0.7916 | 0.3318 | 0.8257 | 0.8289 | 0.3764 | 0.8275 | 0.8251 | 0.7872 | 0.8327 | 0.7158 | 0.7589 | 0.8583 | 0.8952 | | 0.2234 | 87.0 | 43500 | 0.2975 | 0.8018 | 0.9482 | 0.9091 | 0.3233 | 0.7983 | 0.805 | 0.3387 | 0.8387 | 0.8425 | 0.4181 | 0.8408 | 0.8362 | 0.8003 | 0.8403 | 0.7399 | 0.7853 | 0.8652 | 0.9018 | | 0.2361 | 88.0 | 44000 | 0.3144 | 0.7859 | 0.9288 | 0.8954 | 0.2457 | 0.7859 | 0.7829 | 0.33 | 0.8247 | 0.8282 | 0.3167 | 0.8301 | 0.8197 | 0.8026 | 0.8464 | 0.6983 | 0.7442 | 0.8568 | 0.894 | | 0.2305 | 89.0 | 44500 | 0.3158 | 0.8005 | 0.9495 | 0.9093 | 0.2671 | 0.8012 | 0.79 | 0.3364 | 0.8388 | 0.8416 | 0.3361 | 0.8459 | 0.8235 | 0.7997 | 0.8395 | 0.7393 | 0.7874 | 0.8626 | 0.8979 | | 0.2235 | 90.0 | 45000 | 0.3080 | 0.7924 | 0.9411 | 0.9083 | 0.2627 | 0.7869 | 0.7957 | 0.3358 | 0.8323 | 0.8366 | 0.3611 | 0.837 | 0.8321 | 0.7875 | 0.8331 | 0.7227 | 0.7737 | 0.867 | 0.903 | | 0.2371 | 91.0 | 45500 | 0.3000 | 0.7932 | 0.9547 | 0.9102 | 0.307 | 0.7927 | 0.7888 | 0.3305 | 0.834 | 0.838 | 0.3736 | 0.8412 | 0.8265 | 0.7979 | 0.8351 | 0.7177 | 0.7779 | 0.864 | 0.9009 | | 0.2136 | 92.0 | 46000 | 0.3052 | 0.7917 | 0.9531 | 0.9044 | 0.3165 | 0.7878 | 0.7869 | 0.3339 | 0.8328 | 0.8356 | 0.4056 | 0.8368 | 0.8202 | 0.7917 | 0.8343 | 0.7304 | 0.7789 | 0.8529 | 0.8937 | | 0.2609 | 93.0 | 46500 | 0.3135 | 0.7798 | 0.933 | 0.8879 | 0.3041 | 0.7746 | 0.7992 | 0.3315 | 0.8231 | 0.8266 | 0.3569 | 0.8259 | 0.8365 | 0.7878 | 0.8343 | 0.6877 | 0.7442 | 0.864 | 0.9012 | | 0.2258 | 94.0 | 47000 | 0.2984 | 0.7982 | 0.9455 | 0.9074 | 0.308 | 0.7923 | 0.7977 | 0.341 | 0.839 | 0.8418 | 0.3847 | 0.8413 | 0.8305 | 0.802 | 0.8464 | 0.7331 | 0.7832 | 0.8595 | 0.8958 | | 0.2249 | 95.0 | 47500 | 0.3127 | 0.7872 | 0.9452 | 0.9067 | 0.2976 | 0.7879 | 0.7895 | 0.3353 | 0.8316 | 0.8349 | 0.375 | 0.8363 | 0.8277 | 0.7907 | 0.8375 | 0.7184 | 0.7737 | 0.8527 | 0.8934 | | 0.2353 | 96.0 | 48000 | 0.3098 | 0.7896 | 0.9428 | 0.9046 | 0.2852 | 0.7846 | 0.7905 | 0.3345 | 0.8299 | 0.8338 | 0.3681 | 0.834 | 0.8252 | 0.7922 | 0.8375 | 0.717 | 0.7663 | 0.8596 | 0.8976 | | 0.2361 | 97.0 | 48500 | 0.3121 | 0.7897 | 0.9378 | 0.8908 | 0.2774 | 0.7896 | 0.7944 | 0.3377 | 0.8317 | 0.835 | 0.3486 | 0.8377 | 0.8316 | 0.7923 | 0.8379 | 0.7171 | 0.7674 | 0.8596 | 0.8997 | | 0.2273 | 98.0 | 49000 | 0.2956 | 0.7965 | 0.9503 | 0.903 | 0.3052 | 0.7928 | 0.7924 | 0.338 | 0.8425 | 0.8451 | 0.4028 | 0.8467 | 0.8306 | 0.7937 | 0.8419 | 0.7282 | 0.7905 | 0.8676 | 0.9027 | | 0.2461 | 99.0 | 49500 | 0.3038 | 0.7906 | 0.9376 | 0.8969 | 0.3125 | 0.7844 | 0.796 | 0.3367 | 0.8322 | 0.8346 | 0.3889 | 0.8342 | 0.8332 | 0.7937 | 0.8399 | 0.718 | 0.7653 | 0.8603 | 0.8985 | | 0.2195 | 100.0 | 50000 | 0.2938 | 0.7953 | 0.9426 | 0.8928 | 0.3242 | 0.7928 | 0.7988 | 0.3396 | 0.8379 | 0.8402 | 0.4069 | 0.8414 | 0.8343 | 0.8022 | 0.8472 | 0.7161 | 0.7695 | 0.8676 | 0.9039 | | 0.2093 | 101.0 | 50500 | 0.3043 | 0.7953 | 0.9499 | 0.9023 | 0.3206 | 0.7908 | 0.7915 | 0.3373 | 0.837 | 0.8409 | 0.4042 | 0.8421 | 0.8302 | 0.7931 | 0.8415 | 0.7238 | 0.7779 | 0.869 | 0.9033 | | 0.2161 | 102.0 | 51000 | 0.3034 | 0.7945 | 0.9405 | 0.9057 | 0.3314 | 0.79 | 0.8063 | 0.3401 | 0.8365 | 0.8393 | 0.4139 | 0.8369 | 0.8437 | 0.7972 | 0.846 | 0.724 | 0.7705 | 0.8624 | 0.9015 | | 0.219 | 103.0 | 51500 | 0.2984 | 0.7964 | 0.9511 | 0.9086 | 0.3009 | 0.7991 | 0.7869 | 0.338 | 0.8375 | 0.842 | 0.3792 | 0.8462 | 0.8272 | 0.7948 | 0.8435 | 0.7312 | 0.7821 | 0.8631 | 0.9003 | | 0.2472 | 104.0 | 52000 | 0.3095 | 0.7956 | 0.9511 | 0.9115 | 0.2932 | 0.7917 | 0.8018 | 0.3359 | 0.8359 | 0.8397 | 0.4 | 0.8382 | 0.8348 | 0.7952 | 0.8383 | 0.7274 | 0.78 | 0.8644 | 0.9009 | | 0.2457 | 105.0 | 52500 | 0.3061 | 0.7929 | 0.9371 | 0.9027 | 0.2809 | 0.784 | 0.8069 | 0.3377 | 0.8327 | 0.8359 | 0.3417 | 0.8316 | 0.8457 | 0.8015 | 0.8448 | 0.7138 | 0.7653 | 0.8634 | 0.8976 | | 0.2017 | 106.0 | 53000 | 0.3000 | 0.8017 | 0.9475 | 0.9114 | 0.2991 | 0.7983 | 0.8011 | 0.3372 | 0.8439 | 0.8466 | 0.3764 | 0.8475 | 0.8398 | 0.8056 | 0.8496 | 0.7309 | 0.7863 | 0.8687 | 0.9039 | | 0.2055 | 107.0 | 53500 | 0.2946 | 0.8083 | 0.9481 | 0.9106 | 0.2911 | 0.8083 | 0.7989 | 0.3434 | 0.8477 | 0.8513 | 0.3667 | 0.8562 | 0.8344 | 0.8141 | 0.8536 | 0.7396 | 0.7947 | 0.8713 | 0.9054 | | 0.221 | 108.0 | 54000 | 0.2895 | 0.8036 | 0.9508 | 0.9127 | 0.3249 | 0.8017 | 0.7996 | 0.3407 | 0.8445 | 0.8478 | 0.4083 | 0.849 | 0.8383 | 0.8034 | 0.8464 | 0.7377 | 0.7937 | 0.8696 | 0.9033 | | 0.2018 | 109.0 | 54500 | 0.2988 | 0.8 | 0.9398 | 0.9075 | 0.265 | 0.7982 | 0.8017 | 0.3413 | 0.8414 | 0.8438 | 0.3292 | 0.8465 | 0.8359 | 0.8103 | 0.8516 | 0.7241 | 0.78 | 0.8656 | 0.8997 | | 0.2116 | 110.0 | 55000 | 0.2982 | 0.8012 | 0.9461 | 0.9134 | 0.3103 | 0.7977 | 0.8061 | 0.3412 | 0.843 | 0.8453 | 0.3722 | 0.8487 | 0.8362 | 0.8007 | 0.8419 | 0.7373 | 0.7937 | 0.8655 | 0.9003 | | 0.2151 | 111.0 | 55500 | 0.2987 | 0.8005 | 0.9404 | 0.8995 | 0.2806 | 0.799 | 0.8037 | 0.3428 | 0.8413 | 0.8449 | 0.3514 | 0.8491 | 0.837 | 0.8043 | 0.85 | 0.7294 | 0.7842 | 0.8676 | 0.9006 | | 0.2168 | 112.0 | 56000 | 0.2926 | 0.8076 | 0.9475 | 0.9101 | 0.2939 | 0.804 | 0.8077 | 0.3404 | 0.8471 | 0.8506 | 0.3681 | 0.8533 | 0.8372 | 0.808 | 0.8476 | 0.7416 | 0.7968 | 0.8732 | 0.9072 | | 0.2245 | 113.0 | 56500 | 0.2975 | 0.7991 | 0.9446 | 0.9068 | 0.2872 | 0.7977 | 0.797 | 0.335 | 0.8404 | 0.8435 | 0.3458 | 0.8469 | 0.8342 | 0.8001 | 0.8423 | 0.7274 | 0.7832 | 0.8698 | 0.9051 | | 0.235 | 114.0 | 57000 | 0.2973 | 0.8075 | 0.9449 | 0.908 | 0.3041 | 0.8058 | 0.8076 | 0.3413 | 0.8483 | 0.8517 | 0.375 | 0.8554 | 0.8403 | 0.8073 | 0.8512 | 0.7428 | 0.7968 | 0.8724 | 0.9069 | | 0.2008 | 115.0 | 57500 | 0.3147 | 0.8024 | 0.9466 | 0.9154 | 0.298 | 0.7987 | 0.8062 | 0.3396 | 0.8438 | 0.8464 | 0.3611 | 0.847 | 0.8412 | 0.7963 | 0.8423 | 0.7436 | 0.7958 | 0.8671 | 0.9012 | | 0.2225 | 116.0 | 58000 | 0.2958 | 0.802 | 0.9409 | 0.9039 | 0.2781 | 0.8002 | 0.8078 | 0.3399 | 0.8459 | 0.8488 | 0.3444 | 0.8512 | 0.8418 | 0.8009 | 0.8464 | 0.7359 | 0.7958 | 0.8691 | 0.9042 | | 0.1969 | 117.0 | 58500 | 0.2989 | 0.8062 | 0.9446 | 0.9094 | 0.2932 | 0.8073 | 0.8038 | 0.34 | 0.846 | 0.8489 | 0.3639 | 0.8523 | 0.841 | 0.8083 | 0.8528 | 0.74 | 0.7895 | 0.8702 | 0.9045 | | 0.2242 | 118.0 | 59000 | 0.2910 | 0.8075 | 0.9496 | 0.9127 | 0.2838 | 0.805 | 0.8004 | 0.3422 | 0.8481 | 0.8508 | 0.3597 | 0.8513 | 0.8397 | 0.8093 | 0.854 | 0.7413 | 0.7937 | 0.8719 | 0.9048 | | 0.2193 | 119.0 | 59500 | 0.2938 | 0.8062 | 0.9437 | 0.9114 | 0.2695 | 0.8073 | 0.8067 | 0.3416 | 0.848 | 0.8511 | 0.3403 | 0.8555 | 0.843 | 0.8109 | 0.8548 | 0.7422 | 0.7968 | 0.8656 | 0.9015 | | 0.1929 | 120.0 | 60000 | 0.2947 | 0.8036 | 0.9466 | 0.9071 | 0.2826 | 0.8013 | 0.8005 | 0.3405 | 0.845 | 0.8482 | 0.3597 | 0.8502 | 0.8404 | 0.8025 | 0.848 | 0.7364 | 0.7905 | 0.8718 | 0.906 | | 0.234 | 121.0 | 60500 | 0.2966 | 0.8051 | 0.9469 | 0.9099 | 0.2836 | 0.8044 | 0.8052 | 0.3415 | 0.8458 | 0.8485 | 0.3472 | 0.8516 | 0.841 | 0.8082 | 0.8504 | 0.7361 | 0.7895 | 0.8709 | 0.9057 | | 0.2487 | 122.0 | 61000 | 0.2978 | 0.8028 | 0.9442 | 0.9099 | 0.3001 | 0.8003 | 0.8034 | 0.3393 | 0.8427 | 0.8455 | 0.3681 | 0.848 | 0.8394 | 0.8041 | 0.8464 | 0.7369 | 0.7884 | 0.8672 | 0.9018 | | 0.2144 | 123.0 | 61500 | 0.3070 | 0.8028 | 0.9434 | 0.9065 | 0.2893 | 0.8061 | 0.7976 | 0.3389 | 0.845 | 0.8474 | 0.3486 | 0.8535 | 0.8387 | 0.7999 | 0.8452 | 0.7387 | 0.7926 | 0.8698 | 0.9045 | | 0.2 | 124.0 | 62000 | 0.3050 | 0.8024 | 0.9412 | 0.9047 | 0.2815 | 0.8016 | 0.8082 | 0.341 | 0.8424 | 0.8449 | 0.3264 | 0.8467 | 0.8447 | 0.8076 | 0.8504 | 0.7302 | 0.7811 | 0.8695 | 0.9033 | | 0.216 | 125.0 | 62500 | 0.2965 | 0.8061 | 0.9436 | 0.9075 | 0.3028 | 0.8071 | 0.8086 | 0.342 | 0.8471 | 0.8496 | 0.3597 | 0.8543 | 0.8407 | 0.8073 | 0.8504 | 0.7383 | 0.7916 | 0.8728 | 0.9069 | | 0.2556 | 126.0 | 63000 | 0.3021 | 0.7961 | 0.9403 | 0.9039 | 0.2741 | 0.7967 | 0.7992 | 0.3378 | 0.8386 | 0.8413 | 0.3389 | 0.8435 | 0.839 | 0.7994 | 0.8448 | 0.7243 | 0.7768 | 0.8645 | 0.9024 | | 0.212 | 127.0 | 63500 | 0.3013 | 0.8022 | 0.9435 | 0.9073 | 0.2989 | 0.7997 | 0.809 | 0.3371 | 0.8421 | 0.8456 | 0.3639 | 0.8472 | 0.8447 | 0.8029 | 0.8448 | 0.7313 | 0.7853 | 0.8724 | 0.9069 | | 0.2067 | 128.0 | 64000 | 0.3011 | 0.8016 | 0.9445 | 0.9093 | 0.2706 | 0.8003 | 0.8064 | 0.3377 | 0.843 | 0.8455 | 0.325 | 0.8487 | 0.8421 | 0.8046 | 0.8468 | 0.7339 | 0.7863 | 0.8663 | 0.9033 | | 0.2224 | 129.0 | 64500 | 0.3004 | 0.8021 | 0.9448 | 0.91 | 0.2758 | 0.8017 | 0.8014 | 0.3393 | 0.8433 | 0.846 | 0.3347 | 0.8498 | 0.8375 | 0.8056 | 0.8488 | 0.7294 | 0.7832 | 0.8714 | 0.906 | | 0.2127 | 130.0 | 65000 | 0.3002 | 0.8015 | 0.9417 | 0.9065 | 0.3047 | 0.8002 | 0.8029 | 0.3385 | 0.8422 | 0.845 | 0.3694 | 0.8462 | 0.8412 | 0.8059 | 0.8496 | 0.7281 | 0.7789 | 0.8705 | 0.9063 | | 0.2123 | 131.0 | 65500 | 0.2958 | 0.8062 | 0.9469 | 0.9108 | 0.3028 | 0.8056 | 0.8099 | 0.3402 | 0.8479 | 0.8505 | 0.3778 | 0.8542 | 0.8448 | 0.8095 | 0.8536 | 0.7341 | 0.7884 | 0.8749 | 0.9093 | | 0.2066 | 132.0 | 66000 | 0.2969 | 0.8044 | 0.9469 | 0.9015 | 0.3025 | 0.8018 | 0.806 | 0.3398 | 0.8456 | 0.8485 | 0.3778 | 0.8499 | 0.8416 | 0.8096 | 0.8528 | 0.7301 | 0.7842 | 0.8733 | 0.9084 | | 0.2252 | 133.0 | 66500 | 0.2954 | 0.8052 | 0.9468 | 0.9083 | 0.3014 | 0.8039 | 0.8042 | 0.3398 | 0.8467 | 0.8492 | 0.3694 | 0.8511 | 0.8418 | 0.8066 | 0.8508 | 0.7358 | 0.7884 | 0.8733 | 0.9084 | | 0.2008 | 134.0 | 67000 | 0.2946 | 0.806 | 0.9471 | 0.9108 | 0.3066 | 0.8033 | 0.8064 | 0.3403 | 0.8473 | 0.8499 | 0.3778 | 0.8507 | 0.8432 | 0.8082 | 0.8512 | 0.7386 | 0.7916 | 0.871 | 0.9069 | | 0.2195 | 135.0 | 67500 | 0.2956 | 0.8044 | 0.9439 | 0.9084 | 0.3235 | 0.8027 | 0.8062 | 0.3408 | 0.8446 | 0.8471 | 0.3889 | 0.8487 | 0.8405 | 0.81 | 0.854 | 0.7303 | 0.7789 | 0.8731 | 0.9084 | | 0.2044 | 136.0 | 68000 | 0.2923 | 0.8052 | 0.9439 | 0.9099 | 0.307 | 0.8025 | 0.8113 | 0.3411 | 0.8454 | 0.848 | 0.3681 | 0.8488 | 0.8457 | 0.8091 | 0.8532 | 0.7311 | 0.7811 | 0.8753 | 0.9096 | | 0.2717 | 137.0 | 68500 | 0.2937 | 0.8071 | 0.9445 | 0.9129 | 0.2932 | 0.8037 | 0.8121 | 0.3404 | 0.8468 | 0.8494 | 0.3611 | 0.8506 | 0.8457 | 0.8116 | 0.8536 | 0.7365 | 0.7874 | 0.8732 | 0.9072 | | 0.2083 | 138.0 | 69000 | 0.2941 | 0.8062 | 0.9444 | 0.9128 | 0.3005 | 0.8043 | 0.8084 | 0.3403 | 0.8467 | 0.8494 | 0.3639 | 0.8519 | 0.8433 | 0.8074 | 0.8512 | 0.7362 | 0.7884 | 0.8749 | 0.9084 | | 0.2154 | 139.0 | 69500 | 0.2936 | 0.8055 | 0.9439 | 0.9101 | 0.307 | 0.8027 | 0.8108 | 0.3399 | 0.8445 | 0.847 | 0.3681 | 0.8476 | 0.846 | 0.8119 | 0.854 | 0.7297 | 0.7789 | 0.875 | 0.9081 | | 0.1902 | 140.0 | 70000 | 0.2949 | 0.8063 | 0.9444 | 0.91 | 0.3034 | 0.8041 | 0.8116 | 0.3398 | 0.8452 | 0.8478 | 0.3639 | 0.8485 | 0.8465 | 0.8125 | 0.8548 | 0.7301 | 0.7789 | 0.8762 | 0.9096 | | 0.2147 | 141.0 | 70500 | 0.2934 | 0.8086 | 0.9477 | 0.9116 | 0.3069 | 0.8086 | 0.8095 | 0.3406 | 0.8487 | 0.8513 | 0.3681 | 0.8546 | 0.8441 | 0.8111 | 0.8532 | 0.7396 | 0.7916 | 0.875 | 0.909 | | 0.2194 | 142.0 | 71000 | 0.2932 | 0.8077 | 0.9475 | 0.9156 | 0.3069 | 0.8069 | 0.8075 | 0.3401 | 0.8481 | 0.8508 | 0.3681 | 0.8534 | 0.8425 | 0.8106 | 0.8536 | 0.739 | 0.7905 | 0.8734 | 0.9081 | | 0.1975 | 143.0 | 71500 | 0.2929 | 0.8078 | 0.9477 | 0.9157 | 0.3077 | 0.8077 | 0.8063 | 0.3403 | 0.8481 | 0.8507 | 0.3681 | 0.8533 | 0.842 | 0.8099 | 0.8532 | 0.7384 | 0.7895 | 0.875 | 0.9093 | | 0.2057 | 144.0 | 72000 | 0.2928 | 0.8073 | 0.9478 | 0.9158 | 0.3091 | 0.8059 | 0.8091 | 0.3407 | 0.8477 | 0.8502 | 0.3681 | 0.8521 | 0.8437 | 0.8101 | 0.8532 | 0.7369 | 0.7884 | 0.875 | 0.909 | | 0.2006 | 145.0 | 72500 | 0.2929 | 0.8079 | 0.9478 | 0.9158 | 0.3048 | 0.8069 | 0.809 | 0.34 | 0.8479 | 0.8504 | 0.3639 | 0.8528 | 0.8437 | 0.8102 | 0.8524 | 0.7385 | 0.7895 | 0.875 | 0.9093 | | 0.2465 | 146.0 | 73000 | 0.2919 | 0.8069 | 0.9477 | 0.9158 | 0.3091 | 0.8062 | 0.8072 | 0.341 | 0.8471 | 0.8497 | 0.3681 | 0.8524 | 0.8423 | 0.8104 | 0.8532 | 0.7364 | 0.7874 | 0.8738 | 0.9084 | | 0.2178 | 147.0 | 73500 | 0.2920 | 0.8076 | 0.9477 | 0.9157 | 0.3091 | 0.8073 | 0.8067 | 0.3411 | 0.8476 | 0.8502 | 0.3681 | 0.8533 | 0.8415 | 0.8111 | 0.8528 | 0.7378 | 0.7895 | 0.8738 | 0.9084 | | 0.1954 | 148.0 | 74000 | 0.2920 | 0.8076 | 0.9477 | 0.9158 | 0.3091 | 0.8074 | 0.8067 | 0.3412 | 0.8478 | 0.8504 | 0.3681 | 0.8535 | 0.8415 | 0.8111 | 0.8532 | 0.7378 | 0.7895 | 0.8738 | 0.9084 | | 0.2306 | 149.0 | 74500 | 0.2920 | 0.8076 | 0.9477 | 0.9158 | 0.3091 | 0.8073 | 0.8067 | 0.3412 | 0.8476 | 0.8502 | 0.3681 | 0.8532 | 0.8415 | 0.8113 | 0.8536 | 0.7376 | 0.7884 | 0.8738 | 0.9084 | | 0.186 | 150.0 | 75000 | 0.2920 | 0.8077 | 0.9477 | 0.9158 | 0.3091 | 0.8073 | 0.8068 | 0.3412 | 0.8477 | 0.8503 | 0.3681 | 0.8532 | 0.8416 | 0.8113 | 0.8536 | 0.7376 | 0.7884 | 0.8741 | 0.9087 | ### Framework versions - Transformers 4.45.2 - Pytorch 2.5.0+cu121 - Datasets 2.19.2 - Tokenizers 0.20.1
[ "chicken", "duck", "plant" ]
joe611/chickens-composite-101818181818-150-epochs-w-transform
<!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. --> # chickens-composite-101818181818-150-epochs-w-transform This model is a fine-tuned version of [facebook/detr-resnet-50](https://huggingface.co/facebook/detr-resnet-50) on the None dataset. It achieves the following results on the evaluation set: - Loss: 0.2836 - Map: 0.8018 - Map 50: 0.9463 - Map 75: 0.9145 - Map Small: 0.2935 - Map Medium: 0.8039 - Map Large: 0.8211 - Mar 1: 0.3373 - Mar 10: 0.8376 - Mar 100: 0.8406 - Mar Small: 0.3208 - Mar Medium: 0.8422 - Mar Large: 0.8517 - Map Chicken: 0.7988 - Mar 100 Chicken: 0.8423 - Map Duck: 0.7436 - Mar 100 Duck: 0.7811 - Map Plant: 0.8631 - Mar 100 Plant: 0.8985 ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 1e-05 - train_batch_size: 2 - eval_batch_size: 8 - seed: 42 - optimizer: Use adamw_torch with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments - lr_scheduler_type: cosine - num_epochs: 150 ### Training results | Training Loss | Epoch | Step | Validation Loss | Map | Map 50 | Map 75 | Map Chicken | Map Duck | Map Large | Map Medium | Map Plant | Map Small | Mar 1 | Mar 10 | Mar 100 | Mar 100 Chicken | Mar 100 Duck | Mar 100 Plant | Mar Large | Mar Medium | Mar Small | |:-------------:|:-----:|:-----:|:---------------:|:------:|:------:|:------:|:-----------:|:--------:|:---------:|:----------:|:---------:|:---------:|:------:|:------:|:-------:|:---------------:|:------------:|:-------------:|:---------:|:----------:|:---------:| | 1.4317 | 1.0 | 500 | 1.3917 | 0.056 | 0.0904 | 0.0642 | 0.0846 | 0.0 | 0.0696 | 0.0412 | 0.0836 | 0.003 | 0.0722 | 0.183 | 0.3538 | 0.4125 | 0.0 | 0.6488 | 0.386 | 0.2795 | 0.0875 | | 1.6037 | 2.0 | 1000 | 1.3131 | 0.1163 | 0.1793 | 0.1296 | 0.1068 | 0.0 | 0.1522 | 0.0596 | 0.242 | 0.0025 | 0.0819 | 0.2454 | 0.3232 | 0.271 | 0.0 | 0.6985 | 0.3418 | 0.2492 | 0.0542 | | 1.1619 | 3.0 | 1500 | 1.1420 | 0.1636 | 0.242 | 0.1853 | 0.1266 | 0.0 | 0.1891 | 0.0873 | 0.3643 | 0.0126 | 0.0984 | 0.2657 | 0.3361 | 0.2613 | 0.0 | 0.747 | 0.3305 | 0.3038 | 0.0958 | | 0.9864 | 4.0 | 2000 | 1.0863 | 0.1869 | 0.2828 | 0.2189 | 0.1381 | 0.0 | 0.2059 | 0.1139 | 0.4227 | 0.0095 | 0.1091 | 0.3124 | 0.3584 | 0.3319 | 0.0 | 0.7434 | 0.3588 | 0.3247 | 0.1042 | | 1.0153 | 5.0 | 2500 | 0.9843 | 0.2516 | 0.368 | 0.2843 | 0.1965 | 0.0 | 0.2692 | 0.1987 | 0.5584 | 0.0182 | 0.1116 | 0.385 | 0.3999 | 0.471 | 0.0 | 0.7286 | 0.3982 | 0.3709 | 0.0292 | | 1.0348 | 6.0 | 3000 | 0.9389 | 0.2894 | 0.4253 | 0.3333 | 0.2643 | 0.0 | 0.3088 | 0.2406 | 0.6039 | 0.0325 | 0.1185 | 0.4012 | 0.41 | 0.4875 | 0.0 | 0.7425 | 0.4119 | 0.3821 | 0.0625 | | 0.9358 | 7.0 | 3500 | 0.9323 | 0.3286 | 0.4759 | 0.3838 | 0.3865 | 0.0 | 0.3379 | 0.2985 | 0.5993 | 0.0644 | 0.1248 | 0.4488 | 0.4594 | 0.6246 | 0.0 | 0.7536 | 0.4607 | 0.4365 | 0.1042 | | 0.8624 | 8.0 | 4000 | 0.8708 | 0.3548 | 0.5201 | 0.4162 | 0.4469 | 0.0 | 0.385 | 0.3228 | 0.6173 | 0.0751 | 0.1293 | 0.4576 | 0.4661 | 0.6597 | 0.0 | 0.7386 | 0.4866 | 0.4467 | 0.1083 | | 0.792 | 9.0 | 4500 | 0.8009 | 0.3608 | 0.5319 | 0.4073 | 0.4321 | 0.0 | 0.3703 | 0.3288 | 0.6504 | 0.0255 | 0.1259 | 0.4572 | 0.4659 | 0.6419 | 0.0 | 0.7557 | 0.4743 | 0.4347 | 0.0792 | | 0.8859 | 10.0 | 5000 | 0.7880 | 0.3762 | 0.5421 | 0.4381 | 0.4584 | 0.0 | 0.3838 | 0.3499 | 0.67 | 0.0358 | 0.1286 | 0.4716 | 0.4773 | 0.6798 | 0.0 | 0.7521 | 0.4847 | 0.4581 | 0.0417 | | 0.8035 | 11.0 | 5500 | 0.7518 | 0.3937 | 0.5584 | 0.4631 | 0.4786 | 0.0 | 0.4146 | 0.3597 | 0.7024 | 0.0998 | 0.1322 | 0.4802 | 0.4847 | 0.6887 | 0.0 | 0.7654 | 0.5092 | 0.4556 | 0.125 | | 0.7966 | 12.0 | 6000 | 0.7039 | 0.4002 | 0.5585 | 0.4634 | 0.4848 | 0.0 | 0.4089 | 0.3703 | 0.7158 | 0.0782 | 0.1322 | 0.4837 | 0.4876 | 0.6952 | 0.0 | 0.7678 | 0.4956 | 0.463 | 0.125 | | 0.8266 | 13.0 | 6500 | 0.6752 | 0.4044 | 0.5623 | 0.4787 | 0.4951 | 0.0 | 0.429 | 0.3726 | 0.7179 | 0.0758 | 0.1353 | 0.49 | 0.493 | 0.7008 | 0.0 | 0.7783 | 0.5195 | 0.4612 | 0.125 | | 0.7708 | 14.0 | 7000 | 0.6588 | 0.4207 | 0.5743 | 0.4969 | 0.5342 | 0.0 | 0.4323 | 0.3834 | 0.728 | 0.0601 | 0.1383 | 0.4988 | 0.5029 | 0.7262 | 0.0 | 0.7825 | 0.5217 | 0.4681 | 0.1917 | | 0.6287 | 15.0 | 7500 | 0.6549 | 0.404 | 0.579 | 0.451 | 0.482 | 0.0 | 0.4252 | 0.3643 | 0.7299 | 0.0328 | 0.1354 | 0.4821 | 0.4868 | 0.6726 | 0.0 | 0.788 | 0.5127 | 0.4479 | 0.1125 | | 0.6559 | 16.0 | 8000 | 0.6334 | 0.4031 | 0.5814 | 0.4652 | 0.5083 | 0.0 | 0.4247 | 0.3728 | 0.7011 | 0.1067 | 0.1366 | 0.4836 | 0.4889 | 0.6948 | 0.0 | 0.772 | 0.5142 | 0.4574 | 0.1708 | | 0.7151 | 17.0 | 8500 | 0.5907 | 0.4471 | 0.6062 | 0.5339 | 0.5945 | 0.0004 | 0.4537 | 0.4183 | 0.7465 | 0.0949 | 0.1433 | 0.5082 | 0.5117 | 0.7331 | 0.0021 | 0.8 | 0.5282 | 0.4792 | 0.1667 | | 0.7495 | 18.0 | 9000 | 0.6227 | 0.4276 | 0.6057 | 0.492 | 0.5491 | 0.0 | 0.4429 | 0.3988 | 0.7338 | 0.1324 | 0.1411 | 0.4889 | 0.4923 | 0.6859 | 0.0 | 0.791 | 0.5099 | 0.4582 | 0.2042 | | 0.6698 | 19.0 | 9500 | 0.5812 | 0.4552 | 0.6164 | 0.5403 | 0.6232 | 0.0 | 0.4591 | 0.4315 | 0.7423 | 0.1785 | 0.1485 | 0.5075 | 0.5096 | 0.7359 | 0.0 | 0.7928 | 0.5173 | 0.4808 | 0.2375 | | 0.7048 | 20.0 | 10000 | 0.5521 | 0.4675 | 0.6247 | 0.5517 | 0.6487 | 0.0 | 0.468 | 0.4456 | 0.7538 | 0.156 | 0.149 | 0.5148 | 0.5202 | 0.7524 | 0.0 | 0.8081 | 0.5259 | 0.4972 | 0.2333 | | 0.6155 | 21.0 | 10500 | 0.5570 | 0.4642 | 0.6334 | 0.561 | 0.6367 | 0.0002 | 0.4718 | 0.4286 | 0.7556 | 0.0656 | 0.1467 | 0.5026 | 0.5054 | 0.7141 | 0.0011 | 0.8009 | 0.5204 | 0.4702 | 0.2 | | 0.6012 | 22.0 | 11000 | 0.5256 | 0.4777 | 0.6409 | 0.5649 | 0.6609 | 0.002 | 0.4887 | 0.4472 | 0.7703 | 0.1022 | 0.1486 | 0.5167 | 0.5207 | 0.7379 | 0.0011 | 0.8232 | 0.5359 | 0.4937 | 0.1833 | | 0.5228 | 23.0 | 11500 | 0.5485 | 0.477 | 0.642 | 0.5768 | 0.6618 | 0.0158 | 0.47 | 0.455 | 0.7533 | 0.1221 | 0.1505 | 0.511 | 0.5136 | 0.7347 | 0.0084 | 0.7976 | 0.5099 | 0.4941 | 0.1958 | | 0.6076 | 24.0 | 12000 | 0.5497 | 0.5072 | 0.6724 | 0.6053 | 0.6875 | 0.0909 | 0.5003 | 0.4843 | 0.7432 | 0.184 | 0.1727 | 0.5426 | 0.5459 | 0.7528 | 0.0895 | 0.7955 | 0.5402 | 0.5191 | 0.2375 | | 0.5342 | 25.0 | 12500 | 0.5213 | 0.5077 | 0.6727 | 0.6119 | 0.6796 | 0.0854 | 0.506 | 0.4767 | 0.7579 | 0.1671 | 0.1687 | 0.5411 | 0.5459 | 0.7492 | 0.0832 | 0.8054 | 0.5459 | 0.5162 | 0.2833 | | 0.5668 | 26.0 | 13000 | 0.5126 | 0.5918 | 0.7825 | 0.7045 | 0.6827 | 0.3173 | 0.5874 | 0.5738 | 0.7754 | 0.0969 | 0.2343 | 0.6311 | 0.6358 | 0.7456 | 0.3379 | 0.8238 | 0.6284 | 0.6238 | 0.2542 | | 0.5328 | 27.0 | 13500 | 0.4993 | 0.6162 | 0.8218 | 0.748 | 0.685 | 0.4089 | 0.5979 | 0.6021 | 0.7549 | 0.181 | 0.2525 | 0.6585 | 0.6611 | 0.7423 | 0.4389 | 0.8021 | 0.6403 | 0.6464 | 0.2375 | | 0.5303 | 28.0 | 14000 | 0.4917 | 0.6485 | 0.8365 | 0.7794 | 0.7124 | 0.4644 | 0.6271 | 0.6461 | 0.7688 | 0.1745 | 0.268 | 0.6879 | 0.6904 | 0.7677 | 0.4937 | 0.8096 | 0.6613 | 0.6893 | 0.2375 | | 0.5268 | 29.0 | 14500 | 0.4792 | 0.6577 | 0.8834 | 0.7838 | 0.6831 | 0.5047 | 0.6532 | 0.6404 | 0.7854 | 0.1625 | 0.2733 | 0.7018 | 0.7059 | 0.7387 | 0.5495 | 0.8295 | 0.6884 | 0.6985 | 0.2458 | | 0.49 | 30.0 | 15000 | 0.4947 | 0.6475 | 0.8795 | 0.7532 | 0.6762 | 0.5044 | 0.6672 | 0.6287 | 0.7618 | 0.1773 | 0.2766 | 0.6916 | 0.6951 | 0.729 | 0.5526 | 0.8036 | 0.7074 | 0.6779 | 0.3069 | | 0.6711 | 31.0 | 15500 | 0.4538 | 0.6746 | 0.8957 | 0.7991 | 0.6955 | 0.5497 | 0.6853 | 0.663 | 0.7787 | 0.1179 | 0.2935 | 0.7228 | 0.7262 | 0.7585 | 0.5989 | 0.8211 | 0.7289 | 0.7156 | 0.2583 | | 0.4278 | 32.0 | 16000 | 0.4328 | 0.7014 | 0.9158 | 0.8303 | 0.7142 | 0.608 | 0.712 | 0.6875 | 0.7821 | 0.1313 | 0.3039 | 0.7437 | 0.7487 | 0.7694 | 0.6474 | 0.8295 | 0.7503 | 0.7407 | 0.2125 | | 0.5014 | 33.0 | 16500 | 0.4380 | 0.6845 | 0.8992 | 0.8313 | 0.7093 | 0.5781 | 0.7182 | 0.6693 | 0.766 | 0.1623 | 0.2957 | 0.731 | 0.7354 | 0.7681 | 0.6274 | 0.8105 | 0.7652 | 0.7183 | 0.2875 | | 0.5762 | 34.0 | 17000 | 0.4470 | 0.6873 | 0.9159 | 0.8232 | 0.6738 | 0.6083 | 0.7118 | 0.6753 | 0.7799 | 0.182 | 0.2987 | 0.7358 | 0.7396 | 0.7399 | 0.6526 | 0.8262 | 0.7558 | 0.725 | 0.3097 | | 0.5993 | 35.0 | 17500 | 0.4372 | 0.6971 | 0.9303 | 0.8432 | 0.687 | 0.6295 | 0.7122 | 0.6768 | 0.7748 | 0.1631 | 0.3041 | 0.7391 | 0.745 | 0.746 | 0.6716 | 0.8175 | 0.7473 | 0.7288 | 0.3486 | | 0.4911 | 36.0 | 18000 | 0.4101 | 0.7015 | 0.932 | 0.8411 | 0.693 | 0.6238 | 0.7166 | 0.6893 | 0.7878 | 0.1979 | 0.3028 | 0.7472 | 0.7521 | 0.7569 | 0.6632 | 0.8361 | 0.7673 | 0.74 | 0.2958 | | 0.4781 | 37.0 | 18500 | 0.4187 | 0.6914 | 0.9079 | 0.8294 | 0.6926 | 0.59 | 0.7231 | 0.6789 | 0.7918 | 0.1162 | 0.2947 | 0.7356 | 0.7394 | 0.754 | 0.6295 | 0.8346 | 0.7661 | 0.7262 | 0.2139 | | 0.5799 | 38.0 | 19000 | 0.4119 | 0.701 | 0.9439 | 0.8374 | 0.6948 | 0.6172 | 0.7107 | 0.6877 | 0.7909 | 0.1665 | 0.2996 | 0.7478 | 0.7511 | 0.7581 | 0.6621 | 0.8331 | 0.7529 | 0.7375 | 0.2819 | | 0.5361 | 39.0 | 19500 | 0.4146 | 0.7078 | 0.9387 | 0.8622 | 0.7061 | 0.6333 | 0.7228 | 0.6982 | 0.784 | 0.2637 | 0.2969 | 0.7483 | 0.7521 | 0.7577 | 0.6705 | 0.828 | 0.7665 | 0.7408 | 0.3819 | | 0.493 | 40.0 | 20000 | 0.4011 | 0.7074 | 0.924 | 0.8545 | 0.7149 | 0.609 | 0.7326 | 0.6932 | 0.7982 | 0.1959 | 0.3027 | 0.7478 | 0.7529 | 0.769 | 0.6495 | 0.8404 | 0.7748 | 0.7458 | 0.2444 | | 0.5708 | 41.0 | 20500 | 0.3944 | 0.7356 | 0.9499 | 0.8782 | 0.7288 | 0.6795 | 0.7563 | 0.7228 | 0.7986 | 0.1492 | 0.3188 | 0.7753 | 0.78 | 0.7766 | 0.7211 | 0.8425 | 0.7968 | 0.7703 | 0.2722 | | 0.6682 | 42.0 | 21000 | 0.3919 | 0.7261 | 0.9477 | 0.8652 | 0.7259 | 0.6498 | 0.7382 | 0.713 | 0.8025 | 0.2406 | 0.3087 | 0.7701 | 0.7739 | 0.7782 | 0.7 | 0.8434 | 0.7811 | 0.7649 | 0.3153 | | 0.4174 | 43.0 | 21500 | 0.4134 | 0.703 | 0.9523 | 0.8325 | 0.6781 | 0.6314 | 0.7123 | 0.6946 | 0.7994 | 0.2342 | 0.2981 | 0.746 | 0.7501 | 0.7355 | 0.6768 | 0.838 | 0.7524 | 0.741 | 0.3486 | | 0.5008 | 44.0 | 22000 | 0.3907 | 0.7355 | 0.9505 | 0.8609 | 0.7255 | 0.673 | 0.736 | 0.7287 | 0.8081 | 0.2461 | 0.3096 | 0.7722 | 0.7761 | 0.7714 | 0.7063 | 0.8506 | 0.7728 | 0.7712 | 0.3347 | | 0.4455 | 45.0 | 22500 | 0.3722 | 0.7412 | 0.9463 | 0.8804 | 0.7153 | 0.6929 | 0.7561 | 0.7266 | 0.8155 | 0.2147 | 0.3152 | 0.7778 | 0.7825 | 0.7613 | 0.7305 | 0.8557 | 0.7959 | 0.7692 | 0.2931 | | 0.5233 | 46.0 | 23000 | 0.3724 | 0.7403 | 0.9481 | 0.8889 | 0.7207 | 0.6869 | 0.759 | 0.7225 | 0.8132 | 0.2499 | 0.3154 | 0.7816 | 0.7857 | 0.7718 | 0.7305 | 0.8548 | 0.7988 | 0.7691 | 0.3375 | | 0.4255 | 47.0 | 23500 | 0.3741 | 0.7311 | 0.9513 | 0.8635 | 0.7092 | 0.678 | 0.7399 | 0.7221 | 0.806 | 0.2056 | 0.3123 | 0.7751 | 0.7799 | 0.7661 | 0.7295 | 0.844 | 0.7819 | 0.7699 | 0.3014 | | 0.446 | 48.0 | 24000 | 0.3888 | 0.734 | 0.953 | 0.8869 | 0.7125 | 0.6834 | 0.747 | 0.7208 | 0.806 | 0.2461 | 0.3127 | 0.774 | 0.7762 | 0.7565 | 0.7232 | 0.8491 | 0.7858 | 0.7643 | 0.3 | | 0.4059 | 49.0 | 24500 | 0.3750 | 0.7303 | 0.9415 | 0.8951 | 0.7104 | 0.6759 | 0.7358 | 0.7234 | 0.8046 | 0.1287 | 0.3159 | 0.772 | 0.7755 | 0.7698 | 0.7137 | 0.8431 | 0.7806 | 0.7671 | 0.25 | | 0.4609 | 50.0 | 25000 | 0.3579 | 0.7437 | 0.9486 | 0.875 | 0.7298 | 0.6908 | 0.7409 | 0.7385 | 0.8105 | 0.2214 | 0.3188 | 0.7875 | 0.7908 | 0.7855 | 0.7379 | 0.8491 | 0.7783 | 0.7858 | 0.3389 | | 0.5203 | 51.0 | 25500 | 0.3841 | 0.7283 | 0.943 | 0.8761 | 0.7195 | 0.6715 | 0.7375 | 0.7157 | 0.794 | 0.2607 | 0.3134 | 0.7707 | 0.7735 | 0.7698 | 0.7158 | 0.8349 | 0.7808 | 0.7655 | 0.3417 | | 0.5682 | 52.0 | 26000 | 0.3522 | 0.7435 | 0.9479 | 0.8956 | 0.7475 | 0.6641 | 0.7571 | 0.7368 | 0.8189 | 0.2501 | 0.3177 | 0.7855 | 0.7888 | 0.7976 | 0.7084 | 0.8605 | 0.7951 | 0.781 | 0.3403 | | 0.4698 | 53.0 | 26500 | 0.3583 | 0.7416 | 0.9407 | 0.8975 | 0.7269 | 0.6814 | 0.7537 | 0.7276 | 0.8165 | 0.2662 | 0.3177 | 0.7856 | 0.7884 | 0.7802 | 0.7274 | 0.8575 | 0.7972 | 0.777 | 0.3833 | | 0.4656 | 54.0 | 27000 | 0.3621 | 0.7368 | 0.9507 | 0.8858 | 0.7244 | 0.6673 | 0.7422 | 0.7276 | 0.8186 | 0.3068 | 0.3144 | 0.7779 | 0.7819 | 0.7742 | 0.7126 | 0.8587 | 0.7846 | 0.7719 | 0.3778 | | 0.4325 | 55.0 | 27500 | 0.3537 | 0.7429 | 0.9449 | 0.8827 | 0.7334 | 0.6752 | 0.7695 | 0.7297 | 0.8202 | 0.3058 | 0.3175 | 0.7872 | 0.7908 | 0.7871 | 0.7284 | 0.8569 | 0.8056 | 0.7754 | 0.3431 | | 0.4239 | 56.0 | 28000 | 0.3582 | 0.7506 | 0.9524 | 0.8949 | 0.7427 | 0.695 | 0.7701 | 0.7386 | 0.8141 | 0.2394 | 0.3188 | 0.7915 | 0.7955 | 0.7972 | 0.74 | 0.8494 | 0.8032 | 0.7856 | 0.325 | | 0.4582 | 57.0 | 28500 | 0.3465 | 0.7593 | 0.9528 | 0.9059 | 0.7571 | 0.7023 | 0.7844 | 0.7501 | 0.8185 | 0.2384 | 0.3198 | 0.8005 | 0.8042 | 0.8117 | 0.7474 | 0.8536 | 0.8198 | 0.7968 | 0.35 | | 0.5418 | 58.0 | 29000 | 0.3444 | 0.7597 | 0.9552 | 0.8909 | 0.7571 | 0.7036 | 0.7629 | 0.7523 | 0.8185 | 0.2644 | 0.3224 | 0.8001 | 0.8047 | 0.8085 | 0.7495 | 0.856 | 0.7968 | 0.8034 | 0.3389 | | 0.4257 | 59.0 | 29500 | 0.3479 | 0.7565 | 0.9489 | 0.8967 | 0.7483 | 0.6999 | 0.7716 | 0.7497 | 0.8213 | 0.2546 | 0.3221 | 0.7978 | 0.8012 | 0.8004 | 0.7442 | 0.859 | 0.8109 | 0.7944 | 0.3306 | | 0.3616 | 60.0 | 30000 | 0.3678 | 0.74 | 0.9476 | 0.869 | 0.7166 | 0.6764 | 0.7445 | 0.7328 | 0.8269 | 0.2011 | 0.3123 | 0.7792 | 0.7834 | 0.7645 | 0.7211 | 0.8648 | 0.7838 | 0.7761 | 0.3208 | | 0.4692 | 61.0 | 30500 | 0.3567 | 0.7391 | 0.9526 | 0.9076 | 0.7237 | 0.6736 | 0.7352 | 0.7386 | 0.8199 | 0.2564 | 0.3134 | 0.7841 | 0.7874 | 0.7698 | 0.7284 | 0.8642 | 0.777 | 0.7848 | 0.35 | | 0.3664 | 62.0 | 31000 | 0.3445 | 0.7543 | 0.9457 | 0.8969 | 0.7524 | 0.6917 | 0.7841 | 0.7436 | 0.8186 | 0.1985 | 0.3187 | 0.7946 | 0.7982 | 0.804 | 0.7305 | 0.8599 | 0.8191 | 0.7871 | 0.3319 | | 0.4617 | 63.0 | 31500 | 0.3477 | 0.7501 | 0.9319 | 0.8863 | 0.7356 | 0.677 | 0.7548 | 0.7512 | 0.8376 | 0.1933 | 0.3224 | 0.7898 | 0.7932 | 0.7891 | 0.7158 | 0.8747 | 0.7865 | 0.7936 | 0.2667 | | 0.4785 | 64.0 | 32000 | 0.3276 | 0.7646 | 0.9489 | 0.9162 | 0.7483 | 0.7102 | 0.7797 | 0.7639 | 0.8351 | 0.2695 | 0.3244 | 0.8064 | 0.8101 | 0.7988 | 0.7547 | 0.8768 | 0.8188 | 0.8073 | 0.3278 | | 0.3807 | 65.0 | 32500 | 0.3293 | 0.7682 | 0.9436 | 0.9062 | 0.759 | 0.7055 | 0.7908 | 0.7561 | 0.84 | 0.2285 | 0.3257 | 0.8096 | 0.8141 | 0.8137 | 0.7547 | 0.8738 | 0.8229 | 0.8057 | 0.3181 | | 0.3815 | 66.0 | 33000 | 0.3422 | 0.7608 | 0.9451 | 0.9046 | 0.7518 | 0.6895 | 0.7784 | 0.7544 | 0.8411 | 0.2752 | 0.3221 | 0.8004 | 0.8053 | 0.8028 | 0.7358 | 0.8774 | 0.8167 | 0.8019 | 0.3236 | | 0.4308 | 67.0 | 33500 | 0.3536 | 0.7486 | 0.9475 | 0.8884 | 0.7248 | 0.6897 | 0.7676 | 0.7404 | 0.8312 | 0.2703 | 0.3171 | 0.7907 | 0.796 | 0.7855 | 0.7379 | 0.8648 | 0.8097 | 0.7885 | 0.3292 | | 0.4524 | 68.0 | 34000 | 0.3540 | 0.7421 | 0.9371 | 0.8874 | 0.7334 | 0.6738 | 0.7598 | 0.7385 | 0.8191 | 0.2067 | 0.3174 | 0.784 | 0.7874 | 0.7887 | 0.7179 | 0.8557 | 0.7999 | 0.7867 | 0.2833 | | 0.4625 | 69.0 | 34500 | 0.3366 | 0.7633 | 0.9428 | 0.8915 | 0.7428 | 0.7156 | 0.7827 | 0.7584 | 0.8314 | 0.2485 | 0.3264 | 0.7999 | 0.803 | 0.7935 | 0.7526 | 0.863 | 0.8165 | 0.8011 | 0.2917 | | 0.3982 | 70.0 | 35000 | 0.3199 | 0.7704 | 0.9511 | 0.9008 | 0.7549 | 0.7186 | 0.7657 | 0.7739 | 0.8377 | 0.2725 | 0.3271 | 0.81 | 0.8137 | 0.8145 | 0.7579 | 0.8687 | 0.8115 | 0.8156 | 0.3375 | | 0.4455 | 71.0 | 35500 | 0.3327 | 0.7582 | 0.9422 | 0.889 | 0.7602 | 0.6811 | 0.7635 | 0.7576 | 0.8332 | 0.2142 | 0.3214 | 0.7964 | 0.8012 | 0.8129 | 0.7242 | 0.8666 | 0.8043 | 0.7968 | 0.3139 | | 0.4408 | 72.0 | 36000 | 0.3362 | 0.7589 | 0.9435 | 0.896 | 0.7509 | 0.6959 | 0.7749 | 0.7537 | 0.8299 | 0.2235 | 0.32 | 0.7978 | 0.8009 | 0.804 | 0.7368 | 0.8617 | 0.8137 | 0.7951 | 0.2667 | | 0.5039 | 73.0 | 36500 | 0.3251 | 0.7653 | 0.946 | 0.8872 | 0.7631 | 0.7017 | 0.7696 | 0.7687 | 0.8311 | 0.1256 | 0.3259 | 0.8012 | 0.8046 | 0.8089 | 0.7389 | 0.866 | 0.8065 | 0.8053 | 0.1542 | | 0.4364 | 74.0 | 37000 | 0.3245 | 0.771 | 0.9521 | 0.8988 | 0.7656 | 0.7039 | 0.7728 | 0.771 | 0.8435 | 0.31 | 0.3269 | 0.8085 | 0.812 | 0.8077 | 0.7463 | 0.8819 | 0.8133 | 0.8129 | 0.3389 | | 0.4864 | 75.0 | 37500 | 0.3360 | 0.7616 | 0.9475 | 0.8927 | 0.7553 | 0.7025 | 0.7811 | 0.7584 | 0.827 | 0.2521 | 0.3234 | 0.8024 | 0.8062 | 0.8109 | 0.7421 | 0.8657 | 0.8202 | 0.8042 | 0.2819 | | 0.5108 | 76.0 | 38000 | 0.3516 | 0.7529 | 0.9519 | 0.8799 | 0.7343 | 0.696 | 0.7748 | 0.7415 | 0.8284 | 0.1942 | 0.3176 | 0.792 | 0.7956 | 0.7867 | 0.7368 | 0.8633 | 0.8118 | 0.7864 | 0.2861 | | 0.4386 | 77.0 | 38500 | 0.3303 | 0.7701 | 0.9422 | 0.8958 | 0.7672 | 0.6976 | 0.7998 | 0.7631 | 0.8454 | 0.2447 | 0.3269 | 0.8062 | 0.8096 | 0.8133 | 0.7347 | 0.8807 | 0.8336 | 0.8044 | 0.3 | | 0.4438 | 78.0 | 39000 | 0.3459 | 0.7568 | 0.9427 | 0.8977 | 0.7401 | 0.7021 | 0.762 | 0.7538 | 0.8281 | 0.2318 | 0.3208 | 0.7949 | 0.7986 | 0.7964 | 0.7358 | 0.8636 | 0.799 | 0.7978 | 0.3097 | | 0.4384 | 79.0 | 39500 | 0.3290 | 0.7697 | 0.9422 | 0.8978 | 0.7709 | 0.6985 | 0.7908 | 0.7704 | 0.8396 | 0.2272 | 0.3268 | 0.8059 | 0.8105 | 0.8185 | 0.7368 | 0.8762 | 0.8231 | 0.8139 | 0.2917 | | 0.46 | 80.0 | 40000 | 0.3328 | 0.7611 | 0.9379 | 0.8931 | 0.7507 | 0.6942 | 0.7708 | 0.7591 | 0.8383 | 0.248 | 0.3217 | 0.7986 | 0.8017 | 0.8004 | 0.7326 | 0.872 | 0.802 | 0.8022 | 0.3208 | | 0.3882 | 81.0 | 40500 | 0.3387 | 0.7631 | 0.9478 | 0.9052 | 0.7469 | 0.7052 | 0.7825 | 0.7585 | 0.837 | 0.2465 | 0.3205 | 0.7999 | 0.8038 | 0.7883 | 0.7495 | 0.8735 | 0.8156 | 0.7998 | 0.3375 | | 0.4055 | 82.0 | 41000 | 0.3298 | 0.7623 | 0.934 | 0.8977 | 0.7419 | 0.6995 | 0.7882 | 0.7618 | 0.8456 | 0.2022 | 0.324 | 0.7988 | 0.8018 | 0.7907 | 0.7326 | 0.8819 | 0.8235 | 0.8002 | 0.2667 | | 0.4664 | 83.0 | 41500 | 0.3408 | 0.7543 | 0.9434 | 0.9031 | 0.74 | 0.6868 | 0.7696 | 0.7523 | 0.8362 | 0.191 | 0.3201 | 0.7926 | 0.7957 | 0.7867 | 0.7274 | 0.8729 | 0.8058 | 0.7927 | 0.2792 | | 0.3745 | 84.0 | 42000 | 0.3231 | 0.7661 | 0.9464 | 0.9053 | 0.7459 | 0.7051 | 0.7858 | 0.7607 | 0.8472 | 0.2229 | 0.3269 | 0.8049 | 0.8086 | 0.7976 | 0.7474 | 0.8807 | 0.8229 | 0.8032 | 0.3417 | | 0.3388 | 85.0 | 42500 | 0.3270 | 0.7699 | 0.9526 | 0.9007 | 0.7726 | 0.704 | 0.7731 | 0.7753 | 0.8332 | 0.2572 | 0.3231 | 0.8089 | 0.8127 | 0.8161 | 0.7505 | 0.8714 | 0.8128 | 0.8145 | 0.3764 | | 0.3615 | 86.0 | 43000 | 0.3178 | 0.7756 | 0.9496 | 0.9091 | 0.7714 | 0.7045 | 0.7916 | 0.7747 | 0.8508 | 0.294 | 0.3264 | 0.8123 | 0.8173 | 0.8153 | 0.7484 | 0.8883 | 0.8225 | 0.8143 | 0.3681 | | 0.3616 | 87.0 | 43500 | 0.3388 | 0.7613 | 0.953 | 0.9002 | 0.7401 | 0.7073 | 0.7624 | 0.7605 | 0.8366 | 0.2755 | 0.3209 | 0.7992 | 0.8019 | 0.7899 | 0.7453 | 0.8705 | 0.802 | 0.7984 | 0.3861 | | 0.4107 | 88.0 | 44000 | 0.3161 | 0.7772 | 0.9513 | 0.9018 | 0.7613 | 0.7223 | 0.7856 | 0.7778 | 0.8479 | 0.2612 | 0.3308 | 0.8152 | 0.8191 | 0.8129 | 0.76 | 0.8843 | 0.8207 | 0.8195 | 0.3403 | | 0.3945 | 89.0 | 44500 | 0.3433 | 0.7521 | 0.9474 | 0.8903 | 0.7469 | 0.6782 | 0.7605 | 0.7471 | 0.8312 | 0.2602 | 0.3186 | 0.7914 | 0.7948 | 0.7935 | 0.7232 | 0.8678 | 0.8016 | 0.7921 | 0.3042 | | 0.3239 | 90.0 | 45000 | 0.3282 | 0.7692 | 0.9409 | 0.8894 | 0.7633 | 0.7078 | 0.7844 | 0.7662 | 0.8366 | 0.1813 | 0.3234 | 0.8066 | 0.8111 | 0.8161 | 0.7411 | 0.8762 | 0.826 | 0.8066 | 0.25 | | 0.3919 | 91.0 | 45500 | 0.3136 | 0.7808 | 0.944 | 0.8978 | 0.7711 | 0.7221 | 0.7836 | 0.7854 | 0.8491 | 0.2101 | 0.3293 | 0.8148 | 0.8175 | 0.8153 | 0.7537 | 0.8834 | 0.8187 | 0.8229 | 0.2375 | | 0.3625 | 92.0 | 46000 | 0.3011 | 0.7892 | 0.9492 | 0.9019 | 0.7869 | 0.7321 | 0.7997 | 0.793 | 0.8487 | 0.2453 | 0.3354 | 0.8236 | 0.827 | 0.8339 | 0.7653 | 0.8819 | 0.8349 | 0.8282 | 0.2917 | | 0.3943 | 93.0 | 46500 | 0.3042 | 0.7873 | 0.944 | 0.9037 | 0.7763 | 0.7373 | 0.7998 | 0.7891 | 0.8482 | 0.243 | 0.3365 | 0.8226 | 0.8263 | 0.8206 | 0.7747 | 0.8837 | 0.8331 | 0.8285 | 0.2958 | | 0.4379 | 94.0 | 47000 | 0.3146 | 0.7777 | 0.9441 | 0.9048 | 0.7555 | 0.726 | 0.7806 | 0.7786 | 0.8515 | 0.2692 | 0.3311 | 0.8143 | 0.8185 | 0.8052 | 0.7611 | 0.8892 | 0.8154 | 0.8207 | 0.3417 | | 0.4484 | 95.0 | 47500 | 0.3113 | 0.779 | 0.946 | 0.9019 | 0.7684 | 0.7188 | 0.7913 | 0.7848 | 0.8498 | 0.2473 | 0.329 | 0.8134 | 0.8175 | 0.8161 | 0.7484 | 0.888 | 0.8243 | 0.8231 | 0.2972 | | 0.4899 | 96.0 | 48000 | 0.3086 | 0.7834 | 0.9491 | 0.9066 | 0.7749 | 0.7207 | 0.7926 | 0.7875 | 0.8547 | 0.2902 | 0.3289 | 0.8209 | 0.825 | 0.8246 | 0.76 | 0.8904 | 0.8227 | 0.8281 | 0.3431 | | 0.3551 | 97.0 | 48500 | 0.3072 | 0.7814 | 0.9519 | 0.9083 | 0.7687 | 0.7259 | 0.7879 | 0.7862 | 0.8497 | 0.2733 | 0.3296 | 0.8196 | 0.8228 | 0.8169 | 0.7642 | 0.8873 | 0.8284 | 0.8258 | 0.3417 | | 0.4379 | 98.0 | 49000 | 0.3053 | 0.7827 | 0.9443 | 0.9023 | 0.7828 | 0.7111 | 0.79 | 0.7874 | 0.8543 | 0.2485 | 0.33 | 0.8195 | 0.8242 | 0.8327 | 0.7516 | 0.8883 | 0.8254 | 0.83 | 0.3167 | | 0.4418 | 99.0 | 49500 | 0.3088 | 0.7777 | 0.9379 | 0.9037 | 0.7765 | 0.7073 | 0.7923 | 0.7806 | 0.8494 | 0.2146 | 0.3282 | 0.8159 | 0.819 | 0.825 | 0.7453 | 0.8867 | 0.8287 | 0.8192 | 0.2833 | | 0.2906 | 100.0 | 50000 | 0.3106 | 0.7833 | 0.9427 | 0.9046 | 0.7741 | 0.7203 | 0.7894 | 0.7899 | 0.8556 | 0.2257 | 0.3311 | 0.8202 | 0.8226 | 0.8185 | 0.7611 | 0.8883 | 0.8244 | 0.8286 | 0.2708 | | 0.3827 | 101.0 | 50500 | 0.3013 | 0.7814 | 0.9387 | 0.9 | 0.7832 | 0.7065 | 0.799 | 0.7836 | 0.8545 | 0.2251 | 0.3292 | 0.8172 | 0.8205 | 0.829 | 0.7421 | 0.8904 | 0.8339 | 0.8222 | 0.2958 | | 0.3339 | 102.0 | 51000 | 0.2997 | 0.7851 | 0.9457 | 0.8962 | 0.7877 | 0.7196 | 0.7903 | 0.7883 | 0.8481 | 0.2424 | 0.3314 | 0.8212 | 0.825 | 0.8359 | 0.7558 | 0.8834 | 0.8284 | 0.828 | 0.2917 | | 0.357 | 103.0 | 51500 | 0.2963 | 0.7903 | 0.9517 | 0.9017 | 0.7853 | 0.7276 | 0.8004 | 0.7939 | 0.858 | 0.2363 | 0.3335 | 0.8288 | 0.8332 | 0.8319 | 0.7705 | 0.8973 | 0.8394 | 0.8385 | 0.3083 | | 0.4407 | 104.0 | 52000 | 0.3016 | 0.7852 | 0.9482 | 0.9112 | 0.7777 | 0.7203 | 0.8001 | 0.7873 | 0.8577 | 0.2538 | 0.3305 | 0.8196 | 0.8237 | 0.8218 | 0.7579 | 0.8916 | 0.8326 | 0.8276 | 0.3306 | | 0.4194 | 105.0 | 52500 | 0.2960 | 0.7943 | 0.9484 | 0.9091 | 0.7902 | 0.7302 | 0.8199 | 0.7941 | 0.8625 | 0.2392 | 0.3322 | 0.8303 | 0.8346 | 0.8407 | 0.7663 | 0.8967 | 0.8524 | 0.8371 | 0.2931 | | 0.3059 | 106.0 | 53000 | 0.2887 | 0.7964 | 0.9489 | 0.9098 | 0.7968 | 0.7332 | 0.8078 | 0.8042 | 0.8592 | 0.2745 | 0.337 | 0.8335 | 0.8372 | 0.8435 | 0.7726 | 0.8955 | 0.8418 | 0.8461 | 0.3167 | | 0.3604 | 107.0 | 53500 | 0.2979 | 0.7902 | 0.9446 | 0.9084 | 0.7865 | 0.7265 | 0.8061 | 0.7953 | 0.8576 | 0.2311 | 0.3335 | 0.8237 | 0.8296 | 0.8339 | 0.7632 | 0.8919 | 0.842 | 0.8357 | 0.2875 | | 0.3597 | 108.0 | 54000 | 0.2955 | 0.7931 | 0.9517 | 0.9122 | 0.788 | 0.7333 | 0.8022 | 0.7966 | 0.8579 | 0.2544 | 0.3345 | 0.8306 | 0.8342 | 0.8347 | 0.7758 | 0.8922 | 0.8379 | 0.8394 | 0.3056 | | 0.4033 | 109.0 | 54500 | 0.3068 | 0.7823 | 0.9455 | 0.9069 | 0.7722 | 0.7216 | 0.799 | 0.7859 | 0.8529 | 0.2415 | 0.3324 | 0.8192 | 0.8224 | 0.8202 | 0.7589 | 0.888 | 0.8313 | 0.8274 | 0.2875 | | 0.3405 | 110.0 | 55000 | 0.2968 | 0.7905 | 0.9508 | 0.9073 | 0.7773 | 0.7367 | 0.799 | 0.7934 | 0.8577 | 0.2545 | 0.3333 | 0.8242 | 0.8279 | 0.8202 | 0.7705 | 0.8931 | 0.8306 | 0.8325 | 0.3306 | | 0.4235 | 111.0 | 55500 | 0.2834 | 0.796 | 0.9439 | 0.9146 | 0.7929 | 0.734 | 0.8125 | 0.7985 | 0.8609 | 0.2387 | 0.3396 | 0.8318 | 0.8355 | 0.8391 | 0.7705 | 0.897 | 0.846 | 0.8395 | 0.275 | | 0.3267 | 112.0 | 56000 | 0.2922 | 0.7909 | 0.9459 | 0.8997 | 0.7878 | 0.7231 | 0.8014 | 0.7924 | 0.8617 | 0.2623 | 0.3327 | 0.8267 | 0.8307 | 0.8359 | 0.76 | 0.8961 | 0.8337 | 0.8336 | 0.3194 | | 0.4151 | 113.0 | 56500 | 0.2878 | 0.7963 | 0.9485 | 0.9066 | 0.7945 | 0.7302 | 0.8067 | 0.7993 | 0.8643 | 0.2725 | 0.3336 | 0.8297 | 0.8344 | 0.8419 | 0.7621 | 0.8991 | 0.8388 | 0.8363 | 0.3222 | | 0.3984 | 114.0 | 57000 | 0.2887 | 0.7961 | 0.9482 | 0.9034 | 0.7917 | 0.7335 | 0.8072 | 0.8031 | 0.8632 | 0.2598 | 0.3352 | 0.8317 | 0.8364 | 0.8395 | 0.7716 | 0.8982 | 0.8394 | 0.8429 | 0.3056 | | 0.435 | 115.0 | 57500 | 0.2955 | 0.7951 | 0.9509 | 0.9083 | 0.7895 | 0.7356 | 0.8081 | 0.7996 | 0.8601 | 0.2608 | 0.3337 | 0.8293 | 0.8341 | 0.8335 | 0.7747 | 0.894 | 0.8391 | 0.8399 | 0.3097 | | 0.2948 | 116.0 | 58000 | 0.2954 | 0.7912 | 0.9492 | 0.911 | 0.7927 | 0.7214 | 0.808 | 0.7955 | 0.8594 | 0.263 | 0.3315 | 0.8273 | 0.8321 | 0.8403 | 0.7621 | 0.894 | 0.8393 | 0.8355 | 0.3097 | | 0.3556 | 117.0 | 58500 | 0.2893 | 0.8 | 0.949 | 0.9138 | 0.8025 | 0.7364 | 0.8168 | 0.8018 | 0.861 | 0.2696 | 0.3371 | 0.8363 | 0.8403 | 0.848 | 0.7779 | 0.8949 | 0.8499 | 0.8413 | 0.3139 | | 0.4662 | 118.0 | 59000 | 0.2960 | 0.792 | 0.9494 | 0.9061 | 0.7866 | 0.7254 | 0.8113 | 0.7922 | 0.8641 | 0.2575 | 0.3315 | 0.8242 | 0.8288 | 0.8319 | 0.7579 | 0.8967 | 0.8416 | 0.8296 | 0.3014 | | 0.3439 | 119.0 | 59500 | 0.2869 | 0.8048 | 0.9485 | 0.9121 | 0.7975 | 0.7471 | 0.8194 | 0.8073 | 0.8698 | 0.2751 | 0.3363 | 0.8369 | 0.8415 | 0.8444 | 0.7779 | 0.9021 | 0.8479 | 0.8438 | 0.3236 | | 0.3117 | 120.0 | 60000 | 0.2921 | 0.8007 | 0.9491 | 0.9075 | 0.7956 | 0.7394 | 0.8103 | 0.8041 | 0.8671 | 0.2751 | 0.3369 | 0.8355 | 0.839 | 0.8419 | 0.7768 | 0.8982 | 0.8412 | 0.8429 | 0.3139 | | 0.3281 | 121.0 | 60500 | 0.2870 | 0.8047 | 0.9481 | 0.906 | 0.8027 | 0.7437 | 0.8166 | 0.8088 | 0.8678 | 0.2744 | 0.3363 | 0.839 | 0.8423 | 0.8476 | 0.7789 | 0.9003 | 0.8483 | 0.8455 | 0.3194 | | 0.4216 | 122.0 | 61000 | 0.2867 | 0.8002 | 0.9466 | 0.9099 | 0.7953 | 0.7381 | 0.8133 | 0.8004 | 0.8671 | 0.274 | 0.335 | 0.8344 | 0.8383 | 0.8411 | 0.7747 | 0.8991 | 0.8453 | 0.8417 | 0.3111 | | 0.3912 | 123.0 | 61500 | 0.2869 | 0.799 | 0.9464 | 0.9033 | 0.7943 | 0.7372 | 0.8142 | 0.7997 | 0.8655 | 0.2714 | 0.335 | 0.8348 | 0.8379 | 0.8387 | 0.7758 | 0.8991 | 0.8473 | 0.8402 | 0.3111 | | 0.4112 | 124.0 | 62000 | 0.2943 | 0.7917 | 0.9463 | 0.9072 | 0.7856 | 0.7284 | 0.8082 | 0.7913 | 0.8612 | 0.2981 | 0.3337 | 0.8265 | 0.8297 | 0.8294 | 0.7663 | 0.8934 | 0.8396 | 0.8308 | 0.3194 | | 0.3761 | 125.0 | 62500 | 0.2863 | 0.7997 | 0.9463 | 0.9047 | 0.7952 | 0.7353 | 0.8225 | 0.7974 | 0.8685 | 0.2858 | 0.3367 | 0.8335 | 0.8366 | 0.8387 | 0.7705 | 0.9006 | 0.85 | 0.8365 | 0.3069 | | 0.3983 | 126.0 | 63000 | 0.2887 | 0.7969 | 0.9465 | 0.9018 | 0.7927 | 0.735 | 0.8091 | 0.8003 | 0.8631 | 0.2874 | 0.3347 | 0.8321 | 0.8351 | 0.8371 | 0.7705 | 0.8976 | 0.8419 | 0.8395 | 0.3111 | | 0.449 | 127.0 | 63500 | 0.2940 | 0.7942 | 0.9466 | 0.8993 | 0.7913 | 0.7316 | 0.8021 | 0.7971 | 0.8596 | 0.2904 | 0.3336 | 0.8276 | 0.8308 | 0.8327 | 0.7663 | 0.8934 | 0.8345 | 0.8351 | 0.3153 | | 0.3092 | 128.0 | 64000 | 0.2910 | 0.8006 | 0.9466 | 0.9066 | 0.8017 | 0.7349 | 0.8129 | 0.8011 | 0.8651 | 0.2758 | 0.3348 | 0.8346 | 0.8376 | 0.8427 | 0.7716 | 0.8985 | 0.8462 | 0.84 | 0.3153 | | 0.3511 | 129.0 | 64500 | 0.2894 | 0.8008 | 0.9461 | 0.9134 | 0.7963 | 0.7414 | 0.8151 | 0.8034 | 0.8647 | 0.2618 | 0.3367 | 0.8354 | 0.8383 | 0.8391 | 0.7779 | 0.8979 | 0.8456 | 0.8423 | 0.3069 | | 0.3391 | 130.0 | 65000 | 0.2906 | 0.7979 | 0.9464 | 0.907 | 0.7924 | 0.7358 | 0.8108 | 0.7991 | 0.8654 | 0.2943 | 0.3338 | 0.8334 | 0.8362 | 0.8347 | 0.7747 | 0.8991 | 0.8417 | 0.8401 | 0.3167 | | 0.3646 | 131.0 | 65500 | 0.2919 | 0.7963 | 0.9465 | 0.9108 | 0.7931 | 0.7348 | 0.8134 | 0.7972 | 0.8609 | 0.2881 | 0.3358 | 0.8321 | 0.8352 | 0.8387 | 0.7716 | 0.8952 | 0.8443 | 0.8375 | 0.3125 | | 0.2848 | 132.0 | 66000 | 0.2956 | 0.7931 | 0.9463 | 0.9052 | 0.7891 | 0.7323 | 0.8126 | 0.7939 | 0.858 | 0.2862 | 0.3338 | 0.8303 | 0.8332 | 0.8343 | 0.7726 | 0.8928 | 0.8431 | 0.8343 | 0.3083 | | 0.3886 | 133.0 | 66500 | 0.2907 | 0.7961 | 0.9465 | 0.909 | 0.7912 | 0.7344 | 0.8164 | 0.7946 | 0.8628 | 0.2898 | 0.3344 | 0.8316 | 0.8348 | 0.8347 | 0.7726 | 0.897 | 0.8467 | 0.834 | 0.3167 | | 0.379 | 134.0 | 67000 | 0.2910 | 0.7983 | 0.9464 | 0.911 | 0.791 | 0.7449 | 0.8212 | 0.7967 | 0.859 | 0.2815 | 0.3358 | 0.8336 | 0.8365 | 0.8335 | 0.7811 | 0.8949 | 0.8507 | 0.8353 | 0.3083 | | 0.4092 | 135.0 | 67500 | 0.2907 | 0.7989 | 0.9465 | 0.9144 | 0.7952 | 0.7387 | 0.8159 | 0.7999 | 0.8629 | 0.2875 | 0.3357 | 0.8345 | 0.8374 | 0.8387 | 0.7768 | 0.8967 | 0.8479 | 0.8384 | 0.3083 | | 0.3667 | 136.0 | 68000 | 0.2885 | 0.7981 | 0.9465 | 0.9149 | 0.794 | 0.7381 | 0.8203 | 0.7976 | 0.8623 | 0.2955 | 0.3357 | 0.8341 | 0.8372 | 0.8355 | 0.7789 | 0.8973 | 0.8501 | 0.8367 | 0.3222 | | 0.4176 | 137.0 | 68500 | 0.2865 | 0.8029 | 0.9467 | 0.9171 | 0.7996 | 0.7437 | 0.823 | 0.805 | 0.8654 | 0.2956 | 0.338 | 0.8377 | 0.8409 | 0.8415 | 0.7811 | 0.9 | 0.8521 | 0.8424 | 0.3222 | | 0.3972 | 138.0 | 69000 | 0.2852 | 0.8013 | 0.9465 | 0.9172 | 0.7962 | 0.7451 | 0.8223 | 0.8014 | 0.8625 | 0.2955 | 0.3368 | 0.8364 | 0.8396 | 0.8387 | 0.7832 | 0.897 | 0.8521 | 0.841 | 0.3222 | | 0.4419 | 139.0 | 69500 | 0.2812 | 0.8044 | 0.9466 | 0.9174 | 0.8004 | 0.7458 | 0.8213 | 0.8066 | 0.867 | 0.2953 | 0.3376 | 0.8389 | 0.842 | 0.8435 | 0.7821 | 0.9003 | 0.8513 | 0.8443 | 0.3222 | | 0.3318 | 140.0 | 70000 | 0.2845 | 0.803 | 0.9466 | 0.915 | 0.7999 | 0.7453 | 0.8212 | 0.8056 | 0.864 | 0.2897 | 0.338 | 0.8377 | 0.8407 | 0.8419 | 0.7821 | 0.8982 | 0.8505 | 0.8437 | 0.3167 | | 0.3842 | 141.0 | 70500 | 0.2822 | 0.8037 | 0.9466 | 0.917 | 0.8013 | 0.745 | 0.8219 | 0.8066 | 0.8647 | 0.2897 | 0.3382 | 0.8381 | 0.8412 | 0.8427 | 0.7811 | 0.8997 | 0.8527 | 0.8437 | 0.3167 | | 0.3998 | 142.0 | 71000 | 0.2853 | 0.8013 | 0.9465 | 0.9149 | 0.7982 | 0.7444 | 0.8201 | 0.8031 | 0.8613 | 0.2953 | 0.3372 | 0.8369 | 0.8399 | 0.8411 | 0.7821 | 0.8964 | 0.8501 | 0.8416 | 0.3222 | | 0.3571 | 143.0 | 71500 | 0.2838 | 0.8009 | 0.9462 | 0.9147 | 0.2991 | 0.8028 | 0.8196 | 0.337 | 0.8369 | 0.84 | 0.3264 | 0.8411 | 0.8505 | 0.7977 | 0.8415 | 0.7436 | 0.7811 | 0.8615 | 0.8973 | | 0.3332 | 144.0 | 72000 | 0.2837 | 0.801 | 0.9464 | 0.9144 | 0.2932 | 0.803 | 0.8209 | 0.3369 | 0.8368 | 0.8398 | 0.3208 | 0.8413 | 0.8516 | 0.7989 | 0.8423 | 0.7419 | 0.7789 | 0.8623 | 0.8982 | | 0.3259 | 145.0 | 72500 | 0.2837 | 0.8018 | 0.9464 | 0.9147 | 0.2935 | 0.804 | 0.8209 | 0.3373 | 0.8375 | 0.8405 | 0.3208 | 0.8418 | 0.8516 | 0.7997 | 0.8435 | 0.7432 | 0.78 | 0.8623 | 0.8979 | | 0.3873 | 146.0 | 73000 | 0.2839 | 0.8016 | 0.9463 | 0.9145 | 0.2935 | 0.8032 | 0.8209 | 0.3367 | 0.8371 | 0.8401 | 0.3208 | 0.8413 | 0.8516 | 0.7991 | 0.8423 | 0.7434 | 0.78 | 0.8622 | 0.8979 | | 0.3318 | 147.0 | 73500 | 0.2836 | 0.8017 | 0.9463 | 0.9145 | 0.2935 | 0.8036 | 0.8209 | 0.3373 | 0.8374 | 0.8404 | 0.3208 | 0.842 | 0.8516 | 0.7987 | 0.8419 | 0.7437 | 0.7811 | 0.8627 | 0.8982 | | 0.3442 | 148.0 | 74000 | 0.2836 | 0.8014 | 0.9463 | 0.9145 | 0.2935 | 0.8031 | 0.8211 | 0.3373 | 0.8373 | 0.8403 | 0.3208 | 0.8417 | 0.8517 | 0.7989 | 0.8423 | 0.7423 | 0.78 | 0.8631 | 0.8985 | | 0.4478 | 149.0 | 74500 | 0.2836 | 0.8018 | 0.9463 | 0.9146 | 0.2935 | 0.8039 | 0.8211 | 0.3373 | 0.8376 | 0.8406 | 0.3208 | 0.8422 | 0.8517 | 0.7988 | 0.8423 | 0.7436 | 0.7811 | 0.8631 | 0.8985 | | 0.3662 | 150.0 | 75000 | 0.2836 | 0.8018 | 0.9463 | 0.9145 | 0.2935 | 0.8039 | 0.8211 | 0.3373 | 0.8376 | 0.8406 | 0.3208 | 0.8422 | 0.8517 | 0.7988 | 0.8423 | 0.7436 | 0.7811 | 0.8631 | 0.8985 | ### Framework versions - Transformers 4.46.0 - Pytorch 2.5.0+cu121 - Datasets 2.19.2 - Tokenizers 0.20.1
[ "chicken", "duck", "plant" ]
Garon16/rtdetr_r50vd_russia_plate_detector_lightning
## Model description МодСль Π΄Π΅Ρ‚Π΅ΠΊΡ†ΠΈΠΈ Π½ΠΎΠΌΠ΅Ρ€Π½Ρ‹Ρ… Π·Π½Π°ΠΊΠΎΠ² Π°Π²Ρ‚ΠΎΠΌΠΎΠ±ΠΈΠ»Π΅ΠΉ Π Π€, Π² Π΄Π°Π½Π½Ρ‹ΠΉ ΠΌΠΎΠΌΠ΅Π½Ρ‚ 2 класса n_p ΠΈ p_p, ΠΎΠ±Ρ‹Ρ‡Π½Ρ‹Π΅ Π½ΠΎΠΌΠ΅Ρ€Π° ΠΈ полицСйскиС ## Intended uses & limitations ΠŸΡ€ΠΈΠΌΠ΅Ρ€ использования: <pre> from transformers import AutoModelForObjectDetection, AutoImageProcessor import torch import supervision as sv DEVICE = torch.device("cuda" if torch.cuda.is_available() else "cpu") model = AutoModelForObjectDetection.from_pretrained('Garon16/rtdetr_r50vd_russia_plate_detector_lightning').to(DEVICE) processor = AutoImageProcessor.from_pretrained('Garon16/rtdetr_r50vd_russia_plate_detector_lightning') path = 'path/to/image' image = Image.open(path) inputs = processor(image, return_tensors="pt").to(DEVICE) with torch.no_grad(): outputs = model(**inputs) w, h = image.size results = processor.post_process_object_detection( outputs, target_sizes=[(h, w)], threshold=0.3) detections = sv.Detections.from_transformers(results[0]).with_nms(0.3) labels = [ model.config.id2label[class_id] for class_id in detections.class_id ] annotated_image = image.copy() annotated_image = sv.BoundingBoxAnnotator().annotate(annotated_image, detections) annotated_image = sv.LabelAnnotator().annotate(annotated_image, detections, labels=labels) grid = sv.create_tiles( [annotated_image], grid_size=(1, 1), single_tile_size=(512, 512), tile_padding_color=sv.Color.WHITE, tile_margin_color=sv.Color.WHITE ) sv.plot_image(grid, size=(10, 10)) </pre> ## Training and evaluation data ΠžΠ±ΡƒΡ‡Π°Π» Π½Π° своём датасСтС - https://universe.roboflow.com/testcarplate/russian-license-plates-classification-by-this-type ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 5e-05 - train_batch_size: 32 - seed: 42 - lr_scheduler_type: linear - lr_scheduler_warmup_steps: 300 - num_epochs: 20 ### Training results Пока Π½Π΅ разобрался, ΠΊΠ°ΠΊ ΠΏΡ€ΠΈ Π΄ΠΎΠΎΠ±ΡƒΡ‡Π΅Π½ΠΈΠΈ Π»Π°ΠΉΡ‚ΠΈΠ½Π³ΠΎΠΌ Π°Π²Ρ‚ΠΎΠΌΠ°Ρ‚ΠΎΠΌ всё ΠΎΡ‚ΠΏΡ€Π°Π²ΠΈΡ‚ΡŒ сюда ### Framework versions - Transformers 4.46.0.dev0 - Pytorch 2.5.0+cu124 - Tokenizers 0.20.1
[ "car-plates-and-these-types", "n_p", "p_p" ]
conjuncts/ditr-e15
# Model Card for Model ID <!-- Provide a quick summary of what the model is/does. --> ## Model Details ### Model Description <!-- Provide a longer summary of what this model is. --> This is the model card of a πŸ€— transformers model that has been pushed on the Hub. This model card has been automatically generated. - **Developed by:** [More Information Needed] - **Funded by [optional]:** [More Information Needed] - **Shared by [optional]:** [More Information Needed] - **Model type:** [More Information Needed] - **Language(s) (NLP):** [More Information Needed] - **License:** [More Information Needed] - **Finetuned from model [optional]:** [More Information Needed] ### Model Sources [optional] <!-- Provide the basic links for the model. --> - **Repository:** [More Information Needed] - **Paper [optional]:** [More Information Needed] - **Demo [optional]:** [More Information Needed] ## Uses <!-- Address questions around how the model is intended to be used, including the foreseeable users of the model and those affected by the model. --> ### Direct Use <!-- This section is for the model use without fine-tuning or plugging into a larger ecosystem/app. --> [More Information Needed] ### Downstream Use [optional] <!-- This section is for the model use when fine-tuned for a task, or when plugged into a larger ecosystem/app --> [More Information Needed] ### Out-of-Scope Use <!-- This section addresses misuse, malicious use, and uses that the model will not work well for. --> [More Information Needed] ## Bias, Risks, and Limitations <!-- This section is meant to convey both technical and sociotechnical limitations. --> [More Information Needed] ### Recommendations <!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. --> Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations. ## How to Get Started with the Model Use the code below to get started with the model. [More Information Needed] ## Training Details ### Training Data <!-- This should link to a Dataset Card, perhaps with a short stub of information on what the training data is all about as well as documentation related to data pre-processing or additional filtering. --> [More Information Needed] ### Training Procedure <!-- This relates heavily to the Technical Specifications. Content here should link to that section when it is relevant to the training procedure. --> #### Preprocessing [optional] [More Information Needed] #### Training Hyperparameters - **Training regime:** [More Information Needed] <!--fp32, fp16 mixed precision, bf16 mixed precision, bf16 non-mixed precision, fp16 non-mixed precision, fp8 mixed precision --> #### Speeds, Sizes, Times [optional] <!-- This section provides information about throughput, start/end time, checkpoint size if relevant, etc. --> [More Information Needed] ## Evaluation <!-- This section describes the evaluation protocols and provides the results. --> ### Testing Data, Factors & Metrics #### Testing Data <!-- This should link to a Dataset Card if possible. --> [More Information Needed] #### Factors <!-- These are the things the evaluation is disaggregating by, e.g., subpopulations or domains. --> [More Information Needed] #### Metrics <!-- These are the evaluation metrics being used, ideally with a description of why. --> [More Information Needed] ### Results [More Information Needed] #### Summary ## Model Examination [optional] <!-- Relevant interpretability work for the model goes here --> [More Information Needed] ## Environmental Impact <!-- Total emissions (in grams of CO2eq) and additional considerations, such as electricity usage, go here. Edit the suggested text below accordingly --> Carbon emissions can be estimated using the [Machine Learning Impact calculator](https://mlco2.github.io/impact#compute) presented in [Lacoste et al. (2019)](https://arxiv.org/abs/1910.09700). - **Hardware Type:** [More Information Needed] - **Hours used:** [More Information Needed] - **Cloud Provider:** [More Information Needed] - **Compute Region:** [More Information Needed] - **Carbon Emitted:** [More Information Needed] ## Technical Specifications [optional] ### Model Architecture and Objective [More Information Needed] ### Compute Infrastructure [More Information Needed] #### Hardware [More Information Needed] #### Software [More Information Needed] ## Citation [optional] <!-- If there is a paper or blog post introducing the model, the APA and Bibtex information for that should go in this section. --> **BibTeX:** [More Information Needed] **APA:** [More Information Needed] ## Glossary [optional] <!-- If relevant, include terms and calculations in this section that can help readers understand the model or model card. --> [More Information Needed] ## More Information [optional] [More Information Needed] ## Model Card Authors [optional] [More Information Needed] ## Model Card Contact [More Information Needed]
[ "label_0", "label_1", "label_2", "label_3", "label_4" ]
joe611/chickens-composite-201616161616-150-epochs-wo-transform
<!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. --> # chickens-composite-201616161616-150-epochs-wo-transform This model is a fine-tuned version of [facebook/detr-resnet-50](https://huggingface.co/facebook/detr-resnet-50) on the None dataset. It achieves the following results on the evaluation set: - Loss: 0.3090 - Map: 0.8062 - Map 50: 0.9477 - Map 75: 0.9049 - Map Small: 0.3781 - Map Medium: 0.8098 - Map Large: 0.8149 - Mar 1: 0.3198 - Mar 10: 0.8436 - Mar 100: 0.8464 - Mar Small: 0.42 - Mar Medium: 0.8519 - Mar Large: 0.8418 - Map Chicken: 0.8066 - Mar 100 Chicken: 0.848 - Map Duck: 0.7377 - Mar 100 Duck: 0.7907 - Map Plant: 0.8745 - Mar 100 Plant: 0.9006 ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 1e-05 - train_batch_size: 2 - eval_batch_size: 8 - seed: 42 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: cosine - num_epochs: 150 ### Training results | Training Loss | Epoch | Step | Validation Loss | Map | Map 50 | Map 75 | Map Chicken | Map Duck | Map Large | Map Medium | Map Plant | Map Small | Mar 1 | Mar 10 | Mar 100 | Mar 100 Chicken | Mar 100 Duck | Mar 100 Plant | Mar Large | Mar Medium | Mar Small | |:-------------:|:-----:|:-----:|:---------------:|:------:|:------:|:------:|:-----------:|:--------:|:---------:|:----------:|:---------:|:---------:|:------:|:------:|:-------:|:---------------:|:------------:|:-------------:|:---------:|:----------:|:---------:| | 1.4189 | 1.0 | 500 | 1.4184 | 0.0936 | 0.1495 | 0.1003 | 0.0658 | 0.0 | 0.1471 | 0.0451 | 0.2149 | 0.006 | 0.0902 | 0.243 | 0.3636 | 0.35 | 0.0 | 0.7409 | 0.4113 | 0.3359 | 0.1086 | | 1.2359 | 2.0 | 1000 | 1.1018 | 0.2288 | 0.3272 | 0.2595 | 0.1521 | 0.0 | 0.2598 | 0.1378 | 0.5344 | 0.0257 | 0.1116 | 0.3378 | 0.3641 | 0.3321 | 0.0 | 0.7603 | 0.3819 | 0.3438 | 0.189 | | 1.0059 | 3.0 | 1500 | 0.9680 | 0.2906 | 0.416 | 0.3265 | 0.2406 | 0.0 | 0.3274 | 0.2479 | 0.6313 | 0.0222 | 0.1195 | 0.4045 | 0.415 | 0.4813 | 0.0 | 0.7636 | 0.4578 | 0.3856 | 0.1486 | | 0.9673 | 4.0 | 2000 | 0.8671 | 0.3371 | 0.4672 | 0.3871 | 0.33 | 0.0 | 0.3723 | 0.2863 | 0.6812 | 0.0316 | 0.1249 | 0.4623 | 0.4733 | 0.6468 | 0.0 | 0.773 | 0.4815 | 0.4452 | 0.1624 | | 0.836 | 5.0 | 2500 | 0.7606 | 0.3693 | 0.5023 | 0.4341 | 0.4173 | 0.0 | 0.3973 | 0.3329 | 0.6905 | 0.0119 | 0.1271 | 0.4822 | 0.4903 | 0.7202 | 0.0 | 0.7506 | 0.5133 | 0.4517 | 0.1233 | | 0.807 | 6.0 | 3000 | 0.7029 | 0.4041 | 0.5463 | 0.4781 | 0.4873 | 0.0 | 0.4151 | 0.3707 | 0.7251 | 0.0446 | 0.134 | 0.4968 | 0.5011 | 0.7278 | 0.0 | 0.7755 | 0.5198 | 0.4743 | 0.1262 | | 0.7215 | 7.0 | 3500 | 0.6856 | 0.4068 | 0.5627 | 0.484 | 0.4951 | 0.0 | 0.4093 | 0.3931 | 0.7251 | 0.0591 | 0.1346 | 0.4909 | 0.4956 | 0.7131 | 0.0 | 0.7736 | 0.4966 | 0.482 | 0.2029 | | 0.702 | 8.0 | 4000 | 0.6720 | 0.405 | 0.5706 | 0.4678 | 0.4712 | 0.0079 | 0.4229 | 0.3853 | 0.7359 | 0.0624 | 0.1356 | 0.4891 | 0.4921 | 0.6885 | 0.0041 | 0.7836 | 0.5092 | 0.4683 | 0.1767 | | 0.6445 | 9.0 | 4500 | 0.6025 | 0.4383 | 0.5797 | 0.512 | 0.5491 | 0.0126 | 0.4464 | 0.4142 | 0.7532 | 0.0952 | 0.1438 | 0.5191 | 0.5217 | 0.7595 | 0.0093 | 0.7964 | 0.5261 | 0.5036 | 0.1843 | | 0.6489 | 10.0 | 5000 | 0.5862 | 0.441 | 0.5931 | 0.5114 | 0.5704 | 0.0001 | 0.4544 | 0.4234 | 0.7525 | 0.0755 | 0.1472 | 0.513 | 0.5183 | 0.7579 | 0.001 | 0.7961 | 0.5302 | 0.4977 | 0.2205 | | 0.5365 | 11.0 | 5500 | 0.5721 | 0.4608 | 0.6083 | 0.5527 | 0.6097 | 0.0005 | 0.484 | 0.4265 | 0.7723 | 0.1472 | 0.146 | 0.5217 | 0.5258 | 0.7563 | 0.0113 | 0.8097 | 0.5324 | 0.4993 | 0.2176 | | 0.6133 | 12.0 | 6000 | 0.5765 | 0.4598 | 0.6266 | 0.5497 | 0.6036 | 0.002 | 0.4731 | 0.4342 | 0.7739 | 0.1462 | 0.1445 | 0.5082 | 0.5105 | 0.719 | 0.001 | 0.8115 | 0.5259 | 0.4883 | 0.2243 | | 0.5347 | 13.0 | 6500 | 0.5195 | 0.4942 | 0.6438 | 0.5793 | 0.6989 | 0.005 | 0.5058 | 0.4809 | 0.7788 | 0.099 | 0.1557 | 0.5283 | 0.5328 | 0.771 | 0.0082 | 0.8191 | 0.5426 | 0.5171 | 0.2148 | | 0.6003 | 14.0 | 7000 | 0.5105 | 0.6071 | 0.7808 | 0.7224 | 0.6902 | 0.3454 | 0.5728 | 0.6024 | 0.7858 | 0.1158 | 0.2228 | 0.6417 | 0.6466 | 0.7603 | 0.3598 | 0.8197 | 0.5991 | 0.6459 | 0.2476 | | 0.5419 | 15.0 | 7500 | 0.4853 | 0.6657 | 0.8721 | 0.7854 | 0.6882 | 0.521 | 0.6434 | 0.6656 | 0.7878 | 0.1688 | 0.267 | 0.7053 | 0.7094 | 0.7516 | 0.5474 | 0.8291 | 0.6811 | 0.7105 | 0.2462 | | 0.4817 | 16.0 | 8000 | 0.4858 | 0.6585 | 0.891 | 0.8108 | 0.6822 | 0.5003 | 0.6301 | 0.6567 | 0.7929 | 0.1987 | 0.262 | 0.6981 | 0.706 | 0.7456 | 0.5433 | 0.8291 | 0.6706 | 0.7095 | 0.3162 | | 0.4383 | 17.0 | 8500 | 0.4707 | 0.6835 | 0.9091 | 0.8175 | 0.6977 | 0.5722 | 0.7097 | 0.6701 | 0.7807 | 0.1384 | 0.2816 | 0.7329 | 0.7362 | 0.7556 | 0.633 | 0.82 | 0.7566 | 0.7288 | 0.2571 | | 0.4724 | 18.0 | 9000 | 0.4423 | 0.6925 | 0.9001 | 0.8185 | 0.72 | 0.5613 | 0.669 | 0.6895 | 0.7962 | 0.1634 | 0.2794 | 0.7328 | 0.7398 | 0.7742 | 0.6082 | 0.837 | 0.7044 | 0.7423 | 0.2729 | | 0.4734 | 19.0 | 9500 | 0.4489 | 0.6943 | 0.925 | 0.8334 | 0.6993 | 0.5869 | 0.7296 | 0.6902 | 0.7968 | 0.1019 | 0.2833 | 0.739 | 0.7457 | 0.7611 | 0.6402 | 0.8358 | 0.7828 | 0.7432 | 0.239 | | 0.4607 | 20.0 | 10000 | 0.4168 | 0.7182 | 0.9136 | 0.8371 | 0.7329 | 0.6123 | 0.6812 | 0.7224 | 0.8093 | 0.2114 | 0.2921 | 0.7557 | 0.7639 | 0.7885 | 0.6546 | 0.8485 | 0.7202 | 0.7688 | 0.3267 | | 0.4668 | 21.0 | 10500 | 0.4156 | 0.7174 | 0.9383 | 0.8641 | 0.7111 | 0.6483 | 0.7311 | 0.7122 | 0.7929 | 0.1959 | 0.2917 | 0.7626 | 0.7691 | 0.769 | 0.7 | 0.8382 | 0.7837 | 0.765 | 0.3 | | 0.4788 | 22.0 | 11000 | 0.4492 | 0.7012 | 0.9289 | 0.8502 | 0.7142 | 0.6032 | 0.7165 | 0.6948 | 0.7863 | 0.1819 | 0.283 | 0.7438 | 0.7507 | 0.7694 | 0.6546 | 0.8279 | 0.7597 | 0.7453 | 0.3648 | | 0.429 | 23.0 | 11500 | 0.3838 | 0.7329 | 0.9342 | 0.8464 | 0.7442 | 0.6438 | 0.7293 | 0.7296 | 0.8108 | 0.1733 | 0.2989 | 0.7737 | 0.7788 | 0.7893 | 0.6948 | 0.8521 | 0.7658 | 0.7804 | 0.3052 | | 0.3793 | 24.0 | 12000 | 0.4072 | 0.7195 | 0.9366 | 0.8658 | 0.7225 | 0.6405 | 0.7439 | 0.7176 | 0.7956 | 0.1725 | 0.2903 | 0.7611 | 0.7663 | 0.7694 | 0.699 | 0.8306 | 0.7797 | 0.7668 | 0.3271 | | 0.4216 | 25.0 | 12500 | 0.3804 | 0.7311 | 0.9449 | 0.851 | 0.7192 | 0.6578 | 0.7329 | 0.7272 | 0.8162 | 0.1829 | 0.296 | 0.7725 | 0.7797 | 0.7706 | 0.7175 | 0.8509 | 0.7732 | 0.7791 | 0.3624 | | 0.377 | 26.0 | 13000 | 0.3855 | 0.7378 | 0.9433 | 0.8847 | 0.731 | 0.6711 | 0.7461 | 0.7317 | 0.8113 | 0.2488 | 0.2972 | 0.7779 | 0.7838 | 0.7881 | 0.7165 | 0.8467 | 0.7808 | 0.7801 | 0.3995 | | 0.4127 | 27.0 | 13500 | 0.3796 | 0.7455 | 0.9505 | 0.8938 | 0.7453 | 0.6687 | 0.7799 | 0.7366 | 0.8225 | 0.204 | 0.2992 | 0.7836 | 0.7898 | 0.794 | 0.7227 | 0.8527 | 0.8125 | 0.7811 | 0.3662 | | 0.3795 | 28.0 | 14000 | 0.3757 | 0.74 | 0.9418 | 0.8831 | 0.7246 | 0.6699 | 0.7467 | 0.7416 | 0.8254 | 0.1898 | 0.2961 | 0.7875 | 0.792 | 0.7758 | 0.7381 | 0.8621 | 0.7926 | 0.7914 | 0.3824 | | 0.3813 | 29.0 | 14500 | 0.3740 | 0.7575 | 0.9437 | 0.8934 | 0.7547 | 0.6932 | 0.7781 | 0.7456 | 0.8248 | 0.2114 | 0.3076 | 0.7965 | 0.8 | 0.798 | 0.7464 | 0.8555 | 0.8093 | 0.7965 | 0.3719 | | 0.378 | 30.0 | 15000 | 0.3861 | 0.7291 | 0.9369 | 0.8584 | 0.7424 | 0.6404 | 0.7846 | 0.7175 | 0.8043 | 0.099 | 0.2978 | 0.7677 | 0.7745 | 0.7913 | 0.6928 | 0.8394 | 0.8207 | 0.7666 | 0.2948 | | 0.373 | 31.0 | 15500 | 0.3934 | 0.7285 | 0.9401 | 0.873 | 0.7328 | 0.6574 | 0.7576 | 0.7209 | 0.7953 | 0.2318 | 0.2915 | 0.7745 | 0.7771 | 0.7798 | 0.7165 | 0.8352 | 0.7938 | 0.7758 | 0.3552 | | 0.3605 | 32.0 | 16000 | 0.3740 | 0.7465 | 0.9369 | 0.881 | 0.7505 | 0.6593 | 0.7891 | 0.7404 | 0.8296 | 0.2254 | 0.299 | 0.7867 | 0.7914 | 0.7968 | 0.7103 | 0.867 | 0.8212 | 0.7875 | 0.3819 | | 0.3834 | 33.0 | 16500 | 0.3616 | 0.752 | 0.9492 | 0.8881 | 0.7576 | 0.6801 | 0.7652 | 0.7524 | 0.8183 | 0.2199 | 0.3045 | 0.7944 | 0.7995 | 0.8024 | 0.7412 | 0.8548 | 0.8003 | 0.7999 | 0.3919 | | 0.3395 | 34.0 | 17000 | 0.3471 | 0.7568 | 0.9405 | 0.8877 | 0.7747 | 0.6601 | 0.7763 | 0.7639 | 0.8355 | 0.2609 | 0.309 | 0.797 | 0.802 | 0.8179 | 0.7175 | 0.8706 | 0.8061 | 0.8074 | 0.3986 | | 0.3524 | 35.0 | 17500 | 0.3641 | 0.75 | 0.9373 | 0.8923 | 0.7655 | 0.6642 | 0.7678 | 0.7448 | 0.8204 | 0.2799 | 0.3029 | 0.7948 | 0.7983 | 0.8163 | 0.7206 | 0.8579 | 0.8007 | 0.7961 | 0.4638 | | 0.3454 | 36.0 | 18000 | 0.3711 | 0.7352 | 0.9449 | 0.8728 | 0.7263 | 0.6544 | 0.7408 | 0.7412 | 0.8249 | 0.2424 | 0.2949 | 0.7796 | 0.7873 | 0.7762 | 0.7268 | 0.8588 | 0.7861 | 0.7925 | 0.3929 | | 0.3348 | 37.0 | 18500 | 0.3561 | 0.7587 | 0.9447 | 0.8864 | 0.7634 | 0.6775 | 0.7671 | 0.7575 | 0.8352 | 0.272 | 0.3012 | 0.8026 | 0.8067 | 0.8119 | 0.7402 | 0.8679 | 0.8109 | 0.806 | 0.4529 | | 0.3524 | 38.0 | 19000 | 0.3616 | 0.7429 | 0.9499 | 0.8843 | 0.7344 | 0.6575 | 0.7591 | 0.7393 | 0.8369 | 0.2417 | 0.2964 | 0.7901 | 0.7947 | 0.7861 | 0.7278 | 0.8703 | 0.8104 | 0.7909 | 0.4024 | | 0.339 | 39.0 | 19500 | 0.3373 | 0.7645 | 0.95 | 0.89 | 0.7729 | 0.6836 | 0.7782 | 0.7616 | 0.8372 | 0.2201 | 0.3067 | 0.8037 | 0.8089 | 0.8143 | 0.7412 | 0.8712 | 0.81 | 0.8081 | 0.3752 | | 0.3661 | 40.0 | 20000 | 0.3346 | 0.7663 | 0.9408 | 0.8873 | 0.7855 | 0.6746 | 0.7725 | 0.7683 | 0.8388 | 0.2942 | 0.3102 | 0.807 | 0.8105 | 0.8282 | 0.7309 | 0.8724 | 0.8038 | 0.8123 | 0.3962 | | 0.3243 | 41.0 | 20500 | 0.3424 | 0.7617 | 0.9504 | 0.8841 | 0.7744 | 0.6797 | 0.7761 | 0.755 | 0.8309 | 0.3182 | 0.3064 | 0.8035 | 0.8076 | 0.8183 | 0.7402 | 0.8642 | 0.8052 | 0.805 | 0.4267 | | 0.3512 | 42.0 | 21000 | 0.3359 | 0.7681 | 0.9478 | 0.892 | 0.7625 | 0.7064 | 0.7852 | 0.7658 | 0.8355 | 0.2748 | 0.3097 | 0.8089 | 0.8133 | 0.8087 | 0.7619 | 0.8694 | 0.8279 | 0.8132 | 0.3414 | | 0.349 | 43.0 | 21500 | 0.3473 | 0.7676 | 0.95 | 0.8899 | 0.7815 | 0.6877 | 0.7778 | 0.7687 | 0.8335 | 0.2164 | 0.3064 | 0.8034 | 0.8096 | 0.8226 | 0.7433 | 0.863 | 0.8093 | 0.8121 | 0.3619 | | 0.3235 | 44.0 | 22000 | 0.3531 | 0.7597 | 0.9479 | 0.8822 | 0.7625 | 0.6915 | 0.7771 | 0.7572 | 0.8251 | 0.2344 | 0.3031 | 0.7985 | 0.804 | 0.8008 | 0.7505 | 0.8606 | 0.8056 | 0.8052 | 0.3324 | | 0.3488 | 45.0 | 22500 | 0.3484 | 0.7676 | 0.9475 | 0.8917 | 0.774 | 0.699 | 0.7666 | 0.7697 | 0.83 | 0.2819 | 0.3019 | 0.8063 | 0.81 | 0.8135 | 0.7505 | 0.8661 | 0.8019 | 0.8138 | 0.3824 | | 0.2906 | 46.0 | 23000 | 0.3335 | 0.7655 | 0.9394 | 0.8864 | 0.7703 | 0.6799 | 0.7668 | 0.7713 | 0.8465 | 0.2598 | 0.3101 | 0.8062 | 0.8126 | 0.8242 | 0.733 | 0.8806 | 0.8049 | 0.8191 | 0.3852 | | 0.331 | 47.0 | 23500 | 0.3373 | 0.7638 | 0.9499 | 0.8993 | 0.7562 | 0.6962 | 0.7724 | 0.7671 | 0.8389 | 0.2289 | 0.3021 | 0.8032 | 0.8093 | 0.798 | 0.7546 | 0.8752 | 0.8085 | 0.8137 | 0.3781 | | 0.3054 | 48.0 | 24000 | 0.3431 | 0.7687 | 0.9395 | 0.8871 | 0.7662 | 0.7065 | 0.7524 | 0.7732 | 0.8333 | 0.3018 | 0.3145 | 0.8102 | 0.8147 | 0.8171 | 0.7557 | 0.8715 | 0.7909 | 0.818 | 0.3662 | | 0.3285 | 49.0 | 24500 | 0.3563 | 0.7611 | 0.9392 | 0.8976 | 0.7612 | 0.694 | 0.7595 | 0.7609 | 0.8279 | 0.2504 | 0.3064 | 0.8007 | 0.805 | 0.8083 | 0.7454 | 0.8612 | 0.7994 | 0.8032 | 0.3229 | | 0.2877 | 50.0 | 25000 | 0.3290 | 0.7787 | 0.9515 | 0.9045 | 0.7896 | 0.7016 | 0.7985 | 0.7764 | 0.845 | 0.2946 | 0.3091 | 0.8117 | 0.8166 | 0.8266 | 0.7485 | 0.8748 | 0.8302 | 0.8115 | 0.4014 | | 0.2581 | 51.0 | 25500 | 0.3212 | 0.7772 | 0.952 | 0.9035 | 0.7854 | 0.6998 | 0.7957 | 0.7744 | 0.8463 | 0.3216 | 0.3086 | 0.8158 | 0.8204 | 0.827 | 0.7546 | 0.8797 | 0.8265 | 0.8181 | 0.4281 | | 0.3294 | 52.0 | 26000 | 0.3419 | 0.7549 | 0.9453 | 0.9045 | 0.7647 | 0.6722 | 0.7788 | 0.7531 | 0.8279 | 0.3165 | 0.3002 | 0.7961 | 0.8007 | 0.8123 | 0.7278 | 0.8618 | 0.8152 | 0.7967 | 0.4214 | | 0.3186 | 53.0 | 26500 | 0.3245 | 0.7788 | 0.9473 | 0.9007 | 0.7868 | 0.7016 | 0.7887 | 0.7819 | 0.8482 | 0.3224 | 0.3127 | 0.8179 | 0.8214 | 0.8286 | 0.7546 | 0.8809 | 0.8262 | 0.8225 | 0.4005 | | 0.2737 | 54.0 | 27000 | 0.3353 | 0.777 | 0.9517 | 0.9005 | 0.7757 | 0.7134 | 0.7899 | 0.7804 | 0.8419 | 0.3144 | 0.3101 | 0.8159 | 0.8196 | 0.8194 | 0.7649 | 0.8745 | 0.8271 | 0.821 | 0.4314 | | 0.284 | 55.0 | 27500 | 0.3392 | 0.7707 | 0.9467 | 0.9052 | 0.7855 | 0.6861 | 0.7925 | 0.7632 | 0.8406 | 0.3251 | 0.3073 | 0.811 | 0.8157 | 0.8246 | 0.7505 | 0.8721 | 0.8318 | 0.8098 | 0.4452 | | 0.2977 | 56.0 | 28000 | 0.3513 | 0.7644 | 0.9454 | 0.8968 | 0.7693 | 0.6974 | 0.783 | 0.7645 | 0.8264 | 0.2114 | 0.3089 | 0.8022 | 0.8065 | 0.8063 | 0.7546 | 0.8585 | 0.818 | 0.8088 | 0.2871 | | 0.2618 | 57.0 | 28500 | 0.3346 | 0.771 | 0.9458 | 0.9069 | 0.7792 | 0.6953 | 0.7945 | 0.7682 | 0.8384 | 0.2861 | 0.3087 | 0.8078 | 0.8125 | 0.8194 | 0.7464 | 0.8715 | 0.8314 | 0.81 | 0.3719 | | 0.2878 | 58.0 | 29000 | 0.3335 | 0.7698 | 0.9509 | 0.8967 | 0.7784 | 0.6958 | 0.7841 | 0.7729 | 0.8351 | 0.2766 | 0.3093 | 0.8106 | 0.8157 | 0.8155 | 0.7598 | 0.8718 | 0.8252 | 0.8189 | 0.351 | | 0.2697 | 59.0 | 29500 | 0.3205 | 0.7738 | 0.9504 | 0.8924 | 0.7715 | 0.71 | 0.7812 | 0.7748 | 0.8398 | 0.2981 | 0.3122 | 0.8146 | 0.8209 | 0.8147 | 0.7701 | 0.8779 | 0.8221 | 0.8203 | 0.3871 | | 0.2926 | 60.0 | 30000 | 0.3282 | 0.7727 | 0.9489 | 0.8882 | 0.7808 | 0.6969 | 0.7753 | 0.7787 | 0.8404 | 0.2269 | 0.3106 | 0.8129 | 0.8166 | 0.821 | 0.7567 | 0.8721 | 0.8186 | 0.8205 | 0.3119 | | 0.271 | 61.0 | 30500 | 0.3241 | 0.7802 | 0.954 | 0.9047 | 0.7814 | 0.7138 | 0.7901 | 0.7804 | 0.8456 | 0.2802 | 0.3123 | 0.8161 | 0.8209 | 0.8194 | 0.7649 | 0.8782 | 0.8275 | 0.8222 | 0.3771 | | 0.2992 | 62.0 | 31000 | 0.3143 | 0.7839 | 0.9492 | 0.907 | 0.7906 | 0.7102 | 0.7935 | 0.7871 | 0.8508 | 0.3109 | 0.3172 | 0.8238 | 0.8261 | 0.8333 | 0.7629 | 0.8821 | 0.8283 | 0.8309 | 0.3733 | | 0.284 | 63.0 | 31500 | 0.3144 | 0.7787 | 0.9491 | 0.8891 | 0.7871 | 0.699 | 0.7912 | 0.7844 | 0.8501 | 0.2787 | 0.308 | 0.821 | 0.8239 | 0.8294 | 0.7598 | 0.8824 | 0.8367 | 0.8267 | 0.35 | | 0.2684 | 64.0 | 32000 | 0.3237 | 0.7773 | 0.9516 | 0.8844 | 0.7782 | 0.7021 | 0.7707 | 0.78 | 0.8515 | 0.2919 | 0.3141 | 0.8152 | 0.8176 | 0.8202 | 0.7515 | 0.8809 | 0.808 | 0.8205 | 0.3395 | | 0.3161 | 65.0 | 32500 | 0.3291 | 0.7801 | 0.9494 | 0.9059 | 0.7725 | 0.7196 | 0.7927 | 0.7753 | 0.8481 | 0.3206 | 0.3147 | 0.819 | 0.8223 | 0.8147 | 0.7722 | 0.88 | 0.8287 | 0.8181 | 0.4333 | | 0.301 | 66.0 | 33000 | 0.3339 | 0.7771 | 0.948 | 0.9046 | 0.7675 | 0.7173 | 0.7665 | 0.7793 | 0.8464 | 0.3027 | 0.3103 | 0.8188 | 0.8233 | 0.8163 | 0.7742 | 0.8794 | 0.8129 | 0.8233 | 0.4171 | | 0.2677 | 67.0 | 33500 | 0.3367 | 0.7712 | 0.9325 | 0.8993 | 0.7946 | 0.6735 | 0.7505 | 0.7776 | 0.8455 | 0.2573 | 0.3082 | 0.8126 | 0.8161 | 0.8381 | 0.7299 | 0.8803 | 0.7783 | 0.825 | 0.3586 | | 0.2908 | 68.0 | 34000 | 0.3247 | 0.7873 | 0.9527 | 0.9063 | 0.7874 | 0.7241 | 0.7972 | 0.7877 | 0.8504 | 0.3449 | 0.3129 | 0.824 | 0.8286 | 0.8286 | 0.7763 | 0.8809 | 0.8281 | 0.8258 | 0.4362 | | 0.2662 | 69.0 | 34500 | 0.3373 | 0.7799 | 0.9554 | 0.8942 | 0.7745 | 0.7265 | 0.7791 | 0.7805 | 0.8386 | 0.3077 | 0.3122 | 0.8172 | 0.8192 | 0.8135 | 0.7701 | 0.8739 | 0.8154 | 0.8189 | 0.4067 | | 0.2639 | 70.0 | 35000 | 0.3118 | 0.7864 | 0.9464 | 0.9049 | 0.7905 | 0.713 | 0.7836 | 0.7932 | 0.8558 | 0.2985 | 0.3097 | 0.8262 | 0.8293 | 0.8294 | 0.7722 | 0.8864 | 0.8215 | 0.8374 | 0.3795 | | 0.247 | 71.0 | 35500 | 0.3174 | 0.7923 | 0.9503 | 0.9119 | 0.7995 | 0.7269 | 0.7785 | 0.7963 | 0.8506 | 0.3615 | 0.3183 | 0.8297 | 0.8317 | 0.8389 | 0.7753 | 0.8809 | 0.8195 | 0.8348 | 0.4429 | | 0.2794 | 72.0 | 36000 | 0.3248 | 0.7729 | 0.9411 | 0.892 | 0.7716 | 0.691 | 0.7519 | 0.7761 | 0.8561 | 0.3142 | 0.3105 | 0.8073 | 0.8108 | 0.8135 | 0.734 | 0.8848 | 0.7818 | 0.8167 | 0.3781 | | 0.2885 | 73.0 | 36500 | 0.3279 | 0.78 | 0.9512 | 0.9064 | 0.7741 | 0.7185 | 0.775 | 0.7847 | 0.8476 | 0.3474 | 0.311 | 0.8175 | 0.82 | 0.8155 | 0.768 | 0.8764 | 0.8113 | 0.8242 | 0.3933 | | 0.2582 | 74.0 | 37000 | 0.3187 | 0.7877 | 0.9519 | 0.9094 | 0.7903 | 0.7278 | 0.796 | 0.7885 | 0.8449 | 0.3513 | 0.3134 | 0.8268 | 0.829 | 0.8341 | 0.7742 | 0.8788 | 0.8335 | 0.8315 | 0.3976 | | 0.2919 | 75.0 | 37500 | 0.3069 | 0.7886 | 0.9548 | 0.8997 | 0.7922 | 0.7164 | 0.7895 | 0.7928 | 0.8572 | 0.3415 | 0.3113 | 0.8252 | 0.8276 | 0.8317 | 0.7629 | 0.8882 | 0.8235 | 0.833 | 0.4029 | | 0.2486 | 76.0 | 38000 | 0.3251 | 0.783 | 0.9543 | 0.9069 | 0.7759 | 0.7245 | 0.7694 | 0.7902 | 0.8486 | 0.298 | 0.3132 | 0.8207 | 0.8253 | 0.823 | 0.7742 | 0.8788 | 0.8063 | 0.832 | 0.4029 | | 0.2511 | 77.0 | 38500 | 0.3145 | 0.7889 | 0.9538 | 0.9012 | 0.7925 | 0.7189 | 0.7716 | 0.7971 | 0.8552 | 0.3503 | 0.3147 | 0.8309 | 0.8331 | 0.8369 | 0.7732 | 0.8891 | 0.8134 | 0.8423 | 0.3986 | | 0.3084 | 78.0 | 39000 | 0.3236 | 0.7853 | 0.9503 | 0.907 | 0.7734 | 0.7293 | 0.7822 | 0.7932 | 0.8532 | 0.3216 | 0.3133 | 0.8231 | 0.8264 | 0.8187 | 0.7763 | 0.8842 | 0.8177 | 0.8328 | 0.3805 | | 0.275 | 79.0 | 39500 | 0.3250 | 0.7857 | 0.9519 | 0.8981 | 0.7818 | 0.7144 | 0.7705 | 0.7921 | 0.8609 | 0.3714 | 0.3128 | 0.8241 | 0.8297 | 0.8278 | 0.7691 | 0.8921 | 0.8058 | 0.837 | 0.451 | | 0.2193 | 80.0 | 40000 | 0.3163 | 0.7829 | 0.9493 | 0.889 | 0.7823 | 0.7046 | 0.761 | 0.7929 | 0.8619 | 0.3513 | 0.3125 | 0.8213 | 0.8241 | 0.8254 | 0.7557 | 0.8912 | 0.8027 | 0.8319 | 0.4143 | | 0.2683 | 81.0 | 40500 | 0.3104 | 0.7889 | 0.95 | 0.8996 | 0.7958 | 0.7088 | 0.7828 | 0.7896 | 0.8621 | 0.3496 | 0.3168 | 0.8271 | 0.8315 | 0.8349 | 0.7691 | 0.8906 | 0.821 | 0.8345 | 0.4219 | | 0.2724 | 82.0 | 41000 | 0.3037 | 0.7907 | 0.9498 | 0.8996 | 0.79 | 0.72 | 0.7878 | 0.7935 | 0.8619 | 0.3486 | 0.3183 | 0.8316 | 0.834 | 0.8353 | 0.7753 | 0.8915 | 0.8222 | 0.8409 | 0.4 | | 0.2345 | 83.0 | 41500 | 0.3133 | 0.7934 | 0.9525 | 0.8926 | 0.7903 | 0.7297 | 0.8006 | 0.7955 | 0.8601 | 0.3134 | 0.317 | 0.8329 | 0.8377 | 0.8345 | 0.7866 | 0.8921 | 0.8309 | 0.8412 | 0.4057 | | 0.2466 | 84.0 | 42000 | 0.2988 | 0.7992 | 0.9529 | 0.8952 | 0.8088 | 0.7283 | 0.8006 | 0.8034 | 0.8606 | 0.35 | 0.32 | 0.8368 | 0.8416 | 0.8504 | 0.7835 | 0.8909 | 0.8303 | 0.846 | 0.4324 | | 0.2344 | 85.0 | 42500 | 0.3202 | 0.7833 | 0.9527 | 0.8915 | 0.7803 | 0.7027 | 0.7827 | 0.7852 | 0.8668 | 0.345 | 0.3115 | 0.8233 | 0.826 | 0.8254 | 0.7588 | 0.8939 | 0.8164 | 0.8297 | 0.4 | | 0.2443 | 86.0 | 43000 | 0.3031 | 0.7937 | 0.9511 | 0.8899 | 0.8005 | 0.713 | 0.7722 | 0.7997 | 0.8676 | 0.3561 | 0.3133 | 0.8328 | 0.837 | 0.844 | 0.7701 | 0.897 | 0.8057 | 0.8453 | 0.45 | | 0.2485 | 87.0 | 43500 | 0.3118 | 0.7918 | 0.9526 | 0.8946 | 0.7973 | 0.711 | 0.7906 | 0.7987 | 0.867 | 0.2793 | 0.3146 | 0.8277 | 0.8339 | 0.8377 | 0.768 | 0.8961 | 0.8185 | 0.841 | 0.3695 | | 0.2487 | 88.0 | 44000 | 0.3041 | 0.8016 | 0.9442 | 0.8979 | 0.8062 | 0.728 | 0.8026 | 0.8059 | 0.8706 | 0.3709 | 0.3214 | 0.8375 | 0.8418 | 0.8488 | 0.7784 | 0.8982 | 0.829 | 0.8476 | 0.4271 | | 0.2106 | 89.0 | 44500 | 0.3086 | 0.7961 | 0.9448 | 0.9041 | 0.7986 | 0.7239 | 0.7969 | 0.8003 | 0.8658 | 0.3213 | 0.3197 | 0.8317 | 0.8351 | 0.8397 | 0.7722 | 0.8933 | 0.8233 | 0.8415 | 0.3724 | | 0.2196 | 90.0 | 45000 | 0.2962 | 0.8039 | 0.95 | 0.9141 | 0.8086 | 0.7349 | 0.7961 | 0.8156 | 0.8683 | 0.3219 | 0.32 | 0.8412 | 0.8438 | 0.8524 | 0.7825 | 0.8967 | 0.8296 | 0.8547 | 0.3957 | | 0.2243 | 91.0 | 45500 | 0.3145 | 0.784 | 0.9513 | 0.8959 | 0.7894 | 0.7087 | 0.7916 | 0.7852 | 0.8539 | 0.315 | 0.3096 | 0.825 | 0.8292 | 0.8389 | 0.7619 | 0.887 | 0.8282 | 0.831 | 0.3962 | | 0.2163 | 92.0 | 46000 | 0.3038 | 0.7977 | 0.951 | 0.9016 | 0.7974 | 0.7306 | 0.7975 | 0.7983 | 0.865 | 0.3287 | 0.3156 | 0.8323 | 0.8364 | 0.8405 | 0.7784 | 0.8903 | 0.8277 | 0.8393 | 0.4029 | | 0.2511 | 93.0 | 46500 | 0.3074 | 0.8031 | 0.9517 | 0.9121 | 0.8081 | 0.7406 | 0.8117 | 0.8045 | 0.8605 | 0.3575 | 0.318 | 0.8375 | 0.8405 | 0.8472 | 0.7866 | 0.8876 | 0.8402 | 0.8412 | 0.4167 | | 0.2458 | 94.0 | 47000 | 0.3059 | 0.796 | 0.9544 | 0.8952 | 0.8036 | 0.7236 | 0.8028 | 0.7989 | 0.8607 | 0.306 | 0.3158 | 0.8339 | 0.8362 | 0.8425 | 0.7753 | 0.8909 | 0.8362 | 0.8418 | 0.359 | | 0.2377 | 95.0 | 47500 | 0.3181 | 0.7932 | 0.9521 | 0.9102 | 0.7881 | 0.7225 | 0.8033 | 0.7909 | 0.869 | 0.36 | 0.3138 | 0.8286 | 0.8312 | 0.827 | 0.7722 | 0.8945 | 0.8321 | 0.8321 | 0.4043 | | 0.246 | 96.0 | 48000 | 0.3054 | 0.8022 | 0.9518 | 0.899 | 0.7982 | 0.7449 | 0.7982 | 0.8054 | 0.8634 | 0.3107 | 0.3189 | 0.839 | 0.8412 | 0.8361 | 0.7979 | 0.8897 | 0.8278 | 0.8471 | 0.3605 | | 0.2296 | 97.0 | 48500 | 0.3185 | 0.803 | 0.9487 | 0.9088 | 0.804 | 0.7383 | 0.8065 | 0.8018 | 0.8667 | 0.3326 | 0.3199 | 0.8361 | 0.8401 | 0.8452 | 0.7835 | 0.8915 | 0.8337 | 0.8416 | 0.3905 | | 0.2347 | 98.0 | 49000 | 0.3141 | 0.8026 | 0.9509 | 0.9109 | 0.7949 | 0.7398 | 0.8008 | 0.8069 | 0.8731 | 0.3746 | 0.3188 | 0.839 | 0.8425 | 0.8385 | 0.7907 | 0.8982 | 0.8313 | 0.8485 | 0.4324 | | 0.2236 | 99.0 | 49500 | 0.3208 | 0.7992 | 0.954 | 0.8976 | 0.7993 | 0.7389 | 0.7965 | 0.802 | 0.8594 | 0.3036 | 0.3173 | 0.8356 | 0.8403 | 0.8401 | 0.7918 | 0.8891 | 0.8306 | 0.8421 | 0.3805 | | 0.251 | 100.0 | 50000 | 0.3113 | 0.7992 | 0.9506 | 0.9152 | 0.8017 | 0.7309 | 0.7951 | 0.8027 | 0.8651 | 0.324 | 0.3159 | 0.8349 | 0.8384 | 0.8464 | 0.7753 | 0.8936 | 0.8278 | 0.8439 | 0.3795 | | 0.2236 | 101.0 | 50500 | 0.3062 | 0.7996 | 0.9509 | 0.9067 | 0.7959 | 0.7331 | 0.8022 | 0.7999 | 0.8698 | 0.3927 | 0.3171 | 0.8365 | 0.8395 | 0.8377 | 0.7825 | 0.8982 | 0.8363 | 0.8416 | 0.4352 | | 0.2107 | 102.0 | 51000 | 0.3047 | 0.8077 | 0.9496 | 0.8968 | 0.8137 | 0.7388 | 0.8168 | 0.8058 | 0.8707 | 0.3509 | 0.3212 | 0.8427 | 0.8453 | 0.8516 | 0.7876 | 0.8967 | 0.8447 | 0.8471 | 0.3819 | | 0.2278 | 103.0 | 51500 | 0.3027 | 0.8051 | 0.9499 | 0.8995 | 0.8026 | 0.7402 | 0.8149 | 0.8086 | 0.8725 | 0.3343 | 0.32 | 0.8406 | 0.8437 | 0.8413 | 0.7907 | 0.8991 | 0.8463 | 0.8501 | 0.36 | | 0.2291 | 104.0 | 52000 | 0.3096 | 0.8018 | 0.9515 | 0.9022 | 0.798 | 0.7387 | 0.817 | 0.8037 | 0.8686 | 0.3314 | 0.3189 | 0.8378 | 0.841 | 0.8389 | 0.7876 | 0.8964 | 0.8449 | 0.8446 | 0.3695 | | 0.2192 | 105.0 | 52500 | 0.3193 | 0.7944 | 0.9454 | 0.9046 | 0.7906 | 0.726 | 0.8022 | 0.7964 | 0.8667 | 0.3592 | 0.315 | 0.8306 | 0.8335 | 0.8317 | 0.7742 | 0.8945 | 0.831 | 0.8394 | 0.3829 | | 0.2611 | 106.0 | 53000 | 0.3094 | 0.8045 | 0.9514 | 0.9108 | 0.8057 | 0.7418 | 0.8094 | 0.8069 | 0.8661 | 0.3822 | 0.3185 | 0.8412 | 0.8437 | 0.848 | 0.7897 | 0.8933 | 0.8379 | 0.8482 | 0.419 | | 0.2206 | 107.0 | 53500 | 0.3141 | 0.7962 | 0.9489 | 0.9009 | 0.7921 | 0.7242 | 0.7917 | 0.7999 | 0.8723 | 0.36 | 0.3154 | 0.8334 | 0.8366 | 0.8373 | 0.7753 | 0.8973 | 0.8272 | 0.8417 | 0.3881 | | 0.2155 | 108.0 | 54000 | 0.2998 | 0.8062 | 0.948 | 0.9033 | 0.8094 | 0.7384 | 0.8133 | 0.8109 | 0.8708 | 0.355 | 0.3211 | 0.8446 | 0.8471 | 0.8496 | 0.7928 | 0.8988 | 0.8417 | 0.8543 | 0.3867 | | 0.2267 | 109.0 | 54500 | 0.3070 | 0.8047 | 0.9501 | 0.9123 | 0.805 | 0.7405 | 0.7991 | 0.8121 | 0.8686 | 0.342 | 0.3223 | 0.8414 | 0.8446 | 0.8476 | 0.7907 | 0.8955 | 0.8336 | 0.8523 | 0.3938 | | 0.2192 | 110.0 | 55000 | 0.3172 | 0.7927 | 0.9453 | 0.8968 | 0.7993 | 0.7154 | 0.7974 | 0.795 | 0.8635 | 0.3085 | 0.3155 | 0.8307 | 0.8347 | 0.8397 | 0.7732 | 0.8912 | 0.8279 | 0.84 | 0.3581 | | 0.2212 | 111.0 | 55500 | 0.3189 | 0.8013 | 0.9453 | 0.8992 | 0.8086 | 0.7247 | 0.7895 | 0.8062 | 0.8707 | 0.3336 | 0.3229 | 0.8357 | 0.8399 | 0.8488 | 0.7742 | 0.8967 | 0.8245 | 0.8465 | 0.3881 | | 0.2089 | 112.0 | 56000 | 0.3104 | 0.8021 | 0.9514 | 0.9063 | 0.8105 | 0.7278 | 0.8079 | 0.8046 | 0.868 | 0.3474 | 0.3175 | 0.8374 | 0.8407 | 0.8472 | 0.7763 | 0.8985 | 0.8355 | 0.8471 | 0.3886 | | 0.2108 | 113.0 | 56500 | 0.3109 | 0.8054 | 0.9499 | 0.9015 | 0.3634 | 0.8084 | 0.8011 | 0.3177 | 0.8407 | 0.8441 | 0.4129 | 0.8485 | 0.8296 | 0.8076 | 0.8468 | 0.7398 | 0.7887 | 0.8688 | 0.8967 | | 0.2148 | 114.0 | 57000 | 0.3066 | 0.803 | 0.9505 | 0.9011 | 0.3337 | 0.8068 | 0.8033 | 0.3192 | 0.8388 | 0.8421 | 0.3914 | 0.8466 | 0.8336 | 0.8097 | 0.8484 | 0.7335 | 0.7845 | 0.8657 | 0.8933 | | 0.2107 | 115.0 | 57500 | 0.3090 | 0.8052 | 0.9505 | 0.9017 | 0.359 | 0.8074 | 0.802 | 0.3198 | 0.8409 | 0.8438 | 0.4076 | 0.8487 | 0.8363 | 0.809 | 0.8456 | 0.7345 | 0.7866 | 0.8722 | 0.8991 | | 0.2204 | 116.0 | 58000 | 0.3104 | 0.8033 | 0.9505 | 0.9073 | 0.3617 | 0.8036 | 0.8085 | 0.3187 | 0.8387 | 0.8419 | 0.4143 | 0.8446 | 0.8359 | 0.8066 | 0.8448 | 0.7341 | 0.7845 | 0.8693 | 0.8964 | | 0.2192 | 117.0 | 58500 | 0.3098 | 0.8062 | 0.9504 | 0.8968 | 0.3382 | 0.8092 | 0.8052 | 0.321 | 0.8416 | 0.8454 | 0.3948 | 0.8511 | 0.8363 | 0.807 | 0.848 | 0.7388 | 0.7897 | 0.8726 | 0.8985 | | 0.2384 | 118.0 | 59000 | 0.3126 | 0.8051 | 0.9507 | 0.9077 | 0.3702 | 0.8085 | 0.8069 | 0.3187 | 0.8387 | 0.8421 | 0.4219 | 0.8476 | 0.8358 | 0.8043 | 0.8421 | 0.7379 | 0.7845 | 0.8732 | 0.8997 | | 0.2001 | 119.0 | 59500 | 0.3058 | 0.8046 | 0.9506 | 0.9047 | 0.3726 | 0.8089 | 0.8057 | 0.3203 | 0.8405 | 0.8437 | 0.4205 | 0.8507 | 0.8301 | 0.804 | 0.8437 | 0.7374 | 0.7887 | 0.8725 | 0.8988 | | 0.205 | 120.0 | 60000 | 0.3107 | 0.803 | 0.9504 | 0.9039 | 0.3506 | 0.8072 | 0.8106 | 0.3193 | 0.8387 | 0.8423 | 0.4043 | 0.8474 | 0.8393 | 0.8033 | 0.8437 | 0.732 | 0.7845 | 0.8737 | 0.8988 | | 0.2247 | 121.0 | 60500 | 0.3104 | 0.8078 | 0.9503 | 0.9034 | 0.4008 | 0.8085 | 0.806 | 0.321 | 0.8437 | 0.847 | 0.4462 | 0.851 | 0.8377 | 0.8084 | 0.8464 | 0.7372 | 0.7928 | 0.8779 | 0.9018 | | 0.2042 | 122.0 | 61000 | 0.3158 | 0.8016 | 0.9466 | 0.9042 | 0.3568 | 0.8066 | 0.7924 | 0.3195 | 0.8391 | 0.8429 | 0.4157 | 0.8485 | 0.8266 | 0.8045 | 0.846 | 0.7261 | 0.7825 | 0.8743 | 0.9003 | | 0.2029 | 123.0 | 61500 | 0.3060 | 0.8059 | 0.9503 | 0.8976 | 0.336 | 0.808 | 0.8096 | 0.3202 | 0.8431 | 0.8466 | 0.39 | 0.8509 | 0.8403 | 0.802 | 0.846 | 0.7403 | 0.7938 | 0.8753 | 0.9 | | 0.2077 | 124.0 | 62000 | 0.3123 | 0.8028 | 0.9499 | 0.8959 | 0.3735 | 0.8068 | 0.802 | 0.3185 | 0.8399 | 0.8435 | 0.4319 | 0.85 | 0.8301 | 0.8023 | 0.8437 | 0.7324 | 0.7866 | 0.8737 | 0.9003 | | 0.2262 | 125.0 | 62500 | 0.3112 | 0.8043 | 0.9499 | 0.9056 | 0.3527 | 0.8063 | 0.8116 | 0.3196 | 0.8401 | 0.8434 | 0.3957 | 0.8486 | 0.8383 | 0.8 | 0.8401 | 0.739 | 0.7907 | 0.8741 | 0.8994 | | 0.2136 | 126.0 | 63000 | 0.3078 | 0.8045 | 0.9492 | 0.9035 | 0.3766 | 0.8086 | 0.8059 | 0.3189 | 0.8406 | 0.8439 | 0.4271 | 0.8497 | 0.8334 | 0.8033 | 0.844 | 0.7368 | 0.7887 | 0.8735 | 0.8991 | | 0.2446 | 127.0 | 63500 | 0.3092 | 0.8063 | 0.9502 | 0.9039 | 0.3843 | 0.8106 | 0.7998 | 0.3193 | 0.8417 | 0.8447 | 0.4267 | 0.85 | 0.8321 | 0.8081 | 0.8464 | 0.7388 | 0.7897 | 0.872 | 0.8979 | | 0.2116 | 128.0 | 64000 | 0.3041 | 0.809 | 0.9479 | 0.9041 | 0.3855 | 0.8146 | 0.8049 | 0.3241 | 0.844 | 0.8471 | 0.4233 | 0.8535 | 0.832 | 0.807 | 0.8476 | 0.7429 | 0.7918 | 0.8771 | 0.9018 | | 0.2194 | 129.0 | 64500 | 0.3099 | 0.809 | 0.9511 | 0.9049 | 0.3936 | 0.8112 | 0.8153 | 0.3202 | 0.8442 | 0.8471 | 0.4448 | 0.8508 | 0.8389 | 0.8081 | 0.8468 | 0.7421 | 0.7928 | 0.8767 | 0.9018 | | 0.2259 | 130.0 | 65000 | 0.3098 | 0.8074 | 0.951 | 0.9035 | 0.385 | 0.808 | 0.8103 | 0.323 | 0.8424 | 0.8457 | 0.4367 | 0.8483 | 0.8372 | 0.8069 | 0.846 | 0.7433 | 0.7928 | 0.872 | 0.8982 | | 0.1932 | 131.0 | 65500 | 0.3085 | 0.8098 | 0.9506 | 0.9102 | 0.373 | 0.8121 | 0.8153 | 0.3205 | 0.8444 | 0.8473 | 0.4119 | 0.8522 | 0.8406 | 0.8093 | 0.8476 | 0.7464 | 0.7948 | 0.8738 | 0.8994 | | 0.1961 | 132.0 | 66000 | 0.3085 | 0.8095 | 0.9505 | 0.9027 | 0.3761 | 0.8132 | 0.8094 | 0.3203 | 0.8439 | 0.8473 | 0.4219 | 0.8526 | 0.8352 | 0.8096 | 0.8484 | 0.7432 | 0.7928 | 0.8755 | 0.9006 | | 0.2426 | 133.0 | 66500 | 0.3092 | 0.8092 | 0.9509 | 0.9023 | 0.3815 | 0.8094 | 0.8172 | 0.3209 | 0.8446 | 0.8473 | 0.4271 | 0.8504 | 0.8398 | 0.811 | 0.8504 | 0.7433 | 0.7928 | 0.8734 | 0.8988 | | 0.2011 | 134.0 | 67000 | 0.3088 | 0.8087 | 0.9501 | 0.9024 | 0.3701 | 0.8093 | 0.8148 | 0.3204 | 0.8441 | 0.847 | 0.4205 | 0.8503 | 0.8407 | 0.8105 | 0.8496 | 0.7434 | 0.7928 | 0.8724 | 0.8985 | | 0.2408 | 135.0 | 67500 | 0.3120 | 0.8034 | 0.9497 | 0.9032 | 0.3635 | 0.8042 | 0.8049 | 0.3183 | 0.8406 | 0.8434 | 0.409 | 0.848 | 0.832 | 0.8028 | 0.8437 | 0.7355 | 0.7876 | 0.8718 | 0.8988 | | 0.2169 | 136.0 | 68000 | 0.3097 | 0.8066 | 0.9498 | 0.9037 | 0.3689 | 0.8068 | 0.8147 | 0.3194 | 0.8436 | 0.8464 | 0.4219 | 0.8503 | 0.8411 | 0.8083 | 0.8476 | 0.7384 | 0.7918 | 0.8732 | 0.8997 | | 0.2292 | 137.0 | 68500 | 0.3086 | 0.8046 | 0.9498 | 0.8996 | 0.3796 | 0.8068 | 0.8031 | 0.3184 | 0.8407 | 0.844 | 0.4314 | 0.849 | 0.8316 | 0.8058 | 0.8456 | 0.7339 | 0.7866 | 0.874 | 0.8997 | | 0.2231 | 138.0 | 69000 | 0.3083 | 0.8073 | 0.9477 | 0.905 | 0.3733 | 0.8098 | 0.8085 | 0.3192 | 0.8433 | 0.8461 | 0.4119 | 0.8516 | 0.8356 | 0.8075 | 0.848 | 0.74 | 0.7897 | 0.8742 | 0.9006 | | 0.1987 | 139.0 | 69500 | 0.3096 | 0.8068 | 0.9492 | 0.8997 | 0.3756 | 0.8094 | 0.8041 | 0.319 | 0.8427 | 0.8454 | 0.4252 | 0.851 | 0.8318 | 0.8089 | 0.8488 | 0.7366 | 0.7876 | 0.8747 | 0.8997 | | 0.1917 | 140.0 | 70000 | 0.3101 | 0.8051 | 0.9476 | 0.9028 | 0.3669 | 0.8076 | 0.812 | 0.3184 | 0.8411 | 0.8441 | 0.4038 | 0.8494 | 0.8377 | 0.8057 | 0.8456 | 0.735 | 0.7866 | 0.8746 | 0.9 | | 0.2281 | 141.0 | 70500 | 0.3109 | 0.8049 | 0.9476 | 0.9091 | 0.3608 | 0.8076 | 0.8106 | 0.3194 | 0.8415 | 0.8444 | 0.4038 | 0.8499 | 0.8359 | 0.8048 | 0.846 | 0.7368 | 0.7887 | 0.8731 | 0.8985 | | 0.2052 | 142.0 | 71000 | 0.3089 | 0.8064 | 0.9478 | 0.9063 | 0.3715 | 0.8096 | 0.8141 | 0.3192 | 0.8427 | 0.8454 | 0.4052 | 0.8512 | 0.8407 | 0.8072 | 0.8468 | 0.7376 | 0.7897 | 0.8744 | 0.8997 | | 0.215 | 143.0 | 71500 | 0.3087 | 0.8073 | 0.9476 | 0.9062 | 0.3781 | 0.8087 | 0.815 | 0.3193 | 0.8434 | 0.8463 | 0.42 | 0.8508 | 0.8418 | 0.8082 | 0.8484 | 0.7384 | 0.7907 | 0.8752 | 0.8997 | | 0.2031 | 144.0 | 72000 | 0.3087 | 0.8065 | 0.9475 | 0.9062 | 0.3766 | 0.8091 | 0.8107 | 0.3191 | 0.843 | 0.8458 | 0.42 | 0.8511 | 0.8384 | 0.8062 | 0.8476 | 0.7383 | 0.7897 | 0.8752 | 0.9 | | 0.2042 | 145.0 | 72500 | 0.3093 | 0.8056 | 0.9476 | 0.9048 | 0.3781 | 0.8082 | 0.8105 | 0.3189 | 0.8425 | 0.8452 | 0.42 | 0.8506 | 0.8374 | 0.8045 | 0.846 | 0.737 | 0.7897 | 0.8752 | 0.9 | | 0.1978 | 146.0 | 73000 | 0.3089 | 0.8065 | 0.9475 | 0.9047 | 0.3781 | 0.8091 | 0.815 | 0.3197 | 0.8436 | 0.8463 | 0.42 | 0.8515 | 0.8418 | 0.8064 | 0.848 | 0.7372 | 0.7907 | 0.876 | 0.9003 | | 0.1969 | 147.0 | 73500 | 0.3089 | 0.8063 | 0.9477 | 0.9049 | 0.3766 | 0.8093 | 0.8145 | 0.3195 | 0.8434 | 0.8462 | 0.42 | 0.8515 | 0.8408 | 0.8065 | 0.8476 | 0.7377 | 0.7907 | 0.8748 | 0.9003 | | 0.2083 | 148.0 | 74000 | 0.3089 | 0.8062 | 0.9477 | 0.9049 | 0.3781 | 0.8098 | 0.8149 | 0.3198 | 0.8435 | 0.8463 | 0.42 | 0.8519 | 0.8417 | 0.8066 | 0.848 | 0.7377 | 0.7907 | 0.8744 | 0.9003 | | 0.1991 | 149.0 | 74500 | 0.3090 | 0.8062 | 0.9477 | 0.9049 | 0.3781 | 0.8098 | 0.8149 | 0.3198 | 0.8436 | 0.8464 | 0.42 | 0.8519 | 0.8418 | 0.8066 | 0.848 | 0.7377 | 0.7907 | 0.8745 | 0.9006 | | 0.2337 | 150.0 | 75000 | 0.3090 | 0.8062 | 0.9477 | 0.9049 | 0.3781 | 0.8098 | 0.8149 | 0.3198 | 0.8436 | 0.8464 | 0.42 | 0.8519 | 0.8418 | 0.8066 | 0.848 | 0.7377 | 0.7907 | 0.8745 | 0.9006 | ### Framework versions - Transformers 4.45.2 - Pytorch 2.5.0+cu121 - Datasets 2.19.2 - Tokenizers 0.20.1
[ "chicken", "duck", "plant" ]
joe611/chickens-composite-201616161616-150-epochs-w-transform
<!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. --> # chickens-composite-201616161616-150-epochs-w-transform This model is a fine-tuned version of [facebook/detr-resnet-50](https://huggingface.co/facebook/detr-resnet-50) on the None dataset. It achieves the following results on the evaluation set: - Loss: 0.2864 - Map: 0.7992 - Map 50: 0.9637 - Map 75: 0.8989 - Map Small: 0.3428 - Map Medium: 0.8051 - Map Large: 0.8153 - Mar 1: 0.3162 - Mar 10: 0.8378 - Mar 100: 0.843 - Mar Small: 0.4381 - Mar Medium: 0.8463 - Mar Large: 0.8551 - Map Chicken: 0.7833 - Mar 100 Chicken: 0.8298 - Map Duck: 0.747 - Mar 100 Duck: 0.7979 - Map Plant: 0.8672 - Mar 100 Plant: 0.9012 ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 1e-05 - train_batch_size: 2 - eval_batch_size: 8 - seed: 42 - optimizer: Use adamw_torch with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments - lr_scheduler_type: cosine - num_epochs: 150 ### Training results | Training Loss | Epoch | Step | Validation Loss | Map | Map 50 | Map 75 | Map Chicken | Map Duck | Map Large | Map Medium | Map Plant | Map Small | Mar 1 | Mar 10 | Mar 100 | Mar 100 Chicken | Mar 100 Duck | Mar 100 Plant | Mar Large | Mar Medium | Mar Small | |:-------------:|:-----:|:-----:|:---------------:|:------:|:------:|:------:|:-----------:|:--------:|:---------:|:----------:|:---------:|:---------:|:------:|:------:|:-------:|:---------------:|:------------:|:-------------:|:---------:|:----------:|:---------:| | 1.3747 | 1.0 | 500 | 1.3787 | 0.1018 | 0.1491 | 0.1155 | 0.0363 | 0.0 | 0.1826 | 0.0495 | 0.2691 | 0.006 | 0.0524 | 0.2694 | 0.355 | 0.3643 | 0.0 | 0.7006 | 0.3925 | 0.3288 | 0.0262 | | 1.2078 | 2.0 | 1000 | 1.2359 | 0.2048 | 0.2894 | 0.2385 | 0.0858 | 0.0 | 0.2865 | 0.1101 | 0.5287 | 0.0066 | 0.086 | 0.3365 | 0.4465 | 0.604 | 0.0 | 0.7355 | 0.4846 | 0.4144 | 0.0895 | | 1.0716 | 3.0 | 1500 | 1.0378 | 0.2591 | 0.3743 | 0.304 | 0.1358 | 0.0 | 0.2992 | 0.1831 | 0.6414 | 0.0233 | 0.1024 | 0.3601 | 0.3717 | 0.3921 | 0.0 | 0.723 | 0.3834 | 0.3414 | 0.0519 | | 1.0097 | 4.0 | 2000 | 0.9668 | 0.2911 | 0.4199 | 0.3426 | 0.2048 | 0.0 | 0.3254 | 0.2382 | 0.6684 | 0.0787 | 0.1131 | 0.3961 | 0.4091 | 0.4976 | 0.0 | 0.7297 | 0.4087 | 0.3777 | 0.1333 | | 0.6756 | 5.0 | 2500 | 0.8939 | 0.3274 | 0.4611 | 0.3732 | 0.2788 | 0.0 | 0.3693 | 0.2915 | 0.7034 | 0.0597 | 0.1245 | 0.449 | 0.4744 | 0.6635 | 0.0 | 0.7597 | 0.4895 | 0.453 | 0.1271 | | 0.814 | 6.0 | 3000 | 0.8398 | 0.3292 | 0.4681 | 0.3844 | 0.3025 | 0.0 | 0.373 | 0.2896 | 0.6851 | 0.0637 | 0.1184 | 0.4607 | 0.4753 | 0.6802 | 0.0 | 0.7458 | 0.5049 | 0.4418 | 0.1148 | | 0.8875 | 7.0 | 3500 | 1.0039 | 0.3382 | 0.5017 | 0.3967 | 0.3663 | 0.0 | 0.359 | 0.2988 | 0.6484 | 0.0382 | 0.1234 | 0.4309 | 0.4331 | 0.6056 | 0.0 | 0.6936 | 0.4519 | 0.3978 | 0.0867 | | 0.9457 | 8.0 | 4000 | 0.7726 | 0.3549 | 0.5102 | 0.4198 | 0.3821 | 0.0 | 0.392 | 0.3128 | 0.6827 | 0.0431 | 0.1237 | 0.4609 | 0.4649 | 0.6663 | 0.0 | 0.7285 | 0.4913 | 0.4323 | 0.08 | | 0.8339 | 9.0 | 4500 | 0.7188 | 0.3834 | 0.5328 | 0.4389 | 0.4271 | 0.0 | 0.4102 | 0.3461 | 0.7231 | 0.0449 | 0.1309 | 0.4861 | 0.4894 | 0.696 | 0.0 | 0.7721 | 0.5219 | 0.4577 | 0.1381 | | 0.7813 | 10.0 | 5000 | 0.7378 | 0.3769 | 0.5485 | 0.4379 | 0.4384 | 0.0 | 0.3971 | 0.3526 | 0.6923 | 0.0362 | 0.124 | 0.4752 | 0.4803 | 0.6909 | 0.0 | 0.75 | 0.5162 | 0.4555 | 0.0833 | | 0.7526 | 11.0 | 5500 | 0.6691 | 0.4059 | 0.5667 | 0.457 | 0.4777 | 0.0 | 0.4276 | 0.3719 | 0.7398 | 0.0528 | 0.1346 | 0.4873 | 0.4944 | 0.6956 | 0.0 | 0.7876 | 0.5378 | 0.4618 | 0.1514 | | 0.7195 | 12.0 | 6000 | 0.6984 | 0.3983 | 0.5673 | 0.4621 | 0.4728 | 0.0 | 0.4143 | 0.3648 | 0.7222 | 0.0499 | 0.1274 | 0.4797 | 0.4874 | 0.6909 | 0.0 | 0.7712 | 0.5283 | 0.4558 | 0.1281 | | 0.6467 | 13.0 | 6500 | 0.6682 | 0.408 | 0.5632 | 0.4939 | 0.5153 | 0.0 | 0.4315 | 0.388 | 0.7087 | 0.0458 | 0.1372 | 0.4872 | 0.49 | 0.7143 | 0.0 | 0.7558 | 0.5251 | 0.4654 | 0.1333 | | 0.7253 | 14.0 | 7000 | 0.6210 | 0.4263 | 0.5778 | 0.5001 | 0.5356 | 0.0 | 0.4556 | 0.3959 | 0.7432 | 0.0782 | 0.1364 | 0.506 | 0.5088 | 0.7377 | 0.0 | 0.7888 | 0.5347 | 0.4813 | 0.139 | | 0.7234 | 15.0 | 7500 | 0.6613 | 0.406 | 0.5657 | 0.486 | 0.5085 | 0.0 | 0.4334 | 0.3667 | 0.7096 | 0.0572 | 0.1337 | 0.4851 | 0.4885 | 0.7083 | 0.0 | 0.7573 | 0.5213 | 0.4478 | 0.1267 | | 0.6467 | 16.0 | 8000 | 0.6621 | 0.4174 | 0.5704 | 0.4886 | 0.5214 | 0.0 | 0.4596 | 0.3702 | 0.7309 | 0.0454 | 0.1354 | 0.4926 | 0.496 | 0.7095 | 0.0 | 0.7785 | 0.539 | 0.4504 | 0.1133 | | 0.6227 | 17.0 | 8500 | 0.6304 | 0.4221 | 0.5839 | 0.4954 | 0.5342 | 0.0 | 0.4436 | 0.3929 | 0.732 | 0.0783 | 0.1338 | 0.495 | 0.4977 | 0.7139 | 0.0 | 0.7791 | 0.5317 | 0.4721 | 0.1643 | | 0.7302 | 18.0 | 9000 | 0.5794 | 0.4364 | 0.5848 | 0.5177 | 0.5726 | 0.0 | 0.4589 | 0.4131 | 0.7367 | 0.0589 | 0.1399 | 0.5078 | 0.5106 | 0.748 | 0.0 | 0.7836 | 0.5434 | 0.482 | 0.121 | | 0.665 | 19.0 | 9500 | 0.5931 | 0.4435 | 0.6047 | 0.5339 | 0.5862 | 0.0 | 0.4622 | 0.4084 | 0.7442 | 0.0897 | 0.1405 | 0.5052 | 0.5099 | 0.7349 | 0.0 | 0.7948 | 0.5396 | 0.4776 | 0.1667 | | 0.5947 | 20.0 | 10000 | 0.5701 | 0.4626 | 0.6084 | 0.5475 | 0.615 | 0.0 | 0.4907 | 0.4311 | 0.7728 | 0.0789 | 0.1413 | 0.5173 | 0.5248 | 0.7563 | 0.0 | 0.8182 | 0.5565 | 0.4942 | 0.2052 | | 0.5727 | 21.0 | 10500 | 0.5720 | 0.4511 | 0.604 | 0.5269 | 0.5865 | 0.0 | 0.4784 | 0.4265 | 0.7667 | 0.1067 | 0.1369 | 0.5125 | 0.5177 | 0.7389 | 0.0 | 0.8142 | 0.5344 | 0.4946 | 0.229 | | 0.5855 | 22.0 | 11000 | 0.5773 | 0.4519 | 0.6125 | 0.5447 | 0.5949 | 0.0 | 0.4783 | 0.4254 | 0.7608 | 0.1236 | 0.1383 | 0.5063 | 0.5118 | 0.7329 | 0.0 | 0.8024 | 0.5333 | 0.4872 | 0.2386 | | 0.5441 | 23.0 | 11500 | 0.5694 | 0.4636 | 0.62 | 0.5595 | 0.6198 | 0.0 | 0.4837 | 0.4296 | 0.7709 | 0.0867 | 0.1445 | 0.509 | 0.5137 | 0.7313 | 0.0 | 0.8097 | 0.5346 | 0.4852 | 0.161 | | 0.5504 | 24.0 | 12000 | 0.5569 | 0.4653 | 0.6191 | 0.5497 | 0.6305 | 0.0 | 0.4841 | 0.4365 | 0.7653 | 0.0995 | 0.1431 | 0.5149 | 0.5206 | 0.7556 | 0.0 | 0.8064 | 0.5405 | 0.4957 | 0.2229 | | 0.5802 | 25.0 | 12500 | 0.5488 | 0.4621 | 0.6168 | 0.5455 | 0.6366 | 0.0 | 0.4952 | 0.431 | 0.7497 | 0.0932 | 0.1458 | 0.5118 | 0.516 | 0.7536 | 0.0 | 0.7945 | 0.541 | 0.491 | 0.1814 | | 0.6644 | 26.0 | 13000 | 0.5489 | 0.4709 | 0.6259 | 0.564 | 0.637 | 0.0 | 0.5032 | 0.4382 | 0.7757 | 0.0979 | 0.1449 | 0.515 | 0.5187 | 0.7385 | 0.0 | 0.8176 | 0.5449 | 0.4914 | 0.2129 | | 0.5006 | 27.0 | 13500 | 0.5375 | 0.4817 | 0.6348 | 0.5852 | 0.676 | 0.0008 | 0.5099 | 0.4509 | 0.7683 | 0.0954 | 0.1491 | 0.5201 | 0.5249 | 0.7623 | 0.0021 | 0.8103 | 0.5427 | 0.4986 | 0.19 | | 0.5194 | 28.0 | 14000 | 0.5161 | 0.4872 | 0.6325 | 0.5795 | 0.6725 | 0.0015 | 0.5126 | 0.4629 | 0.7875 | 0.1579 | 0.1508 | 0.5289 | 0.5343 | 0.7655 | 0.0093 | 0.8282 | 0.5508 | 0.5098 | 0.2424 | | 0.5253 | 29.0 | 14500 | 0.5392 | 0.4861 | 0.6461 | 0.5959 | 0.6739 | 0.0158 | 0.5018 | 0.4562 | 0.7685 | 0.1217 | 0.1511 | 0.5204 | 0.5248 | 0.7524 | 0.0144 | 0.8076 | 0.5371 | 0.5038 | 0.2138 | | 0.7139 | 30.0 | 15000 | 0.5087 | 0.4933 | 0.6447 | 0.5839 | 0.6846 | 0.0082 | 0.5172 | 0.468 | 0.7873 | 0.0989 | 0.156 | 0.5313 | 0.5363 | 0.7667 | 0.0124 | 0.83 | 0.5527 | 0.5146 | 0.2552 | | 0.5975 | 31.0 | 15500 | 0.5136 | 0.5044 | 0.6842 | 0.5915 | 0.6641 | 0.062 | 0.5448 | 0.4764 | 0.787 | 0.1557 | 0.1728 | 0.5681 | 0.575 | 0.7504 | 0.1433 | 0.8312 | 0.5871 | 0.5523 | 0.2838 | | 0.6357 | 32.0 | 16000 | 0.5031 | 0.506 | 0.6647 | 0.5959 | 0.7083 | 0.0387 | 0.5347 | 0.4748 | 0.7711 | 0.1292 | 0.166 | 0.5445 | 0.5521 | 0.7778 | 0.067 | 0.8115 | 0.5721 | 0.5279 | 0.3205 | | 0.4954 | 33.0 | 16500 | 0.4850 | 0.6026 | 0.7982 | 0.7113 | 0.7038 | 0.3072 | 0.5485 | 0.5987 | 0.7966 | 0.1613 | 0.2289 | 0.6542 | 0.6592 | 0.7687 | 0.3763 | 0.8327 | 0.601 | 0.6559 | 0.2952 | | 0.5608 | 34.0 | 17000 | 0.4956 | 0.6291 | 0.8291 | 0.7295 | 0.6904 | 0.4001 | 0.6004 | 0.6177 | 0.7969 | 0.1447 | 0.2403 | 0.6767 | 0.6807 | 0.7512 | 0.4588 | 0.8321 | 0.6434 | 0.6715 | 0.2195 | | 0.5545 | 35.0 | 17500 | 0.4593 | 0.6732 | 0.8781 | 0.7947 | 0.7125 | 0.5052 | 0.6504 | 0.6728 | 0.802 | 0.128 | 0.2712 | 0.719 | 0.7231 | 0.7679 | 0.566 | 0.8355 | 0.7029 | 0.7214 | 0.2586 | | 0.4638 | 36.0 | 18000 | 0.4485 | 0.6864 | 0.9007 | 0.7957 | 0.7238 | 0.5375 | 0.683 | 0.6833 | 0.7978 | 0.1103 | 0.2766 | 0.7337 | 0.7394 | 0.7798 | 0.6041 | 0.8342 | 0.7294 | 0.7441 | 0.1986 | | 0.4631 | 37.0 | 18500 | 0.4289 | 0.6983 | 0.8846 | 0.819 | 0.7339 | 0.5411 | 0.6778 | 0.6983 | 0.8198 | 0.1593 | 0.282 | 0.7377 | 0.7435 | 0.7877 | 0.5948 | 0.8479 | 0.7242 | 0.7472 | 0.26 | | 0.4801 | 38.0 | 19000 | 0.4302 | 0.7033 | 0.9186 | 0.8231 | 0.7085 | 0.596 | 0.7351 | 0.6893 | 0.8056 | 0.208 | 0.2852 | 0.7465 | 0.7534 | 0.7635 | 0.6608 | 0.8358 | 0.7803 | 0.7432 | 0.2871 | | 0.5169 | 39.0 | 19500 | 0.4603 | 0.6792 | 0.9211 | 0.8229 | 0.6854 | 0.5802 | 0.7224 | 0.6702 | 0.7719 | 0.0846 | 0.2782 | 0.7283 | 0.7346 | 0.746 | 0.6495 | 0.8082 | 0.7577 | 0.7324 | 0.191 | | 0.5702 | 40.0 | 20000 | 0.4284 | 0.7044 | 0.9409 | 0.8336 | 0.7053 | 0.6151 | 0.7657 | 0.6982 | 0.7928 | 0.1289 | 0.2882 | 0.7526 | 0.7597 | 0.7603 | 0.6845 | 0.8342 | 0.8075 | 0.7525 | 0.2648 | | 0.4602 | 41.0 | 20500 | 0.4185 | 0.7108 | 0.9349 | 0.8528 | 0.7103 | 0.6225 | 0.7698 | 0.7045 | 0.7996 | 0.1198 | 0.286 | 0.7573 | 0.7632 | 0.7627 | 0.6876 | 0.8394 | 0.8156 | 0.7583 | 0.209 | | 0.5054 | 42.0 | 21000 | 0.4112 | 0.7046 | 0.9386 | 0.8376 | 0.7135 | 0.6025 | 0.7612 | 0.6897 | 0.7979 | 0.1176 | 0.285 | 0.7544 | 0.7605 | 0.7679 | 0.6784 | 0.8352 | 0.8061 | 0.7498 | 0.2319 | | 0.4585 | 43.0 | 21500 | 0.4149 | 0.7019 | 0.9352 | 0.831 | 0.7039 | 0.5973 | 0.7528 | 0.6919 | 0.8043 | 0.1512 | 0.2842 | 0.746 | 0.7539 | 0.7587 | 0.6619 | 0.8412 | 0.8012 | 0.7437 | 0.2824 | | 0.4809 | 44.0 | 22000 | 0.4257 | 0.7114 | 0.946 | 0.8499 | 0.7028 | 0.6311 | 0.7628 | 0.6952 | 0.8004 | 0.1798 | 0.2847 | 0.7502 | 0.7581 | 0.7452 | 0.6918 | 0.8373 | 0.8051 | 0.7436 | 0.2981 | | 0.5096 | 45.0 | 22500 | 0.3866 | 0.7301 | 0.9409 | 0.8656 | 0.7337 | 0.6316 | 0.7604 | 0.723 | 0.825 | 0.2305 | 0.2934 | 0.768 | 0.7752 | 0.7853 | 0.6825 | 0.8579 | 0.7969 | 0.7703 | 0.3581 | | 0.3569 | 46.0 | 23000 | 0.3903 | 0.7354 | 0.9551 | 0.85 | 0.7441 | 0.6476 | 0.7816 | 0.7281 | 0.8146 | 0.1643 | 0.2973 | 0.7783 | 0.7852 | 0.7913 | 0.7113 | 0.853 | 0.8215 | 0.78 | 0.2905 | | 0.5786 | 47.0 | 23500 | 0.3864 | 0.7324 | 0.9466 | 0.8595 | 0.7353 | 0.6618 | 0.7822 | 0.717 | 0.8 | 0.118 | 0.2988 | 0.7731 | 0.779 | 0.7889 | 0.7124 | 0.8358 | 0.82 | 0.7693 | 0.2252 | | 0.5832 | 48.0 | 24000 | 0.3837 | 0.7295 | 0.9548 | 0.8663 | 0.7363 | 0.6488 | 0.7473 | 0.7188 | 0.8036 | 0.2165 | 0.2953 | 0.7746 | 0.7835 | 0.7925 | 0.7144 | 0.8436 | 0.7958 | 0.775 | 0.3795 | | 0.4607 | 49.0 | 24500 | 0.3718 | 0.7349 | 0.952 | 0.86 | 0.7436 | 0.653 | 0.7486 | 0.7263 | 0.8081 | 0.2217 | 0.2972 | 0.7798 | 0.7852 | 0.7929 | 0.7134 | 0.8494 | 0.7947 | 0.78 | 0.3295 | | 0.4544 | 50.0 | 25000 | 0.3855 | 0.7337 | 0.9509 | 0.8708 | 0.7415 | 0.6595 | 0.7572 | 0.7273 | 0.8002 | 0.1915 | 0.2962 | 0.776 | 0.7831 | 0.7948 | 0.7155 | 0.8391 | 0.8071 | 0.7778 | 0.3548 | | 0.4856 | 51.0 | 25500 | 0.3908 | 0.7289 | 0.948 | 0.8855 | 0.7357 | 0.6467 | 0.7705 | 0.7117 | 0.8042 | 0.2033 | 0.2988 | 0.7699 | 0.7754 | 0.7821 | 0.7082 | 0.8358 | 0.8117 | 0.7603 | 0.3552 | | 0.525 | 52.0 | 26000 | 0.3737 | 0.7356 | 0.9475 | 0.8752 | 0.7445 | 0.661 | 0.7775 | 0.7244 | 0.8012 | 0.1072 | 0.298 | 0.7774 | 0.7847 | 0.7917 | 0.7268 | 0.8358 | 0.8254 | 0.7733 | 0.2576 | | 0.461 | 53.0 | 26500 | 0.3872 | 0.73 | 0.9538 | 0.8836 | 0.7342 | 0.6596 | 0.7643 | 0.7287 | 0.796 | 0.1453 | 0.2963 | 0.7755 | 0.7815 | 0.779 | 0.7309 | 0.8345 | 0.8016 | 0.7774 | 0.2843 | | 0.4168 | 54.0 | 27000 | 0.3672 | 0.7432 | 0.9508 | 0.8815 | 0.7403 | 0.6648 | 0.7746 | 0.7364 | 0.8247 | 0.1979 | 0.3004 | 0.7866 | 0.7924 | 0.7933 | 0.7237 | 0.8603 | 0.8117 | 0.7876 | 0.3281 | | 0.5283 | 55.0 | 27500 | 0.3803 | 0.7312 | 0.9393 | 0.8797 | 0.73 | 0.6559 | 0.7731 | 0.7226 | 0.8077 | 0.2027 | 0.2998 | 0.7742 | 0.778 | 0.7802 | 0.7093 | 0.8445 | 0.81 | 0.7706 | 0.2871 | | 0.4825 | 56.0 | 28000 | 0.3591 | 0.7475 | 0.9513 | 0.8948 | 0.7531 | 0.6794 | 0.7812 | 0.7373 | 0.8099 | 0.2344 | 0.304 | 0.7953 | 0.7992 | 0.8052 | 0.7412 | 0.8512 | 0.825 | 0.7872 | 0.361 | | 0.4286 | 57.0 | 28500 | 0.3636 | 0.7375 | 0.9587 | 0.8681 | 0.7436 | 0.6484 | 0.7447 | 0.7316 | 0.8204 | 0.2267 | 0.2952 | 0.7811 | 0.7873 | 0.7917 | 0.7103 | 0.86 | 0.7927 | 0.7801 | 0.3533 | | 0.505 | 58.0 | 29000 | 0.3713 | 0.7322 | 0.9479 | 0.8768 | 0.7316 | 0.6583 | 0.7717 | 0.7208 | 0.8067 | 0.24 | 0.2949 | 0.7722 | 0.7786 | 0.7782 | 0.7155 | 0.8421 | 0.8092 | 0.7716 | 0.32 | | 0.3802 | 59.0 | 29500 | 0.3628 | 0.7445 | 0.9469 | 0.8799 | 0.7358 | 0.678 | 0.7525 | 0.7393 | 0.8196 | 0.2474 | 0.2973 | 0.782 | 0.7885 | 0.781 | 0.7278 | 0.8567 | 0.7951 | 0.784 | 0.32 | | 0.3638 | 60.0 | 30000 | 0.3528 | 0.7432 | 0.9472 | 0.8783 | 0.7524 | 0.6658 | 0.7709 | 0.7413 | 0.8114 | 0.2418 | 0.2996 | 0.7839 | 0.7893 | 0.798 | 0.7175 | 0.8524 | 0.8165 | 0.7875 | 0.3619 | | 0.4559 | 61.0 | 30500 | 0.3543 | 0.7393 | 0.9569 | 0.8867 | 0.7432 | 0.6716 | 0.7413 | 0.7374 | 0.8031 | 0.2875 | 0.2987 | 0.7806 | 0.7855 | 0.7917 | 0.7227 | 0.8421 | 0.7821 | 0.7842 | 0.3805 | | 0.5254 | 62.0 | 31000 | 0.3775 | 0.7268 | 0.9492 | 0.8686 | 0.7326 | 0.6375 | 0.7406 | 0.7164 | 0.8103 | 0.2894 | 0.2896 | 0.7669 | 0.7745 | 0.7802 | 0.6938 | 0.8494 | 0.7812 | 0.7667 | 0.409 | | 0.3529 | 63.0 | 31500 | 0.3562 | 0.7523 | 0.9533 | 0.8955 | 0.7437 | 0.6887 | 0.7747 | 0.7458 | 0.8246 | 0.2767 | 0.3047 | 0.793 | 0.7998 | 0.7948 | 0.7443 | 0.8603 | 0.8215 | 0.7922 | 0.4081 | | 0.4234 | 64.0 | 32000 | 0.3625 | 0.7424 | 0.9439 | 0.8858 | 0.7355 | 0.6711 | 0.7568 | 0.7401 | 0.8207 | 0.2197 | 0.2969 | 0.7798 | 0.7861 | 0.7845 | 0.7175 | 0.8564 | 0.8095 | 0.7821 | 0.3205 | | 0.4396 | 65.0 | 32500 | 0.3512 | 0.7614 | 0.9564 | 0.891 | 0.7586 | 0.7001 | 0.7852 | 0.7489 | 0.8255 | 0.2531 | 0.3058 | 0.8001 | 0.8069 | 0.8048 | 0.7557 | 0.8603 | 0.8285 | 0.7985 | 0.3833 | | 0.4173 | 66.0 | 33000 | 0.3434 | 0.77 | 0.9558 | 0.8952 | 0.7579 | 0.7253 | 0.8013 | 0.7656 | 0.8268 | 0.2052 | 0.3066 | 0.8065 | 0.8138 | 0.8091 | 0.7691 | 0.8633 | 0.8334 | 0.8101 | 0.3562 | | 0.4697 | 67.0 | 33500 | 0.3513 | 0.7545 | 0.9452 | 0.8796 | 0.7586 | 0.677 | 0.7904 | 0.7461 | 0.8279 | 0.2171 | 0.3026 | 0.7924 | 0.799 | 0.8087 | 0.7247 | 0.8636 | 0.8351 | 0.788 | 0.32 | | 0.4771 | 68.0 | 34000 | 0.3578 | 0.7577 | 0.9455 | 0.8704 | 0.7557 | 0.6993 | 0.8022 | 0.7413 | 0.818 | 0.1788 | 0.3074 | 0.7947 | 0.8017 | 0.8004 | 0.7464 | 0.8582 | 0.8384 | 0.7888 | 0.329 | | 0.4833 | 69.0 | 34500 | 0.3555 | 0.7502 | 0.9465 | 0.8737 | 0.7537 | 0.6766 | 0.7661 | 0.7317 | 0.8202 | 0.2681 | 0.2995 | 0.789 | 0.7955 | 0.7984 | 0.7299 | 0.8582 | 0.8117 | 0.7848 | 0.3676 | | 0.4091 | 70.0 | 35000 | 0.3746 | 0.7332 | 0.9476 | 0.8716 | 0.7196 | 0.6808 | 0.7409 | 0.728 | 0.7992 | 0.1368 | 0.3026 | 0.7781 | 0.7825 | 0.7726 | 0.7351 | 0.8397 | 0.7993 | 0.7767 | 0.2071 | | 0.3662 | 71.0 | 35500 | 0.3476 | 0.748 | 0.9477 | 0.8928 | 0.2295 | 0.7423 | 0.7833 | 0.3014 | 0.791 | 0.7964 | 0.3529 | 0.7918 | 0.8305 | 0.7374 | 0.7873 | 0.6861 | 0.7361 | 0.8204 | 0.8658 | | 0.4244 | 72.0 | 36000 | 0.3509 | 0.7518 | 0.9432 | 0.8724 | 0.2416 | 0.7471 | 0.7649 | 0.2982 | 0.792 | 0.797 | 0.3657 | 0.7909 | 0.8121 | 0.7548 | 0.8004 | 0.6835 | 0.7299 | 0.8172 | 0.8606 | | 0.4483 | 73.0 | 36500 | 0.3508 | 0.7472 | 0.9483 | 0.897 | 0.2306 | 0.753 | 0.7725 | 0.2979 | 0.7963 | 0.8027 | 0.409 | 0.8053 | 0.8304 | 0.738 | 0.7944 | 0.6835 | 0.7474 | 0.8201 | 0.8664 | | 0.4498 | 74.0 | 37000 | 0.3357 | 0.7632 | 0.9535 | 0.882 | 0.2094 | 0.7633 | 0.7862 | 0.3069 | 0.8033 | 0.8087 | 0.3448 | 0.8104 | 0.8324 | 0.7617 | 0.8091 | 0.7007 | 0.7485 | 0.8272 | 0.8685 | | 0.5208 | 75.0 | 37500 | 0.3492 | 0.7598 | 0.9506 | 0.8859 | 0.2379 | 0.7612 | 0.7963 | 0.3067 | 0.7977 | 0.8034 | 0.3476 | 0.8052 | 0.8349 | 0.7466 | 0.7917 | 0.6987 | 0.7454 | 0.834 | 0.873 | | 0.3542 | 76.0 | 38000 | 0.3492 | 0.7606 | 0.9431 | 0.8889 | 0.2385 | 0.7543 | 0.7958 | 0.3072 | 0.7975 | 0.8028 | 0.3724 | 0.7957 | 0.8427 | 0.7548 | 0.7988 | 0.7124 | 0.7526 | 0.8146 | 0.857 | | 0.439 | 77.0 | 38500 | 0.3485 | 0.7617 | 0.9583 | 0.8965 | 0.213 | 0.7633 | 0.7814 | 0.3039 | 0.7998 | 0.8063 | 0.3343 | 0.8052 | 0.8302 | 0.7599 | 0.8036 | 0.6995 | 0.7474 | 0.8257 | 0.8679 | | 0.4294 | 78.0 | 39000 | 0.3406 | 0.7562 | 0.947 | 0.8774 | 0.2508 | 0.7586 | 0.7739 | 0.3044 | 0.7915 | 0.7994 | 0.3657 | 0.8001 | 0.8081 | 0.7478 | 0.7917 | 0.6802 | 0.7268 | 0.8405 | 0.8797 | | 0.3643 | 79.0 | 39500 | 0.3285 | 0.7607 | 0.9492 | 0.8828 | 0.2242 | 0.7627 | 0.7663 | 0.3047 | 0.7998 | 0.8045 | 0.3529 | 0.8074 | 0.809 | 0.7602 | 0.804 | 0.6911 | 0.7402 | 0.8309 | 0.8694 | | 0.3089 | 80.0 | 40000 | 0.3194 | 0.7734 | 0.9514 | 0.8911 | 0.2526 | 0.773 | 0.808 | 0.3087 | 0.8092 | 0.8163 | 0.3805 | 0.8168 | 0.8501 | 0.773 | 0.8175 | 0.7036 | 0.7515 | 0.8436 | 0.88 | | 0.3825 | 81.0 | 40500 | 0.3217 | 0.7671 | 0.9532 | 0.8831 | 0.2602 | 0.7599 | 0.8157 | 0.3076 | 0.8079 | 0.8136 | 0.3638 | 0.8041 | 0.8615 | 0.7627 | 0.8147 | 0.7031 | 0.7557 | 0.8354 | 0.8703 | | 0.465 | 82.0 | 41000 | 0.3319 | 0.7729 | 0.9571 | 0.8953 | 0.2869 | 0.7677 | 0.812 | 0.3083 | 0.8123 | 0.8173 | 0.3862 | 0.8111 | 0.8534 | 0.7579 | 0.8024 | 0.7203 | 0.7711 | 0.8404 | 0.8785 | | 0.3699 | 83.0 | 41500 | 0.3355 | 0.7681 | 0.9404 | 0.8881 | 0.2056 | 0.7663 | 0.7947 | 0.3088 | 0.8062 | 0.8112 | 0.2671 | 0.809 | 0.8387 | 0.7788 | 0.8246 | 0.7036 | 0.7443 | 0.8218 | 0.8645 | | 0.4712 | 84.0 | 42000 | 0.3503 | 0.7537 | 0.9542 | 0.8957 | 0.2904 | 0.7501 | 0.7776 | 0.3017 | 0.7948 | 0.8006 | 0.3829 | 0.7959 | 0.8247 | 0.7472 | 0.7984 | 0.6924 | 0.7423 | 0.8214 | 0.8612 | | 0.3711 | 85.0 | 42500 | 0.3334 | 0.7686 | 0.9549 | 0.8986 | 0.2611 | 0.7664 | 0.8073 | 0.3066 | 0.8066 | 0.8126 | 0.3724 | 0.8067 | 0.8506 | 0.7423 | 0.7905 | 0.7212 | 0.768 | 0.8425 | 0.8794 | | 0.4093 | 86.0 | 43000 | 0.3299 | 0.7711 | 0.9535 | 0.8948 | 0.2808 | 0.7678 | 0.8096 | 0.31 | 0.8091 | 0.8156 | 0.3848 | 0.8098 | 0.8489 | 0.7478 | 0.7996 | 0.7176 | 0.7629 | 0.8478 | 0.8842 | | 0.447 | 87.0 | 43500 | 0.3274 | 0.7718 | 0.9547 | 0.8992 | 0.2794 | 0.7699 | 0.8004 | 0.3086 | 0.8129 | 0.8177 | 0.3738 | 0.8106 | 0.8445 | 0.7686 | 0.8151 | 0.7125 | 0.7639 | 0.8343 | 0.8742 | | 0.3878 | 88.0 | 44000 | 0.3162 | 0.7836 | 0.9558 | 0.9025 | 0.2726 | 0.7785 | 0.8195 | 0.3138 | 0.8202 | 0.8262 | 0.3805 | 0.8201 | 0.8567 | 0.7758 | 0.8206 | 0.7307 | 0.7763 | 0.8442 | 0.8818 | | 0.3293 | 89.0 | 44500 | 0.3279 | 0.7753 | 0.9585 | 0.8908 | 0.2607 | 0.7729 | 0.8023 | 0.3112 | 0.8129 | 0.8193 | 0.3748 | 0.8182 | 0.8371 | 0.76 | 0.8095 | 0.7257 | 0.7732 | 0.8403 | 0.8752 | | 0.279 | 90.0 | 45000 | 0.3147 | 0.7774 | 0.9502 | 0.8862 | 0.2608 | 0.7753 | 0.8075 | 0.3091 | 0.8166 | 0.8217 | 0.351 | 0.8164 | 0.8512 | 0.7737 | 0.821 | 0.7143 | 0.7608 | 0.8442 | 0.8833 | | 0.339 | 91.0 | 45500 | 0.3120 | 0.7779 | 0.9532 | 0.8949 | 0.2683 | 0.7732 | 0.8047 | 0.3094 | 0.8169 | 0.8225 | 0.3881 | 0.8181 | 0.8504 | 0.7784 | 0.8262 | 0.7125 | 0.7598 | 0.8428 | 0.8815 | | 0.3912 | 92.0 | 46000 | 0.3251 | 0.7654 | 0.9549 | 0.9026 | 0.239 | 0.7613 | 0.7949 | 0.3052 | 0.8083 | 0.8145 | 0.4105 | 0.81 | 0.8352 | 0.7566 | 0.8115 | 0.7011 | 0.7536 | 0.8385 | 0.8785 | | 0.3807 | 93.0 | 46500 | 0.3135 | 0.775 | 0.9623 | 0.8789 | 0.3063 | 0.7761 | 0.8012 | 0.3088 | 0.8154 | 0.822 | 0.4376 | 0.8208 | 0.845 | 0.7674 | 0.821 | 0.7131 | 0.7608 | 0.8444 | 0.8842 | | 0.3656 | 94.0 | 47000 | 0.3086 | 0.7801 | 0.95 | 0.8789 | 0.2709 | 0.7843 | 0.8114 | 0.3144 | 0.8184 | 0.8227 | 0.3586 | 0.8282 | 0.8487 | 0.7752 | 0.8238 | 0.726 | 0.766 | 0.8391 | 0.8782 | | 0.4247 | 95.0 | 47500 | 0.3114 | 0.7796 | 0.9586 | 0.8881 | 0.3308 | 0.7744 | 0.7972 | 0.3095 | 0.8172 | 0.8224 | 0.4505 | 0.8143 | 0.8408 | 0.7644 | 0.8135 | 0.7272 | 0.7701 | 0.8473 | 0.8836 | | 0.4126 | 96.0 | 48000 | 0.3133 | 0.7738 | 0.9614 | 0.8988 | 0.3127 | 0.7708 | 0.8021 | 0.31 | 0.8124 | 0.8185 | 0.3962 | 0.8166 | 0.842 | 0.7608 | 0.8087 | 0.7137 | 0.7629 | 0.8468 | 0.8839 | | 0.359 | 97.0 | 48500 | 0.3201 | 0.7733 | 0.953 | 0.906 | 0.3088 | 0.7727 | 0.8001 | 0.3107 | 0.81 | 0.8155 | 0.391 | 0.8137 | 0.8409 | 0.7506 | 0.7964 | 0.7168 | 0.7629 | 0.8526 | 0.8873 | | 0.4638 | 98.0 | 49000 | 0.3107 | 0.782 | 0.9587 | 0.887 | 0.3189 | 0.783 | 0.8032 | 0.3128 | 0.8212 | 0.8258 | 0.3933 | 0.8258 | 0.8441 | 0.7726 | 0.8147 | 0.7132 | 0.767 | 0.8601 | 0.8958 | | 0.3504 | 99.0 | 49500 | 0.3072 | 0.7808 | 0.9538 | 0.9073 | 0.3213 | 0.7827 | 0.8113 | 0.3134 | 0.8181 | 0.823 | 0.4033 | 0.8219 | 0.8499 | 0.7752 | 0.823 | 0.7103 | 0.7546 | 0.8571 | 0.8912 | | 0.4122 | 100.0 | 50000 | 0.3071 | 0.7832 | 0.9591 | 0.9103 | 0.3515 | 0.7866 | 0.7982 | 0.3125 | 0.8203 | 0.8259 | 0.4233 | 0.8275 | 0.8408 | 0.7704 | 0.8179 | 0.7221 | 0.7691 | 0.8572 | 0.8909 | | 0.4066 | 101.0 | 50500 | 0.3091 | 0.7845 | 0.9595 | 0.8987 | 0.3408 | 0.7879 | 0.8005 | 0.3126 | 0.82 | 0.8259 | 0.4086 | 0.826 | 0.8379 | 0.7781 | 0.823 | 0.7198 | 0.7649 | 0.8555 | 0.8897 | | 0.3207 | 102.0 | 51000 | 0.3127 | 0.7783 | 0.9531 | 0.8992 | 0.3268 | 0.7782 | 0.8049 | 0.312 | 0.8201 | 0.8238 | 0.4262 | 0.8221 | 0.8476 | 0.7715 | 0.8198 | 0.7081 | 0.7588 | 0.8555 | 0.8927 | | 0.3462 | 103.0 | 51500 | 0.3052 | 0.7911 | 0.957 | 0.9074 | 0.3095 | 0.7945 | 0.8058 | 0.3146 | 0.8252 | 0.83 | 0.41 | 0.8332 | 0.8423 | 0.7746 | 0.8194 | 0.7385 | 0.7763 | 0.8601 | 0.8942 | | 0.3938 | 104.0 | 52000 | 0.2955 | 0.793 | 0.9638 | 0.9043 | 0.3318 | 0.7971 | 0.8019 | 0.3167 | 0.8306 | 0.8352 | 0.4348 | 0.839 | 0.8371 | 0.7777 | 0.825 | 0.7434 | 0.7876 | 0.858 | 0.893 | | 0.3236 | 105.0 | 52500 | 0.2997 | 0.7909 | 0.9607 | 0.9079 | 0.3475 | 0.7935 | 0.8059 | 0.3149 | 0.8265 | 0.8323 | 0.4552 | 0.8326 | 0.8405 | 0.7767 | 0.8206 | 0.7373 | 0.7825 | 0.8588 | 0.8939 | | 0.3559 | 106.0 | 53000 | 0.2987 | 0.7918 | 0.9599 | 0.9093 | 0.3328 | 0.8003 | 0.8102 | 0.3167 | 0.8294 | 0.8358 | 0.4367 | 0.8392 | 0.8494 | 0.7782 | 0.8278 | 0.7367 | 0.7856 | 0.8606 | 0.8939 | | 0.39 | 107.0 | 53500 | 0.3079 | 0.7835 | 0.961 | 0.9054 | 0.3091 | 0.7855 | 0.7968 | 0.3115 | 0.8186 | 0.8253 | 0.409 | 0.8258 | 0.8377 | 0.7606 | 0.8079 | 0.7349 | 0.7784 | 0.8551 | 0.8897 | | 0.362 | 108.0 | 54000 | 0.2972 | 0.7924 | 0.9578 | 0.905 | 0.3031 | 0.7963 | 0.8132 | 0.3145 | 0.8291 | 0.8352 | 0.4043 | 0.8369 | 0.8524 | 0.7803 | 0.8234 | 0.7422 | 0.7918 | 0.8548 | 0.8903 | | 0.3628 | 109.0 | 54500 | 0.3163 | 0.7773 | 0.9624 | 0.9104 | 0.3353 | 0.7826 | 0.7896 | 0.3092 | 0.8181 | 0.8238 | 0.4433 | 0.8235 | 0.8397 | 0.7642 | 0.8131 | 0.7221 | 0.7732 | 0.8457 | 0.8852 | | 0.3574 | 110.0 | 55000 | 0.3100 | 0.781 | 0.9617 | 0.9053 | 0.3376 | 0.7829 | 0.806 | 0.3118 | 0.82 | 0.8256 | 0.4252 | 0.8243 | 0.8469 | 0.7664 | 0.8147 | 0.7253 | 0.7753 | 0.8511 | 0.887 | | 0.368 | 111.0 | 55500 | 0.2933 | 0.7928 | 0.9593 | 0.9002 | 0.3206 | 0.7956 | 0.8223 | 0.3186 | 0.8282 | 0.8351 | 0.4252 | 0.8349 | 0.8599 | 0.79 | 0.8365 | 0.7313 | 0.7753 | 0.857 | 0.8936 | | 0.3394 | 112.0 | 56000 | 0.2973 | 0.792 | 0.9547 | 0.9061 | 0.3368 | 0.7922 | 0.8306 | 0.3146 | 0.827 | 0.8325 | 0.4029 | 0.83 | 0.8643 | 0.7841 | 0.8294 | 0.7361 | 0.7763 | 0.8559 | 0.8918 | | 0.3677 | 113.0 | 56500 | 0.2919 | 0.7984 | 0.9604 | 0.9117 | 0.3672 | 0.7988 | 0.8247 | 0.3174 | 0.8343 | 0.8398 | 0.4433 | 0.837 | 0.8588 | 0.7855 | 0.8321 | 0.7524 | 0.7938 | 0.8574 | 0.8933 | | 0.3681 | 114.0 | 57000 | 0.3044 | 0.7833 | 0.9586 | 0.8961 | 0.326 | 0.786 | 0.8076 | 0.3116 | 0.8207 | 0.8256 | 0.3905 | 0.8261 | 0.8483 | 0.7709 | 0.8179 | 0.7311 | 0.7732 | 0.848 | 0.8858 | | 0.3562 | 115.0 | 57500 | 0.2904 | 0.7918 | 0.9632 | 0.8991 | 0.3676 | 0.793 | 0.8148 | 0.3128 | 0.8293 | 0.8339 | 0.4362 | 0.8345 | 0.8539 | 0.7843 | 0.827 | 0.7299 | 0.7804 | 0.8612 | 0.8942 | | 0.3524 | 116.0 | 58000 | 0.2993 | 0.7868 | 0.9596 | 0.8935 | 0.3494 | 0.7868 | 0.8053 | 0.3096 | 0.8241 | 0.8305 | 0.4462 | 0.8306 | 0.8468 | 0.7799 | 0.8258 | 0.7199 | 0.7691 | 0.8604 | 0.8967 | | 0.3553 | 117.0 | 58500 | 0.2957 | 0.7906 | 0.9596 | 0.8967 | 0.3272 | 0.7925 | 0.808 | 0.3134 | 0.8307 | 0.8368 | 0.4229 | 0.8383 | 0.8503 | 0.7817 | 0.8298 | 0.7306 | 0.7845 | 0.8596 | 0.8961 | | 0.3976 | 118.0 | 59000 | 0.2960 | 0.7898 | 0.9594 | 0.8957 | 0.3654 | 0.7936 | 0.8076 | 0.3122 | 0.8276 | 0.8337 | 0.451 | 0.8373 | 0.8466 | 0.7777 | 0.8262 | 0.7295 | 0.7763 | 0.8623 | 0.8985 | | 0.3359 | 119.0 | 59500 | 0.3019 | 0.787 | 0.9608 | 0.901 | 0.3603 | 0.7874 | 0.8063 | 0.3129 | 0.8247 | 0.83 | 0.4529 | 0.8295 | 0.8428 | 0.7713 | 0.8187 | 0.7276 | 0.7753 | 0.8622 | 0.8961 | | 0.3539 | 120.0 | 60000 | 0.2955 | 0.791 | 0.9618 | 0.8983 | 0.346 | 0.7919 | 0.809 | 0.3134 | 0.828 | 0.8339 | 0.4557 | 0.8337 | 0.8467 | 0.7759 | 0.8234 | 0.7341 | 0.7804 | 0.8629 | 0.8979 | | 0.3807 | 121.0 | 60500 | 0.2925 | 0.7959 | 0.9593 | 0.894 | 0.3381 | 0.7973 | 0.8155 | 0.3171 | 0.834 | 0.8391 | 0.4519 | 0.8399 | 0.8493 | 0.7827 | 0.8321 | 0.7384 | 0.7856 | 0.8665 | 0.8997 | | 0.3657 | 122.0 | 61000 | 0.3006 | 0.7916 | 0.9634 | 0.8955 | 0.3356 | 0.7899 | 0.8187 | 0.3182 | 0.8276 | 0.8345 | 0.4352 | 0.834 | 0.8516 | 0.7768 | 0.8242 | 0.7377 | 0.7835 | 0.8604 | 0.8958 | | 0.3388 | 123.0 | 61500 | 0.2985 | 0.7894 | 0.9631 | 0.8973 | 0.3342 | 0.7913 | 0.8084 | 0.3131 | 0.8275 | 0.833 | 0.44 | 0.8346 | 0.8481 | 0.7735 | 0.8206 | 0.7316 | 0.7814 | 0.8629 | 0.897 | | 0.3234 | 124.0 | 62000 | 0.2949 | 0.7951 | 0.9601 | 0.8963 | 0.3344 | 0.7999 | 0.8059 | 0.3156 | 0.8322 | 0.8381 | 0.4452 | 0.8401 | 0.8458 | 0.7824 | 0.8306 | 0.7392 | 0.7845 | 0.8636 | 0.8991 | | 0.341 | 125.0 | 62500 | 0.2918 | 0.7977 | 0.964 | 0.9012 | 0.3369 | 0.8003 | 0.82 | 0.3165 | 0.836 | 0.8414 | 0.4419 | 0.8434 | 0.8564 | 0.783 | 0.8294 | 0.7453 | 0.7948 | 0.8647 | 0.9 | | 0.3289 | 126.0 | 63000 | 0.2900 | 0.8004 | 0.9627 | 0.904 | 0.3378 | 0.8034 | 0.823 | 0.3186 | 0.8403 | 0.8461 | 0.4452 | 0.8482 | 0.8596 | 0.7823 | 0.8329 | 0.7492 | 0.8031 | 0.8698 | 0.9021 | | 0.3619 | 127.0 | 63500 | 0.2918 | 0.7985 | 0.9625 | 0.8978 | 0.3304 | 0.804 | 0.8171 | 0.3174 | 0.8368 | 0.8421 | 0.4114 | 0.8459 | 0.855 | 0.7839 | 0.8294 | 0.745 | 0.7969 | 0.8668 | 0.9 | | 0.3725 | 128.0 | 64000 | 0.2907 | 0.7956 | 0.9634 | 0.8987 | 0.3324 | 0.8023 | 0.8105 | 0.3157 | 0.8345 | 0.8401 | 0.4267 | 0.8446 | 0.8508 | 0.7829 | 0.8266 | 0.7403 | 0.7948 | 0.8636 | 0.8988 | | 0.3488 | 129.0 | 64500 | 0.2899 | 0.7982 | 0.9623 | 0.8974 | 0.3311 | 0.8049 | 0.8123 | 0.3171 | 0.8372 | 0.8432 | 0.4352 | 0.8481 | 0.8514 | 0.7821 | 0.8298 | 0.7478 | 0.8 | 0.8647 | 0.8997 | | 0.2774 | 130.0 | 65000 | 0.2880 | 0.7985 | 0.9623 | 0.8972 | 0.3337 | 0.8056 | 0.8152 | 0.3174 | 0.8363 | 0.8424 | 0.4286 | 0.8472 | 0.8526 | 0.7842 | 0.8313 | 0.7457 | 0.7959 | 0.8656 | 0.9 | | 0.3268 | 131.0 | 65500 | 0.2950 | 0.7966 | 0.9645 | 0.904 | 0.3272 | 0.8028 | 0.8135 | 0.3162 | 0.836 | 0.8422 | 0.4352 | 0.8471 | 0.8531 | 0.7806 | 0.8294 | 0.7465 | 0.799 | 0.8628 | 0.8982 | | 0.327 | 132.0 | 66000 | 0.2854 | 0.8026 | 0.9658 | 0.8916 | 0.3174 | 0.8085 | 0.8186 | 0.3177 | 0.8391 | 0.845 | 0.4124 | 0.8497 | 0.8567 | 0.7927 | 0.8369 | 0.7447 | 0.7959 | 0.8704 | 0.9021 | | 0.3712 | 133.0 | 66500 | 0.2902 | 0.8018 | 0.9658 | 0.8912 | 0.3277 | 0.8063 | 0.8175 | 0.3174 | 0.8391 | 0.8449 | 0.4352 | 0.8477 | 0.8584 | 0.7912 | 0.8365 | 0.7457 | 0.7969 | 0.8686 | 0.9012 | | 0.3267 | 134.0 | 67000 | 0.2885 | 0.8009 | 0.964 | 0.8987 | 0.3322 | 0.8065 | 0.8133 | 0.3167 | 0.8384 | 0.8438 | 0.4252 | 0.8484 | 0.8523 | 0.7863 | 0.8321 | 0.7468 | 0.7969 | 0.8696 | 0.9024 | | 0.4273 | 135.0 | 67500 | 0.2911 | 0.7979 | 0.9638 | 0.9028 | 0.3267 | 0.8033 | 0.8095 | 0.3175 | 0.8352 | 0.8406 | 0.4205 | 0.8448 | 0.8494 | 0.7828 | 0.8298 | 0.7428 | 0.7918 | 0.8681 | 0.9003 | | 0.3564 | 136.0 | 68000 | 0.2915 | 0.797 | 0.9634 | 0.9013 | 0.3325 | 0.8023 | 0.818 | 0.3176 | 0.8349 | 0.8406 | 0.419 | 0.8449 | 0.8553 | 0.7825 | 0.8286 | 0.7411 | 0.7918 | 0.8674 | 0.9015 | | 0.358 | 137.0 | 68500 | 0.2883 | 0.8007 | 0.9635 | 0.9034 | 0.3399 | 0.8054 | 0.8189 | 0.3187 | 0.8379 | 0.8434 | 0.4367 | 0.8469 | 0.8564 | 0.7868 | 0.8333 | 0.7455 | 0.7948 | 0.8698 | 0.9021 | | 0.3715 | 138.0 | 69000 | 0.2868 | 0.7973 | 0.9632 | 0.9007 | 0.3366 | 0.8025 | 0.8125 | 0.3172 | 0.8358 | 0.8413 | 0.4286 | 0.8446 | 0.8521 | 0.7847 | 0.8317 | 0.7397 | 0.7907 | 0.8676 | 0.9015 | | 0.4042 | 139.0 | 69500 | 0.2852 | 0.8022 | 0.9636 | 0.903 | 0.341 | 0.8065 | 0.8131 | 0.3187 | 0.84 | 0.8453 | 0.4381 | 0.849 | 0.853 | 0.7864 | 0.8333 | 0.7506 | 0.801 | 0.8695 | 0.9015 | | 0.3881 | 140.0 | 70000 | 0.2871 | 0.8016 | 0.9632 | 0.8966 | 0.3441 | 0.8064 | 0.8174 | 0.3176 | 0.8384 | 0.8437 | 0.4348 | 0.8472 | 0.8561 | 0.7864 | 0.831 | 0.7496 | 0.799 | 0.8687 | 0.9012 | | 0.3214 | 141.0 | 70500 | 0.2878 | 0.798 | 0.9632 | 0.8974 | 0.3421 | 0.8015 | 0.8145 | 0.316 | 0.8372 | 0.8423 | 0.4381 | 0.8449 | 0.8547 | 0.7844 | 0.8306 | 0.7425 | 0.7969 | 0.8671 | 0.8994 | | 0.3357 | 142.0 | 71000 | 0.2879 | 0.7978 | 0.9639 | 0.8906 | 0.342 | 0.8026 | 0.8151 | 0.3163 | 0.8364 | 0.8416 | 0.4348 | 0.8452 | 0.8546 | 0.7829 | 0.8278 | 0.7456 | 0.7969 | 0.865 | 0.9 | | 0.302 | 143.0 | 71500 | 0.2862 | 0.8007 | 0.9638 | 0.8928 | 0.3448 | 0.8046 | 0.8169 | 0.317 | 0.8381 | 0.8434 | 0.4348 | 0.8472 | 0.856 | 0.7869 | 0.831 | 0.7467 | 0.7979 | 0.8684 | 0.9012 | | 0.3504 | 144.0 | 72000 | 0.2856 | 0.801 | 0.9638 | 0.8912 | 0.3394 | 0.8055 | 0.8199 | 0.3177 | 0.8392 | 0.8445 | 0.4381 | 0.8476 | 0.8576 | 0.7856 | 0.831 | 0.7486 | 0.801 | 0.8689 | 0.9015 | | 0.3533 | 145.0 | 72500 | 0.2863 | 0.7992 | 0.9637 | 0.8912 | 0.3428 | 0.8055 | 0.8154 | 0.3162 | 0.8378 | 0.8431 | 0.4381 | 0.8467 | 0.8551 | 0.7858 | 0.831 | 0.7443 | 0.7969 | 0.8676 | 0.9015 | | 0.3648 | 146.0 | 73000 | 0.2861 | 0.7987 | 0.9638 | 0.8913 | 0.3428 | 0.8047 | 0.8153 | 0.3162 | 0.8375 | 0.8427 | 0.4381 | 0.8463 | 0.855 | 0.7855 | 0.831 | 0.7435 | 0.7959 | 0.8672 | 0.9012 | | 0.3381 | 147.0 | 73500 | 0.2867 | 0.7996 | 0.9637 | 0.899 | 0.3428 | 0.8053 | 0.8153 | 0.3163 | 0.8379 | 0.8431 | 0.4381 | 0.8464 | 0.8551 | 0.7846 | 0.8302 | 0.747 | 0.7979 | 0.8672 | 0.9012 | | 0.3483 | 148.0 | 74000 | 0.2864 | 0.7995 | 0.9637 | 0.8989 | 0.3457 | 0.8055 | 0.8153 | 0.3165 | 0.8381 | 0.8433 | 0.4381 | 0.8467 | 0.8551 | 0.7833 | 0.8298 | 0.748 | 0.799 | 0.8672 | 0.9012 | | 0.3674 | 149.0 | 74500 | 0.2864 | 0.7992 | 0.9637 | 0.8989 | 0.3428 | 0.8051 | 0.8153 | 0.3162 | 0.8378 | 0.843 | 0.4381 | 0.8463 | 0.8551 | 0.7833 | 0.8298 | 0.747 | 0.7979 | 0.8672 | 0.9012 | | 0.3838 | 150.0 | 75000 | 0.2864 | 0.7992 | 0.9637 | 0.8989 | 0.3428 | 0.8051 | 0.8153 | 0.3162 | 0.8378 | 0.843 | 0.4381 | 0.8463 | 0.8551 | 0.7833 | 0.8298 | 0.747 | 0.7979 | 0.8672 | 0.9012 | ### Framework versions - Transformers 4.46.0 - Pytorch 2.5.0+cu121 - Datasets 2.19.2 - Tokenizers 0.20.1
[ "chicken", "duck", "plant" ]
madhutry/detr-finetuned-scrheadermask-2
# Model Card for Model ID <!-- Provide a quick summary of what the model is/does. --> ## Model Details ### Model Description <!-- Provide a longer summary of what this model is. --> This is the model card of a πŸ€— transformers model that has been pushed on the Hub. This model card has been automatically generated. - **Developed by:** [More Information Needed] - **Funded by [optional]:** [More Information Needed] - **Shared by [optional]:** [More Information Needed] - **Model type:** [More Information Needed] - **Language(s) (NLP):** [More Information Needed] - **License:** [More Information Needed] - **Finetuned from model [optional]:** [More Information Needed] ### Model Sources [optional] <!-- Provide the basic links for the model. --> - **Repository:** [More Information Needed] - **Paper [optional]:** [More Information Needed] - **Demo [optional]:** [More Information Needed] ## Uses <!-- Address questions around how the model is intended to be used, including the foreseeable users of the model and those affected by the model. --> ### Direct Use <!-- This section is for the model use without fine-tuning or plugging into a larger ecosystem/app. --> [More Information Needed] ### Downstream Use [optional] <!-- This section is for the model use when fine-tuned for a task, or when plugged into a larger ecosystem/app --> [More Information Needed] ### Out-of-Scope Use <!-- This section addresses misuse, malicious use, and uses that the model will not work well for. --> [More Information Needed] ## Bias, Risks, and Limitations <!-- This section is meant to convey both technical and sociotechnical limitations. --> [More Information Needed] ### Recommendations <!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. --> Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations. ## How to Get Started with the Model Use the code below to get started with the model. [More Information Needed] ## Training Details ### Training Data <!-- This should link to a Dataset Card, perhaps with a short stub of information on what the training data is all about as well as documentation related to data pre-processing or additional filtering. --> [More Information Needed] ### Training Procedure <!-- This relates heavily to the Technical Specifications. Content here should link to that section when it is relevant to the training procedure. --> #### Preprocessing [optional] [More Information Needed] #### Training Hyperparameters - **Training regime:** [More Information Needed] <!--fp32, fp16 mixed precision, bf16 mixed precision, bf16 non-mixed precision, fp16 non-mixed precision, fp8 mixed precision --> #### Speeds, Sizes, Times [optional] <!-- This section provides information about throughput, start/end time, checkpoint size if relevant, etc. --> [More Information Needed] ## Evaluation <!-- This section describes the evaluation protocols and provides the results. --> ### Testing Data, Factors & Metrics #### Testing Data <!-- This should link to a Dataset Card if possible. --> [More Information Needed] #### Factors <!-- These are the things the evaluation is disaggregating by, e.g., subpopulations or domains. --> [More Information Needed] #### Metrics <!-- These are the evaluation metrics being used, ideally with a description of why. --> [More Information Needed] ### Results [More Information Needed] #### Summary ## Model Examination [optional] <!-- Relevant interpretability work for the model goes here --> [More Information Needed] ## Environmental Impact <!-- Total emissions (in grams of CO2eq) and additional considerations, such as electricity usage, go here. Edit the suggested text below accordingly --> Carbon emissions can be estimated using the [Machine Learning Impact calculator](https://mlco2.github.io/impact#compute) presented in [Lacoste et al. (2019)](https://arxiv.org/abs/1910.09700). - **Hardware Type:** [More Information Needed] - **Hours used:** [More Information Needed] - **Cloud Provider:** [More Information Needed] - **Compute Region:** [More Information Needed] - **Carbon Emitted:** [More Information Needed] ## Technical Specifications [optional] ### Model Architecture and Objective [More Information Needed] ### Compute Infrastructure [More Information Needed] #### Hardware [More Information Needed] #### Software [More Information Needed] ## Citation [optional] <!-- If there is a paper or blog post introducing the model, the APA and Bibtex information for that should go in this section. --> **BibTeX:** [More Information Needed] **APA:** [More Information Needed] ## Glossary [optional] <!-- If relevant, include terms and calculations in this section that can help readers understand the model or model card. --> [More Information Needed] ## More Information [optional] [More Information Needed] ## Model Card Authors [optional] [More Information Needed] ## Model Card Contact [More Information Needed]
[ "label_0", "label_1", "label_2" ]
mahee12345/table_tr
# Model Card for Model ID <!-- Provide a quick summary of what the model is/does. --> ## Model Details ### Model Description <!-- Provide a longer summary of what this model is. --> This is the model card of a πŸ€— transformers model that has been pushed on the Hub. This model card has been automatically generated. - **Developed by:** [More Information Needed] - **Funded by [optional]:** [More Information Needed] - **Shared by [optional]:** [More Information Needed] - **Model type:** [More Information Needed] - **Language(s) (NLP):** [More Information Needed] - **License:** [More Information Needed] - **Finetuned from model [optional]:** [More Information Needed] ### Model Sources [optional] <!-- Provide the basic links for the model. --> - **Repository:** [More Information Needed] - **Paper [optional]:** [More Information Needed] - **Demo [optional]:** [More Information Needed] ## Uses <!-- Address questions around how the model is intended to be used, including the foreseeable users of the model and those affected by the model. --> ### Direct Use <!-- This section is for the model use without fine-tuning or plugging into a larger ecosystem/app. --> [More Information Needed] ### Downstream Use [optional] <!-- This section is for the model use when fine-tuned for a task, or when plugged into a larger ecosystem/app --> [More Information Needed] ### Out-of-Scope Use <!-- This section addresses misuse, malicious use, and uses that the model will not work well for. --> [More Information Needed] ## Bias, Risks, and Limitations <!-- This section is meant to convey both technical and sociotechnical limitations. --> [More Information Needed] ### Recommendations <!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. --> Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations. ## How to Get Started with the Model Use the code below to get started with the model. [More Information Needed] ## Training Details ### Training Data <!-- This should link to a Dataset Card, perhaps with a short stub of information on what the training data is all about as well as documentation related to data pre-processing or additional filtering. --> [More Information Needed] ### Training Procedure <!-- This relates heavily to the Technical Specifications. Content here should link to that section when it is relevant to the training procedure. --> #### Preprocessing [optional] [More Information Needed] #### Training Hyperparameters - **Training regime:** [More Information Needed] <!--fp32, fp16 mixed precision, bf16 mixed precision, bf16 non-mixed precision, fp16 non-mixed precision, fp8 mixed precision --> #### Speeds, Sizes, Times [optional] <!-- This section provides information about throughput, start/end time, checkpoint size if relevant, etc. --> [More Information Needed] ## Evaluation <!-- This section describes the evaluation protocols and provides the results. --> ### Testing Data, Factors & Metrics #### Testing Data <!-- This should link to a Dataset Card if possible. --> [More Information Needed] #### Factors <!-- These are the things the evaluation is disaggregating by, e.g., subpopulations or domains. --> [More Information Needed] #### Metrics <!-- These are the evaluation metrics being used, ideally with a description of why. --> [More Information Needed] ### Results [More Information Needed] #### Summary ## Model Examination [optional] <!-- Relevant interpretability work for the model goes here --> [More Information Needed] ## Environmental Impact <!-- Total emissions (in grams of CO2eq) and additional considerations, such as electricity usage, go here. Edit the suggested text below accordingly --> Carbon emissions can be estimated using the [Machine Learning Impact calculator](https://mlco2.github.io/impact#compute) presented in [Lacoste et al. (2019)](https://arxiv.org/abs/1910.09700). - **Hardware Type:** [More Information Needed] - **Hours used:** [More Information Needed] - **Cloud Provider:** [More Information Needed] - **Compute Region:** [More Information Needed] - **Carbon Emitted:** [More Information Needed] ## Technical Specifications [optional] ### Model Architecture and Objective [More Information Needed] ### Compute Infrastructure [More Information Needed] #### Hardware [More Information Needed] #### Software [More Information Needed] ## Citation [optional] <!-- If there is a paper or blog post introducing the model, the APA and Bibtex information for that should go in this section. --> **BibTeX:** [More Information Needed] **APA:** [More Information Needed] ## Glossary [optional] <!-- If relevant, include terms and calculations in this section that can help readers understand the model or model card. --> [More Information Needed] ## More Information [optional] [More Information Needed] ## Model Card Authors [optional] [More Information Needed] ## Model Card Contact [More Information Needed]
[ "label_0", "label_1", "label_2", "label_3", "label_4", "label_5", "label_6" ]
madhutry/yolo-finetuned-98samples
# Model Card for Model ID <!-- Provide a quick summary of what the model is/does. --> ## Model Details ### Model Description <!-- Provide a longer summary of what this model is. --> This is the model card of a πŸ€— transformers model that has been pushed on the Hub. This model card has been automatically generated. - **Developed by:** [More Information Needed] - **Funded by [optional]:** [More Information Needed] - **Shared by [optional]:** [More Information Needed] - **Model type:** [More Information Needed] - **Language(s) (NLP):** [More Information Needed] - **License:** [More Information Needed] - **Finetuned from model [optional]:** [More Information Needed] ### Model Sources [optional] <!-- Provide the basic links for the model. --> - **Repository:** [More Information Needed] - **Paper [optional]:** [More Information Needed] - **Demo [optional]:** [More Information Needed] ## Uses <!-- Address questions around how the model is intended to be used, including the foreseeable users of the model and those affected by the model. --> ### Direct Use <!-- This section is for the model use without fine-tuning or plugging into a larger ecosystem/app. --> [More Information Needed] ### Downstream Use [optional] <!-- This section is for the model use when fine-tuned for a task, or when plugged into a larger ecosystem/app --> [More Information Needed] ### Out-of-Scope Use <!-- This section addresses misuse, malicious use, and uses that the model will not work well for. --> [More Information Needed] ## Bias, Risks, and Limitations <!-- This section is meant to convey both technical and sociotechnical limitations. --> [More Information Needed] ### Recommendations <!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. --> Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations. ## How to Get Started with the Model Use the code below to get started with the model. [More Information Needed] ## Training Details ### Training Data <!-- This should link to a Dataset Card, perhaps with a short stub of information on what the training data is all about as well as documentation related to data pre-processing or additional filtering. --> [More Information Needed] ### Training Procedure <!-- This relates heavily to the Technical Specifications. Content here should link to that section when it is relevant to the training procedure. --> #### Preprocessing [optional] [More Information Needed] #### Training Hyperparameters - **Training regime:** [More Information Needed] <!--fp32, fp16 mixed precision, bf16 mixed precision, bf16 non-mixed precision, fp16 non-mixed precision, fp8 mixed precision --> #### Speeds, Sizes, Times [optional] <!-- This section provides information about throughput, start/end time, checkpoint size if relevant, etc. --> [More Information Needed] ## Evaluation <!-- This section describes the evaluation protocols and provides the results. --> ### Testing Data, Factors & Metrics #### Testing Data <!-- This should link to a Dataset Card if possible. --> [More Information Needed] #### Factors <!-- These are the things the evaluation is disaggregating by, e.g., subpopulations or domains. --> [More Information Needed] #### Metrics <!-- These are the evaluation metrics being used, ideally with a description of why. --> [More Information Needed] ### Results [More Information Needed] #### Summary ## Model Examination [optional] <!-- Relevant interpretability work for the model goes here --> [More Information Needed] ## Environmental Impact <!-- Total emissions (in grams of CO2eq) and additional considerations, such as electricity usage, go here. Edit the suggested text below accordingly --> Carbon emissions can be estimated using the [Machine Learning Impact calculator](https://mlco2.github.io/impact#compute) presented in [Lacoste et al. (2019)](https://arxiv.org/abs/1910.09700). - **Hardware Type:** [More Information Needed] - **Hours used:** [More Information Needed] - **Cloud Provider:** [More Information Needed] - **Compute Region:** [More Information Needed] - **Carbon Emitted:** [More Information Needed] ## Technical Specifications [optional] ### Model Architecture and Objective [More Information Needed] ### Compute Infrastructure [More Information Needed] #### Hardware [More Information Needed] #### Software [More Information Needed] ## Citation [optional] <!-- If there is a paper or blog post introducing the model, the APA and Bibtex information for that should go in this section. --> **BibTeX:** [More Information Needed] **APA:** [More Information Needed] ## Glossary [optional] <!-- If relevant, include terms and calculations in this section that can help readers understand the model or model card. --> [More Information Needed] ## More Information [optional] [More Information Needed] ## Model Card Authors [optional] [More Information Needed] ## Model Card Contact [More Information Needed]
[ "label_0", "label_1", "label_2", "label_3", "label_4", "label_5" ]
madhutry/yolo-finetuned-98samples-2
# Model Card for Model ID <!-- Provide a quick summary of what the model is/does. --> ## Model Details ### Model Description <!-- Provide a longer summary of what this model is. --> This is the model card of a πŸ€— transformers model that has been pushed on the Hub. This model card has been automatically generated. - **Developed by:** [More Information Needed] - **Funded by [optional]:** [More Information Needed] - **Shared by [optional]:** [More Information Needed] - **Model type:** [More Information Needed] - **Language(s) (NLP):** [More Information Needed] - **License:** [More Information Needed] - **Finetuned from model [optional]:** [More Information Needed] ### Model Sources [optional] <!-- Provide the basic links for the model. --> - **Repository:** [More Information Needed] - **Paper [optional]:** [More Information Needed] - **Demo [optional]:** [More Information Needed] ## Uses <!-- Address questions around how the model is intended to be used, including the foreseeable users of the model and those affected by the model. --> ### Direct Use <!-- This section is for the model use without fine-tuning or plugging into a larger ecosystem/app. --> [More Information Needed] ### Downstream Use [optional] <!-- This section is for the model use when fine-tuned for a task, or when plugged into a larger ecosystem/app --> [More Information Needed] ### Out-of-Scope Use <!-- This section addresses misuse, malicious use, and uses that the model will not work well for. --> [More Information Needed] ## Bias, Risks, and Limitations <!-- This section is meant to convey both technical and sociotechnical limitations. --> [More Information Needed] ### Recommendations <!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. --> Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations. ## How to Get Started with the Model Use the code below to get started with the model. [More Information Needed] ## Training Details ### Training Data <!-- This should link to a Dataset Card, perhaps with a short stub of information on what the training data is all about as well as documentation related to data pre-processing or additional filtering. --> [More Information Needed] ### Training Procedure <!-- This relates heavily to the Technical Specifications. Content here should link to that section when it is relevant to the training procedure. --> #### Preprocessing [optional] [More Information Needed] #### Training Hyperparameters - **Training regime:** [More Information Needed] <!--fp32, fp16 mixed precision, bf16 mixed precision, bf16 non-mixed precision, fp16 non-mixed precision, fp8 mixed precision --> #### Speeds, Sizes, Times [optional] <!-- This section provides information about throughput, start/end time, checkpoint size if relevant, etc. --> [More Information Needed] ## Evaluation <!-- This section describes the evaluation protocols and provides the results. --> ### Testing Data, Factors & Metrics #### Testing Data <!-- This should link to a Dataset Card if possible. --> [More Information Needed] #### Factors <!-- These are the things the evaluation is disaggregating by, e.g., subpopulations or domains. --> [More Information Needed] #### Metrics <!-- These are the evaluation metrics being used, ideally with a description of why. --> [More Information Needed] ### Results [More Information Needed] #### Summary ## Model Examination [optional] <!-- Relevant interpretability work for the model goes here --> [More Information Needed] ## Environmental Impact <!-- Total emissions (in grams of CO2eq) and additional considerations, such as electricity usage, go here. Edit the suggested text below accordingly --> Carbon emissions can be estimated using the [Machine Learning Impact calculator](https://mlco2.github.io/impact#compute) presented in [Lacoste et al. (2019)](https://arxiv.org/abs/1910.09700). - **Hardware Type:** [More Information Needed] - **Hours used:** [More Information Needed] - **Cloud Provider:** [More Information Needed] - **Compute Region:** [More Information Needed] - **Carbon Emitted:** [More Information Needed] ## Technical Specifications [optional] ### Model Architecture and Objective [More Information Needed] ### Compute Infrastructure [More Information Needed] #### Hardware [More Information Needed] #### Software [More Information Needed] ## Citation [optional] <!-- If there is a paper or blog post introducing the model, the APA and Bibtex information for that should go in this section. --> **BibTeX:** [More Information Needed] **APA:** [More Information Needed] ## Glossary [optional] <!-- If relevant, include terms and calculations in this section that can help readers understand the model or model card. --> [More Information Needed] ## More Information [optional] [More Information Needed] ## Model Card Authors [optional] [More Information Needed] ## Model Card Contact [More Information Needed]
[ "label_0", "label_1", "label_2", "label_3", "label_4", "label_5" ]
joe611/chickens-60-epoch-1000-images-aug
<!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. --> # chickens-60-epoch-1000-images-aug This model is a fine-tuned version of [facebook/detr-resnet-50](https://huggingface.co/facebook/detr-resnet-50) on an unknown dataset. It achieves the following results on the evaluation set: - Loss: 0.2851 - Map: 0.8003 - Map 50: 0.9645 - Map 75: 0.9196 - Map Small: 0.2246 - Map Medium: 0.7999 - Map Large: 0.8772 - Mar 1: 0.3086 - Mar 10: 0.8346 - Mar 100: 0.8384 - Mar Small: 0.3614 - Mar Medium: 0.8496 - Mar Large: 0.918 - Map Chicken: 0.8072 - Mar 100 Chicken: 0.8413 - Map Duck: 0.7689 - Mar 100 Duck: 0.799 - Map Plant: 0.8248 - Mar 100 Plant: 0.8749 ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 1e-05 - train_batch_size: 2 - eval_batch_size: 8 - seed: 42 - optimizer: Use adamw_torch with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments - lr_scheduler_type: cosine - num_epochs: 60 ### Training results | Training Loss | Epoch | Step | Validation Loss | Map | Map 50 | Map 75 | Map Small | Map Medium | Map Large | Mar 1 | Mar 10 | Mar 100 | Mar Small | Mar Medium | Mar Large | Map Chicken | Mar 100 Chicken | Map Duck | Mar 100 Duck | Map Plant | Mar 100 Plant | |:-------------:|:-----:|:-----:|:---------------:|:------:|:------:|:------:|:---------:|:----------:|:---------:|:------:|:------:|:-------:|:---------:|:----------:|:---------:|:-----------:|:---------------:|:--------:|:------------:|:---------:|:-------------:| | 1.4811 | 1.0 | 500 | 1.3713 | 0.0772 | 0.1135 | 0.0875 | 0.004 | 0.0389 | 0.3534 | 0.0527 | 0.1857 | 0.2513 | 0.0875 | 0.2276 | 0.7937 | 0.0069 | 0.0062 | 0.0 | 0.0 | 0.2247 | 0.7476 | | 1.093 | 2.0 | 1000 | 1.1411 | 0.1941 | 0.2637 | 0.221 | 0.0042 | 0.134 | 0.6715 | 0.0688 | 0.2331 | 0.2586 | 0.1167 | 0.23 | 0.8259 | 0.0 | 0.0 | 0.0 | 0.0 | 0.5822 | 0.7758 | | 1.06 | 3.0 | 1500 | 1.5299 | 0.1865 | 0.2571 | 0.2063 | 0.0172 | 0.1497 | 0.6155 | 0.0695 | 0.2154 | 0.2187 | 0.0583 | 0.199 | 0.6971 | 0.0 | 0.0 | 0.0 | 0.0 | 0.5595 | 0.6562 | | 0.866 | 4.0 | 2000 | 1.0132 | 0.2298 | 0.307 | 0.2534 | 0.0251 | 0.1936 | 0.7269 | 0.0897 | 0.2671 | 0.2739 | 0.0688 | 0.2449 | 0.8172 | 0.0329 | 0.0613 | 0.0 | 0.0 | 0.6564 | 0.7602 | | 0.8301 | 5.0 | 2500 | 0.9037 | 0.2895 | 0.3972 | 0.3375 | 0.0486 | 0.25 | 0.7397 | 0.1158 | 0.3495 | 0.3568 | 0.1063 | 0.3238 | 0.8305 | 0.2025 | 0.3062 | 0.0 | 0.0 | 0.6662 | 0.7643 | | 0.8386 | 6.0 | 3000 | 0.9659 | 0.3229 | 0.4625 | 0.3953 | 0.012 | 0.2944 | 0.7146 | 0.1161 | 0.3938 | 0.3977 | 0.0333 | 0.3785 | 0.7837 | 0.3161 | 0.4693 | 0.0 | 0.0 | 0.6527 | 0.7236 | | 0.9838 | 7.0 | 3500 | 0.7706 | 0.3831 | 0.5318 | 0.4649 | 0.0314 | 0.3525 | 0.7573 | 0.1347 | 0.4783 | 0.4829 | 0.0854 | 0.4636 | 0.8234 | 0.4609 | 0.6889 | 0.0 | 0.0 | 0.6886 | 0.7599 | | 0.8051 | 8.0 | 4000 | 0.7424 | 0.3909 | 0.5361 | 0.4744 | 0.0298 | 0.362 | 0.7506 | 0.1396 | 0.489 | 0.4921 | 0.1063 | 0.4762 | 0.8222 | 0.4864 | 0.712 | 0.0 | 0.0 | 0.6864 | 0.7643 | | 0.7114 | 9.0 | 4500 | 0.6860 | 0.425 | 0.5595 | 0.5062 | 0.0656 | 0.4023 | 0.7822 | 0.1429 | 0.5123 | 0.5168 | 0.1104 | 0.5014 | 0.849 | 0.5552 | 0.7609 | 0.0 | 0.0 | 0.7197 | 0.7896 | | 0.8088 | 10.0 | 5000 | 0.6922 | 0.4107 | 0.5706 | 0.5008 | 0.0425 | 0.3953 | 0.7653 | 0.1375 | 0.4929 | 0.4981 | 0.1333 | 0.4833 | 0.8272 | 0.527 | 0.7218 | 0.0 | 0.0 | 0.705 | 0.7726 | | 0.7049 | 11.0 | 5500 | 0.6989 | 0.4204 | 0.5653 | 0.5121 | 0.0859 | 0.4022 | 0.7752 | 0.1439 | 0.4952 | 0.499 | 0.1458 | 0.4863 | 0.8293 | 0.5406 | 0.7182 | 0.0 | 0.0 | 0.7205 | 0.7787 | | 0.7244 | 12.0 | 6000 | 0.6311 | 0.4276 | 0.584 | 0.5016 | 0.0749 | 0.4104 | 0.7945 | 0.1445 | 0.4989 | 0.5035 | 0.1542 | 0.4876 | 0.8515 | 0.5447 | 0.7133 | 0.0 | 0.0 | 0.7382 | 0.7971 | | 0.683 | 13.0 | 6500 | 0.6244 | 0.4371 | 0.5962 | 0.5288 | 0.0877 | 0.4149 | 0.7935 | 0.1447 | 0.5002 | 0.5031 | 0.1312 | 0.486 | 0.8427 | 0.5784 | 0.7244 | 0.0 | 0.0 | 0.7329 | 0.7847 | | 0.6541 | 14.0 | 7000 | 0.5543 | 0.4719 | 0.6191 | 0.5555 | 0.0712 | 0.4525 | 0.8231 | 0.1494 | 0.5195 | 0.5249 | 0.1896 | 0.5025 | 0.8715 | 0.6567 | 0.7644 | 0.0 | 0.0 | 0.759 | 0.8104 | | 0.6219 | 15.0 | 7500 | 0.5368 | 0.4754 | 0.6197 | 0.5553 | 0.0764 | 0.4584 | 0.825 | 0.1528 | 0.5216 | 0.5265 | 0.1688 | 0.5038 | 0.8711 | 0.6691 | 0.7724 | 0.0 | 0.0 | 0.7571 | 0.8072 | | 0.5842 | 16.0 | 8000 | 0.5325 | 0.4778 | 0.6269 | 0.5668 | 0.1147 | 0.4558 | 0.8015 | 0.1501 | 0.5178 | 0.5218 | 0.1604 | 0.5071 | 0.8556 | 0.6922 | 0.7636 | 0.0 | 0.0 | 0.7412 | 0.8017 | | 0.5704 | 17.0 | 8500 | 0.5437 | 0.5192 | 0.6982 | 0.6149 | 0.0616 | 0.5014 | 0.8084 | 0.1798 | 0.558 | 0.5618 | 0.1521 | 0.5445 | 0.8644 | 0.6772 | 0.7449 | 0.1347 | 0.1412 | 0.7456 | 0.7994 | | 0.5683 | 18.0 | 9000 | 0.5068 | 0.6324 | 0.8451 | 0.7659 | 0.0963 | 0.6253 | 0.8208 | 0.225 | 0.6739 | 0.6793 | 0.175 | 0.6808 | 0.8628 | 0.6996 | 0.7573 | 0.4404 | 0.4753 | 0.7573 | 0.8052 | | 0.6402 | 19.0 | 9500 | 0.4682 | 0.6741 | 0.8823 | 0.8298 | 0.1357 | 0.6748 | 0.8274 | 0.2516 | 0.7135 | 0.7185 | 0.2104 | 0.7246 | 0.8728 | 0.7195 | 0.7698 | 0.5335 | 0.567 | 0.7691 | 0.8187 | | 0.5664 | 20.0 | 10000 | 0.4793 | 0.6841 | 0.9057 | 0.8277 | 0.135 | 0.6878 | 0.8164 | 0.2585 | 0.7299 | 0.7341 | 0.2396 | 0.7463 | 0.8649 | 0.7325 | 0.7853 | 0.5558 | 0.5979 | 0.7638 | 0.819 | | 0.4411 | 21.0 | 10500 | 0.4448 | 0.7042 | 0.932 | 0.8592 | 0.1098 | 0.7039 | 0.8287 | 0.2789 | 0.7527 | 0.7568 | 0.1718 | 0.7703 | 0.8749 | 0.7128 | 0.7658 | 0.6338 | 0.6845 | 0.766 | 0.8202 | | 0.6106 | 22.0 | 11000 | 0.4142 | 0.7307 | 0.9307 | 0.8797 | 0.0773 | 0.735 | 0.8381 | 0.2841 | 0.7736 | 0.7783 | 0.2062 | 0.7946 | 0.8866 | 0.7379 | 0.7853 | 0.6726 | 0.7134 | 0.7817 | 0.836 | | 0.5243 | 23.0 | 11500 | 0.4353 | 0.7183 | 0.9406 | 0.86 | 0.0901 | 0.7236 | 0.8416 | 0.2827 | 0.7615 | 0.767 | 0.1973 | 0.779 | 0.8879 | 0.7338 | 0.7827 | 0.6385 | 0.6845 | 0.7827 | 0.8337 | | 0.5184 | 24.0 | 12000 | 0.4077 | 0.7097 | 0.9464 | 0.854 | 0.1197 | 0.7156 | 0.8335 | 0.2741 | 0.757 | 0.7607 | 0.2553 | 0.7738 | 0.8828 | 0.7126 | 0.7667 | 0.6338 | 0.6784 | 0.7828 | 0.8372 | | 0.4849 | 25.0 | 12500 | 0.4043 | 0.7096 | 0.949 | 0.8366 | 0.1234 | 0.7084 | 0.8412 | 0.2739 | 0.7538 | 0.7611 | 0.258 | 0.7703 | 0.8891 | 0.7483 | 0.7902 | 0.596 | 0.6546 | 0.7843 | 0.8383 | | 0.5022 | 26.0 | 13000 | 0.3884 | 0.7394 | 0.9528 | 0.8847 | 0.1472 | 0.7337 | 0.8473 | 0.2918 | 0.7816 | 0.7876 | 0.2549 | 0.7888 | 0.8971 | 0.7466 | 0.7871 | 0.688 | 0.7402 | 0.7838 | 0.8354 | | 0.521 | 27.0 | 13500 | 0.4197 | 0.7177 | 0.9434 | 0.8715 | 0.132 | 0.7168 | 0.8353 | 0.2879 | 0.7639 | 0.7697 | 0.2623 | 0.7777 | 0.8799 | 0.7073 | 0.7649 | 0.6685 | 0.7165 | 0.7771 | 0.8277 | | 0.5433 | 28.0 | 14000 | 0.3886 | 0.7454 | 0.9508 | 0.8823 | 0.2083 | 0.7406 | 0.8448 | 0.292 | 0.7833 | 0.789 | 0.3064 | 0.7952 | 0.8845 | 0.7573 | 0.8004 | 0.6941 | 0.733 | 0.785 | 0.8334 | | 0.3889 | 29.0 | 14500 | 0.3713 | 0.7492 | 0.9553 | 0.8998 | 0.2224 | 0.7492 | 0.8468 | 0.2891 | 0.7873 | 0.7936 | 0.3112 | 0.8026 | 0.8921 | 0.7677 | 0.8053 | 0.6849 | 0.7299 | 0.7951 | 0.8455 | | 0.5103 | 30.0 | 15000 | 0.3556 | 0.7584 | 0.9576 | 0.9014 | 0.219 | 0.7509 | 0.8517 | 0.2939 | 0.7954 | 0.8015 | 0.3089 | 0.8078 | 0.8979 | 0.7654 | 0.8022 | 0.7165 | 0.7557 | 0.7934 | 0.8467 | | 0.4458 | 31.0 | 15500 | 0.3681 | 0.7355 | 0.9518 | 0.8831 | 0.1416 | 0.7311 | 0.8569 | 0.292 | 0.7773 | 0.7814 | 0.2358 | 0.7892 | 0.9008 | 0.734 | 0.7742 | 0.6771 | 0.7247 | 0.7955 | 0.8452 | | 0.4369 | 32.0 | 16000 | 0.3523 | 0.7495 | 0.9499 | 0.8877 | 0.142 | 0.7447 | 0.8514 | 0.2935 | 0.7877 | 0.7923 | 0.2589 | 0.8026 | 0.895 | 0.765 | 0.8027 | 0.6903 | 0.7309 | 0.7932 | 0.8432 | | 0.447 | 33.0 | 16500 | 0.3665 | 0.7448 | 0.954 | 0.8912 | 0.1577 | 0.7412 | 0.8505 | 0.2922 | 0.7844 | 0.7879 | 0.2742 | 0.8005 | 0.8929 | 0.7453 | 0.7813 | 0.6933 | 0.7361 | 0.796 | 0.8464 | | 0.4692 | 34.0 | 17000 | 0.3455 | 0.7589 | 0.954 | 0.899 | 0.1729 | 0.7459 | 0.863 | 0.2949 | 0.798 | 0.8038 | 0.3123 | 0.8068 | 0.9033 | 0.7709 | 0.8093 | 0.7058 | 0.7546 | 0.7999 | 0.8476 | | 0.4272 | 35.0 | 17500 | 0.3381 | 0.767 | 0.9568 | 0.903 | 0.1734 | 0.7623 | 0.852 | 0.2948 | 0.802 | 0.8061 | 0.2907 | 0.8136 | 0.8962 | 0.7842 | 0.8204 | 0.7246 | 0.7546 | 0.7922 | 0.8432 | | 0.4021 | 36.0 | 18000 | 0.3323 | 0.7686 | 0.9551 | 0.8938 | 0.1892 | 0.7621 | 0.8616 | 0.2969 | 0.8025 | 0.8067 | 0.3049 | 0.8122 | 0.9025 | 0.7776 | 0.8133 | 0.7245 | 0.7577 | 0.8038 | 0.849 | | 0.4582 | 37.0 | 18500 | 0.3263 | 0.7732 | 0.9547 | 0.9023 | 0.1477 | 0.7748 | 0.8742 | 0.3015 | 0.8092 | 0.8132 | 0.2634 | 0.8251 | 0.9163 | 0.7729 | 0.812 | 0.7298 | 0.7619 | 0.817 | 0.8657 | | 0.3992 | 38.0 | 19000 | 0.3207 | 0.7799 | 0.956 | 0.9064 | 0.1767 | 0.7823 | 0.873 | 0.3014 | 0.8168 | 0.8209 | 0.3028 | 0.8312 | 0.9172 | 0.7833 | 0.8218 | 0.7384 | 0.7732 | 0.818 | 0.8677 | | 0.4286 | 39.0 | 19500 | 0.3194 | 0.7717 | 0.9567 | 0.8986 | 0.1626 | 0.7677 | 0.8779 | 0.3033 | 0.8101 | 0.8139 | 0.2835 | 0.82 | 0.918 | 0.7752 | 0.8204 | 0.7223 | 0.7598 | 0.8175 | 0.8614 | | 0.4488 | 40.0 | 20000 | 0.3184 | 0.7718 | 0.9566 | 0.9047 | 0.1921 | 0.7702 | 0.8809 | 0.2999 | 0.8092 | 0.8141 | 0.3002 | 0.8238 | 0.9192 | 0.7776 | 0.8187 | 0.7156 | 0.7546 | 0.8223 | 0.8689 | | 0.3763 | 41.0 | 20500 | 0.3055 | 0.7876 | 0.956 | 0.9186 | 0.1841 | 0.7824 | 0.8844 | 0.3033 | 0.8207 | 0.8254 | 0.3061 | 0.8339 | 0.9234 | 0.7973 | 0.8356 | 0.7394 | 0.7691 | 0.8262 | 0.8715 | | 0.5658 | 42.0 | 21000 | 0.3014 | 0.791 | 0.9594 | 0.9167 | 0.2095 | 0.7911 | 0.8786 | 0.3032 | 0.8244 | 0.8299 | 0.3403 | 0.8403 | 0.9213 | 0.805 | 0.8404 | 0.7421 | 0.7742 | 0.8259 | 0.8749 | | 0.4322 | 43.0 | 21500 | 0.2974 | 0.7974 | 0.9595 | 0.9169 | 0.1879 | 0.7927 | 0.8951 | 0.3078 | 0.8296 | 0.834 | 0.3085 | 0.8411 | 0.931 | 0.8005 | 0.8369 | 0.7586 | 0.7876 | 0.8331 | 0.8775 | | 0.7057 | 44.0 | 22000 | 0.3092 | 0.7822 | 0.9563 | 0.9171 | 0.1985 | 0.7813 | 0.8688 | 0.3003 | 0.8165 | 0.821 | 0.3663 | 0.8292 | 0.9117 | 0.7941 | 0.8307 | 0.7348 | 0.766 | 0.8177 | 0.8663 | | 0.4096 | 45.0 | 22500 | 0.2991 | 0.7899 | 0.9614 | 0.9121 | 0.2212 | 0.7852 | 0.8747 | 0.3031 | 0.8233 | 0.8286 | 0.3578 | 0.8351 | 0.9142 | 0.8016 | 0.8413 | 0.7502 | 0.7794 | 0.8179 | 0.8651 | | 0.4854 | 46.0 | 23000 | 0.3003 | 0.7815 | 0.9595 | 0.9068 | 0.2042 | 0.7791 | 0.8747 | 0.3016 | 0.8164 | 0.8197 | 0.3258 | 0.8255 | 0.9163 | 0.7816 | 0.8231 | 0.746 | 0.7722 | 0.8169 | 0.8637 | | 0.4257 | 47.0 | 23500 | 0.2951 | 0.792 | 0.9625 | 0.9172 | 0.2075 | 0.7855 | 0.8802 | 0.3067 | 0.8262 | 0.8309 | 0.3468 | 0.836 | 0.9197 | 0.7961 | 0.8338 | 0.7572 | 0.7907 | 0.8226 | 0.8683 | | 0.4033 | 48.0 | 24000 | 0.2883 | 0.7988 | 0.9632 | 0.9194 | 0.2266 | 0.7984 | 0.8765 | 0.3082 | 0.8343 | 0.8382 | 0.3616 | 0.8477 | 0.9176 | 0.8069 | 0.8458 | 0.7649 | 0.7969 | 0.8246 | 0.872 | | 0.4932 | 49.0 | 24500 | 0.3022 | 0.7844 | 0.9617 | 0.9101 | 0.2231 | 0.7765 | 0.8762 | 0.3007 | 0.8216 | 0.8252 | 0.3396 | 0.8308 | 0.9176 | 0.7882 | 0.8293 | 0.7472 | 0.7794 | 0.8177 | 0.8669 | | 0.3758 | 50.0 | 25000 | 0.2959 | 0.7921 | 0.9609 | 0.9203 | 0.2432 | 0.7853 | 0.8779 | 0.3066 | 0.8273 | 0.8314 | 0.3655 | 0.8365 | 0.9197 | 0.7932 | 0.832 | 0.7619 | 0.7918 | 0.8212 | 0.8703 | | 0.4397 | 51.0 | 25500 | 0.2871 | 0.7983 | 0.9609 | 0.9128 | 0.2145 | 0.7966 | 0.8832 | 0.3086 | 0.833 | 0.8374 | 0.3409 | 0.8478 | 0.9251 | 0.802 | 0.8369 | 0.7648 | 0.7969 | 0.828 | 0.8784 | | 0.3917 | 52.0 | 26000 | 0.2907 | 0.7955 | 0.9645 | 0.9161 | 0.2316 | 0.7911 | 0.8796 | 0.308 | 0.8314 | 0.8352 | 0.375 | 0.8428 | 0.9192 | 0.7975 | 0.8356 | 0.7654 | 0.7969 | 0.8234 | 0.8732 | | 0.3362 | 53.0 | 26500 | 0.2885 | 0.7989 | 0.9644 | 0.92 | 0.2324 | 0.7958 | 0.8789 | 0.3075 | 0.8338 | 0.8379 | 0.3769 | 0.8465 | 0.9201 | 0.8012 | 0.8382 | 0.7703 | 0.8 | 0.8253 | 0.8755 | | 0.4004 | 54.0 | 27000 | 0.2869 | 0.7973 | 0.9644 | 0.9201 | 0.228 | 0.7957 | 0.8813 | 0.3069 | 0.8328 | 0.8368 | 0.3822 | 0.8456 | 0.9218 | 0.801 | 0.8373 | 0.7636 | 0.7948 | 0.8273 | 0.8781 | | 0.406 | 55.0 | 27500 | 0.2871 | 0.8004 | 0.9645 | 0.9194 | 0.2283 | 0.7986 | 0.8788 | 0.3084 | 0.8343 | 0.8384 | 0.3822 | 0.8476 | 0.9205 | 0.8069 | 0.8404 | 0.7679 | 0.7979 | 0.8265 | 0.8769 | | 0.3876 | 56.0 | 28000 | 0.2882 | 0.7985 | 0.9641 | 0.9197 | 0.2257 | 0.7974 | 0.8772 | 0.3084 | 0.834 | 0.838 | 0.3676 | 0.8474 | 0.918 | 0.8072 | 0.8436 | 0.7646 | 0.7969 | 0.8237 | 0.8735 | | 0.3939 | 57.0 | 28500 | 0.2845 | 0.8024 | 0.9645 | 0.9195 | 0.2291 | 0.8014 | 0.8782 | 0.3093 | 0.8367 | 0.8405 | 0.3697 | 0.8511 | 0.9197 | 0.8102 | 0.8431 | 0.7709 | 0.8021 | 0.8262 | 0.8764 | | 0.4218 | 58.0 | 29000 | 0.2852 | 0.8 | 0.9646 | 0.9196 | 0.2254 | 0.7993 | 0.8774 | 0.3085 | 0.8346 | 0.8384 | 0.3634 | 0.8488 | 0.9188 | 0.8067 | 0.8413 | 0.7689 | 0.799 | 0.8245 | 0.8749 | | 0.4046 | 59.0 | 29500 | 0.2851 | 0.8008 | 0.9645 | 0.9196 | 0.2283 | 0.8002 | 0.878 | 0.3087 | 0.835 | 0.8388 | 0.3655 | 0.8497 | 0.9188 | 0.8079 | 0.8418 | 0.7689 | 0.799 | 0.8256 | 0.8758 | | 0.4504 | 60.0 | 30000 | 0.2851 | 0.8003 | 0.9645 | 0.9196 | 0.2246 | 0.7999 | 0.8772 | 0.3086 | 0.8346 | 0.8384 | 0.3614 | 0.8496 | 0.918 | 0.8072 | 0.8413 | 0.7689 | 0.799 | 0.8248 | 0.8749 | ### Framework versions - Transformers 4.46.1 - Pytorch 2.5.0+cu121 - Datasets 2.19.2 - Tokenizers 0.20.1
[ "chicken", "duck", "plant" ]
joe611/chickens-composite-02020202020-150-epochs-wo-transform-metrics-test
<!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. --> # chickens-composite-02020202020-150-epochs-wo-transform-metrics-test This model is a fine-tuned version of [facebook/detr-resnet-50](https://huggingface.co/facebook/detr-resnet-50) on the None dataset. It achieves the following results on the evaluation set: - Loss: 0.3002 - Map: 0.8045 - Map 50: 0.9501 - Map 75: 0.9065 - Map Small: 0.2866 - Map Medium: 0.7969 - Map Large: 0.7881 - Mar 1: 0.3575 - Mar 10: 0.8442 - Mar 100: 0.8477 - Mar Small: 0.331 - Mar Medium: 0.8432 - Mar Large: 0.8218 - Map Chicken: 0.7828 - Mar 100 Chicken: 0.8381 - Map Duck: 0.756 - Mar 100 Duck: 0.8011 - Map Plant: 0.8746 - Mar 100 Plant: 0.9038 ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 1e-05 - train_batch_size: 2 - eval_batch_size: 8 - seed: 42 - optimizer: Use adamw_torch with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments - lr_scheduler_type: cosine - num_epochs: 150 ### Training results | Training Loss | Epoch | Step | Validation Loss | Map | Map 50 | Map 75 | Map Chicken | Map Duck | Map Large | Map Medium | Map Plant | Map Small | Mar 1 | Mar 10 | Mar 100 | Mar 100 Chicken | Mar 100 Duck | Mar 100 Plant | Mar Large | Mar Medium | Mar Small | |:-------------:|:-----:|:-----:|:---------------:|:------:|:------:|:------:|:-----------:|:--------:|:---------:|:----------:|:---------:|:---------:|:------:|:------:|:-------:|:---------------:|:------------:|:-------------:|:---------:|:----------:|:---------:| | 1.5433 | 1.0 | 500 | 1.4439 | 0.1424 | 0.2118 | 0.1634 | 0.0535 | 0.0 | 0.1983 | 0.0664 | 0.3736 | 0.0226 | 0.0883 | 0.2591 | 0.3216 | 0.2573 | 0.0 | 0.7074 | 0.3206 | 0.2916 | 0.0833 | | 1.1879 | 2.0 | 1000 | 1.0695 | 0.2704 | 0.3719 | 0.3153 | 0.186 | 0.0 | 0.3064 | 0.1835 | 0.6253 | 0.0209 | 0.1222 | 0.4049 | 0.4745 | 0.654 | 0.0 | 0.7695 | 0.4804 | 0.4347 | 0.1155 | | 1.2151 | 3.0 | 1500 | 1.0853 | 0.3175 | 0.4534 | 0.3765 | 0.2889 | 0.0 | 0.3149 | 0.269 | 0.6636 | 0.0068 | 0.1255 | 0.4281 | 0.4458 | 0.6088 | 0.0 | 0.7287 | 0.4399 | 0.4158 | 0.0238 | | 0.9586 | 4.0 | 2000 | 1.1220 | 0.3223 | 0.4774 | 0.3739 | 0.3264 | 0.0 | 0.3385 | 0.2769 | 0.6404 | 0.0113 | 0.1239 | 0.4238 | 0.4328 | 0.5962 | 0.0 | 0.7021 | 0.4443 | 0.4021 | 0.0381 | | 0.8535 | 5.0 | 2500 | 0.7944 | 0.3542 | 0.5066 | 0.421 | 0.3793 | 0.0 | 0.3752 | 0.306 | 0.6832 | 0.0336 | 0.1366 | 0.4696 | 0.4791 | 0.705 | 0.0 | 0.7322 | 0.5019 | 0.4416 | 0.0976 | | 0.7929 | 6.0 | 3000 | 0.6761 | 0.3885 | 0.523 | 0.4561 | 0.4218 | 0.0 | 0.4001 | 0.3461 | 0.7438 | 0.0331 | 0.1422 | 0.502 | 0.5106 | 0.7368 | 0.0 | 0.795 | 0.5306 | 0.4765 | 0.0631 | | 0.7237 | 7.0 | 3500 | 0.6439 | 0.4179 | 0.5709 | 0.4995 | 0.5084 | 0.0 | 0.4351 | 0.378 | 0.7453 | 0.1256 | 0.1505 | 0.5 | 0.507 | 0.7238 | 0.0 | 0.797 | 0.5195 | 0.4725 | 0.1667 | | 0.746 | 8.0 | 4000 | 0.6252 | 0.4436 | 0.6016 | 0.5232 | 0.5273 | 0.0559 | 0.436 | 0.4049 | 0.7475 | 0.088 | 0.1617 | 0.5187 | 0.5239 | 0.7188 | 0.0538 | 0.7991 | 0.5176 | 0.4907 | 0.1798 | | 0.6276 | 9.0 | 4500 | 0.6334 | 0.538 | 0.756 | 0.6543 | 0.5593 | 0.3244 | 0.4947 | 0.5153 | 0.7301 | 0.12 | 0.2214 | 0.6007 | 0.6027 | 0.682 | 0.3505 | 0.7754 | 0.547 | 0.5887 | 0.1905 | | 0.665 | 10.0 | 5000 | 0.5830 | 0.5456 | 0.801 | 0.6166 | 0.547 | 0.3706 | 0.527 | 0.5291 | 0.7192 | 0.1414 | 0.2416 | 0.619 | 0.622 | 0.6715 | 0.4187 | 0.7757 | 0.5859 | 0.6061 | 0.2226 | | 0.6302 | 11.0 | 5500 | 0.5229 | 0.6375 | 0.8566 | 0.7753 | 0.6493 | 0.5039 | 0.6151 | 0.6127 | 0.7594 | 0.1425 | 0.2873 | 0.6888 | 0.6909 | 0.7289 | 0.544 | 0.8 | 0.6521 | 0.6764 | 0.2083 | | 0.4839 | 12.0 | 6000 | 0.5044 | 0.6421 | 0.8715 | 0.7866 | 0.6311 | 0.5108 | 0.6277 | 0.6212 | 0.7844 | 0.1109 | 0.288 | 0.6971 | 0.6994 | 0.7046 | 0.5637 | 0.8299 | 0.6733 | 0.6832 | 0.1679 | | 0.578 | 13.0 | 6500 | 0.4924 | 0.6527 | 0.8762 | 0.7858 | 0.6574 | 0.5284 | 0.6514 | 0.6299 | 0.7723 | 0.0973 | 0.295 | 0.7093 | 0.7119 | 0.7268 | 0.589 | 0.8198 | 0.7014 | 0.6921 | 0.1024 | | 0.4601 | 14.0 | 7000 | 0.4605 | 0.6773 | 0.9022 | 0.8281 | 0.6697 | 0.5782 | 0.675 | 0.6554 | 0.784 | 0.1184 | 0.3064 | 0.7322 | 0.7334 | 0.7322 | 0.6418 | 0.8263 | 0.7172 | 0.7142 | 0.144 | | 0.528 | 15.0 | 7500 | 0.4341 | 0.693 | 0.9056 | 0.84 | 0.6996 | 0.5918 | 0.7243 | 0.6557 | 0.7875 | 0.1591 | 0.3181 | 0.7515 | 0.7541 | 0.7619 | 0.6736 | 0.8266 | 0.7655 | 0.7223 | 0.2298 | | 0.4692 | 16.0 | 8000 | 0.4386 | 0.6883 | 0.9219 | 0.8197 | 0.6887 | 0.5856 | 0.7176 | 0.6483 | 0.7904 | 0.1065 | 0.3127 | 0.7451 | 0.7477 | 0.746 | 0.6681 | 0.829 | 0.7586 | 0.7186 | 0.1607 | | 0.4285 | 17.0 | 8500 | 0.4230 | 0.6912 | 0.916 | 0.8388 | 0.6936 | 0.5983 | 0.702 | 0.664 | 0.7817 | 0.1776 | 0.3171 | 0.7442 | 0.746 | 0.7561 | 0.6637 | 0.8183 | 0.742 | 0.7288 | 0.2167 | | 0.4812 | 18.0 | 9000 | 0.4115 | 0.7061 | 0.9261 | 0.8571 | 0.6799 | 0.6338 | 0.6832 | 0.6887 | 0.8045 | 0.205 | 0.3214 | 0.757 | 0.7591 | 0.7397 | 0.6945 | 0.8429 | 0.7254 | 0.7497 | 0.2488 | | 0.4333 | 19.0 | 9500 | 0.4013 | 0.7119 | 0.9328 | 0.8514 | 0.7044 | 0.6334 | 0.7176 | 0.6895 | 0.798 | 0.1156 | 0.3229 | 0.7644 | 0.7665 | 0.7623 | 0.7 | 0.8373 | 0.7541 | 0.7514 | 0.1881 | | 0.4147 | 20.0 | 10000 | 0.4035 | 0.7087 | 0.9166 | 0.86 | 0.6923 | 0.6323 | 0.7291 | 0.6814 | 0.8017 | 0.2092 | 0.3244 | 0.7598 | 0.7625 | 0.7477 | 0.7 | 0.8399 | 0.7656 | 0.7406 | 0.3119 | | 0.4404 | 21.0 | 10500 | 0.4038 | 0.7024 | 0.9252 | 0.8475 | 0.6986 | 0.6154 | 0.7196 | 0.6702 | 0.7931 | 0.1405 | 0.3222 | 0.7561 | 0.7582 | 0.7598 | 0.6802 | 0.8346 | 0.7569 | 0.7315 | 0.2321 | | 0.4717 | 22.0 | 11000 | 0.3903 | 0.7085 | 0.9194 | 0.8491 | 0.7167 | 0.6029 | 0.6919 | 0.6836 | 0.8059 | 0.1412 | 0.3187 | 0.7584 | 0.7616 | 0.7736 | 0.6681 | 0.8429 | 0.7277 | 0.7483 | 0.2036 | | 0.4235 | 23.0 | 11500 | 0.3880 | 0.724 | 0.9318 | 0.862 | 0.7178 | 0.646 | 0.731 | 0.7008 | 0.8081 | 0.19 | 0.3266 | 0.7723 | 0.7746 | 0.7741 | 0.7055 | 0.8444 | 0.7715 | 0.7558 | 0.2262 | | 0.5037 | 24.0 | 12000 | 0.4196 | 0.7127 | 0.911 | 0.849 | 0.7179 | 0.6236 | 0.7035 | 0.6933 | 0.7966 | 0.2335 | 0.3227 | 0.7598 | 0.7621 | 0.7674 | 0.678 | 0.8408 | 0.7417 | 0.7488 | 0.2524 | | 0.4372 | 25.0 | 12500 | 0.3617 | 0.7404 | 0.935 | 0.8782 | 0.7325 | 0.6685 | 0.7287 | 0.7236 | 0.8204 | 0.178 | 0.3305 | 0.787 | 0.79 | 0.7866 | 0.7286 | 0.8547 | 0.7676 | 0.78 | 0.2429 | | 0.4006 | 26.0 | 13000 | 0.3816 | 0.731 | 0.9129 | 0.8659 | 0.7333 | 0.6392 | 0.7193 | 0.7066 | 0.8204 | 0.2115 | 0.3264 | 0.7762 | 0.7787 | 0.7925 | 0.6912 | 0.8524 | 0.7518 | 0.7619 | 0.2798 | | 0.393 | 27.0 | 13500 | 0.4004 | 0.7101 | 0.9118 | 0.8549 | 0.7183 | 0.6108 | 0.685 | 0.6936 | 0.8013 | 0.2258 | 0.3213 | 0.7609 | 0.7653 | 0.7816 | 0.6747 | 0.8396 | 0.7235 | 0.7511 | 0.3643 | | 0.3427 | 28.0 | 14000 | 0.3796 | 0.7374 | 0.9264 | 0.8842 | 0.7293 | 0.6712 | 0.7415 | 0.7129 | 0.8116 | 0.1027 | 0.3332 | 0.7831 | 0.7878 | 0.7845 | 0.7275 | 0.8515 | 0.7783 | 0.7707 | 0.2333 | | 0.3568 | 29.0 | 14500 | 0.3838 | 0.7398 | 0.9342 | 0.8798 | 0.7243 | 0.677 | 0.7407 | 0.7179 | 0.8179 | 0.1837 | 0.3315 | 0.7839 | 0.7872 | 0.7808 | 0.7286 | 0.8524 | 0.7724 | 0.7774 | 0.2726 | | 0.3688 | 30.0 | 15000 | 0.3541 | 0.7449 | 0.9338 | 0.8753 | 0.7283 | 0.667 | 0.7467 | 0.7257 | 0.8394 | 0.2352 | 0.3356 | 0.7976 | 0.8011 | 0.7921 | 0.7385 | 0.8728 | 0.7891 | 0.789 | 0.3524 | | 0.3239 | 31.0 | 15500 | 0.3797 | 0.7264 | 0.9278 | 0.8572 | 0.7022 | 0.6517 | 0.6941 | 0.7194 | 0.8253 | 0.1943 | 0.3267 | 0.7758 | 0.7787 | 0.7628 | 0.7143 | 0.8592 | 0.7362 | 0.7788 | 0.2774 | | 0.3569 | 32.0 | 16000 | 0.3646 | 0.7503 | 0.9375 | 0.8733 | 0.7383 | 0.6742 | 0.7447 | 0.7358 | 0.8384 | 0.2461 | 0.3367 | 0.8003 | 0.804 | 0.7937 | 0.7462 | 0.8722 | 0.7862 | 0.7932 | 0.3369 | | 0.3728 | 33.0 | 16500 | 0.3427 | 0.7504 | 0.9246 | 0.8818 | 0.7421 | 0.6712 | 0.7587 | 0.7338 | 0.8377 | 0.1733 | 0.3429 | 0.7966 | 0.8003 | 0.7975 | 0.733 | 0.8704 | 0.7875 | 0.7916 | 0.3595 | | 0.3339 | 34.0 | 17000 | 0.3636 | 0.7424 | 0.9326 | 0.8767 | 0.7235 | 0.6793 | 0.7492 | 0.7216 | 0.8245 | 0.2328 | 0.3354 | 0.7919 | 0.7963 | 0.782 | 0.7451 | 0.8618 | 0.79 | 0.7822 | 0.3512 | | 0.3418 | 35.0 | 17500 | 0.3566 | 0.7392 | 0.937 | 0.8777 | 0.7253 | 0.6703 | 0.7442 | 0.7165 | 0.8221 | 0.1854 | 0.3313 | 0.7894 | 0.7939 | 0.7854 | 0.7319 | 0.8645 | 0.7897 | 0.7772 | 0.306 | | 0.3315 | 36.0 | 18000 | 0.3576 | 0.7394 | 0.9315 | 0.89 | 0.7439 | 0.6524 | 0.7427 | 0.7183 | 0.822 | 0.304 | 0.3336 | 0.7905 | 0.7937 | 0.7979 | 0.7209 | 0.8624 | 0.784 | 0.7777 | 0.3857 | | 0.3434 | 37.0 | 18500 | 0.3463 | 0.75 | 0.9429 | 0.8802 | 0.7275 | 0.6974 | 0.7521 | 0.7308 | 0.8252 | 0.2276 | 0.3424 | 0.8005 | 0.8032 | 0.7841 | 0.7615 | 0.8639 | 0.7914 | 0.7873 | 0.35 | | 0.3343 | 38.0 | 19000 | 0.3435 | 0.742 | 0.9221 | 0.8714 | 0.752 | 0.6496 | 0.7322 | 0.7306 | 0.8245 | 0.2932 | 0.3369 | 0.7918 | 0.7963 | 0.805 | 0.7198 | 0.8642 | 0.7745 | 0.7896 | 0.3631 | | 0.3207 | 39.0 | 19500 | 0.3481 | 0.744 | 0.9242 | 0.8608 | 0.732 | 0.665 | 0.7186 | 0.7348 | 0.835 | 0.1829 | 0.3371 | 0.7979 | 0.8003 | 0.7962 | 0.733 | 0.8716 | 0.7629 | 0.7952 | 0.2417 | | 0.346 | 40.0 | 20000 | 0.3577 | 0.7471 | 0.9247 | 0.8751 | 0.756 | 0.6544 | 0.7399 | 0.7354 | 0.8309 | 0.3008 | 0.34 | 0.7975 | 0.7997 | 0.8071 | 0.7253 | 0.8666 | 0.7823 | 0.7898 | 0.3429 | | 0.3121 | 41.0 | 20500 | 0.3381 | 0.7548 | 0.9274 | 0.8829 | 0.7508 | 0.6698 | 0.7399 | 0.7395 | 0.8438 | 0.2959 | 0.3404 | 0.8018 | 0.8056 | 0.8071 | 0.7308 | 0.879 | 0.7783 | 0.7937 | 0.3964 | | 0.3101 | 42.0 | 21000 | 0.3581 | 0.7479 | 0.9365 | 0.8895 | 0.7421 | 0.6693 | 0.7475 | 0.728 | 0.8323 | 0.286 | 0.3371 | 0.7952 | 0.7974 | 0.7912 | 0.7319 | 0.8692 | 0.7871 | 0.7837 | 0.3286 | | 0.3322 | 43.0 | 21500 | 0.3496 | 0.7558 | 0.9344 | 0.8773 | 0.7651 | 0.6714 | 0.7579 | 0.7372 | 0.8309 | 0.2599 | 0.3381 | 0.8043 | 0.8087 | 0.8151 | 0.7407 | 0.8704 | 0.801 | 0.7949 | 0.3429 | | 0.4227 | 44.0 | 22000 | 0.3609 | 0.7485 | 0.9355 | 0.8857 | 0.7403 | 0.6948 | 0.7519 | 0.7351 | 0.8103 | 0.3153 | 0.3399 | 0.7983 | 0.8024 | 0.7958 | 0.7538 | 0.8574 | 0.7955 | 0.7938 | 0.3869 | | 0.3777 | 45.0 | 22500 | 0.3806 | 0.73 | 0.9205 | 0.8762 | 0.7211 | 0.6575 | 0.6998 | 0.7302 | 0.8115 | 0.2667 | 0.3261 | 0.7802 | 0.7837 | 0.7824 | 0.711 | 0.8577 | 0.7479 | 0.7813 | 0.3548 | | 0.3513 | 46.0 | 23000 | 0.3688 | 0.7398 | 0.9301 | 0.8685 | 0.7349 | 0.6666 | 0.7436 | 0.7279 | 0.8181 | 0.1918 | 0.3361 | 0.7903 | 0.7937 | 0.79 | 0.7297 | 0.8615 | 0.7845 | 0.7853 | 0.275 | | 0.3247 | 47.0 | 23500 | 0.3789 | 0.7222 | 0.9251 | 0.8701 | 0.704 | 0.628 | 0.7144 | 0.7112 | 0.8346 | 0.2434 | 0.3256 | 0.7734 | 0.7766 | 0.7674 | 0.6901 | 0.8722 | 0.7583 | 0.7641 | 0.3143 | | 0.3788 | 48.0 | 24000 | 0.3830 | 0.7272 | 0.9288 | 0.8678 | 0.7065 | 0.6535 | 0.7169 | 0.7163 | 0.8216 | 0.2011 | 0.3296 | 0.7821 | 0.7845 | 0.772 | 0.7198 | 0.8618 | 0.7702 | 0.7744 | 0.2286 | | 0.2994 | 49.0 | 24500 | 0.3435 | 0.758 | 0.9418 | 0.8827 | 0.7422 | 0.7014 | 0.7461 | 0.7403 | 0.8303 | 0.2473 | 0.3404 | 0.8034 | 0.8072 | 0.7921 | 0.756 | 0.8734 | 0.7855 | 0.7984 | 0.3 | | 0.3535 | 50.0 | 25000 | 0.3532 | 0.7484 | 0.9415 | 0.8746 | 0.727 | 0.6762 | 0.7407 | 0.7321 | 0.8419 | 0.284 | 0.3354 | 0.7983 | 0.8025 | 0.7916 | 0.733 | 0.8828 | 0.7821 | 0.7932 | 0.3464 | | 0.2641 | 51.0 | 25500 | 0.3483 | 0.7608 | 0.937 | 0.8809 | 0.7508 | 0.69 | 0.7486 | 0.7497 | 0.8414 | 0.2486 | 0.3446 | 0.8062 | 0.8096 | 0.8013 | 0.7484 | 0.8793 | 0.7924 | 0.8036 | 0.2917 | | 0.3154 | 52.0 | 26000 | 0.3635 | 0.7383 | 0.9185 | 0.88 | 0.7406 | 0.6416 | 0.6931 | 0.7368 | 0.8328 | 0.287 | 0.3288 | 0.7815 | 0.7853 | 0.8017 | 0.6846 | 0.8695 | 0.7296 | 0.7842 | 0.3393 | | 0.2867 | 53.0 | 26500 | 0.3616 | 0.7577 | 0.9422 | 0.8912 | 0.7418 | 0.7055 | 0.7468 | 0.748 | 0.8259 | 0.2801 | 0.3396 | 0.8042 | 0.8075 | 0.7962 | 0.7615 | 0.8648 | 0.7955 | 0.8036 | 0.3095 | | 0.2801 | 54.0 | 27000 | 0.3426 | 0.7598 | 0.9402 | 0.8879 | 0.7551 | 0.6939 | 0.7498 | 0.7393 | 0.8303 | 0.1811 | 0.3391 | 0.8044 | 0.8078 | 0.8142 | 0.7418 | 0.8675 | 0.7968 | 0.7923 | 0.2976 | | 0.3072 | 55.0 | 27500 | 0.3435 | 0.7648 | 0.9431 | 0.8925 | 0.7451 | 0.7146 | 0.7656 | 0.7385 | 0.8348 | 0.264 | 0.3409 | 0.8102 | 0.814 | 0.8054 | 0.7648 | 0.8716 | 0.8123 | 0.7969 | 0.3476 | | 0.3089 | 56.0 | 28000 | 0.3334 | 0.7728 | 0.9385 | 0.9089 | 0.7564 | 0.7163 | 0.7751 | 0.76 | 0.8457 | 0.2881 | 0.3457 | 0.8178 | 0.8226 | 0.8075 | 0.7736 | 0.8867 | 0.8172 | 0.8157 | 0.3369 | | 0.3144 | 57.0 | 28500 | 0.3425 | 0.769 | 0.9404 | 0.9009 | 0.7671 | 0.6985 | 0.7677 | 0.75 | 0.8415 | 0.2612 | 0.344 | 0.8157 | 0.8205 | 0.8264 | 0.7549 | 0.8802 | 0.8177 | 0.8026 | 0.369 | | 0.2709 | 58.0 | 29000 | 0.3556 | 0.7552 | 0.9445 | 0.8971 | 0.7262 | 0.6976 | 0.7494 | 0.7374 | 0.8418 | 0.2443 | 0.3349 | 0.8056 | 0.8088 | 0.7858 | 0.7593 | 0.8814 | 0.7994 | 0.7936 | 0.344 | | 0.3058 | 59.0 | 29500 | 0.3646 | 0.7534 | 0.9373 | 0.9012 | 0.7405 | 0.6912 | 0.7465 | 0.7287 | 0.8286 | 0.243 | 0.3353 | 0.7986 | 0.8025 | 0.7962 | 0.7484 | 0.863 | 0.7877 | 0.7821 | 0.3452 | | 0.2852 | 60.0 | 30000 | 0.3423 | 0.763 | 0.9314 | 0.8899 | 0.7611 | 0.6874 | 0.734 | 0.7494 | 0.8406 | 0.2066 | 0.3417 | 0.8097 | 0.813 | 0.8155 | 0.7462 | 0.8775 | 0.7837 | 0.8021 | 0.2952 | | 0.2663 | 61.0 | 30500 | 0.3398 | 0.7715 | 0.9415 | 0.904 | 0.7526 | 0.7158 | 0.7485 | 0.7596 | 0.8462 | 0.2175 | 0.3458 | 0.8176 | 0.821 | 0.8008 | 0.7813 | 0.8808 | 0.7933 | 0.8135 | 0.2405 | | 0.2902 | 62.0 | 31000 | 0.3307 | 0.758 | 0.9354 | 0.875 | 0.7626 | 0.6731 | 0.7427 | 0.7483 | 0.8383 | 0.2193 | 0.3405 | 0.808 | 0.8112 | 0.8146 | 0.7429 | 0.876 | 0.7866 | 0.8013 | 0.3119 | | 0.2774 | 63.0 | 31500 | 0.3594 | 0.7562 | 0.9397 | 0.8708 | 0.2627 | 0.7365 | 0.7318 | 0.3427 | 0.8048 | 0.8073 | 0.3464 | 0.7941 | 0.7703 | 0.7404 | 0.7921 | 0.6881 | 0.7582 | 0.8401 | 0.8716 | | 0.3029 | 64.0 | 32000 | 0.3285 | 0.7706 | 0.937 | 0.8835 | 0.2552 | 0.7619 | 0.7267 | 0.347 | 0.8175 | 0.8207 | 0.3881 | 0.8175 | 0.7665 | 0.752 | 0.8088 | 0.7099 | 0.7714 | 0.8498 | 0.882 | | 0.2941 | 65.0 | 32500 | 0.3317 | 0.7721 | 0.9444 | 0.8974 | 0.1749 | 0.7553 | 0.7631 | 0.3404 | 0.8166 | 0.8199 | 0.269 | 0.8065 | 0.8085 | 0.7593 | 0.8117 | 0.7099 | 0.7659 | 0.8471 | 0.882 | | 0.2601 | 66.0 | 33000 | 0.3450 | 0.7614 | 0.9386 | 0.8854 | 0.2197 | 0.7406 | 0.7648 | 0.3348 | 0.8035 | 0.8075 | 0.35 | 0.7943 | 0.8021 | 0.7452 | 0.7954 | 0.6823 | 0.7374 | 0.8568 | 0.8896 | | 0.2925 | 67.0 | 33500 | 0.3345 | 0.7721 | 0.945 | 0.8936 | 0.2889 | 0.7554 | 0.7532 | 0.3437 | 0.8183 | 0.8211 | 0.394 | 0.8098 | 0.7926 | 0.758 | 0.8096 | 0.7008 | 0.7648 | 0.8575 | 0.8888 | | 0.2762 | 68.0 | 34000 | 0.3585 | 0.7639 | 0.9246 | 0.8785 | 0.2512 | 0.7592 | 0.7207 | 0.337 | 0.8061 | 0.809 | 0.3929 | 0.8053 | 0.7558 | 0.7621 | 0.8163 | 0.6877 | 0.7319 | 0.842 | 0.8787 | | 0.2823 | 69.0 | 34500 | 0.3440 | 0.7706 | 0.9415 | 0.8968 | 0.162 | 0.7548 | 0.7594 | 0.3429 | 0.8109 | 0.8137 | 0.2643 | 0.799 | 0.7994 | 0.7557 | 0.8096 | 0.719 | 0.7593 | 0.8371 | 0.8722 | | 0.2657 | 70.0 | 35000 | 0.3297 | 0.7716 | 0.9439 | 0.8877 | 0.2672 | 0.7555 | 0.7621 | 0.3425 | 0.8175 | 0.8211 | 0.3488 | 0.8099 | 0.8005 | 0.7447 | 0.8059 | 0.7217 | 0.7725 | 0.8485 | 0.8849 | | 0.275 | 71.0 | 35500 | 0.3464 | 0.7689 | 0.9432 | 0.8797 | 0.2713 | 0.7449 | 0.7473 | 0.346 | 0.8132 | 0.816 | 0.381 | 0.7953 | 0.7936 | 0.7501 | 0.805 | 0.7128 | 0.7637 | 0.8439 | 0.8793 | | 0.2532 | 72.0 | 36000 | 0.3338 | 0.7744 | 0.939 | 0.9003 | 0.2674 | 0.7585 | 0.7502 | 0.3421 | 0.8151 | 0.8185 | 0.3464 | 0.8085 | 0.7851 | 0.7536 | 0.8092 | 0.7174 | 0.7604 | 0.8521 | 0.8858 | | 0.2514 | 73.0 | 36500 | 0.3411 | 0.7737 | 0.9365 | 0.8919 | 0.3184 | 0.7564 | 0.7582 | 0.3439 | 0.8162 | 0.8186 | 0.3655 | 0.8063 | 0.7953 | 0.7591 | 0.8088 | 0.7153 | 0.7648 | 0.8468 | 0.8822 | | 0.2557 | 74.0 | 37000 | 0.3308 | 0.7802 | 0.943 | 0.9052 | 0.3367 | 0.7644 | 0.7674 | 0.3438 | 0.8227 | 0.8254 | 0.4202 | 0.8139 | 0.7999 | 0.7655 | 0.8167 | 0.7139 | 0.767 | 0.8611 | 0.8923 | | 0.2419 | 75.0 | 37500 | 0.3403 | 0.7748 | 0.9436 | 0.9159 | 0.1904 | 0.7585 | 0.7727 | 0.3431 | 0.8156 | 0.8192 | 0.2679 | 0.808 | 0.8052 | 0.7488 | 0.7958 | 0.7232 | 0.778 | 0.8524 | 0.8837 | | 0.2654 | 76.0 | 38000 | 0.3485 | 0.7548 | 0.9316 | 0.8797 | 0.3081 | 0.7413 | 0.7389 | 0.338 | 0.8015 | 0.8047 | 0.3571 | 0.7968 | 0.7757 | 0.7521 | 0.8059 | 0.6702 | 0.7308 | 0.842 | 0.8775 | | 0.24 | 77.0 | 38500 | 0.3355 | 0.7754 | 0.9427 | 0.9017 | 0.2493 | 0.7592 | 0.7667 | 0.3458 | 0.8198 | 0.8223 | 0.2917 | 0.8112 | 0.805 | 0.7587 | 0.8159 | 0.712 | 0.7637 | 0.8554 | 0.8873 | | 0.2492 | 78.0 | 39000 | 0.3213 | 0.7925 | 0.9496 | 0.902 | 0.3124 | 0.7782 | 0.7797 | 0.3509 | 0.834 | 0.837 | 0.3774 | 0.8234 | 0.8223 | 0.7833 | 0.8364 | 0.7372 | 0.7835 | 0.857 | 0.8911 | | 0.2693 | 79.0 | 39500 | 0.3484 | 0.7627 | 0.9349 | 0.8789 | 0.2276 | 0.7386 | 0.7632 | 0.3439 | 0.8091 | 0.8122 | 0.2833 | 0.7908 | 0.8011 | 0.7534 | 0.8067 | 0.6983 | 0.756 | 0.8363 | 0.874 | | 0.2602 | 80.0 | 40000 | 0.3340 | 0.7738 | 0.9274 | 0.8749 | 0.3028 | 0.7676 | 0.7461 | 0.3472 | 0.8159 | 0.8187 | 0.3238 | 0.8149 | 0.7834 | 0.7715 | 0.8234 | 0.6925 | 0.7429 | 0.8575 | 0.8899 | | 0.2469 | 81.0 | 40500 | 0.3349 | 0.7796 | 0.9434 | 0.8844 | 0.2177 | 0.7655 | 0.7734 | 0.3472 | 0.823 | 0.8263 | 0.2893 | 0.8146 | 0.8172 | 0.7643 | 0.8151 | 0.7241 | 0.778 | 0.8506 | 0.8858 | | 0.2291 | 82.0 | 41000 | 0.3348 | 0.7788 | 0.9477 | 0.9016 | 0.2208 | 0.7647 | 0.7692 | 0.3446 | 0.819 | 0.8231 | 0.3131 | 0.8111 | 0.8118 | 0.7494 | 0.8021 | 0.7289 | 0.778 | 0.858 | 0.8891 | | 0.2422 | 83.0 | 41500 | 0.3250 | 0.7828 | 0.9479 | 0.9009 | 0.2327 | 0.7647 | 0.7781 | 0.3473 | 0.8234 | 0.8266 | 0.3036 | 0.8131 | 0.8146 | 0.7638 | 0.8142 | 0.7259 | 0.7758 | 0.8587 | 0.8896 | | 0.2722 | 84.0 | 42000 | 0.3219 | 0.78 | 0.9447 | 0.8887 | 0.2144 | 0.7704 | 0.7704 | 0.3515 | 0.8282 | 0.8313 | 0.2857 | 0.8232 | 0.8117 | 0.7587 | 0.8163 | 0.7308 | 0.7901 | 0.8506 | 0.8876 | | 0.2537 | 85.0 | 42500 | 0.3292 | 0.7824 | 0.9378 | 0.8982 | 0.2416 | 0.7778 | 0.7604 | 0.343 | 0.8207 | 0.8237 | 0.2512 | 0.8206 | 0.7996 | 0.7726 | 0.8197 | 0.7191 | 0.7604 | 0.8555 | 0.8911 | | 0.2408 | 86.0 | 43000 | 0.3392 | 0.7767 | 0.9366 | 0.8829 | 0.2707 | 0.7639 | 0.7566 | 0.3443 | 0.8161 | 0.8187 | 0.306 | 0.8103 | 0.7921 | 0.7685 | 0.8192 | 0.7064 | 0.7484 | 0.8552 | 0.8885 | | 0.259 | 87.0 | 43500 | 0.3327 | 0.773 | 0.9418 | 0.8915 | 0.2571 | 0.7616 | 0.7594 | 0.3426 | 0.8189 | 0.8212 | 0.2774 | 0.8115 | 0.8076 | 0.755 | 0.8063 | 0.7167 | 0.7725 | 0.8473 | 0.8849 | | 0.2155 | 88.0 | 44000 | 0.3365 | 0.7792 | 0.9447 | 0.9044 | 0.2575 | 0.7667 | 0.765 | 0.3468 | 0.823 | 0.8264 | 0.2869 | 0.8171 | 0.8023 | 0.7646 | 0.8163 | 0.7252 | 0.7791 | 0.8478 | 0.8837 | | 0.2539 | 89.0 | 44500 | 0.3322 | 0.7795 | 0.9443 | 0.8923 | 0.2167 | 0.7628 | 0.7713 | 0.3439 | 0.8187 | 0.8227 | 0.2774 | 0.8106 | 0.8071 | 0.766 | 0.8176 | 0.7228 | 0.7659 | 0.8498 | 0.8846 | | 0.2301 | 90.0 | 45000 | 0.3137 | 0.7858 | 0.9445 | 0.8961 | 0.228 | 0.7676 | 0.7728 | 0.3507 | 0.8259 | 0.8291 | 0.2643 | 0.8163 | 0.8119 | 0.7758 | 0.8255 | 0.7313 | 0.7769 | 0.8504 | 0.8849 | | 0.2377 | 91.0 | 45500 | 0.3259 | 0.7816 | 0.9501 | 0.8876 | 0.2237 | 0.7613 | 0.7764 | 0.3477 | 0.8224 | 0.8249 | 0.2607 | 0.8101 | 0.812 | 0.7717 | 0.8184 | 0.7201 | 0.7703 | 0.8529 | 0.8861 | | 0.2499 | 92.0 | 46000 | 0.3180 | 0.784 | 0.9507 | 0.8973 | 0.2433 | 0.7724 | 0.7663 | 0.348 | 0.8237 | 0.827 | 0.325 | 0.8184 | 0.8062 | 0.7753 | 0.8251 | 0.7221 | 0.7692 | 0.8545 | 0.8867 | | 0.2575 | 93.0 | 46500 | 0.3056 | 0.7919 | 0.9452 | 0.8903 | 0.319 | 0.7783 | 0.7738 | 0.3519 | 0.8296 | 0.8332 | 0.344 | 0.8283 | 0.8055 | 0.7836 | 0.8347 | 0.7279 | 0.7692 | 0.8644 | 0.8956 | | 0.2564 | 94.0 | 47000 | 0.3321 | 0.7836 | 0.9456 | 0.897 | 0.3027 | 0.7643 | 0.7708 | 0.3478 | 0.8197 | 0.8235 | 0.3512 | 0.8073 | 0.8092 | 0.7711 | 0.8213 | 0.7291 | 0.767 | 0.8504 | 0.8822 | | 0.2389 | 95.0 | 47500 | 0.3317 | 0.7817 | 0.9396 | 0.8882 | 0.3276 | 0.7653 | 0.7544 | 0.3449 | 0.8198 | 0.8221 | 0.3857 | 0.8102 | 0.787 | 0.7735 | 0.8276 | 0.7093 | 0.7484 | 0.8624 | 0.8902 | | 0.239 | 96.0 | 48000 | 0.3265 | 0.7897 | 0.945 | 0.894 | 0.2819 | 0.7779 | 0.7744 | 0.3508 | 0.8312 | 0.8336 | 0.331 | 0.8237 | 0.8109 | 0.7817 | 0.841 | 0.7289 | 0.7714 | 0.8587 | 0.8885 | | 0.2333 | 97.0 | 48500 | 0.3178 | 0.7886 | 0.9474 | 0.8944 | 0.2981 | 0.772 | 0.7827 | 0.3514 | 0.83 | 0.8324 | 0.3464 | 0.8189 | 0.8183 | 0.7863 | 0.8364 | 0.7213 | 0.7725 | 0.8582 | 0.8882 | | 0.2367 | 98.0 | 49000 | 0.2983 | 0.7946 | 0.954 | 0.8986 | 0.2573 | 0.7826 | 0.7832 | 0.3495 | 0.8364 | 0.8417 | 0.3655 | 0.8321 | 0.827 | 0.7863 | 0.8431 | 0.7349 | 0.7879 | 0.8626 | 0.8941 | | 0.272 | 99.0 | 49500 | 0.3156 | 0.795 | 0.9501 | 0.9015 | 0.305 | 0.7821 | 0.7845 | 0.3508 | 0.8372 | 0.8402 | 0.3548 | 0.8302 | 0.8235 | 0.7865 | 0.8356 | 0.7279 | 0.7879 | 0.8706 | 0.897 | | 0.233 | 100.0 | 50000 | 0.3199 | 0.7925 | 0.946 | 0.9048 | 0.2558 | 0.7794 | 0.7849 | 0.3536 | 0.8355 | 0.8378 | 0.2821 | 0.8273 | 0.8209 | 0.7872 | 0.8339 | 0.729 | 0.7879 | 0.8613 | 0.8917 | | 0.2121 | 101.0 | 50500 | 0.3041 | 0.797 | 0.9526 | 0.9011 | 0.2896 | 0.7875 | 0.7864 | 0.3543 | 0.8381 | 0.8418 | 0.3429 | 0.8355 | 0.8275 | 0.785 | 0.8356 | 0.7351 | 0.7912 | 0.871 | 0.8985 | | 0.2329 | 102.0 | 51000 | 0.3170 | 0.794 | 0.9534 | 0.9099 | 0.2068 | 0.7798 | 0.7951 | 0.3558 | 0.8361 | 0.8392 | 0.2833 | 0.8291 | 0.8331 | 0.777 | 0.8251 | 0.7501 | 0.8055 | 0.855 | 0.887 | | 0.2267 | 103.0 | 51500 | 0.2948 | 0.8011 | 0.9545 | 0.9075 | 0.3122 | 0.789 | 0.7858 | 0.3536 | 0.8408 | 0.8445 | 0.3655 | 0.8362 | 0.8217 | 0.7868 | 0.8368 | 0.7464 | 0.7978 | 0.8701 | 0.8988 | | 0.2608 | 104.0 | 52000 | 0.2983 | 0.7986 | 0.9538 | 0.8986 | 0.2755 | 0.7841 | 0.7908 | 0.3541 | 0.8403 | 0.8438 | 0.3619 | 0.8345 | 0.8293 | 0.7869 | 0.8389 | 0.7373 | 0.7923 | 0.8717 | 0.9003 | | 0.2311 | 105.0 | 52500 | 0.3141 | 0.7971 | 0.9544 | 0.9108 | 0.2403 | 0.7878 | 0.7848 | 0.3549 | 0.8377 | 0.8418 | 0.3369 | 0.8353 | 0.8212 | 0.7887 | 0.8406 | 0.7411 | 0.7912 | 0.8614 | 0.8935 | | 0.216 | 106.0 | 53000 | 0.3137 | 0.7985 | 0.9497 | 0.8872 | 0.2314 | 0.7861 | 0.7817 | 0.3553 | 0.8362 | 0.8404 | 0.3107 | 0.8347 | 0.8151 | 0.7861 | 0.8351 | 0.7397 | 0.789 | 0.8698 | 0.897 | | 0.2478 | 107.0 | 53500 | 0.3097 | 0.7924 | 0.9494 | 0.9009 | 0.2116 | 0.7826 | 0.7808 | 0.3523 | 0.8333 | 0.8364 | 0.2833 | 0.8307 | 0.8168 | 0.7786 | 0.8285 | 0.7407 | 0.7923 | 0.8579 | 0.8885 | | 0.2137 | 108.0 | 54000 | 0.3094 | 0.7924 | 0.9499 | 0.9027 | 0.229 | 0.7769 | 0.7855 | 0.3519 | 0.8341 | 0.8378 | 0.3238 | 0.8283 | 0.8194 | 0.7807 | 0.8335 | 0.7331 | 0.7879 | 0.8635 | 0.892 | | 0.2181 | 109.0 | 54500 | 0.3044 | 0.7997 | 0.949 | 0.8936 | 0.3269 | 0.787 | 0.7881 | 0.354 | 0.8434 | 0.8469 | 0.3929 | 0.8394 | 0.825 | 0.791 | 0.8435 | 0.739 | 0.7967 | 0.869 | 0.9006 | | 0.2231 | 110.0 | 55000 | 0.3186 | 0.7905 | 0.9498 | 0.9056 | 0.3006 | 0.7712 | 0.7822 | 0.3496 | 0.831 | 0.834 | 0.3429 | 0.821 | 0.8164 | 0.7778 | 0.8293 | 0.7237 | 0.7769 | 0.8701 | 0.8959 | | 0.2268 | 111.0 | 55500 | 0.3018 | 0.7964 | 0.9524 | 0.9052 | 0.338 | 0.7837 | 0.7824 | 0.3531 | 0.8368 | 0.8404 | 0.3893 | 0.8346 | 0.8176 | 0.7814 | 0.8314 | 0.7392 | 0.7901 | 0.8685 | 0.8997 | | 0.2079 | 112.0 | 56000 | 0.3093 | 0.7974 | 0.9467 | 0.9021 | 0.3048 | 0.7865 | 0.7822 | 0.3552 | 0.8386 | 0.8416 | 0.3429 | 0.8345 | 0.8232 | 0.7767 | 0.8301 | 0.7509 | 0.7989 | 0.8645 | 0.8959 | | 0.2291 | 113.0 | 56500 | 0.3160 | 0.794 | 0.9489 | 0.9069 | 0.3151 | 0.7824 | 0.7782 | 0.3526 | 0.8357 | 0.8391 | 0.3512 | 0.8317 | 0.8139 | 0.7775 | 0.8331 | 0.74 | 0.789 | 0.8646 | 0.8953 | | 0.2563 | 114.0 | 57000 | 0.3148 | 0.7959 | 0.949 | 0.9059 | 0.3205 | 0.7849 | 0.782 | 0.3528 | 0.837 | 0.84 | 0.3452 | 0.8335 | 0.8163 | 0.7848 | 0.8368 | 0.7379 | 0.7879 | 0.865 | 0.8953 | | 0.1982 | 115.0 | 57500 | 0.3036 | 0.7948 | 0.9467 | 0.9025 | 0.2714 | 0.7812 | 0.7866 | 0.3523 | 0.836 | 0.8389 | 0.3298 | 0.8305 | 0.822 | 0.777 | 0.8305 | 0.743 | 0.7901 | 0.8644 | 0.8962 | | 0.214 | 116.0 | 58000 | 0.3049 | 0.7979 | 0.9462 | 0.8906 | 0.2825 | 0.7879 | 0.7893 | 0.3548 | 0.84 | 0.8433 | 0.3298 | 0.8379 | 0.8263 | 0.7807 | 0.8347 | 0.7414 | 0.7934 | 0.8715 | 0.9018 | | 0.225 | 117.0 | 58500 | 0.3092 | 0.7976 | 0.9504 | 0.9051 | 0.3303 | 0.7885 | 0.7864 | 0.3544 | 0.8382 | 0.8415 | 0.3679 | 0.8354 | 0.8202 | 0.777 | 0.8314 | 0.746 | 0.7934 | 0.8697 | 0.8997 | | 0.2304 | 118.0 | 59000 | 0.3006 | 0.7997 | 0.9478 | 0.9027 | 0.2938 | 0.7904 | 0.7889 | 0.3551 | 0.8405 | 0.8442 | 0.3405 | 0.8365 | 0.8264 | 0.7762 | 0.8331 | 0.7514 | 0.8 | 0.8717 | 0.8994 | | 0.2193 | 119.0 | 59500 | 0.3021 | 0.8 | 0.9467 | 0.911 | 0.2867 | 0.7901 | 0.7883 | 0.353 | 0.8405 | 0.8441 | 0.331 | 0.8392 | 0.8243 | 0.7783 | 0.8335 | 0.7423 | 0.7934 | 0.8793 | 0.9053 | | 0.2032 | 120.0 | 60000 | 0.3083 | 0.8018 | 0.9474 | 0.9037 | 0.2596 | 0.7909 | 0.7871 | 0.3563 | 0.8413 | 0.8443 | 0.3048 | 0.837 | 0.821 | 0.7796 | 0.8322 | 0.7512 | 0.7989 | 0.8748 | 0.9018 | | 0.2222 | 121.0 | 60500 | 0.3032 | 0.8024 | 0.9467 | 0.8945 | 0.3091 | 0.7964 | 0.786 | 0.3562 | 0.8414 | 0.8448 | 0.3381 | 0.8414 | 0.8227 | 0.7771 | 0.831 | 0.7545 | 0.8 | 0.8757 | 0.9036 | | 0.2057 | 122.0 | 61000 | 0.3038 | 0.8016 | 0.9488 | 0.9039 | 0.3412 | 0.7929 | 0.7844 | 0.3554 | 0.8418 | 0.8457 | 0.4036 | 0.8409 | 0.8216 | 0.7839 | 0.8364 | 0.7474 | 0.7967 | 0.8735 | 0.9038 | | 0.2014 | 123.0 | 61500 | 0.3099 | 0.8006 | 0.9466 | 0.8905 | 0.3145 | 0.7875 | 0.7903 | 0.3561 | 0.8413 | 0.8441 | 0.3512 | 0.8372 | 0.8247 | 0.7829 | 0.8343 | 0.7454 | 0.7967 | 0.8734 | 0.9012 | | 0.2052 | 124.0 | 62000 | 0.3078 | 0.8017 | 0.9467 | 0.9002 | 0.3081 | 0.7905 | 0.7905 | 0.3572 | 0.8429 | 0.8456 | 0.3464 | 0.839 | 0.8242 | 0.7816 | 0.8356 | 0.7525 | 0.8022 | 0.871 | 0.8991 | | 0.2297 | 125.0 | 62500 | 0.3098 | 0.7987 | 0.9477 | 0.9033 | 0.2972 | 0.7936 | 0.7828 | 0.3555 | 0.84 | 0.8433 | 0.3524 | 0.8406 | 0.8186 | 0.7794 | 0.836 | 0.7437 | 0.7934 | 0.873 | 0.9006 | | 0.2139 | 126.0 | 63000 | 0.3029 | 0.8024 | 0.9471 | 0.8978 | 0.3109 | 0.7924 | 0.7864 | 0.3567 | 0.8417 | 0.8446 | 0.3524 | 0.8374 | 0.8249 | 0.7833 | 0.8385 | 0.7542 | 0.7967 | 0.8696 | 0.8985 | | 0.2121 | 127.0 | 63500 | 0.3064 | 0.7994 | 0.9468 | 0.9069 | 0.3204 | 0.7879 | 0.7857 | 0.3553 | 0.8412 | 0.8439 | 0.356 | 0.8363 | 0.8207 | 0.7817 | 0.8368 | 0.7455 | 0.7945 | 0.871 | 0.9003 | | 0.2102 | 128.0 | 64000 | 0.3080 | 0.8075 | 0.9506 | 0.9054 | 0.3314 | 0.7945 | 0.7909 | 0.358 | 0.8467 | 0.8497 | 0.3905 | 0.8397 | 0.8267 | 0.7861 | 0.8397 | 0.7597 | 0.8055 | 0.8766 | 0.9038 | | 0.2361 | 129.0 | 64500 | 0.3010 | 0.8043 | 0.9505 | 0.9055 | 0.3322 | 0.7989 | 0.7804 | 0.3574 | 0.8443 | 0.8473 | 0.3845 | 0.8425 | 0.8169 | 0.7907 | 0.8448 | 0.7504 | 0.7956 | 0.8719 | 0.9015 | | 0.2205 | 130.0 | 65000 | 0.3005 | 0.7999 | 0.9501 | 0.9066 | 0.293 | 0.791 | 0.7805 | 0.3559 | 0.8402 | 0.843 | 0.331 | 0.8384 | 0.8152 | 0.7785 | 0.8343 | 0.7495 | 0.7956 | 0.8718 | 0.8991 | | 0.2252 | 131.0 | 65500 | 0.2984 | 0.7997 | 0.9502 | 0.9023 | 0.2882 | 0.7933 | 0.7799 | 0.3563 | 0.841 | 0.8436 | 0.3262 | 0.8395 | 0.814 | 0.78 | 0.8356 | 0.75 | 0.7967 | 0.8691 | 0.8985 | | 0.2289 | 132.0 | 66000 | 0.3054 | 0.8041 | 0.9499 | 0.9015 | 0.2807 | 0.7978 | 0.7837 | 0.3578 | 0.8431 | 0.8466 | 0.3357 | 0.8427 | 0.819 | 0.7852 | 0.8389 | 0.7535 | 0.7989 | 0.8737 | 0.9021 | | 0.2074 | 133.0 | 66500 | 0.3043 | 0.8019 | 0.9503 | 0.9069 | 0.2932 | 0.794 | 0.7837 | 0.3563 | 0.8417 | 0.8451 | 0.3452 | 0.8404 | 0.8193 | 0.7794 | 0.8347 | 0.7527 | 0.7989 | 0.8735 | 0.9018 | | 0.2127 | 134.0 | 67000 | 0.3022 | 0.8044 | 0.9502 | 0.907 | 0.2874 | 0.7979 | 0.7879 | 0.3573 | 0.8436 | 0.8469 | 0.3405 | 0.8424 | 0.8216 | 0.7833 | 0.8385 | 0.7575 | 0.8011 | 0.8724 | 0.9012 | | 0.2045 | 135.0 | 67500 | 0.3007 | 0.8029 | 0.9502 | 0.907 | 0.2961 | 0.7977 | 0.7838 | 0.3567 | 0.8434 | 0.8467 | 0.3357 | 0.844 | 0.8183 | 0.7816 | 0.8385 | 0.753 | 0.7989 | 0.8739 | 0.9027 | | 0.2202 | 136.0 | 68000 | 0.2995 | 0.8047 | 0.9501 | 0.9066 | 0.2977 | 0.7965 | 0.7867 | 0.358 | 0.8439 | 0.8475 | 0.3452 | 0.8437 | 0.8199 | 0.7822 | 0.8372 | 0.7567 | 0.8011 | 0.8751 | 0.9041 | | 0.1876 | 137.0 | 68500 | 0.3020 | 0.8037 | 0.9472 | 0.9106 | 0.2818 | 0.7936 | 0.7876 | 0.3575 | 0.8425 | 0.8461 | 0.331 | 0.8414 | 0.8211 | 0.7794 | 0.8356 | 0.7552 | 0.7989 | 0.8764 | 0.9038 | | 0.1931 | 138.0 | 69000 | 0.3000 | 0.8056 | 0.95 | 0.9065 | 0.2881 | 0.7988 | 0.7855 | 0.3583 | 0.8446 | 0.8482 | 0.3405 | 0.8453 | 0.8194 | 0.7838 | 0.8381 | 0.7557 | 0.8011 | 0.8772 | 0.9053 | | 0.2055 | 139.0 | 69500 | 0.2999 | 0.8049 | 0.947 | 0.9065 | 0.292 | 0.7972 | 0.7875 | 0.3579 | 0.8433 | 0.8467 | 0.331 | 0.8433 | 0.8216 | 0.7863 | 0.8393 | 0.7525 | 0.7956 | 0.8758 | 0.905 | | 0.249 | 140.0 | 70000 | 0.3005 | 0.8038 | 0.947 | 0.9064 | 0.2918 | 0.7958 | 0.7881 | 0.3576 | 0.8427 | 0.8461 | 0.331 | 0.8424 | 0.8211 | 0.7836 | 0.8381 | 0.7501 | 0.7945 | 0.8777 | 0.9056 | | 0.2086 | 141.0 | 70500 | 0.3004 | 0.8037 | 0.947 | 0.9065 | 0.2922 | 0.7939 | 0.7887 | 0.357 | 0.8421 | 0.8456 | 0.3357 | 0.8403 | 0.8221 | 0.7849 | 0.8381 | 0.749 | 0.7934 | 0.8773 | 0.9053 | | 0.1979 | 142.0 | 71000 | 0.3018 | 0.8022 | 0.9469 | 0.9063 | 0.2922 | 0.7938 | 0.7849 | 0.3564 | 0.8415 | 0.845 | 0.3357 | 0.8406 | 0.8186 | 0.7819 | 0.8372 | 0.7493 | 0.7934 | 0.8754 | 0.9044 | | 0.2272 | 143.0 | 71500 | 0.3005 | 0.8037 | 0.9501 | 0.9066 | 0.2865 | 0.795 | 0.787 | 0.3563 | 0.8429 | 0.8464 | 0.331 | 0.8419 | 0.8204 | 0.7811 | 0.8351 | 0.7555 | 0.8 | 0.8744 | 0.9041 | | 0.2033 | 144.0 | 72000 | 0.2994 | 0.8043 | 0.9501 | 0.9065 | 0.2865 | 0.7972 | 0.7874 | 0.3575 | 0.8444 | 0.8479 | 0.331 | 0.8439 | 0.8212 | 0.7829 | 0.8381 | 0.756 | 0.8011 | 0.8742 | 0.9044 | | 0.2226 | 145.0 | 72500 | 0.3010 | 0.8049 | 0.9501 | 0.9066 | 0.2866 | 0.7972 | 0.7879 | 0.3575 | 0.8444 | 0.8479 | 0.331 | 0.8439 | 0.8212 | 0.7829 | 0.8381 | 0.7561 | 0.8011 | 0.8757 | 0.9044 | | 0.2126 | 146.0 | 73000 | 0.3000 | 0.8054 | 0.9501 | 0.9065 | 0.2866 | 0.7976 | 0.7894 | 0.3577 | 0.8449 | 0.8484 | 0.331 | 0.844 | 0.8229 | 0.7843 | 0.8393 | 0.7561 | 0.8011 | 0.8758 | 0.9047 | | 0.2266 | 147.0 | 73500 | 0.3003 | 0.8044 | 0.95 | 0.9064 | 0.2866 | 0.7968 | 0.7881 | 0.3575 | 0.8442 | 0.8477 | 0.331 | 0.8432 | 0.8218 | 0.7828 | 0.8381 | 0.756 | 0.8011 | 0.8745 | 0.9038 | | 0.2125 | 148.0 | 74000 | 0.3003 | 0.8045 | 0.95 | 0.9065 | 0.2866 | 0.7969 | 0.7881 | 0.3575 | 0.8442 | 0.8477 | 0.331 | 0.8432 | 0.8218 | 0.7828 | 0.8381 | 0.756 | 0.8011 | 0.8746 | 0.9038 | | 0.2043 | 149.0 | 74500 | 0.3002 | 0.8045 | 0.9501 | 0.9065 | 0.2866 | 0.7969 | 0.7881 | 0.3575 | 0.8442 | 0.8477 | 0.331 | 0.8432 | 0.8218 | 0.7828 | 0.8381 | 0.756 | 0.8011 | 0.8746 | 0.9038 | | 0.1994 | 150.0 | 75000 | 0.3002 | 0.8045 | 0.9501 | 0.9065 | 0.2866 | 0.7969 | 0.7881 | 0.3575 | 0.8442 | 0.8477 | 0.331 | 0.8432 | 0.8218 | 0.7828 | 0.8381 | 0.756 | 0.8011 | 0.8746 | 0.9038 | ### Framework versions - Transformers 4.46.1 - Pytorch 2.5.0+cu121 - Datasets 2.19.2 - Tokenizers 0.20.1
[ "chicken", "duck", "plant" ]