Model Description
This DAMO-YOLO-M model is a medium-size object detection model with fast inference speed and high accuracy, trained by DAMO-YOLO. DAMO-YOLO is a fast and accurate object detection method, which is developed by TinyML Team from Alibaba DAMO Data Analytics and Intelligence Lab. And it achieves a higher performance than state-of-the-art YOLO series. DAMO-YOLO is extend from YOLO but with some new techs, including Neural Architecture Search (NAS) backbones, efficient Reparameterized Generalized-FPN (RepGFPN), a lightweight head with AlignedOTA label assignment, and distillation enhancement. For more details, please refer to our Arxiv Report and Github Code. Moreover, here you can find not only powerful models, but also highly efficient training strategies and complete tools from training to deployment.
Chinese Web Demo
- We also provide Chinese Web Demo on ModelScope, including DAMO-YOLO-T, DAMO-YOLO-S, DAMO-YOLO-M.
Datasets
The model is trained on COCO2017.
Model Usage
The usage guideline can be found in our Quick Start Tutorial.
Model Evaluation
Model | size | mAPval 0.5:0.95 |
Latency T4 TRT-FP16-BS1 |
FLOPs (G) |
Params (M) |
Download |
---|---|---|---|---|---|---|
DAMO-YOLO-T | 640 | 41.8 | 2.78 | 18.1 | 8.5 | torch,onnx |
DAMO-YOLO-T* | 640 | 43.0 | 2.78 | 18.1 | 8.5 | torch,onnx |
DAMO-YOLO-S | 640 | 45.6 | 3.83 | 37.8 | 16.3 | torch,onnx |
DAMO-YOLO-S* | 640 | 46.8 | 3.83 | 37.8 | 16.3 | torch,onnx |
DAMO-YOLO-M | 640 | 48.7 | 5.62 | 61.8 | 28.2 | torch,onnx |
DAMO-YOLO-M* | 640 | 50.0 | 5.62 | 61.8 | 28.2 | torch,onnx |
- We report the mAP of models on COCO2017 validation set, with multi-class NMS.
- The latency in this table is measured without post-processing.
- * denotes the model trained with distillation.
Cite DAMO-YOLO
If you use DAMO-YOLO in your research, please cite our work by using the following BibTeX entry:
@article{damoyolo,
title={DAMO-YOLO: A Report on Real-Time Object Detection Design},
author={Xianzhe Xu, Yiqi Jiang, Weihua Chen, Yilun Huang, Yuan Zhang and Xiuyu Sun},
journal={arXiv preprint arXiv:2211.15444v2},
year={2022},
}