Yolo-v6: Optimized for Mobile Deployment

Real-time object detection optimized for mobile and edge

YoloV6 is a machine learning model that predicts bounding boxes and classes of objects in an image.

This model is an implementation of Yolo-v6 found here.

More details on model performance across various devices, can be found here.

Model Details

  • Model Type: Object detection
  • Model Stats:
    • Model checkpoint: YoloV6-N
    • Input resolution: 640x640
    • Number of parameters: 4.68M
    • Model size: 17.9 MB
Model Device Chipset Target Runtime Inference Time (ms) Peak Memory Range (MB) Precision Primary Compute Unit Target Model
Yolo-v6 Samsung Galaxy S23 Snapdragon® 8 Gen 2 TFLITE 10.657 ms 0 - 9 MB FP16 NPU --
Yolo-v6 Samsung Galaxy S23 Snapdragon® 8 Gen 2 QNN 5.675 ms 5 - 17 MB FP16 NPU --
Yolo-v6 Samsung Galaxy S23 Snapdragon® 8 Gen 2 ONNX 5.507 ms 5 - 37 MB FP16 NPU --
Yolo-v6 Samsung Galaxy S24 Snapdragon® 8 Gen 3 TFLITE 7.294 ms 0 - 49 MB FP16 NPU --
Yolo-v6 Samsung Galaxy S24 Snapdragon® 8 Gen 3 QNN 3.962 ms 5 - 48 MB FP16 NPU --
Yolo-v6 Samsung Galaxy S24 Snapdragon® 8 Gen 3 ONNX 4.222 ms 5 - 70 MB FP16 NPU --
Yolo-v6 Snapdragon 8 Elite QRD Snapdragon® 8 Elite TFLITE 7.206 ms 0 - 43 MB FP16 NPU --
Yolo-v6 Snapdragon 8 Elite QRD Snapdragon® 8 Elite QNN 4.214 ms 5 - 39 MB FP16 NPU --
Yolo-v6 Snapdragon 8 Elite QRD Snapdragon® 8 Elite ONNX 4.302 ms 5 - 55 MB FP16 NPU --
Yolo-v6 SA7255P ADP SA7255P TFLITE 85.024 ms 0 - 38 MB FP16 NPU --
Yolo-v6 SA7255P ADP SA7255P QNN 78.113 ms 2 - 11 MB FP16 NPU --
Yolo-v6 SA8255 (Proxy) SA8255P Proxy TFLITE 10.574 ms 0 - 11 MB FP16 NPU --
Yolo-v6 SA8255 (Proxy) SA8255P Proxy QNN 4.6 ms 5 - 7 MB FP16 NPU --
Yolo-v6 SA8295P ADP SA8295P TFLITE 13.93 ms 0 - 29 MB FP16 NPU --
Yolo-v6 SA8295P ADP SA8295P QNN 7.081 ms 0 - 18 MB FP16 NPU --
Yolo-v6 SA8650 (Proxy) SA8650P Proxy TFLITE 10.602 ms 0 - 8 MB FP16 NPU --
Yolo-v6 SA8650 (Proxy) SA8650P Proxy QNN 4.746 ms 5 - 7 MB FP16 NPU --
Yolo-v6 SA8775P ADP SA8775P TFLITE 14.1 ms 0 - 37 MB FP16 NPU --
Yolo-v6 SA8775P ADP SA8775P QNN 7.424 ms 0 - 10 MB FP16 NPU --
Yolo-v6 QCS8275 (Proxy) QCS8275 Proxy TFLITE 85.024 ms 0 - 38 MB FP16 NPU --
Yolo-v6 QCS8275 (Proxy) QCS8275 Proxy QNN 78.113 ms 2 - 11 MB FP16 NPU --
Yolo-v6 QCS8550 (Proxy) QCS8550 Proxy TFLITE 10.491 ms 0 - 8 MB FP16 NPU --
Yolo-v6 QCS8550 (Proxy) QCS8550 Proxy QNN 4.538 ms 5 - 7 MB FP16 NPU --
Yolo-v6 QCS9075 (Proxy) QCS9075 Proxy TFLITE 14.1 ms 0 - 37 MB FP16 NPU --
Yolo-v6 QCS9075 (Proxy) QCS9075 Proxy QNN 7.424 ms 0 - 10 MB FP16 NPU --
Yolo-v6 QCS8450 (Proxy) QCS8450 Proxy TFLITE 13.048 ms 0 - 39 MB FP16 NPU --
Yolo-v6 QCS8450 (Proxy) QCS8450 Proxy QNN 7.726 ms 5 - 32 MB FP16 NPU --
Yolo-v6 Snapdragon X Elite CRD Snapdragon® X Elite QNN 4.842 ms 5 - 5 MB FP16 NPU --
Yolo-v6 Snapdragon X Elite CRD Snapdragon® X Elite ONNX 6.095 ms 7 - 7 MB FP16 NPU --

License

  • The license for the original implementation of Yolo-v6 can be found here.
  • The license for the compiled assets for on-device deployment can be found here

References

Community

Usage and Limitations

Model may not be used for or in connection with any of the following applications:

  • Accessing essential private and public services and benefits;
  • Administration of justice and democratic processes;
  • Assessing or recognizing the emotional state of a person;
  • Biometric and biometrics-based systems, including categorization of persons based on sensitive characteristics;
  • Education and vocational training;
  • Employment and workers management;
  • Exploitation of the vulnerabilities of persons resulting in harmful behavior;
  • General purpose social scoring;
  • Law enforcement;
  • Management and operation of critical infrastructure;
  • Migration, asylum and border control management;
  • Predictive policing;
  • Real-time remote biometric identification in public spaces;
  • Recommender systems of social media platforms;
  • Scraping of facial images (from the internet or otherwise); and/or
  • Subliminal manipulation
Downloads last month

-

Downloads are not tracked for this model. How to track
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support