Skip to content

Question concerning exporting to ONNX #750

@gillmac13

Description

@gillmac13

@glenn-jocher

I have been using your repo time and again with excellent results for 1-class detection with yolov3-spp. I am now working on different ways to deploy some models which need a .onnx format.
However, when applying the suggested conversion method to the letter, I get an unexpected output ...

For example, if I download a yolov3-spp.weights model (80 classes) from https://pjreddie.com/darknet/yolo/ , and:

  • change to ONNX_EXPORT = True in models.py
  • run the command: python3 detect.py --cfg cfg/yolov3-spp.cfg --weights weights/yolov3-spp.weights

I get an export.onnx file in return with no warning or error.
Now if I inspect this export.onnx with netron, I find that the output of the model is 2 tensors, 1 of shape 10647x80 and 1 of shape 10647x1. I was under the impression that an 80-class model would produce 85 columns, not 84.

Using the code snippet:
model_onnx = onnxruntime.InferenceSession("weights/export.onnx") ort_inputs = {model_onnx.get_inputs()[0].name: img_t.numpy()} ort_outs = model_onnx.run(None, ort_inputs) x_out = ort_outs[0] print(x_out.shape)

confirms what I get in terms of output format.
And applying this procedure to my 1-class models, I get 4+1 columns instead of the expected 6, such as in the "pred" variable in "detect.py".

My questions are simple: is what I get in terms of output normal, in which case how should I interpret the results, or is something wrong?

Metadata

Metadata

Assignees

No one assigned

    Labels

    bugSomething isn't working

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions