You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
how do you reach such a good performance with the imageNet example?
What is happening under the hood when you call predictions = net.Classify(img, topK=args.topK) in the following example https://github.com/dusty-nv/jetson-inference/blob/master/python/examples/imagenet.py?
I have trouble to understand how you load the onnx-model and use it as a TensorRT model.
I did try to use the same Model together with the triton server but the performance is much worse (0.06s inference time for your solution vs 0.3s inference time for the triton server solution)
Its quite hard to find the right information for me as a novice.
Any advice?
The text was updated successfully, but these errors were encountered:
Hey dusty,
how do you reach such a good performance with the imageNet example?
What is happening under the hood when you call
predictions = net.Classify(img, topK=args.topK)
in the following examplehttps://github.com/dusty-nv/jetson-inference/blob/master/python/examples/imagenet.py
?I have trouble to understand how you load the onnx-model and use it as a TensorRT model.
I did try to use the same Model together with the triton server but the performance is much worse (0.06s inference time for your solution vs 0.3s inference time for the triton server solution)
Its quite hard to find the right information for me as a novice.
Any advice?
The text was updated successfully, but these errors were encountered: