-
Notifications
You must be signed in to change notification settings - Fork 534
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
将.tm模型量化成int8模型 #518
Comments
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
我有一个onnx模型,将其转换成了.tm模型,如何再将其量化成8位模型呢?
通过mge.load加载模型后,使用quantize_qat和quantize对这个加载的TracedModule()没有作用
可能是我的使用方式不对,请大佬协助
The text was updated successfully, but these errors were encountered: