-
Notifications
You must be signed in to change notification settings - Fork 18
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
master_params and model_params #5
Comments
I think the project is right. Only in fp16 mode, we have master_params, otherwise, we only have model_params which is in fp32 precision . |
Hi I'm interested in your project.
BTW i have a question about master_params and model_params.
i thought master_params are 32bits and model_params are 16bits since you leave a comment in train.py like this.
`
def master_params_to_model_params(self, model_params, master_params):
"""
Move FP32 master params to FP16 model params.
"""
for model, master in zip(model_params, master_params):
model.data.copy_(master.data)
`
however in this code,
`
`
you use master_params in fp16 mode.
which params is for fp16? master or model?
thanks for your kind reply.
The text was updated successfully, but these errors were encountered: