Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Potential bugs #19

Open
yxiao54 opened this issue Jun 7, 2023 · 0 comments
Open

Potential bugs #19

yxiao54 opened this issue Jun 7, 2023 · 0 comments

Comments

@yxiao54
Copy link

yxiao54 commented Jun 7, 2023

Line 71 of utils.py
grads.append(0 if param.grad is None else param.grad + 0.)
should rewrite as:
grads.append(param-param if param.grad is None else param.grad + 0.)

The current implementation may cause bugs when there are unused layers in the model. To be specific, when a layer was set require_grad as true but doesn't participate in forward or backward participation, it's grad was set as float zero. It will trigger an error when torch.autograd checks the shape of grads. Detail can be seen in this discussion: #8

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant