Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[Usage] None of the inputs have requires_grad=True. Gradients will be None #1475

Open
hellangleZ opened this issue Apr 30, 2024 · 4 comments

Comments

@hellangleZ
Copy link

hellangleZ commented Apr 30, 2024

Describe the issue

Issue:

Log said gradient will be none

Command:

PASTE THE COMMANDS HERE.

just using pretran

Log:

/data22/llava/lib/python3.10/site-packages/torch/utils/checkpoint.py:429: UserWarning: torch.utils.checkpoint: please pass in use_reentrant=True or use_reentrant=False explicitly. The default value of use_reentrant will be updated to be False in the future. To maintain current behavior, pass use_reentrant=True. It is recommended that you use use_reentrant=False. Refer to docs for more details on the differences between the two variants.
  warnings.warn(
/data22/llava/lib/python3.10/site-packages/torch/utils/checkpoint.py:61: UserWarning: None of the inputs have requires_grad=True. Gradients will be None
  warnings.warn(
/data22/llava/lib/python3.10/site-packages/torch/utils/checkpoint.py:429: UserWarning: torch.utils.checkpoint: please pass in use_reentrant=True or use_reentrant=False explicitly. The default value of use_reentrant will be updated to be False in the future. To maintain current behavior, pass use_reentrant=True. It is recommended that you use use_reentrant=False. Refer to docs for more details on the differences between the two variants.
  warnings.warn(
/data22/llava/lib/python3.10/site-packages/torch/utils/checkpoint.py:61: UserWarning: None of the inputs have requires_grad=True. Gradients will be None
  warnings.warn(
/data22/llava/lib/python3.10/site-packages/torch/utils/checkpoint.py:429: UserWarning: torch.utils.checkpoint: please pass in use_reentrant=True or use_reentrant=False explicitly. The default value of use_reentrant will be updated to be False in the future. To maintain current behavior, pass use_reentrant=True. It is recommended that you use use_reentrant=False. Refer to docs for more details on the differences between the two variants.

Screenshots:
You may attach screenshots if it better explains the issue.
image

@hellangleZ
Copy link
Author

/data22/llava/lib/python3.10/site-packages/torch/utils/checkpoint.py:429: UserWarning: torch.utils.checkpoint: please pass in use_reentrant=True or use_reentrant=False explicitly. The default value of use_reentrant will be updated to be False in the future. To maintain current behavior, pass use_reentrant=True. It is recommended that you use use_reentrant=False. Refer to docs for more details on the differences between the two variants.
warnings.warn(
/data22/llava/lib/python3.10/site-packages/torch/utils/checkpoint.py:61: UserWarning: None of the inputs have requires_grad=True. Gradients will be None
warnings.warn(
/data22/llava/lib/python3.10/site-packages/torch/utils/checkpoint.py:429: UserWarning: torch.utils.checkpoint: please pass in use_reentrant=True or use_reentrant=False explicitly. The default value of use_reentrant will be updated to be False in the future. To maintain current behavior, pass use_reentrant=True. It is recommended that you use use_reentrant=False. Refer to docs for more details on the differences between the two variants.
warnings.warn(
/data22/llava/lib/python3.10/site-packages/torch/utils/checkpoint.py:61: UserWarning: None of the inputs have requires_grad=True. Gradients will be None
warnings.warn(
{'loss': 1.8577, 'learning_rate': 3.205128205128205e-07, 'epoch': 0.0}
{'loss': 1.7297, 'learning_rate': 6.41025641025641e-07, 'epoch': 0.0}
{'loss': 1.866, 'learning_rate': 9.615384615384617e-07, 'epoch': 0.0}
{'loss': 2.0846, 'learning_rate': 1.282051282051282e-06, 'epoch': 0.0}

Will anyone could Help to see that?

Thanks

@hellangleZ hellangleZ changed the title [Usage] Log said gradient will be none [Usage] None of the inputs have requires_grad=True. Gradients will be None May 1, 2024
@LijunZhang01
Copy link

Have you solved it? I encountered the same problem,

@PzWHU
Copy link

PzWHU commented May 16, 2024

我也是这个问题,请问有解决吗

@xiaxiangzhou
Copy link

I have the same problem

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

4 participants