Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Contradiction in save_for_backward, what is permitted to be saved #2797

Open
ad8e opened this issue Mar 10, 2024 · 0 comments
Open

Contradiction in save_for_backward, what is permitted to be saved #2797

ad8e opened this issue Mar 10, 2024 · 0 comments
Labels
core Tutorials of any level of difficulty related to the core pytorch functionality

Comments

@ad8e
Copy link

ad8e commented Mar 10, 2024

https://pytorch.org/tutorials/beginner/examples_autograd/two_layer_net_custom_function.html
"ctx is a context object that can be used to stash information for backward computation. You can cache arbitrary objects for use in the backward pass using the ctx.save_for_backward method."

https://pytorch.org/docs/stable/generated/torch.autograd.function.FunctionCtx.save_for_backward.html
"save_for_backward should be called at most once, only from inside the forward() method, and only with tensors."

Most likely the second is correct, and the first is not. I haven't checked.

Suggestion: "You can cache tensors for use in the backward pass using the ctx.save_for_backward method. Other miscellaneous objects can be cached using ctx.my_object_name = object."

cc @albanD

@svekars svekars added the core Tutorials of any level of difficulty related to the core pytorch functionality label Mar 11, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
core Tutorials of any level of difficulty related to the core pytorch functionality
Projects
None yet
Development

No branches or pull requests

2 participants