Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Warning: Detected call of lr_scheduler.step() before optimizer.step() #1100

Open
hyecheol123 opened this issue Jul 21, 2020 · 5 comments
Open

Comments

@hyecheol123
Copy link

/home/jhyecheol/.local/lib/python3.8/site-packages/torch/optim/lr_scheduler.py:118: UserWarning: Detected call of `lr_scheduler.step()` before `optimizer.step()`. In PyTorch 1.1.0 and later, you should call them in the opposite order: `optimizer.step()` before `lr_scheduler.step()`.  Failure to do this will result in PyTorch skipping the first value of the learning rate schedule. See more details at https://pytorch.org/docs/stable/optim.html#how-to-adjust-learning-rate
  warnings.warn("Detected call of `lr_scheduler.step()` before `optimizer.step()`. "

I believe this warning is caused by the change of order regarding learning_rate update. As it will be better to inform you, I opened a issue.
This can be easily fixed by adding one conditional variable. However, before I go and try to find a solution for this issue, I want to make sure whether all training will start with epoch 1 or not. If not, I want to know where the starting epoch could be found.

@hyecheol123
Copy link
Author

I successfully locate the variable, and fixed it with simple conditional code.
The PR has been submitted.

@junyanz
Copy link
Owner

junyanz commented Jul 21, 2020

I replied to your PR.

@LightingMc
Copy link

I also got the same error. I had cloned my repo in Jan 2021, so I don't understand why it shows up, since it should have been resolved by now. Has it been updated?

@leggitta
Copy link

leggitta commented Apr 8, 2021

Also still getting this warning, and I noticed that at least in the first 25 training epochs, the learning rate never updates (remains at 0.0002000). Though I noticed that the PR referenced above hasn't merged

#1101

@simon-eda
Copy link

can you please provide a description on where and how to put lr_scheduler.step() AFTER optimizer.step()?

many thanks

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

5 participants