You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Let's say total data batches are 160 and gradient accumulation is 10. Optimization step is happening only 10 times.
But here scheduler is called every time:
馃悰 Bug
Here:
h2o-llmstudio/train.py
Line 562 in a9d72ff
Let's say total data batches are 160 and gradient accumulation is 10. Optimization step is happening only 10 times.
But here scheduler is called every time:
h2o-llmstudio/train.py
Lines 315 to 316 in a9d72ff
which can lead to
https://discuss.pytorch.org/t/userwarning-detected-call-of-lr-scheduler-step-before-optimizer-step-in-pytorch-1-1-0-and-later-you-should-call-them-in-the-opposite-order-optimizer-step-before-lr-scheduler-step/88295
To Reproduce
LLM Studio version
The text was updated successfully, but these errors were encountered: