-
Notifications
You must be signed in to change notification settings - Fork 176
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
AttributeError: module 'torch.optim' has no attribute 'adam' #258
Labels
bug
Something isn't working
Comments
配置如下: 模块配置, 默认启用loraenable_deepspeed = True |
pip list |grep torch |
pytorch-lightning 2.0.4 |
pip uninstall deep_training it is bug and fixed , do upgrade can solve. |
Thanks a lot. |
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
PTV2将优化器设置为adam会报AttributeError: module 'torch.optim' has no attribute 'adam'的错误,adamw是好的。
Lora则没有该问题。
The text was updated successfully, but these errors were encountered: