Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

两张3090微调CogVLM的可能性? #479

Closed
KDD2018 opened this issue May 16, 2024 · 1 comment
Closed

两张3090微调CogVLM的可能性? #479

KDD2018 opened this issue May 16, 2024 · 1 comment
Assignees

Comments

@KDD2018
Copy link

KDD2018 commented May 16, 2024

请教下,有大佬尝试过使用两张RTX3090微调CogVLM嘛?有可能性嘛?

@zRzRzRzRzRzRzR zRzRzRzRzRzRzR self-assigned this May 19, 2024
@zRzRzRzRzRzRzR
Copy link
Collaborator

zRzRzRzRzRzRzR commented May 19, 2024

BF16不可能(模型本身推理就要占用36G,微调的时候图像的padding也是要加入到token中的,仅预留了8G-10G是不行的),int4魔改代码理论可行,但是实际操作还是比较困难的。

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants