Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

GPU not big enough? I'm using A5500 24GB RAM #15

Open
imessien opened this issue Feb 28, 2024 · 0 comments
Open

GPU not big enough? I'm using A5500 24GB RAM #15

imessien opened this issue Feb 28, 2024 · 0 comments

Comments

@imessien
Copy link

imessien commented Feb 28, 2024

the paper uses an A6000 GPU with 48GB of RAM but the GPU in my workstation I have 4 A5500 with 24GB of RAM. Can I use the method suggested in the paper by separating out the model editing and model running. Or is there a way for me to run it parallel between my GPUs? My current idea to use this library called transformer-utils that uses a smaller model. I'm getting the message that I'm running out storage when running the model editing

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant