Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Not enough space #146

Open
gruzilla opened this issue Feb 15, 2024 · 1 comment
Open

Not enough space #146

gruzilla opened this issue Feb 15, 2024 · 1 comment

Comments

@gruzilla
Copy link

When sending large prompts i get the following error:

ggml_allocr_alloc: not enough space in the buffer (needed 561259008, largest block available 102924288)
GGML_ASSERT: /private/var/folders/qj/w85147014pq9yp4k5j6g66tw0000gn/T/pip-install-5wl_01v9/llama-cpp-python_451859871b5d42ada35cd84d4620a385/vendor/llama.cpp/ggml-alloc.c:139: !"not enough space in the buffer"

Does this mean, my machine does not have enough RAM (can't veryify, there seems to be RAM left), or is there an option i can give this process more memory (can't find an option)?

thank you for your answer!

@Haui1112
Copy link

Same issue here with the llama-2-7b-chat model.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants