Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Possible solution for Windows users - LLama not working. #450

Open
Gary6780 opened this issue May 20, 2023 · 4 comments
Open

Possible solution for Windows users - LLama not working. #450

Gary6780 opened this issue May 20, 2023 · 4 comments

Comments

@Gary6780
Copy link

Hello!

I had the issue, that after the Llama installation Llama didn't respond to any input.Alpaca worked fine. I found out, that i had to copy the three files from C:\Users\USERNAME\dalai\llama\build\bin\Release to the directory C:\Users\USERNAME\dalai\llama\build\Release. I restarted the server and Llama worked.

In the case someone runs into the same problem.

@neinja007
Copy link

I have the completely opposite issue: Alpaca doesn't respons when I click "GO", and I haven't tried llama on its own (idk how)

@mirek190
Copy link

Stop using that ancient dead project and go to llamacpp or koboldcpp ...
Also download models from https://huggingface.co/TheBloke ggml versions

@junxian428
Copy link

mine works but maybe mine model 7B so outcome not desirable Uploading image.png…

@junxian428
Copy link

image

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

4 participants