Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Local LLM could not connect #205

Open
shamin10 opened this issue Dec 19, 2023 · 5 comments
Open

Local LLM could not connect #205

shamin10 opened this issue Dec 19, 2023 · 5 comments

Comments

@shamin10
Copy link

Hi, thank you for the wonderful this code. I have downloaded the model from huggingface but when i try to load from prompt load , I could not be able to load
Can you pls help me. I don't want to load model thru huggingface app key

@doberst
Copy link
Contributor

doberst commented Dec 19, 2023

Thanks for the feedback. Sorry you have run into an issue. Which model are you trying to use?

@shamin10
Copy link
Author

Thank you I'm trying to use bling-sheared-llama-1.3b-0.1
I have downloaded this model to my PC. I want to use

Load model directly

from transformers import AutoTokenizer, AutoModelForCausalLM

tokenizer = AutoTokenizer.from_pretrained("llmware/bling-sheared-llama-1.3b-0.1")
model = AutoModelForCausalLM.from_pretrained("llmware/bling-sheared-llama-1.3b-0.1")

I changed the path to my c drive . But getting error , seemilike I have to have huggingface api token?

@philipkd
Copy link
Contributor

philipkd commented Jan 9, 2024

Hmm, can you provide the error? I ran those three lines of code, and it seems to download the model fine.

@chair300
Copy link
Contributor

where you able to get the model running? Were you able to run the model outside the frameware with the ollama command?

@JeremyBickel
Copy link

JeremyBickel commented Mar 14, 2024

If I understand the OP correctly, then I want to know this, too. How do I load a model from a non-standard location on my local drive? It's a GGUF, and it's not in the huggingface cache system at all. load_model() seems to expect a huggingface model path.

Reference issue: #433

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

5 participants