Skip to content

Getting error while loading meta-llama/Llama-2-7b-chat-hf #3627

Answered by arnavgarg1
Shrijeeth asked this question in Q&A
Discussion options

You must be logged in to vote

Hi @Shrijeeth, are you able to check if your HUGGING_FACE_HUB_TOKEN environment variable is set?

This error usually happens when the token isn't set in the environment, which is required to get access to the Llama-2 suite of models from Meta.

In case it isn't set, you can either do

export HUGGING_FACE_HUB_TOKEN=<token>

in your shell/environment, or do something like:

import os
os.environ['HUGGING_FACE_HUB_TOKEN'] = "token"

And then try to load your model.

Let me know if this fixes the issue!

Replies: 1 comment 2 replies

Comment options

You must be logged in to vote
2 replies
@Shrijeeth
Comment options

@arnavgarg1
Comment options

Answer selected by Shrijeeth
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Category
Q&A
Labels
None yet
2 participants