Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Issue with HuggingFaceHubExperiment: Unable to Use Models Other Than gpt2 #110

Open
shrijayan opened this issue Dec 11, 2023 · 6 comments
Open
Assignees

Comments

@shrijayan
Copy link

⁉️ Discussion/Question

I'm currently facing an issue while attempting to use the HuggingFaceHubExperiment for testing models from Hugging Face. Specifically, it seems that only the gpt2 model is functional in this experiment, while other models from Hugging Face do not work. I am trying to use other models available from HuggingFace.

I've tried inputting Model IDs from Hugging Face, and I've also explored the available models on https://huggingface.co/api/models. Unfortunately, none of the models, aside from gpt2, seem to be compatible with the HuggingFaceHubExperiment.

Is there a specific reason for this limitation, or am I overlooking something in the setup process? I would greatly appreciate any guidance or insights you could provide to help resolve this issue.

@steventkrawczyk
Copy link
Contributor

Thanks for bringing this issue up, I'm looking into it

@steventkrawczyk steventkrawczyk self-assigned this Dec 11, 2023
@shrijayan
Copy link
Author

Thanks for bringing this issue up, I'm looking into it

Thanks for looking up!

@shrijayan
Copy link
Author

Thanks for bringing this issue up, I'm looking into it

Are there other possibilities to use all other huggingface models?

@shrijayan
Copy link
Author

@steventkrawczyk I think its all because of the endpoint where it looks!

@steventkrawczyk
Copy link
Contributor

Which model in particular are you trying to use? The huggingface hub experiment uses the Inference API, so it will only support models which that API supports. Is there a model you can call through the inference API, but are running into problems experimenting with prompttools? Can you post your error here?

@shrijayan
Copy link
Author

Which model in particular are you trying to use? The huggingface hub experiment uses the Inference API, so it will only support models which that API supports. Is there a model you can call through the inference API, but are running into problems experimenting with prompttools? Can you post your error here?

I have also searched on https://huggingface.co/api/models, and it seems that the model present there, including GPT-2, does not support my requirements. I am certain that GPT-2 fetches the model from this link. However, any other model from the same link, such as GPT-2 Large, is also not working.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants