New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
60th example error with litellm LLM #702
Comments
yiouyou
changed the title
How to run with litellm + ollama
60th example error with litellm LLM
Apr 30, 2024
I would have to research this more and review the outlines code. The outlines integration in that notebook only works with Transformers-based models. |
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
If the llama3 from ollama is running on http://8.140.18.**:28275, the following code from 60th example runs fine.
However, when run the following code
It shows error (caused by the line 'tokenizer_or_pipe=llm.generator.llm.pipeline.tokenizer,'):
How to fix this issue?
Thanks
The text was updated successfully, but these errors were encountered: