New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
feat: c4ai-command-r-v01 support #944
Comments
The current ValueError: The checkpoint you are trying to load has a model type of `cohere`, which Transformers does not recognize. This may be due to a problem with the checkpoint or an outdated version of Transformers. when doing this: docker run --rm --gpus all -p 3000:3000 -it ghcr.io/bentoml/openllm start CohereForAI/c4ai-command-r-v01 --backend vllm also when installing Though vLLM version in main branch is 0.4.0: |
I think this should be the same prompting system, there is also |
Feature request
Would be nice to have ability to run Command-R (
CohereForAI/c4ai-command-r-v01
) using OpenLLMMotivation
No response
Other
vLLM backend already supports Command-R in v0.4.0: vllm-project/vllm#3330 (comment)
The text was updated successfully, but these errors were encountered: