-
Notifications
You must be signed in to change notification settings - Fork 384
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
[CogVLM-chat-v1.1] LM weights are different with vicuna-7b-v1.5 #466
Comments
@antigone660 In text-only mode, the prompt template is different. Did you use the following prompt for text-only query? CogVLM/basic_demo/cli_demo_hf.py Line 52 in b37f36b
In my case, text-only mode works well regardless of this issue |
@minostauros Thanks for your reply I did not use the template before and it works now : ) |
While CogVLM is trained, LM weights are fronzen.
From my observation however, the LM weights of cogvlm are different with Vicuna
Vicuna: https://huggingface.co/lmsys/vicuna-7b-v1.5/tree/main
CogVLM: cogvlm-chat-v1.1 (both from HF or SAT)
Can I ask why or the proper source of the language model?
The text was updated successfully, but these errors were encountered: