New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
different inference result #453
Comments
You're correct! It seems like |
Hi unslothai, thx for fixing that! tinyllama-chat seems better not but i found Qwen1.5-7B-Chat still not well and here is the case too: |
hi unslothai, i got different inference result when using unsloth, i'v tested qwen1.5-chat and tinyllama-chat and got same issue, generate by unsloth always get a bad result compare with transformers and dont know why
and here is my case:
https://colab.research.google.com/drive/1dxGKB-c3U8BYX-m2rQie8R12--0-JQMs?usp=sharing
The text was updated successfully, but these errors were encountered: