Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We鈥檒l occasionally send you account related emails.

Already on GitHub? Sign in to your account

[Bug] mlc_llm chat throws errors for model mlc-ai/Qwen1.5-1.8B-Chat-q4f16_1-MLC #2254

Closed
BodhiHu opened this issue Apr 30, 2024 · 1 comment
Labels
bug Confirmed bugs

Comments

@BodhiHu
Copy link
Contributor

BodhiHu commented Apr 30, 2024

馃悰 Bug

Hello,

HF://mlc-ai/Qwen1.5-1.8B-Chat-q4f16_1-MLC seems to be incomplete:

missing max_batch_size in mlc-chat-config.json;
no tokenizers found under Qwen1.5-1.8B-Chat-q4f16_1-MLC/

And above two missings will cause mlc_llm chat ... throw errors.

To Reproduce

Steps to reproduce the behavior:

  1. mlc_llm chat mlc-ai/Qwen1.5-1.8B-Chat-q4f16_1-MLC

Expected behavior

Environment

  • Platform (e.g. WebGPU/Vulkan/IOS/Android/CUDA): Mac M1, Metal
  • Operating system (e.g. Ubuntu/Windows/MacOS/...): MacOS
  • Device (e.g. iPhone 12 Pro, PC+RTX 3090, ...)
  • How you installed MLC-LLM (conda, source): yes
  • How you installed TVM-Unity (pip, source):
  • Python version (e.g. 3.10):
  • GPU driver version (if applicable):
  • CUDA/cuDNN version (if applicable):
  • TVM Unity Hash Tag (python -c "import tvm; print('\n'.join(f'{k}: {v}' for k, v in tvm.support.libinfo().items()))", applicable if you compile models):
  • Any other relevant information:

Additional context

@BodhiHu BodhiHu added the bug Confirmed bugs label Apr 30, 2024
@mengshyu
Copy link
Contributor

Hi @BodhiHu, I've updated the config file, can you try it again with latest mlc llm, thanks.

https://huggingface.co/mlc-ai/Qwen1.5-1.8B-Chat-q4f16_1-MLC/commit/09f17e66c5fa19bb938898e23d7997ac06371abf

@tqchen tqchen closed this as completed Jun 7, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Confirmed bugs
Projects
None yet
Development

No branches or pull requests

3 participants