Issues: mlc-ai/web-llm
[Announcement] Breaking changes regarding conversation template
#344
opened Mar 26, 2024 by
CharlieFRuan
Open
1
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Author
Label
Projects
Milestones
Assignee
Sort
Issues list
[Tracking] WebLLM: Frontend Compatibility Issues and CDN Delivery
#398
opened May 14, 2024 by
CharlieFRuan
3 tasks
In the Llama-2-7b-chat-hf-q4f32_1-1k model, the number of tokens in the prefill is 36 when inputting 'hello'.
#396
opened May 14, 2024 by
137591
Strange reply from Phi2-q4f32_1-1k model in running the Web-llm Chat Demo
#388
opened May 6, 2024 by
bennylam
Models output is scrambled in Safari Technology Preview, which has WebGPU support
#386
opened May 2, 2024 by
felladrin
Check failed: (!free_page_ids_.empty()) is false: The KV cache is full. No page can be allocated.
#385
opened Apr 29, 2024 by
pilosof
Error: Cannot find global function tvmjs.runtime.ArrayConcat
#381
opened Apr 23, 2024 by
DavidGOrtega
[MLC-LLM] Uncaught (in promise) LinkError: WebAssembly.instantiate(): Import #4 "env"
#373
opened Apr 21, 2024 by
DavidGOrtega
Error running the function calling example: Cannot find global function mlc.serve.BNFGrammarGetGrammarOfJSON
#367
opened Apr 18, 2024 by
m0o0scar
[Vue/Vite/Nuxt] Build Error: require is not defined in ES module scope, you can use import instead
#353
opened Apr 1, 2024 by
k2m5t2
[Announcement] Breaking changes regarding conversation template
#344
opened Mar 26, 2024 by
CharlieFRuan
Previous Next
ProTip!
Follow long discussions with comments:>50.