Skip to content

Akegarasu/ChatGLM-webui

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

62 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

ChatGLM-webui

A webui for ChatGLM made by THUDM. chatglm-6b

image

Features

  • Original Chat like chatglm-6b's demo, but use Gradio Chatbox for better user experience.
  • One click install script (but you still must install python)
  • More parameters that can be freely adjusted
  • Convenient save/load dialog history, presets
  • Custom maximum context length
  • Save to Markdown
  • Use program arguments to specify model and caculation accuracy

Install

requirements

python3.10

pip install torch==1.13.1+cu117 torchvision==0.14.1+cu117 --extra-index-url https://download.pytorch.org/whl/cu117
pip install --upgrade -r requirements.txt

or

bash install.sh

Run

python webui.py

Arguments

--model-path: specify model path. If this parameter is not specified manually, the default value is THUDM/chatglm-6b. Transformers will automatically download model from huggingface.

--listen: launch gradio with 0.0.0.0 as server name, allowing to respond to network requests

--port: webui port

--share: use gradio to share

--precision: fp32(CPU only), fp16, int4(CUDA GPU only), int8(CUDA GPU only)

--cpu: use cpu

--path-prefix: url root path. If this parameter is not specified manually, the default value is /. Using a path prefix of /foo/bar enables ChatGLM-webui to serve from http://$ip:$port/foo/bar/ rather than http://$ip:$port/.