langchain-ChatGLM v0.1.14
imClumsyPanda
released this
31 May 15:08
·
1379 commits
to master
since this release
What's Changed
新增功能
- 新增通过 FastChat API 调用 LLM 的方式,实现了更多 LLM 的支持,并有效提升 LLM 回复速度 by @glide-the
- 新增 lru_cache 函数以提升本地向量库加载速度 by @liunux4odoo in #496
- Web UI 中增加更新知识库列表按钮,以解决刷新页面后 gradio 不支持自动更新知识库选项列表问题 by @imClumsyPanda
- 添加streamlit ui by @liunux4odoo in #480
- 添加重新初始化 uploadpath 内文档的按钮 by @zhoutongqing in #418
问题修复
- 更新 Dockerfile-cuda by @zhubao315 in #446
- 修改:读取本地知识文件时,能够读取给定目录下全部(包括直接和间接属于该目录)文件 by @DingJunyao in #471
- 修复 api 中 delete_docs 的 bug by @yihuaxiang in #499
- 修复上传 PDF 文件后可能提示 tmp.png 文件不存在的问题 by @imClumsyPanda
- 修复上传文件后可能提示本地缺少 faiss.index 导致文件上传至知识库失败的问题 by @imClumsyPanda
New Contributors
- @zhubao315 made their first contribution in #446
- @liunux4odoo made their first contribution in #480
- @DingJunyao made their first contribution in #471
- @yihuaxiang made their first contribution in #499
- @zhoutongqing made their first contribution in #418
Full Changelog: v0.1.13...v0.1.14