Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Add chat memory #30

Open
2 tasks
sinedied opened this issue Apr 4, 2024 · 0 comments
Open
2 tasks

Add chat memory #30

sinedied opened this issue Apr 4, 2024 · 0 comments
Labels
backend enhancement New feature or request

Comments

@sinedied
Copy link
Contributor

sinedied commented Apr 4, 2024

Currently only the last chat message with the user question is fed to the chat model. However, the complete chat history is sent to the server. We should use LangChain's Memory component to integrate the previous messages in the model invocation, with a limit set.

Tasks

  • Add BufferMemory with the last 5 messages to the existing chain in /chat endpoint
  • Extract the 5 message window value in constants.ts
@sinedied sinedied added enhancement New feature or request backend labels Apr 4, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
backend enhancement New feature or request
Projects
None yet
Development

No branches or pull requests

1 participant