Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Support for Chat History? #8

Open
2 tasks done
JacobGoldenArt opened this issue Jul 31, 2023 · 2 comments
Open
2 tasks done

Support for Chat History? #8

JacobGoldenArt opened this issue Jul 31, 2023 · 2 comments
Labels
enhancement New feature or request

Comments

@JacobGoldenArt
Copy link

Is this your first time submitting a feature request?

  • I have searched the existing issues, and I could not find an existing issue for this feature
  • I am requesting a straightforward extension of existing functionality

Describe the feature

Thanks for the really nice tutorial. I was just wondering how we might add a chat history feature. I'm not sure if this would be stored in Pinecone or maybe a vercel database?

Describe alternatives you've considered

No response

Who will this benefit?

No response

Are you interested in contributing this feature?

No response

Anything else?

No response

@JacobGoldenArt JacobGoldenArt added the enhancement New feature or request label Jul 31, 2023
@zackproser
Copy link
Contributor

Hi Jacob,

Thanks for the feedback and great idea - I think this would be helpful for others as well, certainly. There's also a number of ways to implement chat history via tools like LangChain and others - I'll discuss this with the team 👍

@athrael-soju
Copy link

@JacobGoldenArt as @zackproser suggests Langchain has a pretty good toolkit to do this and it can also be done in a more 'raw' fashion too.

If we retrieve the vector id for each upsert of our chat and insert a new chat record with that id in something like mongoDB atlas (or any db really), then we can use the returning matching topN rows from the db to further reinforce RAG precision and provide chat history.

An alternative is to use combined memory classes with Langchain: https://js.langchain.com/docs/modules/memory/how_to/multiple_memory

The only setbacks are response times and cost for each call to the openAI, which can be bigger if you include a large historical data set from your db.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
enhancement New feature or request
Projects
None yet
Development

No branches or pull requests

3 participants