Skip to content

run-llama/chat-llamaindex

Repository files navigation



LlamaIndex Chat Logo

LlamaIndex Chat

Create chat bots that know your data

LlamaIndex Chat Screen

Welcome to LlamaIndex Chat. You can create and share LLM chatbots that know your data (PDF or text documents).

Getting started with LlamaIndex Chat is a breeze. Visit https://chat.llamaindex.ai - a hosted version of LlamaIndex Chat with no user authentication that provides an immediate start.

🚀 Features

LlamaIndex Chat is an example chatbot application for LlamaIndexTS. You can:

  • Create bots using prompt engineering and share them with other users.
  • Modify the demo bots by using the UI or directly editing the ./app/bots/bot.data.ts file.
  • Integrate your data by uploading documents or generating new data sources.

⚡️ Quick start

Local Development

Requirement: NodeJS 18

  • Clone the repository
git clone https://github.com/run-llama/chat-llamaindex
cd chat-llamaindex
  • Set the environment variables
cp .env.template .env.development.local

Edit environment variables in .env.development.local.

  • Run the dev server
pnpm install
pnpm dev

🐳 Docker

You can use Docker for development and deployment of LlamaIndex Chat.

  • Clone the repository
git clone https://github.com/run-llama/chat-llamaindex
cd chat-llamaindex
  • Set the environment variables
cp .env.template .env.development.local

Edit environment variables in .env.development.local.

Building the Docker Image

docker build -t chat-llamaindex .

Running in a Docker Container

docker run -p 3000:3000 --env-file .env.development.local chat-llamaindex

Docker Compose

For those preferring Docker Compose, we've included a docker-compose.yml file. To run using Docker Compose:

docker compose up

Go to http://localhost:3000 in your web browser.

Note: By default, the Docker Compose setup maps the cache and datasources directories from your host machine to the Docker container, ensuring data persistence and accessibility between container restarts.

Vercel Deployment

Deploying to Vercel is simple; click the button below and follow the instructions:

Deploy with Vercel

If you're deploying to a Vercel Hobby account, change the running time to 10 seconds, as this is the limit for the free plan.

If you want to use the sharing functionality, then you need to create a Vercel KV store and connect it to your project. Just follow this step from the quickstart. No further configuration is necessary as the app automatically uses a connected KV store.

🔄 Sharing

LlamaIndex Chat supports the sharing of bots via URLs. Demo bots are read-only and can't be shared. But you can create new bots (or clone and modify a demo bot) and call the share functionality in the context menu. It will create a unique URL that you can share with others. Opening the URL, users can directly use the shared bot.

📀 Data Sources

The app is using a ChatEngine for each bot with a VectorStoreIndex attached. The cache folder in the root directory is used as Storage for each VectorStoreIndex.

Each subfolder in the cache folder contains the data for one VectorStoreIndex. To set which VectorStoreIndex is used for a bot, use the subfolder's name as datasource attribute in the bot's data.

Note: To use the changed bots, you have to clear your local storage. Otherwise, the old bots are still used. You can clear your local storage by opening the developer tools and running localStorage.clear() in the console and reloading the page.

Generate Data Sources

To generate a new data source, create a new subfolder in the datasources directory and add the data files (e.g., PDFs). Then, create the `VectorStoreIndex`` for the data source by running the following command:

pnpm run generate <datasource-name>

Where <datasource-name> is the name of the subfolder with your data files.

🙏 Thanks

Thanks go to @Yidadaa for his ChatGPT-Next-Web project, which was used as a starter template for this project.