diff --git a/README.md b/README.md
index 4897f42..09e7d44 100644
--- a/README.md
+++ b/README.md
@@ -1,17 +1,39 @@
-# 🤖🤖🤖 Multi AI Agent Systems
using OpenAI's Assistants API
+# Multi AI Agent Systems
using OpenAI's Assistants API (Experts.js)
-Experts.js is the easiest way to create [OpenAI's Assistants](https://platform.openai.com/docs/assistants/how-it-works) and link them together as Tools to create a Panel of Experts system with expanded memory and attention to detail.
+Experts.js is the easiest way to create and deploy [OpenAI's Assistants](https://platform.openai.com/docs/assistants/how-it-works) and link them together as Tools to create a Panel of Experts system with expanded memory and attention to detail.
## Overview
-...
+The new Assistants API from OpenAI sets a new industry standard, significantly advancing beyond the widely adopted Chat Completions API. It represents a major leap in the usability of AI agents and the way engineers interact with LLMs. Paired with the cutting-edge [GPT-4o](https://openai.com/index/hello-gpt-4o/) model, Assistants can now reference attached files & images as knowledge sources within a managed context window called a [Thread](#threads). Unlike [Custom GPTs](https://openai.com/index/introducing-gpts/), Assistants support instructions up to 256,000 characters, integrate with 128 tools, and utilize the innovative [Vector Store](https://platform.openai.com/docs/assistants/tools/file-search/vector-stores) API for efficient file search.
+
+Experts.js aims to simplify the usage of this new API by removing the complexity of managing [Run](https://platform.openai.com/docs/assistants/how-it-works/runs-and-run-steps) objects.
+
+```javascript
+const thread = Thread.create();
+const assistant = await MyAssistant.create();
+const output = await assistant.ask("Say hello.", thread.id);
+console.log(output) // Hello
+```
+
+More importantly, Experts.js introduces Assistants as [Tools](#tools), enabling the creation of [Multi AI Agent Systems](https://twitter.com/AndrewYNg/status/1790769732146307308). Each Tool is an LLM-backed Assistant that can take on specialized roles or fulfill complex tasks on behalf of their parent [Assistant](#assistants) or Tool. Allowing for complex orchestration workflows or choreographing a series of tightly knit tasks. Shown here is an example of a company assistant with a product catalog tool which itself has a LLM backed tool to create OpenSearch queries.
+
+![Multi AI Agent Systems with OpenAI Assistants API](docs/images/panel-of-experts-company-apparel-after.webp)
## Installation
+Install via npm. Usage is very simple, there are only three objects to import.
+```bash
+npm install experts
+```
+
+```javascript
+import { Assistant, Tool, Thread } from "experts";
+```
## Assistants
+The constructor of our [Assistant](https://platform.openai.com/docs/assistants/how-it-works/creating-assistants) facade object requires a name, description, and instructions. The third argument is a set of options which directly maps to all the request body options outlined in the [create assistant](https://platform.openai.com/docs/api-reference/assistants/createAssistant) documentation. All examples in Experts.js are written in ES6 classes for simplicity.
```javascript
class MyAssistant extends Assistant {
@@ -19,13 +41,67 @@ class MyAssistant extends Assistant {
const name = "My Assistant";
const description = "...";
const instructions = "..."
- super(name, description, instructions);
+ super(name, description, instructions, {
+ tools: [{ type: "file_search" }],
+ temperature: 0.1,
+ tool_resources: {
+ file_search: {
+ vector_store_ids: [process.env.VECTOR_STORE_ID],
+ },
+ },
+ });
}
}
-const thread = Thread.create();
const assistant = await MyAssistant.create();
-const output = assistant.ask("Hi, how are you?", thread.id);
+```
+
+The Experts.js create() function will:
+
+* Find or create your assistant by name.
+* Updates the assistants configurations to latest. [(pending)](https://github.com/metaskills/experts/issues/2)
+
+
+### Streaming & Events
+
+By default all Experts.js leverages the [Assistants Streaming Events](https://platform.openai.com/docs/api-reference/assistants-streaming/events). These allow your applications to receive text, image, and tool outputs via OpenAI's server-send events. We lever the [openai-node](https://github.com/openai/openai-node/blob/master/helpers.md) stream helpers and surface these and more so you can be in control of all events in your agentic applications.
+
+
+```javascript
+const assistant = await MainAssistant.create();
+assistant.on("textDelta", (delta, _snapshot) => {
+ process.stdout.write(delta.value)
+});
+```
+
+All openai-node [streaming events](https://github.com/openai/openai-node/blob/master/helpers.md) are supported via our Assistant's `on()` function.
+
+* event
+* textDelta
+* textDone
+* imageFileDone
+* toolCallDelta
+* runStepDone
+* toolCallDone
+* end
+
+> [!IMPORTANT]
+> These events are not async/await friendly. If your need to do more async work, consider using our extensions to these events. These are called "AFTER" the Run has been completed in the following order.
+
+* textDoneAsync
+* imageFileDoneAsync
+* runStepDoneAsync
+* toolCallDoneAsync
+* endAsync
+
+### Advanced Features
+
+If you want to lazily standup additional resources when an assistant's `create()` function is called, implement the `beforeInit()` function in your class. This is an async method that will be called before the assistant is created.
+
+```javascript
+async beforeInit() {
+ await this.#createFileSearch();
+}
```
## Tools
@@ -34,14 +110,14 @@ const output = assistant.ask("Hi, how are you?", thread.id);
## Threads
-OpenAI's Assistants API introduces a new resource called [Threads](https://platform.openai.com/docs/assistants/how-it-works/managing-threads-and-messages) which messages & files are stored within. Essentially, threads are a managed context window or memory for your agent. Creating a new thread with Experts.js is as easy as:
+OpenAI's Assistants API introduces a new resource called [Threads](https://platform.openai.com/docs/assistants/how-it-works/managing-threads-and-messages) which messages & files are stored within. Essentially, threads are a managed context window (memory) for your agents. Creating a new thread with Experts.js is as easy as:
```javascript
const thread = Thread.create();
console.log(thread.id) // thread_abc123
```
-You can also create a thread with messages to start a conversation. We support OpenAI's threads/create request body outlined in their [Threads API Reference](https://platform.openai.com/docs/api-reference/threads) documentation. For example:
+You can also create a thread with messages, files, or tool resources to start a conversation. We support OpenAI's thread create request body outlined in their [Threads API](https://platform.openai.com/docs/api-reference/threads) reference.
```javascript
const thread = await Thread.create({
@@ -56,11 +132,68 @@ console.log(output) // Ken Collins
### Thread Management & Locks
-By default, each [Tool](#tools) in Experts.js has its own thread & context. This avoids a potential [thread locking](https://platform.openai.com/docs/assistants/how-it-works/thread-locks) issue which happens if a [Tool](#tools) were to share an [Assistant's](#assistant) thread which would still be waiting for tool outputs to be submitted. The following diagram illustrates how Experts.js manages threads on your behalf:
+By default, each [Tool](#tools) in Experts.js has its own thread & context. This avoids a potential [thread locking](https://platform.openai.com/docs/assistants/how-it-works/thread-locks) issue which happens if a [Tool](#tools) were to share an [Assistant's](#assistants) thread still waiting for tool outputs to be submitted. The following diagram illustrates how Experts.js manages threads on your behalf to avoid this problem:
![Panel of Experts Thread Management](docs/images/panel-of-experts-thread-management.webp)
-All questions to your experts require a thread ID. For chat applications, the ID would be stored on the client. Such as a URL path parameter. With Expert.js, no other client-side IDs are needed. As each [Assistant](#assistants) calls an LLM backed [Tool](#tools), it will find or create a thread for that tool as needed. Experts.js stores this parent -> child thread relationship for you using OpenAI's [thread metadata](https://platform.openai.com/docs/api-reference/threads/modifyThread). An Experts.js [Tool](#tools) configured as a simple function via the `llm` false configuration will not create or use a thread.
+All questions to your experts require a thread ID. For chat applications, the ID would be stored on the client. Such as a URL path parameter. With Expert.js, no other client-side IDs are needed. As each [Assistant](#assistants) calls an LLM backed [Tool](#tools), it will find or create a thread for that tool as needed. Experts.js stores this parent -> child thread relationship for you using OpenAI's [thread metadata](https://platform.openai.com/docs/api-reference/threads/modifyThread).
+
+## Examples
+
+### Product Catalog
+
+![Multi AI Agent Systems with OpenAI Assistants API](docs/images/panel-of-experts-company-apparel-before.webp)
+
+
+
+### Streaming From Express
+
+Basic example using the `textDelta` event to stream responses from an Express route.
+
+```javascript
+import express from "express";
+import { MainAssistant } from "../experts/main.js";
+
+const assistant = await MainAssistant.create();
+
+messagesRouter.post("", async (req, res, next) => {
+ res.setHeader("Content-Type", "text/plain");
+ res.setHeader("Transfer-Encoding", "chunked");
+ assistant.on("textDelta", (delta, _snapshot) => {
+ res.write(delta.value);
+ });
+ await assistant.ask(req.body.message.content, req.body.threadID);
+ res.end();
+});
+```
+
+### Vector Store
+
+```javascript
+
+class VectorSearchAssistant extends Assistant {
+ constructor() {
+ const name = "Vector Search Assistant";
+ const description = "...";
+ const instructions = "..."
+ super(name, description, instructions, {
+ tools: [{ type: "file_search" }],
+ temperature: 0.1,
+ tool_resources: {
+ file_search: {
+ vector_store_ids: [process.env.VECTOR_STORE_ID],
+ },
+ },
+ });
+ }
+}
+
+const assistant = await OddFactsAssistant.create();
+
+```
+
+
+
## TODO
@@ -104,3 +237,11 @@ Now you can run the following commands:
./bin/setup
./bin/test
```
+
+## Scratch
+
+```javascript
+const thread = Thread.create();
+const assistant = await MyAssistant.create();
+const output = assistant.ask("Hi, how are you?", thread.id);
+```
diff --git a/docs/images/panel-of-experts-company-apparel-after.webp b/docs/images/panel-of-experts-company-apparel-after.webp
new file mode 100644
index 0000000..e5c755c
Binary files /dev/null and b/docs/images/panel-of-experts-company-apparel-after.webp differ
diff --git a/docs/images/panel-of-experts-company-apparel-before.webp b/docs/images/panel-of-experts-company-apparel-before.webp
new file mode 100644
index 0000000..ed9368b
Binary files /dev/null and b/docs/images/panel-of-experts-company-apparel-before.webp differ
diff --git a/docs/images/panel-of-experts-thread-management.webp b/docs/images/panel-of-experts-thread-management.webp
new file mode 100644
index 0000000..e1c5054
Binary files /dev/null and b/docs/images/panel-of-experts-thread-management.webp differ