Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

error when adding message to thread #331

Open
ZhiminHeGit opened this issue May 5, 2024 · 4 comments
Open

error when adding message to thread #331

ZhiminHeGit opened this issue May 5, 2024 · 4 comments

Comments

@ZhiminHeGit
Copy link

got the following error when running command following API.md

print(requests.post(
'http://127.0.0.1:8100/threads/266f60d1-ac42-43cd-ab93-f54b7e714971/state',
cookies= {"opengpts_user_id": "foo"}, json={
"values": [{
"content": "hi! my name is bob",
"type": "human",
}]
}
).content)

...
opengpts-backend | File "/backend/app/storage.py", line 122, in update_thread_state
opengpts-backend | await agent.aupdate_state(
opengpts-backend | File "/usr/local/lib/python3.11/site-packages/langgraph/pregel/init.py", line 479, in aupdate_state
opengpts-backend | raise InvalidUpdateError("Ambiguous update, specify as_node")

@Gitmaxd
Copy link

Gitmaxd commented May 7, 2024

Same issue

Additionally, the issue is not present when when checking the thread state.

Checking thread state...
Thread state: { values: [], next: [] }

Adding prompt to the thread...
Error: Request failed with status code 500
Response data: Internal Server Error
Response status: 500
Response headers: Object [AxiosHeaders] {
'content-length': '21',
'content-type': 'text/plain; charset=utf-8',
date: 'Tue, 07 May 2024 22:21:06 GMT',
server: 'uvicorn'
}

On the endpoint:

INFO: 10.0.1.4:52084 - "POST /threads/2949af24-9653-404e-8d2d-5210e8c0ffff/state HTTP/1.1" 500 Internal Server Error
ERROR: Exception in ASGI application
Traceback (most recent call last):
File "/usr/local/lib/python3.11/site-packages/uvicorn/protocols/http/h11_impl.py", line 408, in run_asgi
result = await app( # type: ignore[func-returns-value]
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/usr/local/lib/python3.11/site-packages/uvicorn/middleware/proxy_headers.py", line 84, in call
return await self.app(scope, receive, send)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/usr/local/lib/python3.11/site-packages/fastapi/applications.py", line 292, in call
await super().call(scope, receive, send)
File "/usr/local/lib/python3.11/site-packages/starlette/applications.py", line 122, in call
await self.middleware_stack(scope, receive, send)
File "/usr/local/lib/python3.11/site-packages/starlette/middleware/errors.py", line 184, in call
raise exc
File "/usr/local/lib/python3.11/site-packages/starlette/middleware/errors.py", line 162, in call
await self.app(scope, receive, _send)
File "/usr/local/lib/python3.11/site-packages/starlette/middleware/exceptions.py", line 79, in call
raise exc
File "/usr/local/lib/python3.11/site-packages/starlette/middleware/exceptions.py", line 68, in call
await self.app(scope, receive, sender)
File "/usr/local/lib/python3.11/site-packages/fastapi/middleware/asyncexitstack.py", line 20, in call
raise e
File "/usr/local/lib/python3.11/site-packages/fastapi/middleware/asyncexitstack.py", line 17, in call
await self.app(scope, receive, send)
File "/usr/local/lib/python3.11/site-packages/starlette/routing.py", line 718, in call
await route.handle(scope, receive, send)
File "/usr/local/lib/python3.11/site-packages/starlette/routing.py", line 276, in handle
await self.app(scope, receive, send)
File "/usr/local/lib/python3.11/site-packages/starlette/routing.py", line 66, in app
response = await func(request)
^^^^^^^^^^^^^^^^^^^
File "/usr/local/lib/python3.11/site-packages/fastapi/routing.py", line 273, in app
raw_response = await run_endpoint_function(
^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/usr/local/lib/python3.11/site-packages/fastapi/routing.py", line 190, in run_endpoint_function
return await dependant.call(**values)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/backend/app/api/threads.py", line 64, in add_thread_state
return await storage.update_thread_state(
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/backend/app/storage.py", line 122, in update_thread_state
await agent.aupdate_state(
File "/usr/local/lib/python3.11/site-packages/langgraph/pregel/init.py", line 479, in aupdate_state
raise InvalidUpdateError("Ambiguous update, specify as_node")
langgraph.channels.base.InvalidUpdateError: Ambiguous update, specify as_node

@ptgoetz
Copy link
Collaborator

ptgoetz commented May 10, 2024

Confirming that this is indeed a valid issue.

The OpenGPTs API has changed, and existing scripts/programs that use it might break.

The payloads the API expects are largely just untyped dict objects. That means any existing scripts or clients generated from the OpenAPI spec could break at any time. I think we desperately need to more strongly type the API.

In the short term, I'll work on updating API.md so it better reflects what the backend actually expects.

Longer term, I feel the OpenGPTs community needs to decide upon and implement a more strongly-typed API. This is important since the untyped nature of the current API makes it inherently unstable.

@ptgoetz
Copy link
Collaborator

ptgoetz commented May 14, 2024

It looks like the recent changes that decouple agents from threads changed the API a bit. You now need to add messages to a run instead.

Here's a sample of how to use the API now:

# Create a RAG-enabled assistant
resp = requests.post('http://127.0.0.1:8100/assistants', json={
  "name": "My Asisstant",
  "config": {
      "configurable": {
        "type": "agent",
        "type==agent/agent_type": "GPT 3.5 Turbo",
        "type==agent/system_message": "You are a helpful assistant",
        "type==agent/tools": [
            # {"type": "wikipedia"},
            # enable RAG
            {"type": "retrieval"}
        ]
      }
  },
  "public": False
}, cookies= {"user_id": "foo"}).content

assistant = json.loads(resp)

# Create a thread
thread = json.loads(requests.post('http://127.0.0.1:8100/threads', cookies= {"user_id": user_id}, json={
    "name": "My Thread",
    "assistant_id": assistant["assistant_id"]

}).content)

# Upload files for RAG
files = {
    'files': ("opengpt_blog_1.txt", open("opengpt_blog_1.txt", 'rb'), 'text/plain'),
}

config = {
        'configurable': {
            # RAG files can be attached to thread or assistants, but not both
            'thread_id': thread['thread_id'],
            # 'assistant_id': assistant['assistant_id'],
        }
}

config = {"config": json.dumps(config)}
cookie = {"user_id": user_id}

response = requests.post('http://localhost:8100/ingest', files=files, cookies=cookie, data=config, headers={'accept': 'application/json'})

# Send a message to the thread
payload = {
  "input" : [
    {
      "content" : "Tell me about OpenGPTs.",
      "type" : "human",
    }
  ],
  "thread_id" : thread["thread_id"]
}

response = requests.post(
    "http://127.0.0.1:8100/runs/stream",
    json.dumps(payload),
)

print(response.content)

I'll try to get the API.md file updated ASAP. But for now this should give you some ammunition to fix your client apps.

@Gitmaxd
Copy link

Gitmaxd commented May 20, 2024

Thanks, confirmed working example

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

3 participants