Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

assistants api import error #94

Open
anthonyrs06 opened this issue Mar 17, 2024 · 7 comments
Open

assistants api import error #94

anthonyrs06 opened this issue Mar 17, 2024 · 7 comments

Comments

@anthonyrs06
Copy link

I keep getting this error when trying to run the assistants api example. I have pip install openai done. Any tips?

ImportError: cannot import name 'MessageContentImageFile' from 'openai.types.beta.threads'

@Mitthat
Copy link

Mitthat commented Mar 19, 2024

i am getting same error, let me know if you figure this out

@tioans
Copy link

tioans commented Mar 21, 2024

@anthonyrs06 @Mitthat, it looks like an openai library issue. As a workaround, downgrade to version 1.6.1 and the issue disappears.

@anthonyrs06
Copy link
Author

that did the trick!

I think the recipe needs to be updated for the latest lib versions if anyone from chainlit see's this

@leonardonhesi
Copy link

openai/openai-python@5429f69
release: 1.14.0 (openai/openai-python#1234)

See the reference docs for more information:
https://platform.openai.com/docs/api-reference/assistants-streaming
We've also improved some of the names for the types in
the assistants beta, non exhaustive list:

  • CodeToolCall -> CodeInterpreterToolCall
  • MessageContentImageFile -> ImageFileContentBlock
  • MessageContentText -> TextContentBlock
  • ThreadMessage -> Message
  • ThreadMessageDeleted -> MessageDeleted
  • release: 1.14.0

@aliss77777
Copy link

Thanks for catching! super helpful

@hayescode
Copy link

I assume the best approach to stream assistant steps into Chainlit objects is directly via the EventHandler. This works inside app.py but I'd love to import this to clean it up. How can we pass the context though?

For streaming input, like code interpreter, there's no stream_token for input, only output. step.input += token then update() is inefficient.

One design struggle that I'm curious how anyone has solved, is submitting tool outputs for custom functions. This starts a new stream.

@scot
Copy link

scot commented May 15, 2024

@hayescode I'm struggling with the same issue with custom functions...any luck on your side?

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

7 participants