Replies: 3 comments
-
Hi @vswraith , Thanks for mentioning this issue. |
Beta Was this translation helpful? Give feedback.
0 replies
-
The same lines should work. Alternatively you can also pass them in the |
Beta Was this translation helpful? Give feedback.
0 replies
-
how do you do this? please show me an example i need to set cache_prompt to false from initiate chat for ollama |
Beta Was this translation helpful? Give feedback.
0 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
-
Is it possible to add extra openai parameters to connect to Azure endpoints?
Eg- our endpoint is deployed at
https://.azure-api.net/deployments/gpt-4/chat/completions?api-version=2023-05-01.
But by default, the code adds /openai to it.
Hence Autogen hits at
https://.azure-api.net/openai/deployments/gpt-4/chat/completions?api-version=2023-05-01 and this does not work.
In python, using the openAI lib, there is a way to override this by setting-
Can we do something similar for Autogen?
Beta Was this translation helpful? Give feedback.
All reactions