You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
The we get is:
` ) -> BaseModel:
message = completion.choices[0].message
assert (
len(message.tool_calls or []) == 1
), "Instructor does not support multiple tool calls, use List[Model] instead."
E AssertionError: Instructor does not support multiple tool calls, use List[Model] instead.`
If
Same code as above for 'openai/gpt-4o' works perfectly for OPENAI (using openrouter) and we get a response model as expected:
client.chat.completions.create(**params, model='openai/gpt-4o', extra_headers=headers, response_model=response_model)
If you will try to use the AsyncAnthropic instead of AsyncOpenAI the error gets even weirder and there is a long stack trace, this is the outmost error, something with not reaching the proper endpoint or something.
raise self._make_status_error_from_response(err.response) from None
E anthropic.NotFoundError
Expected behavior
Allow to use Anthropic models over openrouter wether if it by using the openai client or anthropic (which would be better since the APIs are not identical)
The text was updated successfully, but these errors were encountered:
What Model are you using?
Describe the bug
Instructor fails to work work with openrouter when using Anthropic.
base_url = 'https://openrouter.ai/api/v1'
client = instructor.from_openai(AsyncOpenAI(base_url=base_url, api_key={openrouter_api_key}))
client.chat.completions.create(**params, model='anthropic/claude-3-sonnet', extra_headers=headers, response_model=response_model)
The we get is:
` ) -> BaseModel:
message = completion.choices[0].message
assert (
E AssertionError: Instructor does not support multiple tool calls, use List[Model] instead.`
If
Same code as above for 'openai/gpt-4o' works perfectly for OPENAI (using openrouter) and we get a response model as expected:
client.chat.completions.create(**params, model='openai/gpt-4o', extra_headers=headers, response_model=response_model)
If you will try to use the AsyncAnthropic instead of AsyncOpenAI the error gets even weirder and there is a long stack trace, this is the outmost error, something with not reaching the proper endpoint or something.
E anthropic.NotFoundError
Expected behavior
Allow to use Anthropic models over openrouter wether if it by using the openai client or anthropic (which would be better since the APIs are not identical)
The text was updated successfully, but these errors were encountered: