Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Logfire Integration #3444

Merged
merged 9 commits into from
May 14, 2024
Merged

Logfire Integration #3444

merged 9 commits into from
May 14, 2024

Conversation

elisalimli
Copy link
Contributor

@elisalimli elisalimli commented May 4, 2024

Tasks

  • Add tests
  • Documentation

Fixes #3414

image

Copy link

vercel bot commented May 4, 2024

The latest updates on your projects. Learn more about Vercel for Git ↗︎

Name Status Preview Comments Updated (UTC)
litellm ✅ Ready (Inspect) Visit Preview 💬 Add feedback May 14, 2024 6:33pm

@elisalimli
Copy link
Contributor Author

@krrishdholakia What if users set multiple callbacks for the failure_handler? Shouldn't these check statements in failure_handler be if instead of elif?
https://github.com/BerriAI/litellm/blob/main/litellm/utils.py#L2362

@krrishdholakia
Copy link
Contributor

it's inside a for-loop, so it would work either way - let me know if you hit any issues

@elisalimli
Copy link
Contributor Author

it's inside a for-loop, so it would work either way - let me know if you hit any issues

Oh yeah, please review this PR. I have changed elifs to ifs it should not an issue as you just have said. Let me know if you want to revert that change to back.

@krrishdholakia
Copy link
Contributor

@elisalimli can you add a screenshot of this passing your testing

@elisalimli
Copy link
Contributor Author

@elisalimli can you add a screenshot of this passing your testing

pytest ./tests/test_logfire.py

image

litellm/utils.py Outdated
# this only logs streaming once, complete_streaming_response exists i.e when stream ends
if self.stream:
if "complete_streaming_response" not in kwargs:
break
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

can we use 'continue' instead of 'break' here - so other logging integrations aren't impacted @elisalimli

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Good catch! I have realized that we're using break statement not only for the logfire callback, but also others. Therefore, I pushed a fix in ed7c9e4

litellm/utils.py Outdated
@@ -2355,7 +2381,9 @@ def _failure_handler_helper_fn(
def failure_handler(
self, exception, traceback_exception, start_time=None, end_time=None
):
print_verbose(f"Logging Details LiteLLM-Failure Call")
print_verbose(
f"eLogging Details LiteLLM-Failure Call: {litellm.failure_callback}"
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

cleanup 'eLogging' to 'Logging'

@krrishdholakia
Copy link
Contributor

Thanks @elisalimli

LGTM - just some cleanup, and we should be good to merge. Appreciate you adding the testing for this

@krrishdholakia krrishdholakia merged commit e1d6536 into BerriAI:main May 14, 2024
1 check passed
@krrishdholakia
Copy link
Contributor

Thanks @elisalimli

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

[Feature]: Add support for Pydantic Logfire obeservability
2 participants