Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

feat(llmobs): only patch llm integrations if enabled #9369

Closed
wants to merge 7 commits into from

Conversation

Yun-Kim
Copy link
Contributor

@Yun-Kim Yun-Kim commented May 23, 2024

This PR does two things:

  • Changes the LLMObs.enable() argument integrations list of integrations to enable to --> integrations_enabled a boolean flag whether or not the LLM integrations should be patched.
  • Restructures the openai/langchain/bedrock test suites to separate out LLMObs tests into a separate file, i.e. test_bedrock_llmobs.py. This will help maintainability. No functionality is changed in the tests other than to reflect the above change.

Previously we could use LLMObs.enable(integrations=[...]) to enable which LLMObs integrations to patch, but this is unnecessary functionality that is already supported by ddtrace. Users should just use ddtrace.patch_all()/ddtrace.patch() instead.

Checklist

  • Change(s) are motivated and described in the PR description
  • Testing strategy is described if automated tests are not included in the PR
  • Risks are described (performance impact, potential for breakage, maintainability)
  • Change is maintainable (easy to change, telemetry, documentation)
  • Library release note guidelines are followed or label changelog/no-changelog is set
  • Documentation is included (in-code, generated user docs, public corp docs)
  • Backport labels are set (if applicable)
  • If this PR changes the public interface, I've notified @DataDog/apm-tees.

Reviewer Checklist

  • Title is accurate
  • All changes are related to the pull request's stated goal
  • Description motivates each change
  • Avoids breaking API changes
  • Testing strategy adequately addresses listed risks
  • Change is maintainable (easy to change, telemetry, documentation)
  • Release note makes sense to a user of the library
  • Author has acknowledged and discussed the performance implications of this PR as reported in the benchmarks PR comment
  • Backport labels are set in a manner that is consistent with the release branch maintenance policy

@Yun-Kim Yun-Kim added changelog/no-changelog A changelog entry is not required for this PR. backport 2.9 labels May 23, 2024
@Yun-Kim Yun-Kim requested a review from a team as a code owner May 23, 2024 19:46
@Yun-Kim Yun-Kim enabled auto-merge (squash) May 23, 2024 19:47
@pr-commenter
Copy link

pr-commenter bot commented May 23, 2024

Benchmarks

Benchmark execution time: 2024-05-27 21:04:58

Comparing candidate commit 040bf38 in PR branch yunkim/llmobs-patch-all with baseline commit 3de0cf5 in branch main.

Found 0 performance improvements and 0 performance regressions! Performance is the same for 209 metrics, 9 unstable metrics.

@datadog-dd-trace-py-rkomorn
Copy link

datadog-dd-trace-py-rkomorn bot commented May 23, 2024

Datadog Report

Branch report: yunkim/llmobs-patch-all
Commit report: dd34406
Test service: dd-trace-py

❌ 62 Failed (0 Known Flaky), 1876 Passed, 1290 Skipped, 33m 8.73s Total duration (20m 52.53s time saved)

❌ Failed Tests (62)

This report shows up to 5 failed tests.

  • test_llmobs_ai21_invoke - test_bedrock_llmobs.py - Details

    Expand for error
     ('bedrock_client', <FixtureRequest for <Function test_llmobs_ai21_invoke>>)
    
  • test_llmobs_ai21_invoke - test_bedrock_llmobs.py - Details

    Expand for error
     ('bedrock_client', <FixtureRequest for <Function test_llmobs_ai21_invoke>>)
    
  • test_llmobs_ai21_invoke - test_bedrock_llmobs.py - Details

    Expand for error
     ('bedrock_client', <FixtureRequest for <Function test_llmobs_ai21_invoke>>)
    
  • test_llmobs_ai21_invoke - test_bedrock_llmobs.py - Details

    Expand for error
     ('bedrock_client', <FixtureRequest for <Function test_llmobs_ai21_invoke>>)
    
  • test_llmobs_amazon_invoke - test_bedrock_llmobs.py

@Yun-Kim Yun-Kim requested review from a team as code owners May 24, 2024 22:27
@Yun-Kim Yun-Kim requested a review from brettlangdon May 24, 2024 22:27
@Yun-Kim Yun-Kim changed the title feat(llmobs): use patch_all() instead of patch() feat(llmobs): only patch llm integrations if enabled May 24, 2024
@Yun-Kim Yun-Kim force-pushed the yunkim/llmobs-patch-all branch 3 times, most recently from 41948d5 to 0d682ae Compare May 24, 2024 23:05
@Yun-Kim
Copy link
Contributor Author

Yun-Kim commented May 28, 2024

Closing in favor of splitting this into smaller PRs (#9398, #9397)

@Yun-Kim Yun-Kim closed this May 28, 2024
auto-merge was automatically disabled May 28, 2024 14:49

Pull request was closed

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
backport 2.9 changelog/no-changelog A changelog entry is not required for this PR.
Projects
None yet
Development

Successfully merging this pull request may close these issues.

None yet

2 participants