-
Notifications
You must be signed in to change notification settings - Fork 152
Add OpenTelemetry integration for OpenAI AgentsOpenAI/otel #1286
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Open
tconley1428
wants to merge
18
commits into
main
Choose a base branch
from
openai/otel
base: main
Could not load branches
Branch not found: {{ refName }}
Loading
Could not load tags
Nothing to show
Loading
Are you sure you want to change the base?
Some commits from the old base branch may be removed from the timeline,
and old review comments may become outdated.
Open
Changes from all commits
Commits
Show all changes
18 commits
Select commit
Hold shift + click to select a range
daf60db
WIP
tconley1428 2785bd3
Add OpenTelemetry integration for OpenAI Agents
tconley1428 9a767c4
Remove opentelemetryv2 additions - moved to separate branch
tconley1428 f79e320
Revert opentelemetry.py changes - no longer needed
tconley1428 a455df8
Remove debug prints from OpenAI agents OTEL integration
tconley1428 dde9237
Linting fixes
tconley1428 b247152
Fix linting
tconley1428 242ddce
Fix OpenAI Agents tracing to require explicit trace context for custo…
tconley1428 3e2a94f
Merge branch 'main' into openai/otel
tconley1428 9a91ddd
Cleanup
tconley1428 a087b2d
Ensure telemetry interceptor is added to replayer
tconley1428 54ead57
Merge branch 'main' into openai/otel
tconley1428 63911f9
Update character to work on windows
tconley1428 9d8f9c3
Merge branch 'main' into openai/otel
tconley1428 a7238dc
Add support for direct OpenTelemetry API calls in workflows
tconley1428 a9ac2ab
Merge branch 'main' into openai/otel
tconley1428 c5dfee1
Fix linting errors in OpenAI agents OpenTelemetry integration
tconley1428 251a0e4
Fix test_sdk_trace_to_otel_span_parenting flaky test failure
tconley1428 File filter
Filter by extension
Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
There are no files selected for viewing
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
| Original file line number | Diff line number | Diff line change |
|---|---|---|
| @@ -0,0 +1,106 @@ | ||
| """OpenTelemetry integration for OpenAI Agents in Temporal workflows. | ||
|
|
||
| This module provides utilities for properly exporting OpenAI agent telemetry | ||
| to OpenTelemetry endpoints from within Temporal workflows, handling workflow | ||
| replay semantics correctly. | ||
| """ | ||
|
|
||
| from opentelemetry.sdk.trace import ReadableSpan | ||
| from opentelemetry.sdk.trace.export import SimpleSpanProcessor | ||
| from opentelemetry.sdk.trace.id_generator import IdGenerator | ||
| from opentelemetry.trace import INVALID_SPAN_ID, INVALID_TRACE_ID | ||
|
|
||
| from temporalio import workflow | ||
|
|
||
|
|
||
| class TemporalIdGenerator(IdGenerator): | ||
| """OpenTelemetry ID generator that provides deterministic IDs for Temporal workflows. | ||
|
|
||
| This generator ensures that span and trace IDs are deterministic when running | ||
| within Temporal workflows by using the workflow's deterministic random source. | ||
| This is crucial for maintaining consistency across workflow replays. | ||
| """ | ||
|
|
||
| def __init__(self): | ||
| """Initialize the ID generator with empty trace and span pools.""" | ||
| self.traces = [] | ||
| self.spans = [] | ||
|
|
||
| def generate_span_id(self) -> int: | ||
| """Generate a deterministic span ID. | ||
|
|
||
| Uses the workflow's deterministic random source when in a workflow context, | ||
| otherwise falls back to system random. | ||
|
|
||
| Returns: | ||
| A 64-bit span ID that is guaranteed not to be INVALID_SPAN_ID. | ||
| """ | ||
| if workflow.in_workflow(): | ||
| get_rand_bits = workflow.random().getrandbits | ||
| else: | ||
| import random | ||
|
|
||
| get_rand_bits = random.getrandbits | ||
|
|
||
| if len(self.spans) > 0: | ||
| return self.spans.pop() | ||
|
|
||
| span_id = get_rand_bits(64) | ||
| while span_id == INVALID_SPAN_ID: | ||
| span_id = get_rand_bits(64) | ||
| return span_id | ||
|
|
||
| def generate_trace_id(self) -> int: | ||
| """Generate a deterministic trace ID. | ||
|
|
||
| Uses the workflow's deterministic random source when in a workflow context, | ||
| otherwise falls back to system random. | ||
|
|
||
| Returns: | ||
| A 128-bit trace ID that is guaranteed not to be INVALID_TRACE_ID. | ||
| """ | ||
| if workflow.in_workflow(): | ||
| get_rand_bits = workflow.random().getrandbits | ||
| else: | ||
| import random | ||
|
|
||
| get_rand_bits = random.getrandbits | ||
| if len(self.traces) > 0: | ||
| return self.traces.pop() | ||
|
|
||
| trace_id = get_rand_bits(128) | ||
| while trace_id == INVALID_TRACE_ID: | ||
| trace_id = get_rand_bits(128) | ||
| return trace_id | ||
|
|
||
|
|
||
| class TemporalSpanProcessor(SimpleSpanProcessor): | ||
| """A span processor that handles Temporal workflow replay semantics. | ||
|
|
||
| This processor ensures that spans are only exported when workflows actually | ||
| complete, not during intermediate replays. This is crucial for maintaining | ||
| correct telemetry data when using OpenAI agents within Temporal workflows. | ||
|
|
||
| Example usage: | ||
| from opentelemetry.sdk import trace as trace_sdk | ||
| from opentelemetry.sdk.trace.export.in_memory_span_exporter import InMemorySpanExporter | ||
| from temporalio.contrib.openai_agents._temporal_trace_provider import TemporalIdGenerator | ||
| from temporalio.contrib.openai_agents._otel import TemporalSpanProcessor | ||
| from openinference.instrumentation.openai_agents import OpenAIAgentsInstrumentor | ||
|
|
||
| exporter = InMemorySpanExporter() | ||
| provider = trace_sdk.TracerProvider(id_generator=TemporalIdGenerator()) | ||
| provider.add_span_processor(TemporalSpanProcessor(exporter)) | ||
| OpenAIAgentsInstrumentor().instrument(tracer_provider=provider) | ||
| """ | ||
|
|
||
| def on_end(self, span: ReadableSpan) -> None: | ||
| """Handle span end events, skipping export during workflow replay. | ||
|
|
||
| Args: | ||
| span: The span that has ended. | ||
| """ | ||
| if workflow.in_workflow() and workflow.unsafe.is_replaying(): | ||
tconley1428 marked this conversation as resolved.
Show resolved
Hide resolved
|
||
| # Skip exporting spans during workflow replay to avoid duplicate telemetry | ||
| return | ||
| super().on_end(span) | ||
Oops, something went wrong.
Oops, something went wrong.
Add this suggestion to a batch that can be applied as a single commit.
This suggestion is invalid because no changes were made to the code.
Suggestions cannot be applied while the pull request is closed.
Suggestions cannot be applied while viewing a subset of changes.
Only one suggestion per line can be applied in a batch.
Add this suggestion to a batch that can be applied as a single commit.
Applying suggestions on deleted lines is not supported.
You must change the existing code in this line in order to create a valid suggestion.
Outdated suggestions cannot be applied.
This suggestion has been applied or marked resolved.
Suggestions cannot be applied from pending reviews.
Suggestions cannot be applied on multi-line comments.
Suggestions cannot be applied while the pull request is queued to merge.
Suggestion cannot be applied right now. Please check back later.
Uh oh!
There was an error while loading. Please reload this page.