Skip to content
Open
Show file tree
Hide file tree
Changes from all commits
Commits
Show all changes
96 commits
Select commit Hold shift + click to select a range
e7bd8d9
Initial APM-only openai-java instrumentation with a unit test.
ygree Oct 24, 2025
6feb61e
Start llmobs system in tests and create llmobs span.
ygree Oct 25, 2025
d541adb
streamed request completion test
ygree Oct 29, 2025
c209bd0
README
ygree Oct 29, 2025
a82856d
Instrument sync streamed completion
ygree Nov 5, 2025
3d3def9
Mock Streamed Completion in Tests
ygree Nov 6, 2025
3542b9e
Add failing test for Async Completion
ygree Nov 6, 2025
c12cd0a
Async Single Completion. Simplify tests. Remove println. Cleanup.
ygree Nov 6, 2025
2af4e5f
Extract DDHttpResponseFor to intercept when the response is parsed in…
ygree Nov 6, 2025
cec33af
Instrument and test Async Stream Completion
ygree Nov 6, 2025
b2b4141
More tests "streamed request completion test with withRawResponse" an…
ygree Nov 6, 2025
b1b6db2
Fix muzzle check
ygree Nov 6, 2025
baa687d
Add openai APM tags, assert tags, todo tags
ygree Nov 7, 2025
b9ad963
Wrap HttpResponseFor instead of forcing parsing. Add TODOs
ygree Nov 8, 2025
5bf16f4
Set response model tag
ygree Nov 10, 2025
be20fa1
Set "openai.organization.name"
ygree Nov 10, 2025
dd230b7
Set ratelimit metrics
ygree Nov 10, 2025
c3f94b0
api_base
ygree Nov 10, 2025
ddf0e09
"openai.request.method" & "openai.request.endpoint"
ygree Nov 11, 2025
fb9fd04
createChatCompletion instrumentation
ygree Nov 11, 2025
a4358a8
Reorder tests single then stream
ygree Nov 11, 2025
99dedfe
Async Single Chat Completion
ygree Nov 11, 2025
d66a5ee
Async Streamed Chat Completion
ygree Nov 11, 2025
f8af3ae
Rename assertChatCompletionTrace
ygree Nov 11, 2025
846e945
Instrument Embeddings
ygree Nov 11, 2025
b14bc4b
ResponseService WIP
ygree Nov 12, 2025
e8daf08
ResponseService synch
ygree Nov 12, 2025
c8a9e6e
ResponseServiceAsyncInstrumentation
ygree Nov 12, 2025
9754cd8
Setup httpClient for tests WIP
ygree Nov 12, 2025
aeaae2a
Intercept Http req/resp with TestOpenAiHttpClient
ygree Nov 12, 2025
7e50d1d
Implement req/resp recorder and mock backend for tests
ygree Nov 13, 2025
8e85447
Add lockfile
ygree Nov 13, 2025
ea2f465
Minor changes to the records format
ygree Nov 13, 2025
1f3aa6c
OpenAiHttpClient for tests. Record only if the record doesn't exist. …
ygree Nov 13, 2025
0c28997
Rename TestOpenAiHttpClient
ygree Nov 13, 2025
6a41720
Do not dump a record if already exists
ygree Nov 13, 2025
6050515
Fix format
ygree Nov 14, 2025
c8a1e94
Fix linter errors
ygree Nov 14, 2025
20c88d7
Fix unused imports
ygree Nov 14, 2025
001ce0e
Fix format
ygree Nov 14, 2025
96f4353
Fix format
ygree Nov 14, 2025
46c2d3a
Fix format
ygree Nov 14, 2025
29ae26c
Merge branch 'master' into ygree/openai-java
ygree Nov 14, 2025
8de5108
Fix format
ygree Nov 14, 2025
fb74ed2
Remove unexisting helper class that failed the test
ygree Nov 14, 2025
9a2bc61
Extract response model.
ygree Nov 15, 2025
637f8e5
Extract response model.
ygree Nov 15, 2025
6bf8a77
Fix llmObsSpanName in LLMObsSpanMapper
ygree Nov 17, 2025
a204bab
LLMObsState -> LLMObsContext (internal-api) to be shared with auto-in…
ygree Nov 17, 2025
8a6a0d8
Experimental use of LLMObsContext in the openai-java completion instr…
ygree Nov 18, 2025
277bd12
Fix bug in the mapper to write input/output fields as maps
ygree Nov 18, 2025
8f2f83c
Add necessary tags to pass TestOpenAiLlmObs::test_completion
ygree Nov 18, 2025
135e42a
Fix assertion when expect a class but it's null
ygree Nov 18, 2025
60d2f06
Fix unit tests
ygree Nov 18, 2025
5cf2669
Fix format
ygree Nov 18, 2025
201e61d
Implement proper extractResponseModel
ygree Nov 19, 2025
6d11609
Fix format
ygree Nov 19, 2025
848d46a
Add note about instrumented code change
ygree Nov 19, 2025
85dbbf2
Enable tests for 0.45.0
ygree Nov 19, 2025
57212c6
:TestOpenAiLlmObs::test_chat_completion[java-test-ml-app-tcp-False] P…
ygree Nov 19, 2025
81295b9
TestOpenAiLlmObs::test_chat_completion[java-test-ml-app-tcp-True] PASSED
ygree Nov 19, 2025
ec29979
Fix format
ygree Nov 19, 2025
c26909b
Fix ChatCompletionServiceTest
ygree Nov 19, 2025
1850e79
Reorg/rename decorator functions
ygree Nov 20, 2025
d6110fc
TestOpenAiLlmObs::test_embedding[java-test-ml-app-tcp] PASSED
ygree Nov 20, 2025
f74d8b6
chat/completion tool call for openai-java <v3.0+
ygree Nov 22, 2025
f2d4428
Change the naming of HTTP records to keep the scanner quiet during a …
ygree Nov 22, 2025
08b0bab
Support openai-java v3.0.0+ only because of breaking changes in tool …
ygree Dec 8, 2025
f718f22
Rename to openai-java-3.0
ygree Dec 8, 2025
a384464
Merge branch 'master' into ygree/openai-java
ygree Dec 8, 2025
ffe043a
fix format
ygree Dec 8, 2025
6e654e5
Extract and assert chatCompletion toolCalls single and streamed
ygree Dec 9, 2025
c783318
Use ChatCompletionAccumulator to simplify streamed chat/response deco…
ygree Dec 9, 2025
f5a9bea
TestOpenAiLlmObs::test_responses_create_tool_call
ygree Dec 9, 2025
23c1a09
TestOpenAiLlmObs::test_responses_create
ygree Dec 10, 2025
ac39251
reasoning
ygree Dec 13, 2025
c25d3e2
Remove the unused record file and add SET_RECORD_FILE_ATTR_ON_READ to…
ygree Dec 15, 2025
c29ee56
Test both JSON and typed parameters because they are accessed differe…
ygree Dec 15, 2025
0f0b0a1
Split to modules b/o exceeded Muzzle.create limit
ygree Dec 16, 2025
55423a0
extract startSpan
ygree Dec 17, 2025
4282f65
rename Response wrappers to minimize confusion with ResponseService
ygree Dec 17, 2025
972b2ac
test_responses_create_tool_input WIP
ygree Dec 16, 2025
b726b1f
Responses create tool input support
ygree Dec 17, 2025
7476caa
withHttpResponse clean up
ygree Dec 18, 2025
854d7b0
Fix unused import
ygree Dec 18, 2025
2d127f2
Handle possible response parse errors
ygree Dec 19, 2025
8718f94
Clean up HttpResponse wrappers
ygree Dec 19, 2025
a4d0421
Properly close scope, finish span, and handle errors in HttpResponse …
ygree Dec 20, 2025
1ca2ae0
Refactor LLMObs getting current LLMObs parentSpanID
ygree Dec 20, 2025
f8d8ce5
Do not decorate LLMObs specific tags when it's disabled
ygree Dec 20, 2025
1154c16
Finish span in HttpResponseWrapper when closing, if the response was …
ygree Dec 20, 2025
6e5cf86
Clean up the unnecessary test subsystem for now because the unit test…
ygree Dec 22, 2025
b19cbb9
remove leftover
ygree Dec 22, 2025
b341414
Implement LLMObs "span.finished" metric
ygree Dec 23, 2025
9dedc4e
Merge branch 'master' into ygree/openai-java
ygree Dec 23, 2025
ff22fa8
update lockfile
ygree Dec 23, 2025
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
Original file line number Diff line number Diff line change
Expand Up @@ -5,6 +5,7 @@
import datadog.trace.api.DDTraceId;
import datadog.trace.api.WellKnownTags;
import datadog.trace.api.llmobs.LLMObs;
import datadog.trace.api.llmobs.LLMObsContext;
import datadog.trace.api.llmobs.LLMObsSpan;
import datadog.trace.api.llmobs.LLMObsTags;
import datadog.trace.bootstrap.instrumentation.api.AgentSpan;
Expand Down Expand Up @@ -81,8 +82,8 @@ public DDLLMObsSpan(
this.span.setTag(LLMOBS_TAG_PREFIX + LLMObsTags.SESSION_ID, sessionId);
}

AgentSpanContext parent = LLMObsState.getLLMObsParentContext();
String parentSpanID = LLMObsState.ROOT_SPAN_ID;
AgentSpanContext parent = LLMObsContext.current();
String parentSpanID = LLMObsContext.ROOT_SPAN_ID;
if (null != parent) {
if (parent.getTraceId() != this.span.getTraceId()) {
LOGGER.error(
Expand All @@ -96,8 +97,7 @@ public DDLLMObsSpan(
}
}
this.span.setTag(LLMOBS_TAG_PREFIX + PARENT_ID_TAG_INTERNAL, parentSpanID);
this.scope = LLMObsState.attach();
LLMObsState.setLLMObsParentContext(this.span.context());
this.scope = LLMObsContext.attach(this.span.context());
}

@Override
Expand Down

This file was deleted.

Original file line number Diff line number Diff line change
Expand Up @@ -108,7 +108,12 @@ public static ReferenceMatcher loadStaticMuzzleReferences(
}

/**
* @return Class names of helpers to inject into the user's classloader
* @return Class names of helpers to inject into the user's classloader.
* <blockquote>
* <p><b>NOTE:</b> The order matters. If the muzzle check fails with a NoClassDefFoundError
* (as seen in build/reports/muzzle-*.txt), it may be because some helper classes depend on
* each other. In this case, the order must be adjusted accordingly.
* </blockquote>
*/
public String[] helperClassNames() {
return NO_HELPERS;
Expand Down
Original file line number Diff line number Diff line change
@@ -0,0 +1,25 @@
apply from: "$rootDir/gradle/java.gradle"
apply plugin: 'idea'

def minVer = '3.0.0'

muzzle {
pass {
group = "com.openai"
module = "openai-java"
versions = "[$minVer,)"
}
}

addTestSuiteForDir('latestDepTest', 'test')

dependencies {
compileOnly group: 'com.openai', name: 'openai-java', version: minVer
implementation project(':internal-api')

testImplementation group: 'com.openai', name: 'openai-java', version: minVer
latestDepTestImplementation group: 'com.openai', name: 'openai-java', version: '+'

testImplementation project(':dd-java-agent:instrumentation:okhttp:okhttp-3.0')
}

Large diffs are not rendered by default.

Original file line number Diff line number Diff line change
@@ -0,0 +1,157 @@
package datadog.trace.instrumentation.openai_java;

import static datadog.trace.instrumentation.openai_java.OpenAiDecorator.REQUEST_MODEL;
import static datadog.trace.instrumentation.openai_java.OpenAiDecorator.RESPONSE_MODEL;

import com.openai.helpers.ChatCompletionAccumulator;
import com.openai.models.chat.completions.ChatCompletion;
import com.openai.models.chat.completions.ChatCompletionChunk;
import com.openai.models.chat.completions.ChatCompletionCreateParams;
import com.openai.models.chat.completions.ChatCompletionMessage;
import com.openai.models.chat.completions.ChatCompletionMessageParam;
import com.openai.models.chat.completions.ChatCompletionMessageToolCall;
import datadog.trace.api.Config;
import datadog.trace.api.llmobs.LLMObs;
import datadog.trace.bootstrap.instrumentation.api.AgentSpan;
import datadog.trace.bootstrap.instrumentation.api.Tags;
import datadog.trace.bootstrap.instrumentation.api.UTF8BytesString;
import java.util.ArrayList;
import java.util.Collections;
import java.util.HashMap;
import java.util.List;
import java.util.Map;
import java.util.Optional;
import java.util.stream.Collectors;

public class ChatCompletionDecorator {
public static final ChatCompletionDecorator DECORATE = new ChatCompletionDecorator();
private static final CharSequence CHAT_COMPLETIONS_CREATE =
UTF8BytesString.create("createChatCompletion");

private final boolean llmObsEnabled = Config.get().isLlmObsEnabled();

public void withChatCompletionCreateParams(
AgentSpan span, ChatCompletionCreateParams params, boolean stream) {
span.setResourceName(CHAT_COMPLETIONS_CREATE);
span.setTag("openai.request.endpoint", "v1/chat/completions");
span.setTag("openai.request.method", "POST");
if (!llmObsEnabled) {
return;
}

span.setTag("_ml_obs_tag.span.kind", Tags.LLMOBS_LLM_SPAN_KIND);
if (params == null) {
return;
}
params.model()._value().asString().ifPresent(str -> span.setTag(REQUEST_MODEL, str));

span.setTag(
"_ml_obs_tag.input",
params.messages().stream()
.map(ChatCompletionDecorator::llmMessage)
.collect(Collectors.toList()));

Map<String, Object> metadata = new HashMap<>();
// maxTokens is deprecated but integration tests missing to provide maxCompletionTokens
params.maxTokens().ifPresent(v -> metadata.put("max_tokens", v));
params.temperature().ifPresent(v -> metadata.put("temperature", v));
if (stream) {
metadata.put("stream", true);
}
params
.streamOptions()
.ifPresent(
v -> {
if (v.includeUsage().orElse(false)) {
metadata.put("stream_options", Collections.singletonMap("include_usage", true));
}
});
span.setTag("_ml_obs_tag.metadata", metadata);
}

private static LLMObs.LLMMessage llmMessage(ChatCompletionMessageParam m) {
String role = "unknown";
String content = null;
if (m.isAssistant()) {
role = "assistant";
content = m.asAssistant().content().map(v -> v.text().orElse(null)).orElse(null);
} else if (m.isDeveloper()) {
role = "developer";
content = m.asDeveloper().content().text().orElse(null);
} else if (m.isSystem()) {
role = "system";
content = m.asSystem().content().text().orElse(null);
} else if (m.isTool()) {
role = "tool";
content = m.asTool().content().text().orElse(null);
} else if (m.isUser()) {
role = "user";
content = m.asUser().content().text().orElse(null);
}
return LLMObs.LLMMessage.from(role, content);
}

public void withChatCompletion(AgentSpan span, ChatCompletion completion) {
if (!llmObsEnabled) {
return;
}
String modelName = completion.model();
span.setTag(RESPONSE_MODEL, modelName);
span.setTag("_ml_obs_tag.model_name", modelName);
span.setTag("_ml_obs_tag.model_provider", "openai");

List<LLMObs.LLMMessage> output =
completion.choices().stream()
.map(ChatCompletionDecorator::llmMessage)
.collect(Collectors.toList());
span.setTag("_ml_obs_tag.output", output);

completion
.usage()
.ifPresent(
usage -> {
span.setTag("_ml_obs_metric.input_tokens", usage.promptTokens());
span.setTag("_ml_obs_metric.output_tokens", usage.completionTokens());
span.setTag("_ml_obs_metric.total_tokens", usage.totalTokens());
});
}

private static LLMObs.LLMMessage llmMessage(ChatCompletion.Choice choice) {
ChatCompletionMessage msg = choice.message();
Optional<?> roleOpt = msg._role().asString();
String role = "unknown";
if (roleOpt.isPresent()) {
role = String.valueOf(roleOpt.get());
}
String content = msg.content().orElse(null);

Optional<List<ChatCompletionMessageToolCall>> toolCallsOpt = msg.toolCalls();
if (toolCallsOpt.isPresent() && !toolCallsOpt.get().isEmpty()) {
List<LLMObs.ToolCall> toolCalls = new ArrayList<>();
for (ChatCompletionMessageToolCall toolCall : toolCallsOpt.get()) {
LLMObs.ToolCall llmObsToolCall = ToolCallExtractor.getToolCall(toolCall);
if (llmObsToolCall != null) {
toolCalls.add(llmObsToolCall);
}
}

if (!toolCalls.isEmpty()) {
return LLMObs.LLMMessage.from(role, content, toolCalls);
}
}

return LLMObs.LLMMessage.from(role, content);
}

public void withChatCompletionChunks(AgentSpan span, List<ChatCompletionChunk> chunks) {
if (!llmObsEnabled) {
return;
}
ChatCompletionAccumulator accumulator = ChatCompletionAccumulator.create();
for (ChatCompletionChunk chunk : chunks) {
accumulator.accumulate(chunk);
}
ChatCompletion chatCompletion = accumulator.chatCompletion();
withChatCompletion(span, chatCompletion);
}
}
Original file line number Diff line number Diff line change
@@ -0,0 +1,34 @@
package datadog.trace.instrumentation.openai_java;

import com.google.auto.service.AutoService;
import datadog.trace.agent.tooling.Instrumenter;
import datadog.trace.agent.tooling.InstrumenterModule;
import java.util.Arrays;
import java.util.List;

@AutoService(InstrumenterModule.class)
public class ChatCompletionModule extends InstrumenterModule.Tracing {
public ChatCompletionModule() {
super("openai-java");
}

@Override
public String[] helperClassNames() {
return new String[] {
packageName + ".ChatCompletionDecorator",
packageName + ".OpenAiDecorator",
packageName + ".HttpResponseWrapper",
packageName + ".HttpStreamResponseWrapper",
packageName + ".HttpStreamResponseStreamWrapper",
packageName + ".ToolCallExtractor",
packageName + ".ToolCallExtractor$1"
};
}

@Override
public List<Instrumenter> typeInstrumentations() {
return Arrays.asList(
new ChatCompletionServiceAsyncInstrumentation(),
new ChatCompletionServiceInstrumentation());
}
}
Loading
Loading