-
Notifications
You must be signed in to change notification settings - Fork 577
feat(instrumentation-aws-sdk): add gen ai conventions for converse stream span #2769
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
base: main
Are you sure you want to change the base?
Conversation
if (override) { | ||
response.output = override; | ||
normalizedResponse.data = override; | ||
} | ||
self._callUserResponseHook(span, normalizedResponse); |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I considered whether this should be done for the user hook too but didn't think there's enough use case for it. Currently the change is only internal since AFAIK, users can't define service extensions
Codecov ReportAll modified and coverable lines are covered by tests ✅
Additional details and impacted files@@ Coverage Diff @@
## main #2769 +/- ##
==========================================
+ Coverage 89.69% 89.71% +0.02%
==========================================
Files 184 184
Lines 8966 8988 +22
Branches 1835 1839 +4
==========================================
+ Hits 8042 8064 +22
Misses 924 924
🚀 New features to boost your workflow:
|
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Streaming is tricky but you handled it concisely. Thanks
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Some initial review. I haven't looked carefully at the implementation yet.
// isStream - if true, then the response is a stream so the span should not be ended by the middleware. | ||
// the ServiceExtension must end the span itself, generally by wrapping the stream and ending after it is | ||
// consumed. | ||
isStream?: boolean; |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Do you know if the GenAI SIG discussed/documented wanting this behaviour of ending the span after the full stream is consumed? I've seen opinions vary when discussing HTTP streaming. See the guidance for HTTP client span duration here: https://github.com/open-telemetry/semantic-conventions/blob/main/docs/http/http-spans.md#http-client-span-duration
Is there any equivalent in the Python GenAI-related instrumentations?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Thanks for the link, that's interesting indeed in terms of response streaming. There isn't any guideline on gen ai spans, however I feel it's sort of implied by the conventions for token usage - there really isn't a way to populate them without keeping the span until the end of the stream. While the duration could keep streaming out of it, that would then need to override the end time of that span with the earlier timestamp, which I don't think the JS SDK supports.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
While the duration could keep streaming out of it, that would then need to override the end time of that span with the earlier timestamp, which I don't think the JS SDK supports.
Correct, it does not support that. The (documented) behaviour then would be that the Span duration would be just up until the first response from the server, effectively TTFB.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Ah do you mean that we should use TTFB here? That would mean we couldn't record usage information though.
BTW, I realized that this might be closer to RPC than HTTP which has some specification for streaming
https://opentelemetry.io/docs/specs/semconv/rpc/rpc-spans/#message-event
an event for each message sent/received on client and server spans SHOULD be created
I think this also implies the overall span is for the whole stream. FWIW python keeps the span for the entire stream too.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Ah do you mean that we should use TTFB here?
No, I did not mean to imply what this "should" do. I don't have a strong opinion one way or the other.
The HTTP span guidance from https://github.com/open-telemetry/semantic-conventions/blob/main/docs/http/http-spans.md#http-client-span-duration says:
Because of the potential for confusion around this, HTTP client library instrumentations SHOULD document their behavior around ending HTTP client spans.
I only meant to say that if it is decided to handle this stream by having the span end just on the initial response that this should be documented.
FWIW python keeps the span for the entire stream too.
Sounds good to me to have the intention for the JS instrumentation to be the same.
plugins/node/opentelemetry-instrumentation-aws-sdk/src/services/bedrock-runtime.ts
Show resolved
Hide resolved
…y-js-contrib into bedrock-runtime-stream
…y-js-contrib into bedrock-runtime-stream
import { AwsInstrumentation } from '../src'; | ||
|
||
export const instrumentation = new AwsInstrumentation(); | ||
export const metricReader = initMeterProvider(instrumentation); |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I realized this pattern seemed to not work for multiple tests. I did try passing DELTA
temporality in the MetricReader
constructor since I thought it would fix it but it didn't. So I changed to a pattern inspired by what's used in Java
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
plugins/node/opentelemetry-instrumentation-aws-sdk/src/services/bedrock-runtime.ts
Outdated
Show resolved
Hide resolved
plugins/node/opentelemetry-instrumentation-aws-sdk/test/bedrock-runtime.test.ts
Outdated
Show resolved
Hide resolved
) { | ||
return { | ||
...response.data, | ||
stream: this.wrapConverseStreamResponse( |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Does stream: this.wrapConverseStreamResponse(...)
overwrite the stream
from ...response.data
?
I think it will be nice to have a comment to call this out.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Yup - added a comment about it
The instrumentation is done for claude, titan and nova models
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
LGTM. Waiting for an approval from one of the code owners before merging.
Which problem is this PR solving?
Converse
populates gen ai conventionsShort description of the changes
ConverseStream
in bedrock extension using above/cc @trentm @codefromthecrypt