-
Notifications
You must be signed in to change notification settings - Fork 18
Description
Background
The LLO Handler in ADOT Python instrumentation supports LLO Patterns of OpenTelemetry Semantic Conventions for GenAI v1.36.0.
# OTel GenAI Semantic Convention used by the latest Strands SDK
# References:
# - OTel GenAI SemConv: https://opentelemetry.io/docs/specs/semconv/gen-ai/gen-ai-events/
# - Strands SDK PR(introduced in v0.1.9): https://github.com/strands-agents/sdk-python/pull/319
"gen_ai.user.message": {
"type": PatternType.DIRECT,
"role": ROLE_USER,
"source": "prompt",
},
"gen_ai.assistant.message": {
"type": PatternType.DIRECT,
"role": ROLE_ASSISTANT,
"source": "output",
},
"gen_ai.system.message": {
"type": PatternType.DIRECT,
"role": ROLE_SYSTEM,
"source": "prompt",
},
"gen_ai.tool.message": {
"type": PatternType.DIRECT,
"role": ROLE_TOOL,
"source": "prompt",
},
"gen_ai.choice": {
"type": PatternType.DIRECT,
"role": ROLE_ASSISTANT,
"source": "output",
},
ADOT JS Instrumentation does not support the above.
https://github.com/aws-observability/aws-otel-js-instrumentation/blob/v0.8.0/aws-distro-opentelemetry-node-autoinstrumentation/src/llo-handler.ts
Furthermore, these events have been deprecated in OpenTelemetry Semantic Conventions for GenAI v1.37.0, and in the latest v1.38.0, "gen_ai.input.messages" and "gen_ai.output.messages" attributes should be used instead.
🛑 Breaking changes 🛑
Deprecations:
gen_ai.system.message event - use gen_ai.system_instructions or
gen_ai.input.messages attributes instead.
gen_ai.user.message, gen_ai.assistant.message, gen_ai.tool.message events
(use gen_ai.input.messages attribute instead)
gen_ai.choice event (use gen_ai.output.messages attribute instead)
Request
Could you please add gen_ai.input.messages and gen_ai.output.messages to the LLO Pattern in the LLO Handler to support the breaking changes introduced in v1.37.0 and later?
Issue
When using Amazon Bedrock AgentCore Observability with OpenTelemetry Bridge of Mastra, the LLM input/output ends up in span metadata instead of being viewable as Events in GenAI Observability, because the OpenTelemetry Bridge follows the v1.38.0 specification.
https://mastra.ai/docs/v1/observability/tracing/bridges/otel#semantic-conventions
The OtelBridge exports Mastra spans using OpenTelemetry Semantic Conventions for GenAI v1.38.0.
- span metadata (in CloudWatch Gen AI Observability Console)
{
"traceId": "6948f0595ebc000e547fcb8750dfe67b",
"spanId": "ea07a6bc2a05097e",
"flags": 0,
"name": "chat openai.gpt-oss-safeguard-120b",
"kind": "CLIENT",
"startTimeUnixNano": 1766387801569000000,
"endTimeUnixNano": 1766387807560000000,
"durationNano": 5991000000,
"attributes": {
"aws.local.service": "mastra_hono_agent_o11y.DEFAULT",
"gen_ai.input.messages": [
{
"role": "system",
"parts": [
{
"type": "text",
"content": "\n You are a helpful weather assistant that provides accurate weather information and can help planning activities based on the weather.\n\n Your primary function is to help users get weather details for specific locations. When responding:\n - Always ask for a location if none is provided\n - If the location name isn't in English, please translate it\n - If giving a location with multiple parts (e.g. \"New York, NY\"), use the most relevant part (e.g. \"New York\")\n - Include relevant details like humidity, wind conditions, and precipitation\n - Keep responses concise but informative\n - If the user asks for activities and provides the weather forecast, suggest activities based on the weather forecast.\n - If the user asks for activities, respond in the format they request.\n\n Use the weatherTool to fetch current weather data.\n"
}
]
},
{
"role": "user",
"parts": [
{
"type": "text",
"content": "東京の天気は?"
}
]
}
],
"telemetry.extended": true,
"gen_ai.usage.output_tokens": 498,
"mastra.metadata.runId": "694b884d-94eb-4350-92f6-cb268adc8b5c",
"aws.remote.resource.identifier": "openai.gpt-oss-safeguard-120b",
"gen_ai.agent.name": "Weather Agent",
"aws.remote.service": "UnknownRemoteService",
"gen_ai.provider.name": "aws.bedrock",
"aws.local.environment": "bedrock-agentcore:default",
"aws.remote.operation": "UnknownRemoteOperation",
"aws.local.operation": "UnmappedOperation",
"mastra.span.type": "model_generation",
"gen_ai.request.temperature": 0,
"aws.span.kind": "CLIENT",
"aws.remote.resource.type": "GenAI::Model",
"gen_ai.operation.name": "chat",
"gen_ai.usage.input_tokens": 772,
"gen_ai.response.finish_reasons": [
"stop"
],
"gen_ai.request.model": "openai.gpt-oss-safeguard-120b",
"gen_ai.output.messages": [
{
"role": "assistant",
"parts": [
{
"type": "text",
"content": "**Current weather in Tokyo**\n\n- **Temperature:** 9.4 °C (feels like 6.1 °C) \n- **Humidity:** 47 % \n- **Wind:** Light breeze from the west at 6.2 km/h, gusts up to 18.4 km/h \n- **Conditions:** Clear sky (no precipitation) \n\nIt’s a cool, clear day—great for a brisk walk or indoor activities if you prefer staying warm."
}
]
}
],
"PlatformType": "AWS::BedrockAgentCore",
"session.id": "fbb60a6a-df68-4104-8d24-d9eaf0f57511",
"gen_ai.agent.id": "weather-agent"
},
"status": {
"code": "OK"
},
"resource": {
"attributes": {
"deployment.environment.name": "bedrock-agentcore:default",
"service.name": "mastra_hono_agent_o11y.DEFAULT",
"cloud.region": "ap-northeast-1",
"aws.log.stream.names": "otel-rt-logs",
"telemetry.sdk.name": "opentelemetry",
"aws.service.type": "gen_ai_agent",
"telemetry.sdk.language": "nodejs",
"cloud.provider": "aws",
"cloud.resource_id": "arn:aws:bedrock-agentcore:ap-northeast-1:<My AWS Account>:runtime/mastra_hono_agent_o11y-Ods4eLGhdQ/runtime-endpoint/DEFAULT:DEFAULT",
"aws.log.group.names": "/aws/bedrock-agentcore/runtimes/mastra_hono_agent_o11y-Ods4eLGhdQ-DEFAULT",
"telemetry.sdk.version": "1.30.1",
"telemetry.auto.version": "0.8.0-aws",
"cloud.platform": "aws_bedrock_agentcore"
}
},
"scope": {
"name": "@mastra/otel-bridge",
"version": "1.0.0"
},
"parentSpanId": "9f758daee370bd20"
}