-
Notifications
You must be signed in to change notification settings - Fork 857
Description
Description
The StreamWrapper class in opentelemetry-instrumentation-openai-v2 does not proxy the .headers attribute to the underlying stream when using with_raw_response.create(stream=True). This causes an AttributeError when code tries to access response headers.
This is related to #4032 which fixed the missing .parse() method, but .headers is still not proxied.
Steps to Reproduce
from opentelemetry.instrumentation.openai_v2 import OpenAIInstrumentor
OpenAIInstrumentor().instrument()
import openai
client = openai.AsyncAzureOpenAI(
api_key="...",
azure_endpoint="...",
api_version="2024-02-15-preview",
)
async def reproducer():
raw_response = await client.chat.completions.with_raw_response.create(
model="gpt-4",
messages=[{"role": "user", "content": "Say hello"}],
max_tokens=10,
stream=True,
)
# This fails:
headers = raw_response.headers # AttributeError: 'StreamWrapper' object has no attribute 'headers'
import asyncio
asyncio.run(reproducer())
Expected Behavior
raw_response.headers should return the HTTP response headers, same as when not instrumented.
Actual Behavior
AttributeError: 'StreamWrapper' object has no attribute 'headers'
Root Cause
StreamWrapper in patch.py doesn't implement getattr to proxy unknown attributes to self.stream.
Suggested Fix
Add getattr to StreamWrapper:
def __getattr__(self, name): return getattr(self.stream, name)
Environment
opentelemetry-instrumentation-openai-v2: 2.3b0
openai: 1.89.0
Python: 3.11
Metadata
Metadata
Assignees
Labels
Type
Projects
Status