feat(openinference-vercel): AI sdk v6#2684
Conversation
@arizeai/openinference-core
@arizeai/openinference-genai
@arizeai/openinference-instrumentation-anthropic
@arizeai/openinference-instrumentation-bedrock
@arizeai/openinference-instrumentation-bedrock-agent-runtime
@arizeai/openinference-instrumentation-beeai
@arizeai/openinference-instrumentation-langchain
@arizeai/openinference-instrumentation-langchain-v0
@arizeai/openinference-instrumentation-mcp
@arizeai/openinference-instrumentation-openai
@arizeai/openinference-mastra
@arizeai/openinference-semantic-conventions
@arizeai/openinference-vercel
commit: |
js/packages/openinference-vercel/src/OpenInferenceSpanProcessor.ts
Outdated
Show resolved
Hide resolved
Code reviewNo issues found. Checked for bugs and CLAUDE.md compliance. |
1 similar comment
Code reviewNo issues found. Checked for bugs and CLAUDE.md compliance. |
3e3d33f to
5614c10
Compare
Update test to use ai.embed (which maps to CHAIN) instead of ai.generateText (which now maps to AGENT) to properly test that token counts are not mapped for CHAIN spans.
There was a problem hiding this comment.
Cursor Bugbot has reviewed your changes and found 1 potential issue.
Bugbot Autofix is OFF. To automatically fix reported issues with Cloud Agents, enable Autofix in the Cursor dashboard.
| async forceFlush(): Promise<void> { | ||
| this.aggregateManager.clear(); | ||
| return super.forceFlush(); | ||
| } |
There was a problem hiding this comment.
forceFlush clears in-flight trace aggregate state
Medium Severity
forceFlush() calls this.aggregateManager.clear(), which destroys all in-flight trace aggregate state. Unlike shutdown(), forceFlush is a non-destructive operation that can be called any time (e.g., periodic intervals, Lambda shutdown hooks). If called while a trace is in progress — after a child span with an error has ended but before the root span ends — the root span's status will remain UNSET instead of being set to ERROR or OK, because the aggregate error tracking data was wiped. The clear() call belongs only in shutdown().
Additional Locations (1)
Code reviewFound 2 CLAUDE.md compliance issues: 1. npm lockfile in scripts directoryFile: This is an npm lockfile ( Per CLAUDE.md:81 and js/CLAUDE.md:14:
This file should be deleted, and dependencies for the scripts directory should be managed using pnpm instead. 2. npx usage in script commentFile: The comment instructs users to use Per CLAUDE.md:81 and js/CLAUDE.md:14:
Change: No bugs found in the code changes. |


Note
Medium Risk
Changes core span attribute conversion and span processor behavior (status/renaming) for AI SDK traces, which could affect exported telemetry shape and error reporting across versions.
Overview
Adds AI SDK v6 telemetry support to
@arizeai/openinference-vercelby preferring standardgen_ai.*attributes (converted via new dependency@arizeai/openinference-genai) and falling back to Vercelai.*attributes for span kind detection, embeddings, tool calls, metadata, streaming metrics, and message IO.Introduces trace-level aggregation (
TraceAggregateManager) so AI SDK root spans are renamed tooperation.nameand get anOK/ERRORstatus derived from child spans/finish reasons/exception events, while keeping existing status untouched when explicitly set.Updates Vercel semantic convention constants (adds v6 fields +
VercelAISemanticConventionsre-export), adjusts span-kind mappings (top-levelgenerate*/stream*nowAGENT), refreshes tests to use real AI SDK v6 span fixtures, and adds scripts/examples for capturing and demoing v6 telemetry. Also standardizes package test scripts tovitest runand pinsopeninference-mastradeps/works around OTel v1 vs v2 typing when calling Vercel utils.Written by Cursor Bugbot for commit a7c61c9. This will update automatically on new commits. Configure here.