Skip to content

Commit ff0cee1

Browse files
authored
ref(core): Unify .do* span ops to gen_ai.generate_content (#20074)
The Vercel AI integration had 4 distinct span op values for inner `.do*` spans (`gen_ai.generate_text`, `gen_ai.stream_text`, `gen_ai.generate_object`, `gen_ai.stream_object`), even though all of them share `generate_content` as operation name. Every other AI integration derives the span op as `gen_ai.${operationName}`, so these should all be `gen_ai.generate_content`.
1 parent d0bbc1a commit ff0cee1

File tree

12 files changed

+62
-79
lines changed

12 files changed

+62
-79
lines changed

CHANGELOG.md

Lines changed: 4 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -4,6 +4,10 @@
44

55
- "You miss 100 percent of the chances you don't take. — Wayne Gretzky" — Michael Scott
66

7+
- **ref(core): Unify .do\* span ops to `gen_ai.generate_content` ([#20074](https://github.com/getsentry/sentry-javascript/pull/20074))**
8+
9+
All Vercel AI `do*` spans (`ai.generateText.doGenerate`, `ai.streamText.doStream`, `ai.generateObject.doGenerate`, `ai.streamObject.doStream`) now use a single unified span op `gen_ai.generate_content` instead of separate ops like `gen_ai.generate_text`, `gen_ai.stream_text`, `gen_ai.generate_object`, and `gen_ai.stream_object`.
10+
711
- **ref(core): Remove provider-specific AI span attributes in favor of `gen_ai` attributes in sentry conventions ([#20011](https://github.com/getsentry/sentry-javascript/pull/20011))**
812

913
The following provider-specific span attributes have been removed from the OpenAI and Anthropic AI integrations. Use the standardized `gen_ai.*` equivalents instead:

dev-packages/e2e-tests/test-applications/deno/tests/ai.test.ts

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -31,7 +31,7 @@ test('should create AI pipeline spans with Vercel AI SDK', async ({ baseURL }) =
3131
const aiSpans = spans.filter(
3232
(span: any) =>
3333
span.op === 'gen_ai.invoke_agent' ||
34-
span.op === 'gen_ai.generate_text' ||
34+
span.op === 'gen_ai.generate_content' ||
3535
span.op === 'otel.span' ||
3636
span.description?.includes('ai.generateText'),
3737
);

dev-packages/e2e-tests/test-applications/nextjs-15/tests/ai-error.test.ts

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -26,7 +26,7 @@ test('should create AI spans with correct attributes and error linking', async (
2626
// because of this, only spans that are manually opted-in at call time will be captured
2727
// this may be fixed by https://github.com/vercel/ai/pull/6716 in the future
2828
const aiPipelineSpans = spans.filter(span => span.op === 'gen_ai.invoke_agent');
29-
const aiGenerateSpans = spans.filter(span => span.op === 'gen_ai.generate_text');
29+
const aiGenerateSpans = spans.filter(span => span.op === 'gen_ai.generate_content');
3030
const toolCallSpans = spans.filter(span => span.op === 'gen_ai.execute_tool');
3131

3232
expect(aiPipelineSpans.length).toBeGreaterThanOrEqual(1);

dev-packages/e2e-tests/test-applications/nextjs-15/tests/ai-test.test.ts

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -22,7 +22,7 @@ test('should create AI spans with correct attributes', async ({ page }) => {
2222
// because of this, only spans that are manually opted-in at call time will be captured
2323
// this may be fixed by https://github.com/vercel/ai/pull/6716 in the future
2424
const aiPipelineSpans = spans.filter(span => span.op === 'gen_ai.invoke_agent');
25-
const aiGenerateSpans = spans.filter(span => span.op === 'gen_ai.generate_text');
25+
const aiGenerateSpans = spans.filter(span => span.op === 'gen_ai.generate_content');
2626
const toolCallSpans = spans.filter(span => span.op === 'gen_ai.execute_tool');
2727

2828
expect(aiPipelineSpans.length).toBeGreaterThanOrEqual(1);

dev-packages/e2e-tests/test-applications/nextjs-16/tests/ai-error.test.ts

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -26,7 +26,7 @@ test('should create AI spans with correct attributes and error linking', async (
2626
// because of this, only spans that are manually opted-in at call time will be captured
2727
// this may be fixed by https://github.com/vercel/ai/pull/6716 in the future
2828
const aiPipelineSpans = spans.filter(span => span.op === 'gen_ai.invoke_agent');
29-
const aiGenerateSpans = spans.filter(span => span.op === 'gen_ai.generate_text');
29+
const aiGenerateSpans = spans.filter(span => span.op === 'gen_ai.generate_content');
3030
const toolCallSpans = spans.filter(span => span.op === 'gen_ai.execute_tool');
3131

3232
expect(aiPipelineSpans.length).toBeGreaterThanOrEqual(1);

dev-packages/e2e-tests/test-applications/nextjs-16/tests/ai-test.test.ts

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -22,7 +22,7 @@ test('should create AI spans with correct attributes', async ({ page }) => {
2222
// because of this, only spans that are manually opted-in at call time will be captured
2323
// this may be fixed by https://github.com/vercel/ai/pull/6716 in the future
2424
const aiPipelineSpans = spans.filter(span => span.op === 'gen_ai.invoke_agent');
25-
const aiGenerateSpans = spans.filter(span => span.op === 'gen_ai.generate_text');
25+
const aiGenerateSpans = spans.filter(span => span.op === 'gen_ai.generate_content');
2626
const toolCallSpans = spans.filter(span => span.op === 'gen_ai.execute_tool');
2727

2828
expect(aiPipelineSpans.length).toBeGreaterThanOrEqual(1);

dev-packages/node-integration-tests/suites/tracing/vercelai/test-generate-object.ts

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -37,7 +37,7 @@ describe('Vercel AI integration - generateObject', () => {
3737
expect.objectContaining({
3838
data: expect.objectContaining({
3939
'sentry.origin': 'auto.vercelai.otel',
40-
'sentry.op': 'gen_ai.generate_object',
40+
'sentry.op': 'gen_ai.generate_content',
4141
'gen_ai.operation.name': 'generate_content',
4242
'vercel.ai.operationId': 'ai.generateObject.doGenerate',
4343
'vercel.ai.model.provider': 'mock-provider',
@@ -52,7 +52,7 @@ describe('Vercel AI integration - generateObject', () => {
5252
'gen_ai.usage.total_tokens': 40,
5353
}),
5454
description: 'generate_content mock-model-id',
55-
op: 'gen_ai.generate_object',
55+
op: 'gen_ai.generate_content',
5656
origin: 'auto.vercelai.otel',
5757
status: 'ok',
5858
}),

dev-packages/node-integration-tests/suites/tracing/vercelai/test.ts

Lines changed: 19 additions & 19 deletions
Original file line numberDiff line numberDiff line change
@@ -70,7 +70,7 @@ describe('Vercel AI integration', () => {
7070
[GEN_AI_USAGE_OUTPUT_TOKENS_ATTRIBUTE]: 20,
7171
[GEN_AI_USAGE_TOTAL_TOKENS_ATTRIBUTE]: 30,
7272
[GEN_AI_OPERATION_NAME_ATTRIBUTE]: 'generate_content',
73-
[SEMANTIC_ATTRIBUTE_SENTRY_OP]: 'gen_ai.generate_text',
73+
[SEMANTIC_ATTRIBUTE_SENTRY_OP]: 'gen_ai.generate_content',
7474
[SEMANTIC_ATTRIBUTE_SENTRY_ORIGIN]: 'auto.vercelai.otel',
7575
'vercel.ai.model.provider': 'mock-provider',
7676
'vercel.ai.operationId': 'ai.generateText.doGenerate',
@@ -83,7 +83,7 @@ describe('Vercel AI integration', () => {
8383
'vercel.ai.streaming': false,
8484
},
8585
description: 'generate_content mock-model-id',
86-
op: 'gen_ai.generate_text',
86+
op: 'gen_ai.generate_content',
8787
origin: 'auto.vercelai.otel',
8888
status: 'ok',
8989
}),
@@ -132,7 +132,7 @@ describe('Vercel AI integration', () => {
132132
[GEN_AI_USAGE_OUTPUT_TOKENS_ATTRIBUTE]: 20,
133133
[GEN_AI_USAGE_TOTAL_TOKENS_ATTRIBUTE]: 30,
134134
[GEN_AI_OPERATION_NAME_ATTRIBUTE]: 'generate_content',
135-
[SEMANTIC_ATTRIBUTE_SENTRY_OP]: 'gen_ai.generate_text',
135+
[SEMANTIC_ATTRIBUTE_SENTRY_OP]: 'gen_ai.generate_content',
136136
[SEMANTIC_ATTRIBUTE_SENTRY_ORIGIN]: 'auto.vercelai.otel',
137137
'vercel.ai.model.provider': 'mock-provider',
138138
'vercel.ai.operationId': 'ai.generateText.doGenerate',
@@ -146,7 +146,7 @@ describe('Vercel AI integration', () => {
146146
'vercel.ai.streaming': false,
147147
},
148148
description: 'generate_content mock-model-id',
149-
op: 'gen_ai.generate_text',
149+
op: 'gen_ai.generate_content',
150150
origin: 'auto.vercelai.otel',
151151
status: 'ok',
152152
}),
@@ -186,7 +186,7 @@ describe('Vercel AI integration', () => {
186186
[GEN_AI_USAGE_OUTPUT_TOKENS_ATTRIBUTE]: 25,
187187
[GEN_AI_USAGE_TOTAL_TOKENS_ATTRIBUTE]: 40,
188188
[GEN_AI_OPERATION_NAME_ATTRIBUTE]: 'generate_content',
189-
[SEMANTIC_ATTRIBUTE_SENTRY_OP]: 'gen_ai.generate_text',
189+
[SEMANTIC_ATTRIBUTE_SENTRY_OP]: 'gen_ai.generate_content',
190190
[SEMANTIC_ATTRIBUTE_SENTRY_ORIGIN]: 'auto.vercelai.otel',
191191
'vercel.ai.model.provider': 'mock-provider',
192192
'vercel.ai.operationId': 'ai.generateText.doGenerate',
@@ -199,7 +199,7 @@ describe('Vercel AI integration', () => {
199199
'vercel.ai.streaming': false,
200200
},
201201
description: 'generate_content mock-model-id',
202-
op: 'gen_ai.generate_text',
202+
op: 'gen_ai.generate_content',
203203
origin: 'auto.vercelai.otel',
204204
status: 'ok',
205205
}),
@@ -280,7 +280,7 @@ describe('Vercel AI integration', () => {
280280
[GEN_AI_USAGE_OUTPUT_TOKENS_ATTRIBUTE]: 20,
281281
[GEN_AI_USAGE_TOTAL_TOKENS_ATTRIBUTE]: 30,
282282
[GEN_AI_OPERATION_NAME_ATTRIBUTE]: 'generate_content',
283-
[SEMANTIC_ATTRIBUTE_SENTRY_OP]: 'gen_ai.generate_text',
283+
[SEMANTIC_ATTRIBUTE_SENTRY_OP]: 'gen_ai.generate_content',
284284
[SEMANTIC_ATTRIBUTE_SENTRY_ORIGIN]: 'auto.vercelai.otel',
285285
'vercel.ai.model.provider': 'mock-provider',
286286
'vercel.ai.operationId': 'ai.generateText.doGenerate',
@@ -294,7 +294,7 @@ describe('Vercel AI integration', () => {
294294
'vercel.ai.streaming': false,
295295
},
296296
description: 'generate_content mock-model-id',
297-
op: 'gen_ai.generate_text',
297+
op: 'gen_ai.generate_content',
298298
origin: 'auto.vercelai.otel',
299299
status: 'ok',
300300
parent_span_id: expect.any(String),
@@ -353,7 +353,7 @@ describe('Vercel AI integration', () => {
353353
[GEN_AI_USAGE_OUTPUT_TOKENS_ATTRIBUTE]: 20,
354354
[GEN_AI_USAGE_TOTAL_TOKENS_ATTRIBUTE]: 30,
355355
[GEN_AI_OPERATION_NAME_ATTRIBUTE]: 'generate_content',
356-
[SEMANTIC_ATTRIBUTE_SENTRY_OP]: 'gen_ai.generate_text',
356+
[SEMANTIC_ATTRIBUTE_SENTRY_OP]: 'gen_ai.generate_content',
357357
[SEMANTIC_ATTRIBUTE_SENTRY_ORIGIN]: 'auto.vercelai.otel',
358358
'vercel.ai.model.provider': 'mock-provider',
359359
'vercel.ai.operationId': 'ai.generateText.doGenerate',
@@ -367,7 +367,7 @@ describe('Vercel AI integration', () => {
367367
'vercel.ai.streaming': false,
368368
},
369369
description: 'generate_content mock-model-id',
370-
op: 'gen_ai.generate_text',
370+
op: 'gen_ai.generate_content',
371371
origin: 'auto.vercelai.otel',
372372
status: 'ok',
373373
parent_span_id: expect.any(String),
@@ -427,7 +427,7 @@ describe('Vercel AI integration', () => {
427427
[GEN_AI_USAGE_OUTPUT_TOKENS_ATTRIBUTE]: 25,
428428
[GEN_AI_USAGE_TOTAL_TOKENS_ATTRIBUTE]: 40,
429429
[GEN_AI_OPERATION_NAME_ATTRIBUTE]: 'generate_content',
430-
[SEMANTIC_ATTRIBUTE_SENTRY_OP]: 'gen_ai.generate_text',
430+
[SEMANTIC_ATTRIBUTE_SENTRY_OP]: 'gen_ai.generate_content',
431431
[SEMANTIC_ATTRIBUTE_SENTRY_ORIGIN]: 'auto.vercelai.otel',
432432
'vercel.ai.model.provider': 'mock-provider',
433433
'vercel.ai.operationId': 'ai.generateText.doGenerate',
@@ -442,7 +442,7 @@ describe('Vercel AI integration', () => {
442442
'vercel.ai.streaming': false,
443443
},
444444
description: 'generate_content mock-model-id',
445-
op: 'gen_ai.generate_text',
445+
op: 'gen_ai.generate_content',
446446
origin: 'auto.vercelai.otel',
447447
status: 'ok',
448448
parent_span_id: expect.any(String),
@@ -528,7 +528,7 @@ describe('Vercel AI integration', () => {
528528
[GEN_AI_USAGE_OUTPUT_TOKENS_ATTRIBUTE]: 25,
529529
[GEN_AI_USAGE_TOTAL_TOKENS_ATTRIBUTE]: 40,
530530
[GEN_AI_OPERATION_NAME_ATTRIBUTE]: 'generate_content',
531-
[SEMANTIC_ATTRIBUTE_SENTRY_OP]: 'gen_ai.generate_text',
531+
[SEMANTIC_ATTRIBUTE_SENTRY_OP]: 'gen_ai.generate_content',
532532
[SEMANTIC_ATTRIBUTE_SENTRY_ORIGIN]: 'auto.vercelai.otel',
533533
'vercel.ai.model.provider': 'mock-provider',
534534
'vercel.ai.operationId': 'ai.generateText.doGenerate',
@@ -541,7 +541,7 @@ describe('Vercel AI integration', () => {
541541
'vercel.ai.streaming': false,
542542
},
543543
description: 'generate_content mock-model-id',
544-
op: 'gen_ai.generate_text',
544+
op: 'gen_ai.generate_content',
545545
origin: 'auto.vercelai.otel',
546546
status: 'ok',
547547
}),
@@ -648,7 +648,7 @@ describe('Vercel AI integration', () => {
648648
[GEN_AI_USAGE_OUTPUT_TOKENS_ATTRIBUTE]: 25,
649649
[GEN_AI_USAGE_TOTAL_TOKENS_ATTRIBUTE]: 40,
650650
[GEN_AI_OPERATION_NAME_ATTRIBUTE]: 'generate_content',
651-
[SEMANTIC_ATTRIBUTE_SENTRY_OP]: 'gen_ai.generate_text',
651+
[SEMANTIC_ATTRIBUTE_SENTRY_OP]: 'gen_ai.generate_content',
652652
[SEMANTIC_ATTRIBUTE_SENTRY_ORIGIN]: 'auto.vercelai.otel',
653653
'vercel.ai.model.provider': 'mock-provider',
654654
'vercel.ai.operationId': 'ai.generateText.doGenerate',
@@ -661,7 +661,7 @@ describe('Vercel AI integration', () => {
661661
'vercel.ai.streaming': false,
662662
},
663663
description: 'generate_content mock-model-id',
664-
op: 'gen_ai.generate_text',
664+
op: 'gen_ai.generate_content',
665665
origin: 'auto.vercelai.otel',
666666
status: 'ok',
667667
}),
@@ -757,11 +757,11 @@ describe('Vercel AI integration', () => {
757757
// The doGenerate span - name stays as 'generateText.doGenerate' since model ID is missing
758758
expect.objectContaining({
759759
description: 'generateText.doGenerate',
760-
op: 'gen_ai.generate_text',
760+
op: 'gen_ai.generate_content',
761761
origin: 'auto.vercelai.otel',
762762
status: 'ok',
763763
data: expect.objectContaining({
764-
[SEMANTIC_ATTRIBUTE_SENTRY_OP]: 'gen_ai.generate_text',
764+
[SEMANTIC_ATTRIBUTE_SENTRY_OP]: 'gen_ai.generate_content',
765765
[SEMANTIC_ATTRIBUTE_SENTRY_ORIGIN]: 'auto.vercelai.otel',
766766
[GEN_AI_OPERATION_NAME_ATTRIBUTE]: 'generate_content',
767767
}),
@@ -938,7 +938,7 @@ describe('Vercel AI integration', () => {
938938
}),
939939
}),
940940
expect.objectContaining({
941-
op: 'gen_ai.generate_text',
941+
op: 'gen_ai.generate_content',
942942
data: expect.objectContaining({
943943
'gen_ai.conversation.id': 'conv-a',
944944
}),

dev-packages/node-integration-tests/suites/tracing/vercelai/v5/test.ts

Lines changed: 14 additions & 14 deletions
Original file line numberDiff line numberDiff line change
@@ -59,7 +59,7 @@ describe('Vercel AI integration (V5)', () => {
5959
expect.objectContaining({
6060
data: {
6161
[SEMANTIC_ATTRIBUTE_SENTRY_ORIGIN]: 'auto.vercelai.otel',
62-
[SEMANTIC_ATTRIBUTE_SENTRY_OP]: 'gen_ai.generate_text',
62+
[SEMANTIC_ATTRIBUTE_SENTRY_OP]: 'gen_ai.generate_content',
6363
[GEN_AI_OPERATION_NAME_ATTRIBUTE]: 'generate_content',
6464
'vercel.ai.operationId': 'ai.generateText.doGenerate',
6565
'vercel.ai.model.provider': 'mock-provider',
@@ -80,7 +80,7 @@ describe('Vercel AI integration (V5)', () => {
8080
[GEN_AI_USAGE_TOTAL_TOKENS_ATTRIBUTE]: 30,
8181
},
8282
description: 'generate_content mock-model-id',
83-
op: 'gen_ai.generate_text',
83+
op: 'gen_ai.generate_content',
8484
origin: 'auto.vercelai.otel',
8585
status: 'ok',
8686
}),
@@ -116,7 +116,7 @@ describe('Vercel AI integration (V5)', () => {
116116
expect.objectContaining({
117117
data: {
118118
[SEMANTIC_ATTRIBUTE_SENTRY_ORIGIN]: 'auto.vercelai.otel',
119-
[SEMANTIC_ATTRIBUTE_SENTRY_OP]: 'gen_ai.generate_text',
119+
[SEMANTIC_ATTRIBUTE_SENTRY_OP]: 'gen_ai.generate_content',
120120
[GEN_AI_OPERATION_NAME_ATTRIBUTE]: 'generate_content',
121121
'vercel.ai.operationId': 'ai.generateText.doGenerate',
122122
'vercel.ai.model.provider': 'mock-provider',
@@ -141,7 +141,7 @@ describe('Vercel AI integration (V5)', () => {
141141
[GEN_AI_USAGE_TOTAL_TOKENS_ATTRIBUTE]: 30,
142142
},
143143
description: 'generate_content mock-model-id',
144-
op: 'gen_ai.generate_text',
144+
op: 'gen_ai.generate_content',
145145
origin: 'auto.vercelai.otel',
146146
status: 'ok',
147147
}),
@@ -189,11 +189,11 @@ describe('Vercel AI integration (V5)', () => {
189189
[GEN_AI_USAGE_OUTPUT_TOKENS_ATTRIBUTE]: 25,
190190
[GEN_AI_USAGE_TOTAL_TOKENS_ATTRIBUTE]: 40,
191191
[GEN_AI_OPERATION_NAME_ATTRIBUTE]: 'generate_content',
192-
[SEMANTIC_ATTRIBUTE_SENTRY_OP]: 'gen_ai.generate_text',
192+
[SEMANTIC_ATTRIBUTE_SENTRY_OP]: 'gen_ai.generate_content',
193193
[SEMANTIC_ATTRIBUTE_SENTRY_ORIGIN]: 'auto.vercelai.otel',
194194
},
195195
description: 'generate_content mock-model-id',
196-
op: 'gen_ai.generate_text',
196+
op: 'gen_ai.generate_content',
197197
origin: 'auto.vercelai.otel',
198198
status: 'ok',
199199
}),
@@ -277,11 +277,11 @@ describe('Vercel AI integration (V5)', () => {
277277
[GEN_AI_USAGE_OUTPUT_TOKENS_ATTRIBUTE]: 20,
278278
[GEN_AI_USAGE_TOTAL_TOKENS_ATTRIBUTE]: 30,
279279
[GEN_AI_OPERATION_NAME_ATTRIBUTE]: 'generate_content',
280-
[SEMANTIC_ATTRIBUTE_SENTRY_OP]: 'gen_ai.generate_text',
280+
[SEMANTIC_ATTRIBUTE_SENTRY_OP]: 'gen_ai.generate_content',
281281
[SEMANTIC_ATTRIBUTE_SENTRY_ORIGIN]: 'auto.vercelai.otel',
282282
},
283283
description: 'generate_content mock-model-id',
284-
op: 'gen_ai.generate_text',
284+
op: 'gen_ai.generate_content',
285285
origin: 'auto.vercelai.otel',
286286
status: 'ok',
287287
}),
@@ -317,7 +317,7 @@ describe('Vercel AI integration (V5)', () => {
317317
expect.objectContaining({
318318
data: {
319319
[SEMANTIC_ATTRIBUTE_SENTRY_ORIGIN]: 'auto.vercelai.otel',
320-
[SEMANTIC_ATTRIBUTE_SENTRY_OP]: 'gen_ai.generate_text',
320+
[SEMANTIC_ATTRIBUTE_SENTRY_OP]: 'gen_ai.generate_content',
321321
[GEN_AI_OPERATION_NAME_ATTRIBUTE]: 'generate_content',
322322
'vercel.ai.operationId': 'ai.generateText.doGenerate',
323323
'vercel.ai.model.provider': 'mock-provider',
@@ -342,7 +342,7 @@ describe('Vercel AI integration (V5)', () => {
342342
[GEN_AI_USAGE_TOTAL_TOKENS_ATTRIBUTE]: 30,
343343
},
344344
description: 'generate_content mock-model-id',
345-
op: 'gen_ai.generate_text',
345+
op: 'gen_ai.generate_content',
346346
origin: 'auto.vercelai.otel',
347347
status: 'ok',
348348
}),
@@ -401,11 +401,11 @@ describe('Vercel AI integration (V5)', () => {
401401
[GEN_AI_USAGE_OUTPUT_TOKENS_ATTRIBUTE]: 25,
402402
[GEN_AI_USAGE_TOTAL_TOKENS_ATTRIBUTE]: 40,
403403
[GEN_AI_OPERATION_NAME_ATTRIBUTE]: 'generate_content',
404-
[SEMANTIC_ATTRIBUTE_SENTRY_OP]: 'gen_ai.generate_text',
404+
[SEMANTIC_ATTRIBUTE_SENTRY_OP]: 'gen_ai.generate_content',
405405
[SEMANTIC_ATTRIBUTE_SENTRY_ORIGIN]: 'auto.vercelai.otel',
406406
}),
407407
description: 'generate_content mock-model-id',
408-
op: 'gen_ai.generate_text',
408+
op: 'gen_ai.generate_content',
409409
origin: 'auto.vercelai.otel',
410410
status: 'ok',
411411
}),
@@ -513,11 +513,11 @@ describe('Vercel AI integration (V5)', () => {
513513
[GEN_AI_USAGE_OUTPUT_TOKENS_ATTRIBUTE]: 25,
514514
[GEN_AI_USAGE_TOTAL_TOKENS_ATTRIBUTE]: 40,
515515
[GEN_AI_OPERATION_NAME_ATTRIBUTE]: 'generate_content',
516-
[SEMANTIC_ATTRIBUTE_SENTRY_OP]: 'gen_ai.generate_text',
516+
[SEMANTIC_ATTRIBUTE_SENTRY_OP]: 'gen_ai.generate_content',
517517
[SEMANTIC_ATTRIBUTE_SENTRY_ORIGIN]: 'auto.vercelai.otel',
518518
},
519519
description: 'generate_content mock-model-id',
520-
op: 'gen_ai.generate_text',
520+
op: 'gen_ai.generate_content',
521521
origin: 'auto.vercelai.otel',
522522
status: 'ok',
523523
}),

0 commit comments

Comments
 (0)