You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
llm.prompt_template.template and llm.prompt_template.variables (OpenInference convention) are not emitted on backend spans today.
A component-level approach was prototyped in PR #1036 but withdrawn following review from @jakelorocco: capturing template state on Instruction and GenerativeStub is the wrong layer — it only covers two component types, captures pre-substitution values, and puts telemetry concerns inside domain objects.
The correct extraction point is the formatter render path (TemplateFormatter._render_representation), which is reached for every component type and holds the actual template and variables used at generation time.
Approach
After #1045 lands and BackendTracingPlugin exists:
Capture (template_str, variables_dict) at the point _render_representation renders an action — one place, covers all components.
In BackendTracingPlugin, emit:
llm.prompt_template.template — always (template text is not user content)
llm.prompt_template.variables — only when is_content_tracing_enabled() returns true (variables may contain user data)
Problem
llm.prompt_template.templateandllm.prompt_template.variables(OpenInference convention) are not emitted on backend spans today.A component-level approach was prototyped in PR #1036 but withdrawn following review from @jakelorocco: capturing template state on
InstructionandGenerativeStubis the wrong layer — it only covers two component types, captures pre-substitution values, and puts telemetry concerns inside domain objects.The correct extraction point is the formatter render path (
TemplateFormatter._render_representation), which is reached for every component type and holds the actual template and variables used at generation time.Approach
After #1045 lands and
BackendTracingPluginexists:(template_str, variables_dict)at the point_render_representationrenders an action — one place, covers all components.BackendTracingPlugin, emit:llm.prompt_template.template— always (template text is not user content)llm.prompt_template.variables— only whenis_content_tracing_enabled()returns true (variables may contain user data)is_content_tracing_enabled()and theMELLEA_TRACE_CONTENT/OTEL_INSTRUMENTATION_GENAI_CAPTURE_MESSAGE_CONTENTflag are already wired inmellea/telemetry/tracing.pyfrom feat(telemetry): close five OTel GenAI semantic convention emission gaps (#1035) #1036.Prerequisites
Related
Epic #444 Phase 2 — alongside the existing #1035 entry.
PR #1036 (where the component-level prototype was withdrawn).