Skip to content

feat: add cekura_metrics_python extension for observability#2148

Open
frank005 wants to merge 1 commit into
TEN-framework:mainfrom
frank005:feat/cekura-metrics-extension
Open

feat: add cekura_metrics_python extension for observability#2148
frank005 wants to merge 1 commit into
TEN-framework:mainfrom
frank005:feat/cekura-metrics-extension

Conversation

@frank005
Copy link
Copy Markdown

Summary

Adds an optional observability extension, cekura_metrics_python, that forwards conversational data (user + assistant transcripts, per-module latency, session lifecycle) to Cekura — with zero changes to core framework files.

  • New addon: ai_agents/agents/ten_packages/extension/cekura_metrics_python/
  • Wired into the stock voice_assistant example as an opt-in node (disabled unless CEKURA_API_KEY is set, so existing demos are unaffected).
  • Ships with unit tests, a detailed README, and requirements.txt (adds aiohttp).

What the extension captures (via graph wiring only)

Because ten_ai_base base classes emit these signals through send_data without set_dests, they are graph-routable and can be fan-out-subscribed in property.json:

Signal Upstream source Cekura row
User speech STT asr_result transcript (role: Testing Agent)
Assistant speech TTS text_data transcript (role: Main Agent)
Session start/end agora_rtc on_user_joined / on_user_left session lifecycle + auto-flush
Per-module latency STT / TTS / LLM metrics latency breakdown

Routing proof (from core/src/ten_runtime/extension/extension.c:ten_extension_determine_out_msgs): when dest_cnt == 0 the runtime consults graph connections and clones the message to every subscriber; when dest_cnt > 0 it bypasses graph fan-out. Since TTS/STT/metrics paths in ten_ai_base never call set_dests, this extension can subscribe alongside main_control without any framework changes.

What is deliberately out of scope for this PR

main_python routes LLM tool-call invocations and reasoning deltas through explicit set_dests / return_result paths (to message_collector and tool providers). Those streams are not graph-fan-out and therefore not captured here. A future follow-up could add an opt-in observability hook in main_python if the maintainers want that, but this PR intentionally leaves main_python untouched. Basic observability (transcripts + latency + session) works fully without it.

Files changed

New extension

  • ai_agents/agents/ten_packages/extension/cekura_metrics_python/ (manifest, addon, extension, config, session, client, helpers, tests, README, requirements)

Example wiring (voice-assistant)

  • tenapp/manifest.json — registers the extension path
  • tenapp/manifest-lock.json — adds the lock entry (v0.1.0)
  • tenapp/property.json — adds a cekura_metrics node (opt-in via ${env:CEKURA_API_KEY|}) and connection block
  • README.md — documents env vars (CEKURA_API_KEY, CEKURA_ASSISTANT_ID, CEKURA_METRIC_IDS) and that the demo still runs unchanged when those are unset

Safety

  • No API keys or secrets committed. All sensitive values reference environment variables with empty defaults (e.g. "${env:CEKURA_API_KEY|}").
  • Extension stays idle with a warning if config is missing — no crash, no network calls.

Test plan

  • tman install resolves the new addon in the voice-assistant example
  • Run the example without CEKURA_API_KEY — verify the warning logs and conversation flows normally
  • Run the example with CEKURA_API_KEY + CEKURA_ASSISTANT_ID — verify a call log appears in the Cekura dashboard with user + assistant transcripts and latency metrics
  • pytest ai_agents/agents/ten_packages/extension/cekura_metrics_python/tests/ passes

CLA

I have read and agreed to the TEN Framework Contributor License Agreement.

Made with Cursor

Adds an optional observability extension that forwards transcripts,
per-module latency, and session lifecycle events to Cekura
(https://cekura.ai) with zero changes to core framework files.

Extension: ai_agents/agents/ten_packages/extension/cekura_metrics_python/
  - Subscribes to graph-routed signals emitted by ten_ai_base:
    asr_result (user transcript), text_data (assistant transcript),
    and metrics (STT/TTS/LLM latency).
  - Consumes on_user_joined / on_user_left from agora_rtc for session
    lifecycle, auto-flushing snapshots on an interval while open.
  - Stays idle with a warning if CEKURA_API_KEY is unset so demos
    still run unmodified.
  - Includes unit tests, a detailed README with routing notes, and
    requirements.txt (aiohttp).

voice-assistant example wiring:
  - Adds the cekura_metrics node to tenapp/property.json with
    ${env:CEKURA_API_KEY|} / ${env:CEKURA_ASSISTANT_ID|} placeholders;
    no hardcoded credentials.
  - Registers the extension path in manifest.json / manifest-lock.json.
  - Documents the env vars and opt-in behavior in the example README.

Notes for reviewers:
  - No changes to main_python, message_collector2, or any framework
    code. Tool-call and LLM reasoning streams (routed via explicit
    set_dests/return_result in main_python) remain out of scope for
    this PR and can be added later behind an opt-in hook if needed.

Made-with: Cursor
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

1 participant