Files
manual_slop/conductor/tracks/event_driven_metrics_20260223/plan.md

1.7 KiB

Implementation Plan: Event-Driven API Metrics Updates

Phase 1: Event Infrastructure & Test Setup

Define the event mechanism and create baseline tests to ensure we don't break data accuracy.

  • Task: Create tests/test_api_events.py to verify the new event emission logic in isolation.
  • Task: Implement a simple EventEmitter or Signal class (if not already present) to handle decoupled communication.
  • Task: Instrument ai_client.py with the event system, adding placeholders for the key lifecycle events.
  • Task: Conductor - User Manual Verification 'Phase 1: Event Infrastructure & Test Setup' (Protocol in workflow.md)

Phase 2: Client Instrumentation (API Lifecycle)

Update the AI client to emit events during actual API interactions.

  • Task: Implement event emission for Gemini and Anthropic request/response cycles in ai_client.py.
  • Task: Implement event emission for tool/function calls and stream processing.
  • Task: Verify via tests that events carry the correct payload (token counts, session metadata).
  • Task: Conductor - User Manual Verification 'Phase 2: Client Instrumentation (API Lifecycle)' (Protocol in workflow.md)

Phase 3: GUI Integration & Decoupling

Connect the UI to the event system and remove polling logic.

  • Task: Update gui.py to subscribe to API events and trigger metrics UI refreshes only upon event receipt.
  • Task: Audit the gui.py render loop and remove all per-frame metrics calculations or display updates.
  • Task: Verify that UI performance improves (reduced CPU/frame time) while metrics remain accurate.
  • Task: Conductor - User Manual Verification 'Phase 3: GUI Integration & Decoupling' (Protocol in workflow.md)