docs: Update workflow rules, create new async tool track, and log journal

This commit is contained in:
2026-03-03 01:49:04 -05:00
parent 2d3820bc76
commit 2b15bfb1c1
6 changed files with 72 additions and 4 deletions

View File

@@ -0,0 +1,20 @@
# Track Specification: Asynchronous Tool Execution Engine (async_tool_execution_20260303)
## Overview
Currently, AI tool calls are executed synchronously in the background thread. If an AI requests multiple tool calls (e.g., parallel file reads or parallel grep searches), the execution engine blocks and runs them sequentially. This track will refactor the MCP tool dispatch system to execute independent tool calls concurrently using `asyncio.gather` or `ThreadPoolExecutor`, significantly reducing latency during the research phase.
## Functional Requirements
- **Concurrent Dispatch**: Refactor `ai_client.py` and `mcp_client.py` to support asynchronous execution of multiple parallel tool calls.
- **Thread Safety**: Ensure that concurrent access to the file system or UI event queue does not cause race conditions.
- **Cancellation**: If an AI request is cancelled (e.g., via user interruption), all running background tools should be safely cancelled.
- **UI Progress Updates**: Ensure that the UI stream correctly reflects the progress of concurrent tools (e.g., "Tool 1 finished, Tool 2 still running...").
## Non-Functional Requirements
- Maintain complete parity with existing tool functionality.
- Ensure all automated simulation tests continue to pass.
## Acceptance Criteria
- [ ] Multiple tool calls requested in a single AI turn are executed in parallel.
- [ ] End-to-end latency for multi-tool requests is demonstrably reduced.
- [ ] No threading deadlocks or race conditions are introduced.
- [ ] All integration tests pass.