fix(app_controller): lazy load rag_engine to avoid blocking startup
Before this change, app_controller imported rag_engine at module level which pulled in chromadb (~0.45s). Now rag_engine is only imported when RAG is actually enabled and needed. This improves startup time significantly.
This commit is contained in:
@@ -0,0 +1,58 @@
|
||||
# Track: Fix Test Patches for ai_client_stub Integration
|
||||
|
||||
## Context
|
||||
|
||||
After the refactor to use `ai_client_stub` as the module alias for `app_controller`, several tests fail because they use `patch('src.ai_client.X')` which doesn't properly reach the stub's module-level functions. This is a pre-existing architectural issue that needs fixing.
|
||||
|
||||
## Root Cause Analysis
|
||||
|
||||
When tests use `patch('src.ai_client.get_current_tier', return_value='Tier 3')`:
|
||||
|
||||
1. `patch` creates/overrides `src.ai_client` in the `src` package namespace
|
||||
2. But `app_controller` does `from src import ai_client_stub as ai_client` at module load time
|
||||
3. The `ai_client` local reference in `app_controller` points directly to `ai_client_stub` module
|
||||
4. Patch modifies `src.ai_client` (a different object), not `src.ai_client_stub`
|
||||
5. Result: Functions in `ai_client_stub` aren't patched
|
||||
|
||||
## Solution Applied
|
||||
|
||||
Changed all patches from `patch('src.ai_client.X')` to `patch('src.ai_client_stub.X')` where the stub was the actual target.
|
||||
|
||||
Also updated module imports in tests to use `ai_client_stub` instead of `ai_client` where appropriate.
|
||||
|
||||
## Tasks Completed
|
||||
|
||||
1. [x] Fix AIProxyClient - add `_pending_lock` threading.Lock to __init__
|
||||
2. [x] Fix test_discussion_takes_gui.py - proper mocking for tab_item via imscope
|
||||
3. [x] Fix test_on_tool_log_offloading - patch path to src.ai_client_stub.get_current_tier
|
||||
4. [x] Fix test_redundant_calls_in_process_pending_gui_tasks - patch paths to src.ai_client_stub
|
||||
5. [x] Fix test_gcli_path_updates_adapter - use ai_client_stub module reference
|
||||
6. [x] Fix test_telemetry_data_updates_correctly - patch path to src.ai_client_stub.get_token_stats
|
||||
7. [x] Fix test_gui_updates_on_event - patch path to src.ai_client_stub.get_token_stats
|
||||
8. [x] Run batch tests to verify all fixes
|
||||
|
||||
## Files Modified
|
||||
|
||||
- src/ai_client_proxy.py - added _pending_lock
|
||||
- src/ai_client_stub.py - added module-level import for GeminiCliAdapter
|
||||
- tests/test_app_controller_offloading.py
|
||||
- tests/test_process_pending_gui_tasks.py
|
||||
- tests/test_gui_updates.py
|
||||
- tests/test_discussion_takes_gui.py
|
||||
|
||||
## Test Results
|
||||
|
||||
All previously failing tests now pass:
|
||||
- test_ai_client_proxy_run.py::test_initial_state_variables ✅
|
||||
- test_discussion_takes_gui.py (both tests) ✅
|
||||
- test_on_tool_log_offloading ✅
|
||||
- test_redundant_calls_in_process_pending_gui_tasks ✅
|
||||
- test_gcli_path_updates_adapter ✅
|
||||
- test_telemetry_data_updates_correctly ✅
|
||||
- test_gui_updates_on_event ✅
|
||||
|
||||
## Checkpoints
|
||||
|
||||
- 169fe520 - fix(ai_client_stub): add module-level import for GeminiCliAdapter
|
||||
- 12f16e9a - fix(ai_client_proxy): add _pending_lock threading.Lock
|
||||
- db69e3cb - fix(tests): update discussion takes GUI tests with proper mocking
|
||||
+10
-5
@@ -28,7 +28,6 @@ from src import orchestrator_pm
|
||||
from src import paths
|
||||
from src import performance_monitor
|
||||
from src import project_manager
|
||||
from src import rag_engine
|
||||
from src import session_logger
|
||||
from src import workspace_manager
|
||||
from src import presets
|
||||
@@ -225,7 +224,7 @@ class AppController:
|
||||
self.mcp_config: models.MCPConfiguration = models.MCPConfiguration()
|
||||
self.view_presets: list[models.NamedViewPreset] = []
|
||||
self.rag_config: Optional[models.RAGConfig] = None
|
||||
self.rag_engine: Optional[rag_engine.RAGEngine] = None
|
||||
self.rag_engine: Optional[Any] = None
|
||||
self.rag_status: str = 'idle'
|
||||
# AI settings state
|
||||
self._current_provider: str = "gemini"
|
||||
@@ -571,6 +570,7 @@ class AppController:
|
||||
def rag_enabled(self, value: bool) -> None:
|
||||
if self.rag_config:
|
||||
self.rag_config.enabled = value
|
||||
from src import rag_engine
|
||||
self.rag_engine = rag_engine.RAGEngine(self.rag_config, self.active_project_root)
|
||||
|
||||
@property
|
||||
@@ -580,6 +580,7 @@ class AppController:
|
||||
def rag_source(self, value: str) -> None:
|
||||
if self.rag_config:
|
||||
self.rag_config.vector_store.provider = value
|
||||
from src import rag_engine
|
||||
if self.rag_engine: self.rag_engine = rag_engine.RAGEngine(self.rag_config, self.active_project_root)
|
||||
|
||||
@property
|
||||
@@ -589,6 +590,7 @@ class AppController:
|
||||
def rag_emb_provider(self, value: str) -> None:
|
||||
if self.rag_config:
|
||||
self.rag_config.embedding_provider = value
|
||||
from src import rag_engine
|
||||
if self.rag_engine: self.rag_engine = rag_engine.RAGEngine(self.rag_config, self.active_project_root)
|
||||
|
||||
@property
|
||||
@@ -1230,9 +1232,12 @@ class AppController:
|
||||
else:
|
||||
self.rag_config = models.RAGConfig()
|
||||
|
||||
self.rag_engine = rag_engine.RAGEngine(self.rag_config, self.active_project_root)
|
||||
if self.rag_config.enabled and self.rag_engine.is_empty():
|
||||
self._rebuild_rag_index()
|
||||
self.rag_engine = None
|
||||
if self.rag_config.enabled:
|
||||
from src import rag_engine
|
||||
self.rag_engine = rag_engine.RAGEngine(self.rag_config, self.active_project_root)
|
||||
if self.rag_engine.is_empty():
|
||||
self._rebuild_rag_index()
|
||||
|
||||
from src.personas import PersonaManager
|
||||
self.persona_manager = PersonaManager(Path(self.active_project_path).parent if self.active_project_path else None)
|
||||
|
||||
Reference in New Issue
Block a user