chore(conductor): Add new track 'Zhipu AI (GLM) Provider Integration'
This commit is contained in:
58
conductor/tracks/zhipu_integration_20260308/spec.md
Normal file
58
conductor/tracks/zhipu_integration_20260308/spec.md
Normal file
@@ -0,0 +1,58 @@
|
||||
# Specification: Zhipu AI (GLM) Provider Integration
|
||||
|
||||
## Overview
|
||||
This track introduces support for Zhipu AI (z.ai) as a first-class model provider. It involves implementing a dedicated client in `src/ai_client.py` for the GLM series of models, updating configuration models, enhancing the GUI for provider selection, and integrating the provider into the tiered MMA architecture.
|
||||
|
||||
## Functional Requirements
|
||||
|
||||
### 1. Core AI Client (`src/ai_client.py`)
|
||||
- **Zhipu AI Integration:**
|
||||
- Implement `_send_zhipu()` to handle communication with Zhipu AI's API.
|
||||
- Implement a tool-call loop similar to `_send_gemini` and `_send_anthropic`.
|
||||
- Support for function calling (native Zhipu tool format).
|
||||
- Support for multi-modal input (Vision) using `glm-4v` by encoding screenshots as base64 data.
|
||||
- Implement response streaming support compatible with the existing GUI mechanism.
|
||||
- **State Management:**
|
||||
- Define module-level `_zhipu_client` and `_zhipu_history`.
|
||||
- Ensure `ai_client.reset_session()` clears Zhipu-specific history.
|
||||
- **Error Handling:**
|
||||
- Implement `_classify_zhipu_error()` to map Zhipu API exceptions to `ProviderError` types (`quota`, `rate_limit`, `auth`, `balance`, `network`).
|
||||
- **Model Discovery:**
|
||||
- Implement `_list_zhipu_models()` to fetch available models (GLM-4, GLM-4-Flash, GLM-4V, etc.) from the API.
|
||||
|
||||
### 2. Configuration & Authentication
|
||||
- **Credentials:**
|
||||
- Update `src/ai_client.py:_load_credentials()` to support a `[z_ai]` section in `credentials.toml`.
|
||||
- Update the `FileNotFoundError` message in `_load_credentials()` to include the Zhipu AI example.
|
||||
- **Cost Tracking:**
|
||||
- Update `src/cost_tracker.py` with pricing for major GLM models (GLM-4, GLM-4-Flash).
|
||||
|
||||
### 3. GUI & Controller Integration
|
||||
- **Provider Lists:**
|
||||
- Add `z_ai` to the `PROVIDERS` list in `src/gui_2.py` and `src/app_controller.py`.
|
||||
- **Model Fetching:**
|
||||
- Ensure `AppController._fetch_models()` correctly dispatches to `ai_client.list_models("z_ai")`.
|
||||
- **Settings UI:**
|
||||
- The AI Settings panel should automatically handle the new provider and its models once added to the lists.
|
||||
|
||||
### 4. MMA Orchestration
|
||||
- **Tier Support:**
|
||||
- Verify that agents in all tiers (1-4) can be configured to use Zhipu AI models via the `mma_tier_usage` dict in `AppController`.
|
||||
- Ensure `run_worker_lifecycle` in `src/multi_agent_conductor.py` correctly passes Zhipu model names to `ai_client.send()`.
|
||||
|
||||
## Non-Functional Requirements
|
||||
- **Consistency:** Follow the established pattern of module-level singleton state in `ai_client.py`.
|
||||
- **Latency:** Ensure the tool-call loop overhead is minimal.
|
||||
- **Security:** Rigorously protect the Zhipu AI API key; ensure it is never logged or exposed in the GUI.
|
||||
|
||||
## Acceptance Criteria
|
||||
- [ ] Zhipu AI can be selected as a provider in the AI Settings panel.
|
||||
- [ ] Models like `glm-4` and `glm-4-flash` are listed and selectable.
|
||||
- [ ] Agents can successfully use tools (e.g., `read_file`, `run_powershell`) using Zhipu AI models.
|
||||
- [ ] Screenshots are correctly processed and described by vision-capable models (`glm-4v`).
|
||||
- [ ] Response streaming is functional in the Discussion panel.
|
||||
- [ ] Estimated costs for Zhipu AI calls are displayed in the MMA Dashboard.
|
||||
|
||||
## Out of Scope
|
||||
- Support for Zhipu AI's knowledge base or vector store features.
|
||||
- Specialized "Batch" API support.
|
||||
Reference in New Issue
Block a user