chore(conductor): Add new track 'DeepSeek API Support'
This commit is contained in:
31
conductor/tracks/deepseek_support_20260225/spec.md
Normal file
31
conductor/tracks/deepseek_support_20260225/spec.md
Normal file
@@ -0,0 +1,31 @@
|
||||
# Specification: DeepSeek API Provider Support
|
||||
|
||||
## Overview
|
||||
Implement a new AI provider module to support the DeepSeek API within the Manual Slop application. This integration will leverage a dedicated SDK to provide access to high-performance models (DeepSeek-V3 and DeepSeek-R1) with support for streaming, tool calling, and detailed reasoning traces.
|
||||
|
||||
## Functional Requirements
|
||||
- **Dedicated SDK Integration:** Utilize a DeepSeek-specific Python client for API interactions.
|
||||
- **Model Support:** Initial support for `deepseek-v3` (general performance) and `deepseek-r1` (reasoning).
|
||||
- **Core Features:**
|
||||
- **Streaming:** Support real-time response generation for a better user experience.
|
||||
- **Tool Calling:** Integrate with Manual Slop's existing tool/function execution framework.
|
||||
- **Reasoning Traces:** Capture and display reasoning paths if provided by the model (e.g., DeepSeek-R1).
|
||||
- **Configuration Management:**
|
||||
- Add `[deepseek]` section to `credentials.toml` for `api_key`.
|
||||
- Update `config.toml` to allow selecting DeepSeek as the active provider.
|
||||
|
||||
## Non-Functional Requirements
|
||||
- **Parity:** Maintain consistency with existing Gemini and Anthropic provider implementations in `ai_client.py`.
|
||||
- **Error Handling:** Robust handling of API rate limits and connection issues specific to DeepSeek.
|
||||
- **Observability:** Track token usage and costs according to DeepSeek's pricing model.
|
||||
|
||||
## Acceptance Criteria
|
||||
- [ ] User can select "DeepSeek" as a provider in the GUI.
|
||||
- [ ] Successful completion of prompts using both DeepSeek-V3 and DeepSeek-R1 models.
|
||||
- [ ] Tool calling works correctly for standard operations (e.g., `read_file`).
|
||||
- [ ] Reasoning traces from R1 are captured and visible in the discussion history.
|
||||
- [ ] Streaming responses function correctly without blocking the GUI.
|
||||
|
||||
## Out of Scope
|
||||
- Support for OpenAI-compatible proxies for DeepSeek in this initial track.
|
||||
- Automated fine-tuning or custom model endpoints.
|
||||
Reference in New Issue
Block a user