1.9 KiB
1.9 KiB
Specification: DeepSeek API Provider Support
Overview
Implement a new AI provider module to support the DeepSeek API within the Manual Slop application. This integration will leverage a dedicated SDK to provide access to high-performance models (DeepSeek-V3 and DeepSeek-R1) with support for streaming, tool calling, and detailed reasoning traces.
Functional Requirements
- Dedicated SDK Integration: Utilize a DeepSeek-specific Python client for API interactions.
- Model Support: Initial support for
deepseek-v3(general performance) anddeepseek-r1(reasoning). - Core Features:
- Streaming: Support real-time response generation for a better user experience.
- Tool Calling: Integrate with Manual Slop's existing tool/function execution framework.
- Reasoning Traces: Capture and display reasoning paths if provided by the model (e.g., DeepSeek-R1).
- Configuration Management:
- Add
[deepseek]section tocredentials.tomlforapi_key. - Update
config.tomlto allow selecting DeepSeek as the active provider.
- Add
Non-Functional Requirements
- Parity: Maintain consistency with existing Gemini and Anthropic provider implementations in
ai_client.py. - Error Handling: Robust handling of API rate limits and connection issues specific to DeepSeek.
- Observability: Track token usage and costs according to DeepSeek's pricing model.
Acceptance Criteria
- User can select "DeepSeek" as a provider in the GUI.
- Successful completion of prompts using both DeepSeek-V3 and DeepSeek-R1 models.
- Tool calling works correctly for standard operations (e.g.,
read_file). - Reasoning traces from R1 are captured and visible in the discussion history.
- Streaming responses function correctly without blocking the GUI.
Out of Scope
- Support for OpenAI-compatible proxies for DeepSeek in this initial track.
- Automated fine-tuning or custom model endpoints.