docs
This commit is contained in:
12
ai_client.py
12
ai_client.py
@@ -1,3 +1,15 @@
|
||||
# ai_client.py
|
||||
"""
|
||||
Note(Gemini):
|
||||
Acts as the unified interface for multiple LLM providers (Anthropic, Gemini).
|
||||
Abstracts away the differences in how they handle tool schemas, history, and caching.
|
||||
|
||||
For Anthropic: aggressively manages the ~200k token limit by manually culling
|
||||
stale [FILES UPDATED] entries and dropping the oldest message pairs.
|
||||
|
||||
For Gemini: injects the initial context directly into system_instruction
|
||||
during chat creation to avoid massive history bloat.
|
||||
"""
|
||||
# ai_client.py
|
||||
import tomllib
|
||||
import json
|
||||
|
||||
Reference in New Issue
Block a user