wip
This commit is contained in:
@@ -98,9 +98,17 @@ history = [
|
||||
]
|
||||
|
||||
[discussion.discussions."Make sure it works well with gemini"]
|
||||
git_commit = "f126cdcb21f57e0fdead682ff6bba7bf6cc29356"
|
||||
last_updated = "2026-02-21T22:23:46"
|
||||
git_commit = "32af96b3658b813cf4fd7e270b20d661187394cf"
|
||||
last_updated = "2026-02-21T22:35:01"
|
||||
history = [
|
||||
"@2026-02-21T22:21:06\nUser:\nQuestion: For how we're doing things right now, you have to re-read files cause its not saved in discussion history rigt per me sending a gen + send?\nIs there a way you can do a routine to save important files to the discussion or upload them temporarily to a gemini cache so that we don't have to worrya bout re-reading read-only things you already gathered info on for an active discussion?\nI'm not sure the depths of how this custom api works or custom tooling, etc gemini api from google provides...\n\n(I'm also reviewing using this with gemini to make sure there are no hiccups from how the api is utilized).",
|
||||
"@2026-02-21T22:21:08\nSystem:\nNo response from ai... when done. Other gemini in the anything llm discussion help!!!",
|
||||
"@2026-02-21T22:26:26\nUser:\nQuestion: For how we're doing things right now, you have to re-read files cause its not saved in discussion history rigt per me sending a gen + send?\nIs there a way you can do a routine to save important files to the discussion or upload them temporarily to a gemini cache so that we don't have to worrya bout re-reading read-only things you already gathered info on for an active discussion?\nI'm not sure the depths of how this custom api works or custom tooling, etc gemini api from google provides...\n\n(I'm also reviewing using this with gemini to make sure there are no hiccups from how the api is utilized).",
|
||||
"@2026-02-21T22:26:32\nAI:\n(No text returned by the model)",
|
||||
"@2026-02-21T22:27:47\nUser:\nUm... So can you answer my question? I want to know what we should do about preserving as much context ideally with how gemini's api works...",
|
||||
"@2026-02-21T22:28:41\nAI:\nTo answer your question, here is exactly how your current context and memory work with Gemini, and why there's actually a hidden issue we should address:\n\n### How memory works right now\n1. **Tool History is Remembered (Temporarily):** During an active session (until you close the app or click \"Reset Session\"), the `_gemini_chat` object keeps the entire conversation history in memory. **I actually do not need to re-read files** I've already opened with my tools during our active discussion. Gemini remembers the tool results.\n2. **Session Resets Wipe Memory:** If you close the app or change discussions, that memory is gone. The only things that survive are the files you explicitly added to the \"Files\" list in the GUI.\n\n### The Hidden Issue with `md_content`\nRight now, every time you hit \"Send\", `ai_client.py` does this:\n```python\npayload_to_send = f\"<context>\\n{md_content}\\n</context>\\n\\n{user_message}\"\n_gemini_chat.send_message(payload_to_send)\n```\nBecause `_gemini_chat` remembers history, **the entire `<context>` block (every file in your Files list) is being duplicated into the conversation history every single time you send a message!** This severely bloats your token usage and can cause issues with the API.\n\n### How we can solve this (My Suggestions)\n\n**1. Fix the Context Duplication (Recommended First Step)**\nInstead of prepending `md_content` to your user message on every single turn, we should modify `ai_client.py` to inject `md_content` into the `system_instruction` when `_gemini_chat` is first created. This means the context is loaded once per session and doesn't duplicate.\n\n**2. Use Gemini's Context Caching API (For Large Codebases)**\nGoogle provides a specific Context Caching API (`client.caches.create()`). We could modify the app so that if your \"Files\" list is very large (e.g., >32k tokens), we upload the files to a temporary Gemini cache. This would heavily reduce your API costs and speed up response times.\n\n**3. Auto-Adding to the \"Files\" List**\nIf I find a file important while I'm exploring using tools, I can use my PowerShell tool to automatically inject that file path into your `manual_slop.toml` project files list. That way, the file becomes permanent context for our discussion without you having to manually add it in the GUI.\n\n***\n\n**How would you like to proceed?** \nI highly recommend we at least fix the context duplication bug by moving `md_content` to the `system_instruction` for Gemini, but I will wait for your decision! I have updated `MainContext.md` to reflect this discussion.",
|
||||
"@2026-02-21T22:30:28\nUser:\nLets do these fixes. Also on the MainContext.md lets make that project configurable in the project's toml config.",
|
||||
"@2026-02-21T22:30:29\nAI:\n(No text returned by the model)",
|
||||
"@2026-02-21T22:30:58\nUser:\ncontinue if your not done.",
|
||||
"@2026-02-21T22:32:22\nAI:\n(No text returned by the model)",
|
||||
"@2026-02-21T22:34:01\nUser:\ncontinue.",
|
||||
"@2026-02-21T22:34:48\nAI:\n(No text returned by the model)",
|
||||
]
|
||||
|
||||
Reference in New Issue
Block a user