This commit is contained in:
2026-02-21 14:44:06 -05:00
parent fa78c1d08a
commit 4b4c4d96ad
2 changed files with 51 additions and 2 deletions

42
MainContext.md Normal file
View File

@@ -0,0 +1,42 @@
**manual_slop** is a local GUI tool for manually curating and sending context to AI APIs. It aggregates files, screenshots, and discussion history into a structured markdown file and sends it to a chosen AI provider with a user-written message.
**Stack:**
- `dearpygui` - GUI with docking/floating/resizable panels
- `google-genai` - Gemini API
- `anthropic` - Anthropic API
- `tomli-w` - TOML writing
- `uv` - package/env management
**Files:**
- `gui.py` - main GUI, `App` class, all panels, all callbacks
- `ai_client.py` - unified provider wrapper, model listing, session management, send
- `aggregate.py` - reads config, collects files/screenshots/discussion, writes numbered `.md` files to `output_dir/md_gen/`
- `config.toml` - namespace, output_dir, files paths+base_dir, screenshots paths+base_dir, discussion history array, ai provider+model
- `credentials.toml` - gemini api_key, anthropic api_key
**GUI Panels:**
- **Config** - namespace, output dir, save
- **Files** - base_dir, scrollable path list with remove, add file(s), add wildcard
- **Screenshots** - base_dir, scrollable path list with remove, add screenshot(s)
- **Discussion History** - multiline text box, `---` as separator between excerpts, save splits on `---` back into toml array
- **Provider** - provider combo (gemini/anthropic), model listbox populated from API, fetch models button, status line
- **Message** - multiline input, Gen+Send button, MD Only button, Reset session button
- **Response** - readonly multiline displaying last AI response
**Data flow:**
1. GUI edits are held in `App` state lists (`self.files`, `self.screenshots`, `self.history`) and dpg widget values
2. `_flush_to_config()` pulls all widget values into `self.config` dict
3. `_do_generate()` calls `_flush_to_config()`, saves `config.toml`, calls `aggregate.run(config)` which writes the md and returns `(markdown_str, path)`
4. `cb_generate_send()` calls `_do_generate()` then threads a call to `ai_client.send(md, message)`
5. `ai_client.send()` prepends the md as a `<context>` block to the user message and sends via the active provider chat session
6. Sessions are stateful within a run (chat history maintained), `Reset` clears them
**Config persistence:**
- Every send and save writes `config.toml` with current state including selected provider and model under `[ai]`
- Discussion history is stored as a TOML array of strings in `[discussion] history`
- File and screenshot paths are stored as TOML arrays, support absolute paths, relative paths from base_dir, and `**/*` wildcards
**Known extension points:**
- Add more providers by adding a section to `credentials.toml`, a `_list_*` and `_send_*` function in `ai_client.py`, and the provider name to the `PROVIDERS` list in `gui.py`
- System prompt support could be added as a field in `config.toml` and passed in `ai_client.send()`
- Discussion history excerpts could be individually toggleable for inclusion in the generated md

View File

@@ -1,10 +1,17 @@
[output] [output]
namespace = "colorforth_bootslop" namespace = "manual_slop"
output_dir = "." output_dir = "./md_gen"
[files] [files]
base_dir = "C:/projects/manual_slop" base_dir = "C:/projects/manual_slop"
paths = [ paths = [
"config.toml",
"ai_client.py",
"aggregate.py",
"gemini.py",
"gui.py",
"pyproject.toml",
"MainContext.md"
] ]
[screenshots] [screenshots]