2.9 KiB
2.9 KiB
manual_slop is a local GUI tool for manually curating and sending context to AI APIs. It aggregates files, screenshots, and discussion history into a structured markdown file and sends it to a chosen AI provider with a user-written message.
Stack:
dearpygui- GUI with docking/floating/resizable panelsgoogle-genai- Gemini APIanthropic- Anthropic APItomli-w- TOML writinguv- package/env management
Files:
gui.py- main GUI,Appclass, all panels, all callbacksai_client.py- unified provider wrapper, model listing, session management, sendaggregate.py- reads config, collects files/screenshots/discussion, writes numbered.mdfiles tooutput_dir/md_gen/config.toml- namespace, output_dir, files paths+base_dir, screenshots paths+base_dir, discussion history array, ai provider+modelcredentials.toml- gemini api_key, anthropic api_key
GUI Panels:
- Config - namespace, output dir, save
- Files - base_dir, scrollable path list with remove, add file(s), add wildcard
- Screenshots - base_dir, scrollable path list with remove, add screenshot(s)
- Discussion History - multiline text box,
---as separator between excerpts, save splits on---back into toml array - Provider - provider combo (gemini/anthropic), model listbox populated from API, fetch models button, status line
- Message - multiline input, Gen+Send button, MD Only button, Reset session button
- Response - readonly multiline displaying last AI response
Data flow:
- GUI edits are held in
Appstate lists (self.files,self.screenshots,self.history) and dpg widget values _flush_to_config()pulls all widget values intoself.configdict_do_generate()calls_flush_to_config(), savesconfig.toml, callsaggregate.run(config)which writes the md and returns(markdown_str, path)cb_generate_send()calls_do_generate()then threads a call toai_client.send(md, message)ai_client.send()prepends the md as a<context>block to the user message and sends via the active provider chat session- Sessions are stateful within a run (chat history maintained),
Resetclears them
Config persistence:
- Every send and save writes
config.tomlwith current state including selected provider and model under[ai] - Discussion history is stored as a TOML array of strings in
[discussion] history - File and screenshot paths are stored as TOML arrays, support absolute paths, relative paths from base_dir, and
**/*wildcards
Known extension points:
- Add more providers by adding a section to
credentials.toml, a_list_*and_send_*function inai_client.py, and the provider name to thePROVIDERSlist ingui.py - System prompt support could be added as a field in
config.tomland passed inai_client.send() - Discussion history excerpts could be individually toggleable for inclusion in the generated md