Compare commits

...

16 Commits

Author SHA1 Message Date
Ed_
9b6d16b4e0 update progress snapshot 2026-03-11 00:38:21 -04:00
Ed_
847096d192 checkpoint done with ux refinement for the night 2026-03-11 00:32:35 -04:00
Ed_
7ee50f979a fix(gui): fix tool presets and biases panel and cache analytics section layout 2026-03-11 00:25:04 -04:00
Ed_
3870bf086c refactor(gui): redesign ai settings layout and fix model fetching sync 2026-03-11 00:18:45 -04:00
Ed_
747b810fe1 refactor(gui): redesign AI settings and usage analytics UI 2026-03-11 00:07:11 -04:00
Ed_
3ba05b8a6a refactor(gui): improve persona preferred models UI and remove embedded preset managers 2026-03-10 23:50:29 -04:00
Ed_
94598b605a checkpoint dealing with personal manager/editor 2026-03-10 23:47:53 -04:00
Ed_
26e03d2c9f refactor(gui): redesign persona modal as non-blocking window and embed sub-managers 2026-03-10 23:28:20 -04:00
Ed_
6da3d95c0e refactor(gui): redesign persona editor UI and replace popup modals with standard windows 2026-03-10 23:21:14 -04:00
Ed_
6ae8737c1a fix bug 2026-03-10 22:54:24 -04:00
Ed_
92e7352d37 feat(gui): implement persona manager two-pane layout and dynamic model preference list 2026-03-10 22:45:35 -04:00
Ed_
ca8e33837b refactor(gui): streamline preset manager and improve tool bias ui 2026-03-10 22:29:43 -04:00
Ed_
fa5ead2c69 docs(conductor): Synchronize docs for track 'Agent Personas: Unified Profiles & Tool Presets' 2026-03-10 21:28:05 -04:00
Ed_
67a269b05d test: align tests with new Persona system 2026-03-10 21:26:31 -04:00
Ed_
ee3a811cc9 fix(gui): render persona editor modal correctly and align with Persona model attributes 2026-03-10 21:24:57 -04:00
Ed_
6b587d76a7 fix(gui): render persona editor modal correctly and align with Persona model attributes 2026-03-10 21:20:05 -04:00
52 changed files with 1440 additions and 635 deletions

View File

@@ -10,7 +10,7 @@ A high-density GUI orchestrator for local LLM-driven coding sessions. Manual Slo
**Providers**: Gemini API, Anthropic API, DeepSeek, Gemini CLI (headless), MiniMax **Providers**: Gemini API, Anthropic API, DeepSeek, Gemini CLI (headless), MiniMax
**Platform**: Windows (PowerShell) — single developer, local use **Platform**: Windows (PowerShell) — single developer, local use
![img](./gallery/python_2026-03-07_14-32-50.png) ![img](./gallery/python_2026-03-11_00-37-21.png)
--- ---

View File

@@ -73,6 +73,10 @@ For deep implementation details when planning or implementing tracks, consult `d
- **Scoped Inheritance:** Supports **Global** (application-wide) and **Project-Specific** presets. Project presets with the same name automatically override global counterparts, allowing for fine-tuned context tailoring. - **Scoped Inheritance:** Supports **Global** (application-wide) and **Project-Specific** presets. Project presets with the same name automatically override global counterparts, allowing for fine-tuned context tailoring.
- **Full AI Profiles:** Presets capture not only the system prompt text but also critical model parameters like **Temperature**, **Top-P**, and **Max Output Tokens**. - **Full AI Profiles:** Presets capture not only the system prompt text but also critical model parameters like **Temperature**, **Top-P**, and **Max Output Tokens**.
- **Preset Manager Modal:** A dedicated high-density GUI for creating, editing, and deleting presets with real-time validation and instant application to the active session. - **Preset Manager Modal:** A dedicated high-density GUI for creating, editing, and deleting presets with real-time validation and instant application to the active session.
- **Agent Personas & Unified Profiles:** Consolidates model settings, provider routing, system prompts, tool presets, and bias profiles into named "Persona" entities.
- **Single Configuration Entity:** Switch models, tool weights, and system prompts simultaneously using a single Persona selection.
- **Persona Editor Modal:** A dedicated high-density GUI for creating, editing, and deleting Personas.
- **MMA Granular Assignment:** Allows assigning specific Personas to individual agents within the 4-Tier Hierarchical MMA.
- **Agent Tool Weighting & Bias:** Influences agent tool selection via a weighting system. - **Agent Tool Weighting & Bias:** Influences agent tool selection via a weighting system.
- **Semantic Nudging:** Automatically prefixes tool and parameter descriptions with priority tags (e.g., [HIGH PRIORITY], [PREFERRED]) to bias model selection. - **Semantic Nudging:** Automatically prefixes tool and parameter descriptions with priority tags (e.g., [HIGH PRIORITY], [PREFERRED]) to bias model selection.
- **Dynamic Tooling Strategy:** Automatically appends a Markdown "Tooling Strategy" section to system instructions based on the active preset and global bias profile. - **Dynamic Tooling Strategy:** Automatically appends a Markdown "Tooling Strategy" section to system instructions based on the active preset and global bias profile.

View File

@@ -33,6 +33,8 @@
- **src/presets.py:** Implements `PresetManager` for high-performance CRUD operations on system prompt presets stored in TOML format (`presets.toml`, `project_presets.toml`). Supports dynamic path resolution and scope-based inheritance. - **src/presets.py:** Implements `PresetManager` for high-performance CRUD operations on system prompt presets stored in TOML format (`presets.toml`, `project_presets.toml`). Supports dynamic path resolution and scope-based inheritance.
- **src/personas.py:** Implements `PersonaManager` for high-performance CRUD operations on unified agent personas stored in TOML format (`personas.toml`, `project_personas.toml`). Handles consolidation of model settings, prompts, and tool biases.
- **src/tool_bias.py:** Implements the `ToolBiasEngine` for semantic tool description nudging and dynamic tooling strategy generation. - **src/tool_bias.py:** Implements the `ToolBiasEngine` for semantic tool description nudging and dynamic tooling strategy generation.
- **src/tool_presets.py:** Extends `ToolPresetManager` to handle nested `Tool` models, weights, and global `BiasProfile` persistence within `tool_presets.toml`. - **src/tool_presets.py:** Extends `ToolPresetManager` to handle nested `Tool` models, weights, and global `BiasProfile` persistence within `tool_presets.toml`.

View File

@@ -1,11 +1,11 @@
[ai] [ai]
provider = "gemini_cli" provider = "minimax"
model = "gemini-2.5-flash-lite" model = "MiniMax-M2.5"
temperature = 0.30000001192092896 temperature = 0.0
max_tokens = 32000 max_tokens = 32000
history_trunc_limit = 900000 history_trunc_limit = 900000
active_preset = "TestGlobal" active_preset = "Default"
system_prompt = "Overridden Prompt" system_prompt = ""
[projects] [projects]
paths = [ paths = [
@@ -16,7 +16,7 @@ paths = [
"C:\\projects\\manual_slop\\tests\\artifacts\\temp_liveexecutionsim.toml", "C:\\projects\\manual_slop\\tests\\artifacts\\temp_liveexecutionsim.toml",
"C:\\projects\\manual_slop\\tests\\artifacts\\temp_project.toml", "C:\\projects\\manual_slop\\tests\\artifacts\\temp_project.toml",
] ]
active = "C:\\projects\\manual_slop\\tests\\artifacts\\live_gui_workspace\\manual_slop.toml" active = "C:/projects/gencpp/gencpp_sloppy.toml"
[gui] [gui]
separate_message_panel = false separate_message_panel = false
@@ -24,11 +24,11 @@ separate_response_panel = false
separate_tool_calls_panel = false separate_tool_calls_panel = false
bg_shader_enabled = true bg_shader_enabled = true
crt_filter_enabled = false crt_filter_enabled = false
separate_task_dag = true separate_task_dag = false
separate_usage_analytics = true separate_usage_analytics = false
separate_tier1 = false separate_tier1 = false
separate_tier2 = false separate_tier2 = false
separate_tier3 = true separate_tier3 = false
separate_tier4 = false separate_tier4 = false
[gui.show_windows] [gui.show_windows]
@@ -36,7 +36,7 @@ separate_tier4 = false
"Files & Media" = true "Files & Media" = true
"AI Settings" = true "AI Settings" = true
"MMA Dashboard" = true "MMA Dashboard" = true
"Task DAG" = true "Task DAG" = false
"Usage Analytics" = false "Usage Analytics" = false
"Tier 1" = false "Tier 1" = false
"Tier 2" = false "Tier 2" = false
@@ -44,7 +44,7 @@ separate_tier4 = false
"Tier 4" = false "Tier 4" = false
"Tier 1: Strategy" = false "Tier 1: Strategy" = false
"Tier 2: Tech Lead" = false "Tier 2: Tech Lead" = false
"Tier 3: Workers" = true "Tier 3: Workers" = false
"Tier 4: QA" = false "Tier 4: QA" = false
"Discussion Hub" = true "Discussion Hub" = true
"Operations Hub" = true "Operations Hub" = true
@@ -53,15 +53,15 @@ Response = false
"Tool Calls" = false "Tool Calls" = false
Theme = true Theme = true
"Log Management" = true "Log Management" = true
Diagnostics = true Diagnostics = false
[theme] [theme]
palette = "Nord Dark" palette = "Nord Dark"
font_path = "C:/projects/manual_slop/assets/fonts/Inter-Regular.ttf" font_path = "C:/projects/manual_slop/assets/fonts/Inter-Regular.ttf"
font_size = 14.0 font_size = 14.0
scale = 1.0 scale = 1.0099999904632568
transparency = 0.5099999904632568 transparency = 1.0
child_transparency = 0.699999988079071 child_transparency = 1.0
[mma] [mma]
max_workers = 4 max_workers = 4

Binary file not shown.

After

Width:  |  Height:  |  Size: 607 KiB

View File

@@ -73,8 +73,8 @@ Collapsed=0
DockId=0xAFC85805,2 DockId=0xAFC85805,2
[Window][Theme] [Window][Theme]
Pos=0,1016 Pos=0,1670
Size=623,401 Size=840,467
Collapsed=0 Collapsed=0
DockId=0x00000002,2 DockId=0x00000002,2
@@ -84,14 +84,14 @@ Size=900,700
Collapsed=0 Collapsed=0
[Window][Diagnostics] [Window][Diagnostics]
Pos=2833,28 Pos=2641,22
Size=1007,2109 Size=1199,2115
Collapsed=0 Collapsed=0
DockId=0x0000000C,2 DockId=0x0000000C,2
[Window][Context Hub] [Window][Context Hub]
Pos=0,1016 Pos=0,1670
Size=623,401 Size=840,467
Collapsed=0 Collapsed=0
DockId=0x00000002,1 DockId=0x00000002,1
@@ -102,26 +102,26 @@ Collapsed=0
DockId=0x0000000D,0 DockId=0x0000000D,0
[Window][Discussion Hub] [Window][Discussion Hub]
Pos=1296,22 Pos=1668,22
Size=659,1395 Size=971,2115
Collapsed=0 Collapsed=0
DockId=0x00000013,0 DockId=0x00000013,0
[Window][Operations Hub] [Window][Operations Hub]
Pos=625,22 Pos=842,22
Size=669,1395 Size=824,2115
Collapsed=0 Collapsed=0
DockId=0x00000012,0 DockId=0x00000012,0
[Window][Files & Media] [Window][Files & Media]
Pos=0,1016 Pos=0,1670
Size=623,401 Size=840,467
Collapsed=0 Collapsed=0
DockId=0x00000002,0 DockId=0x00000002,0
[Window][AI Settings] [Window][AI Settings]
Pos=0,22 Pos=0,22
Size=623,992 Size=840,1646
Collapsed=0 Collapsed=0
DockId=0x00000001,0 DockId=0x00000001,0
@@ -131,14 +131,14 @@ Size=416,325
Collapsed=0 Collapsed=0
[Window][MMA Dashboard] [Window][MMA Dashboard]
Pos=1957,22 Pos=2641,22
Size=603,1395 Size=1199,2115
Collapsed=0 Collapsed=0
DockId=0x0000000C,0 DockId=0x0000000C,0
[Window][Log Management] [Window][Log Management]
Pos=1957,22 Pos=2641,22
Size=603,1395 Size=1199,2115
Collapsed=0 Collapsed=0
DockId=0x0000000C,1 DockId=0x0000000C,1
@@ -166,8 +166,8 @@ Collapsed=0
DockId=0x0000000F,0 DockId=0x0000000F,0
[Window][Tier 3: Workers] [Window][Tier 3: Workers]
Pos=2905,1238 Pos=2641,1235
Size=935,899 Size=1199,902
Collapsed=0 Collapsed=0
DockId=0x0000000F,0 DockId=0x0000000F,0
@@ -322,12 +322,12 @@ Size=420,966
Collapsed=0 Collapsed=0
[Window][Preset Manager] [Window][Preset Manager]
Pos=403,396 Pos=1693,826
Size=956,958 Size=933,839
Collapsed=0 Collapsed=0
[Window][Task DAG] [Window][Task DAG]
Pos=1700,1199 Pos=1661,1181
Size=1079,662 Size=1079,662
Collapsed=0 Collapsed=0
@@ -337,13 +337,13 @@ Size=275,375
Collapsed=0 Collapsed=0
[Window][Tool Preset Manager] [Window][Tool Preset Manager]
Pos=192,90 Pos=1014,522
Size=1066,1324 Size=1155,1011
Collapsed=0 Collapsed=0
[Window][Persona Editor] [Window][Persona Editor]
Pos=956,447 Pos=995,597
Size=549,447 Size=1868,1434
Collapsed=0 Collapsed=0
[Table][0xFB6E3870,4] [Table][0xFB6E3870,4]
@@ -377,11 +377,11 @@ Column 3 Width=20
Column 4 Weight=1.0000 Column 4 Weight=1.0000
[Table][0x2A6000B6,4] [Table][0x2A6000B6,4]
RefScale=17 RefScale=14
Column 0 Width=51 Column 0 Width=42
Column 1 Width=76 Column 1 Width=62
Column 2 Weight=1.0000 Column 2 Weight=1.0000
Column 3 Width=128 Column 3 Width=105
[Table][0x8BCC69C7,6] [Table][0x8BCC69C7,6]
RefScale=13 RefScale=13
@@ -393,18 +393,18 @@ Column 4 Weight=1.0000
Column 5 Width=50 Column 5 Width=50
[Table][0x3751446B,4] [Table][0x3751446B,4]
RefScale=20 RefScale=17
Column 0 Width=60 Column 0 Width=51
Column 1 Width=91 Column 1 Width=77
Column 2 Weight=1.0000 Column 2 Weight=1.0000
Column 3 Width=151 Column 3 Width=128
[Table][0x2C515046,4] [Table][0x2C515046,4]
RefScale=20 RefScale=14
Column 0 Width=63 Column 0 Width=43
Column 1 Weight=1.0000 Column 1 Weight=1.0000
Column 2 Width=152 Column 2 Width=106
Column 3 Width=60 Column 3 Width=42
[Table][0xD99F45C5,4] [Table][0xD99F45C5,4]
Column 0 Sort=0v Column 0 Sort=0v
@@ -425,28 +425,28 @@ Column 1 Width=100
Column 2 Weight=1.0000 Column 2 Weight=1.0000
[Table][0xA02D8C87,3] [Table][0xA02D8C87,3]
RefScale=20 RefScale=14
Column 0 Width=227 Column 0 Width=158
Column 1 Width=150 Column 1 Width=105
Column 2 Weight=1.0000 Column 2 Weight=1.0000
[Docking][Data] [Docking][Data]
DockNode ID=0x00000008 Pos=3125,170 Size=593,1157 Split=Y DockNode ID=0x00000008 Pos=3125,170 Size=593,1157 Split=Y
DockNode ID=0x00000009 Parent=0x00000008 SizeRef=1029,147 Selected=0x0469CA7A DockNode ID=0x00000009 Parent=0x00000008 SizeRef=1029,147 Selected=0x0469CA7A
DockNode ID=0x0000000A Parent=0x00000008 SizeRef=1029,145 Selected=0xDF822E02 DockNode ID=0x0000000A Parent=0x00000008 SizeRef=1029,145 Selected=0xDF822E02
DockSpace ID=0xAFC85805 Window=0x079D3A04 Pos=0,22 Size=2560,1395 Split=X DockSpace ID=0xAFC85805 Window=0x079D3A04 Pos=0,22 Size=3840,2115 Split=X
DockNode ID=0x00000003 Parent=0xAFC85805 SizeRef=1955,1183 Split=X DockNode ID=0x00000003 Parent=0xAFC85805 SizeRef=2639,1183 Split=X
DockNode ID=0x0000000B Parent=0x00000003 SizeRef=404,1186 Split=X Selected=0xF4139CA2 DockNode ID=0x0000000B Parent=0x00000003 SizeRef=404,1186 Split=X Selected=0xF4139CA2
DockNode ID=0x00000007 Parent=0x0000000B SizeRef=623,858 Split=Y Selected=0x8CA2375C DockNode ID=0x00000007 Parent=0x0000000B SizeRef=840,858 Split=Y Selected=0x8CA2375C
DockNode ID=0x00000001 Parent=0x00000007 SizeRef=824,989 CentralNode=1 Selected=0x7BD57D6A DockNode ID=0x00000001 Parent=0x00000007 SizeRef=824,1646 CentralNode=1 Selected=0x7BD57D6A
DockNode ID=0x00000002 Parent=0x00000007 SizeRef=824,401 Selected=0x1DCB2623 DockNode ID=0x00000002 Parent=0x00000007 SizeRef=824,467 Selected=0x1DCB2623
DockNode ID=0x0000000E Parent=0x0000000B SizeRef=1330,858 Split=X Selected=0x418C7449 DockNode ID=0x0000000E Parent=0x0000000B SizeRef=1797,858 Split=X Selected=0x418C7449
DockNode ID=0x00000012 Parent=0x0000000E SizeRef=669,402 Selected=0x418C7449 DockNode ID=0x00000012 Parent=0x0000000E SizeRef=824,402 Selected=0x418C7449
DockNode ID=0x00000013 Parent=0x0000000E SizeRef=659,402 Selected=0x6F2B5B04 DockNode ID=0x00000013 Parent=0x0000000E SizeRef=971,402 Selected=0x6F2B5B04
DockNode ID=0x0000000D Parent=0x00000003 SizeRef=435,1186 Selected=0x363E93D6 DockNode ID=0x0000000D Parent=0x00000003 SizeRef=435,1186 Selected=0x363E93D6
DockNode ID=0x00000004 Parent=0xAFC85805 SizeRef=603,1183 Split=Y Selected=0x3AEC3498 DockNode ID=0x00000004 Parent=0xAFC85805 SizeRef=1199,1183 Split=Y Selected=0x3AEC3498
DockNode ID=0x0000000C Parent=0x00000004 SizeRef=1074,1208 Selected=0x2C0206CE DockNode ID=0x0000000C Parent=0x00000004 SizeRef=1074,1208 Selected=0x3AEC3498
DockNode ID=0x0000000F Parent=0x00000004 SizeRef=1074,899 Selected=0x5CDB7A4B DockNode ID=0x0000000F Parent=0x00000004 SizeRef=1074,899 Selected=0x655BC6E9
;;;<<<Layout_655921752_Default>>>;;; ;;;<<<Layout_655921752_Default>>>;;;
;;;<<<HelloImGui_Misc>>>;;; ;;;<<<HelloImGui_Misc>>>;;;

View File

@@ -1,10 +1,19 @@
[personas.Default] [personas.Default]
system_prompt = "" system_prompt = ""
provider = "minimax" tool_preset = "Default"
bias_profile = "Balanced"
[[personas.Default.preferred_models]]
model = "MiniMax-M2.5" model = "MiniMax-M2.5"
preferred_models = [ provider = "minimax"
"MiniMax-M2.5",
]
temperature = 0.0 temperature = 0.0
top_p = 1.0 top_p = 1.0
max_output_tokens = 32000 max_output_tokens = 32000
history_trunc_limit = 900000
[[personas.Default.preferred_models]]
provider = "gemini_cli"
model = "gemini-3-flash-preview"
temperature = -1.4901161193847656e-08
max_output_tokens = 32000
history_trunc_limit = 900000

View File

@@ -34,13 +34,8 @@ def migrate():
preset = models.Preset.from_dict(name, data) preset = models.Preset.from_dict(name, data)
persona = models.Persona( persona = models.Persona(
name=name, name=name,
provider=provider, preferred_models=[{"provider": provider, "model": model}],
model=model, system_prompt=preset.system_prompt
preferred_models=[model] if model else [],
system_prompt=preset.system_prompt,
temperature=preset.temperature,
top_p=preset.top_p,
max_output_tokens=preset.max_output_tokens
) )
persona_manager.save_persona(persona, scope="global") persona_manager.save_persona(persona, scope="global")
print(f"Migrated global preset to persona: {name}") print(f"Migrated global preset to persona: {name}")
@@ -50,12 +45,13 @@ def migrate():
if active_preset and active_preset not in persona_manager.load_all(): if active_preset and active_preset not in persona_manager.load_all():
persona = models.Persona( persona = models.Persona(
name=active_preset, name=active_preset,
provider=provider, preferred_models=[{
model=model, "provider": provider,
preferred_models=[model] if model else [], "model": model,
system_prompt=ai_cfg.get("system_prompt", ""), "temperature": ai_cfg.get("temperature"),
temperature=ai_cfg.get("temperature"), "max_output_tokens": ai_cfg.get("max_tokens")
max_output_tokens=ai_cfg.get("max_tokens") }],
system_prompt=ai_cfg.get("system_prompt", "")
) )
persona_manager.save_persona(persona, scope="global") persona_manager.save_persona(persona, scope="global")
print(f"Created Initial Legacy persona from active_preset: {active_preset}") print(f"Created Initial Legacy persona from active_preset: {active_preset}")

View File

@@ -0,0 +1,252 @@
import sys
with open("src/gui_2.py", "r", encoding="utf-8") as f:
content = f.read()
# 1. In _render_provider_panel, remove Fetch Models
old_fetch = """ imgui.text("Model")
imgui.same_line()
if imgui.button("Fetch Models"):
self._fetch_models(self.current_provider)
if imgui.begin_list_box("##models", imgui.ImVec2(-1, 120)):"""
new_fetch = """ imgui.text("Model")
if imgui.begin_list_box("##models", imgui.ImVec2(-1, 120)):"""
content = content.replace(old_fetch, new_fetch)
# 2. Extract Persona block
# We need to find the start of 'imgui.text("Persona")' and end of 'self._editing_persona_is_new = True'
# Let's be very careful.
old_persona_block = """ imgui.text("Persona")
if not hasattr(self, 'ui_active_persona'):
self.ui_active_persona = ""
personas = getattr(self.controller, 'personas', {})
if imgui.begin_combo("##persona", self.ui_active_persona or "None"):
if imgui.selectable("None", not self.ui_active_persona)[0]:
self.ui_active_persona = ""
for pname in sorted(personas.keys()):
if imgui.selectable(pname, pname == self.ui_active_persona)[0]:
self.ui_active_persona = pname
if pname in personas:
persona = personas[pname]
self._editing_persona_name = persona.name
self._editing_persona_provider = persona.provider or ""
self._editing_persona_model = persona.model or ""
self._editing_persona_system_prompt = persona.system_prompt or ""
self._editing_persona_temperature = persona.temperature or 0.7
self._editing_persona_max_tokens = persona.max_output_tokens or 4096
self._editing_persona_tool_preset_id = persona.tool_preset or ""
self._editing_persona_bias_profile_id = persona.bias_profile or ""
import json
self._editing_persona_preferred_models = json.dumps(persona.preferred_models) if persona.preferred_models else "[]"
self._editing_persona_is_new = False
if persona.provider and persona.provider in self.controller.PROVIDERS:
self.current_provider = persona.provider
if persona.model:
self.current_model = persona.model
if persona.temperature is not None:
ai_client.temperature = persona.temperature
if persona.max_output_tokens:
ai_client.max_output_tokens = persona.max_output_tokens
if persona.system_prompt:
ai_client.system_instruction = persona.system_prompt
if persona.tool_preset:
self.ui_active_tool_preset = persona.tool_preset
ai_client.set_tool_preset(persona.tool_preset)
if persona.bias_profile:
self.ui_active_bias_profile = persona.bias_profile
ai_client.set_bias_profile(persona.bias_profile)
imgui.end_combo()
imgui.same_line()
if imgui.button("Manage Personas"):
self.show_persona_editor_window = True
if self.ui_active_persona and self.ui_active_persona in personas:
persona = personas[self.ui_active_persona]
self._editing_persona_name = persona.name
self._editing_persona_provider = persona.provider or ""
self._editing_persona_model = persona.model or ""
self._editing_persona_system_prompt = persona.system_prompt or ""
self._editing_persona_temperature = persona.temperature if persona.temperature is not None else 0.7
self._editing_persona_max_tokens = persona.max_output_tokens if persona.max_output_tokens is not None else 4096
self._editing_persona_tool_preset_id = persona.tool_preset or ""
self._editing_persona_bias_profile_id = persona.bias_profile or ""
self._editing_persona_preferred_models_list = list(persona.preferred_models) if persona.preferred_models else []
self._editing_persona_scope = self.controller.persona_manager.get_persona_scope(persona.name)
self._editing_persona_is_new = False
else:
self._editing_persona_name = ""
self._editing_persona_provider = self.current_provider
self._editing_persona_model = self.current_model
self._editing_persona_system_prompt = ""
self._editing_persona_temperature = 0.7
self._editing_persona_max_tokens = 4096
self._editing_persona_tool_preset_id = ""
self._editing_persona_bias_profile_id = ""
self._editing_persona_preferred_models_list = []
self._editing_persona_scope = "project"
self._editing_persona_is_new = True"""
# We need to extract the bias profile block as well
old_bias_block = """ imgui.text("Bias Profile")
if imgui.begin_combo("##bias", self.ui_active_bias_profile or "None"):
if imgui.selectable("None", not self.ui_active_bias_profile)[0]:
self.ui_active_bias_profile = ""
ai_client.set_bias_profile(None)
for bname in sorted(self.bias_profiles.keys()):
if imgui.selectable(bname, bname == self.ui_active_bias_profile)[0]:
self.ui_active_bias_profile = bname
ai_client.set_bias_profile(bname)
imgui.end_combo()"""
# Remove them from their original spots
content = content.replace(old_bias_block, "")
content = content.replace(old_persona_block, "")
# Insert Persona block at the top of _render_provider_panel
old_provider_start = """ def _render_provider_panel(self) -> None:
if self.perf_profiling_enabled: self.perf_monitor.start_component("_render_provider_panel")
imgui.text("Provider")"""
new_provider_start = f""" def _render_provider_panel(self) -> None:
if self.perf_profiling_enabled: self.perf_monitor.start_component("_render_provider_panel")
{old_persona_block}
imgui.separator()
imgui.text("Provider")"""
content = content.replace(old_provider_start, new_provider_start)
# Update _render_agent_tools_panel
old_agent_tools_start = """ def _render_agent_tools_panel(self) -> None:
imgui.text_colored(C_LBL, 'Active Tool Preset')"""
new_agent_tools_start = f""" def _render_agent_tools_panel(self) -> None:
if imgui.collapsing_header("Active Tool Presets & Biases", imgui.TreeNodeFlags_.default_open):
imgui.text("Tool Preset")"""
content = content.replace(old_agent_tools_start, new_agent_tools_start)
# Wait, if I do collapsing header, I need to indent the rest of the function.
# Instead of indenting the whole function, I can just use the header.
# But wait, ImGui collapsing_header doesn't require indenting, it just returns true if open.
# So I should write it properly:
old_agent_tools_func = """ def _render_agent_tools_panel(self) -> None:
imgui.text_colored(C_LBL, 'Active Tool Preset')
presets = self.controller.tool_presets
preset_names = [""] + sorted(list(presets.keys()))
# Gracefully handle None or missing preset
active = getattr(self, "ui_active_tool_preset", "")
if active is None: active = ""
try:
idx = preset_names.index(active)
except ValueError:
idx = 0
ch, new_idx = imgui.combo("##tool_preset_select", idx, preset_names)
if ch:
self.ui_active_tool_preset = preset_names[new_idx]
imgui.same_line()
if imgui.button("Manage Presets##tools"):
self.show_tool_preset_manager_window = True
if imgui.is_item_hovered():
imgui.set_tooltip("Configure tool availability and default modes.")
imgui.dummy(imgui.ImVec2(0, 8))
active_name = self.ui_active_tool_preset
if active_name and active_name in presets:
preset = presets[active_name]
for cat_name, tools in preset.categories.items():
if imgui.tree_node(cat_name):
for tool in tools:
if tool.weight >= 5:
imgui.text_colored(vec4(255, 100, 100), "[HIGH]")
imgui.same_line()
elif tool.weight == 4:
imgui.text_colored(vec4(255, 255, 100), "[PREF]")
imgui.same_line()
elif tool.weight == 2:
imgui.text_colored(vec4(255, 150, 50), "[REJECT]")
imgui.same_line()
elif tool.weight <= 1:
imgui.text_colored(vec4(180, 180, 180), "[LOW]")
imgui.same_line()
imgui.text(tool.name)
imgui.same_line(180)
mode = tool.approval
if imgui.radio_button(f"Auto##{cat_name}_{tool.name}", mode == "auto"):
tool.approval = "auto"
imgui.same_line()
if imgui.radio_button(f"Ask##{cat_name}_{tool.name}", mode == "ask"):
tool.approval = "ask"
imgui.tree_pop()"""
new_agent_tools_func = """ def _render_agent_tools_panel(self) -> None:
if imgui.collapsing_header("Active Tool Presets & Biases", imgui.TreeNodeFlags_.default_open):
imgui.text("Tool Preset")
presets = self.controller.tool_presets
preset_names = [""] + sorted(list(presets.keys()))
# Gracefully handle None or missing preset
active = getattr(self, "ui_active_tool_preset", "")
if active is None: active = ""
try:
idx = preset_names.index(active)
except ValueError:
idx = 0
ch, new_idx = imgui.combo("##tool_preset_select", idx, preset_names)
if ch:
self.ui_active_tool_preset = preset_names[new_idx]
imgui.same_line()
if imgui.button("Manage Tools##tools"):
self.show_tool_preset_manager_window = True
if imgui.is_item_hovered():
imgui.set_tooltip("Configure tool availability and default modes.")
imgui.dummy(imgui.ImVec2(0, 4))
""" + "\n ".join(old_bias_block.split("\n")) + """
imgui.dummy(imgui.ImVec2(0, 8))
active_name = self.ui_active_tool_preset
if active_name and active_name in presets:
preset = presets[active_name]
for cat_name, tools in preset.categories.items():
if imgui.tree_node(cat_name):
for tool in tools:
if tool.weight >= 5:
imgui.text_colored(vec4(255, 100, 100), "[HIGH]")
imgui.same_line()
elif tool.weight == 4:
imgui.text_colored(vec4(255, 255, 100), "[PREF]")
imgui.same_line()
elif tool.weight == 2:
imgui.text_colored(vec4(255, 150, 50), "[REJECT]")
imgui.same_line()
elif tool.weight <= 1:
imgui.text_colored(vec4(180, 180, 180), "[LOW]")
imgui.same_line()
imgui.text(tool.name)
imgui.same_line(180)
mode = tool.approval
if imgui.radio_button(f"Auto##{cat_name}_{tool.name}", mode == "auto"):
tool.approval = "auto"
imgui.same_line()
if imgui.radio_button(f"Ask##{cat_name}_{tool.name}", mode == "ask"):
tool.approval = "ask"
imgui.tree_pop()"""
content = content.replace(old_agent_tools_func, new_agent_tools_func)
# Fix cache text display in Usage Analytics
content = content.replace('self._gemini_cache_text = f"Gemini Caches: {count} ({size_bytes / 1024:.1f} KB)"', 'self._gemini_cache_text = f"Cache Usage: {count} ({size_bytes / 1024:.1f} KB)"')
content = content.replace('imgui.text_colored(C_LBL, f"Gemini Cache: ACTIVE | Age: {age:.0f}s / {ttl}s | Renews at: {ttl * 0.9:.0f}s")', 'imgui.text_colored(C_LBL, f"Cache Usage: ACTIVE | Age: {age:.0f}s / {ttl}s | Renews at: {ttl * 0.9:.0f}s")')
content = content.replace('imgui.text_disabled("Gemini Cache: INACTIVE")', 'imgui.text_disabled("Cache Usage: INACTIVE")')
# Also, user requested: "The persona should problably just mess with the project system prompt for now."
# Currently in persona selection: `ai_client.system_instruction = persona.system_prompt`
# Let's change that to `self.ui_project_system_prompt = persona.system_prompt` and remove ai_client direct injection
content = content.replace('ai_client.system_instruction = persona.system_prompt', 'self.ui_project_system_prompt = persona.system_prompt')
with open("src/gui_2.py", "w", encoding="utf-8") as f:
f.write(content)
print("done")

View File

@@ -0,0 +1,228 @@
import sys
with open("src/gui_2.py", "r", encoding="utf-8") as f:
content = f.read()
# 1. Update _gui_func:
# Extract Persona out of Provider panel. I will create a new method _render_persona_selector_panel
old_gui_settings = """ if self.show_windows.get("AI Settings", False):
exp, opened = imgui.begin("AI Settings", self.show_windows["AI Settings"])
self.show_windows["AI Settings"] = bool(opened)
if exp:
if imgui.collapsing_header("Provider & Model"):
self._render_provider_panel()
if imgui.collapsing_header("System Prompts"):
self._render_system_prompts_panel()
self._render_agent_tools_panel()
self._render_cache_panel()
imgui.end()
if self.ui_separate_usage_analytics and self.show_windows.get("Usage Analytics", False):
exp, opened = imgui.begin("Usage Analytics", self.show_windows["Usage Analytics"])
self.show_windows["Usage Analytics"] = bool(opened)
if exp:
self._render_usage_analytics_panel()
imgui.end()"""
new_gui_settings = """ if self.show_windows.get("AI Settings", False):
exp, opened = imgui.begin("AI Settings", self.show_windows["AI Settings"])
self.show_windows["AI Settings"] = bool(opened)
if exp:
self._render_persona_selector_panel()
if imgui.collapsing_header("Provider & Model"):
self._render_provider_panel()
if imgui.collapsing_header("System Prompts"):
self._render_system_prompts_panel()
self._render_agent_tools_panel()
imgui.end()
if self.ui_separate_usage_analytics and self.show_windows.get("Usage Analytics", False):
exp, opened = imgui.begin("Usage Analytics", self.show_windows["Usage Analytics"])
self.show_windows["Usage Analytics"] = bool(opened)
if exp:
self._render_usage_analytics_panel()
imgui.end()"""
content = content.replace(old_gui_settings, new_gui_settings)
# Update _render_usage_analytics_panel
old_usage = """ def _render_usage_analytics_panel(self) -> None:
if self.perf_profiling_enabled: self.perf_monitor.start_component("_render_usage_analytics_panel")
self._render_token_budget_panel()
imgui.separator()
self._render_tool_analytics_panel()
imgui.separator()
self._render_session_insights_panel()
if self.perf_profiling_enabled: self.perf_monitor.end_component("_render_usage_analytics_panel")"""
new_usage = """ def _render_usage_analytics_panel(self) -> None:
if self.perf_profiling_enabled: self.perf_monitor.start_component("_render_usage_analytics_panel")
self._render_token_budget_panel()
imgui.separator()
self._render_cache_panel()
imgui.separator()
self._render_tool_analytics_panel()
imgui.separator()
self._render_session_insights_panel()
if self.perf_profiling_enabled: self.perf_monitor.end_component("_render_usage_analytics_panel")"""
content = content.replace(old_usage, new_usage)
# Remove the persona block from _render_provider_panel and put it in _render_persona_selector_panel
old_persona_block = """ def _render_provider_panel(self) -> None:
if self.perf_profiling_enabled: self.perf_monitor.start_component("_render_provider_panel")
imgui.text("Persona")
if not hasattr(self, 'ui_active_persona'):
self.ui_active_persona = ""
personas = getattr(self.controller, 'personas', {})
if imgui.begin_combo("##persona", self.ui_active_persona or "None"):
if imgui.selectable("None", not self.ui_active_persona)[0]:
self.ui_active_persona = ""
for pname in sorted(personas.keys()):
if imgui.selectable(pname, pname == self.ui_active_persona)[0]:
self.ui_active_persona = pname
if pname in personas:
persona = personas[pname]
self._editing_persona_name = persona.name
self._editing_persona_provider = persona.provider or ""
self._editing_persona_model = persona.model or ""
self._editing_persona_system_prompt = persona.system_prompt or ""
self._editing_persona_temperature = persona.temperature or 0.7
self._editing_persona_max_tokens = persona.max_output_tokens or 4096
self._editing_persona_tool_preset_id = persona.tool_preset or ""
self._editing_persona_bias_profile_id = persona.bias_profile or ""
import json
self._editing_persona_preferred_models = json.dumps(persona.preferred_models) if persona.preferred_models else "[]"
self._editing_persona_is_new = False
if persona.provider and persona.provider in self.controller.PROVIDERS:
self.current_provider = persona.provider
if persona.model:
self.current_model = persona.model
if persona.temperature is not None:
ai_client.temperature = persona.temperature
if persona.max_output_tokens:
ai_client.max_output_tokens = persona.max_output_tokens
if persona.system_prompt:
self.ui_project_system_prompt = persona.system_prompt
if persona.tool_preset:
self.ui_active_tool_preset = persona.tool_preset
ai_client.set_tool_preset(persona.tool_preset)
if persona.bias_profile:
self.ui_active_bias_profile = persona.bias_profile
ai_client.set_bias_profile(persona.bias_profile)
imgui.end_combo()
imgui.same_line()
if imgui.button("Manage Personas"):
self.show_persona_editor_window = True
if self.ui_active_persona and self.ui_active_persona in personas:
persona = personas[self.ui_active_persona]
self._editing_persona_name = persona.name
self._editing_persona_provider = persona.provider or ""
self._editing_persona_model = persona.model or ""
self._editing_persona_system_prompt = persona.system_prompt or ""
self._editing_persona_temperature = persona.temperature if persona.temperature is not None else 0.7
self._editing_persona_max_tokens = persona.max_output_tokens if persona.max_output_tokens is not None else 4096
self._editing_persona_tool_preset_id = persona.tool_preset or ""
self._editing_persona_bias_profile_id = persona.bias_profile or ""
self._editing_persona_preferred_models_list = list(persona.preferred_models) if persona.preferred_models else []
self._editing_persona_scope = self.controller.persona_manager.get_persona_scope(persona.name)
self._editing_persona_is_new = False
else:
self._editing_persona_name = ""
self._editing_persona_provider = self.current_provider
self._editing_persona_model = self.current_model
self._editing_persona_system_prompt = ""
self._editing_persona_temperature = 0.7
self._editing_persona_max_tokens = 4096
self._editing_persona_tool_preset_id = ""
self._editing_persona_bias_profile_id = ""
self._editing_persona_preferred_models_list = []
self._editing_persona_scope = "project"
self._editing_persona_is_new = True
imgui.separator()
imgui.text("Provider")"""
new_persona_block = """ def _render_persona_selector_panel(self) -> None:
if self.perf_profiling_enabled: self.perf_monitor.start_component("_render_persona_selector_panel")
imgui.text("Persona")
if not hasattr(self, 'ui_active_persona'):
self.ui_active_persona = ""
personas = getattr(self.controller, 'personas', {})
if imgui.begin_combo("##persona", self.ui_active_persona or "None"):
if imgui.selectable("None", not self.ui_active_persona)[0]:
self.ui_active_persona = ""
for pname in sorted(personas.keys()):
if imgui.selectable(pname, pname == self.ui_active_persona)[0]:
self.ui_active_persona = pname
if pname in personas:
persona = personas[pname]
self._editing_persona_name = persona.name
self._editing_persona_system_prompt = persona.system_prompt or ""
self._editing_persona_tool_preset_id = persona.tool_preset or ""
self._editing_persona_bias_profile_id = persona.bias_profile or ""
import copy
self._editing_persona_preferred_models_list = copy.deepcopy(persona.preferred_models) if persona.preferred_models else []
self._editing_persona_is_new = False
# Apply persona to current state immediately
if persona.preferred_models and len(persona.preferred_models) > 0:
first_model = persona.preferred_models[0]
if first_model.get("provider"):
self.current_provider = first_model.get("provider")
if first_model.get("model"):
self.current_model = first_model.get("model")
if first_model.get("temperature") is not None:
ai_client.temperature = first_model.get("temperature")
self.temperature = first_model.get("temperature")
if first_model.get("max_output_tokens"):
ai_client.max_output_tokens = first_model.get("max_output_tokens")
self.max_tokens = first_model.get("max_output_tokens")
if first_model.get("history_trunc_limit"):
self.history_trunc_limit = first_model.get("history_trunc_limit")
if persona.system_prompt:
self.ui_project_system_prompt = persona.system_prompt
if persona.tool_preset:
self.ui_active_tool_preset = persona.tool_preset
ai_client.set_tool_preset(persona.tool_preset)
if persona.bias_profile:
self.ui_active_bias_profile = persona.bias_profile
ai_client.set_bias_profile(persona.bias_profile)
imgui.end_combo()
imgui.same_line()
if imgui.button("Manage Personas"):
self.show_persona_editor_window = True
if self.ui_active_persona and self.ui_active_persona in personas:
persona = personas[self.ui_active_persona]
self._editing_persona_name = persona.name
self._editing_persona_system_prompt = persona.system_prompt or ""
self._editing_persona_tool_preset_id = persona.tool_preset or ""
self._editing_persona_bias_profile_id = persona.bias_profile or ""
import copy
self._editing_persona_preferred_models_list = copy.deepcopy(persona.preferred_models) if persona.preferred_models else []
self._editing_persona_scope = self.controller.persona_manager.get_persona_scope(persona.name)
self._editing_persona_is_new = False
else:
self._editing_persona_name = ""
self._editing_persona_system_prompt = ""
self._editing_persona_tool_preset_id = ""
self._editing_persona_bias_profile_id = ""
self._editing_persona_preferred_models_list = [{
"provider": self.current_provider,
"model": self.current_model,
"temperature": getattr(self, "temperature", 0.7),
"max_output_tokens": getattr(self, "max_tokens", 4096),
"history_trunc_limit": getattr(self, "history_trunc_limit", 900000)
}]
self._editing_persona_scope = "project"
self._editing_persona_is_new = True
imgui.separator()
if self.perf_profiling_enabled: self.perf_monitor.end_component("_render_persona_selector_panel")
def _render_provider_panel(self) -> None:
if self.perf_profiling_enabled: self.perf_monitor.start_component("_render_provider_panel")
imgui.text("Provider")"""
content = content.replace(old_persona_block, new_persona_block)
with open("src/gui_2.py", "w", encoding="utf-8") as f:
f.write(content)
print("done gui updates")

View File

@@ -363,3 +363,4 @@ def main() -> None:
if __name__ == "__main__": if __name__ == "__main__":
main() main()

View File

@@ -2400,3 +2400,4 @@ def get_history_bleed_stats(md_content: Optional[str] = None) -> dict[str, Any]:
"current": 0, "current": 0,
"percentage": 0, "percentage": 0,
}) })

View File

@@ -223,3 +223,4 @@ class ApiHookClient:
def get_patch_status(self) -> dict[str, Any]: def get_patch_status(self) -> dict[str, Any]:
"""Gets the current patch modal status.""" """Gets the current patch modal status."""
return self._make_request('GET', '/api/patch/status') or {} return self._make_request('GET', '/api/patch/status') or {}

View File

@@ -523,3 +523,4 @@ class HookServer:
if self.thread: if self.thread:
self.thread.join() self.thread.join()
logging.info("Hook server stopped") logging.info("Hook server stopped")

View File

@@ -300,7 +300,9 @@ class AppController:
self._inject_mode: str = "skeleton" self._inject_mode: str = "skeleton"
self._inject_preview: str = "" self._inject_preview: str = ""
self._show_inject_modal: bool = False self._show_inject_modal: bool = False
self.show_preset_manager_modal: bool = False self.show_preset_manager_window: bool = False
self.show_tool_preset_manager_window: bool = False
self.show_persona_editor_window: bool = False
self._editing_preset_name: str = "" self._editing_preset_name: str = ""
self._editing_preset_content: str = "" self._editing_preset_content: str = ""
self._editing_preset_temperature: float = 0.0 self._editing_preset_temperature: float = 0.0
@@ -342,7 +344,9 @@ class AppController:
'ui_active_tool_preset': 'ui_active_tool_preset', 'ui_active_tool_preset': 'ui_active_tool_preset',
'temperature': 'temperature', 'temperature': 'temperature',
'max_tokens': 'max_tokens', 'max_tokens': 'max_tokens',
'show_preset_manager_modal': 'show_preset_manager_modal', 'show_preset_manager_window': 'show_preset_manager_window',
'show_tool_preset_manager_window': 'show_tool_preset_manager_window',
'show_persona_editor_window': 'show_persona_editor_window',
'_editing_preset_name': '_editing_preset_name', '_editing_preset_name': '_editing_preset_name',
'_editing_preset_content': '_editing_preset_content', '_editing_preset_content': '_editing_preset_content',
'_editing_preset_temperature': '_editing_preset_temperature', '_editing_preset_temperature': '_editing_preset_temperature',
@@ -390,7 +394,9 @@ class AppController:
'ui_active_tool_preset': 'ui_active_tool_preset', 'ui_active_tool_preset': 'ui_active_tool_preset',
'temperature': 'temperature', 'temperature': 'temperature',
'max_tokens': 'max_tokens', 'max_tokens': 'max_tokens',
'show_preset_manager_modal': 'show_preset_manager_modal', 'show_preset_manager_window': 'show_preset_manager_window',
'show_tool_preset_manager_window': 'show_tool_preset_manager_window',
'show_persona_editor_window': 'show_persona_editor_window',
'_editing_preset_name': '_editing_preset_name', '_editing_preset_name': '_editing_preset_name',
'_editing_preset_content': '_editing_preset_content', '_editing_preset_content': '_editing_preset_content',
'_editing_preset_temperature': '_editing_preset_temperature', '_editing_preset_temperature': '_editing_preset_temperature',
@@ -877,6 +883,8 @@ class AppController:
self.persona_manager = PersonaManager(Path(self.active_project_path).parent if self.active_project_path else None) self.persona_manager = PersonaManager(Path(self.active_project_path).parent if self.active_project_path else None)
self.personas = self.persona_manager.load_all() self.personas = self.persona_manager.load_all()
self._fetch_models(self.current_provider)
self.ui_active_tool_preset = os.environ.get('SLOP_TOOL_PRESET') or ai_cfg.get("active_tool_preset") self.ui_active_tool_preset = os.environ.get('SLOP_TOOL_PRESET') or ai_cfg.get("active_tool_preset")
self.ui_active_bias_profile = ai_cfg.get("active_bias_profile") self.ui_active_bias_profile = ai_cfg.get("active_bias_profile")
ai_client.set_tool_preset(self.ui_active_tool_preset) ai_client.set_tool_preset(self.ui_active_tool_preset)
@@ -1490,6 +1498,9 @@ class AppController:
self._current_provider = value self._current_provider = value
ai_client.reset_session() ai_client.reset_session()
ai_client.set_provider(value, self.current_model) ai_client.set_provider(value, self.current_model)
self.available_models = self.all_available_models.get(value, [])
if not self.available_models:
self._fetch_models(value)
self._token_stats = {} self._token_stats = {}
self._token_stats_dirty = True self._token_stats_dirty = True
@@ -1859,20 +1870,13 @@ class AppController:
else: else:
self.ui_project_system_prompt = preset.system_prompt self.ui_project_system_prompt = preset.system_prompt
self.ui_project_preset_name = name self.ui_project_preset_name = name
if preset.temperature is not None:
self.temperature = preset.temperature
if preset.max_output_tokens is not None:
self.max_tokens = preset.max_output_tokens
def _cb_save_preset(self, name, content, temp, top_p, max_tok, scope): def _cb_save_preset(self, name, content, scope):
if not name or not name.strip(): if not name or not name.strip():
raise ValueError("Preset name cannot be empty or whitespace.") raise ValueError("Preset name cannot be empty or whitespace.")
preset = models.Preset( preset = models.Preset(
name=name, name=name,
system_prompt=content, system_prompt=content
temperature=temp,
top_p=top_p,
max_output_tokens=max_tok
) )
self.preset_manager.save_preset(preset, scope) self.preset_manager.save_preset(preset, scope)
self.presets = self.preset_manager.load_all() self.presets = self.preset_manager.load_all()
@@ -1900,11 +1904,12 @@ class AppController:
def _cb_save_persona(self, persona: models.Persona, scope: str = "project") -> None: def _cb_save_persona(self, persona: models.Persona, scope: str = "project") -> None:
self.persona_manager.save_persona(persona, scope) self.persona_manager.save_persona(persona, scope)
self.personas = self.persona_manager.load_all_personas() self.personas = self.persona_manager.load_all()
def _cb_delete_persona(self, name: str, scope: str = "project") -> None:
self.persona_manager.delete_persona(name, scope)
self.personas = self.persona_manager.load_all()
def _cb_delete_persona(self, persona_id: str, scope: str = "project") -> None:
self.persona_manager.delete_persona(persona_id, scope)
self.personas = self.persona_manager.load_all_personas()
def _cb_load_track(self, track_id: str) -> None: def _cb_load_track(self, track_id: str) -> None:
state = project_manager.load_track_state(track_id, self.ui_files_base_dir) state = project_manager.load_track_state(track_id, self.ui_files_base_dir)
@@ -2573,3 +2578,4 @@ class AppController:
tasks=self.active_track.tickets tasks=self.active_track.tickets
) )
project_manager.save_track_state(self.active_track.id, state, self.ui_files_base_dir) project_manager.save_track_state(self.active_track.id, state, self.ui_files_base_dir)

View File

@@ -63,3 +63,4 @@ def get_bg():
if _bg is None: if _bg is None:
_bg = BackgroundShader() _bg = BackgroundShader()
return _bg return _bg

View File

@@ -118,3 +118,4 @@ if __name__ == "__main__":
test_skeletons = "class NewFeature: pass" test_skeletons = "class NewFeature: pass"
tickets = generate_tickets(test_brief, test_skeletons) tickets = generate_tickets(test_brief, test_skeletons)
print(json.dumps(tickets, indent=2)) print(json.dumps(tickets, indent=2))

View File

@@ -59,3 +59,4 @@ def estimate_cost(model: str, input_tokens: int, output_tokens: int) -> float:
return input_cost + output_cost return input_cost + output_cost
return 0.0 return 0.0

View File

@@ -193,3 +193,4 @@ class ExecutionEngine:
ticket = self.dag.ticket_map.get(task_id) ticket = self.dag.ticket_map.get(task_id)
if ticket: if ticket:
ticket.status = status ticket.status = status

View File

@@ -1,4 +1,4 @@
from typing import List, Dict, Optional, Tuple from typing import List, Dict, Optional, Tuple
from dataclasses import dataclass from dataclasses import dataclass
import shutil import shutil
import os import os

View File

@@ -126,3 +126,4 @@ class UserRequestEvent:
"disc_text": self.disc_text, "disc_text": self.disc_text,
"base_dir": self.base_dir "base_dir": self.base_dir
} }

View File

@@ -376,3 +376,4 @@ def evict(path: Path) -> None:
def list_cached() -> List[Dict[str, Any]]: def list_cached() -> List[Dict[str, Any]]:
return [] return []

View File

@@ -189,3 +189,4 @@ class GeminiCliAdapter:
""" """
total_chars = len("\n".join(contents)) total_chars = len("\n".join(contents))
return total_chars // 4 return total_chars // 4

File diff suppressed because it is too large Load Diff

View File

@@ -115,3 +115,4 @@ class LogPruner:
sys.stderr.write(f"[LogPruner] Error removing {resolved_path}: {e}\n") sys.stderr.write(f"[LogPruner] Error removing {resolved_path}: {e}\n")
self.log_registry.save_registry() self.log_registry.save_registry()

View File

@@ -301,3 +301,4 @@ class LogRegistry:
}) })
return old_sessions return old_sessions

View File

@@ -166,3 +166,4 @@ def render_unindented(text: str) -> None:
def render_code(code: str, lang: str = "", context_id: str = "default", block_idx: int = 0) -> None: def render_code(code: str, lang: str = "", context_id: str = "default", block_idx: int = 0) -> None:
get_renderer().render_code(code, lang, context_id, block_idx) get_renderer().render_code(code, lang, context_id, block_idx)

View File

@@ -1,4 +1,4 @@
# mcp_client.py # mcp_client.py
""" """
MCP Client - Multi-tool filesystem and network operations with sandboxing. MCP Client - Multi-tool filesystem and network operations with sandboxing.
@@ -782,10 +782,10 @@ def get_tree(path: str, max_depth: int = 2) -> str:
entries = [e for e in entries if not e.name.startswith('.') and e.name not in ('__pycache__', 'venv', 'env') and e.name != "history.toml" and not e.name.endswith("_history.toml")] entries = [e for e in entries if not e.name.startswith('.') and e.name not in ('__pycache__', 'venv', 'env') and e.name != "history.toml" and not e.name.endswith("_history.toml")]
for i, entry in enumerate(entries): for i, entry in enumerate(entries):
is_last = (i == len(entries) - 1) is_last = (i == len(entries) - 1)
connector = "└── " if is_last else "├── " connector = "└── " if is_last else "├── "
lines.append(f"{prefix}{connector}{entry.name}") lines.append(f"{prefix}{connector}{entry.name}")
if entry.is_dir(): if entry.is_dir():
extension = " " if is_last else "│ " extension = " " if is_last else " "
lines.extend(_build_tree(entry, current_depth + 1, prefix + extension)) lines.extend(_build_tree(entry, current_depth + 1, prefix + extension))
return lines return lines
tree_lines = [f"{p.name}/"] + _build_tree(p, 1) tree_lines = [f"{p.name}/"] + _build_tree(p, 1)
@@ -1466,3 +1466,4 @@ MCP_TOOL_SPECS: list[dict[str, Any]] = [

View File

@@ -178,3 +178,4 @@ RULES:
Analyze this error and generate the patch: Analyze this error and generate the patch:
""" """

View File

@@ -349,30 +349,17 @@ class FileItem:
class Preset: class Preset:
name: str name: str
system_prompt: str system_prompt: str
temperature: Optional[float] = None
top_p: Optional[float] = None
max_output_tokens: Optional[int] = None
def to_dict(self) -> Dict[str, Any]: def to_dict(self) -> Dict[str, Any]:
res = { return {
"system_prompt": self.system_prompt, "system_prompt": self.system_prompt,
} }
if self.temperature is not None:
res["temperature"] = self.temperature
if self.top_p is not None:
res["top_p"] = self.top_p
if self.max_output_tokens is not None:
res["max_output_tokens"] = self.max_output_tokens
return res
@classmethod @classmethod
def from_dict(cls, name: str, data: Dict[str, Any]) -> "Preset": def from_dict(cls, name: str, data: Dict[str, Any]) -> "Preset":
return cls( return cls(
name=name, name=name,
system_prompt=data.get("system_prompt", ""), system_prompt=data.get("system_prompt", ""),
temperature=data.get("temperature"),
top_p=data.get("top_p"),
max_output_tokens=data.get("max_output_tokens"),
) )
@dataclass @dataclass
@@ -447,32 +434,48 @@ class BiasProfile:
@dataclass @dataclass
class Persona: class Persona:
name: str name: str
provider: Optional[str] = None preferred_models: List[Dict[str, Any]] = field(default_factory=list)
model: Optional[str] = None
preferred_models: List[str] = field(default_factory=list)
system_prompt: str = '' system_prompt: str = ''
temperature: Optional[float] = None
top_p: Optional[float] = None
max_output_tokens: Optional[int] = None
tool_preset: Optional[str] = None tool_preset: Optional[str] = None
bias_profile: Optional[str] = None bias_profile: Optional[str] = None
@property
def provider(self) -> Optional[str]:
if not self.preferred_models: return None
return self.preferred_models[0].get("provider")
@property
def model(self) -> Optional[str]:
if not self.preferred_models: return None
return self.preferred_models[0].get("model")
@property
def temperature(self) -> Optional[float]:
if not self.preferred_models: return None
return self.preferred_models[0].get("temperature")
@property
def top_p(self) -> Optional[float]:
if not self.preferred_models: return None
return self.preferred_models[0].get("top_p")
@property
def max_output_tokens(self) -> Optional[int]:
if not self.preferred_models: return None
return self.preferred_models[0].get("max_output_tokens")
def to_dict(self) -> Dict[str, Any]: def to_dict(self) -> Dict[str, Any]:
res = { res = {
"system_prompt": self.system_prompt, "system_prompt": self.system_prompt,
} }
if self.provider is not None:
res["provider"] = self.provider
if self.model is not None:
res["model"] = self.model
if self.preferred_models: if self.preferred_models:
res["preferred_models"] = self.preferred_models processed = []
if self.temperature is not None: for m in self.preferred_models:
res["temperature"] = self.temperature if isinstance(m, str):
if self.top_p is not None: processed.append({"model": m})
res["top_p"] = self.top_p else:
if self.max_output_tokens is not None: processed.append(m)
res["max_output_tokens"] = self.max_output_tokens res["preferred_models"] = processed
if self.tool_preset is not None: if self.tool_preset is not None:
res["tool_preset"] = self.tool_preset res["tool_preset"] = self.tool_preset
if self.bias_profile is not None: if self.bias_profile is not None:
@@ -481,15 +484,34 @@ class Persona:
@classmethod @classmethod
def from_dict(cls, name: str, data: Dict[str, Any]) -> "Persona": def from_dict(cls, name: str, data: Dict[str, Any]) -> "Persona":
raw_models = data.get("preferred_models", [])
parsed_models = []
for m in raw_models:
if isinstance(m, str):
parsed_models.append({"model": m})
else:
parsed_models.append(m)
# Migration logic: merge legacy fields if they exist
legacy = {}
for k in ["provider", "model", "temperature", "top_p", "max_output_tokens"]:
if data.get(k) is not None:
legacy[k] = data[k]
if legacy:
if not parsed_models:
parsed_models.append(legacy)
else:
# Merge into first item if it's missing these specific legacy fields
for k, v in legacy.items():
if k not in parsed_models[0] or parsed_models[0][k] is None:
parsed_models[0][k] = v
return cls( return cls(
name=name, name=name,
provider=data.get("provider"), preferred_models=parsed_models,
model=data.get("model"),
preferred_models=data.get("preferred_models", []),
system_prompt=data.get("system_prompt", ""), system_prompt=data.get("system_prompt", ""),
temperature=data.get("temperature"),
top_p=data.get("top_p"),
max_output_tokens=data.get("max_output_tokens"),
tool_preset=data.get("tool_preset"), tool_preset=data.get("tool_preset"),
bias_profile=data.get("bias_profile"), bias_profile=data.get("bias_profile"),
) )

View File

@@ -614,3 +614,4 @@ def run_worker_lifecycle(ticket: Ticket, context: WorkerContext, context_files:
if event_queue: if event_queue:
_queue_put(event_queue, "ticket_completed", {"ticket_id": ticket.id, "timestamp": time.time()}) _queue_put(event_queue, "ticket_completed", {"ticket_id": ticket.id, "timestamp": time.time()})
return response return response

View File

@@ -86,3 +86,4 @@ class NativeOrchestrator:
"""Tier 4: Generate patch for error""" """Tier 4: Generate patch for error"""
from src import ai_client from src import ai_client
return ai_client.run_tier4_patch_generation(error, file_context) return ai_client.run_tier4_patch_generation(error, file_context)

View File

@@ -125,3 +125,4 @@ if __name__ == "__main__":
history = get_track_history_summary() history = get_track_history_summary()
tracks = generate_tracks("Implement a basic unit test for the ai_client.py module.", flat, file_items, history_summary=history) tracks = generate_tracks("Implement a basic unit test for the ai_client.py module.", flat, file_items, history_summary=history)
print(json.dumps(tracks, indent=2)) print(json.dumps(tracks, indent=2))

View File

@@ -87,3 +87,4 @@ def get_outline(path: Path, code: str) -> str:
return outliner.outline(code) return outliner.outline(code)
else: else:
return f"Outlining not supported for {suffix} files yet." return f"Outlining not supported for {suffix} files yet."

View File

@@ -1,4 +1,4 @@
from typing import Optional, Callable, List from typing import Optional, Callable, List
from dataclasses import dataclass, field from dataclasses import dataclass, field
@dataclass @dataclass

View File

@@ -110,3 +110,4 @@ def get_archive_dir() -> Path:
def reset_resolved() -> None: def reset_resolved() -> None:
"""For testing only - clear cached resolutions.""" """For testing only - clear cached resolutions."""
_RESOLVED.clear() _RESOLVED.clear()

View File

@@ -232,3 +232,4 @@ class PerformanceMonitor:
self._stop_event.set() self._stop_event.set()
if self._cpu_thread.is_alive(): if self._cpu_thread.is_alive():
self._cpu_thread.join(timeout=2.0) self._cpu_thread.join(timeout=2.0)

View File

@@ -47,6 +47,21 @@ class PersonaManager:
data["personas"][persona.name] = persona.to_dict() data["personas"][persona.name] = persona.to_dict()
self._save_file(path, data) self._save_file(path, data)
def get_persona_scope(self, name: str) -> str:
"""Returns the scope ('global' or 'project') of a persona by name."""
if self.project_root:
project_path = paths.get_project_personas_path(self.project_root)
project_data = self._load_file(project_path)
if name in project_data.get("personas", {}):
return "project"
global_path = paths.get_global_personas_path()
global_data = self._load_file(global_path)
if name in global_data.get("personas", {}):
return "global"
return "project"
def delete_persona(self, name: str, scope: str = "project") -> None: def delete_persona(self, name: str, scope: str = "project") -> None:
path = self._get_path(scope) path = self._get_path(scope)
data = self._load_file(path) data = self._load_file(path)
@@ -67,3 +82,4 @@ class PersonaManager:
path.parent.mkdir(parents=True, exist_ok=True) path.parent.mkdir(parents=True, exist_ok=True)
with open(path, "wb") as f: with open(path, "wb") as f:
tomli_w.dump(data, f) tomli_w.dump(data, f)

View File

@@ -89,3 +89,4 @@ class PresetManager:
path.parent.mkdir(parents=True, exist_ok=True) path.parent.mkdir(parents=True, exist_ok=True)
with open(path, "wb") as f: with open(path, "wb") as f:
f.write(tomli_w.dumps(data).encode("utf-8")) f.write(tomli_w.dumps(data).encode("utf-8"))

View File

@@ -391,3 +391,4 @@ def calculate_track_progress(tickets: list) -> dict:
"blocked": blocked, "blocked": blocked,
"todo": todo "todo": todo
} }

View File

@@ -217,3 +217,4 @@ def log_cli_call(command: str, stdin_content: Optional[str], stdout_content: Opt
_cli_fh.flush() _cli_fh.flush()
except Exception: except Exception:
pass pass

View File

@@ -70,3 +70,4 @@ def apply_faux_acrylic_glass(draw_list: imgui.ImDrawList, p_min: imgui.ImVec2, p
flags=imgui.ImDrawFlags_.round_corners_all if rounding > 0 else imgui.ImDrawFlags_.none, flags=imgui.ImDrawFlags_.round_corners_all if rounding > 0 else imgui.ImDrawFlags_.none,
thickness=1.0 thickness=1.0
) )

View File

@@ -90,3 +90,4 @@ def run_powershell(script: str, base_dir: str, qa_callback: Optional[Callable[[s
if 'process' in locals() and process: if 'process' in locals() and process:
subprocess.run(["taskkill", "/F", "/T", "/PID", str(process.pid)], capture_output=True) subprocess.run(["taskkill", "/F", "/T", "/PID", str(process.pid)], capture_output=True)
return f"ERROR: {e}" return f"ERROR: {e}"

View File

@@ -190,3 +190,4 @@ def build_summary_markdown(file_items: list[dict[str, Any]]) -> str:
summary = item.get("summary", "") summary = item.get("summary", "")
parts.append(f"### `{path}`\n\n{summary}") parts.append(f"### `{path}`\n\n{summary}")
return "\n\n---\n\n".join(parts) return "\n\n---\n\n".join(parts)

View File

@@ -388,3 +388,4 @@ def load_from_config(config: dict[str, Any]) -> None:
if font_path: if font_path:
apply_font(font_path, font_size) apply_font(font_path, font_size)
set_scale(scale) set_scale(scale)

View File

@@ -392,3 +392,4 @@ def get_tweaked_theme() -> hello_imgui.ImGuiTweakedTheme:
# Sync tweaks # Sync tweaks
tt.tweaks.rounding = 6.0 tt.tweaks.rounding = 6.0
return tt return tt

View File

@@ -82,3 +82,4 @@ def apply_nerv() -> None:
style.popup_border_size = 1.0 style.popup_border_size = 1.0
style.child_border_size = 1.0 style.child_border_size = 1.0
style.tab_border_size = 1.0 style.tab_border_size = 1.0

View File

@@ -83,3 +83,4 @@ class AlertPulsing:
alpha = 0.05 + 0.15 * ((math.sin(time.time() * 4.0) + 1.0) / 2.0) alpha = 0.05 + 0.15 * ((math.sin(time.time() * 4.0) + 1.0) / 2.0)
color = imgui.get_color_u32((1.0, 0.0, 0.0, alpha)) color = imgui.get_color_u32((1.0, 0.0, 0.0, alpha))
draw_list.add_rect((0.0, 0.0), (width, height), color, 0.0, 0, 10.0) draw_list.add_rect((0.0, 0.0), (width, height), color, 0.0, 0, 10.0)

View File

@@ -63,3 +63,4 @@ class ToolBiasEngine:
lines.append(f"- {cat}: {mult}x") lines.append(f"- {cat}: {mult}x")
return "\n\n".join(lines) return "\n\n".join(lines)

View File

@@ -108,3 +108,4 @@ class ToolPresetManager:
if "bias_profiles" in data and name in data["bias_profiles"]: if "bias_profiles" in data and name in data["bias_profiles"]:
del data["bias_profiles"][name] del data["bias_profiles"][name]
self._write_raw(path, data) self._write_raw(path, data)

View File

@@ -59,7 +59,7 @@ def test_load_all_merged(temp_paths):
def test_save_persona(temp_paths): def test_save_persona(temp_paths):
manager = PersonaManager(project_root=temp_paths["project_root"]) manager = PersonaManager(project_root=temp_paths["project_root"])
persona = Persona(name="New", provider="gemini", system_prompt="Test") persona = Persona(name="New", preferred_models=[{"provider": "gemini"}], system_prompt="Test")
manager.save_persona(persona, scope="project") manager.save_persona(persona, scope="project")
loaded = manager.load_all() loaded = manager.load_all()

View File

@@ -4,30 +4,38 @@ from src.models import Persona
def test_persona_serialization(): def test_persona_serialization():
persona = Persona( persona = Persona(
name="SecuritySpecialist", name="SecuritySpecialist",
provider="anthropic", preferred_models=[
model="claude-3-7-sonnet-20250219", {"provider": "anthropic", "model": "claude-3-7-sonnet-20250219", "temperature": 0.2, "top_p": 0.9, "max_output_tokens": 4000},
preferred_models=["claude-3-7-sonnet-20250219", "claude-3-5-sonnet-20241022"], "claude-3-5-sonnet-20241022"
],
system_prompt="You are a security expert.", system_prompt="You are a security expert.",
temperature=0.2,
top_p=0.9,
max_output_tokens=4000,
tool_preset="SecurityTools", tool_preset="SecurityTools",
bias_profile="Execution-Focused" bias_profile="Execution-Focused"
) )
assert persona.provider == "anthropic"
assert persona.model == "claude-3-7-sonnet-20250219"
assert persona.temperature == 0.2
assert persona.top_p == 0.9
assert persona.max_output_tokens == 4000
data = persona.to_dict() data = persona.to_dict()
assert data["provider"] == "anthropic" # data should NOT have top-level provider/model anymore, it's in preferred_models
assert data["model"] == "claude-3-7-sonnet-20250219" assert "provider" not in data
assert "claude-3-5-sonnet-20241022" in data["preferred_models"] assert "model" not in data
assert data["preferred_models"][0]["provider"] == "anthropic"
assert data["preferred_models"][0]["model"] == "claude-3-7-sonnet-20250219"
assert data["preferred_models"][1] == {"model": "claude-3-5-sonnet-20241022"}
assert data["system_prompt"] == "You are a security expert." assert data["system_prompt"] == "You are a security expert."
assert data["temperature"] == 0.2 assert data["preferred_models"][0]["temperature"] == 0.2
assert data["top_p"] == 0.9 assert data["preferred_models"][0]["top_p"] == 0.9
assert data["max_output_tokens"] == 4000 assert data["preferred_models"][0]["max_output_tokens"] == 4000
assert data["tool_preset"] == "SecurityTools" assert data["tool_preset"] == "SecurityTools"
assert data["bias_profile"] == "Execution-Focused" assert data["bias_profile"] == "Execution-Focused"
def test_persona_deserialization(): def test_persona_deserialization():
# Old config format (legacy)
data = { data = {
"provider": "gemini", "provider": "gemini",
"model": "gemini-2.5-flash", "model": "gemini-2.5-flash",
@@ -45,7 +53,9 @@ def test_persona_deserialization():
assert persona.name == "Assistant" assert persona.name == "Assistant"
assert persona.provider == "gemini" assert persona.provider == "gemini"
assert persona.model == "gemini-2.5-flash" assert persona.model == "gemini-2.5-flash"
assert persona.preferred_models == ["gemini-2.5-flash"] # Migration logic should have put legacy fields into preferred_models since it only had a string
assert persona.preferred_models[0]["provider"] == "gemini"
assert persona.preferred_models[0]["model"] == "gemini-2.5-flash"
assert persona.system_prompt == "You are a helpful assistant." assert persona.system_prompt == "You are a helpful assistant."
assert persona.temperature == 0.5 assert persona.temperature == 0.5
assert persona.top_p == 1.0 assert persona.top_p == 1.0