add agent template config panel#439
Conversation
Review Summary by Qodo(Agentic_describe updated until commit b30a55d)Add agent template configuration panel with LLM settings
WalkthroughsDescription• Add agent template configuration panel with LLM settings • Support response format, provider, model, and reasoning level configuration • Implement collapsible config panel with toggle button in template editor • Refactor template deduplication logic and improve data structure handling Diagramflowchart LR
A["Agent Template Component"] -->|imports| B["Template Config Component"]
B -->|manages| C["LLM Configuration"]
C -->|includes| D["Provider & Model Selection"]
C -->|includes| E["Response Format & Reasoning Level"]
A -->|displays| F["Collapsible Config Panel"]
F -->|toggles| B
File Changes1. src/routes/page/agent/[agentId]/agent-components/templates/agent-template-config.svelte
|
Code Review by Qodo
1. Cleared LLM override persists
|
| template.llm_config = { provider: null, model: null }; | ||
| } | ||
| const value = Number(e.target.value) || 0; | ||
| template.llm_config.max_output_tokens = value; |
There was a problem hiding this comment.
max_output_tokens 是可空类型的,这里转换数字失败默认应该为null不是0
There was a problem hiding this comment.
no worries here. We will validate the max output token value when update agent. Only value greater than zero will be saved, otherwise, save it as null.
…eatures/refine-template-config
|
Persistent review updated to latest commit b30a55d |
| function changeProvider(e) { | ||
| const provider = e.target.value; | ||
| if (!template.llm_config) { | ||
| template.llm_config = { provider: null, model: null }; | ||
| } | ||
| template.llm_config.provider = provider || null; | ||
| if (!provider) { | ||
| models = []; | ||
| template.llm_config.model = null; | ||
| template.llm_config.reasoning_effort_level = null; | ||
| reasoningLevelOptions = defaultReasonLevelOptions; | ||
| handleAgentChange(); | ||
| return; | ||
| } | ||
| models = getLlmModels(provider); | ||
| template.llm_config.model = models[0]?.name || null; | ||
| template.llm_config.reasoning_effort_level = null; | ||
| reasoningLevelOptions = getReasoningLevelOptions(template.llm_config.model); | ||
| handleAgentChange(); | ||
| } |
There was a problem hiding this comment.
1. Cleared llm override persists 🐞 Bug ≡ Correctness
In agent-template-config.svelte, clearing the Provider only nulls provider/model/reasoning but keeps template.llm_config (and may keep max_output_tokens), so fetchTemplates still serializes llm_config because it only checks object truthiness. This makes “clearing” the template override ineffective and can save stale per-template config fields back to the agent payload.
Agent Prompt
### Issue description
Clearing a template’s LLM Provider does not remove the per-template override from the saved payload because `template.llm_config` remains a non-null object (and may retain `max_output_tokens`). `fetchTemplates()` then serializes `llm_config` based on object truthiness, so an “empty override” is still sent.
### Issue Context
Users expect clearing Provider to mean “no per-template override”. Current behavior can persist stale `max_output_tokens` (and other fields) even when provider is unset.
### Fix Focus Areas
- src/routes/page/agent/[agentId]/agent-components/templates/agent-template-config.svelte[116-135]
- src/routes/page/agent/[agentId]/agent-components/templates/agent-template-config.svelte[148-156]
- src/routes/page/agent/[agentId]/agent-components/templates/agent-template.svelte[39-50]
### Suggested fix
1. In `changeProvider`, when provider is cleared, either:
- set `template.llm_config = null` (preferred), or
- clear **all** fields (`model`, `reasoning_effort_level`, `max_output_tokens`) and ensure serialization treats this as null.
2. In `fetchTemplates`, only include `llm_config` if it contains meaningful values (e.g., `provider` is set; optionally require `model` too).
ⓘ Copy this prompt and use it to remediate the issue with your preferred AI generation tools
No description provided.