feat: support all model providers in JSON agent configuration#2109
feat: support all model providers in JSON agent configuration#2109
Conversation
Codecov Report❌ Patch coverage is
📢 Thoughts on this report? Let us know! |
| _ENV_VAR_PATTERN = re.compile(r"^\$\{([^}]+)\}$|^\$([A-Za-z_][A-Za-z0-9_]*)$") | ||
|
|
||
| # Provider name to factory function mapping — populated at module level, lazy imports at call time | ||
| PROVIDER_MAP: dict[str, str] = { |
There was a problem hiding this comment.
Issue: PROVIDER_MAP maps provider names to string function names, which are then resolved via globals()[factory_name] on line 438. This pattern is fragile — it breaks silently if a function is renamed or removed, and bypasses static analysis tools and IDE navigation.
Suggestion: Map directly to function references instead:
PROVIDER_MAP: dict[str, Callable[[dict[str, Any]], Any]] = {
"bedrock": _create_bedrock_model,
"anthropic": _create_anthropic_model,
...
}This requires moving PROVIDER_MAP below the factory function definitions, but it gives you type safety, IDE go-to-definition, and eliminates the globals() lookup.
| return factory_fn(config) | ||
|
|
||
|
|
||
| def config_to_agent(config: str | dict[str, Any], **kwargs: dict[str, Any]) -> Any: |
There was a problem hiding this comment.
Issue: The type annotation for **kwargs is dict[str, Any], but **kwargs already captures keyword arguments as a dict — the annotation should be the value type, not the full dict type.
Suggestion:
def config_to_agent(config: str | dict[str, Any], **kwargs: Any) -> Any:| import jsonschema | ||
| from jsonschema import ValidationError | ||
|
|
||
| logger = logging.getLogger(__name__) |
There was a problem hiding this comment.
Issue: logging is imported and logger is defined but never used anywhere in the module.
Suggestion: Either add logging calls (e.g., in _create_model_from_config for debugging provider instantiation) or remove the import and logger definition. Per the style guide, a useful log would be:
logger.debug("provider=<%s> | creating model from config", provider)| if client_args is not None: | ||
| kwargs["client_args"] = client_args | ||
| kwargs.update(config) | ||
| return AnthropicModel(**kwargs) |
There was a problem hiding this comment.
Issue: 7 of the 12 factory functions (_create_anthropic_model, _create_openai_model, _create_gemini_model, _create_litellm_model, _create_llamaapi_model, _create_writer_model, _create_openai_responses_model) share identical logic: pop client_args, build kwargs, update with remaining config, call constructor.
Suggestion: Extract a shared helper to eliminate the duplication:
def _create_client_args_model(model_cls: type, config: dict[str, Any]) -> Any:
"""Common factory for providers that accept client_args + **model_config."""
client_args = config.pop("client_args", None)
kwargs: dict[str, Any] = {}
if client_args is not None:
kwargs["client_args"] = client_args
kwargs.update(config)
return model_cls(**kwargs)Then each common provider becomes a one-liner with just the import and call. The unique providers (bedrock, ollama, mistral, llamacpp, sagemaker) keep their custom logic.
| if "boto_client_config" in config: | ||
| raw = config.pop("boto_client_config") | ||
| kwargs["boto_client_config"] = BotocoreConfig(**raw) if isinstance(raw, dict) else raw | ||
| return SageMakerAIModel(**kwargs) |
There was a problem hiding this comment.
Issue: The SageMaker factory silently drops any extra config keys after popping endpoint_config, payload_config, and boto_client_config. For example, if a user passes model_id (a common pattern for other providers), it's silently ignored. All other factories pass remaining config through via kwargs.update(config).
Suggestion: Either pass remaining config to the constructor (if SageMaker accepts **kwargs) or explicitly warn/raise on unexpected keys so users get feedback when their config is wrong:
if config:
logger.warning("provider=<sagemaker> | ignoring unsupported config keys: %s", list(config.keys()))| _VALIDATOR = jsonschema.Draft7Validator(AGENT_CONFIG_SCHEMA) | ||
|
|
||
| # Pattern for matching environment variable references | ||
| _ENV_VAR_PATTERN = re.compile(r"^\$\{([^}]+)\}$|^\$([A-Za-z_][A-Za-z0-9_]*)$") |
There was a problem hiding this comment.
Issue: The regex ^\$\{([^}]+)\}$|^\$([A-Za-z_][A-Za-z0-9_]*)$ only matches full-string env var references (anchored with ^ and $). This means "prefix-$VAR-suffix" won't be resolved, which may surprise users coming from shell-like environments.
Suggestion: This is a reasonable design choice for security and simplicity, but it should be explicitly documented — either in the module docstring or as a comment near the pattern. Something like:
# Only full-string env var references are resolved (no inline interpolation).
# "prefix-$VAR" is NOT resolved; use the object format to construct values programmatically.|
Assessment: Comment Solid implementation that extends the experimental JSON agent configuration to support all 12 SDK model providers with backward compatibility. The main areas for improvement are reducing code duplication in the provider factory functions and replacing the fragile Review Categories
Good backward compatibility preservation, thorough test coverage, and clean lazy-import pattern across all providers. |
142bcb1 to
e3c44bf
Compare
Motivation
The experimental
agent_config.pyfeature currently only accepts a simple string for themodelfield, which is always interpreted as a Bedrockmodel_id. This limits JSON-based agent configuration to a single provider, preventing use cases like agent-builder tools and theuse_agenttool from working with any of the SDK's 12 model providers.Resolves #1064
Public API Changes
The
modelfield in agent JSON configuration now supports two formats:Environment variable references (
$VARor${VAR}) in model config values are resolved automatically before provider instantiation, enabling secure configuration without embedding secrets.All 12 SDK providers are supported:
bedrock,anthropic,openai,gemini,ollama,litellm,mistral,llamaapi,llamacpp,sagemaker,writer,openai_responses. Each provider's constructor parameters are correctly routed — for example,boto_client_configdicts are converted toBotocoreConfigobjects for Bedrock/SageMaker, Ollama'sclient_argsmaps toollama_client_args, and Mistral'sapi_keyis extracted as a separate parameter.All provider imports are lazy to avoid requiring optional dependencies that aren't installed. Non-serializable parameters (
boto_session,client,gemini_tools) cannot be specified from JSON and are documented as such.Use Cases