Skip to content

feat: support all model providers in JSON agent configuration#2109

Draft
Unshure wants to merge 1 commit intomainfrom
agent-tasks/1064
Draft

feat: support all model providers in JSON agent configuration#2109
Unshure wants to merge 1 commit intomainfrom
agent-tasks/1064

Conversation

@Unshure
Copy link
Copy Markdown
Member

@Unshure Unshure commented Apr 10, 2026

Motivation

The experimental agent_config.py feature currently only accepts a simple string for the model field, which is always interpreted as a Bedrock model_id. This limits JSON-based agent configuration to a single provider, preventing use cases like agent-builder tools and the use_agent tool from working with any of the SDK's 12 model providers.

Resolves #1064

Public API Changes

The model field in agent JSON configuration now supports two formats:

# Before: string only (Bedrock model_id)
config = {"model": "us.anthropic.claude-sonnet-4-20250514-v1:0"}
agent = config_to_agent(config)

# After: string still works (backward compatible)
agent = config_to_agent(config)

# After: object format for any provider
config = {
    "model": {
        "provider": "openai",
        "model_id": "gpt-4o",
        "client_args": {"api_key": "$OPENAI_API_KEY"}
    }
}
agent = config_to_agent(config)

Environment variable references ($VAR or ${VAR}) in model config values are resolved automatically before provider instantiation, enabling secure configuration without embedding secrets.

All 12 SDK providers are supported: bedrock, anthropic, openai, gemini, ollama, litellm, mistral, llamaapi, llamacpp, sagemaker, writer, openai_responses. Each provider's constructor parameters are correctly routed — for example, boto_client_config dicts are converted to BotocoreConfig objects for Bedrock/SageMaker, Ollama's client_args maps to ollama_client_args, and Mistral's api_key is extracted as a separate parameter.

All provider imports are lazy to avoid requiring optional dependencies that aren't installed. Non-serializable parameters (boto_session, client, gemini_tools) cannot be specified from JSON and are documented as such.

Use Cases

  • Agent-builder tools: Create agents from JSON templates with any model provider, not just Bedrock
  • Multi-provider configs: Store agent configurations in files that use OpenAI, Anthropic, or local models
  • Secure credential handling: Reference API keys via environment variables in JSON configs rather than hardcoding them

@codecov
Copy link
Copy Markdown

codecov bot commented Apr 10, 2026

Codecov Report

❌ Patch coverage is 95.78947% with 4 lines in your changes missing coverage. Please review.

Files with missing lines Patch % Lines
src/strands/models/llamacpp.py 77.77% 0 Missing and 2 partials ⚠️
src/strands/models/mistral.py 81.81% 0 Missing and 2 partials ⚠️

📢 Thoughts on this report? Let us know!

_ENV_VAR_PATTERN = re.compile(r"^\$\{([^}]+)\}$|^\$([A-Za-z_][A-Za-z0-9_]*)$")

# Provider name to factory function mapping — populated at module level, lazy imports at call time
PROVIDER_MAP: dict[str, str] = {
Copy link
Copy Markdown

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Issue: PROVIDER_MAP maps provider names to string function names, which are then resolved via globals()[factory_name] on line 438. This pattern is fragile — it breaks silently if a function is renamed or removed, and bypasses static analysis tools and IDE navigation.

Suggestion: Map directly to function references instead:

PROVIDER_MAP: dict[str, Callable[[dict[str, Any]], Any]] = {
    "bedrock": _create_bedrock_model,
    "anthropic": _create_anthropic_model,
    ...
}

This requires moving PROVIDER_MAP below the factory function definitions, but it gives you type safety, IDE go-to-definition, and eliminates the globals() lookup.

return factory_fn(config)


def config_to_agent(config: str | dict[str, Any], **kwargs: dict[str, Any]) -> Any:
Copy link
Copy Markdown

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Issue: The type annotation for **kwargs is dict[str, Any], but **kwargs already captures keyword arguments as a dict — the annotation should be the value type, not the full dict type.

Suggestion:

def config_to_agent(config: str | dict[str, Any], **kwargs: Any) -> Any:

import jsonschema
from jsonschema import ValidationError

logger = logging.getLogger(__name__)
Copy link
Copy Markdown

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Issue: logging is imported and logger is defined but never used anywhere in the module.

Suggestion: Either add logging calls (e.g., in _create_model_from_config for debugging provider instantiation) or remove the import and logger definition. Per the style guide, a useful log would be:

logger.debug("provider=<%s> | creating model from config", provider)

if client_args is not None:
kwargs["client_args"] = client_args
kwargs.update(config)
return AnthropicModel(**kwargs)
Copy link
Copy Markdown

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Issue: 7 of the 12 factory functions (_create_anthropic_model, _create_openai_model, _create_gemini_model, _create_litellm_model, _create_llamaapi_model, _create_writer_model, _create_openai_responses_model) share identical logic: pop client_args, build kwargs, update with remaining config, call constructor.

Suggestion: Extract a shared helper to eliminate the duplication:

def _create_client_args_model(model_cls: type, config: dict[str, Any]) -> Any:
    """Common factory for providers that accept client_args + **model_config."""
    client_args = config.pop("client_args", None)
    kwargs: dict[str, Any] = {}
    if client_args is not None:
        kwargs["client_args"] = client_args
    kwargs.update(config)
    return model_cls(**kwargs)

Then each common provider becomes a one-liner with just the import and call. The unique providers (bedrock, ollama, mistral, llamacpp, sagemaker) keep their custom logic.

if "boto_client_config" in config:
raw = config.pop("boto_client_config")
kwargs["boto_client_config"] = BotocoreConfig(**raw) if isinstance(raw, dict) else raw
return SageMakerAIModel(**kwargs)
Copy link
Copy Markdown

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Issue: The SageMaker factory silently drops any extra config keys after popping endpoint_config, payload_config, and boto_client_config. For example, if a user passes model_id (a common pattern for other providers), it's silently ignored. All other factories pass remaining config through via kwargs.update(config).

Suggestion: Either pass remaining config to the constructor (if SageMaker accepts **kwargs) or explicitly warn/raise on unexpected keys so users get feedback when their config is wrong:

if config:
    logger.warning("provider=<sagemaker> | ignoring unsupported config keys: %s", list(config.keys()))

_VALIDATOR = jsonschema.Draft7Validator(AGENT_CONFIG_SCHEMA)

# Pattern for matching environment variable references
_ENV_VAR_PATTERN = re.compile(r"^\$\{([^}]+)\}$|^\$([A-Za-z_][A-Za-z0-9_]*)$")
Copy link
Copy Markdown

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Issue: The regex ^\$\{([^}]+)\}$|^\$([A-Za-z_][A-Za-z0-9_]*)$ only matches full-string env var references (anchored with ^ and $). This means "prefix-$VAR-suffix" won't be resolved, which may surprise users coming from shell-like environments.

Suggestion: This is a reasonable design choice for security and simplicity, but it should be explicitly documented — either in the module docstring or as a comment near the pattern. Something like:

# Only full-string env var references are resolved (no inline interpolation).
# "prefix-$VAR" is NOT resolved; use the object format to construct values programmatically.

@github-actions
Copy link
Copy Markdown

Assessment: Comment

Solid implementation that extends the experimental JSON agent configuration to support all 12 SDK model providers with backward compatibility. The main areas for improvement are reducing code duplication in the provider factory functions and replacing the fragile globals() dispatch pattern.

Review Categories
  • Code Duplication: 7 of 12 factory functions are identical — a shared helper would significantly reduce boilerplate and make adding future providers trivial.
  • Dispatch Pattern: PROVIDER_MAP stores string names resolved via globals(), which is fragile and opaque to static analysis. Direct function references are more idiomatic and safer.
  • Silent Failures: The SageMaker factory silently discards unrecognized config keys, unlike all other factories that pass them through. This inconsistency could silently swallow user config errors.
  • Unused Code: logging import and logger are defined but never called — should either add debug logging for provider instantiation or remove.
  • Type Annotation: **kwargs: dict[str, Any] on config_to_agent is incorrect — should be **kwargs: Any.
  • API Review: This PR modifies experimental public API surface (new object model format). Consider adding the needs-api-review label per the API bar raising process.
  • Documentation PR: The PR adds a new public configuration format (object model). Since this is within the experimental module, a docs PR is suggested but non-blocking. Consider adding a "Documentation PR" section to the PR description.

Good backward compatibility preservation, thorough test coverage, and clean lazy-import pattern across all providers.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

Projects

None yet

Development

Successfully merging this pull request may close these issues.

[FEATURE] Support Every model provider in Agent Json feature

1 participant