Skip to content

[agentserver] Always include model field in response payload#46302

Open
ankitbko wants to merge 5 commits intomainfrom
fix/responses-model-always-present
Open

[agentserver] Always include model field in response payload#46302
ankitbko wants to merge 5 commits intomainfrom
fix/responses-model-always-present

Conversation

@ankitbko
Copy link
Copy Markdown
Member

Problem

When the request payload doesn't include a model field, the response payload omits model entirely. The OpenAI SDK requires model to be present to deserialize the response object, resulting in an empty object being returned to the caller.

Fix

Default model to empty string ("") instead of None when not provided in the request. This ensures model is always stamped on the response payload via apply_common_defaults() (which guards with if model is not None).

One-line change in _endpoint_handler.py:

# Before:
model = getattr(parsed, "model", None)
# After:
model = getattr(parsed, "model", None) or ""

Testing

All 678 responses tests pass.

Default model to empty string when not provided in the request,
ensuring the field is always present in the response payload.
The OpenAI SDK requires model to be present to deserialize the
response object.

Co-authored-by: Copilot <223556219+Copilot@users.noreply.github.com>
Copilot AI review requested due to automatic review settings April 14, 2026 06:25
@github-actions github-actions bot added the Hosted Agents sdk/agentserver/* label Apr 14, 2026
@ankitbko ankitbko changed the title Always include model field in response payload [agentserver] Always include model field in response payload Apr 14, 2026
Copy link
Copy Markdown
Contributor

Copilot AI left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Pull request overview

Ensures the Responses hosting layer always stamps a model field into response payloads (even when omitted from the request), preventing downstream clients (notably the OpenAI SDK) from failing to deserialize responses when model is missing.

Changes:

  • Default model to "" when building the per-request execution context so apply_common_defaults() will always include model in lifecycle snapshots.

@RaviPidaparthi
Copy link
Copy Markdown
Member

Can we have a e2e test for this? May be update an existing one.

Address PR review feedback: add contract tests verifying the model
field is present in the response payload when omitted from the request,
for both sync (stream=False) and streaming (stream=True) modes.
…eleted'

The OpenAI spec returns {id, object: 'response', deleted: true} for
DELETE /responses/{id}.  Our handler was returning 'response.deleted'
which doesn't match.  Fixed the handler and updated all 5 test
assertions.
ResponseExecution now carries agent_session_id and conversation_id so
that _RuntimeState.to_snapshot can forcibly stamp them (S-038/S-040)
on both the response.as_dict() path and the minimal fallback dict.
All four orchestrator ResponseExecution creation sites pass both
fields from the execution context.
The manual _patch.py override of ResponseObject.output erased the
element type (list instead of list[OutputItem]), preventing the model
framework from deserializing nested dicts into OutputItem instances.
This caused get_history to return plain dicts instead of typed models.

Changes:
- Remove output:list override; use generated list[OutputItem]
- Remove ToolChoiceAllowed override (generated type is identical)
- Move Sphinx docstring fixes into models_patch.py shim so
  make generate-models preserves them instead of overwriting
- Accept emitter upgrade to model_base.py (XML refactor)
- Regenerate _validators.py from current TypeSpec sources
@RaviPidaparthi RaviPidaparthi force-pushed the fix/responses-model-always-present branch from 6fa2b47 to a141311 Compare April 15, 2026 00:13
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

Hosted Agents sdk/agentserver/*

Projects

None yet

Development

Successfully merging this pull request may close these issues.

3 participants