Skip to content

feat(responses): add prompt_cache_key to CreateModelResponseQuery#412

Merged
nezhyborets merged 1 commit intomainfrom
claude/issue-386-prompt-cache-key
Apr 30, 2026
Merged

feat(responses): add prompt_cache_key to CreateModelResponseQuery#412
nezhyborets merged 1 commit intomainfrom
claude/issue-386-prompt-cache-key

Conversation

@Krivoblotsky
Copy link
Copy Markdown
Contributor

Summary

Closes #386.

OpenAI's Responses API accepts a prompt_cache_key parameter that buckets similar requests together for improved cache hit rates and routing. Previously there was no way to set it from CreateModelResponseQuery, forcing users to fork or fall back to lower-level workarounds.

This PR adds a single optional promptCacheKey: String? property + matching init parameter (default nil) + CodingKey mapping to "prompt_cache_key". Pure addition — no existing call site needs to change.

Test plan

  • testCreateResponseQueryEncodesPromptCacheKey — when set, the encoded JSON contains "prompt_cache_key": "user-1234".
  • testCreateResponseQueryOmitsPromptCacheKeyWhenNil — when not provided, the encoded JSON has no prompt_cache_key field (verifies we don't accidentally send null and pollute requests).
  • swift test — all 177 tests pass locally.

🤖 Generated with Claude Code

OpenAI's Responses API accepts a `prompt_cache_key` parameter that
groups requests for cache-hit-rate reporting and routing. The library
had no way to set it, so users had to fork or work around it.

Add `promptCacheKey: String?` as an optional init parameter, defaulting
to nil so the field is omitted from the request when unset (verified
by a new encode test). Existing call sites are unaffected.

Closes #386

Co-Authored-By: Claude Opus 4.7 <noreply@anthropic.com>
@nezhyborets nezhyborets merged commit bede880 into main Apr 30, 2026
3 checks passed
@nezhyborets nezhyborets deleted the claude/issue-386-prompt-cache-key branch April 30, 2026 10:00
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

Add prompt_cache_key to the Responses API Query

2 participants