CacheChat is a C# library that mediates software prompts and user responses with an LLM, while caching only initial enhanced prompts.
This scaffold now includes:
- Core orchestration library (
CacheChat) - Provider library (
CacheChat.Providers) with:- in-memory stores
- default policy/preference resolvers
- real
ILlmProviderimplementations for Semantic Kernel and OpenRouter
- Unit test project (
test/CacheChat.Tests) - Interactive demo app (
test/CacheChat.Demo) simulating software and user sessions
- Software prompt enters CacheChat with metadata.
- Cache lookup uses pluggable key strategy (v1 default: exact prompt text).
- Cache miss triggers LLM enhancement and caches only the enhanced initial prompt.
- User responses are evaluated by deterministic rules first.
- LLM classifies response into
answer,question,error. answerreturns to software.questiongenerates clarifying prompt (always anchored to original enhanced prompt).errorgenerates brief error prompt with previous prompt context.- Loop continues until answer or
MaxNonAnswerLoopsreached. - On max loops, last user response is returned with unresolved flag.
PassThroughLlmProvider: local heuristic provider for offline/demo use.OpenRouterLlmProvider: real OpenRouter chat completions integration (/chat/completions).SemanticKernelLlmProvider: real Semantic Kernel-backed implementation viaIChatCompletionService.
C:\Source\cachechat
|-- CacheChat.sln
|-- NuGet.Config
|-- .gitignore
|-- CACHECHAT_LIBRARY.md
|-- src
| |-- CacheChat
| | |-- CacheChat.csproj
| | |-- Abstractions
| | |-- Caching
| | |-- Domain
| | |-- Engine
| | `-- Policies
| `-- CacheChat.Providers
| |-- CacheChat.Providers.csproj
| |-- Defaults
| |-- InMemory
| `-- Llm
| |-- PassThroughLlmProvider.cs
| |-- OpenRouterLlmProvider.cs
| |-- OpenRouterLlmProviderOptions.cs
| |-- SemanticKernelLlmProvider.cs
| |-- SemanticKernelLlmProviderOptions.cs
| `-- Common
| |-- LlmPromptTemplates.cs
| `-- LlmResponseParser.cs
`-- test
|-- CacheChat.Tests
| |-- CacheChat.Tests.csproj
| `-- CacheChatEngineContractTests.cs
`-- CacheChat.Demo
|-- CacheChat.Demo.csproj
`-- Program.cs
- .NET SDK 9.0+
- Target frameworks:
net9.0 - C#
- NuGet packages used:
Microsoft.SemanticKernel(provider implementation)xunit,Microsoft.NET.Test.Sdk,coverlet.collector(tests)
From C:\Source\cachechat:
dotnet restore CacheChat.sln
dotnet build CacheChat.sln
dotnet test test/CacheChat.Tests/CacheChat.Tests.csprojRun the demo app:
dotnet run --project test/CacheChat.Demo/CacheChat.Demo.csproj- Supports software prompt input as guided text or JSON.
- Supports user response input as text or JSON.
- Simulates full CacheChat loop between software and user.
- For OpenRouter mode, set env vars or enter interactively:
OPENROUTER_API_KEYOPENROUTER_MODEL(optional, defaultopenai/gpt-4o-mini)
- Matching logic:
ICacheKeyStrategy - Cache backend:
IPromptCacheStore - Conversation persistence:
IConversationStore - LLM backend:
ILlmProvider - Validation rules:
IResponseRulePolicy - Per-tenant runtime behavior:
ITenantPolicyProvider - Preference merge logic:
IUserPreferenceResolver