Skip to content

mgrady/cachechat

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

2 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

CacheChat Library Scaffold

Beware - Agentically generated. (duh).

Overview

CacheChat is a C# library that mediates software prompts and user responses with an LLM, while caching only initial enhanced prompts.

This scaffold now includes:

  • Core orchestration library (CacheChat)
  • Provider library (CacheChat.Providers) with:
    • in-memory stores
    • default policy/preference resolvers
    • real ILlmProvider implementations for Semantic Kernel and OpenRouter
  • Unit test project (test/CacheChat.Tests)
  • Interactive demo app (test/CacheChat.Demo) simulating software and user sessions

Implemented Behavior

  1. Software prompt enters CacheChat with metadata.
  2. Cache lookup uses pluggable key strategy (v1 default: exact prompt text).
  3. Cache miss triggers LLM enhancement and caches only the enhanced initial prompt.
  4. User responses are evaluated by deterministic rules first.
  5. LLM classifies response into answer, question, error.
  6. answer returns to software.
  7. question generates clarifying prompt (always anchored to original enhanced prompt).
  8. error generates brief error prompt with previous prompt context.
  9. Loop continues until answer or MaxNonAnswerLoops reached.
  10. On max loops, last user response is returned with unresolved flag.

Provider Implementations

  • PassThroughLlmProvider: local heuristic provider for offline/demo use.
  • OpenRouterLlmProvider: real OpenRouter chat completions integration (/chat/completions).
  • SemanticKernelLlmProvider: real Semantic Kernel-backed implementation via IChatCompletionService.

Directory Structure

C:\Source\cachechat
|-- CacheChat.sln
|-- NuGet.Config
|-- .gitignore
|-- CACHECHAT_LIBRARY.md
|-- src
|   |-- CacheChat
|   |   |-- CacheChat.csproj
|   |   |-- Abstractions
|   |   |-- Caching
|   |   |-- Domain
|   |   |-- Engine
|   |   `-- Policies
|   `-- CacheChat.Providers
|       |-- CacheChat.Providers.csproj
|       |-- Defaults
|       |-- InMemory
|       `-- Llm
|           |-- PassThroughLlmProvider.cs
|           |-- OpenRouterLlmProvider.cs
|           |-- OpenRouterLlmProviderOptions.cs
|           |-- SemanticKernelLlmProvider.cs
|           |-- SemanticKernelLlmProviderOptions.cs
|           `-- Common
|               |-- LlmPromptTemplates.cs
|               `-- LlmResponseParser.cs
`-- test
    |-- CacheChat.Tests
    |   |-- CacheChat.Tests.csproj
    |   `-- CacheChatEngineContractTests.cs
    `-- CacheChat.Demo
        |-- CacheChat.Demo.csproj
        `-- Program.cs

Required Frameworks and Packages

  • .NET SDK 9.0+
  • Target frameworks: net9.0
  • C#
  • NuGet packages used:
    • Microsoft.SemanticKernel (provider implementation)
    • xunit, Microsoft.NET.Test.Sdk, coverlet.collector (tests)

Build, Test, and Run

From C:\Source\cachechat:

dotnet restore CacheChat.sln
dotnet build CacheChat.sln
dotnet test test/CacheChat.Tests/CacheChat.Tests.csproj

Run the demo app:

dotnet run --project test/CacheChat.Demo/CacheChat.Demo.csproj

Demo App Notes

  • Supports software prompt input as guided text or JSON.
  • Supports user response input as text or JSON.
  • Simulates full CacheChat loop between software and user.
  • For OpenRouter mode, set env vars or enter interactively:
    • OPENROUTER_API_KEY
    • OPENROUTER_MODEL (optional, default openai/gpt-4o-mini)

Key Modification Points

  • Matching logic: ICacheKeyStrategy
  • Cache backend: IPromptCacheStore
  • Conversation persistence: IConversationStore
  • LLM backend: ILlmProvider
  • Validation rules: IResponseRulePolicy
  • Per-tenant runtime behavior: ITenantPolicyProvider
  • Preference merge logic: IUserPreferenceResolver

About

LLM-assisted interface layer between software and people.

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

 
 
 

Contributors

Languages