.NET: Implement Microsoft.Agents.AI.FoundryLocal package for C##5170
Open
DavidLuong98 wants to merge 6 commits intomicrosoft:mainfrom
Open
.NET: Implement Microsoft.Agents.AI.FoundryLocal package for C##5170DavidLuong98 wants to merge 6 commits intomicrosoft:mainfrom
DavidLuong98 wants to merge 6 commits intomicrosoft:mainfrom
Conversation
…l inference Adds AI Foundry Local support to the .NET Agent Framework, enabling on-device model inference via the Microsoft.AI.Foundry.Local SDK. The package creates an OpenAI-compatible IChatClient pointed at the Foundry Local HTTP endpoint (localhost:5272), following the same pattern as the Python foundry_local package. New files: - FoundryLocalChatClient: DelegatingChatClient with async CreateAsync factory - FoundryLocalClientOptions: Configuration (model, bootstrap, web service) - FoundryLocalChatClientExtensions: .AsAIAgent() extension methods - Unit tests (12 passing) and smoke test
Contributor
There was a problem hiding this comment.
Pull request overview
Note
Copilot was unable to run its full agentic suite in this review.
Adds a new .NET package (Microsoft.Agents.AI.FoundryLocal) to enable AI Foundry Local on-device inference by managing models via Microsoft.AI.Foundry.Local while using the official OpenAI client against the local OpenAI-compatible HTTP endpoint.
Changes:
- Introduces
FoundryLocalChatClient,FoundryLocalClientOptions, andFoundryLocalChatClientExtensionsfor Foundry Local-backedIChatClient/ agent creation. - Adds unit tests for options validation and basic factory/extension argument checks.
- Adds a standalone smoke test console project and wires the new package into the .NET solution + dependency versions.
Reviewed changes
Copilot reviewed 12 out of 12 changed files in this pull request and generated 6 comments.
Show a summary per file
| File | Description |
|---|---|
| dotnet/src/Microsoft.Agents.AI.FoundryLocal/Microsoft.Agents.AI.FoundryLocal.csproj | New package project targeting net9.0 and referencing Foundry Local + OpenAI dependencies |
| dotnet/src/Microsoft.Agents.AI.FoundryLocal/FoundryLocalClientOptions.cs | Adds options object with env-var model resolution |
| dotnet/src/Microsoft.Agents.AI.FoundryLocal/FoundryLocalChatClient.cs | Implements async factory that bootstraps manager, resolves/downloads/loads model, starts service, and builds OpenAI chat client |
| dotnet/src/Microsoft.Agents.AI.FoundryLocal/FoundryLocalChatClientExtensions.cs | Adds .AsAIAgent() convenience extensions for FoundryLocalChatClient |
| dotnet/tests/Microsoft.Agents.AI.FoundryLocal.UnitTests/Microsoft.Agents.AI.FoundryLocal.UnitTests.csproj | Adds new unit test project for FoundryLocal package |
| dotnet/tests/Microsoft.Agents.AI.FoundryLocal.UnitTests/FoundryLocalClientOptionsTests.cs | Unit tests for model resolution and default option values |
| dotnet/tests/Microsoft.Agents.AI.FoundryLocal.UnitTests/FoundryLocalChatClientTests.cs | Unit tests for CreateAsync argument/option validation |
| dotnet/tests/Microsoft.Agents.AI.FoundryLocal.UnitTests/FoundryLocalChatClientExtensionsTests.cs | Unit tests for extension method null argument behavior |
| dotnet/tests/FoundryLocal.SmokeTest/FoundryLocal.SmokeTest.csproj | Adds a non-test executable project for manual local integration verification |
| dotnet/tests/FoundryLocal.SmokeTest/Program.cs | Smoke test that creates a client/agent and runs a prompt against a local model |
| dotnet/agent-framework-dotnet.slnx | Adds the new FoundryLocal project to the solution |
| dotnet/Directory.Packages.props | Pins Microsoft.AI.Foundry.Local dependency version (0.9.0) |
dotnet/tests/Microsoft.Agents.AI.FoundryLocal.UnitTests/FoundryLocalClientOptionsTests.cs
Show resolved
Hide resolved
...et/tests/Microsoft.Agents.AI.FoundryLocal.UnitTests/FoundryLocalChatClientExtensionsTests.cs
Outdated
Show resolved
Hide resolved
dotnet/src/Microsoft.Agents.AI.FoundryLocal/FoundryLocalChatClient.cs
Outdated
Show resolved
Hide resolved
dotnet/src/Microsoft.Agents.AI.FoundryLocal/Microsoft.Agents.AI.FoundryLocal.csproj
Outdated
Show resolved
Hide resolved
Co-Authored-By: David Luong <davidluong98@gmail.com>
Co-Authored-By: David Luong <davidluong98@gmail.com>
Add explicit guard for uninitialized FoundryLocalManager and serialize env-var-mutating tests with xUnit collection attribute. Co-Authored-By: David Luong <davidluong98@gmail.com>
Co-Authored-By: David Luong <davidluong98@gmail.com>
Co-Authored-By: David Luong <davidluong98@gmail.com>
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Add this suggestion to a batch that can be applied as a single commit.This suggestion is invalid because no changes were made to the code.Suggestions cannot be applied while the pull request is closed.Suggestions cannot be applied while viewing a subset of changes.Only one suggestion per line can be applied in a batch.Add this suggestion to a batch that can be applied as a single commit.Applying suggestions on deleted lines is not supported.You must change the existing code in this line in order to create a valid suggestion.Outdated suggestions cannot be applied.This suggestion has been applied or marked resolved.Suggestions cannot be applied from pending reviews.Suggestions cannot be applied on multi-line comments.Suggestions cannot be applied while the pull request is queued to merge.Suggestion cannot be applied right now. Please check back later.
Motivation and Context
The existing
Microsoft.Agents.AI.Foundrypackage only supports cloud-based Azure AI Foundry. There is no .NET support for AI Foundry Local. Microsoft's on-device model inference product that runs models locally via an OpenAI-compatible HTTP API.A Python implementation already exists in this repo (
python/packages/foundry_local/), but C# developers have no equivalent. This PR adds that parity.Description
Adds a new
Microsoft.Agents.AI.FoundryLocalpackage that enables on-device model inference through theMicrosoft.AI.Foundry.LocalSDK (v0.9.0).Architecture: The Foundry Local SDK internally uses
Betalgo.Ranul.OpenAI, which is incompatible withMicrosoft.Extensions.AI.IChatClient. To solve this, we use the SDK for model management (catalog, download, load, web service lifecycle) but create the chat client by pointing the officialOpenAINuGet package at the SDK's local HTTP endpoint (localhost:5272/v1), the same approach the Python implementation uses.Key classes:
FoundryLocalChatClient-DelegatingChatClientwith asyncCreateAsyncfactory that handles manager bootstrapping, model download/load, and web service startupFoundryLocalClientOptions- Configuration (model alias, bootstrap, prepare model, web service URL, env var support viaFOUNDRY_LOCAL_MODEL)FoundryLocalChatClientExtensions0.AsAIAgent()extension methods matching theOpenAIChatClientExtensionspatternDesign decisions:
Azure.AI.ProjectsorAzure.IdentityMicrosoft.AI.Foundry.LocalSDK's target frameworkCreateAsync) because SDK initialization requires async operations (model download, load, web service start)http://localhost:5272) always configured, matching the Python SDK's default endpointContribution Checklist