-
Notifications
You must be signed in to change notification settings - Fork 1.4k
Support multiple models like DeepSeek and OpenAI at the same time in one application #2221
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Comments
Agree! Many tools are capable of utilizing multiple models simultaneously to address a process issue. In Spring AI, however, one can only accomplish such tasks through more low-level approaches. It would be highly beneficial for software development if there were a class that could easily interchange multiple sets of configurations. |
Yes, I really need this feature. I do believe this is a real case for many Spring AI applications as well. |
i think you can follow the doc https://docs.spring.io/spring-ai/reference/api/chatclient.html#_create_a_chatclient_programmatically or create beans for each model |
I use oneAPI as a service, and then Spring AI uses the package of OpenAI. In this way, different models are utilized. I'm not sure if this is correct. this.chatClient.prompt(
new Prompt(
p,
OpenAiChatOptions.builder()
.model("xxxx")
.build()
)
)
.call()
.content(); |
I agree this is a gap. The issue in the example linked to by @FakeTrader is that the underlying chatmodels in this use case need to point to different openai api compatible URLs. I am looking to fix this for RC1 at least for OpenAI by letting users define multiple client endpoints in applicaton.yml via a map. Also note, that there is a slippery slope with models that claim to be openai api compatible but then add extra fields to the json. This is polluting the openai implementation, so I think we need to have a dedicated deepseek module. |
As a example of what I'm thinking
and some basic usage (not chat client based, but you get the idea)
|
i believe a better approach would be to offer a way to customise the api. the current OpenAiApi is a single class file, and I have to duplicate large sections of code just to modify a single field. |
+1 for this feature. I have done some hack with similar configuration layout like this before. |
We have custom endpoint which acts as an orchestrator for multiple LLMs (including ChatGPT). It follows API spec for OpenAI. Now, if we need to consume multiple models we need to configure different base urls - under openAi spec, we cannot use Spring AI configuration as it stands. I was thinking of an Array under OpenAI. But the setup @markpollack described above works too (provided instance names are given by developers/customizable). |
You can refer to: https://docs.spring.io/spring-ai/reference/api/chat/openai-chat.html#_manual_configuration |
I'm tryint to collect all the different issues around this. It is true that one can do
but the issue is that OpenAiChatModel doesn't have such a simple constructor as in the docs. Now it is
So let's assume we we start with two autoconfigured beans
Suppose you want to create a GPT-4 and a Llama instance, each with their own endpoint and options:
While there can be declarative means, I think at a programmatic level this would work? Trying to get this into RC1. I'll make a spike and a PR to discuss. |
I've make a WIP PR for people to review. #3037 The flow in the tests is
|
Also note that deepseek now has it's own model implantation as it is starting to differ significantly in terms of options from openai. |
See #3037 . Closing this issue for now. We should revisit a more comprehensive declarative solution post GA in another issue. |
Expected Behavior
SpringAI could support both DeepSeek and OpenAI configurations at the same time in one application.
Current Behavior
I failed to find a way to support both models at the same time in SpringAI.
Context
For now, the DeepSeek is using the configuration defined by OpenAI, so there is a problem, if I want to integrate DeepSeek, then there's no configuration for OpenAI. Any possibilities that SpringAI can support both models at the same time, in this way, I can dynamically switch the model from OpenAI to DeepSeek per client's request.
The text was updated successfully, but these errors were encountered: