openai api behind Oauth2? #2425
Replies: 3 comments
-
|
You can do this without subclassing by using the Here's a clean approach: import httpx
from openai import OpenAI
class OAuth2Auth(httpx.Auth):
def __init__(self, token_url, client_id, client_secret):
self.token_url = token_url
self.client_id = client_id
self.client_secret = client_secret
self._token = None
def auth_flow(self, request):
if not self._token:
self._fetch_token()
request.headers["Authorization"] = f"Bearer {self._token}"
yield request
def _fetch_token(self):
resp = httpx.post(
self.token_url,
data={
"grant_type": "client_credentials",
"client_id": self.client_id,
"client_secret": self.client_secret,
},
)
resp.raise_for_status()
self._token = resp.json()["access_token"]
auth = OAuth2Auth(
token_url="https://your-idp.com/oauth/token",
client_id="your-client-id",
client_secret="your-client-secret",
)
http_client = httpx.Client(auth=auth)
client = OpenAI(
base_url="https://your-proxy-endpoint.com/v1",
api_key="unused", # set to any string; your proxy handles real auth
http_client=http_client,
)
# Now use it normally
response = client.chat.completions.create(
model="gpt-4",
messages=[{"role": "user", "content": "Hello"}],
)Key points:
If your proxy expects the OpenAI API key in addition to the OAuth2 token, keep |
Beta Was this translation helpful? Give feedback.
-
Using the OpenAI client with a custom OAuth2 proxyYou can achieve this with Approach 1: Custom
|
Beta Was this translation helpful? Give feedback.
-
|
You don't actually need to subclass import time
from typing import Optional
import httpx
from openai import OpenAI
class OAuthTokenManager:
"""Caches the bearer token and refreshes it on demand."""
def __init__(self, token_endpoint: str, client_id: str, client_secret: str):
self._endpoint = token_endpoint
self._client_id = client_id
self._client_secret = client_secret
self._token: Optional[str] = None
self._expires_at: float = 0
def get(self) -> str:
if self._token and time.time() < self._expires_at - 30:
return self._token
# Client-credentials flow shown — adapt to your IdP
r = httpx.post(self._endpoint, data={
"grant_type": "client_credentials",
"client_id": self._client_id,
"client_secret": self._client_secret,
})
r.raise_for_status()
body = r.json()
self._token = body["access_token"]
self._expires_at = time.time() + body.get("expires_in", 3600)
return self._token
tokens = OAuthTokenManager(
token_endpoint="https://idp.example.com/oauth2/token",
client_id="...",
client_secret="...",
)
def add_oauth_header(request: httpx.Request) -> None:
request.headers["Authorization"] = f"Bearer {tokens.get()}"
http_client = httpx.Client(
event_hooks={"request": [add_oauth_header]},
timeout=httpx.Timeout(60.0, connect=10.0),
)
client = OpenAI(
base_url="https://your-oauth-proxy.example.com/v1",
api_key="placeholder", # required by the SDK; your proxy ignores it
http_client=http_client,
)
resp = client.chat.completions.create(
model="gpt-4o-mini",
messages=[{"role": "user", "content": "hi"}],
)Notes:
Ref: https://github.com/openai/openai-python#configuring-the-http-client |
Beta Was this translation helpful? Give feedback.
Uh oh!
There was an error while loading. Please reload this page.
-
I'm working on a project where we have openai protected behind a Oauth2 endpoint. Which works like a proxy, it recieves api calls we need to authenticate with oauth2 and then the requests are forwarded to I believe Azure openai.
I would like to use the official openai python client instead of writting my own httpx wrapper around the api. So I basically need to override the OpenAI class and implement my own authentication flow.
Can someone tell me how to do this? Which methods should I override? Are there examples of this I can use as reference? All help would be appreciated.
Beta Was this translation helpful? Give feedback.
All reactions