fix: remove overly broad "codex" keyword from openai_codex provider
The bare keyword "codex" causes false positive matches when any model name happens to contain "codex" (e.g. "gpt-5.3-codex" on a custom provider). This incorrectly routes the request through the OAuth-based OpenAI Codex provider, producing "OAuth credentials not found" errors even when a valid custom api_key and api_base are configured. Keep only the explicit "openai-codex" keyword so that auto-detection requires the canonical prefix. Users can still set provider: "custom" to force the custom endpoint, but auto-detection should not collide. Closes #1311
This commit is contained in:
@@ -201,7 +201,7 @@ PROVIDERS: tuple[ProviderSpec, ...] = (
|
||||
# OpenAI Codex: uses OAuth, not API key.
|
||||
ProviderSpec(
|
||||
name="openai_codex",
|
||||
keywords=("openai-codex", "codex"),
|
||||
keywords=("openai-codex",),
|
||||
env_key="", # OAuth-based, no API key
|
||||
display_name="OpenAI Codex",
|
||||
litellm_prefix="", # Not routed through LiteLLM
|
||||
|
||||
Reference in New Issue
Block a user