The bare keyword "codex" causes false positive matches when any model
name happens to contain "codex" (e.g. "gpt-5.3-codex" on a custom
provider). This incorrectly routes the request through the OAuth-based
OpenAI Codex provider, producing "OAuth credentials not found" errors
even when a valid custom api_key and api_base are configured.
Keep only the explicit "openai-codex" keyword so that auto-detection
requires the canonical prefix. Users can still set provider: "custom"
to force the custom endpoint, but auto-detection should not collide.
Closes#1311