Skip to content

feat(agents): add GPT-5.3 Codex model and improve model resolution#184

Closed
amdmax wants to merge 25 commits intomainfrom
feat/add-gpt-5.3-codex-model
Closed

feat(agents): add GPT-5.3 Codex model and improve model resolution#184
amdmax wants to merge 25 commits intomainfrom
feat/add-gpt-5.3-codex-model

Conversation

@amdmax
Copy link
Collaborator

@amdmax amdmax commented Mar 4, 2026

  • Add gpt-5.3-codex-2026-02-24 model config with 400K context window
  • Change default model to claude-sonnet-4-6 for both opencode and codemie-code plugins
  • Extract toOpenCodeConfig() utility for reusable model config stripping
  • Ensure fallback-resolved models are injected into provider model list
  • Remove unnecessary save_history flag from chat message payload
  • Fix test mock to include new toOpenCodeConfig export

Generated with AI

Summary

Changes

Impact

Checklist

  • Self-reviewed
  • Manual testing performed
  • Documentation updated (if needed)
  • No breaking changes (or clearly documented)

- Add gpt-5.3-codex-2026-02-24 model config with 400K context window
- Change default model to claude-sonnet-4-6 for both opencode and codemie-code plugins
- Extract toOpenCodeConfig() utility for reusable model config stripping
- Ensure fallback-resolved models are injected into provider model list
- Remove unnecessary save_history flag from chat message payload
- Fix test mock to include new toOpenCodeConfig export

Generated with AI

Co-Authored-By: codemie-ai <codemie.ai@gmail.com>
@amdmax amdmax requested a review from TarasSpashchenko March 4, 2026 11:25
Maksym Diabin and others added 24 commits March 4, 2026 15:33
Add kimi-k2.5 model config with 262K context/output, reasoning,
structured output, and kimi family defaults for future model variants.

Generated with AI

Co-Authored-By: codemie-ai <codemie.ai@gmail.com>
- Use correct model ID `moonshotai.kimi-k2.5` matching Bedrock naming
- Fix model capabilities (attachment: true, temperature: true)
- Update family default key from `kimi` to `moonshotai` for prefix matching
- Adjust GPT-5.3 Codex knowledge cutoff date

Generated with AI

Co-Authored-By: codemie-ai <codemie.ai@gmail.com>
…ound command (#186)

Co-authored-by: codemie-ai <codemie.ai@gmail.com>
Co-authored-by: Sviatoslav Likhtarchyk <Sviatoslav_Likhtarchyk@epam.com>
Co-authored-by: Maksym Diabin <maksym_diabin@epam.com>
Co-authored-by: codemie-ai <codemie.ai@gmail.com>
Co-authored-by: codemie-ai <codemie.ai@gmail.com>
Co-authored-by: Sviatoslav Likhtarchyk <Sviatoslav_Likhtarchyk@epam.com>
🤖 Generated with release script
Co-authored-by: codemie-ai <codemie.ai@gmail.com>
🤖 Generated with release script
Co-authored-by: codemie-ai <codemie.ai@gmail.com>
Co-authored-by: codemie-ai <codemie.ai@gmail.com>
🤖 Generated with release script
Co-authored-by: codemie-ai <codemie.ai@gmail.com>
🤖 Generated with release script
Co-authored-by: Maksym Diabin <maksym_diabin@epam.com>
Co-authored-by: codemie-ai <codemie.ai@gmail.com>
🤖 Generated with release script
Co-authored-by: codemie-ai <codemie.ai@gmail.com>
- Add gpt-5.3-codex-2026-02-24 model config with 400K context window
- Change default model to claude-sonnet-4-6 for both opencode and codemie-code plugins
- Extract toOpenCodeConfig() utility for reusable model config stripping
- Ensure fallback-resolved models are injected into provider model list
- Remove unnecessary save_history flag from chat message payload
- Fix test mock to include new toOpenCodeConfig export

Generated with AI

Co-Authored-By: codemie-ai <codemie.ai@gmail.com>
- Split models into chatCompletionsModels and responsesApiModels groups
- Responses API models (use_responses_api=true) are excluded from
  codemie-proxy/litellm to prevent Chat Completions routing on model switch
- Always include the openai CUSTOM_LOADER when Responses API models exist,
  regardless of the initial model selected at startup
- Name the openai provider 'CodeMie SSO' so all models appear under the
  same group in OpenCode's model picker UI
- Add getChatCompletionsModelConfigs() and getResponsesApiModelConfigs()
  helpers to opencode-model-configs.ts
- Add debug logging for Responses API decision path
- Update test mocks to include new model config functions

Generated with AI

Co-Authored-By: codemie-ai <codemie.ai@gmail.com>
All GPT models now use use_responses_api=true, routing them through
the OpenAI CUSTOM_LOADER (sdk.responses() → /v1/responses) and
displaying under the 'CodeMie SSO' provider group in OpenCode UI.

Models updated: gpt-5-2-2025-12-11, gpt-5.1-codex, gpt-5.1-codex-mini,
gpt-5.1-codex-max, gpt-5.2-chat (gpt-5.3-codex-2026-02-24 already set)

Generated with AI

Co-Authored-By: codemie-ai <codemie.ai@gmail.com>
@8nevil8 8nevil8 closed this Mar 9, 2026
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

7 participants