Skip to content

feat(agents): dynamic model list from /v1/llm_models at agent startup#208

Merged
TarasSpashchenko merged 2 commits intomainfrom
feat/add-new-models-gpt-kimi
Mar 9, 2026
Merged

feat(agents): dynamic model list from /v1/llm_models at agent startup#208
TarasSpashchenko merged 2 commits intomainfrom
feat/add-new-models-gpt-kimi

Conversation

@TarasSpashchenko
Copy link
Collaborator

Summary

Replace hardcoded OPENCODE_MODEL_CONFIGS with a live model catalogue fetched from the CodeMie API (/v1/llm_models?include_all=true) every time codemie-code or codemie-opencode starts. Static configs remain as a silent fallback on any auth or network error.

Changes

  • sso.http-client.ts – Add LlmModel interface (full model descriptor with cost and features) and fetchCodeMieLlmModels() with cookie/JWT overloads
  • opencode-dynamic-models.ts (new) – fetchDynamicModelConfigs() fetches and converts API models to OpenCodeModelConfig; includes RESPONSES_API_MODEL_PATTERNS for Responses-API model detection (the /v1/llm_models endpoint has no mode field); falls back to static config on error
  • opencode-model-configs.tsgetChatCompletionsModelConfigs() and getResponsesApiModelConfigs() accept an optional source map (backward-compatible; default = static config)
  • opencode.plugin.ts / codemie-code.plugin.tsbeforeRun calls fetchDynamicModelConfigs(baseUrl, codeMieUrl, jwtToken) and threads the result through model resolution and the chat/responses-api split
  • Plugin tests – Mock opencode-dynamic-models.js to prevent real API calls in unit tests

Also on this branch (prior commit):

  • Add gpt-5.3-codex-2026-02-24, claude-haiku-4-5-20251001, moonshotai.kimi-k2.5 to static model configs
  • Fix SessionStore to handle completed_ prefixed session files on SSO sync race
  • Add node:sqlite native driver as primary SQLite strategy + type declarations

Impact

Before: Model list shown to users in OpenCode's model picker was fixed at compile time. New models deployed to the CodeMie platform required a CLI release to appear.

After: Model list is fetched from the live API at startup. Any model enabled in the CodeMie platform immediately appears without a CLI update. Cost/feature metadata (temperature, tool calling, vision) is also sourced from the API.

Checklist

  • Self-reviewed
  • Manual testing performed
  • Documentation updated (if needed)
  • No breaking changes (or clearly documented)

TarasSpashchenko and others added 2 commits March 9, 2026 17:27
…ls with Responses API routing

- Add gpt-5.3-codex-2026-02-24, claude-haiku-4-5-20251001, and moonshotai/kimi-k2.5 to model configs
- Split model registry into chat-completions vs responses-api groups (use_responses_api flag)
- Route Responses API models through OpenCode's built-in openai CUSTOM_LOADER (sdk.responses())
- Fix SessionStore to fall back to completed_{sessionId}.json on SSO sync race condition
- Add node:sqlite native driver as primary SQLite query strategy (Node.js 22.5+)
- Add node-sqlite.d.ts type declarations for compile-time safety
- Update test mocks: replace getAllOpenCodeModelConfigs with getChatCompletionsModelConfigs/getResponsesApiModelConfigs

Generated with AI

Co-Authored-By: codemie-ai <codemie.ai@gmail.com>
…t startup

Instead of relying on the hardcoded OPENCODE_MODEL_CONFIGS, codemie-code and
codemie-opencode now call /v1/llm_models?include_all=true at startup and build
the model catalogue dynamically.

- Add LlmModel interface + fetchCodeMieLlmModels() to sso.http-client.ts
- Add opencode-dynamic-models.ts with fetchDynamicModelConfigs() and
  convertApiModelToOpenCodeConfig(); silently falls back to static configs
  on any auth/network error
- Add RESPONSES_API_MODEL_PATTERNS to detect models that need the OpenAI
  Responses API when the /v1/llm_models endpoint provides no "mode" field
- Update getChatCompletionsModelConfigs / getResponsesApiModelConfigs to
  accept an optional source map (backward-compatible default = static config)
- Wire fetchDynamicModelConfigs into beforeRun of both plugins (JWT → SSO →
  static fallback auth chain)
- Mock opencode-dynamic-models in plugin unit tests to avoid real API calls

Generated with AI

Co-Authored-By: codemie-ai <codemie.ai@gmail.com>
@TarasSpashchenko TarasSpashchenko requested a review from 8nevil8 March 9, 2026 17:26
@TarasSpashchenko TarasSpashchenko merged commit 7add395 into main Mar 9, 2026
5 checks passed
@TarasSpashchenko TarasSpashchenko deleted the feat/add-new-models-gpt-kimi branch March 9, 2026 18:13
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

3 participants