feat(agents): dynamic model list from /v1/llm_models at agent startup#208
Merged
TarasSpashchenko merged 2 commits intomainfrom Mar 9, 2026
Merged
feat(agents): dynamic model list from /v1/llm_models at agent startup#208TarasSpashchenko merged 2 commits intomainfrom
TarasSpashchenko merged 2 commits intomainfrom
Conversation
…ls with Responses API routing
- Add gpt-5.3-codex-2026-02-24, claude-haiku-4-5-20251001, and moonshotai/kimi-k2.5 to model configs
- Split model registry into chat-completions vs responses-api groups (use_responses_api flag)
- Route Responses API models through OpenCode's built-in openai CUSTOM_LOADER (sdk.responses())
- Fix SessionStore to fall back to completed_{sessionId}.json on SSO sync race condition
- Add node:sqlite native driver as primary SQLite query strategy (Node.js 22.5+)
- Add node-sqlite.d.ts type declarations for compile-time safety
- Update test mocks: replace getAllOpenCodeModelConfigs with getChatCompletionsModelConfigs/getResponsesApiModelConfigs
Generated with AI
Co-Authored-By: codemie-ai <codemie.ai@gmail.com>
…t startup Instead of relying on the hardcoded OPENCODE_MODEL_CONFIGS, codemie-code and codemie-opencode now call /v1/llm_models?include_all=true at startup and build the model catalogue dynamically. - Add LlmModel interface + fetchCodeMieLlmModels() to sso.http-client.ts - Add opencode-dynamic-models.ts with fetchDynamicModelConfigs() and convertApiModelToOpenCodeConfig(); silently falls back to static configs on any auth/network error - Add RESPONSES_API_MODEL_PATTERNS to detect models that need the OpenAI Responses API when the /v1/llm_models endpoint provides no "mode" field - Update getChatCompletionsModelConfigs / getResponsesApiModelConfigs to accept an optional source map (backward-compatible default = static config) - Wire fetchDynamicModelConfigs into beforeRun of both plugins (JWT → SSO → static fallback auth chain) - Mock opencode-dynamic-models in plugin unit tests to avoid real API calls Generated with AI Co-Authored-By: codemie-ai <codemie.ai@gmail.com>
8nevil8
approved these changes
Mar 9, 2026
codemie-ai
approved these changes
Mar 9, 2026
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Add this suggestion to a batch that can be applied as a single commit.This suggestion is invalid because no changes were made to the code.Suggestions cannot be applied while the pull request is closed.Suggestions cannot be applied while viewing a subset of changes.Only one suggestion per line can be applied in a batch.Add this suggestion to a batch that can be applied as a single commit.Applying suggestions on deleted lines is not supported.You must change the existing code in this line in order to create a valid suggestion.Outdated suggestions cannot be applied.This suggestion has been applied or marked resolved.Suggestions cannot be applied from pending reviews.Suggestions cannot be applied on multi-line comments.Suggestions cannot be applied while the pull request is queued to merge.Suggestion cannot be applied right now. Please check back later.
Summary
Replace hardcoded
OPENCODE_MODEL_CONFIGSwith a live model catalogue fetched from the CodeMie API (/v1/llm_models?include_all=true) every timecodemie-codeorcodemie-opencodestarts. Static configs remain as a silent fallback on any auth or network error.Changes
sso.http-client.ts– AddLlmModelinterface (full model descriptor withcostandfeatures) andfetchCodeMieLlmModels()with cookie/JWT overloadsopencode-dynamic-models.ts(new) –fetchDynamicModelConfigs()fetches and converts API models toOpenCodeModelConfig; includesRESPONSES_API_MODEL_PATTERNSfor Responses-API model detection (the/v1/llm_modelsendpoint has nomodefield); falls back to static config on erroropencode-model-configs.ts–getChatCompletionsModelConfigs()andgetResponsesApiModelConfigs()accept an optionalsourcemap (backward-compatible; default = static config)opencode.plugin.ts/codemie-code.plugin.ts–beforeRuncallsfetchDynamicModelConfigs(baseUrl, codeMieUrl, jwtToken)and threads the result through model resolution and the chat/responses-api splitopencode-dynamic-models.jsto prevent real API calls in unit testsAlso on this branch (prior commit):
gpt-5.3-codex-2026-02-24,claude-haiku-4-5-20251001,moonshotai.kimi-k2.5to static model configsSessionStoreto handlecompleted_prefixed session files on SSO sync racenode:sqlitenative driver as primary SQLite strategy + type declarationsImpact
Before: Model list shown to users in OpenCode's model picker was fixed at compile time. New models deployed to the CodeMie platform required a CLI release to appear.
After: Model list is fetched from the live API at startup. Any model enabled in the CodeMie platform immediately appears without a CLI update. Cost/feature metadata (temperature, tool calling, vision) is also sourced from the API.
Checklist