Skip to content

[feat] [model] add MiniMax as LLM provider protocol#466

Open
octo-patch wants to merge 1 commit intocoze-dev:mainfrom
octo-patch:feature/add-minimax-provider
Open

[feat] [model] add MiniMax as LLM provider protocol#466
octo-patch wants to merge 1 commit intocoze-dev:mainfrom
octo-patch:feature/add-minimax-provider

Conversation

@octo-patch
Copy link

Summary

  • Add MiniMax as the 10th LLM provider protocol in coze-loop model system
  • MiniMax provides OpenAI-compatible API, integrated via existing eino-ext openai component with default base URL https://api.minimax.io/v1
  • Supported models: MiniMax-M2.7, MiniMax-M2.5, MiniMax-M2.5-highspeed (up to 1M context)

Changes

Backend (Go)

  • entity/manage.go: Add ProtocolMiniMax constant and ProtocolConfigMiniMax struct
  • eino/init.go: Add miniMaxBuilder function using OpenAI-compatible eino-ext component
  • kitex_gen manage.go: Add ProtocolMinimax constant

IDL

  • manage.thrift: Add protocol_minimax and ProtocolConfigMiniMax struct

Frontend (TypeScript)

  • manage.ts: Add protocol_minimax enum value and ProtocolConfigMiniMax interface

Test plan

  • 10 unit tests for minimax builder (various configs, error cases, model variants)
  • 1 factory integration test (eino_minimax case in TestFactoryImpl_CreateLLM)
  • 3 integration tests with real MiniMax API (generate, highspeed, stream)
  • All existing tests pass without regression

Add MiniMax as the 10th LLM provider in coze-loop model protocol system.
MiniMax offers OpenAI-compatible API and is integrated via the existing
eino-ext openai component with MiniMax default base URL.

Changes:
- Add ProtocolMiniMax constant and ProtocolConfigMiniMax struct in entity
- Add miniMaxBuilder in eino/init.go using openai component with MiniMax defaults
- Update thrift IDL with protocol_minimax and ProtocolConfigMiniMax
- Update kitex_gen constants and frontend TypeScript types
- Add 10 unit tests and 3 integration tests

Supported models: MiniMax-M2.7, MiniMax-M2.5, MiniMax-M2.5-highspeed
@mocayo mocayo changed the title feat(llm): add MiniMax as LLM provider protocol [feat] [model] add MiniMax as LLM provider protocol Mar 23, 2026
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants