Skip to content

feat: add MiniMax as first-class LLM provider#44

Open
octo-patch wants to merge 1 commit intoUnicomAI:mainfrom
octo-patch:feature/add-minimax-provider
Open

feat: add MiniMax as first-class LLM provider#44
octo-patch wants to merge 1 commit intoUnicomAI:mainfrom
octo-patch:feature/add-minimax-provider

Conversation

@octo-patch
Copy link

Summary

  • Add MiniMax as a first-class LLM provider in the model-provider framework
  • Implement mp-minimax package following the existing provider pattern (same as DeepSeek)
  • Register MiniMax in provider constants, model config factory, and model params factory
  • Support chat completions (unary + streaming), tool calling, vision, and thinking tags
  • MiniMax-specific temperature clamping to [0, 1.0] range
  • Stream requests automatically include stream_options.include_usage

Changes

  • pkg/model-provider/mp-minimax/mp-minimax-llm.go - Core MiniMax LLM provider implementing ILLM interface
  • pkg/model-provider/common.go - Added ProviderMiniMax constant
  • pkg/model-provider/model_config_api.go - Registered MiniMax in ToModelTags(), ToModelConfig(), added ProviderModelByMiniMax struct
  • pkg/model-provider/model_params_api.go - Registered MiniMax in ToModelParams(), added AppModelParamsMiniMax struct
  • README.md / README_CN.md - Added MiniMax to supported providers list

Test plan

  • 14 unit tests pass: Tags, TemperatureClamping, MaxTokensValidation, StreamOptions, ChatCompletionsUrl, JSONUnmarshal
  • 3 integration tests pass (with MINIMAX_API_KEY): Unary, Stream, ToolCall
  • go build ./... succeeds with no errors

Add MiniMax (https://api.minimax.io) as a dedicated model provider
alongside DeepSeek, Qwen, and other existing providers. MiniMax offers
OpenAI-compatible chat completion API with models like M2.7 and
M2.7-highspeed (1M context window).

Changes:
- New mp-minimax package with LLM implementation
- Temperature clamping to MiniMax's [0, 1.0] range
- Tool calling and vision support capability tags
- Provider registration in factory pattern (ToModelConfig/ToModelTags/ToModelParams)
- README updates (EN + CN) listing MiniMax as supported provider

Co-Authored-By: octopus <octopus@github.com>
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

1 participant