Skip to content

feat: add MiniMax as first-class LLM provider#155

Open
octo-patch wants to merge 1 commit intoOpenBMB:mainfrom
octo-patch:feature/add-minimax-provider
Open

feat: add MiniMax as first-class LLM provider#155
octo-patch wants to merge 1 commit intoOpenBMB:mainfrom
octo-patch:feature/add-minimax-provider

Conversation

@octo-patch
Copy link

Summary

  • Add MiniMax AI as a new LLM provider alongside OpenAI and Azure OpenAI
  • MiniMax offers an OpenAI-compatible API with models: MiniMax-M2.7 (1M context), MiniMax-M2.5, and MiniMax-M2.5-highspeed (204K context)
  • New MiniMaxChat class extending BaseChatModel with temperature clamping, think-tag stripping, and cost tracking
  • Registered minimax, MiniMax-M2.7, MiniMax-M2.5, MiniMax-M2.5-highspeed in llm_registry

Changes

File Description
agentverse/llms/minimax.py New MiniMax LLM provider module
agentverse/llms/__init__.py Import MiniMaxChat
README.md MiniMax environment variables and config docs
tests/test_minimax_unit.py 43 unit tests
tests/test_minimax_integration.py 3 integration tests

Usage

export MINIMAX_API_KEY="your_key"
# In task config YAML
llm_type: minimax
model: MiniMax-M2.7

Test plan

  • 43 unit tests passing (mocked API, registry, cost tracking, think-tag stripping, temperature clamping)
  • 3 integration tests passing (basic generation, history, async)
  • Verify existing OpenAI tests still pass

Add MiniMax AI (https://www.minimax.io/) as a new LLM provider alongside
OpenAI and Azure OpenAI. MiniMax offers an OpenAI-compatible API with
models M2.7, M2.5, and M2.5-highspeed (204K context).

Changes:
- New agentverse/llms/minimax.py: MiniMaxChat class extending BaseChatModel
  with temperature clamping, think-tag stripping, and cost tracking
- Register minimax/MiniMax-M2.7/M2.5/M2.5-highspeed in llm_registry
- Update __init__.py to import MiniMaxChat
- Add MiniMax environment variables and config docs to README.md
- Add 43 unit tests and 3 integration tests

Usage:
  export MINIMAX_API_KEY="your_key"
  # In config YAML:
  llm_type: minimax
  model: MiniMax-M2.7
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

1 participant