Skip to content

feat: add MiniMax as first-class LLM provider#112

Open
octo-patch wants to merge 1 commit into79E:masterfrom
octo-patch:feature/add-minimax-provider
Open

feat: add MiniMax as first-class LLM provider#112
octo-patch wants to merge 1 commit into79E:masterfrom
octo-patch:feature/add-minimax-provider

Conversation

@octo-patch
Copy link
Copy Markdown

Summary

Add MiniMax as a first-class LLM provider alongside OpenAI. MiniMax offers OpenAI-compatible API endpoints with models featuring up to 1M token context windows.

Changes

  • Backend model utilities (server/utils/modelUtils.ts): Model classification, temperature clamping (0-1), penalty parameter removal, GPTTokens fallback
  • Chat helper (server/helpers/chat/index.ts): Automatic option clamping before API calls
  • Chat router (server/routers/apis/chat.ts): Safe token counting, premium billing routing, VIP access gating
  • Key validation (server/helpers/keyUsage/index.ts): MiniMax API key health check
  • Frontend defaults (src/store/config/slice.ts): MiniMax-M2.7 and M2.7-highspeed in dropdown
  • Config UI (src/components/ConfigModal/index.tsx): Generic model selector label
  • README.md: MiniMax configuration guide
  • Tests: 31 unit + integration tests via Vitest

MiniMax Admin Configuration

Field Value
Key Your MiniMax API Key
Host https://api.minimax.io
Models MiniMax-M2.7,MiniMax-M2.7-highspeed
Type minimax-chat

Test plan

  • All 31 unit + integration tests pass
  • TypeScript compiles cleanly
  • Manual: Add MiniMax aikey via admin panel
  • Manual: Send chat with MiniMax model, verify streaming

Add MiniMax (M2.7 + M2.7-highspeed) as a supported LLM provider alongside
OpenAI, with automatic temperature clamping (0-1), penalty parameter removal,
GPTTokens fallback for billing, premium model classification, key validation,
and frontend model presets.

Changes:
- server/utils/modelUtils.ts: Model classification, temp clamping, option sanitization
- server/helpers/chat/index.ts: Apply clampOptions before API calls
- server/routers/apis/chat.ts: GPTTokens fallback, premium billing/VIP routing
- server/helpers/keyUsage/index.ts: MiniMax key validation endpoint
- src/store/config/slice.ts: MiniMax models in frontend defaults
- src/components/ConfigModal/index.tsx: Generic model selector label
- README.md: MiniMax configuration guide and model documentation
- vitest.config.ts + tests/: 31 unit + integration tests

Co-Authored-By: octopus <octopus@github.com>
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

1 participant