Skip to content

feat: add MiniMax as LLM provider#444

Open
octo-patch wants to merge 1 commit intoInternLM:mainfrom
octo-patch:feature/add-minimax-provider
Open

feat: add MiniMax as LLM provider#444
octo-patch wants to merge 1 commit intoInternLM:mainfrom
octo-patch:feature/add-minimax-provider

Conversation

@octo-patch
Copy link
Copy Markdown

Summary

Add MiniMax as a supported LLM backend via its OpenAI-compatible API.

  • Register MiniMax in backend2url and backend2model with https://api.minimax.io/v1
  • Add auto model selection for MiniMax-M1 (supports up to 1M context window)
  • Update config.ini with MiniMax documentation and model examples
  • Add MiniMax to the LLM support table in both README.md and README_zh.md
  • Add config example in setup instructions for both READMEs
  • Add tests/test_minimax.py demo script, 20 unit tests, and 3 integration tests

Configuration Example

[llm.server]
remote_type = "minimax"
remote_api_key = "sk-xxxxxxxxxxxxx"
remote_llm_model = "MiniMax-M1"
remote_llm_max_text_length = 1000000

Test Plan

  • 20 unit tests passing (pytest tests/test_minimax_unit.py -v)
  • 3 integration tests passing with real MiniMax API (pytest tests/test_minimax_integration.py -v)
  • Verify remote_type = "minimax" works end-to-end with python3 -m huixiangdou.main

Files Changed (7 files, ~433 additions)

File Change
huixiangdou/services/llm.py Add minimax to backend2url, backend2model, choose_model()
config.ini Document minimax as supported provider
README.md Add MiniMax to LLM support table + config example
README_zh.md Add MiniMax to LLM support table + config example
tests/test_minimax.py Demo script for MiniMax API
tests/test_minimax_unit.py 20 unit tests
tests/test_minimax_integration.py 3 integration tests

Add MiniMax (https://platform.minimaxi.com) as a supported LLM backend
via its OpenAI-compatible API endpoint (https://api.minimax.io/v1).

Changes:
- Register MiniMax in backend2url and backend2model dicts
- Add auto model selection for MiniMax-M1 (up to 1M context)
- Update config.ini with MiniMax documentation and model examples
- Add MiniMax to LLM support table in README.md and README_zh.md
- Add MiniMax config example in setup instructions
- Add test_minimax.py demo script, 20 unit tests, 3 integration tests
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

1 participant