diff --git a/README.md b/README.md index 3eb3a73..1595bac 100644 --- a/README.md +++ b/README.md @@ -2,7 +2,7 @@ English | [中文](./README_CN.md) -**Mini Agent** is a minimal yet professional demo project that showcases the best practices for building agents with the MiniMax M2 model. Leveraging an Anthropic-compatible API, it fully supports interleaved thinking to unlock M2's powerful reasoning capabilities for long, complex tasks. +**Mini Agent** is a minimal yet professional demo project that showcases the best practices for building agents with the MiniMax M2.1 model. Leveraging an Anthropic-compatible API, it fully supports interleaved thinking to unlock M2's powerful reasoning capabilities for long, complex tasks. This project comes packed with features designed for a robust and intelligent agent development experience: @@ -115,7 +115,7 @@ Fill in your API Key and corresponding API Base: api_key: "YOUR_API_KEY_HERE" # API Key from step 1 api_base: "https://api.minimax.io" # Global # api_base: "https://api.minimaxi.com" # China -model: "MiniMax-M2" +model: "MiniMax-M2.1" ``` **Start Using:** @@ -182,7 +182,7 @@ Fill in your API Key and corresponding API Base: api_key: "YOUR_API_KEY_HERE" # API Key from step 1 api_base: "https://api.minimax.io" # Global # api_base: "https://api.minimaxi.com" # China -model: "MiniMax-M2" +model: "MiniMax-M2.1" max_steps: 100 workspace_dir: "./workspace" ``` diff --git a/README_CN.md b/README_CN.md index 7512bae..810cd99 100644 --- a/README_CN.md +++ b/README_CN.md @@ -2,7 +2,7 @@ [English](./README.md) | 中文 -**Mini Agent** 是一个极简但专业的演示项目,旨在展示使用 MiniMax M2 模型构建 Agent 的最佳实践。项目通过兼容 Anthropic 的 API,完全支持交错思维(interleaved thinking),从而解锁 M2 模型在处理长而复杂的任务时强大的推理能力。 +**Mini Agent** 是一个极简但专业的演示项目,旨在展示使用 MiniMax M2.1 模型构建 Agent 的最佳实践。项目通过兼容 Anthropic 的 API,完全支持交错思维(interleaved thinking),从而解锁 M2 模型在处理长而复杂的任务时强大的推理能力。 该项目具备一系列为稳健、智能的 Agent 开发而设计的特性: @@ -115,7 +115,7 @@ nano ~/.mini-agent/config/config.yaml api_key: "YOUR_API_KEY_HERE" # 填入第 1 步获取的 API Key api_base: "https://api.minimaxi.com" # 国内版 # api_base: "https://api.minimax.io" # 海外版(如使用海外平台,请取消本行注释) -model: "MiniMax-M2" +model: "MiniMax-M2.1" ``` **开始使用:** @@ -182,7 +182,7 @@ vim mini_agent/config/config.yaml # 或使用您偏好的编辑器 api_key: "YOUR_API_KEY_HERE" # 填入第 1 步获取的 API Key api_base: "https://api.minimaxi.com" # 国内版 # api_base: "https://api.minimax.io" # 海外版(如使用海外平台,请修改此行) -model: "MiniMax-M2" +model: "MiniMax-M2.1" max_steps: 100 workspace_dir: "./workspace" ``` diff --git a/docs/PRODUCTION_GUIDE.md b/docs/PRODUCTION_GUIDE.md index 0a1c7bb..33dc011 100644 --- a/docs/PRODUCTION_GUIDE.md +++ b/docs/PRODUCTION_GUIDE.md @@ -34,7 +34,7 @@ This project is a **teaching-level demo** that demonstrates the core concepts an ### 2.2 Model Fallback Mechanism -Currently using a single fixed model (MiniMax-M2), which will directly report errors on failure. +Currently using a single fixed model (MiniMax-M2.1), which will directly report errors on failure. - Introduce a model pool by configuring multiple model accounts to improve availability - Introduce automatic health checks, failure removal, circuit breaker strategies for the model pool diff --git a/docs/PRODUCTION_GUIDE_CN.md b/docs/PRODUCTION_GUIDE_CN.md index be15113..98c9c50 100644 --- a/docs/PRODUCTION_GUIDE_CN.md +++ b/docs/PRODUCTION_GUIDE_CN.md @@ -34,7 +34,7 @@ ### 2.2 模型回退机制 -当前 Demo 固定使用单一模型(MiniMax-M2),调用失败时会直接报错。 +当前 Demo 固定使用单一模型(MiniMax-M2.1),调用失败时会直接报错。 - **建立模型池**:配置多个模型账号,建立模型池以提高服务可用性。 - **引入高可用策略**:为模型池引入自动健康检测、故障节点切换、熔断等高可用策略。 diff --git a/examples/05_provider_selection.py b/examples/05_provider_selection.py index 66e8d0c..6a2910c 100644 --- a/examples/05_provider_selection.py +++ b/examples/05_provider_selection.py @@ -28,7 +28,7 @@ async def demo_anthropic_provider(): client = LLMClient( api_key=config["api_key"], provider=LLMProvider.ANTHROPIC, # Specify Anthropic provider - model=config.get("model", "MiniMax-M2"), + model=config.get("model", "MiniMax-M2.1"), ) print(f"Provider: {client.provider}") @@ -63,7 +63,7 @@ async def demo_openai_provider(): client = LLMClient( api_key=config["api_key"], provider=LLMProvider.OPENAI, # Specify OpenAI provider - model=config.get("model", "MiniMax-M2"), + model=config.get("model", "MiniMax-M2.1"), ) print(f"Provider: {client.provider}") @@ -97,7 +97,7 @@ async def demo_default_provider(): # Initialize client without specifying provider (defaults to Anthropic) client = LLMClient( api_key=config["api_key"], - model=config.get("model", "MiniMax-M2"), + model=config.get("model", "MiniMax-M2.1"), ) print(f"Provider (default): {client.provider}") @@ -130,13 +130,13 @@ async def demo_provider_comparison(): anthropic_client = LLMClient( api_key=config["api_key"], provider=LLMProvider.ANTHROPIC, - model=config.get("model", "MiniMax-M2"), + model=config.get("model", "MiniMax-M2.1"), ) openai_client = LLMClient( api_key=config["api_key"], provider=LLMProvider.OPENAI, - model=config.get("model", "MiniMax-M2"), + model=config.get("model", "MiniMax-M2.1"), ) # Same question for both diff --git a/examples/06_tool_schema_demo.py b/examples/06_tool_schema_demo.py index 5bcf7d8..fc610eb 100644 --- a/examples/06_tool_schema_demo.py +++ b/examples/06_tool_schema_demo.py @@ -166,7 +166,7 @@ async def demo_tool_schemas(): client = LLMClient( api_key=config["api_key"], provider=LLMProvider.ANTHROPIC, - model="MiniMax-M2", + model="MiniMax-M2.1", ) # Test with a query that should trigger weather tool @@ -215,7 +215,7 @@ async def demo_multiple_tools(): client = LLMClient( api_key=config["api_key"], provider=LLMProvider.ANTHROPIC, - model="MiniMax-M2", + model="MiniMax-M2.1", ) messages = [Message(role="user", content="Calculate 15 * 23 for me")] diff --git a/mini_agent/cli.py b/mini_agent/cli.py index 0280b7d..80d04c8 100644 --- a/mini_agent/cli.py +++ b/mini_agent/cli.py @@ -418,7 +418,7 @@ def on_retry(exception: Exception, attempt: int): system_prompt = system_prompt_path.read_text(encoding="utf-8") print(f"{Colors.GREEN}✅ Loaded system prompt (from: {system_prompt_path}){Colors.RESET}") else: - system_prompt = "You are Mini-Agent, an intelligent assistant powered by MiniMax M2 that can help users complete various tasks." + system_prompt = "You are Mini-Agent, an intelligent assistant powered by MiniMax M2.1 that can help users complete various tasks." print(f"{Colors.YELLOW}⚠️ System prompt not found, using default{Colors.RESET}") # 6. Inject Skills Metadata into System Prompt (Progressive Disclosure - Level 1) diff --git a/mini_agent/config.py b/mini_agent/config.py index bcd86e6..c46c5af 100644 --- a/mini_agent/config.py +++ b/mini_agent/config.py @@ -24,7 +24,7 @@ class LLMConfig(BaseModel): api_key: str api_base: str = "https://api.minimax.io" - model: str = "MiniMax-M2" + model: str = "MiniMax-M2.1" provider: str = "anthropic" # "anthropic" or "openai" retry: RetryConfig = Field(default_factory=RetryConfig) @@ -116,7 +116,7 @@ def from_yaml(cls, config_path: str | Path) -> "Config": llm_config = LLMConfig( api_key=data["api_key"], api_base=data.get("api_base", "https://api.minimax.io"), - model=data.get("model", "MiniMax-M2"), + model=data.get("model", "MiniMax-M2.1"), provider=data.get("provider", "anthropic"), retry=retry_config, ) diff --git a/mini_agent/config/config-example.yaml b/mini_agent/config/config-example.yaml index 9d259ce..ef8a4eb 100644 --- a/mini_agent/config/config-example.yaml +++ b/mini_agent/config/config-example.yaml @@ -19,7 +19,7 @@ api_key: "YOUR_API_KEY_HERE" # Replace with your MiniMax API Key api_base: "https://api.minimax.io" # Global users (default) # api_base: "https://api.minimaxi.com" # China users -model: "MiniMax-M2" +model: "MiniMax-M2.1" # LLM provider: "anthropic" or "openai" # The LLMClient will automatically append /anthropic or /v1 to api_base based on provider provider: "anthropic" # Default: anthropic diff --git a/mini_agent/llm/anthropic_client.py b/mini_agent/llm/anthropic_client.py index a8c3af3..8b5929c 100644 --- a/mini_agent/llm/anthropic_client.py +++ b/mini_agent/llm/anthropic_client.py @@ -25,7 +25,7 @@ def __init__( self, api_key: str, api_base: str = "https://api.minimaxi.com/anthropic", - model: str = "MiniMax-M2", + model: str = "MiniMax-M2.1", retry_config: RetryConfig | None = None, ): """Initialize Anthropic client. @@ -33,7 +33,7 @@ def __init__( Args: api_key: API key for authentication api_base: Base URL for the API (default: MiniMax Anthropic endpoint) - model: Model name to use (default: MiniMax-M2) + model: Model name to use (default: MiniMax-M2.1) retry_config: Optional retry configuration """ super().__init__(api_key, api_base, model, retry_config) diff --git a/mini_agent/llm/llm_wrapper.py b/mini_agent/llm/llm_wrapper.py index 8250fb7..7037b57 100644 --- a/mini_agent/llm/llm_wrapper.py +++ b/mini_agent/llm/llm_wrapper.py @@ -32,7 +32,7 @@ def __init__( api_key: str, provider: LLMProvider = LLMProvider.ANTHROPIC, api_base: str = "https://api.minimaxi.com", - model: str = "MiniMax-M2", + model: str = "MiniMax-M2.1", retry_config: RetryConfig | None = None, ): """Initialize LLM client with specified provider. diff --git a/mini_agent/llm/openai_client.py b/mini_agent/llm/openai_client.py index 6b3e326..c0ffcc7 100644 --- a/mini_agent/llm/openai_client.py +++ b/mini_agent/llm/openai_client.py @@ -26,7 +26,7 @@ def __init__( self, api_key: str, api_base: str = "https://api.minimaxi.com/v1", - model: str = "MiniMax-M2", + model: str = "MiniMax-M2.1", retry_config: RetryConfig | None = None, ): """Initialize OpenAI client. @@ -34,7 +34,7 @@ def __init__( Args: api_key: API key for authentication api_base: Base URL for the API (default: MiniMax OpenAI endpoint) - model: Model name to use (default: MiniMax-M2) + model: Model name to use (default: MiniMax-M2.1) retry_config: Optional retry configuration """ super().__init__(api_key, api_base, model, retry_config) diff --git a/tests/test_llm_clients.py b/tests/test_llm_clients.py index 2691118..0ebe322 100644 --- a/tests/test_llm_clients.py +++ b/tests/test_llm_clients.py @@ -33,7 +33,7 @@ async def test_anthropic_simple_completion(): client = AnthropicClient( api_key=config["api_key"], api_base="https://api.minimaxi.com/anthropic", - model=config.get("model", "MiniMax-M2"), + model=config.get("model", "MiniMax-M2.1"), retry_config=RetryConfig(enabled=True, max_retries=2), ) @@ -74,7 +74,7 @@ async def test_openai_simple_completion(): client = OpenAIClient( api_key=config["api_key"], api_base="https://api.minimaxi.com/v1", - model=config.get("model", "MiniMax-M2"), + model=config.get("model", "MiniMax-M2.1"), retry_config=RetryConfig(enabled=True, max_retries=2), ) @@ -115,7 +115,7 @@ async def test_anthropic_tool_calling(): client = AnthropicClient( api_key=config["api_key"], api_base="https://api.minimaxi.com/anthropic", - model=config.get("model", "MiniMax-M2"), + model=config.get("model", "MiniMax-M2.1"), ) # Define tool using dict format @@ -175,7 +175,7 @@ async def test_openai_tool_calling(): client = OpenAIClient( api_key=config["api_key"], api_base="https://api.minimaxi.com/v1", - model=config.get("model", "MiniMax-M2"), + model=config.get("model", "MiniMax-M2.1"), ) # Define tool using dict format (will be converted internally for OpenAI) @@ -235,7 +235,7 @@ async def test_multi_turn_conversation(): client = AnthropicClient( api_key=config["api_key"], api_base="https://api.minimaxi.com/anthropic", - model=config.get("model", "MiniMax-M2"), + model=config.get("model", "MiniMax-M2.1"), ) # Define tool using dict format