Skip to content
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
6 changes: 3 additions & 3 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -2,7 +2,7 @@

English | [中文](./README_CN.md)

**Mini Agent** is a minimal yet professional demo project that showcases the best practices for building agents with the MiniMax M2 model. Leveraging an Anthropic-compatible API, it fully supports interleaved thinking to unlock M2's powerful reasoning capabilities for long, complex tasks.
**Mini Agent** is a minimal yet professional demo project that showcases the best practices for building agents with the MiniMax M2.1 model. Leveraging an Anthropic-compatible API, it fully supports interleaved thinking to unlock M2's powerful reasoning capabilities for long, complex tasks.

This project comes packed with features designed for a robust and intelligent agent development experience:

Expand Down Expand Up @@ -115,7 +115,7 @@ Fill in your API Key and corresponding API Base:
api_key: "YOUR_API_KEY_HERE" # API Key from step 1
api_base: "https://api.minimax.io" # Global
# api_base: "https://api.minimaxi.com" # China
model: "MiniMax-M2"
model: "MiniMax-M2.1"
```

**Start Using:**
Expand Down Expand Up @@ -182,7 +182,7 @@ Fill in your API Key and corresponding API Base:
api_key: "YOUR_API_KEY_HERE" # API Key from step 1
api_base: "https://api.minimax.io" # Global
# api_base: "https://api.minimaxi.com" # China
model: "MiniMax-M2"
model: "MiniMax-M2.1"
max_steps: 100
workspace_dir: "./workspace"
```
Expand Down
6 changes: 3 additions & 3 deletions README_CN.md
Original file line number Diff line number Diff line change
Expand Up @@ -2,7 +2,7 @@

[English](./README.md) | 中文

**Mini Agent** 是一个极简但专业的演示项目,旨在展示使用 MiniMax M2 模型构建 Agent 的最佳实践。项目通过兼容 Anthropic 的 API,完全支持交错思维(interleaved thinking),从而解锁 M2 模型在处理长而复杂的任务时强大的推理能力。
**Mini Agent** 是一个极简但专业的演示项目,旨在展示使用 MiniMax M2.1 模型构建 Agent 的最佳实践。项目通过兼容 Anthropic 的 API,完全支持交错思维(interleaved thinking),从而解锁 M2 模型在处理长而复杂的任务时强大的推理能力。

该项目具备一系列为稳健、智能的 Agent 开发而设计的特性:

Expand Down Expand Up @@ -115,7 +115,7 @@ nano ~/.mini-agent/config/config.yaml
api_key: "YOUR_API_KEY_HERE" # 填入第 1 步获取的 API Key
api_base: "https://api.minimaxi.com" # 国内版
# api_base: "https://api.minimax.io" # 海外版(如使用海外平台,请取消本行注释)
model: "MiniMax-M2"
model: "MiniMax-M2.1"
```

**开始使用:**
Expand Down Expand Up @@ -182,7 +182,7 @@ vim mini_agent/config/config.yaml # 或使用您偏好的编辑器
api_key: "YOUR_API_KEY_HERE" # 填入第 1 步获取的 API Key
api_base: "https://api.minimaxi.com" # 国内版
# api_base: "https://api.minimax.io" # 海外版(如使用海外平台,请修改此行)
model: "MiniMax-M2"
model: "MiniMax-M2.1"
max_steps: 100
workspace_dir: "./workspace"
```
Expand Down
2 changes: 1 addition & 1 deletion docs/PRODUCTION_GUIDE.md
Original file line number Diff line number Diff line change
Expand Up @@ -34,7 +34,7 @@ This project is a **teaching-level demo** that demonstrates the core concepts an

### 2.2 Model Fallback Mechanism

Currently using a single fixed model (MiniMax-M2), which will directly report errors on failure.
Currently using a single fixed model (MiniMax-M2.1), which will directly report errors on failure.

- Introduce a model pool by configuring multiple model accounts to improve availability
- Introduce automatic health checks, failure removal, circuit breaker strategies for the model pool
Expand Down
2 changes: 1 addition & 1 deletion docs/PRODUCTION_GUIDE_CN.md
Original file line number Diff line number Diff line change
Expand Up @@ -34,7 +34,7 @@

### 2.2 模型回退机制

当前 Demo 固定使用单一模型(MiniMax-M2),调用失败时会直接报错。
当前 Demo 固定使用单一模型(MiniMax-M2.1),调用失败时会直接报错。

- **建立模型池**:配置多个模型账号,建立模型池以提高服务可用性。
- **引入高可用策略**:为模型池引入自动健康检测、故障节点切换、熔断等高可用策略。
Expand Down
10 changes: 5 additions & 5 deletions examples/05_provider_selection.py
Original file line number Diff line number Diff line change
Expand Up @@ -28,7 +28,7 @@ async def demo_anthropic_provider():
client = LLMClient(
api_key=config["api_key"],
provider=LLMProvider.ANTHROPIC, # Specify Anthropic provider
model=config.get("model", "MiniMax-M2"),
model=config.get("model", "MiniMax-M2.1"),
)

print(f"Provider: {client.provider}")
Expand Down Expand Up @@ -63,7 +63,7 @@ async def demo_openai_provider():
client = LLMClient(
api_key=config["api_key"],
provider=LLMProvider.OPENAI, # Specify OpenAI provider
model=config.get("model", "MiniMax-M2"),
model=config.get("model", "MiniMax-M2.1"),
)

print(f"Provider: {client.provider}")
Expand Down Expand Up @@ -97,7 +97,7 @@ async def demo_default_provider():
# Initialize client without specifying provider (defaults to Anthropic)
client = LLMClient(
api_key=config["api_key"],
model=config.get("model", "MiniMax-M2"),
model=config.get("model", "MiniMax-M2.1"),
)

print(f"Provider (default): {client.provider}")
Expand Down Expand Up @@ -130,13 +130,13 @@ async def demo_provider_comparison():
anthropic_client = LLMClient(
api_key=config["api_key"],
provider=LLMProvider.ANTHROPIC,
model=config.get("model", "MiniMax-M2"),
model=config.get("model", "MiniMax-M2.1"),
)

openai_client = LLMClient(
api_key=config["api_key"],
provider=LLMProvider.OPENAI,
model=config.get("model", "MiniMax-M2"),
model=config.get("model", "MiniMax-M2.1"),
)

# Same question for both
Expand Down
4 changes: 2 additions & 2 deletions examples/06_tool_schema_demo.py
Original file line number Diff line number Diff line change
Expand Up @@ -166,7 +166,7 @@ async def demo_tool_schemas():
client = LLMClient(
api_key=config["api_key"],
provider=LLMProvider.ANTHROPIC,
model="MiniMax-M2",
model="MiniMax-M2.1",
)

# Test with a query that should trigger weather tool
Expand Down Expand Up @@ -215,7 +215,7 @@ async def demo_multiple_tools():
client = LLMClient(
api_key=config["api_key"],
provider=LLMProvider.ANTHROPIC,
model="MiniMax-M2",
model="MiniMax-M2.1",
)

messages = [Message(role="user", content="Calculate 15 * 23 for me")]
Expand Down
2 changes: 1 addition & 1 deletion mini_agent/cli.py
Original file line number Diff line number Diff line change
Expand Up @@ -418,7 +418,7 @@ def on_retry(exception: Exception, attempt: int):
system_prompt = system_prompt_path.read_text(encoding="utf-8")
print(f"{Colors.GREEN}✅ Loaded system prompt (from: {system_prompt_path}){Colors.RESET}")
else:
system_prompt = "You are Mini-Agent, an intelligent assistant powered by MiniMax M2 that can help users complete various tasks."
system_prompt = "You are Mini-Agent, an intelligent assistant powered by MiniMax M2.1 that can help users complete various tasks."
print(f"{Colors.YELLOW}⚠️ System prompt not found, using default{Colors.RESET}")

# 6. Inject Skills Metadata into System Prompt (Progressive Disclosure - Level 1)
Expand Down
4 changes: 2 additions & 2 deletions mini_agent/config.py
Original file line number Diff line number Diff line change
Expand Up @@ -24,7 +24,7 @@ class LLMConfig(BaseModel):

api_key: str
api_base: str = "https://api.minimax.io"
model: str = "MiniMax-M2"
model: str = "MiniMax-M2.1"
provider: str = "anthropic" # "anthropic" or "openai"
retry: RetryConfig = Field(default_factory=RetryConfig)

Expand Down Expand Up @@ -116,7 +116,7 @@ def from_yaml(cls, config_path: str | Path) -> "Config":
llm_config = LLMConfig(
api_key=data["api_key"],
api_base=data.get("api_base", "https://api.minimax.io"),
model=data.get("model", "MiniMax-M2"),
model=data.get("model", "MiniMax-M2.1"),
provider=data.get("provider", "anthropic"),
retry=retry_config,
)
Expand Down
2 changes: 1 addition & 1 deletion mini_agent/config/config-example.yaml
Original file line number Diff line number Diff line change
Expand Up @@ -19,7 +19,7 @@
api_key: "YOUR_API_KEY_HERE" # Replace with your MiniMax API Key
api_base: "https://api.minimax.io" # Global users (default)
# api_base: "https://api.minimaxi.com" # China users
model: "MiniMax-M2"
model: "MiniMax-M2.1"
# LLM provider: "anthropic" or "openai"
# The LLMClient will automatically append /anthropic or /v1 to api_base based on provider
provider: "anthropic" # Default: anthropic
Expand Down
4 changes: 2 additions & 2 deletions mini_agent/llm/anthropic_client.py
Original file line number Diff line number Diff line change
Expand Up @@ -25,15 +25,15 @@ def __init__(
self,
api_key: str,
api_base: str = "https://api.minimaxi.com/anthropic",
model: str = "MiniMax-M2",
model: str = "MiniMax-M2.1",
retry_config: RetryConfig | None = None,
):
"""Initialize Anthropic client.

Args:
api_key: API key for authentication
api_base: Base URL for the API (default: MiniMax Anthropic endpoint)
model: Model name to use (default: MiniMax-M2)
model: Model name to use (default: MiniMax-M2.1)
retry_config: Optional retry configuration
"""
super().__init__(api_key, api_base, model, retry_config)
Expand Down
2 changes: 1 addition & 1 deletion mini_agent/llm/llm_wrapper.py
Original file line number Diff line number Diff line change
Expand Up @@ -32,7 +32,7 @@ def __init__(
api_key: str,
provider: LLMProvider = LLMProvider.ANTHROPIC,
api_base: str = "https://api.minimaxi.com",
model: str = "MiniMax-M2",
model: str = "MiniMax-M2.1",
retry_config: RetryConfig | None = None,
):
"""Initialize LLM client with specified provider.
Expand Down
4 changes: 2 additions & 2 deletions mini_agent/llm/openai_client.py
Original file line number Diff line number Diff line change
Expand Up @@ -26,15 +26,15 @@ def __init__(
self,
api_key: str,
api_base: str = "https://api.minimaxi.com/v1",
model: str = "MiniMax-M2",
model: str = "MiniMax-M2.1",
retry_config: RetryConfig | None = None,
):
"""Initialize OpenAI client.

Args:
api_key: API key for authentication
api_base: Base URL for the API (default: MiniMax OpenAI endpoint)
model: Model name to use (default: MiniMax-M2)
model: Model name to use (default: MiniMax-M2.1)
retry_config: Optional retry configuration
"""
super().__init__(api_key, api_base, model, retry_config)
Expand Down
10 changes: 5 additions & 5 deletions tests/test_llm_clients.py
Original file line number Diff line number Diff line change
Expand Up @@ -33,7 +33,7 @@ async def test_anthropic_simple_completion():
client = AnthropicClient(
api_key=config["api_key"],
api_base="https://api.minimaxi.com/anthropic",
model=config.get("model", "MiniMax-M2"),
model=config.get("model", "MiniMax-M2.1"),
retry_config=RetryConfig(enabled=True, max_retries=2),
)

Expand Down Expand Up @@ -74,7 +74,7 @@ async def test_openai_simple_completion():
client = OpenAIClient(
api_key=config["api_key"],
api_base="https://api.minimaxi.com/v1",
model=config.get("model", "MiniMax-M2"),
model=config.get("model", "MiniMax-M2.1"),
retry_config=RetryConfig(enabled=True, max_retries=2),
)

Expand Down Expand Up @@ -115,7 +115,7 @@ async def test_anthropic_tool_calling():
client = AnthropicClient(
api_key=config["api_key"],
api_base="https://api.minimaxi.com/anthropic",
model=config.get("model", "MiniMax-M2"),
model=config.get("model", "MiniMax-M2.1"),
)

# Define tool using dict format
Expand Down Expand Up @@ -175,7 +175,7 @@ async def test_openai_tool_calling():
client = OpenAIClient(
api_key=config["api_key"],
api_base="https://api.minimaxi.com/v1",
model=config.get("model", "MiniMax-M2"),
model=config.get("model", "MiniMax-M2.1"),
)

# Define tool using dict format (will be converted internally for OpenAI)
Expand Down Expand Up @@ -235,7 +235,7 @@ async def test_multi_turn_conversation():
client = AnthropicClient(
api_key=config["api_key"],
api_base="https://api.minimaxi.com/anthropic",
model=config.get("model", "MiniMax-M2"),
model=config.get("model", "MiniMax-M2.1"),
)

# Define tool using dict format
Expand Down