Skip to content

Conversation

@MervinPraison
Copy link

Summary

Add PraisonAI to the "πŸ“š Frameworks β†’ For clients" section as an AI Agents framework with native MCP client support.

About PraisonAI

MCP Integration

PraisonAI provides a native MCP class that allows AI agents to consume any MCP server as tools:

from praisonaiagents import Agent
from praisonaiagents.mcp import MCP

agent = Agent(
    instructions="You are a helpful assistant",
    tools=MCP("npx -y @modelcontextprotocol/server-filesystem /tmp")
)
agent.start("List files in /tmp")

Supported MCP Features

  • All transports: stdio, SSE, WebSocket, Streamable HTTP
  • Environment variables: Pass API keys and config to MCP servers
  • Multiple servers: Use multiple MCP servers simultaneously
  • Async support: Full async/await compatibility

Why This Belongs in Frameworks (For clients)

  1. Client-side MCP integration - PraisonAI is an MCP client that consumes MCP servers
  2. Framework, not a server - It's a framework for building AI agents, not an MCP server implementation
  3. Already in MCP Registry - Published at io.github.MervinPraison/praisonai
  4. Active maintenance - Regular releases, 10K+ GitHub stars, 2M+ PyPI downloads

PraisonAI is a production-ready AI Agents framework with native MCP client support.
- GitHub: https://github.com/MervinPraison/PraisonAI
- PyPI: 2M+ downloads
- MCP Registry: io.github.MervinPraison/praisonai
- Supports all MCP transports: stdio, SSE, WebSocket, Streamable HTTP
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

1 participant