Skip to content

Conversation

@anandnk24
Copy link

Description

This PR adds the Patronus AI MCP Server to the official list of MCP implementations, linking to Patronus AI's repository. The update provides visibility for users who want to integrate Patronus AI's API for powerful LLM evaluation and optimization.

Server Details

• Server: Patronus AI MCP
• Changes to: README (added Patronus AI MCP Server to the list of available servers)

Motivation and Context

The Patronus AI MCP Server enables AI engineers to scalably and reliably test and optimize LLM systems like AI agents and RAG apps. It is challenging improve LLM system performance and reduce hallucinations and other unexpected behavior. To solve this problem, we developed evaluation models and optimization techniques that help companies ship higher quality AI products. This MCP Server provides an interface over the Patronus API, enabling anyone to ask questions and run evals without running any code at all. This democratizes the process of optimizing AI products within product teams.

How Has This Been Tested?

This change affects documentation only, so no functional testing was required. However, the Patronus AI MCP Server has been tested against a variety of functional and E2E tests (see repo), and it been validated for use with MCP-compatible clients.

Breaking Changes

No breaking changes.

Types of changes

  • Bug fix (non-breaking change which fixes an issue)
  • New feature (non-breaking change which adds functionality)
  • Breaking change (fix or feature that would cause existing functionality to change)
  • Documentation update

Checklist

  • I have read the MCP Protocol Documentation
  • My changes follows MCP security best practices
  • I have updated the server's README accordingly
  • I have tested this with an LLM client
  • My code follows the repository's style guidelines
  • New and existing tests pass locally
  • I have added appropriate error handling
  • I have documented all environment variables and configuration options

Additional context

- <img height="12" width="12" src="https://avatars.githubusercontent.com/u/82347605?s=48&v=4" alt="OceanBase Logo" /> **[OceanBase](https://github.com/oceanbase/mcp-oceanbase)** - MCP Server for OceanBase database and its tools
- <img height="12" width="12" src="https://docs.octagonagents.com/logo.svg" alt="Octagon Logo" /> **[Octagon](https://github.com/OctagonAI/octagon-mcp-server)** - Deliver real-time investment research with extensive private and public market data.
- <img height="12" width="12" src="https://oxylabs.io/favicon.ico" alt="Oxylabs Logo" /> **[Oxylabs](https://github.com/oxylabs/oxylabs-mcp)** - Scrape websites with Oxylabs Web API, supporting dynamic rendering and parsing for structured data extraction.
- <img height="12" width="12" src="https://www.patronus.ai/favicon-ico" alt="Patronus AI Logo" /> **[Patronus AI](https://github.com/patronus-ai/patronus-mcp-server)** - Test, evaluate, and optimize AI agents and RAG apps
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

For some reason this icon isn't rendering for me when I preview the rendered Readme with these changes, could you double check that this image is loading correctly?

@olaservo olaservo added the waiting for submitter Waiting for the submitter to provide more info label May 13, 2025
@olaservo
Copy link
Member

Thanks for your contribution to the servers list. This has been merged in this combined PR: #2075

This is a new process we're trying out, so if you see any issues feel free to re-open the PR and tag me.

@olaservo olaservo closed this Jun 12, 2025
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

waiting for submitter Waiting for the submitter to provide more info

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants