Skip to content
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
26 changes: 6 additions & 20 deletions docs/mint.json
Original file line number Diff line number Diff line change
Expand Up @@ -17,7 +17,6 @@
"anchors": {
"from": "#D1E5F7",
"to": "#a9bed4"

}
},
"topbarCtaButton": {
Expand Down Expand Up @@ -60,25 +59,16 @@
"url": "https://agentops.ai/contact"
}
],
"versions": [
"v2",
"v1",
"v0"
],
"versions": ["v2", "v1", "v0"],
"navigation": [
{
"group": "",
"pages": [
"v1/introduction"
],
"pages": ["v1/introduction"],
"version": "v1"
},
{
"group": "Getting Started",
"pages": [
"v1/quickstart",
"v1/examples/examples"
],
"pages": ["v1/quickstart", "v1/examples/examples"],
"version": "v1"
},
{
Expand Down Expand Up @@ -140,17 +130,12 @@
},
{
"group": "",
"pages": [
"v2/introduction"
],
"pages": ["v2/introduction"],
"version": "v2"
},
{
"group": "Getting Started",
"pages": [
"v2/quickstart",
"v2/examples/examples"
],
"pages": ["v2/quickstart", "v2/examples/examples"],
"version": "v2"
},
{
Expand Down Expand Up @@ -193,6 +178,7 @@
"v2/usage/dashboard-info",
"v2/usage/sdk-reference",
"v2/usage/typescript-sdk",
"v2/usage/mcp-server",
"v2/usage/advanced-configuration",
"v2/usage/context-managers",
"v2/usage/tracking-llm-calls",
Expand Down
8 changes: 4 additions & 4 deletions docs/v2/examples/openai.mdx
Original file line number Diff line number Diff line change
@@ -1,6 +1,6 @@
---
title: 'OpenAI'
description: 'Load the dataset (ensure you're logged in with huggingface-cli if needed)'
description: "Load the dataset (ensure you're logged in with huggingface-cli if needed)"
---
{/* SOURCE_FILE: examples/openai/multi_tool_orchestration.ipynb */}

Expand Down Expand Up @@ -322,7 +322,7 @@ Finally, the tool call and its output are appended to the conversation, and the

### Multi-tool orchestration flow

Now let us try to modify the input query and the system instructions to the responses API in order to follow a tool calling sequence and generate the output.
Now let us try to modify the input query and the system instructions to the responses API in order to follow a tool calling sequence and generate the output.


```python
Expand Down Expand Up @@ -427,10 +427,10 @@ agentops.end_trace(tracer, end_state="Success")
```


Here, we have seen how to utilize OpenAI's Responses API to implement a Retrieval-Augmented Generation (RAG) approach with multi-tool calling capabilities. It showcases an example where the model selects the appropriate tool based on the input query: general questions may be handled by built-in tools such as web-search, while specific medical inquiries related to internal knowledge are addressed by retrieving context from a vector database (such as Pinecone) via function calls. Additonally, we have showcased how multiple tool calls can be sequentially combined to generate a final response based on our instructions provided to responses API. Happy coding!
Here, we have seen how to utilize OpenAI's Responses API to implement a Retrieval-Augmented Generation (RAG) approach with multi-tool calling capabilities. It showcases an example where the model selects the appropriate tool based on the input query: general questions may be handled by built-in tools such as web-search, while specific medical inquiries related to internal knowledge are addressed by retrieving context from a vector database (such as Pinecone) via function calls. Additonally, we have showcased how multiple tool calls can be sequentially combined to generate a final response based on our instructions provided to responses API. Happy coding!


<script type="module" src="/scripts/github_stars.js"></script>
<script type="module" src="/scripts/scroll-img-fadein-animation.js"></script>
<script type="module" src="/scripts/button_heartbeat_animation.js"></script>
<script type="module" src="/scripts/adjust_api_dynamically.js"></script>
<script type="module" src="/scripts/adjust_api_dynamically.js"></script>
88 changes: 88 additions & 0 deletions docs/v2/usage/mcp-server.mdx
Original file line number Diff line number Diff line change
@@ -0,0 +1,88 @@
---
title: "MCP Server"
description: "MCP server for accessing AgentOps trace and span data"
---

<iframe
width="100%"
height="300"
src="https://www.youtube.com/embed/lTa3Sk8C4f0?si=3r7GO8N1Csh0P9C5RR"
title="AgentOps MCP Server"
allow="accelerometer; autoplay; clipboard-write; encrypted-media; gyroscope; picture-in-picture"
allowFullScreen
></iframe>

# MCP Server

AgentOps provides a [Model Context Protocol (MCP)](https://modelcontextprotocol.io/) server that exposes the Public API as a set of tools for AI assistants. This allows AI models to directly query your AgentOps data during conversations and debug AI agents with greater context.

### Configuration & Installation

Add the AgentOps MCP to your MCP client's configuration file.

**npx configuration:**
```json
{
"mcpServers": {
"agentops": {
"command": "npx",
"args": [
"agentops-mcp"
],
"env": {
"AGENTOPS_API_KEY": ""
}
}
}
}
```

**Cursor Deeplink:**

Add the AgentOps MCP to Cursor with Deeplink.

[![Install MCP Server](https://cursor.com/deeplink/mcp-install-dark.svg)](https://cursor.com/install-mcp?name=agentops&config=eyJjb21tYW5kIjoibnB4IGFnZW50b3BzLW1jcCIsImVudiI6eyJBR0VOVE9QU19BUElfS0VZIjoiIn19)

**Smithery:**

To install agentops-mcp for Claude Desktop automatically via [Smithery](https://smithery.ai/server/@AgentOps-AI/agentops-mcp):

```bash
npx -y @smithery/cli install @AgentOps-AI/agentops-mcp --client claude
```

### Available Tools

The MCP server exposes the following tools that mirror the Public API endpoints:

#### `auth`
Authorize using an AgentOps project API key.
- **Parameters**: `api_key` (string) - Your AgentOps project API key
- **Usage**: The server will automatically prompt for authentication when needed

#### `get_project`
Get details about the current project.
- **Parameters**: None
- **Returns**: Project information including ID, name, and environment

#### `get_trace`
Get trace information by ID.
- **Parameters**: `trace_id` (string) - The trace identifier
- **Returns**: Trace details and metrics

#### `get_span`
Get span information by ID.
- **Parameters**: `span_id` (string) - The span identifier
- **Returns**: Span attributes and metrics

#### `get_complete_trace`
Get complete trace information by ID.
- **Parameters**: `span_id` (string) - The trace identifier
- **Returns**: Complete trace and associated span details.

### Environment Variables

The MCP server supports the following environment variables:

- `AGENTOPS_API_KEY`: Your AgentOps project API key
- `HOST`: API endpoint (defaults to `https://api.agentops.ai`)
57 changes: 0 additions & 57 deletions mcp/.dockerignore

This file was deleted.

5 changes: 0 additions & 5 deletions mcp/.env.example

This file was deleted.

Loading
Loading