@@ -73,7 +73,7 @@ The Model Context Protocol allows applications to provide context for LLMs in a
7373
7474### Adding MCP to your python project
7575
76- We recommend using [ uv] ( https://docs.astral.sh/uv/ ) to manage your Python projects.
76+ We recommend using [ uv] ( https://docs.astral.sh/uv/ ) to manage your Python projects.
7777
7878If you haven't created a uv-managed project yet, create one:
7979
@@ -89,6 +89,7 @@ If you haven't created a uv-managed project yet, create one:
8989 ```
9090
9191Alternatively, for projects using pip for dependencies:
92+
9293``` bash
9394pip install " mcp[cli]"
9495```
@@ -128,11 +129,13 @@ def get_greeting(name: str) -> str:
128129```
129130
130131You can install this server in [ Claude Desktop] ( https://claude.ai/download ) and interact with it right away by running:
132+
131133``` bash
132134mcp install server.py
133135```
134136
135137Alternatively, you can test it with the MCP Inspector:
138+
136139``` bash
137140mcp dev server.py
138141```
@@ -245,6 +248,67 @@ async def fetch_weather(city: str) -> str:
245248 return response.text
246249```
247250
251+ #### Output Schemas
252+
253+ Tools automatically generate JSON Schema definitions for their return types, helping LLMs understand the structure of the data they'll receive:
254+
255+ ``` python
256+ from pydantic import BaseModel
257+
258+ # Tools with primitive return types
259+ @mcp.tool ()
260+ def get_temperature (city : str ) -> float :
261+ """ Get the current temperature for a city"""
262+ # In a real implementation, this would fetch actual weather data
263+ return 72.5
264+
265+ # Tools with dictionary return types
266+ @mcp.tool ()
267+ def get_user (user_id : int ) -> dict :
268+ """ Get user information by ID"""
269+ return {" id" : user_id, " name" : " John Doe" , " email" : " john@example.com" }
270+
271+ # Using Pydantic models for structured output
272+ class WeatherData (BaseModel ):
273+ temperature: float
274+ humidity: float
275+ conditions: str
276+
277+ @mcp.tool ()
278+ def get_weather_data (city : str ) -> WeatherData:
279+ """ Get structured weather data for a city"""
280+ # In a real implementation, this would fetch actual weather data
281+ return WeatherData(
282+ temperature = 72.5 ,
283+ humidity = 65.0 ,
284+ conditions = " Partly cloudy"
285+ )
286+
287+ # Complex nested models
288+ class Location (BaseModel ):
289+ city: str
290+ country: str
291+ coordinates: tuple[float , float ]
292+
293+ class WeatherForecast (BaseModel ):
294+ current: WeatherData
295+ location: Location
296+ forecast: list[WeatherData]
297+
298+ @mcp.tool ()
299+ def get_weather_forecast (city : str ) -> WeatherForecast:
300+ """ Get detailed weather forecast for a city"""
301+ # In a real implementation, this would fetch actual forecast data
302+ return WeatherForecast(
303+ current = WeatherData(temperature = 72.5 , humidity = 65.0 , conditions = " Partly cloudy" ),
304+ location = Location(city = city, country = " USA" , coordinates = (37.7749 , - 122.4194 )),
305+ forecast = [
306+ WeatherData(temperature = 75.0 , humidity = 62.0 , conditions = " Sunny" ),
307+ WeatherData(temperature = 68.0 , humidity = 80.0 , conditions = " Rainy" )
308+ ]
309+ )
310+ ```
311+
248312### Prompts
249313
250314Prompts are reusable templates that help LLMs interact with your server effectively:
@@ -381,6 +445,7 @@ if __name__ == "__main__":
381445```
382446
383447Run it with:
448+
384449``` bash
385450python server.py
386451# or
@@ -458,18 +523,17 @@ app.mount("/math", math.mcp.streamable_http_app())
458523```
459524
460525For low level server with Streamable HTTP implementations, see:
526+
461527- Stateful server: [ ` examples/servers/simple-streamablehttp/ ` ] ( examples/servers/simple-streamablehttp/ )
462528- Stateless server: [ ` examples/servers/simple-streamablehttp-stateless/ ` ] ( examples/servers/simple-streamablehttp-stateless/ )
463529
464-
465-
466530The streamable HTTP transport supports:
531+
467532- Stateful and stateless operation modes
468533- Resumability with event stores
469- - JSON or SSE response formats
534+ - JSON or SSE response formats
470535- Better scalability for multi-node deployments
471536
472-
473537### Mounting to an Existing ASGI Server
474538
475539> ** Note** : SSE transport is being superseded by [ Streamable HTTP transport] ( https://modelcontextprotocol.io/specification/2025-03-26/basic/transports#streamable-http ) .
@@ -637,6 +701,7 @@ async def query_db(name: str, arguments: dict) -> list:
637701```
638702
639703The lifespan API provides:
704+
640705- A way to initialize resources when the server starts and clean them up when it stops
641706- Access to initialized resources through the request context in handlers
642707- Type-safe context passing between lifespan and request handlers
0 commit comments