Skip to content
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
2 changes: 1 addition & 1 deletion .release-please-manifest.json
Original file line number Diff line number Diff line change
@@ -1,3 +1,3 @@
{
".": "0.3.1-alpha.2"
".": "0.4.0-alpha.1"
}
4 changes: 2 additions & 2 deletions .stats.yml
Original file line number Diff line number Diff line change
@@ -1,4 +1,4 @@
configured_endpoints: 104
configured_endpoints: 111
openapi_spec_url: https://storage.googleapis.com/stainless-sdk-openapi-specs/llamastack%2Fllama-stack-client-35c6569e5e9fcc85084c9728eb7fc7c5908297fcc77043d621d25de3c850a990.yml
openapi_spec_hash: 0f95bbeee16f3205d36ec34cfa62c711
config_hash: a3829dbdaa491194d01f399784d532cd
config_hash: ef275cc002a89629459fd73d0cf9cba9
27 changes: 27 additions & 0 deletions CHANGELOG.md
Original file line number Diff line number Diff line change
@@ -1,5 +1,32 @@
# Changelog

## 0.4.0-alpha.1 (2025-10-30)

Full Changelog: [v0.3.1-alpha.2...v0.4.0-alpha.1](https://github.com/llamastack/llama-stack-client-python/compare/v0.3.1-alpha.2...v0.4.0-alpha.1)

### ⚠ BREAKING CHANGES

* **api:** /v1/inspect only lists v1 apis by default
* **api:** /v1/inspect only lists v1 apis by default

### Features

* **api:** Adding prompts API to stainless config ([114198b](https://github.com/llamastack/llama-stack-client-python/commit/114198bef4244ec27f7e163beb2e554da0dbd213))
* **api:** manual updates??! ([d8ab6cb](https://github.com/llamastack/llama-stack-client-python/commit/d8ab6cb77267af53f3f2e9ff3ebaab9364a754c7))


### Bug Fixes

* clean pre-commit ([799b908](https://github.com/llamastack/llama-stack-client-python/commit/799b9084266c390604829dd1eef483bf3b941134))
* **client:** close streams without requiring full consumption ([d861708](https://github.com/llamastack/llama-stack-client-python/commit/d8617084062acbb81c26b6c22ea613e397aa969b))
* **headers:** add a newline ([55a8efc](https://github.com/llamastack/llama-stack-client-python/commit/55a8efc0a60f44c8c93e18b2b60215f051405be4))


### Chores

* **api:** /v1/inspect only lists v1 apis by default ([209de45](https://github.com/llamastack/llama-stack-client-python/commit/209de45599de19183a1cd14bc3567e34d2374184))
* **api:** /v1/inspect only lists v1 apis by default ([b36e2ab](https://github.com/llamastack/llama-stack-client-python/commit/b36e2ab8661e4913838c2cb4501156b290876da0))

## 0.3.1-alpha.2 (2025-10-27)

Full Changelog: [v0.3.1-alpha.1...v0.3.1-alpha.2](https://github.com/llamastack/llama-stack-client-python/compare/v0.3.1-alpha.1...v0.3.1-alpha.2)
Expand Down
23 changes: 23 additions & 0 deletions api.md
Original file line number Diff line number Diff line change
Expand Up @@ -102,6 +102,29 @@ Methods:

- <code title="get /v1/responses/{response_id}/input_items">client.responses.input_items.<a href="./src/llama_stack_client/resources/responses/input_items.py">list</a>(response_id, \*\*<a href="src/llama_stack_client/types/responses/input_item_list_params.py">params</a>) -> <a href="./src/llama_stack_client/types/responses/input_item_list_response.py">InputItemListResponse</a></code>

# Prompts

Types:

```python
from llama_stack_client.types import ListPromptsResponse, Prompt, PromptListResponse
```

Methods:

- <code title="post /v1/prompts">client.prompts.<a href="./src/llama_stack_client/resources/prompts/prompts.py">create</a>(\*\*<a href="src/llama_stack_client/types/prompt_create_params.py">params</a>) -> <a href="./src/llama_stack_client/types/prompt.py">Prompt</a></code>
- <code title="get /v1/prompts/{prompt_id}">client.prompts.<a href="./src/llama_stack_client/resources/prompts/prompts.py">retrieve</a>(prompt_id, \*\*<a href="src/llama_stack_client/types/prompt_retrieve_params.py">params</a>) -> <a href="./src/llama_stack_client/types/prompt.py">Prompt</a></code>
- <code title="post /v1/prompts/{prompt_id}">client.prompts.<a href="./src/llama_stack_client/resources/prompts/prompts.py">update</a>(prompt_id, \*\*<a href="src/llama_stack_client/types/prompt_update_params.py">params</a>) -> <a href="./src/llama_stack_client/types/prompt.py">Prompt</a></code>
- <code title="get /v1/prompts">client.prompts.<a href="./src/llama_stack_client/resources/prompts/prompts.py">list</a>() -> <a href="./src/llama_stack_client/types/prompt_list_response.py">PromptListResponse</a></code>
- <code title="delete /v1/prompts/{prompt_id}">client.prompts.<a href="./src/llama_stack_client/resources/prompts/prompts.py">delete</a>(prompt_id) -> None</code>
- <code title="post /v1/prompts/{prompt_id}/set-default-version">client.prompts.<a href="./src/llama_stack_client/resources/prompts/prompts.py">set_default_version</a>(prompt_id, \*\*<a href="src/llama_stack_client/types/prompt_set_default_version_params.py">params</a>) -> <a href="./src/llama_stack_client/types/prompt.py">Prompt</a></code>

## Versions

Methods:

- <code title="get /v1/prompts/{prompt_id}/versions">client.prompts.versions.<a href="./src/llama_stack_client/resources/prompts/versions.py">list</a>(prompt_id) -> <a href="./src/llama_stack_client/types/prompt_list_response.py">PromptListResponse</a></code>

# Conversations

Types:
Expand Down
2 changes: 1 addition & 1 deletion pyproject.toml
Original file line number Diff line number Diff line change
@@ -1,6 +1,6 @@
[project]
name = "llama_stack_client"
version = "0.3.1-alpha.2"
version = "0.4.0-alpha.1"
description = "The official Python library for the llama-stack-client API"
dynamic = ["readme"]
license = "MIT"
Expand Down
38 changes: 38 additions & 0 deletions src/llama_stack_client/_client.py
Original file line number Diff line number Diff line change
Expand Up @@ -48,6 +48,7 @@
routes,
safety,
inspect,
prompts,
scoring,
shields,
providers,
Expand Down Expand Up @@ -80,6 +81,7 @@
from .resources.completions import CompletionsResource, AsyncCompletionsResource
from .resources.moderations import ModerationsResource, AsyncModerationsResource
from .resources.models.models import ModelsResource, AsyncModelsResource
from .resources.prompts.prompts import PromptsResource, AsyncPromptsResource
from .resources.scoring_functions import ScoringFunctionsResource, AsyncScoringFunctionsResource
from .resources.responses.responses import ResponsesResource, AsyncResponsesResource
from .resources.synthetic_data_generation import (
Expand Down Expand Up @@ -183,6 +185,12 @@ def responses(self) -> ResponsesResource:

return ResponsesResource(self)

@cached_property
def prompts(self) -> PromptsResource:
from .resources.prompts import PromptsResource

return PromptsResource(self)

@cached_property
def conversations(self) -> ConversationsResource:
from .resources.conversations import ConversationsResource
Expand Down Expand Up @@ -493,6 +501,12 @@ def responses(self) -> AsyncResponsesResource:

return AsyncResponsesResource(self)

@cached_property
def prompts(self) -> AsyncPromptsResource:
from .resources.prompts import AsyncPromptsResource

return AsyncPromptsResource(self)

@cached_property
def conversations(self) -> AsyncConversationsResource:
from .resources.conversations import AsyncConversationsResource
Expand Down Expand Up @@ -752,6 +766,12 @@ def responses(self) -> responses.ResponsesResourceWithRawResponse:

return ResponsesResourceWithRawResponse(self._client.responses)

@cached_property
def prompts(self) -> prompts.PromptsResourceWithRawResponse:
from .resources.prompts import PromptsResourceWithRawResponse

return PromptsResourceWithRawResponse(self._client.prompts)

@cached_property
def conversations(self) -> conversations.ConversationsResourceWithRawResponse:
from .resources.conversations import ConversationsResourceWithRawResponse
Expand Down Expand Up @@ -897,6 +917,12 @@ def responses(self) -> responses.AsyncResponsesResourceWithRawResponse:

return AsyncResponsesResourceWithRawResponse(self._client.responses)

@cached_property
def prompts(self) -> prompts.AsyncPromptsResourceWithRawResponse:
from .resources.prompts import AsyncPromptsResourceWithRawResponse

return AsyncPromptsResourceWithRawResponse(self._client.prompts)

@cached_property
def conversations(self) -> conversations.AsyncConversationsResourceWithRawResponse:
from .resources.conversations import AsyncConversationsResourceWithRawResponse
Expand Down Expand Up @@ -1044,6 +1070,12 @@ def responses(self) -> responses.ResponsesResourceWithStreamingResponse:

return ResponsesResourceWithStreamingResponse(self._client.responses)

@cached_property
def prompts(self) -> prompts.PromptsResourceWithStreamingResponse:
from .resources.prompts import PromptsResourceWithStreamingResponse

return PromptsResourceWithStreamingResponse(self._client.prompts)

@cached_property
def conversations(self) -> conversations.ConversationsResourceWithStreamingResponse:
from .resources.conversations import ConversationsResourceWithStreamingResponse
Expand Down Expand Up @@ -1191,6 +1223,12 @@ def responses(self) -> responses.AsyncResponsesResourceWithStreamingResponse:

return AsyncResponsesResourceWithStreamingResponse(self._client.responses)

@cached_property
def prompts(self) -> prompts.AsyncPromptsResourceWithStreamingResponse:
from .resources.prompts import AsyncPromptsResourceWithStreamingResponse

return AsyncPromptsResourceWithStreamingResponse(self._client.prompts)

@cached_property
def conversations(self) -> conversations.AsyncConversationsResourceWithStreamingResponse:
from .resources.conversations import AsyncConversationsResourceWithStreamingResponse
Expand Down
10 changes: 4 additions & 6 deletions src/llama_stack_client/_streaming.py
Original file line number Diff line number Diff line change
Expand Up @@ -63,9 +63,8 @@ def __stream__(self) -> Iterator[_T]:
for sse in iterator:
yield process_data(data=sse.json(), cast_to=cast_to, response=response)

# Ensure the entire stream is consumed
for _sse in iterator:
...
# As we might not fully consume the response stream, we need to close it explicitly
response.close()

def __enter__(self) -> Self:
return self
Expand Down Expand Up @@ -127,9 +126,8 @@ async def __stream__(self) -> AsyncIterator[_T]:
async for sse in iterator:
yield process_data(data=sse.json(), cast_to=cast_to, response=response)

# Ensure the entire stream is consumed
async for _sse in iterator:
...
# As we might not fully consume the response stream, we need to close it explicitly
await response.aclose()

async def __aenter__(self) -> Self:
return self
Expand Down
14 changes: 14 additions & 0 deletions src/llama_stack_client/resources/__init__.py
Original file line number Diff line number Diff line change
Expand Up @@ -78,6 +78,14 @@
InspectResourceWithStreamingResponse,
AsyncInspectResourceWithStreamingResponse,
)
from .prompts import (
PromptsResource,
AsyncPromptsResource,
PromptsResourceWithRawResponse,
AsyncPromptsResourceWithRawResponse,
PromptsResourceWithStreamingResponse,
AsyncPromptsResourceWithStreamingResponse,
)
from .scoring import (
ScoringResource,
AsyncScoringResource,
Expand Down Expand Up @@ -216,6 +224,12 @@
"AsyncResponsesResourceWithRawResponse",
"ResponsesResourceWithStreamingResponse",
"AsyncResponsesResourceWithStreamingResponse",
"PromptsResource",
"AsyncPromptsResource",
"PromptsResourceWithRawResponse",
"AsyncPromptsResourceWithRawResponse",
"PromptsResourceWithStreamingResponse",
"AsyncPromptsResourceWithStreamingResponse",
"ConversationsResource",
"AsyncConversationsResource",
"ConversationsResourceWithRawResponse",
Expand Down
33 changes: 33 additions & 0 deletions src/llama_stack_client/resources/prompts/__init__.py
Original file line number Diff line number Diff line change
@@ -0,0 +1,33 @@
# File generated from our OpenAPI spec by Stainless. See CONTRIBUTING.md for details.

from .prompts import (
PromptsResource,
AsyncPromptsResource,
PromptsResourceWithRawResponse,
AsyncPromptsResourceWithRawResponse,
PromptsResourceWithStreamingResponse,
AsyncPromptsResourceWithStreamingResponse,
)
from .versions import (
VersionsResource,
AsyncVersionsResource,
VersionsResourceWithRawResponse,
AsyncVersionsResourceWithRawResponse,
VersionsResourceWithStreamingResponse,
AsyncVersionsResourceWithStreamingResponse,
)

__all__ = [
"VersionsResource",
"AsyncVersionsResource",
"VersionsResourceWithRawResponse",
"AsyncVersionsResourceWithRawResponse",
"VersionsResourceWithStreamingResponse",
"AsyncVersionsResourceWithStreamingResponse",
"PromptsResource",
"AsyncPromptsResource",
"PromptsResourceWithRawResponse",
"AsyncPromptsResourceWithRawResponse",
"PromptsResourceWithStreamingResponse",
"AsyncPromptsResourceWithStreamingResponse",
]
Loading
Loading