forked from llamastack/llama-stack
-
Notifications
You must be signed in to change notification settings - Fork 0
feat: OpenAPI Generator for client SDK #2
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Open
aegeiger
wants to merge
6
commits into
release-0.4.x
Choose a base branch
from
feature/release-0.4.x-openapi
base: release-0.4.x
Could not load branches
Branch not found: {{ refName }}
Loading
Could not load tags
Nothing to show
Loading
Are you sure you want to change the base?
Some commits from the old base branch may be removed from the timeline,
and old review comments may become outdated.
Conversation
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
d2a9f4e to
3c192d9
Compare
382b324 to
455510d
Compare
a1dfb7f to
4a60452
Compare
Add comprehensive build tooling for generating the Python SDK from OpenAPI specs with support for hierarchical API structures: - build-hierarchical-sdk.sh: Main build script orchestrating the full pipeline - generate-python-sdk.sh: OpenAPI Generator wrapper with custom config - process_openapi_hierarchy.py: Extracts tag hierarchies and adds x-child-tags - patch_api_hierarchy.py: Post-generation patching for nested API structure - merge_stainless_to_openapi.py: Merges Stainless spec into OpenAPI Generator spec - openapi-config.json: OpenAPI Generator configuration - patches.yml: API hierarchy patches for LlamaStackClient wiring This infrastructure enables generating SDKs with nested API access patterns like client.chat.completions.create() while maintaining OpenAPI Generator compatibility. 🤖 Generated with [Claude Code](https://claude.com/claude-code) Co-Authored-By: Claude Sonnet 4.5 <noreply@anthropic.com>
Enhance OpenAPI templates to support hierarchical API structure and improved streaming: Template Improvements: - Add pascal_to_snake_case utility to deduplicate class-to-module conversion - Add LlamaStackClient wrapper with nested API access (chat.completions.*) - Add x-child-tags documentation showing nested API attributes - Update examples to use LlamaStackClient instead of individual API classes - Improve _create_event_stream for better SSE handling with proper typing Model Template Enhancements: - Enhance anyOf/oneOf deserialization with discriminator support - Add fallback to from_dict for streaming chunks - Use pascal_to_snake_case for consistent module resolution Streaming Improvements: - Add Stream template for server-sent events - Fix return types and parameter passing in _create_event_stream - Add proper decoder for streaming response types This enables the SDK to support both flat (client.chat_completions.create) and hierarchical (client.chat.completions.create) access patterns. 🤖 Generated with [Claude Code](https://claude.com/claude-code) Co-Authored-By: Claude Sonnet 4.5 <noreply@anthropic.com>
Add async/await support to the generated SDK with full async versions of all API clients: - async_api_client.mustache: Async HTTP client with httpx - async_api_response.mustache: Async response wrapper - async_stream.mustache: Async streaming support for SSE This allows users to use the SDK in async contexts with proper async/await patterns while maintaining the same API surface as the synchronous client. 🤖 Generated with [Claude Code](https://claude.com/claude-code) Co-Authored-By: Claude Sonnet 4.5 <noreply@anthropic.com>
- Export Agent and AgentEventLogger from lib/__init__.py - Add tools/__init__.py to export get_oauth_token_for_mcp_server 🤖 Generated with [Claude Code](https://claude.com/claude-code) Co-Authored-By: Claude Sonnet 4.5 <noreply@anthropic.com>
Complete the SDK template suite and fix related server/test issues: Template Additions: - Add _exceptions.mustache, _types.mustache, _version.mustache - Update README templates with hierarchical API examples - Enhance configuration.mustache with better defaults - Update partial templates for consistency Server Fixes: - Fix error response format to match OpenAPI spec (remove wrapper) - Update library_client for new SDK structure Test Updates: - Update integration tests to use new LlamaStackClient - Fix imports and client initialization patterns - Update embeddings, rerank, tools, and vector_io tests Stainless Config: - Update config for compatibility with OpenAPI Generator output These changes complete the migration to the hierarchical SDK structure while maintaining backward compatibility. 🤖 Generated with [Claude Code](https://claude.com/claude-code) Co-Authored-By: Claude Sonnet 4.5 <noreply@anthropic.com>
Linting fixes: - UP047: Use modern type parameter syntax for generic functions - UP038/UP040: Use X | Y syntax instead of Union/tuple in type hints - C414: Remove unnecessary list() calls in sorted() - RUF001: Replace ambiguous unicode character with ASCII - F841: Remove unused variable - F821: Add missing statistics import - W291: Remove trailing whitespace Removed templates/lib (duplicate of templates/python/lib) 🤖 Generated with [Claude Code](https://claude.com/claude-code) Co-Authored-By: Claude Sonnet 4.5 <noreply@anthropic.com>
455510d to
6340d30
Compare
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Add this suggestion to a batch that can be applied as a single commit.
This suggestion is invalid because no changes were made to the code.
Suggestions cannot be applied while the pull request is closed.
Suggestions cannot be applied while viewing a subset of changes.
Only one suggestion per line can be applied in a batch.
Add this suggestion to a batch that can be applied as a single commit.
Applying suggestions on deleted lines is not supported.
You must change the existing code in this line in order to create a valid suggestion.
Outdated suggestions cannot be applied.
This suggestion has been applied or marked resolved.
Suggestions cannot be applied from pending reviews.
Suggestions cannot be applied on multi-line comments.
Suggestions cannot be applied while the pull request is queued to merge.
Suggestion cannot be applied right now. Please check back later.
What does this PR do?
Includes all code to generate a llama_stack_client sdk using openapi-generator-cli directly from the llama-stack repo
Test Plan
After generating a client, uninstall the llama_stack_client package, and
uv pip install ./client-sdks/openapi/sdks/python, then run integration tests.