Skip to content

Conversation

@hardikjshah
Copy link
Contributor

@hardikjshah hardikjshah commented Jun 12, 2025

Add client side resources for

  • files
  • vector_stores
  • embeddings

Installed this package and tested llama-stack integration tests -

# files 
pytest -sv --stack-config=http://localhost:8321  tests/integration/files/test_files.py

# embeddings 
pytest -sv --stack-config=http://localhost:8321  tests/integration/inference/test_openai_embeddings.py --embedding-model all-MiniLM-L6-v2

# vector stores 
pytest -sv --stack-config=http://localhost:8321 tests/integration/vector_io/test_openai_vector_stores.py --embedding-model all-MiniLM-L6-v2

@hardikjshah hardikjshah merged commit 7997543 into main Jun 12, 2025
2 checks passed
@hardikjshah hardikjshah deleted the openai_api_updates4 branch June 12, 2025 22:52
hardikjshah added a commit to llamastack/llama-stack that referenced this pull request Jun 12, 2025
llamastack/llama-stack-client-python#238 updated
llama-stack-client to also support Open AI endpoints for embeddings,
files, vector-stores. This updates the test to test all configs --
openai sdk, llama stack sdk and library-as-client.
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Projects

None yet

Development

Successfully merging this pull request may close these issues.

3 participants