From 68b5e0a7b71bc2848dc1cc760c65268fd88d57b8 Mon Sep 17 00:00:00 2001 From: Xi Yan Date: Mon, 10 Mar 2025 11:28:00 -0700 Subject: [PATCH 01/10] autogen doc --- docs/cli_reference.md | 1063 +++++++++++++++++++++++++++++++++------- scripts/gen_cli_doc.py | 80 +++ 2 files changed, 969 insertions(+), 174 deletions(-) create mode 100644 scripts/gen_cli_doc.py diff --git a/docs/cli_reference.md b/docs/cli_reference.md index 20c06029..18b44d13 100644 --- a/docs/cli_reference.md +++ b/docs/cli_reference.md @@ -1,252 +1,967 @@ -# CLI Reference +# cli -You may use the `llama-stack-client` to query information about the distribution. +Welcome to the LlamaStackClient CLI -#### `llama-stack-client` -```bash -$ llama-stack-client -h +### Usage -usage: llama-stack-client [-h] {models,memory_banks,shields} ... +``` +Usage: cli [OPTIONS] COMMAND [ARGS]... +``` -Welcome to the LlamaStackClient CLI +### Options + +* **-h, --help**: Show this message and exit. [default: False] + +* **--version**: Show the version and exit. [default: False] + +* **--endpoint**: Llama Stack distribution endpoint [default: ] + +* **--api-key**: Llama Stack distribution API key [default: ] + +* **--config**: Path to config file + +### Commands + +* **configure**: Configure Llama Stack Client CLI. + +* **datasets**: Manage datasets. + +* **eval**: Run evaluation tasks. + +* **eval_tasks**: Manage evaluation tasks. + +* **inference**: Inference (chat). + +* **inspect**: Inspect server configuration. + +* **models**: Manage GenAI models. + +* **post_training**: Post-training. + +* **providers**: Manage API providers. + +* **scoring_functions**: Manage scoring functions. + +* **shields**: Manage safety shield services. + +* **toolgroups**: Manage available tool groups. + +* **vector_dbs**: Manage vector databases. + + + +## configure + +Configure Llama Stack Client CLI. + +### Usage + +``` +Usage: cli configure [OPTIONS] +``` + +### Options + +* **-h, --help**: Show this message and exit. [default: False] + +* **--endpoint**: Llama Stack distribution endpoint [default: ] + +* **--api-key**: Llama Stack distribution API key [default: ] + + + +## datasets + +Manage datasets. + +### Usage + +``` +Usage: cli datasets [OPTIONS] COMMAND [ARGS]... +``` + +### Options + +* **-h, --help**: Show this message and exit. [default: False] + +### Commands + +* **list**: Show available datasets on distribution... + +* **register**: Create a new dataset + + + +### list + +Show available datasets on distribution endpoint + +### Usage + +``` +Usage: cli datasets list [OPTIONS] +``` + +### Options + +* **-h, --help**: Show this message and exit. [default: False] + + + +### register + +Create a new dataset + +### Usage + +``` +Usage: cli datasets register [OPTIONS] +``` + +### Options + +* **-h, --help**: Show this message and exit. [default: False] + +* **--dataset-id**: Id of the dataset + +* **--provider-id**: Provider ID for the dataset + +* **--provider-dataset-id**: Provider's dataset ID + +* **--metadata**: Metadata of the dataset + +* **--url**: URL of the dataset + +* **--dataset-path**: Local file path to the dataset. If specified, upload dataset via URL + +* **--schema**: JSON schema of the dataset + + + +## eval + +Run evaluation tasks. + +### Usage + +``` +Usage: cli eval [OPTIONS] COMMAND [ARGS]... +``` + +### Options + +* **-h, --help**: Show this message and exit. [default: False] + +### Commands + +* **run-benchmark**: Run a evaluation benchmark task + +* **run-scoring**: Run scoring from application datasets + + + +### run-benchmark + +Run a evaluation benchmark task + +### Usage + +``` +Usage: cli eval run-benchmark [OPTIONS] BENCHMARK_IDS... +``` + +### Options + +* **-h, --help**: Show this message and exit. [default: False] + +* **--model-id**: model id to run the benchmark eval on + +* **--output-dir**: Path to the dump eval results output directory + +* **--num-examples**: Number of examples to evaluate on, useful for debugging + +* **--temperature**: temperature in the sampling params to run generation [default: 0.0] + +* **--max-tokens**: max-tokens in the sampling params to run generation [default: 4096] + +* **--top-p**: top-p in the sampling params to run generation [default: 0.9] + +* **--repeat-penalty**: repeat-penalty in the sampling params to run generation [default: 1.0] + +* **--visualize**: Visualize evaluation results after completion [default: False] + +### Arguments + +* **BENCHMARK_IDS** + + + +### run-scoring + +Run scoring from application datasets + +### Usage + +``` +Usage: cli eval run-scoring [OPTIONS] SCORING_FUNCTION_IDS... +``` + +### Options + +* **-h, --help**: Show this message and exit. [default: False] + +* **--dataset-id**: Pre-registered dataset_id to score (from llama-stack-client datasets list) + +* **--dataset-path**: Path to the dataset file to score + +* **--scoring-params-config**: Path to the scoring params config file in JSON format + +* **--num-examples**: Number of examples to evaluate on, useful for debugging + +* **--output-dir**: Path to the dump eval results output directory -options: - -h, --help show this help message and exit +* **--visualize**: Visualize evaluation results after completion [default: False] -subcommands: - {models,memory_banks,shields} +### Arguments + +* **SCORING_FUNCTION_IDS** + + + +## eval-tasks + +Manage evaluation tasks. + +### Usage + +``` +Usage: cli eval-tasks [OPTIONS] COMMAND [ARGS]... ``` -#### `llama-stack-client configure` -```bash -$ llama-stack-client configure -> Enter the host name of the Llama Stack distribution server: localhost -> Enter the port number of the Llama Stack distribution server: 5000 -Done! You can now use the Llama Stack Client CLI with endpoint http://localhost:5000 +### Options + +* **-h, --help**: Show this message and exit. [default: False] + +### Commands + +* **list**: Show available eval tasks on distribution... + +* **register**: Register a new eval task + + + +### list + +Show available eval tasks on distribution endpoint + +### Usage + ``` +Usage: cli eval-tasks list [OPTIONS] +``` + +### Options + +* **-h, --help**: Show this message and exit. [default: False] + + + +### register + +Register a new eval task + +### Usage + +``` +Usage: cli eval-tasks register [OPTIONS] +``` + +### Options + +* **-h, --help**: Show this message and exit. [default: False] +* **--eval-task-id**: ID of the eval task + +* **--dataset-id**: ID of the dataset to evaluate + +* **--scoring-functions**: Scoring functions to use for evaluation + +* **--provider-id**: Provider ID for the eval task + +* **--provider-eval-task-id**: Provider's eval task ID + +* **--metadata**: Metadata for the eval task in JSON format + + + +## inference + +Inference (chat). + +### Usage -#### `llama-stack-client providers list` -```bash -$ llama-stack-client providers list ``` +Usage: cli inference [OPTIONS] COMMAND [ARGS]... ``` -+-----------+----------------+-----------------+ -| API | Provider ID | Provider Type | -+===========+================+=================+ -| scoring | meta0 | meta-reference | -+-----------+----------------+-----------------+ -| datasetio | meta0 | meta-reference | -+-----------+----------------+-----------------+ -| inference | tgi0 | remote::tgi | -+-----------+----------------+-----------------+ -| memory | meta-reference | meta-reference | -+-----------+----------------+-----------------+ -| agents | meta-reference | meta-reference | -+-----------+----------------+-----------------+ -| telemetry | meta-reference | meta-reference | -+-----------+----------------+-----------------+ -| safety | meta-reference | meta-reference | -+-----------+----------------+-----------------+ + +### Options + +* **-h, --help**: Show this message and exit. [default: False] + +### Commands + +* **chat-completion**: Show available inference chat completion... + + + +### chat-completion + +Show available inference chat completion endpoints on distribution endpoint + +### Usage + ``` +Usage: cli inference chat-completion [OPTIONS] +``` + +### Options + +* **-h, --help**: Show this message and exit. [default: False] + +* **--message**: Message + +* **--stream**: Streaming [default: False] + +* **--session**: Start a Chat Session [default: False] + +* **--model-id**: Model ID + + + +## inspect + +Inspect server configuration. -#### `llama-stack-client models list` -```bash -$ llama-stack-client models list +### Usage + +``` +Usage: cli inspect [OPTIONS] COMMAND [ARGS]... +``` + +### Options + +* **-h, --help**: Show this message and exit. [default: False] + +### Commands + +* **version**: Show available providers on distribution... + + + +### version + +Show available providers on distribution endpoint + +### Usage + +``` +Usage: cli inspect version [OPTIONS] +``` + +### Options + +* **-h, --help**: Show this message and exit. [default: False] + + + +## models + +Manage GenAI models. + +### Usage + +``` +Usage: cli models [OPTIONS] COMMAND [ARGS]... +``` + +### Options + +* **-h, --help**: Show this message and exit. [default: False] + +### Commands + +* **get**: Show available llama models at distribution... + +* **list**: Show available llama models at distribution... + +* **register**: Register a new model at distribution endpoint + +* **unregister**: Unregister a model from distribution endpoint + + + +### get + +Show available llama models at distribution endpoint + +### Usage + +``` +Usage: cli models get [OPTIONS] MODEL_ID +``` + +### Options + +* **-h, --help**: Show this message and exit. [default: False] + +### Arguments + +* **MODEL_ID** + + + +### list + +Show available llama models at distribution endpoint + +### Usage + +``` +Usage: cli models list [OPTIONS] +``` + +### Options + +* **-h, --help**: Show this message and exit. [default: False] + + + +### register + +Register a new model at distribution endpoint + +### Usage + +``` +Usage: cli models register [OPTIONS] MODEL_ID ``` + +### Options + +* **-h, --help**: Show this message and exit. [default: False] + +* **--provider-id**: Provider ID for the model + +* **--provider-model-id**: Provider's model ID + +* **--metadata**: JSON metadata for the model + +### Arguments + +* **MODEL_ID** + + + +### unregister + +Unregister a model from distribution endpoint + +### Usage + ``` -+----------------------+----------------------+---------------+----------------------------------------------------------+ -| identifier | llama_model | provider_id | metadata | -+======================+======================+===============+==========================================================+ -| Llama3.1-8B-Instruct | Llama3.1-8B-Instruct | tgi0 | {'huggingface_repo': 'meta-llama/Llama-3.1-8B-Instruct'} | -+----------------------+----------------------+---------------+----------------------------------------------------------+ +Usage: cli models unregister [OPTIONS] MODEL_ID ``` -#### `llama-stack-client models get` -```bash -$ llama-stack-client models get Llama3.1-8B-Instruct +### Options + +* **-h, --help**: Show this message and exit. [default: False] + +### Arguments + +* **MODEL_ID** + + + +## post-training + +Post-training. + +### Usage + ``` +Usage: cli post-training [OPTIONS] COMMAND [ARGS]... +``` + +### Options + +* **-h, --help**: Show this message and exit. [default: False] + +### Commands + +* **artifacts**: Get the training artifacts of a specific post... + +* **cancel**: Cancel the training job + +* **list**: Show the list of available post training jobs + +* **status**: Show the status of a specific post training... + +* **supervised_fine_tune**: Kick off a supervised fine tune job + + + +### artifacts + +Get the training artifacts of a specific post training job + +### Usage ``` -+----------------------+----------------------+----------------------------------------------------------+---------------+ -| identifier | llama_model | metadata | provider_id | -+======================+======================+==========================================================+===============+ -| Llama3.1-8B-Instruct | Llama3.1-8B-Instruct | {'huggingface_repo': 'meta-llama/Llama-3.1-8B-Instruct'} | tgi0 | -+----------------------+----------------------+----------------------------------------------------------+---------------+ +Usage: cli post-training artifacts [OPTIONS] ``` +### Options + +* **-h, --help**: Show this message and exit. [default: False] + +* **--job-uuid**: Job UUID + -```bash -$ llama-stack-client models get Random-Model -Model RandomModel is not found at distribution endpoint host:port. Please ensure endpoint is serving specified model. +### cancel + +Cancel the training job + +### Usage + +``` +Usage: cli post-training cancel [OPTIONS] ``` -#### `llama-stack-client models register` +### Options + +* **-h, --help**: Show this message and exit. [default: False] + +* **--job-uuid**: Job UUID + + -```bash -$ llama-stack-client models register [--provider-id ] [--provider-model-id ] [--metadata ] +### list + +Show the list of available post training jobs + +### Usage + +``` +Usage: cli post-training list [OPTIONS] ``` -#### `llama-stack-client models update` +### Options + +* **-h, --help**: Show this message and exit. [default: False] + + + +### status + +Show the status of a specific post training job + +### Usage -```bash -$ llama-stack-client models update [--provider-id ] [--provider-model-id ] [--metadata ] ``` +Usage: cli post-training status [OPTIONS] +``` + +### Options + +* **-h, --help**: Show this message and exit. [default: False] + +* **--job-uuid**: Job UUID + -#### `llama-stack-client models delete` -```bash -$ llama-stack-client models delete +### supervised_fine_tune + +Kick off a supervised fine tune job + +### Usage + +``` +Usage: cli post-training supervised_fine_tune [OPTIONS] ``` -#### `llama-stack-client vector_dbs list` -```bash -$ llama-stack-client vector_dbs list +### Options + +* **-h, --help**: Show this message and exit. [default: False] + +* **--job-uuid**: Job UUID + +* **--model**: Model ID + +* **--algorithm-config**: Algorithm Config + +* **--training-config**: Training Config + +* **--checkpoint-dir**: Checkpoint Config + + + +## providers + +Manage API providers. + +### Usage + ``` +Usage: cli providers [OPTIONS] COMMAND [ARGS]... ``` -+--------------+----------------+---------------------+---------------+------------------------+ -| identifier | provider_id | provider_resource_id| vector_db_type| params | -+==============+================+=====================+===============+========================+ -| test_bank | meta-reference | test_bank | vector | embedding_model: all-MiniLM-L6-v2 - embedding_dimension: 384| -+--------------+----------------+---------------------+---------------+------------------------+ + +### Options + +* **-h, --help**: Show this message and exit. [default: False] + +### Commands + +* **list**: Show available providers on distribution... + + + +### list + +Show available providers on distribution endpoint + +### Usage + ``` +Usage: cli providers list [OPTIONS] +``` + +### Options + +* **-h, --help**: Show this message and exit. [default: False] + + -#### `llama-stack-client vector_dbs register` -```bash -$ llama-stack-client vector_dbs register [--provider-id ] [--provider-vector-db-id ] [--embedding-model ] [--embedding-dimension ] +## scoring-functions + +Manage scoring functions. + +### Usage + +``` +Usage: cli scoring-functions [OPTIONS] COMMAND [ARGS]... ``` -Options: -- `--provider-id`: Optional. Provider ID for the vector db -- `--provider-vector-db-id`: Optional. Provider's vector db ID -- `--embedding-model`: Optional. Embedding model to use. Default: "all-MiniLM-L6-v2" -- `--embedding-dimension`: Optional. Dimension of embeddings. Default: 384 +### Options + +* **-h, --help**: Show this message and exit. [default: False] + +### Commands + +* **list**: Show available scoring functions on... + +* **register**: Register a new scoring function -#### `llama-stack-client vector_dbs unregister` -```bash -$ llama-stack-client vector_dbs unregister + + +### list + +Show available scoring functions on distribution endpoint + +### Usage + +``` +Usage: cli scoring-functions list [OPTIONS] ``` -#### `llama-stack-client shields list` -```bash -$ llama-stack-client shields list +### Options + +* **-h, --help**: Show this message and exit. [default: False] + + + +### register + +Register a new scoring function + +### Usage + ``` +Usage: cli scoring-functions register [OPTIONS] +``` + +### Options + +* **-h, --help**: Show this message and exit. [default: False] + +* **--scoring-fn-id**: Id of the scoring function + +* **--description**: Description of the scoring function + +* **--return-type**: Return type of the scoring function + +* **--provider-id**: Provider ID for the scoring function + +* **--provider-scoring-fn-id**: Provider's scoring function ID + +* **--params**: Parameters for the scoring function in JSON format + + + +## shields + +Manage safety shield services. + +### Usage ``` -+--------------+----------+----------------+-------------+ -| identifier | params | provider_id | type | -+==============+==========+================+=============+ -| llama_guard | {} | meta-reference | llama_guard | -+--------------+----------+----------------+-------------+ +Usage: cli shields [OPTIONS] COMMAND [ARGS]... ``` -#### `llama-stack-client shields register` -```bash -$ llama-stack-client shields register --shield-id [--provider-id ] [--provider-shield-id ] [--params ] +### Options + +* **-h, --help**: Show this message and exit. [default: False] + +### Commands + +* **list**: Show available safety shields on distribution... + +* **register**: Register a new safety shield + + + +### list + +Show available safety shields on distribution endpoint + +### Usage + +``` +Usage: cli shields list [OPTIONS] ``` -Options: -- `--shield-id`: Required. ID of the shield -- `--provider-id`: Optional. Provider ID for the shield -- `--provider-shield-id`: Optional. Provider's shield ID -- `--params`: Optional. JSON configuration parameters for the shield +### Options + +* **-h, --help**: Show this message and exit. [default: False] + + + +### register + +Register a new safety shield + +### Usage -#### `llama-stack-client eval_tasks list` -```bash -$ llama-stack-client eval_tasks list ``` +Usage: cli shields register [OPTIONS] +``` + +### Options + +* **-h, --help**: Show this message and exit. [default: False] + +* **--shield-id**: Id of the shield + +* **--provider-id**: Provider ID for the shield + +* **--provider-shield-id**: Provider's shield ID + +* **--params**: JSON configuration parameters for the shield + + -#### `llama-stack-client eval_tasks register` -```bash -$ llama-stack-client eval_tasks register --eval-task-id --dataset-id --scoring-functions [ ...] [--provider-id ] [--provider-eval-task-id ] [--metadata ] +## toolgroups + +Manage available tool groups. + +### Usage + +``` +Usage: cli toolgroups [OPTIONS] COMMAND [ARGS]... ``` -Options: -- `--eval-task-id`: Required. ID of the eval task -- `--dataset-id`: Required. ID of the dataset to evaluate -- `--scoring-functions`: Required. One or more scoring functions to use for evaluation -- `--provider-id`: Optional. Provider ID for the eval task -- `--provider-eval-task-id`: Optional. Provider's eval task ID -- `--metadata`: Optional. Metadata for the eval task in JSON format +### Options + +* **-h, --help**: Show this message and exit. [default: False] + +### Commands + +* **get**: Show available llama toolgroups at... + +* **list**: Show available llama toolgroups at... -#### `llama-stack-client eval run-benchmark` -```bash -$ llama-stack-client eval run-benchmark [ ...] --eval-task-config --output-dir [--num-examples ] [--visualize] +* **register**: Register a new toolgroup at distribution... + +* **unregister**: Unregister a toolgroup from distribution... + + + +### get + +Show available llama toolgroups at distribution endpoint + +### Usage + +``` +Usage: cli toolgroups get [OPTIONS] TOOLGROUP_ID ``` -Options: -- `--eval-task-config`: Required. Path to the eval task config file in JSON format -- `--output-dir`: Required. Path to the directory where evaluation results will be saved -- `--num-examples`: Optional. Number of examples to evaluate (useful for debugging) -- `--visualize`: Optional flag. If set, visualizes evaluation results after completion +### Options + +* **-h, --help**: Show this message and exit. [default: False] + +### Arguments + +* **TOOLGROUP_ID** + + + +### list + +Show available llama toolgroups at distribution endpoint + +### Usage -Example eval_benchmark_config.json: -```json -{ - "type": "benchmark", - "eval_candidate": { - "type": "model", - "model": "Llama3.1-405B-Instruct", - "sampling_params": { - "strategy": "greedy", - "temperature": 0, - "top_p": 0.95, - "top_k": 0, - "max_tokens": 0, - "repetition_penalty": 1.0 - } - } -} ``` +Usage: cli toolgroups list [OPTIONS] +``` + +### Options + +* **-h, --help**: Show this message and exit. [default: False] + + + +### register + +Register a new toolgroup at distribution endpoint -#### `llama-stack-client eval run-scoring` -```bash -$ llama-stack-client eval run-scoring --eval-task-config --output-dir [--num-examples ] [--visualize] +### Usage + +``` +Usage: cli toolgroups register [OPTIONS] TOOLGROUP_ID ``` -Options: -- `--eval-task-config`: Required. Path to the eval task config file in JSON format -- `--output-dir`: Required. Path to the directory where scoring results will be saved -- `--num-examples`: Optional. Number of examples to evaluate (useful for debugging) -- `--visualize`: Optional flag. If set, visualizes scoring results after completion +### Options + +* **-h, --help**: Show this message and exit. [default: False] + +* **--provider-id**: Provider ID for the toolgroup + +* **--provider-toolgroup-id**: Provider's toolgroup ID + +* **--mcp-config**: JSON mcp_config for the toolgroup + +* **--args**: JSON args for the toolgroup + +### Arguments + +* **TOOLGROUP_ID** + + + +### unregister + +Unregister a toolgroup from distribution endpoint + +### Usage -#### `llama-stack-client toolgroups list` -```bash -$ llama-stack-client toolgroups list ``` +Usage: cli toolgroups unregister [OPTIONS] TOOLGROUP_ID ``` -+---------------------------+------------------+------+---------------+ -| identifier | provider_id | args | mcp_endpoint | -+===========================+==================+======+===============+ -| builtin::code_interpreter | code-interpreter | None | None | -+---------------------------+------------------+------+---------------+ -| builtin::rag | rag-runtime | None | None | -+---------------------------+------------------+------+---------------+ -| builtin::websearch | tavily-search | None | None | -+---------------------------+------------------+------+---------------+ + +### Options + +* **-h, --help**: Show this message and exit. [default: False] + +### Arguments + +* **TOOLGROUP_ID** + + + +## vector-dbs + +Manage vector databases. + +### Usage + ``` +Usage: cli vector-dbs [OPTIONS] COMMAND [ARGS]... +``` + +### Options + +* **-h, --help**: Show this message and exit. [default: False] + +### Commands + +* **list**: Show available vector dbs on distribution... + +* **register**: Create a new vector db + +* **unregister**: Delete a vector db -#### `llama-stack-client toolgroups get` -```bash -$ llama-stack-client toolgroups get + + +### list + +Show available vector dbs on distribution endpoint + +### Usage + +``` +Usage: cli vector-dbs list [OPTIONS] ``` -Shows detailed information about a specific toolgroup. If the toolgroup is not found, displays an error message. +### Options + +* **-h, --help**: Show this message and exit. [default: False] + + + +### register + +Create a new vector db + +### Usage -#### `llama-stack-client toolgroups register` -```bash -$ llama-stack-client toolgroups register [--provider-id ] [--provider-toolgroup-id ] [--mcp-config ] [--args ] ``` +Usage: cli vector-dbs register [OPTIONS] VECTOR_DB_ID +``` + +### Options + +* **-h, --help**: Show this message and exit. [default: False] -Options: -- `--provider-id`: Optional. Provider ID for the toolgroup -- `--provider-toolgroup-id`: Optional. Provider's toolgroup ID -- `--mcp-config`: Optional. JSON configuration for the MCP endpoint -- `--args`: Optional. JSON arguments for the toolgroup +* **--provider-id**: Provider ID for the vector db + +* **--provider-vector-db-id**: Provider's vector db ID + +* **--embedding-model**: Embedding model (for vector type) [default: all-MiniLM-L6-v2] + +* **--embedding-dimension**: Embedding dimension (for vector type) [default: 384] + +### Arguments + +* **VECTOR_DB_ID** + + + +### unregister + +Delete a vector db + +### Usage -#### `llama-stack-client toolgroups unregister` -```bash -$ llama-stack-client toolgroups unregister ``` +Usage: cli vector-dbs unregister [OPTIONS] VECTOR_DB_ID +``` + +### Options + +* **-h, --help**: Show this message and exit. [default: False] + +### Arguments + +* **VECTOR_DB_ID** diff --git a/scripts/gen_cli_doc.py b/scripts/gen_cli_doc.py new file mode 100644 index 00000000..346c1b39 --- /dev/null +++ b/scripts/gen_cli_doc.py @@ -0,0 +1,80 @@ +# Copyright (c) Meta Platforms, Inc. and affiliates. +# All rights reserved. +# +# This source code is licensed under the terms described in the LICENSE file in +# the root directory of this source tree. + +import os +from pathlib import Path + +import click +from llama_stack_client.lib.cli.llama_stack_client import cli + + +def generate_markdown_docs(command, parent=None, level=1): + """Generate markdown documentation for a click command.""" + ctx = click.Context(command, info_name=command.name, parent=parent) + + # Start with the command name as a header + prefix = "#" * level + doc = [f"{prefix} {command.name or 'CLI Reference'}\n"] + + # Add command help docstring + if command.help: + doc.append(f"{command.help}\n") + + # Add usage + doc.append("### Usage\n") + doc.append(f"```\n{command.get_usage(ctx)}\n```\n") + + # Add options if present + has_options = False + for param in command.get_params(ctx): + if isinstance(param, click.Option): + if not has_options: + doc.append("### Options\n") + has_options = True + opts = ", ".join(param.opts) + help_text = param.help or "" + default = f" [default: {param.default}]" if param.default is not None else "" + doc.append(f"* **{opts}**: {help_text}{default}\n") + + # Add arguments if present + has_arguments = False + for param in command.get_params(ctx): + if isinstance(param, click.Argument): + if not has_arguments: + doc.append("### Arguments\n") + has_arguments = True + name = param.name.upper() + doc.append(f"* **{name}**\n") + + # If this is a group with commands, add subcommands + if isinstance(command, click.Group): + doc.append("### Commands\n") + for cmd_name in command.list_commands(ctx): + cmd = command.get_command(ctx, cmd_name) + cmd_help = cmd.get_short_help_str() if cmd else "" + doc.append(f"* **{cmd_name}**: {cmd_help}\n") + + # Add detailed subcommand documentation + for cmd_name in command.list_commands(ctx): + cmd = command.get_command(ctx, cmd_name) + if cmd: + doc.append("\n") + doc.extend(generate_markdown_docs(cmd, ctx, level + 1)) + + return doc + + +if __name__ == "__main__": + # Generate the docs + markdown_lines = generate_markdown_docs(cli) + markdown = "\n".join(markdown_lines) + + # Write to file + file_path = Path(__file__).parent.parent / "docs" / "cli_reference.md" + with open(file_path, "w") as f: + f.write(markdown) + + print(f"Documentation generated in {file_path}") From fe97992b050a036683d2cecfd768afb91e5c9a2b Mon Sep 17 00:00:00 2001 From: Xi Yan Date: Mon, 10 Mar 2025 11:29:56 -0700 Subject: [PATCH 02/10] update --- docs/cli_reference.md | 86 +++++++++++++++++++++--------------------- scripts/gen_cli_doc.py | 2 +- 2 files changed, 44 insertions(+), 44 deletions(-) diff --git a/docs/cli_reference.md b/docs/cli_reference.md index 18b44d13..ef56afca 100644 --- a/docs/cli_reference.md +++ b/docs/cli_reference.md @@ -1,4 +1,4 @@ -# cli +# 'CLI Reference' Welcome to the LlamaStackClient CLI @@ -50,7 +50,7 @@ Usage: cli [OPTIONS] COMMAND [ARGS]... -## configure +## 'CLI Reference' Configure Llama Stack Client CLI. @@ -70,7 +70,7 @@ Usage: cli configure [OPTIONS] -## datasets +## 'CLI Reference' Manage datasets. @@ -92,7 +92,7 @@ Usage: cli datasets [OPTIONS] COMMAND [ARGS]... -### list +### 'CLI Reference' Show available datasets on distribution endpoint @@ -108,7 +108,7 @@ Usage: cli datasets list [OPTIONS] -### register +### 'CLI Reference' Create a new dataset @@ -138,7 +138,7 @@ Usage: cli datasets register [OPTIONS] -## eval +## 'CLI Reference' Run evaluation tasks. @@ -160,7 +160,7 @@ Usage: cli eval [OPTIONS] COMMAND [ARGS]... -### run-benchmark +### 'CLI Reference' Run a evaluation benchmark task @@ -196,7 +196,7 @@ Usage: cli eval run-benchmark [OPTIONS] BENCHMARK_IDS... -### run-scoring +### 'CLI Reference' Run scoring from application datasets @@ -228,7 +228,7 @@ Usage: cli eval run-scoring [OPTIONS] SCORING_FUNCTION_IDS... -## eval-tasks +## 'CLI Reference' Manage evaluation tasks. @@ -250,7 +250,7 @@ Usage: cli eval-tasks [OPTIONS] COMMAND [ARGS]... -### list +### 'CLI Reference' Show available eval tasks on distribution endpoint @@ -266,7 +266,7 @@ Usage: cli eval-tasks list [OPTIONS] -### register +### 'CLI Reference' Register a new eval task @@ -294,7 +294,7 @@ Usage: cli eval-tasks register [OPTIONS] -## inference +## 'CLI Reference' Inference (chat). @@ -314,7 +314,7 @@ Usage: cli inference [OPTIONS] COMMAND [ARGS]... -### chat-completion +### 'CLI Reference' Show available inference chat completion endpoints on distribution endpoint @@ -338,7 +338,7 @@ Usage: cli inference chat-completion [OPTIONS] -## inspect +## 'CLI Reference' Inspect server configuration. @@ -358,7 +358,7 @@ Usage: cli inspect [OPTIONS] COMMAND [ARGS]... -### version +### 'CLI Reference' Show available providers on distribution endpoint @@ -374,7 +374,7 @@ Usage: cli inspect version [OPTIONS] -## models +## 'CLI Reference' Manage GenAI models. @@ -400,7 +400,7 @@ Usage: cli models [OPTIONS] COMMAND [ARGS]... -### get +### 'CLI Reference' Show available llama models at distribution endpoint @@ -420,7 +420,7 @@ Usage: cli models get [OPTIONS] MODEL_ID -### list +### 'CLI Reference' Show available llama models at distribution endpoint @@ -436,7 +436,7 @@ Usage: cli models list [OPTIONS] -### register +### 'CLI Reference' Register a new model at distribution endpoint @@ -462,7 +462,7 @@ Usage: cli models register [OPTIONS] MODEL_ID -### unregister +### 'CLI Reference' Unregister a model from distribution endpoint @@ -482,7 +482,7 @@ Usage: cli models unregister [OPTIONS] MODEL_ID -## post-training +## 'CLI Reference' Post-training. @@ -510,7 +510,7 @@ Usage: cli post-training [OPTIONS] COMMAND [ARGS]... -### artifacts +### 'CLI Reference' Get the training artifacts of a specific post training job @@ -528,7 +528,7 @@ Usage: cli post-training artifacts [OPTIONS] -### cancel +### 'CLI Reference' Cancel the training job @@ -546,7 +546,7 @@ Usage: cli post-training cancel [OPTIONS] -### list +### 'CLI Reference' Show the list of available post training jobs @@ -562,7 +562,7 @@ Usage: cli post-training list [OPTIONS] -### status +### 'CLI Reference' Show the status of a specific post training job @@ -580,7 +580,7 @@ Usage: cli post-training status [OPTIONS] -### supervised_fine_tune +### 'CLI Reference' Kick off a supervised fine tune job @@ -606,7 +606,7 @@ Usage: cli post-training supervised_fine_tune [OPTIONS] -## providers +## 'CLI Reference' Manage API providers. @@ -626,7 +626,7 @@ Usage: cli providers [OPTIONS] COMMAND [ARGS]... -### list +### 'CLI Reference' Show available providers on distribution endpoint @@ -642,7 +642,7 @@ Usage: cli providers list [OPTIONS] -## scoring-functions +## 'CLI Reference' Manage scoring functions. @@ -664,7 +664,7 @@ Usage: cli scoring-functions [OPTIONS] COMMAND [ARGS]... -### list +### 'CLI Reference' Show available scoring functions on distribution endpoint @@ -680,7 +680,7 @@ Usage: cli scoring-functions list [OPTIONS] -### register +### 'CLI Reference' Register a new scoring function @@ -708,7 +708,7 @@ Usage: cli scoring-functions register [OPTIONS] -## shields +## 'CLI Reference' Manage safety shield services. @@ -730,7 +730,7 @@ Usage: cli shields [OPTIONS] COMMAND [ARGS]... -### list +### 'CLI Reference' Show available safety shields on distribution endpoint @@ -746,7 +746,7 @@ Usage: cli shields list [OPTIONS] -### register +### 'CLI Reference' Register a new safety shield @@ -770,7 +770,7 @@ Usage: cli shields register [OPTIONS] -## toolgroups +## 'CLI Reference' Manage available tool groups. @@ -796,7 +796,7 @@ Usage: cli toolgroups [OPTIONS] COMMAND [ARGS]... -### get +### 'CLI Reference' Show available llama toolgroups at distribution endpoint @@ -816,7 +816,7 @@ Usage: cli toolgroups get [OPTIONS] TOOLGROUP_ID -### list +### 'CLI Reference' Show available llama toolgroups at distribution endpoint @@ -832,7 +832,7 @@ Usage: cli toolgroups list [OPTIONS] -### register +### 'CLI Reference' Register a new toolgroup at distribution endpoint @@ -860,7 +860,7 @@ Usage: cli toolgroups register [OPTIONS] TOOLGROUP_ID -### unregister +### 'CLI Reference' Unregister a toolgroup from distribution endpoint @@ -880,7 +880,7 @@ Usage: cli toolgroups unregister [OPTIONS] TOOLGROUP_ID -## vector-dbs +## 'CLI Reference' Manage vector databases. @@ -904,7 +904,7 @@ Usage: cli vector-dbs [OPTIONS] COMMAND [ARGS]... -### list +### 'CLI Reference' Show available vector dbs on distribution endpoint @@ -920,7 +920,7 @@ Usage: cli vector-dbs list [OPTIONS] -### register +### 'CLI Reference' Create a new vector db @@ -948,7 +948,7 @@ Usage: cli vector-dbs register [OPTIONS] VECTOR_DB_ID -### unregister +### 'CLI Reference' Delete a vector db diff --git a/scripts/gen_cli_doc.py b/scripts/gen_cli_doc.py index 346c1b39..af4fac02 100644 --- a/scripts/gen_cli_doc.py +++ b/scripts/gen_cli_doc.py @@ -17,7 +17,7 @@ def generate_markdown_docs(command, parent=None, level=1): # Start with the command name as a header prefix = "#" * level - doc = [f"{prefix} {command.name or 'CLI Reference'}\n"] + doc = [f"{prefix} 'CLI Reference'\n"] # Add command help docstring if command.help: From ef9b46826b24d2a056420f3296f876992cd4bc6a Mon Sep 17 00:00:00 2001 From: Xi Yan Date: Mon, 10 Mar 2025 11:33:36 -0700 Subject: [PATCH 03/10] update name --- docs/cli_reference.md | 87 ++++++++++--------- scripts/gen_cli_doc.py | 5 +- .../lib/cli/llama_stack_client.py | 30 +++---- 3 files changed, 62 insertions(+), 60 deletions(-) diff --git a/docs/cli_reference.md b/docs/cli_reference.md index ef56afca..4b21c657 100644 --- a/docs/cli_reference.md +++ b/docs/cli_reference.md @@ -5,7 +5,7 @@ Welcome to the LlamaStackClient CLI ### Usage ``` -Usage: cli [OPTIONS] COMMAND [ARGS]... +Usage: llama-stack-client [OPTIONS] COMMAND [ARGS]... ``` ### Options @@ -57,7 +57,7 @@ Configure Llama Stack Client CLI. ### Usage ``` -Usage: cli configure [OPTIONS] +Usage: llama-stack-client configure [OPTIONS] ``` ### Options @@ -77,7 +77,7 @@ Manage datasets. ### Usage ``` -Usage: cli datasets [OPTIONS] COMMAND [ARGS]... +Usage: llama-stack-client datasets [OPTIONS] COMMAND [ARGS]... ``` ### Options @@ -99,7 +99,7 @@ Show available datasets on distribution endpoint ### Usage ``` -Usage: cli datasets list [OPTIONS] +Usage: llama-stack-client datasets list [OPTIONS] ``` ### Options @@ -115,7 +115,7 @@ Create a new dataset ### Usage ``` -Usage: cli datasets register [OPTIONS] +Usage: llama-stack-client datasets register [OPTIONS] ``` ### Options @@ -145,7 +145,7 @@ Run evaluation tasks. ### Usage ``` -Usage: cli eval [OPTIONS] COMMAND [ARGS]... +Usage: llama-stack-client eval [OPTIONS] COMMAND [ARGS]... ``` ### Options @@ -167,7 +167,7 @@ Run a evaluation benchmark task ### Usage ``` -Usage: cli eval run-benchmark [OPTIONS] BENCHMARK_IDS... +Usage: llama-stack-client eval run-benchmark [OPTIONS] BENCHMARK_IDS... ``` ### Options @@ -203,7 +203,7 @@ Run scoring from application datasets ### Usage ``` -Usage: cli eval run-scoring [OPTIONS] SCORING_FUNCTION_IDS... +Usage: llama-stack-client eval run-scoring [OPTIONS] SCORING_FUNCTION_IDS... ``` ### Options @@ -235,7 +235,7 @@ Manage evaluation tasks. ### Usage ``` -Usage: cli eval-tasks [OPTIONS] COMMAND [ARGS]... +Usage: llama-stack-client eval-tasks [OPTIONS] COMMAND [ARGS]... ``` ### Options @@ -257,7 +257,7 @@ Show available eval tasks on distribution endpoint ### Usage ``` -Usage: cli eval-tasks list [OPTIONS] +Usage: llama-stack-client eval-tasks list [OPTIONS] ``` ### Options @@ -273,7 +273,7 @@ Register a new eval task ### Usage ``` -Usage: cli eval-tasks register [OPTIONS] +Usage: llama-stack-client eval-tasks register [OPTIONS] ``` ### Options @@ -301,7 +301,7 @@ Inference (chat). ### Usage ``` -Usage: cli inference [OPTIONS] COMMAND [ARGS]... +Usage: llama-stack-client inference [OPTIONS] COMMAND [ARGS]... ``` ### Options @@ -321,7 +321,7 @@ Show available inference chat completion endpoints on distribution endpoint ### Usage ``` -Usage: cli inference chat-completion [OPTIONS] +Usage: llama-stack-client inference chat-completion [OPTIONS] ``` ### Options @@ -345,7 +345,7 @@ Inspect server configuration. ### Usage ``` -Usage: cli inspect [OPTIONS] COMMAND [ARGS]... +Usage: llama-stack-client inspect [OPTIONS] COMMAND [ARGS]... ``` ### Options @@ -365,7 +365,7 @@ Show available providers on distribution endpoint ### Usage ``` -Usage: cli inspect version [OPTIONS] +Usage: llama-stack-client inspect version [OPTIONS] ``` ### Options @@ -381,7 +381,7 @@ Manage GenAI models. ### Usage ``` -Usage: cli models [OPTIONS] COMMAND [ARGS]... +Usage: llama-stack-client models [OPTIONS] COMMAND [ARGS]... ``` ### Options @@ -407,7 +407,7 @@ Show available llama models at distribution endpoint ### Usage ``` -Usage: cli models get [OPTIONS] MODEL_ID +Usage: llama-stack-client models get [OPTIONS] MODEL_ID ``` ### Options @@ -427,7 +427,7 @@ Show available llama models at distribution endpoint ### Usage ``` -Usage: cli models list [OPTIONS] +Usage: llama-stack-client models list [OPTIONS] ``` ### Options @@ -443,7 +443,7 @@ Register a new model at distribution endpoint ### Usage ``` -Usage: cli models register [OPTIONS] MODEL_ID +Usage: llama-stack-client models register [OPTIONS] MODEL_ID ``` ### Options @@ -469,7 +469,7 @@ Unregister a model from distribution endpoint ### Usage ``` -Usage: cli models unregister [OPTIONS] MODEL_ID +Usage: llama-stack-client models unregister [OPTIONS] MODEL_ID ``` ### Options @@ -489,7 +489,7 @@ Post-training. ### Usage ``` -Usage: cli post-training [OPTIONS] COMMAND [ARGS]... +Usage: llama-stack-client post-training [OPTIONS] COMMAND [ARGS]... ``` ### Options @@ -517,7 +517,7 @@ Get the training artifacts of a specific post training job ### Usage ``` -Usage: cli post-training artifacts [OPTIONS] +Usage: llama-stack-client post-training artifacts [OPTIONS] ``` ### Options @@ -535,7 +535,7 @@ Cancel the training job ### Usage ``` -Usage: cli post-training cancel [OPTIONS] +Usage: llama-stack-client post-training cancel [OPTIONS] ``` ### Options @@ -553,7 +553,7 @@ Show the list of available post training jobs ### Usage ``` -Usage: cli post-training list [OPTIONS] +Usage: llama-stack-client post-training list [OPTIONS] ``` ### Options @@ -569,7 +569,7 @@ Show the status of a specific post training job ### Usage ``` -Usage: cli post-training status [OPTIONS] +Usage: llama-stack-client post-training status [OPTIONS] ``` ### Options @@ -587,7 +587,8 @@ Kick off a supervised fine tune job ### Usage ``` -Usage: cli post-training supervised_fine_tune [OPTIONS] +Usage: llama-stack-client post-training supervised_fine_tune + [OPTIONS] ``` ### Options @@ -613,7 +614,7 @@ Manage API providers. ### Usage ``` -Usage: cli providers [OPTIONS] COMMAND [ARGS]... +Usage: llama-stack-client providers [OPTIONS] COMMAND [ARGS]... ``` ### Options @@ -633,7 +634,7 @@ Show available providers on distribution endpoint ### Usage ``` -Usage: cli providers list [OPTIONS] +Usage: llama-stack-client providers list [OPTIONS] ``` ### Options @@ -649,7 +650,7 @@ Manage scoring functions. ### Usage ``` -Usage: cli scoring-functions [OPTIONS] COMMAND [ARGS]... +Usage: llama-stack-client scoring-functions [OPTIONS] COMMAND [ARGS]... ``` ### Options @@ -671,7 +672,7 @@ Show available scoring functions on distribution endpoint ### Usage ``` -Usage: cli scoring-functions list [OPTIONS] +Usage: llama-stack-client scoring-functions list [OPTIONS] ``` ### Options @@ -687,7 +688,7 @@ Register a new scoring function ### Usage ``` -Usage: cli scoring-functions register [OPTIONS] +Usage: llama-stack-client scoring-functions register [OPTIONS] ``` ### Options @@ -715,7 +716,7 @@ Manage safety shield services. ### Usage ``` -Usage: cli shields [OPTIONS] COMMAND [ARGS]... +Usage: llama-stack-client shields [OPTIONS] COMMAND [ARGS]... ``` ### Options @@ -737,7 +738,7 @@ Show available safety shields on distribution endpoint ### Usage ``` -Usage: cli shields list [OPTIONS] +Usage: llama-stack-client shields list [OPTIONS] ``` ### Options @@ -753,7 +754,7 @@ Register a new safety shield ### Usage ``` -Usage: cli shields register [OPTIONS] +Usage: llama-stack-client shields register [OPTIONS] ``` ### Options @@ -777,7 +778,7 @@ Manage available tool groups. ### Usage ``` -Usage: cli toolgroups [OPTIONS] COMMAND [ARGS]... +Usage: llama-stack-client toolgroups [OPTIONS] COMMAND [ARGS]... ``` ### Options @@ -803,7 +804,7 @@ Show available llama toolgroups at distribution endpoint ### Usage ``` -Usage: cli toolgroups get [OPTIONS] TOOLGROUP_ID +Usage: llama-stack-client toolgroups get [OPTIONS] TOOLGROUP_ID ``` ### Options @@ -823,7 +824,7 @@ Show available llama toolgroups at distribution endpoint ### Usage ``` -Usage: cli toolgroups list [OPTIONS] +Usage: llama-stack-client toolgroups list [OPTIONS] ``` ### Options @@ -839,7 +840,7 @@ Register a new toolgroup at distribution endpoint ### Usage ``` -Usage: cli toolgroups register [OPTIONS] TOOLGROUP_ID +Usage: llama-stack-client toolgroups register [OPTIONS] TOOLGROUP_ID ``` ### Options @@ -867,7 +868,7 @@ Unregister a toolgroup from distribution endpoint ### Usage ``` -Usage: cli toolgroups unregister [OPTIONS] TOOLGROUP_ID +Usage: llama-stack-client toolgroups unregister [OPTIONS] TOOLGROUP_ID ``` ### Options @@ -887,7 +888,7 @@ Manage vector databases. ### Usage ``` -Usage: cli vector-dbs [OPTIONS] COMMAND [ARGS]... +Usage: llama-stack-client vector-dbs [OPTIONS] COMMAND [ARGS]... ``` ### Options @@ -911,7 +912,7 @@ Show available vector dbs on distribution endpoint ### Usage ``` -Usage: cli vector-dbs list [OPTIONS] +Usage: llama-stack-client vector-dbs list [OPTIONS] ``` ### Options @@ -927,7 +928,7 @@ Create a new vector db ### Usage ``` -Usage: cli vector-dbs register [OPTIONS] VECTOR_DB_ID +Usage: llama-stack-client vector-dbs register [OPTIONS] VECTOR_DB_ID ``` ### Options @@ -955,7 +956,7 @@ Delete a vector db ### Usage ``` -Usage: cli vector-dbs unregister [OPTIONS] VECTOR_DB_ID +Usage: llama-stack-client vector-dbs unregister [OPTIONS] VECTOR_DB_ID ``` ### Options diff --git a/scripts/gen_cli_doc.py b/scripts/gen_cli_doc.py index af4fac02..73ca5f35 100644 --- a/scripts/gen_cli_doc.py +++ b/scripts/gen_cli_doc.py @@ -8,11 +8,12 @@ from pathlib import Path import click -from llama_stack_client.lib.cli.llama_stack_client import cli +from llama_stack_client.lib.cli.llama_stack_client import llama_stack_client def generate_markdown_docs(command, parent=None, level=1): """Generate markdown documentation for a click command.""" + print("command.name", command.name) ctx = click.Context(command, info_name=command.name, parent=parent) # Start with the command name as a header @@ -69,7 +70,7 @@ def generate_markdown_docs(command, parent=None, level=1): if __name__ == "__main__": # Generate the docs - markdown_lines = generate_markdown_docs(cli) + markdown_lines = generate_markdown_docs(llama_stack_client) markdown = "\n".join(markdown_lines) # Write to file diff --git a/src/llama_stack_client/lib/cli/llama_stack_client.py b/src/llama_stack_client/lib/cli/llama_stack_client.py index d2b86528..3631094b 100644 --- a/src/llama_stack_client/lib/cli/llama_stack_client.py +++ b/src/llama_stack_client/lib/cli/llama_stack_client.py @@ -35,7 +35,7 @@ @click.option("--api-key", type=str, help="Llama Stack distribution API key", default="") @click.option("--config", type=str, help="Path to config file", default=None) @click.pass_context -def cli(ctx, endpoint: str, api_key: str, config: str | None): +def llama_stack_client(ctx, endpoint: str, api_key: str, config: str | None): """Welcome to the LlamaStackClient CLI""" ctx.ensure_object(dict) @@ -80,23 +80,23 @@ def cli(ctx, endpoint: str, api_key: str, config: str | None): # Register all subcommands -cli.add_command(models, "models") -cli.add_command(vector_dbs, "vector_dbs") -cli.add_command(shields, "shields") -cli.add_command(eval_tasks, "eval_tasks") -cli.add_command(providers, "providers") -cli.add_command(datasets, "datasets") -cli.add_command(configure, "configure") -cli.add_command(scoring_functions, "scoring_functions") -cli.add_command(eval, "eval") -cli.add_command(inference, "inference") -cli.add_command(post_training, "post_training") -cli.add_command(inspect, "inspect") -cli.add_command(toolgroups, "toolgroups") +llama_stack_client.add_command(models, "models") +llama_stack_client.add_command(vector_dbs, "vector_dbs") +llama_stack_client.add_command(shields, "shields") +llama_stack_client.add_command(eval_tasks, "eval_tasks") +llama_stack_client.add_command(providers, "providers") +llama_stack_client.add_command(datasets, "datasets") +llama_stack_client.add_command(configure, "configure") +llama_stack_client.add_command(scoring_functions, "scoring_functions") +llama_stack_client.add_command(eval, "eval") +llama_stack_client.add_command(inference, "inference") +llama_stack_client.add_command(post_training, "post_training") +llama_stack_client.add_command(inspect, "inspect") +llama_stack_client.add_command(toolgroups, "toolgroups") def main(): - cli() + llama_stack_client() if __name__ == "__main__": From 2b1bcbc7cac7f7e89134648dd0a0136f463757bd Mon Sep 17 00:00:00 2001 From: Xi Yan Date: Mon, 10 Mar 2025 11:35:06 -0700 Subject: [PATCH 04/10] update --- scripts/gen_cli_doc.py | 1 - 1 file changed, 1 deletion(-) diff --git a/scripts/gen_cli_doc.py b/scripts/gen_cli_doc.py index 73ca5f35..42e4cb60 100644 --- a/scripts/gen_cli_doc.py +++ b/scripts/gen_cli_doc.py @@ -13,7 +13,6 @@ def generate_markdown_docs(command, parent=None, level=1): """Generate markdown documentation for a click command.""" - print("command.name", command.name) ctx = click.Context(command, info_name=command.name, parent=parent) # Start with the command name as a header From d52ce1eadcac30558ecb3df86ea070f369db7c73 Mon Sep 17 00:00:00 2001 From: Xi Yan Date: Mon, 10 Mar 2025 11:35:26 -0700 Subject: [PATCH 05/10] update --- docs/cli_reference.md | 86 +++++++++++++++++++++--------------------- scripts/gen_cli_doc.py | 2 +- 2 files changed, 44 insertions(+), 44 deletions(-) diff --git a/docs/cli_reference.md b/docs/cli_reference.md index 4b21c657..25130c43 100644 --- a/docs/cli_reference.md +++ b/docs/cli_reference.md @@ -1,4 +1,4 @@ -# 'CLI Reference' +# CLI Reference Welcome to the LlamaStackClient CLI @@ -50,7 +50,7 @@ Usage: llama-stack-client [OPTIONS] COMMAND [ARGS]... -## 'CLI Reference' +## CLI Reference Configure Llama Stack Client CLI. @@ -70,7 +70,7 @@ Usage: llama-stack-client configure [OPTIONS] -## 'CLI Reference' +## CLI Reference Manage datasets. @@ -92,7 +92,7 @@ Usage: llama-stack-client datasets [OPTIONS] COMMAND [ARGS]... -### 'CLI Reference' +### CLI Reference Show available datasets on distribution endpoint @@ -108,7 +108,7 @@ Usage: llama-stack-client datasets list [OPTIONS] -### 'CLI Reference' +### CLI Reference Create a new dataset @@ -138,7 +138,7 @@ Usage: llama-stack-client datasets register [OPTIONS] -## 'CLI Reference' +## CLI Reference Run evaluation tasks. @@ -160,7 +160,7 @@ Usage: llama-stack-client eval [OPTIONS] COMMAND [ARGS]... -### 'CLI Reference' +### CLI Reference Run a evaluation benchmark task @@ -196,7 +196,7 @@ Usage: llama-stack-client eval run-benchmark [OPTIONS] BENCHMARK_IDS... -### 'CLI Reference' +### CLI Reference Run scoring from application datasets @@ -228,7 +228,7 @@ Usage: llama-stack-client eval run-scoring [OPTIONS] SCORING_FUNCTION_IDS... -## 'CLI Reference' +## CLI Reference Manage evaluation tasks. @@ -250,7 +250,7 @@ Usage: llama-stack-client eval-tasks [OPTIONS] COMMAND [ARGS]... -### 'CLI Reference' +### CLI Reference Show available eval tasks on distribution endpoint @@ -266,7 +266,7 @@ Usage: llama-stack-client eval-tasks list [OPTIONS] -### 'CLI Reference' +### CLI Reference Register a new eval task @@ -294,7 +294,7 @@ Usage: llama-stack-client eval-tasks register [OPTIONS] -## 'CLI Reference' +## CLI Reference Inference (chat). @@ -314,7 +314,7 @@ Usage: llama-stack-client inference [OPTIONS] COMMAND [ARGS]... -### 'CLI Reference' +### CLI Reference Show available inference chat completion endpoints on distribution endpoint @@ -338,7 +338,7 @@ Usage: llama-stack-client inference chat-completion [OPTIONS] -## 'CLI Reference' +## CLI Reference Inspect server configuration. @@ -358,7 +358,7 @@ Usage: llama-stack-client inspect [OPTIONS] COMMAND [ARGS]... -### 'CLI Reference' +### CLI Reference Show available providers on distribution endpoint @@ -374,7 +374,7 @@ Usage: llama-stack-client inspect version [OPTIONS] -## 'CLI Reference' +## CLI Reference Manage GenAI models. @@ -400,7 +400,7 @@ Usage: llama-stack-client models [OPTIONS] COMMAND [ARGS]... -### 'CLI Reference' +### CLI Reference Show available llama models at distribution endpoint @@ -420,7 +420,7 @@ Usage: llama-stack-client models get [OPTIONS] MODEL_ID -### 'CLI Reference' +### CLI Reference Show available llama models at distribution endpoint @@ -436,7 +436,7 @@ Usage: llama-stack-client models list [OPTIONS] -### 'CLI Reference' +### CLI Reference Register a new model at distribution endpoint @@ -462,7 +462,7 @@ Usage: llama-stack-client models register [OPTIONS] MODEL_ID -### 'CLI Reference' +### CLI Reference Unregister a model from distribution endpoint @@ -482,7 +482,7 @@ Usage: llama-stack-client models unregister [OPTIONS] MODEL_ID -## 'CLI Reference' +## CLI Reference Post-training. @@ -510,7 +510,7 @@ Usage: llama-stack-client post-training [OPTIONS] COMMAND [ARGS]... -### 'CLI Reference' +### CLI Reference Get the training artifacts of a specific post training job @@ -528,7 +528,7 @@ Usage: llama-stack-client post-training artifacts [OPTIONS] -### 'CLI Reference' +### CLI Reference Cancel the training job @@ -546,7 +546,7 @@ Usage: llama-stack-client post-training cancel [OPTIONS] -### 'CLI Reference' +### CLI Reference Show the list of available post training jobs @@ -562,7 +562,7 @@ Usage: llama-stack-client post-training list [OPTIONS] -### 'CLI Reference' +### CLI Reference Show the status of a specific post training job @@ -580,7 +580,7 @@ Usage: llama-stack-client post-training status [OPTIONS] -### 'CLI Reference' +### CLI Reference Kick off a supervised fine tune job @@ -607,7 +607,7 @@ Usage: llama-stack-client post-training supervised_fine_tune -## 'CLI Reference' +## CLI Reference Manage API providers. @@ -627,7 +627,7 @@ Usage: llama-stack-client providers [OPTIONS] COMMAND [ARGS]... -### 'CLI Reference' +### CLI Reference Show available providers on distribution endpoint @@ -643,7 +643,7 @@ Usage: llama-stack-client providers list [OPTIONS] -## 'CLI Reference' +## CLI Reference Manage scoring functions. @@ -665,7 +665,7 @@ Usage: llama-stack-client scoring-functions [OPTIONS] COMMAND [ARGS]... -### 'CLI Reference' +### CLI Reference Show available scoring functions on distribution endpoint @@ -681,7 +681,7 @@ Usage: llama-stack-client scoring-functions list [OPTIONS] -### 'CLI Reference' +### CLI Reference Register a new scoring function @@ -709,7 +709,7 @@ Usage: llama-stack-client scoring-functions register [OPTIONS] -## 'CLI Reference' +## CLI Reference Manage safety shield services. @@ -731,7 +731,7 @@ Usage: llama-stack-client shields [OPTIONS] COMMAND [ARGS]... -### 'CLI Reference' +### CLI Reference Show available safety shields on distribution endpoint @@ -747,7 +747,7 @@ Usage: llama-stack-client shields list [OPTIONS] -### 'CLI Reference' +### CLI Reference Register a new safety shield @@ -771,7 +771,7 @@ Usage: llama-stack-client shields register [OPTIONS] -## 'CLI Reference' +## CLI Reference Manage available tool groups. @@ -797,7 +797,7 @@ Usage: llama-stack-client toolgroups [OPTIONS] COMMAND [ARGS]... -### 'CLI Reference' +### CLI Reference Show available llama toolgroups at distribution endpoint @@ -817,7 +817,7 @@ Usage: llama-stack-client toolgroups get [OPTIONS] TOOLGROUP_ID -### 'CLI Reference' +### CLI Reference Show available llama toolgroups at distribution endpoint @@ -833,7 +833,7 @@ Usage: llama-stack-client toolgroups list [OPTIONS] -### 'CLI Reference' +### CLI Reference Register a new toolgroup at distribution endpoint @@ -861,7 +861,7 @@ Usage: llama-stack-client toolgroups register [OPTIONS] TOOLGROUP_ID -### 'CLI Reference' +### CLI Reference Unregister a toolgroup from distribution endpoint @@ -881,7 +881,7 @@ Usage: llama-stack-client toolgroups unregister [OPTIONS] TOOLGROUP_ID -## 'CLI Reference' +## CLI Reference Manage vector databases. @@ -905,7 +905,7 @@ Usage: llama-stack-client vector-dbs [OPTIONS] COMMAND [ARGS]... -### 'CLI Reference' +### CLI Reference Show available vector dbs on distribution endpoint @@ -921,7 +921,7 @@ Usage: llama-stack-client vector-dbs list [OPTIONS] -### 'CLI Reference' +### CLI Reference Create a new vector db @@ -949,7 +949,7 @@ Usage: llama-stack-client vector-dbs register [OPTIONS] VECTOR_DB_ID -### 'CLI Reference' +### CLI Reference Delete a vector db diff --git a/scripts/gen_cli_doc.py b/scripts/gen_cli_doc.py index 42e4cb60..c2862434 100644 --- a/scripts/gen_cli_doc.py +++ b/scripts/gen_cli_doc.py @@ -17,7 +17,7 @@ def generate_markdown_docs(command, parent=None, level=1): # Start with the command name as a header prefix = "#" * level - doc = [f"{prefix} 'CLI Reference'\n"] + doc = [f"{prefix} CLI Reference\n"] # Add command help docstring if command.help: From 0110e4eb207a7f79fac962eb3d27f3f41e4d4e45 Mon Sep 17 00:00:00 2001 From: Xi Yan Date: Mon, 10 Mar 2025 11:40:26 -0700 Subject: [PATCH 06/10] improve --- docs/cli_reference.md | 2 +- src/llama_stack_client/lib/cli/llama_stack_client.py | 2 +- 2 files changed, 2 insertions(+), 2 deletions(-) diff --git a/docs/cli_reference.md b/docs/cli_reference.md index 25130c43..1d90f8b5 100644 --- a/docs/cli_reference.md +++ b/docs/cli_reference.md @@ -1,6 +1,6 @@ # CLI Reference -Welcome to the LlamaStackClient CLI +Welcome to the llama-stack-client CLI - a command-line interface for interacting with Llama Stack ### Usage diff --git a/src/llama_stack_client/lib/cli/llama_stack_client.py b/src/llama_stack_client/lib/cli/llama_stack_client.py index 3631094b..54c46aaa 100644 --- a/src/llama_stack_client/lib/cli/llama_stack_client.py +++ b/src/llama_stack_client/lib/cli/llama_stack_client.py @@ -36,7 +36,7 @@ @click.option("--config", type=str, help="Path to config file", default=None) @click.pass_context def llama_stack_client(ctx, endpoint: str, api_key: str, config: str | None): - """Welcome to the LlamaStackClient CLI""" + """Welcome to the llama-stack-client CLI - a command-line interface for interacting with Llama Stack""" ctx.ensure_object(dict) # If no config provided, check default location From ca045d8474a8ef3987e4864e8f91a5e32cb40800 Mon Sep 17 00:00:00 2001 From: Xi Yan Date: Mon, 10 Mar 2025 11:46:01 -0700 Subject: [PATCH 07/10] fix doc --- docs/cli_reference.md | 84 +++++++++++++++++++++--------------------- scripts/gen_cli_doc.py | 5 ++- 2 files changed, 46 insertions(+), 43 deletions(-) diff --git a/docs/cli_reference.md b/docs/cli_reference.md index 1d90f8b5..b1251740 100644 --- a/docs/cli_reference.md +++ b/docs/cli_reference.md @@ -50,7 +50,7 @@ Usage: llama-stack-client [OPTIONS] COMMAND [ARGS]... -## CLI Reference +## configure Configure Llama Stack Client CLI. @@ -70,7 +70,7 @@ Usage: llama-stack-client configure [OPTIONS] -## CLI Reference +## datasets Manage datasets. @@ -92,7 +92,7 @@ Usage: llama-stack-client datasets [OPTIONS] COMMAND [ARGS]... -### CLI Reference +### list Show available datasets on distribution endpoint @@ -108,7 +108,7 @@ Usage: llama-stack-client datasets list [OPTIONS] -### CLI Reference +### register Create a new dataset @@ -138,7 +138,7 @@ Usage: llama-stack-client datasets register [OPTIONS] -## CLI Reference +## eval Run evaluation tasks. @@ -160,7 +160,7 @@ Usage: llama-stack-client eval [OPTIONS] COMMAND [ARGS]... -### CLI Reference +### run-benchmark Run a evaluation benchmark task @@ -196,7 +196,7 @@ Usage: llama-stack-client eval run-benchmark [OPTIONS] BENCHMARK_IDS... -### CLI Reference +### run-scoring Run scoring from application datasets @@ -228,7 +228,7 @@ Usage: llama-stack-client eval run-scoring [OPTIONS] SCORING_FUNCTION_IDS... -## CLI Reference +## eval-tasks Manage evaluation tasks. @@ -250,7 +250,7 @@ Usage: llama-stack-client eval-tasks [OPTIONS] COMMAND [ARGS]... -### CLI Reference +### list Show available eval tasks on distribution endpoint @@ -266,7 +266,7 @@ Usage: llama-stack-client eval-tasks list [OPTIONS] -### CLI Reference +### register Register a new eval task @@ -294,7 +294,7 @@ Usage: llama-stack-client eval-tasks register [OPTIONS] -## CLI Reference +## inference Inference (chat). @@ -314,7 +314,7 @@ Usage: llama-stack-client inference [OPTIONS] COMMAND [ARGS]... -### CLI Reference +### chat-completion Show available inference chat completion endpoints on distribution endpoint @@ -338,7 +338,7 @@ Usage: llama-stack-client inference chat-completion [OPTIONS] -## CLI Reference +## inspect Inspect server configuration. @@ -358,7 +358,7 @@ Usage: llama-stack-client inspect [OPTIONS] COMMAND [ARGS]... -### CLI Reference +### version Show available providers on distribution endpoint @@ -374,7 +374,7 @@ Usage: llama-stack-client inspect version [OPTIONS] -## CLI Reference +## models Manage GenAI models. @@ -400,7 +400,7 @@ Usage: llama-stack-client models [OPTIONS] COMMAND [ARGS]... -### CLI Reference +### get Show available llama models at distribution endpoint @@ -420,7 +420,7 @@ Usage: llama-stack-client models get [OPTIONS] MODEL_ID -### CLI Reference +### list Show available llama models at distribution endpoint @@ -436,7 +436,7 @@ Usage: llama-stack-client models list [OPTIONS] -### CLI Reference +### register Register a new model at distribution endpoint @@ -462,7 +462,7 @@ Usage: llama-stack-client models register [OPTIONS] MODEL_ID -### CLI Reference +### unregister Unregister a model from distribution endpoint @@ -482,7 +482,7 @@ Usage: llama-stack-client models unregister [OPTIONS] MODEL_ID -## CLI Reference +## post-training Post-training. @@ -510,7 +510,7 @@ Usage: llama-stack-client post-training [OPTIONS] COMMAND [ARGS]... -### CLI Reference +### artifacts Get the training artifacts of a specific post training job @@ -528,7 +528,7 @@ Usage: llama-stack-client post-training artifacts [OPTIONS] -### CLI Reference +### cancel Cancel the training job @@ -546,7 +546,7 @@ Usage: llama-stack-client post-training cancel [OPTIONS] -### CLI Reference +### list Show the list of available post training jobs @@ -562,7 +562,7 @@ Usage: llama-stack-client post-training list [OPTIONS] -### CLI Reference +### status Show the status of a specific post training job @@ -580,7 +580,7 @@ Usage: llama-stack-client post-training status [OPTIONS] -### CLI Reference +### supervised_fine_tune Kick off a supervised fine tune job @@ -607,7 +607,7 @@ Usage: llama-stack-client post-training supervised_fine_tune -## CLI Reference +## providers Manage API providers. @@ -627,7 +627,7 @@ Usage: llama-stack-client providers [OPTIONS] COMMAND [ARGS]... -### CLI Reference +### list Show available providers on distribution endpoint @@ -643,7 +643,7 @@ Usage: llama-stack-client providers list [OPTIONS] -## CLI Reference +## scoring-functions Manage scoring functions. @@ -665,7 +665,7 @@ Usage: llama-stack-client scoring-functions [OPTIONS] COMMAND [ARGS]... -### CLI Reference +### list Show available scoring functions on distribution endpoint @@ -681,7 +681,7 @@ Usage: llama-stack-client scoring-functions list [OPTIONS] -### CLI Reference +### register Register a new scoring function @@ -709,7 +709,7 @@ Usage: llama-stack-client scoring-functions register [OPTIONS] -## CLI Reference +## shields Manage safety shield services. @@ -731,7 +731,7 @@ Usage: llama-stack-client shields [OPTIONS] COMMAND [ARGS]... -### CLI Reference +### list Show available safety shields on distribution endpoint @@ -747,7 +747,7 @@ Usage: llama-stack-client shields list [OPTIONS] -### CLI Reference +### register Register a new safety shield @@ -771,7 +771,7 @@ Usage: llama-stack-client shields register [OPTIONS] -## CLI Reference +## toolgroups Manage available tool groups. @@ -797,7 +797,7 @@ Usage: llama-stack-client toolgroups [OPTIONS] COMMAND [ARGS]... -### CLI Reference +### get Show available llama toolgroups at distribution endpoint @@ -817,7 +817,7 @@ Usage: llama-stack-client toolgroups get [OPTIONS] TOOLGROUP_ID -### CLI Reference +### list Show available llama toolgroups at distribution endpoint @@ -833,7 +833,7 @@ Usage: llama-stack-client toolgroups list [OPTIONS] -### CLI Reference +### register Register a new toolgroup at distribution endpoint @@ -861,7 +861,7 @@ Usage: llama-stack-client toolgroups register [OPTIONS] TOOLGROUP_ID -### CLI Reference +### unregister Unregister a toolgroup from distribution endpoint @@ -881,7 +881,7 @@ Usage: llama-stack-client toolgroups unregister [OPTIONS] TOOLGROUP_ID -## CLI Reference +## vector-dbs Manage vector databases. @@ -905,7 +905,7 @@ Usage: llama-stack-client vector-dbs [OPTIONS] COMMAND [ARGS]... -### CLI Reference +### list Show available vector dbs on distribution endpoint @@ -921,7 +921,7 @@ Usage: llama-stack-client vector-dbs list [OPTIONS] -### CLI Reference +### register Create a new vector db @@ -949,7 +949,7 @@ Usage: llama-stack-client vector-dbs register [OPTIONS] VECTOR_DB_ID -### CLI Reference +### unregister Delete a vector db diff --git a/scripts/gen_cli_doc.py b/scripts/gen_cli_doc.py index c2862434..6d507744 100644 --- a/scripts/gen_cli_doc.py +++ b/scripts/gen_cli_doc.py @@ -17,7 +17,10 @@ def generate_markdown_docs(command, parent=None, level=1): # Start with the command name as a header prefix = "#" * level - doc = [f"{prefix} CLI Reference\n"] + if level == 1: + doc = [f"{prefix} CLI Reference\n"] + else: + doc = [f"{prefix} {command.name}\n"] # Add command help docstring if command.help: From 1fa564a357ae9ba6b4d449967fded39751a45833 Mon Sep 17 00:00:00 2001 From: Xi Yan Date: Mon, 10 Mar 2025 11:57:20 -0700 Subject: [PATCH 08/10] better --- docs/cli_reference.md | 86 ------------------------------------------ scripts/gen_cli_doc.py | 1 - 2 files changed, 87 deletions(-) diff --git a/docs/cli_reference.md b/docs/cli_reference.md index b1251740..76d7d032 100644 --- a/docs/cli_reference.md +++ b/docs/cli_reference.md @@ -2,8 +2,6 @@ Welcome to the llama-stack-client CLI - a command-line interface for interacting with Llama Stack -### Usage - ``` Usage: llama-stack-client [OPTIONS] COMMAND [ARGS]... ``` @@ -54,8 +52,6 @@ Usage: llama-stack-client [OPTIONS] COMMAND [ARGS]... Configure Llama Stack Client CLI. -### Usage - ``` Usage: llama-stack-client configure [OPTIONS] ``` @@ -74,8 +70,6 @@ Usage: llama-stack-client configure [OPTIONS] Manage datasets. -### Usage - ``` Usage: llama-stack-client datasets [OPTIONS] COMMAND [ARGS]... ``` @@ -96,8 +90,6 @@ Usage: llama-stack-client datasets [OPTIONS] COMMAND [ARGS]... Show available datasets on distribution endpoint -### Usage - ``` Usage: llama-stack-client datasets list [OPTIONS] ``` @@ -112,8 +104,6 @@ Usage: llama-stack-client datasets list [OPTIONS] Create a new dataset -### Usage - ``` Usage: llama-stack-client datasets register [OPTIONS] ``` @@ -142,8 +132,6 @@ Usage: llama-stack-client datasets register [OPTIONS] Run evaluation tasks. -### Usage - ``` Usage: llama-stack-client eval [OPTIONS] COMMAND [ARGS]... ``` @@ -164,8 +152,6 @@ Usage: llama-stack-client eval [OPTIONS] COMMAND [ARGS]... Run a evaluation benchmark task -### Usage - ``` Usage: llama-stack-client eval run-benchmark [OPTIONS] BENCHMARK_IDS... ``` @@ -200,8 +186,6 @@ Usage: llama-stack-client eval run-benchmark [OPTIONS] BENCHMARK_IDS... Run scoring from application datasets -### Usage - ``` Usage: llama-stack-client eval run-scoring [OPTIONS] SCORING_FUNCTION_IDS... ``` @@ -232,8 +216,6 @@ Usage: llama-stack-client eval run-scoring [OPTIONS] SCORING_FUNCTION_IDS... Manage evaluation tasks. -### Usage - ``` Usage: llama-stack-client eval-tasks [OPTIONS] COMMAND [ARGS]... ``` @@ -254,8 +236,6 @@ Usage: llama-stack-client eval-tasks [OPTIONS] COMMAND [ARGS]... Show available eval tasks on distribution endpoint -### Usage - ``` Usage: llama-stack-client eval-tasks list [OPTIONS] ``` @@ -270,8 +250,6 @@ Usage: llama-stack-client eval-tasks list [OPTIONS] Register a new eval task -### Usage - ``` Usage: llama-stack-client eval-tasks register [OPTIONS] ``` @@ -298,8 +276,6 @@ Usage: llama-stack-client eval-tasks register [OPTIONS] Inference (chat). -### Usage - ``` Usage: llama-stack-client inference [OPTIONS] COMMAND [ARGS]... ``` @@ -318,8 +294,6 @@ Usage: llama-stack-client inference [OPTIONS] COMMAND [ARGS]... Show available inference chat completion endpoints on distribution endpoint -### Usage - ``` Usage: llama-stack-client inference chat-completion [OPTIONS] ``` @@ -342,8 +316,6 @@ Usage: llama-stack-client inference chat-completion [OPTIONS] Inspect server configuration. -### Usage - ``` Usage: llama-stack-client inspect [OPTIONS] COMMAND [ARGS]... ``` @@ -362,8 +334,6 @@ Usage: llama-stack-client inspect [OPTIONS] COMMAND [ARGS]... Show available providers on distribution endpoint -### Usage - ``` Usage: llama-stack-client inspect version [OPTIONS] ``` @@ -378,8 +348,6 @@ Usage: llama-stack-client inspect version [OPTIONS] Manage GenAI models. -### Usage - ``` Usage: llama-stack-client models [OPTIONS] COMMAND [ARGS]... ``` @@ -404,8 +372,6 @@ Usage: llama-stack-client models [OPTIONS] COMMAND [ARGS]... Show available llama models at distribution endpoint -### Usage - ``` Usage: llama-stack-client models get [OPTIONS] MODEL_ID ``` @@ -424,8 +390,6 @@ Usage: llama-stack-client models get [OPTIONS] MODEL_ID Show available llama models at distribution endpoint -### Usage - ``` Usage: llama-stack-client models list [OPTIONS] ``` @@ -440,8 +404,6 @@ Usage: llama-stack-client models list [OPTIONS] Register a new model at distribution endpoint -### Usage - ``` Usage: llama-stack-client models register [OPTIONS] MODEL_ID ``` @@ -466,8 +428,6 @@ Usage: llama-stack-client models register [OPTIONS] MODEL_ID Unregister a model from distribution endpoint -### Usage - ``` Usage: llama-stack-client models unregister [OPTIONS] MODEL_ID ``` @@ -486,8 +446,6 @@ Usage: llama-stack-client models unregister [OPTIONS] MODEL_ID Post-training. -### Usage - ``` Usage: llama-stack-client post-training [OPTIONS] COMMAND [ARGS]... ``` @@ -514,8 +472,6 @@ Usage: llama-stack-client post-training [OPTIONS] COMMAND [ARGS]... Get the training artifacts of a specific post training job -### Usage - ``` Usage: llama-stack-client post-training artifacts [OPTIONS] ``` @@ -532,8 +488,6 @@ Usage: llama-stack-client post-training artifacts [OPTIONS] Cancel the training job -### Usage - ``` Usage: llama-stack-client post-training cancel [OPTIONS] ``` @@ -550,8 +504,6 @@ Usage: llama-stack-client post-training cancel [OPTIONS] Show the list of available post training jobs -### Usage - ``` Usage: llama-stack-client post-training list [OPTIONS] ``` @@ -566,8 +518,6 @@ Usage: llama-stack-client post-training list [OPTIONS] Show the status of a specific post training job -### Usage - ``` Usage: llama-stack-client post-training status [OPTIONS] ``` @@ -584,8 +534,6 @@ Usage: llama-stack-client post-training status [OPTIONS] Kick off a supervised fine tune job -### Usage - ``` Usage: llama-stack-client post-training supervised_fine_tune [OPTIONS] @@ -611,8 +559,6 @@ Usage: llama-stack-client post-training supervised_fine_tune Manage API providers. -### Usage - ``` Usage: llama-stack-client providers [OPTIONS] COMMAND [ARGS]... ``` @@ -631,8 +577,6 @@ Usage: llama-stack-client providers [OPTIONS] COMMAND [ARGS]... Show available providers on distribution endpoint -### Usage - ``` Usage: llama-stack-client providers list [OPTIONS] ``` @@ -647,8 +591,6 @@ Usage: llama-stack-client providers list [OPTIONS] Manage scoring functions. -### Usage - ``` Usage: llama-stack-client scoring-functions [OPTIONS] COMMAND [ARGS]... ``` @@ -669,8 +611,6 @@ Usage: llama-stack-client scoring-functions [OPTIONS] COMMAND [ARGS]... Show available scoring functions on distribution endpoint -### Usage - ``` Usage: llama-stack-client scoring-functions list [OPTIONS] ``` @@ -685,8 +625,6 @@ Usage: llama-stack-client scoring-functions list [OPTIONS] Register a new scoring function -### Usage - ``` Usage: llama-stack-client scoring-functions register [OPTIONS] ``` @@ -713,8 +651,6 @@ Usage: llama-stack-client scoring-functions register [OPTIONS] Manage safety shield services. -### Usage - ``` Usage: llama-stack-client shields [OPTIONS] COMMAND [ARGS]... ``` @@ -735,8 +671,6 @@ Usage: llama-stack-client shields [OPTIONS] COMMAND [ARGS]... Show available safety shields on distribution endpoint -### Usage - ``` Usage: llama-stack-client shields list [OPTIONS] ``` @@ -751,8 +685,6 @@ Usage: llama-stack-client shields list [OPTIONS] Register a new safety shield -### Usage - ``` Usage: llama-stack-client shields register [OPTIONS] ``` @@ -775,8 +707,6 @@ Usage: llama-stack-client shields register [OPTIONS] Manage available tool groups. -### Usage - ``` Usage: llama-stack-client toolgroups [OPTIONS] COMMAND [ARGS]... ``` @@ -801,8 +731,6 @@ Usage: llama-stack-client toolgroups [OPTIONS] COMMAND [ARGS]... Show available llama toolgroups at distribution endpoint -### Usage - ``` Usage: llama-stack-client toolgroups get [OPTIONS] TOOLGROUP_ID ``` @@ -821,8 +749,6 @@ Usage: llama-stack-client toolgroups get [OPTIONS] TOOLGROUP_ID Show available llama toolgroups at distribution endpoint -### Usage - ``` Usage: llama-stack-client toolgroups list [OPTIONS] ``` @@ -837,8 +763,6 @@ Usage: llama-stack-client toolgroups list [OPTIONS] Register a new toolgroup at distribution endpoint -### Usage - ``` Usage: llama-stack-client toolgroups register [OPTIONS] TOOLGROUP_ID ``` @@ -865,8 +789,6 @@ Usage: llama-stack-client toolgroups register [OPTIONS] TOOLGROUP_ID Unregister a toolgroup from distribution endpoint -### Usage - ``` Usage: llama-stack-client toolgroups unregister [OPTIONS] TOOLGROUP_ID ``` @@ -885,8 +807,6 @@ Usage: llama-stack-client toolgroups unregister [OPTIONS] TOOLGROUP_ID Manage vector databases. -### Usage - ``` Usage: llama-stack-client vector-dbs [OPTIONS] COMMAND [ARGS]... ``` @@ -909,8 +829,6 @@ Usage: llama-stack-client vector-dbs [OPTIONS] COMMAND [ARGS]... Show available vector dbs on distribution endpoint -### Usage - ``` Usage: llama-stack-client vector-dbs list [OPTIONS] ``` @@ -925,8 +843,6 @@ Usage: llama-stack-client vector-dbs list [OPTIONS] Create a new vector db -### Usage - ``` Usage: llama-stack-client vector-dbs register [OPTIONS] VECTOR_DB_ID ``` @@ -953,8 +869,6 @@ Usage: llama-stack-client vector-dbs register [OPTIONS] VECTOR_DB_ID Delete a vector db -### Usage - ``` Usage: llama-stack-client vector-dbs unregister [OPTIONS] VECTOR_DB_ID ``` diff --git a/scripts/gen_cli_doc.py b/scripts/gen_cli_doc.py index 6d507744..f20ef0ab 100644 --- a/scripts/gen_cli_doc.py +++ b/scripts/gen_cli_doc.py @@ -27,7 +27,6 @@ def generate_markdown_docs(command, parent=None, level=1): doc.append(f"{command.help}\n") # Add usage - doc.append("### Usage\n") doc.append(f"```\n{command.get_usage(ctx)}\n```\n") # Add options if present From f5a1c5fea782020e176dd523becbecca33301a97 Mon Sep 17 00:00:00 2001 From: Xi Yan Date: Mon, 10 Mar 2025 11:58:33 -0700 Subject: [PATCH 09/10] less header --- scripts/gen_cli_doc.py | 6 +++--- 1 file changed, 3 insertions(+), 3 deletions(-) diff --git a/scripts/gen_cli_doc.py b/scripts/gen_cli_doc.py index f20ef0ab..859b88d2 100644 --- a/scripts/gen_cli_doc.py +++ b/scripts/gen_cli_doc.py @@ -34,7 +34,7 @@ def generate_markdown_docs(command, parent=None, level=1): for param in command.get_params(ctx): if isinstance(param, click.Option): if not has_options: - doc.append("### Options\n") + doc.append("**Options**\n") has_options = True opts = ", ".join(param.opts) help_text = param.help or "" @@ -46,14 +46,14 @@ def generate_markdown_docs(command, parent=None, level=1): for param in command.get_params(ctx): if isinstance(param, click.Argument): if not has_arguments: - doc.append("### Arguments\n") + doc.append("**Arguments**\n") has_arguments = True name = param.name.upper() doc.append(f"* **{name}**\n") # If this is a group with commands, add subcommands if isinstance(command, click.Group): - doc.append("### Commands\n") + doc.append("**Commands**\n") for cmd_name in command.list_commands(ctx): cmd = command.get_command(ctx, cmd_name) cmd_help = cmd.get_short_help_str() if cmd else "" From 23f2a1eac0e4b9a6678ec60ca7bd24091f255b53 Mon Sep 17 00:00:00 2001 From: Xi Yan Date: Mon, 10 Mar 2025 12:35:49 -0700 Subject: [PATCH 10/10] less header --- docs/cli_reference.md | 132 +++++++++++++++++++++--------------------- 1 file changed, 66 insertions(+), 66 deletions(-) diff --git a/docs/cli_reference.md b/docs/cli_reference.md index 76d7d032..d3e403b7 100644 --- a/docs/cli_reference.md +++ b/docs/cli_reference.md @@ -6,7 +6,7 @@ Welcome to the llama-stack-client CLI - a command-line interface for interacting Usage: llama-stack-client [OPTIONS] COMMAND [ARGS]... ``` -### Options +**Options** * **-h, --help**: Show this message and exit. [default: False] @@ -18,7 +18,7 @@ Usage: llama-stack-client [OPTIONS] COMMAND [ARGS]... * **--config**: Path to config file -### Commands +**Commands** * **configure**: Configure Llama Stack Client CLI. @@ -56,7 +56,7 @@ Configure Llama Stack Client CLI. Usage: llama-stack-client configure [OPTIONS] ``` -### Options +**Options** * **-h, --help**: Show this message and exit. [default: False] @@ -74,11 +74,11 @@ Manage datasets. Usage: llama-stack-client datasets [OPTIONS] COMMAND [ARGS]... ``` -### Options +**Options** * **-h, --help**: Show this message and exit. [default: False] -### Commands +**Commands** * **list**: Show available datasets on distribution... @@ -94,7 +94,7 @@ Show available datasets on distribution endpoint Usage: llama-stack-client datasets list [OPTIONS] ``` -### Options +**Options** * **-h, --help**: Show this message and exit. [default: False] @@ -108,7 +108,7 @@ Create a new dataset Usage: llama-stack-client datasets register [OPTIONS] ``` -### Options +**Options** * **-h, --help**: Show this message and exit. [default: False] @@ -136,11 +136,11 @@ Run evaluation tasks. Usage: llama-stack-client eval [OPTIONS] COMMAND [ARGS]... ``` -### Options +**Options** * **-h, --help**: Show this message and exit. [default: False] -### Commands +**Commands** * **run-benchmark**: Run a evaluation benchmark task @@ -156,7 +156,7 @@ Run a evaluation benchmark task Usage: llama-stack-client eval run-benchmark [OPTIONS] BENCHMARK_IDS... ``` -### Options +**Options** * **-h, --help**: Show this message and exit. [default: False] @@ -176,7 +176,7 @@ Usage: llama-stack-client eval run-benchmark [OPTIONS] BENCHMARK_IDS... * **--visualize**: Visualize evaluation results after completion [default: False] -### Arguments +**Arguments** * **BENCHMARK_IDS** @@ -190,7 +190,7 @@ Run scoring from application datasets Usage: llama-stack-client eval run-scoring [OPTIONS] SCORING_FUNCTION_IDS... ``` -### Options +**Options** * **-h, --help**: Show this message and exit. [default: False] @@ -206,7 +206,7 @@ Usage: llama-stack-client eval run-scoring [OPTIONS] SCORING_FUNCTION_IDS... * **--visualize**: Visualize evaluation results after completion [default: False] -### Arguments +**Arguments** * **SCORING_FUNCTION_IDS** @@ -220,11 +220,11 @@ Manage evaluation tasks. Usage: llama-stack-client eval-tasks [OPTIONS] COMMAND [ARGS]... ``` -### Options +**Options** * **-h, --help**: Show this message and exit. [default: False] -### Commands +**Commands** * **list**: Show available eval tasks on distribution... @@ -240,7 +240,7 @@ Show available eval tasks on distribution endpoint Usage: llama-stack-client eval-tasks list [OPTIONS] ``` -### Options +**Options** * **-h, --help**: Show this message and exit. [default: False] @@ -254,7 +254,7 @@ Register a new eval task Usage: llama-stack-client eval-tasks register [OPTIONS] ``` -### Options +**Options** * **-h, --help**: Show this message and exit. [default: False] @@ -280,11 +280,11 @@ Inference (chat). Usage: llama-stack-client inference [OPTIONS] COMMAND [ARGS]... ``` -### Options +**Options** * **-h, --help**: Show this message and exit. [default: False] -### Commands +**Commands** * **chat-completion**: Show available inference chat completion... @@ -298,7 +298,7 @@ Show available inference chat completion endpoints on distribution endpoint Usage: llama-stack-client inference chat-completion [OPTIONS] ``` -### Options +**Options** * **-h, --help**: Show this message and exit. [default: False] @@ -320,11 +320,11 @@ Inspect server configuration. Usage: llama-stack-client inspect [OPTIONS] COMMAND [ARGS]... ``` -### Options +**Options** * **-h, --help**: Show this message and exit. [default: False] -### Commands +**Commands** * **version**: Show available providers on distribution... @@ -338,7 +338,7 @@ Show available providers on distribution endpoint Usage: llama-stack-client inspect version [OPTIONS] ``` -### Options +**Options** * **-h, --help**: Show this message and exit. [default: False] @@ -352,11 +352,11 @@ Manage GenAI models. Usage: llama-stack-client models [OPTIONS] COMMAND [ARGS]... ``` -### Options +**Options** * **-h, --help**: Show this message and exit. [default: False] -### Commands +**Commands** * **get**: Show available llama models at distribution... @@ -376,11 +376,11 @@ Show available llama models at distribution endpoint Usage: llama-stack-client models get [OPTIONS] MODEL_ID ``` -### Options +**Options** * **-h, --help**: Show this message and exit. [default: False] -### Arguments +**Arguments** * **MODEL_ID** @@ -394,7 +394,7 @@ Show available llama models at distribution endpoint Usage: llama-stack-client models list [OPTIONS] ``` -### Options +**Options** * **-h, --help**: Show this message and exit. [default: False] @@ -408,7 +408,7 @@ Register a new model at distribution endpoint Usage: llama-stack-client models register [OPTIONS] MODEL_ID ``` -### Options +**Options** * **-h, --help**: Show this message and exit. [default: False] @@ -418,7 +418,7 @@ Usage: llama-stack-client models register [OPTIONS] MODEL_ID * **--metadata**: JSON metadata for the model -### Arguments +**Arguments** * **MODEL_ID** @@ -432,11 +432,11 @@ Unregister a model from distribution endpoint Usage: llama-stack-client models unregister [OPTIONS] MODEL_ID ``` -### Options +**Options** * **-h, --help**: Show this message and exit. [default: False] -### Arguments +**Arguments** * **MODEL_ID** @@ -450,11 +450,11 @@ Post-training. Usage: llama-stack-client post-training [OPTIONS] COMMAND [ARGS]... ``` -### Options +**Options** * **-h, --help**: Show this message and exit. [default: False] -### Commands +**Commands** * **artifacts**: Get the training artifacts of a specific post... @@ -476,7 +476,7 @@ Get the training artifacts of a specific post training job Usage: llama-stack-client post-training artifacts [OPTIONS] ``` -### Options +**Options** * **-h, --help**: Show this message and exit. [default: False] @@ -492,7 +492,7 @@ Cancel the training job Usage: llama-stack-client post-training cancel [OPTIONS] ``` -### Options +**Options** * **-h, --help**: Show this message and exit. [default: False] @@ -508,7 +508,7 @@ Show the list of available post training jobs Usage: llama-stack-client post-training list [OPTIONS] ``` -### Options +**Options** * **-h, --help**: Show this message and exit. [default: False] @@ -522,7 +522,7 @@ Show the status of a specific post training job Usage: llama-stack-client post-training status [OPTIONS] ``` -### Options +**Options** * **-h, --help**: Show this message and exit. [default: False] @@ -539,7 +539,7 @@ Usage: llama-stack-client post-training supervised_fine_tune [OPTIONS] ``` -### Options +**Options** * **-h, --help**: Show this message and exit. [default: False] @@ -563,11 +563,11 @@ Manage API providers. Usage: llama-stack-client providers [OPTIONS] COMMAND [ARGS]... ``` -### Options +**Options** * **-h, --help**: Show this message and exit. [default: False] -### Commands +**Commands** * **list**: Show available providers on distribution... @@ -581,7 +581,7 @@ Show available providers on distribution endpoint Usage: llama-stack-client providers list [OPTIONS] ``` -### Options +**Options** * **-h, --help**: Show this message and exit. [default: False] @@ -595,11 +595,11 @@ Manage scoring functions. Usage: llama-stack-client scoring-functions [OPTIONS] COMMAND [ARGS]... ``` -### Options +**Options** * **-h, --help**: Show this message and exit. [default: False] -### Commands +**Commands** * **list**: Show available scoring functions on... @@ -615,7 +615,7 @@ Show available scoring functions on distribution endpoint Usage: llama-stack-client scoring-functions list [OPTIONS] ``` -### Options +**Options** * **-h, --help**: Show this message and exit. [default: False] @@ -629,7 +629,7 @@ Register a new scoring function Usage: llama-stack-client scoring-functions register [OPTIONS] ``` -### Options +**Options** * **-h, --help**: Show this message and exit. [default: False] @@ -655,11 +655,11 @@ Manage safety shield services. Usage: llama-stack-client shields [OPTIONS] COMMAND [ARGS]... ``` -### Options +**Options** * **-h, --help**: Show this message and exit. [default: False] -### Commands +**Commands** * **list**: Show available safety shields on distribution... @@ -675,7 +675,7 @@ Show available safety shields on distribution endpoint Usage: llama-stack-client shields list [OPTIONS] ``` -### Options +**Options** * **-h, --help**: Show this message and exit. [default: False] @@ -689,7 +689,7 @@ Register a new safety shield Usage: llama-stack-client shields register [OPTIONS] ``` -### Options +**Options** * **-h, --help**: Show this message and exit. [default: False] @@ -711,11 +711,11 @@ Manage available tool groups. Usage: llama-stack-client toolgroups [OPTIONS] COMMAND [ARGS]... ``` -### Options +**Options** * **-h, --help**: Show this message and exit. [default: False] -### Commands +**Commands** * **get**: Show available llama toolgroups at... @@ -735,11 +735,11 @@ Show available llama toolgroups at distribution endpoint Usage: llama-stack-client toolgroups get [OPTIONS] TOOLGROUP_ID ``` -### Options +**Options** * **-h, --help**: Show this message and exit. [default: False] -### Arguments +**Arguments** * **TOOLGROUP_ID** @@ -753,7 +753,7 @@ Show available llama toolgroups at distribution endpoint Usage: llama-stack-client toolgroups list [OPTIONS] ``` -### Options +**Options** * **-h, --help**: Show this message and exit. [default: False] @@ -767,7 +767,7 @@ Register a new toolgroup at distribution endpoint Usage: llama-stack-client toolgroups register [OPTIONS] TOOLGROUP_ID ``` -### Options +**Options** * **-h, --help**: Show this message and exit. [default: False] @@ -779,7 +779,7 @@ Usage: llama-stack-client toolgroups register [OPTIONS] TOOLGROUP_ID * **--args**: JSON args for the toolgroup -### Arguments +**Arguments** * **TOOLGROUP_ID** @@ -793,11 +793,11 @@ Unregister a toolgroup from distribution endpoint Usage: llama-stack-client toolgroups unregister [OPTIONS] TOOLGROUP_ID ``` -### Options +**Options** * **-h, --help**: Show this message and exit. [default: False] -### Arguments +**Arguments** * **TOOLGROUP_ID** @@ -811,11 +811,11 @@ Manage vector databases. Usage: llama-stack-client vector-dbs [OPTIONS] COMMAND [ARGS]... ``` -### Options +**Options** * **-h, --help**: Show this message and exit. [default: False] -### Commands +**Commands** * **list**: Show available vector dbs on distribution... @@ -833,7 +833,7 @@ Show available vector dbs on distribution endpoint Usage: llama-stack-client vector-dbs list [OPTIONS] ``` -### Options +**Options** * **-h, --help**: Show this message and exit. [default: False] @@ -847,7 +847,7 @@ Create a new vector db Usage: llama-stack-client vector-dbs register [OPTIONS] VECTOR_DB_ID ``` -### Options +**Options** * **-h, --help**: Show this message and exit. [default: False] @@ -859,7 +859,7 @@ Usage: llama-stack-client vector-dbs register [OPTIONS] VECTOR_DB_ID * **--embedding-dimension**: Embedding dimension (for vector type) [default: 384] -### Arguments +**Arguments** * **VECTOR_DB_ID** @@ -873,10 +873,10 @@ Delete a vector db Usage: llama-stack-client vector-dbs unregister [OPTIONS] VECTOR_DB_ID ``` -### Options +**Options** * **-h, --help**: Show this message and exit. [default: False] -### Arguments +**Arguments** * **VECTOR_DB_ID**