Skip to content

Commit 6a38b39

Browse files
authored
[1/n] improve github workflows for client repo (#77)
# What - add pre-commit workflow - add issue / PR template - next: add testpypi package workflows as upstream for llamastack/llama-stack#734
1 parent 2183dc9 commit 6a38b39

File tree

9 files changed

+135
-59
lines changed

9 files changed

+135
-59
lines changed

.github/ISSUE_TEMPLATE/bug.yml

Lines changed: 77 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,77 @@
1+
name: 🐛 Bug Report
2+
description: Create a report to help us reproduce and fix the bug
3+
4+
body:
5+
- type: markdown
6+
attributes:
7+
value: >
8+
#### Before submitting a bug, please make sure the issue hasn't been already addressed by searching through [the
9+
existing and past issues](https://github.com/meta-llama/llama-stack/issues).
10+
11+
- type: textarea
12+
id: system-info
13+
attributes:
14+
label: System Info
15+
description: |
16+
Please share your system info with us. You can use the following command to capture your environment information
17+
python -m "torch.utils.collect_env"
18+
19+
placeholder: |
20+
PyTorch version, CUDA version, GPU type, #num of GPUs...
21+
validations:
22+
required: true
23+
24+
- type: checkboxes
25+
id: information-scripts-examples
26+
attributes:
27+
label: Information
28+
description: 'The problem arises when using:'
29+
options:
30+
- label: "The official example scripts"
31+
- label: "My own modified scripts"
32+
33+
- type: textarea
34+
id: bug-description
35+
attributes:
36+
label: 🐛 Describe the bug
37+
description: |
38+
Please provide a clear and concise description of what the bug is.
39+
40+
Please also paste or describe the results you observe instead of the expected results.
41+
placeholder: |
42+
A clear and concise description of what the bug is.
43+
44+
```llama stack
45+
# Command that you used for running the examples
46+
```
47+
Description of the results
48+
validations:
49+
required: true
50+
51+
- type: textarea
52+
attributes:
53+
label: Error logs
54+
description: |
55+
If you observe an error, please paste the error message including the **full** traceback of the exception. It may be relevant to wrap error messages in ```` ```triple quotes blocks``` ````.
56+
57+
placeholder: |
58+
```
59+
The error message you got, with the full traceback.
60+
```
61+
62+
validations:
63+
required: true
64+
65+
66+
- type: textarea
67+
id: expected-behavior
68+
validations:
69+
required: true
70+
attributes:
71+
label: Expected behavior
72+
description: "A clear and concise description of what you would expect to happen."
73+
74+
- type: markdown
75+
attributes:
76+
value: >
77+
Thanks for contributing 🎉!

.github/PULL_REQUEST_TEMPLATE.md

Lines changed: 27 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,27 @@
1+
# What does this PR do?
2+
3+
In short, provide a summary of what this PR does and why. Usually, the relevant context should be present in a linked issue.
4+
5+
- [ ] Addresses issue (#issue)
6+
7+
8+
## Test Plan
9+
10+
Please describe:
11+
- tests you ran to verify your changes with result summaries.
12+
- provide instructions so it can be reproduced.
13+
14+
15+
## Sources
16+
17+
Please link relevant resources if necessary.
18+
19+
20+
## Before submitting
21+
22+
- [ ] This PR fixes a typo or improves the docs (you can dismiss the other checks if that's the case).
23+
- [ ] Ran pre-commit to handle lint / formatting issues.
24+
- [ ] Read the [contributor guideline](https://github.com/meta-llama/llama-stack/blob/main/CONTRIBUTING.md),
25+
Pull Request section?
26+
- [ ] Updated relevant documentation.
27+
- [ ] Wrote necessary unit or integration tests.

.github/workflows/ci.yml

Lines changed: 0 additions & 52 deletions
This file was deleted.

.github/workflows/pre-commit.yml

Lines changed: 25 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,25 @@
1+
name: Pre-commit
2+
3+
on:
4+
pull_request:
5+
push:
6+
branches: [main]
7+
8+
jobs:
9+
pre-commit:
10+
runs-on: ubuntu-latest
11+
12+
steps:
13+
- name: Checkout code
14+
uses: actions/checkout@d632683dd7b4114ad314bca15554477dd762a938 # v4.2.0
15+
16+
- name: Set up Python
17+
uses: actions/setup-python@f677139bbe7f9c59b41e40162b753c062f5d49a3 # v5.2.0
18+
with:
19+
python-version: '3.11.10'
20+
cache: pip
21+
cache-dependency-path: |
22+
**/requirements*.txt
23+
.pre-commit-config.yaml
24+
25+
- uses: pre-commit/action@v3.0.1

.pre-commit-config.yaml

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -5,7 +5,7 @@ default_language_version:
55

66
repos:
77
- repo: https://github.com/pre-commit/pre-commit-hooks
8-
rev: 6306a48f7dae5861702d573c9c247e4e9498e867
8+
rev: v5.0.0
99
hooks:
1010
- id: trailing-whitespace
1111
- id: check-ast
Lines changed: 0 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -1,4 +1,3 @@
11
from .inspect import inspect
22

33
__all__ = ["inspect"]
4-

src/llama_stack_client/lib/cli/inspect/version.py

Lines changed: 1 addition & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -1,6 +1,5 @@
11
import click
22
from rich.console import Console
3-
from rich.table import Table
43

54
from ..common.utils import handle_client_errors
65

@@ -13,4 +12,4 @@ def inspect_version(ctx):
1312
client = ctx.obj["client"]
1413
console = Console()
1514
version_response = client.inspect.version()
16-
console.print(version_response)
15+
console.print(version_response)

src/llama_stack_client/lib/cli/llama_stack_client.py

Lines changed: 3 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -5,26 +5,27 @@
55
# the root directory of this source tree.
66

77
import os
8+
from importlib.metadata import version
89

910
import click
1011
import yaml
1112

1213
from llama_stack_client import LlamaStackClient
13-
from importlib.metadata import version
1414

1515
from .configure import configure
1616
from .constants import get_config_file_path
1717
from .datasets import datasets
1818
from .eval import eval
1919
from .eval_tasks import eval_tasks
2020
from .inference import inference
21+
from .inspect import inspect
2122
from .memory_banks import memory_banks
2223
from .models import models
2324
from .post_training import post_training
2425
from .providers import providers
2526
from .scoring_functions import scoring_functions
2627
from .shields import shields
27-
from .inspect import inspect
28+
2829

2930
@click.group()
3031
@click.version_option(version=version("llama-stack-client"), prog_name="llama-stack-client")

tests/test_client.py

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -1631,7 +1631,7 @@ def test_get_platform(self) -> None:
16311631
import threading
16321632
16331633
from llama_stack_client._utils import asyncify
1634-
from llama_stack_client._base_client import get_platform
1634+
from llama_stack_client._base_client import get_platform
16351635
16361636
async def test_main() -> None:
16371637
result = await asyncify(get_platform)()

0 commit comments

Comments
 (0)