You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
## Changes
Add a minimal README that will be used for the PyPi package.
## Why
We need to give the PyPi package README in case it gets discovered
before CLI.
Python for Databricks Asset Bundles extends [Databricks Asset Bundles](https://docs.databricks.com/aws/en/dev-tools/bundles/) so that you can:
4
+
- Define jobs and pipelines as Python code. These jobs can coexist with jobs defined in YAML.
5
+
- Dynamically create jobs and pipelines using metadata.
6
+
- Modify jobs and pipelines defined in YAML or Python during bundle deployment.
7
+
8
+
Documentation is available at https://docs.databricks.com/dev-tools/cli/databricks-cli.html.
9
+
10
+
Reference documentation is available at https://databricks.github.io/cli/experimental/python/
11
+
12
+
## Getting started
13
+
14
+
To use `databricks-bundles`, you must first:
15
+
16
+
1. Install the [Databricks CLI](https://github.com/databricks/cli), version 0.247.1 or above
17
+
2. Authenticate to your Databricks workspace if you have not done so already:
18
+
19
+
```bash
20
+
databricks configure
21
+
```
22
+
3. To create a new project, initialize a bundle using the `experimental-jobs-as-code` template:
23
+
24
+
```bash
25
+
databricks bundle init experimental-jobs-as-code
26
+
```
27
+
28
+
## Privacy Notice
29
+
Databricks CLI use is subject to the [Databricks License](https://github.com/databricks/cli/blob/main/LICENSE) and [Databricks Privacy Notice](https://www.databricks.com/legal/privacynotice), including any Usage Data provisions.
0 commit comments