Skip to content
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
2 changes: 1 addition & 1 deletion .codegen/_openapi_sha
Original file line number Diff line number Diff line change
@@ -1 +1 @@
a8f547d3728fba835fbdda301e846829c5cbbef5
033bcb9242b006001e2cf3956896711681de1a8c
3 changes: 2 additions & 1 deletion .gitattributes
Original file line number Diff line number Diff line change
Expand Up @@ -38,6 +38,7 @@ cmd/account/workspace-assignment/workspace-assignment.go linguist-generated=true
cmd/account/workspace-network-configuration/workspace-network-configuration.go linguist-generated=true
cmd/account/workspaces/workspaces.go linguist-generated=true
cmd/workspace/access-control/access-control.go linguist-generated=true
cmd/workspace/ai-builder/ai-builder.go linguist-generated=true
cmd/workspace/aibi-dashboard-embedding-access-policy/aibi-dashboard-embedding-access-policy.go linguist-generated=true
cmd/workspace/aibi-dashboard-embedding-approved-domains/aibi-dashboard-embedding-approved-domains.go linguist-generated=true
cmd/workspace/alerts-legacy/alerts-legacy.go linguist-generated=true
Expand All @@ -63,7 +64,6 @@ cmd/workspace/consumer-providers/consumer-providers.go linguist-generated=true
cmd/workspace/credentials-manager/credentials-manager.go linguist-generated=true
cmd/workspace/credentials/credentials.go linguist-generated=true
cmd/workspace/current-user/current-user.go linguist-generated=true
cmd/workspace/custom-llms/custom-llms.go linguist-generated=true
cmd/workspace/dashboard-email-subscriptions/dashboard-email-subscriptions.go linguist-generated=true
cmd/workspace/dashboard-widgets/dashboard-widgets.go linguist-generated=true
cmd/workspace/dashboards/dashboards.go linguist-generated=true
Expand All @@ -78,6 +78,7 @@ cmd/workspace/enable-results-downloading/enable-results-downloading.go linguist-
cmd/workspace/enhanced-security-monitoring/enhanced-security-monitoring.go linguist-generated=true
cmd/workspace/experiments/experiments.go linguist-generated=true
cmd/workspace/external-locations/external-locations.go linguist-generated=true
cmd/workspace/feature-store/feature-store.go linguist-generated=true
cmd/workspace/forecasting/forecasting.go linguist-generated=true
cmd/workspace/functions/functions.go linguist-generated=true
cmd/workspace/genie/genie.go linguist-generated=true
Expand Down
3 changes: 3 additions & 0 deletions NEXT_CHANGELOG.md
Original file line number Diff line number Diff line change
Expand Up @@ -13,3 +13,6 @@
](https://docs.databricks.com/dev-tools/bundles/python). ([#3102](https://github.com/databricks/cli/pull/3102))

### API Changes
* Removed `databricks custom-llms` command group.
* Added `databricks ai-builder` command group.
* Added `databricks feature-store` command group.
3 changes: 3 additions & 0 deletions acceptance/help/output.txt
Original file line number Diff line number Diff line change
Expand Up @@ -124,6 +124,9 @@ Clean Rooms
clean-room-task-runs Clean room task runs are the executions of notebooks in a clean room.
clean-rooms A clean room uses Delta Sharing and serverless compute to provide a secure and privacy-protecting environment where multiple parties can work together on sensitive enterprise data without direct access to each other’s data.

Database
database Database Instances provide access to a database via REST API or direct SQL.

Quality Monitor v2
quality-monitor-v2 Manage data quality of UC objects (currently support schema).

Expand Down
44 changes: 43 additions & 1 deletion bundle/internal/schema/annotations_openapi.yml
Original file line number Diff line number Diff line change
Expand Up @@ -192,6 +192,9 @@ github.com/databricks/cli/bundle/config/resources.Cluster:
"policy_id":
"description": |-
The ID of the cluster policy used to create the cluster if applicable.
"remote_disk_throughput":
"description": |-
If set, what the configurable throughput (in Mb/s) for the remote disk is. Currently only supported for GCP HYPERDISK_BALANCED disks.
"runtime_engine":
"description": |-
Determines the cluster's runtime engine, either standard or Photon.
Expand Down Expand Up @@ -232,6 +235,9 @@ github.com/databricks/cli/bundle/config/resources.Cluster:
SSH public key contents that will be added to each Spark node in this cluster. The
corresponding private keys can be used to login with the user name `ubuntu` on port `2200`.
Up to 10 keys can be specified.
"total_initial_remote_disk_size":
"description": |-
If set, what the total initial volume size (in GB) of the remote disks should be. Currently only supported for GCP HYPERDISK_BALANCED disks.
"use_ml_runtime":
"description": |-
This field can only be used when `kind = CLASSIC_PREVIEW`.
Expand Down Expand Up @@ -477,6 +483,11 @@ github.com/databricks/cli/bundle/config/resources.Pipeline:
"edition":
"description": |-
Pipeline product edition.
"environment":
"description": |-
Environment specification for this pipeline used to install dependencies.
"x-databricks-preview": |-
PRIVATE
"event_log":
"description": |-
Event log configuration for this pipeline
Expand Down Expand Up @@ -1300,6 +1311,9 @@ github.com/databricks/databricks-sdk-go/service/compute.ClusterSpec:
"policy_id":
"description": |-
The ID of the cluster policy used to create the cluster if applicable.
"remote_disk_throughput":
"description": |-
If set, what the configurable throughput (in Mb/s) for the remote disk is. Currently only supported for GCP HYPERDISK_BALANCED disks.
"runtime_engine":
"description": |-
Determines the cluster's runtime engine, either standard or Photon.
Expand Down Expand Up @@ -1340,6 +1354,9 @@ github.com/databricks/databricks-sdk-go/service/compute.ClusterSpec:
SSH public key contents that will be added to each Spark node in this cluster. The
corresponding private keys can be used to login with the user name `ubuntu` on port `2200`.
Up to 10 keys can be specified.
"total_initial_remote_disk_size":
"description": |-
If set, what the total initial volume size (in GB) of the remote disks should be. Currently only supported for GCP HYPERDISK_BALANCED disks.
"use_ml_runtime":
"description": |-
This field can only be used when `kind = CLASSIC_PREVIEW`.
Expand Down Expand Up @@ -1788,12 +1805,22 @@ github.com/databricks/databricks-sdk-go/service/jobs.DashboardTask:
Optional: The warehouse id to execute the dashboard with for the schedule.
If not specified, the default warehouse of the dashboard will be used.
github.com/databricks/databricks-sdk-go/service/jobs.DbtCloudTask:
"_":
"description": |-
Deprecated in favor of DbtPlatformTask
"connection_resource_name":
"description": |-
The resource name of the UC connection that authenticates the dbt Cloud for this task
"dbt_cloud_job_id":
"description": |-
Id of the dbt Cloud job to be triggered
github.com/databricks/databricks-sdk-go/service/jobs.DbtPlatformTask:
"connection_resource_name":
"description": |-
The resource name of the UC connection that authenticates the dbt platform for this task
"dbt_platform_job_id":
"description": |-
Id of the dbt platform job to be triggered. Specified as a string for maximum compatibility with clients.
github.com/databricks/databricks-sdk-go/service/jobs.DbtTask:
"catalog":
"description": |-
Expand Down Expand Up @@ -2549,7 +2576,12 @@ github.com/databricks/databricks-sdk-go/service/jobs.Task:
The task refreshes a dashboard and sends a snapshot to subscribers.
"dbt_cloud_task":
"description": |-
Task type for dbt cloud
Task type for dbt cloud, deprecated in favor of the new name dbt_platform_task
"deprecation_message": |-
This field is deprecated
"x-databricks-preview": |-
PRIVATE
"dbt_platform_task":
"x-databricks-preview": |-
PRIVATE
"dbt_task":
Expand Down Expand Up @@ -3083,6 +3115,16 @@ github.com/databricks/databricks-sdk-go/service/pipelines.PipelineLibrary:
github.com/databricks/databricks-sdk-go/service/pipelines.PipelineTrigger:
"cron": {}
"manual": {}
github.com/databricks/databricks-sdk-go/service/pipelines.PipelinesEnvironment:
"_":
"description": |-
The environment entity used to preserve serverless environment side panel, jobs' environment for non-notebook task, and DLT's environment for classic and serverless pipelines.
In this minimal environment spec, only pip dependencies are supported.
"dependencies":
"description": |-
List of pip dependencies, as supported by the version of pip in this environment.
Each dependency is a pip requirement file line https://pip.pypa.io/en/stable/reference/requirements-file-format/
Allowed dependency could be <requirement specifier>, <archive url/path>, <local project path>(WSFS or Volumes in Databricks), <vcs project url>
github.com/databricks/databricks-sdk-go/service/pipelines.ReportSpec:
"destination_catalog":
"description": |-
Expand Down
3 changes: 3 additions & 0 deletions bundle/internal/schema/annotations_openapi_overrides.yml
Original file line number Diff line number Diff line change
Expand Up @@ -665,6 +665,9 @@ github.com/databricks/databricks-sdk-go/service/jobs.SubscriptionSubscriber:
"description": |-
PLACEHOLDER
github.com/databricks/databricks-sdk-go/service/jobs.Task:
"dbt_platform_task":
"description": |-
PLACEHOLDER
"gen_ai_compute_task":
"description": |-
PLACEHOLDER
Expand Down
73 changes: 72 additions & 1 deletion bundle/schema/jsonschema.json

Some generated files are not rendered by default. Learn more about how customized files appear on GitHub.

Loading
Loading