Skip to content
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
2 changes: 1 addition & 1 deletion .codegen/_openapi_sha
Original file line number Diff line number Diff line change
@@ -1 +1 @@
c4784cea599325a13472b1455e7434d639362d8b
e2018bb00cba203508f8afe5a6d41bd49789ba25
1 change: 1 addition & 0 deletions NEXT_CHANGELOG.md
Original file line number Diff line number Diff line change
Expand Up @@ -5,6 +5,7 @@
### Notable Changes

### CLI
* Remove previously added flags from the `jobs create` and `pipelines create` commands. ([#3870](https://github.com/databricks/cli/pull/3870))

### Dependency updates

Expand Down
22 changes: 6 additions & 16 deletions acceptance/bundle/refschema/out.fields.txt
Original file line number Diff line number Diff line change
Expand Up @@ -5,7 +5,7 @@ resources.alerts.*.display_name string ALL
resources.alerts.*.effective_run_as *sql.AlertV2RunAs ALL
resources.alerts.*.effective_run_as.service_principal_name string ALL
resources.alerts.*.effective_run_as.user_name string ALL
resources.alerts.*.evaluation *sql.AlertV2Evaluation ALL
resources.alerts.*.evaluation sql.AlertV2Evaluation ALL
resources.alerts.*.evaluation.comparison_operator sql.ComparisonOperator ALL
resources.alerts.*.evaluation.empty_result_state sql.AlertEvaluationState ALL
resources.alerts.*.evaluation.last_evaluated_at string ALL
Expand All @@ -16,7 +16,7 @@ resources.alerts.*.evaluation.notification.subscriptions []sql.AlertV2Subscripti
resources.alerts.*.evaluation.notification.subscriptions[*] sql.AlertV2Subscription ALL
resources.alerts.*.evaluation.notification.subscriptions[*].destination_id string ALL
resources.alerts.*.evaluation.notification.subscriptions[*].user_email string ALL
resources.alerts.*.evaluation.source *sql.AlertV2OperandColumn ALL
resources.alerts.*.evaluation.source sql.AlertV2OperandColumn ALL
resources.alerts.*.evaluation.source.aggregation sql.Aggregation ALL
resources.alerts.*.evaluation.source.display string ALL
resources.alerts.*.evaluation.source.name string ALL
Expand All @@ -33,7 +33,7 @@ resources.alerts.*.evaluation.threshold.value.string_value string ALL
resources.alerts.*.id string ALL
resources.alerts.*.lifecycle resources.Lifecycle INPUT
resources.alerts.*.lifecycle.prevent_destroy bool INPUT
resources.alerts.*.lifecycle_state sql.LifecycleState ALL
resources.alerts.*.lifecycle_state sql.AlertLifecycleState ALL
resources.alerts.*.modified_status string INPUT
resources.alerts.*.owner_user_name string ALL
resources.alerts.*.parent_path string ALL
Expand All @@ -48,7 +48,7 @@ resources.alerts.*.run_as *sql.AlertV2RunAs ALL
resources.alerts.*.run_as.service_principal_name string ALL
resources.alerts.*.run_as.user_name string ALL
resources.alerts.*.run_as_user_name string ALL
resources.alerts.*.schedule *sql.CronSchedule ALL
resources.alerts.*.schedule sql.CronSchedule ALL
resources.alerts.*.schedule.pause_status sql.SchedulePauseStatus ALL
resources.alerts.*.schedule.quartz_cron_schedule string ALL
resources.alerts.*.schedule.timezone_id string ALL
Expand Down Expand Up @@ -1523,12 +1523,6 @@ resources.jobs.*.settings.trigger.pause_status jobs.PauseStatus REMOTE
resources.jobs.*.settings.trigger.periodic *jobs.PeriodicTriggerConfiguration REMOTE
resources.jobs.*.settings.trigger.periodic.interval int REMOTE
resources.jobs.*.settings.trigger.periodic.unit jobs.PeriodicTriggerConfigurationTimeUnit REMOTE
resources.jobs.*.settings.trigger.table *jobs.TableUpdateTriggerConfiguration REMOTE
resources.jobs.*.settings.trigger.table.condition jobs.Condition REMOTE
resources.jobs.*.settings.trigger.table.min_time_between_triggers_seconds int REMOTE
resources.jobs.*.settings.trigger.table.table_names []string REMOTE
resources.jobs.*.settings.trigger.table.table_names[*] string REMOTE
resources.jobs.*.settings.trigger.table.wait_after_last_change_seconds int REMOTE
resources.jobs.*.settings.trigger.table_update *jobs.TableUpdateTriggerConfiguration REMOTE
resources.jobs.*.settings.trigger.table_update.condition jobs.Condition REMOTE
resources.jobs.*.settings.trigger.table_update.min_time_between_triggers_seconds int REMOTE
Expand Down Expand Up @@ -2194,12 +2188,6 @@ resources.jobs.*.trigger.pause_status jobs.PauseStatus INPUT STATE
resources.jobs.*.trigger.periodic *jobs.PeriodicTriggerConfiguration INPUT STATE
resources.jobs.*.trigger.periodic.interval int INPUT STATE
resources.jobs.*.trigger.periodic.unit jobs.PeriodicTriggerConfigurationTimeUnit INPUT STATE
resources.jobs.*.trigger.table *jobs.TableUpdateTriggerConfiguration INPUT STATE
resources.jobs.*.trigger.table.condition jobs.Condition INPUT STATE
resources.jobs.*.trigger.table.min_time_between_triggers_seconds int INPUT STATE
resources.jobs.*.trigger.table.table_names []string INPUT STATE
resources.jobs.*.trigger.table.table_names[*] string INPUT STATE
resources.jobs.*.trigger.table.wait_after_last_change_seconds int INPUT STATE
resources.jobs.*.trigger.table_update *jobs.TableUpdateTriggerConfiguration INPUT STATE
resources.jobs.*.trigger.table_update.condition jobs.Condition INPUT STATE
resources.jobs.*.trigger.table_update.min_time_between_triggers_seconds int INPUT STATE
Expand Down Expand Up @@ -2869,6 +2857,7 @@ resources.pipelines.*.spec.trigger.cron *pipelines.CronTrigger REMOTE
resources.pipelines.*.spec.trigger.cron.quartz_cron_schedule string REMOTE
resources.pipelines.*.spec.trigger.cron.timezone_id string REMOTE
resources.pipelines.*.spec.trigger.manual *pipelines.ManualTrigger REMOTE
resources.pipelines.*.spec.usage_policy_id string REMOTE
resources.pipelines.*.state pipelines.PipelineState REMOTE
resources.pipelines.*.storage string INPUT STATE
resources.pipelines.*.tags map[string]string INPUT STATE
Expand All @@ -2880,6 +2869,7 @@ resources.pipelines.*.trigger.cron.quartz_cron_schedule string INPUT STATE
resources.pipelines.*.trigger.cron.timezone_id string INPUT STATE
resources.pipelines.*.trigger.manual *pipelines.ManualTrigger INPUT STATE
resources.pipelines.*.url string INPUT
resources.pipelines.*.usage_policy_id string INPUT STATE
resources.pipelines.*.permissions.object_id string ALL
resources.pipelines.*.permissions.permissions []iam.AccessControlRequest ALL
resources.pipelines.*.permissions.permissions[*] iam.AccessControlRequest ALL
Expand Down
3 changes: 3 additions & 0 deletions acceptance/cmd/account/account-help/output.txt
Original file line number Diff line number Diff line change
Expand Up @@ -7,8 +7,11 @@ Usage:

Identity and Access Management
access-control These APIs manage access rules on resources in an account.
groups Groups simplify identity management, making it easier to assign access to Databricks account, data, and other securable objects.
groups-v2 Groups simplify identity management, making it easier to assign access to Databricks account, data, and other securable objects.
service-principals Identities for use with jobs, automated tools, and systems such as scripts, apps, and CI/CD platforms.
service-principals-v2 Identities for use with jobs, automated tools, and systems such as scripts, apps, and CI/CD platforms.
users User identities recognized by Databricks and represented by email addresses.
users-v2 User identities recognized by Databricks and represented by email addresses.
workspace-assignment The Workspace Permission Assignment API allows you to manage workspace permissions for principals in your account.

Expand Down
2 changes: 2 additions & 0 deletions acceptance/help/output.txt
Original file line number Diff line number Diff line change
Expand Up @@ -87,6 +87,7 @@ Unity Catalog
Delta Sharing
providers A data provider is an object representing the organization in the real world who shares the data.
recipient-activation The Recipient Activation API is only applicable in the open sharing model where the recipient object has the authentication type of TOKEN.
recipient-federation-policies The Recipient Federation Policies APIs are only applicable in the open sharing model where the recipient object has the authentication type of OIDC_RECIPIENT, enabling data sharing from Databricks to non-Databricks recipients.
recipients A recipient is an object you create using :method:recipients/create to represent an organization which you want to allow access shares.
shares A share is a container instantiated with :method:shares/create.

Expand Down Expand Up @@ -154,6 +155,7 @@ Additional Commands:
auth Authentication related commands
completion Generate the autocompletion script for the specified shell
configure Configure authentication
data-quality Manage the data quality of Unity Catalog objects (currently support schema and table).
help Help about any command
labs Manage Databricks Labs installations
tag-policies The Tag Policy API allows you to manage policies for governed tags in Databricks.
Expand Down
2 changes: 2 additions & 0 deletions bundle/direct/dresources/pipeline.go
Original file line number Diff line number Diff line change
Expand Up @@ -58,6 +58,7 @@ func (*ResourcePipeline) RemapState(p *pipelines.GetPipelineResponse) *pipelines
Tags: spec.Tags,
Target: spec.Target,
Trigger: spec.Trigger,
UsagePolicyId: spec.UsagePolicyId,
ForceSendFields: filterFields[pipelines.CreatePipeline](spec.ForceSendFields, "AllowDuplicateNames", "DryRun", "RunAs", "Id"),
}
}
Expand Down Expand Up @@ -106,6 +107,7 @@ func (r *ResourcePipeline) DoUpdate(ctx context.Context, id string, config *pipe
Tags: config.Tags,
Target: config.Target,
Trigger: config.Trigger,
UsagePolicyId: config.UsagePolicyId,
PipelineId: id,
ForceSendFields: filterFields[pipelines.EditPipeline](config.ForceSendFields),
}
Expand Down
32 changes: 18 additions & 14 deletions bundle/internal/schema/annotations_openapi.yml
Original file line number Diff line number Diff line change
Expand Up @@ -589,8 +589,6 @@ github.com/databricks/cli/bundle/config/resources.Pipeline:
"budget_policy_id":
"description": |-
Budget policy of this pipeline.
"x-databricks-preview": |-
PRIVATE
"catalog":
"description": |-
A catalog in Unity Catalog to publish data from this pipeline to. If `target` is specified, tables in this pipeline are published to a `target` schema inside `catalog` (for example, `catalog`.`target`.`table`). If `target` is not specified, no data is published to Unity Catalog.
Expand Down Expand Up @@ -687,6 +685,11 @@ github.com/databricks/cli/bundle/config/resources.Pipeline:
Which pipeline trigger to use. Deprecated: Use `continuous` instead.
"deprecation_message": |-
This field is deprecated
"usage_policy_id":
"description": |-
Usage policy of this pipeline.
"x-databricks-preview": |-
PRIVATE
github.com/databricks/cli/bundle/config/resources.QualityMonitor:
"assets_dir":
"description": |-
Expand Down Expand Up @@ -2462,6 +2465,10 @@ github.com/databricks/databricks-sdk-go/service/jobs.AuthenticationMethod:
- |-
PAT
github.com/databricks/databricks-sdk-go/service/jobs.CleanRoomsNotebookTask:
"_":
"description": |-
Clean Rooms notebook task for V1 Clean Room service (GA).
Replaces the deprecated CleanRoomNotebookTask (defined above) which was for V0 service.
"clean_room_name":
"description": |-
The clean room that the notebook belongs to.
Expand Down Expand Up @@ -3060,6 +3067,8 @@ github.com/databricks/databricks-sdk-go/service/jobs.RunJobTask:
"dbt_commands":
"description": |-
An array of commands to execute for jobs with the dbt task, for example `"dbt_commands": ["dbt deps", "dbt seed", "dbt deps", "dbt seed", "dbt run"]`

⚠ **Deprecation note** Use [job parameters](https://docs.databricks.com/jobs/job-parameters.html#job-parameter-pushdown) to pass information down to tasks.
"deprecation_message": |-
This field is deprecated
"x-databricks-preview": |-
Expand All @@ -3072,7 +3081,7 @@ github.com/databricks/databricks-sdk-go/service/jobs.RunJobTask:
jar_params cannot be specified in conjunction with notebook_params.
The JSON representation of this field (for example `{"jar_params":["john doe","35"]}`) cannot exceed 10,000 bytes.

Use [Task parameter variables](https://docs.databricks.com/jobs.html#parameter-variables) to set parameters containing information about job runs.
⚠ **Deprecation note** Use [job parameters](https://docs.databricks.com/jobs/job-parameters.html#job-parameter-pushdown) to pass information down to tasks.
"deprecation_message": |-
This field is deprecated
"x-databricks-preview": |-
Expand All @@ -3092,7 +3101,7 @@ github.com/databricks/databricks-sdk-go/service/jobs.RunJobTask:

notebook_params cannot be specified in conjunction with jar_params.

Use [Task parameter variables](https://docs.databricks.com/jobs.html#parameter-variables) to set parameters containing information about job runs.
⚠ **Deprecation note** Use [job parameters](https://docs.databricks.com/jobs/job-parameters.html#job-parameter-pushdown) to pass information down to tasks.

The JSON representation of this field (for example `{"notebook_params":{"name":"john doe","age":"35"}}`) cannot exceed 10,000 bytes.
"deprecation_message": |-
Expand All @@ -3114,7 +3123,7 @@ github.com/databricks/databricks-sdk-go/service/jobs.RunJobTask:
the parameters specified in job setting. The JSON representation of this field (for example `{"python_params":["john doe","35"]}`)
cannot exceed 10,000 bytes.

Use [Task parameter variables](https://docs.databricks.com/jobs.html#parameter-variables) to set parameters containing information about job runs.
⚠ **Deprecation note** Use [job parameters](https://docs.databricks.com/jobs/job-parameters.html#job-parameter-pushdown) to pass information down to tasks.

Important

Expand All @@ -3131,7 +3140,7 @@ github.com/databricks/databricks-sdk-go/service/jobs.RunJobTask:
parameters specified in job setting. The JSON representation of this field (for example `{"python_params":["john doe","35"]}`)
cannot exceed 10,000 bytes.

Use [Task parameter variables](https://docs.databricks.com/jobs.html#parameter-variables) to set parameters containing information about job runs
⚠ **Deprecation note** Use [job parameters](https://docs.databricks.com/jobs/job-parameters.html#job-parameter-pushdown) to pass information down to tasks.

Important

Expand All @@ -3144,6 +3153,8 @@ github.com/databricks/databricks-sdk-go/service/jobs.RunJobTask:
"sql_params":
"description": |-
A map from keys to values for jobs with SQL task, for example `"sql_params": {"name": "john doe", "age": "35"}`. The SQL alert task does not support custom parameters.

⚠ **Deprecation note** Use [job parameters](https://docs.databricks.com/jobs/job-parameters.html#job-parameter-pushdown) to pass information down to tasks.
"deprecation_message": |-
This field is deprecated
"x-databricks-preview": |-
Expand Down Expand Up @@ -3511,13 +3522,6 @@ github.com/databricks/databricks-sdk-go/service/jobs.TriggerSettings:
"periodic":
"description": |-
Periodic trigger settings.
"table":
"description": |-
Old table trigger settings name. Deprecated in favor of `table_update`.
"deprecation_message": |-
This field is deprecated
"x-databricks-preview": |-
PRIVATE
"table_update": {}
github.com/databricks/databricks-sdk-go/service/jobs.Webhook:
"id": {}
Expand Down Expand Up @@ -3566,7 +3570,7 @@ github.com/databricks/databricks-sdk-go/service/pipelines.CronTrigger:
github.com/databricks/databricks-sdk-go/service/pipelines.DayOfWeek:
"_":
"description": |-
Days of week in which the restart is allowed to happen (within a five-hour window starting at start_hour).
Days of week in which the window is allowed to happen.
If not specified all days of the week will be used.
"enum":
- |-
Expand Down
1 change: 0 additions & 1 deletion bundle/internal/validation/generated/enum_fields.go

Some generated files are not rendered by default. Learn more about how customized files appear on GitHub.

Loading