You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
* Added `GENERATED_SQL_QUERY_TOO_LONG_EXCEPTION` and `MISSING_SQL_QUERY_EXCEPTION` enum values for `com.databricks.sdk.service.dashboards.MessageErrorType`.
24
+
* Added `BALANCED` enum value for `com.databricks.sdk.service.jobs.PerformanceTarget`.
25
+
* Added `LISTING_RESOURCE` enum value for `com.databricks.sdk.service.marketplace.FileParentType`.
26
+
* Added `APP` enum value for `com.databricks.sdk.service.marketplace.MarketplaceFileType`.
27
+
* Added `CUSTOM` enum value for `com.databricks.sdk.service.serving.ExternalModelProvider`.
28
+
* Added `ARCLIGHT_MULTI_TENANT_AZURE_EXCHANGE_TOKEN` and `ARCLIGHT_MULTI_TENANT_AZURE_EXCHANGE_TOKEN_WITH_USER_DELEGATION_KEY` enum values for `com.databricks.sdk.service.settings.TokenType`.
29
+
*[Breaking] Changed `createExperiment()` method for `workspaceClient.forecasting()` service with new required argument order.
30
+
* Changed `instanceTypeId` field for `com.databricks.sdk.service.compute.NodeInstanceType` to be required.
31
+
* Changed `category` field for `com.databricks.sdk.service.compute.NodeType` to be required.
32
+
*[Breaking] Changed `functions` field for `com.databricks.sdk.service.sharing.ListProviderShareAssetsResponse` to type `com.databricks.sdk.service.sharing.DeltaSharingFunctionList` class.
33
+
*[Breaking] Removed `executionDetails` and `script` fields for `com.databricks.sdk.service.compute.InitScriptInfoAndExecutionDetails`.
34
+
*[Breaking] Removed `supportsElasticDisk` field for `com.databricks.sdk.service.compute.NodeType`.
35
+
*[Breaking] Removed `dataGranularityQuantity` and `dataGranularityUnit` fields for `com.databricks.sdk.service.ml.CreateForecastingExperimentRequest`.
Copy file name to clipboardExpand all lines: NEXT_CHANGELOG.md
+2-1Lines changed: 2 additions & 1 deletion
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -1,6 +1,6 @@
1
1
# NEXT CHANGELOG
2
2
3
-
## Release v0.43.0
3
+
## Release v0.44.0
4
4
5
5
### New Features and Improvements
6
6
* Introduce support for Databricks Workload Identity Federation in GitHub workflows ([423](https://github.com/databricks/databricks-sdk-java/pull/423)).
@@ -9,6 +9,7 @@
9
9
environment variables set may see their authentication start failing due to the order in which the SDK tries different authentication methods.
10
10
11
11
### Bug Fixes
12
+
* Fix issue deserializing HTTP responses with an empty body ([#426](https://github.com/databricks/databricks-sdk-java/pull/426)).
Copy file name to clipboardExpand all lines: README.md
+4-4Lines changed: 4 additions & 4 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -116,18 +116,18 @@ Depending on the Databricks authentication method, the SDK uses the following in
116
116
117
117
### Databricks native authentication
118
118
119
-
By default, the Databricks SDK for Java initially tries [Databricks token authentication](https://docs.databricks.com/dev-tools/api/latest/authentication.html) (`auth_type='pat'` argument). If the SDK is unsuccessful, it then tries Databricks basic (username/password) authentication (`auth_type="basic"` argument).
119
+
By default, the Databricks SDK for Java initially tries [Databricks token authentication](https://docs.databricks.com/dev-tools/api/latest/authentication.html) (`auth_type='pat'` argument). If the SDK is unsuccessful, it then tries Databricks Workload Identity Federation (WIF) authentication using OIDC (`auth_type="github-oidc"` argument).
120
120
121
121
- For Databricks token authentication, you must provide `host` and `token`; or their environment variable or `.databrickscfg` file field equivalents.
122
-
- For Databricks basic authentication, you must provide `host`, `username`, and `password`_(for AWS workspace-level operations)_; or `host`, `account_id`, `username`, and `password`_(for AWS, Azure, or GCP account-level operations)_; or their environment variable or `.databrickscfg`file field equivalents.
122
+
- For Databricks OIDC authentication, you must provide the `host`, `client_id` and `token_audience`_(optional)_ either directly, through the corresponding environment variables, or in your `.databrickscfg`configuration file.
|`host`|_(String)_ The Databricks host URL for either the Databricks workspace endpoint or the Databricks accounts endpoint. |`DATABRICKS_HOST`|
127
127
|`account_id`|_(String)_ The Databricks account ID for the Databricks accounts endpoint. Only has effect when `Host` is either `https://accounts.cloud.databricks.com/`_(AWS)_, `https://accounts.azuredatabricks.net/`_(Azure)_, or `https://accounts.gcp.databricks.com/`_(GCP)_. |`DATABRICKS_ACCOUNT_ID`|
128
128
|`token`|_(String)_ The Databricks personal access token (PAT) _(AWS, Azure, and GCP)_ or Azure Active Directory (Azure AD) token _(Azure)_. |`DATABRICKS_TOKEN`|
129
-
|`username`|_(String)_ The Databricks username part of basic authentication. Only possible when `Host` is `*.cloud.databricks.com`_(AWS)_. |`DATABRICKS_USERNAME`|
130
-
|`password`|_(String)_The Databricks password part of basic authentication. Only possible when `Host` is `*.cloud.databricks.com`_(AWS)_. |`DATABRICKS_PASSWORD`|
129
+
|`client_id`|_(String)_ The Databricks Service Principal Application ID. |`DATABRICKS_CLIENT_ID`|
130
+
|`token_audience`|_(String)_When using Workload Identity Federation, the audience to specify when fetching an ID token from the ID token supplier. |`TOKEN_AUDIENCE`|
131
131
132
132
For example, to use Databricks token authentication:
0 commit comments