Skip to content

Commit 4582126

Browse files
denikpietern
andauthored
Log diagnostics immediately after mutator ends (#3175)
## Changes - New package: logdiag with - logdiag.LogDiag(ctx, d): render and logs Diagnostic object on stderr right away. Counts diagnostic type (error, warning, recommendation) on the context variable. - logdiag.HasError(ctx) — return true if there were any errors seen so far. - New functions bundle.ApplyContext/ApplySeqContext (to replace bundle.Apply/ApplySeq). These functions do not return diagnostics, but calls LogDiag on every diagnostic returned by the mutator. bundle.Apply and bundle.ApplySeq are still available as wrappers around Context counterparts. The main use case for these is unit tests. Note, with this PR mutators may choose whether to use LogDiag to log diagnostics right away or to return it as usual. This has exactly the same effect (except if you return it it is delayed a bit). For new code, LogDiag should be preferred, eventually mutator interface will be modified not to return diagnostics. Note, even if you use LogDiag, bundle.Apply will still return the diagnostics logged. Other behavior changes: - diagnostics are now always logged to stderr; previously "bundle summary" non-json would print them to stdout. - slashes in the filenames in diagnostics are normalized to have forward slashes, to help with acceptance tests. - summary command won't print recommendations (but it'll keep on printing warnings) ## Why Collection of diagnostics and printing them at the very end is poor user and developer experience. User experience: - Users will miss diagnostics if process crashes/panics; - For long running deployments, useful information is not printed right away; - For deploys with interactive approval, diagnostics may show issues that should be fixed before the deployment. Since they not shown, users cannot make informed choice. Developer experience: - High accidental complexity to propagate the diagnostics properly. - Various bugs over time where diagnostics were lost due to logic or threading bugs (#2057, #2927, #3123) ## Tests Existing tests. Note, we have no coverage for shell completion in bundle. --------- Co-authored-by: Pieter Noordhuis <pieter.noordhuis@databricks.com>
1 parent 1dc97c8 commit 4582126

Some content is hidden

Large Commits have some content hidden by default. Use the searchbox below for content that may be hidden.

67 files changed

+1204
-1173
lines changed

NEXT_CHANGELOG.md

Lines changed: 4 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -3,6 +3,10 @@
33
## Release v0.259.0
44

55
### Notable Changes
6+
* Diagnostics messages are no longer buffered to be printed at the end of command, flushed after every mutator ([#3175](https://github.com/databricks/cli/pull/3175))
7+
* Diagnostics are now always rendered with forward slashes in file paths, even on Windows ([#3175](https://github.com/databricks/cli/pull/3175))
8+
* "bundle summary" now prints diagnostics to stderr instead of stdout in text output mode ([#3175](https://github.com/databricks/cli/pull/3175))
9+
* "bundle summary" no longer prints recommendations, it will only prints warnings and errors ([#3175](https://github.com/databricks/cli/pull/3175))
610

711
### Dependency updates
812

acceptance/auth/bundle_and_profile/output.txt

Lines changed: 1 addition & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -21,6 +21,7 @@ Exit code: 1
2121
>>> errcode [CLI] current-user me -t dev -p profile_name
2222
Error: cannot resolve bundle auth configuration: the host in the profile (https://non-existing-subdomain.databricks.com) doesn’t match the host configured in the bundle ([DATABRICKS_TARGET]). The profile "DEFAULT" has host="[DATABRICKS_TARGET]" that matches host in the bundle. To select it, pass "-p DEFAULT"
2323

24+
2425
Exit code: 1
2526

2627
=== Bundle commands load bundle configuration when no flags, validation OK

acceptance/bundle/apps/config_section/output.txt

Lines changed: 4 additions & 4 deletions
Original file line numberDiff line numberDiff line change
@@ -13,11 +13,11 @@ Workspace:
1313
Found 1 warning
1414

1515
>>> [CLI] bundle deploy
16-
Uploading bundle files to /Workspace/Users/[USERNAME]/.bundle/test-bundle/default/files...
17-
Deploying resources...
18-
Updating deployment state...
19-
Deployment complete!
2016
Warning: App config section detected
2117

2218
remove 'config' from app resource 'myapp' section and use app.yml file in the root of this app instead
2319

20+
Uploading bundle files to /Workspace/Users/[USERNAME]/.bundle/test-bundle/default/files...
21+
Deploying resources...
22+
Updating deployment state...
23+
Deployment complete!

acceptance/bundle/deploy/jobs/shared-root-path/output.txt

Lines changed: 6 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -1,15 +1,19 @@
11

22
>>> [CLI] bundle deploy
3+
Warning: the bundle root path /Workspace/Shared/[USERNAME]/.bundle/[UNIQUE_NAME] is writable by all workspace users
4+
5+
The bundle is configured to use /Workspace/Shared, which will give read/write access to all users. If this is intentional, add CAN_MANAGE for 'group_name: users' permission to your bundle configuration. If the deployment should be restricted, move it to a restricted folder such as /Workspace/Users/<username or principal name>.
6+
37
Uploading bundle files to /Workspace/Shared/[USERNAME]/.bundle/[UNIQUE_NAME]/files...
48
Deploying resources...
59
Updating deployment state...
610
Deployment complete!
11+
12+
>>> [CLI] bundle destroy --auto-approve
713
Warning: the bundle root path /Workspace/Shared/[USERNAME]/.bundle/[UNIQUE_NAME] is writable by all workspace users
814

915
The bundle is configured to use /Workspace/Shared, which will give read/write access to all users. If this is intentional, add CAN_MANAGE for 'group_name: users' permission to your bundle configuration. If the deployment should be restricted, move it to a restricted folder such as /Workspace/Users/<username or principal name>.
1016

11-
12-
>>> [CLI] bundle destroy --auto-approve
1317
The following resources will be deleted:
1418
delete job foo
1519

acceptance/bundle/deploy/mlops-stacks/output.txt

Lines changed: 20 additions & 4 deletions
Original file line numberDiff line numberDiff line change
@@ -69,10 +69,6 @@ Workspace:
6969
Found 2 warnings
7070

7171
>>> [CLI] bundle deploy
72-
Uploading bundle files to /Workspace/Users/[USERNAME]/.bundle/project_name_[UNIQUE_NAME]/dev/files...
73-
Deploying resources...
74-
Updating deployment state...
75-
Deployment complete!
7672
Warning: expected a string value, found null
7773
at targets.dev.workspace.host
7874
in databricks.yml:34:12
@@ -81,6 +77,10 @@ Warning: unknown field: description
8177
at resources.experiments.experiment
8278
in resources/ml-artifacts-resource.yml:21:7
8379

80+
Uploading bundle files to /Workspace/Users/[USERNAME]/.bundle/project_name_[UNIQUE_NAME]/dev/files...
81+
Deploying resources...
82+
Updating deployment state...
83+
Deployment complete!
8484

8585
>>> [CLI] bundle summary -o json
8686
Warning: expected a string value, found null
@@ -107,11 +107,27 @@ Warning: unknown field: description
107107
at resources.experiments.experiment
108108
in resources/ml-artifacts-resource.yml:21:7
109109

110+
Warning: expected a string value, found null
111+
at targets.dev.workspace.host
112+
in databricks.yml:34:12
113+
114+
Warning: unknown field: description
115+
at resources.experiments.experiment
116+
in resources/ml-artifacts-resource.yml:21:7
117+
110118
{
111119
"name": "[dev [USERNAME]] dev-project_name_[UNIQUE_NAME]-batch-inference-job"
112120
}
113121

114122
>>> [CLI] bundle destroy --auto-approve
123+
Warning: expected a string value, found null
124+
at targets.dev.workspace.host
125+
in databricks.yml:34:12
126+
127+
Warning: unknown field: description
128+
at resources.experiments.experiment
129+
in resources/ml-artifacts-resource.yml:21:7
130+
115131
The following resources will be deleted:
116132
delete job batch_inference_job
117133
delete job model_training_job

acceptance/bundle/deployment/bind/job/job-abort-bind/output.txt

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -3,7 +3,7 @@
33
Created job with ID: [NUMID]
44

55
=== Expect binding to fail without an auto-approve flag:
6-
Error: failed to bind the resource, err: This bind operation requires user confirmation, but the current console does not support prompting. Please specify --auto-approve if you would like to skip prompts and proceed.
6+
Error: This bind operation requires user confirmation, but the current console does not support prompting. Please specify --auto-approve if you would like to skip prompts and proceed.
77

88
=== Deploy bundle:
99
>>> [CLI] bundle deploy --force-lock

acceptance/bundle/telemetry/deploy-mode/output.txt

Lines changed: 3 additions & 3 deletions
Original file line numberDiff line numberDiff line change
@@ -5,15 +5,15 @@ Deploying resources...
55
Deployment complete!
66

77
>>> [CLI] bundle deploy -t prod
8-
Uploading bundle files to /Workspace/Users/[USERNAME]/.bundle/test-bundle/prod/files...
9-
Deploying resources...
10-
Deployment complete!
118
Recommendation: target with 'mode: production' should set 'workspace.root_path' to make sure only one copy is deployed
129

1310
A common practice is to use a username or principal name in this path, i.e. use
1411

1512
root_path: /Workspace/Users/[USERNAME]/.bundle/${bundle.name}/${bundle.target}
1613

14+
Uploading bundle files to /Workspace/Users/[USERNAME]/.bundle/test-bundle/prod/files...
15+
Deploying resources...
16+
Deployment complete!
1717

1818
>>> cat out.requests.txt
1919
{

acceptance/bundle/variables/host/output.txt

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -55,7 +55,7 @@ Exit code: 1
5555
>>> errcode [CLI] bundle summary
5656
Warning: Variable interpolation is not supported for fields that configure authentication
5757
at workspace.host
58-
in [TEST_TMP_DIR]/databricks.yml:10:9
58+
in databricks.yml:10:9
5959

6060
Interpolation is not supported for the field workspace.host. Please set
6161
the DATABRICKS_HOST environment variable if you wish to configure this field at runtime.
Lines changed: 4 additions & 4 deletions
Original file line numberDiff line numberDiff line change
@@ -1,11 +1,11 @@
11

22
>>> [PIPELINES] deploy
3-
Uploading bundle files to /Workspace/Users/[USERNAME]/.bundle/render-diagnostics-warning-pipeline/default/files...
4-
Deploying resources...
5-
Updating deployment state...
6-
Deployment complete!
73
Warning: unknown field: unknown_property
84
at resources.pipelines.test-pipeline
95
in databricks.yml:8:7
106

7+
Uploading bundle files to /Workspace/Users/[USERNAME]/.bundle/render-diagnostics-warning-pipeline/default/files...
8+
Deploying resources...
9+
Updating deployment state...
10+
Deployment complete!
1111
<EOL>

bundle/artifacts/build.go

Lines changed: 18 additions & 17 deletions
Original file line numberDiff line numberDiff line change
@@ -13,6 +13,7 @@ import (
1313
"github.com/databricks/cli/libs/diag"
1414
"github.com/databricks/cli/libs/exec"
1515
"github.com/databricks/cli/libs/log"
16+
"github.com/databricks/cli/libs/logdiag"
1617
"github.com/databricks/cli/libs/patchwheel"
1718
"github.com/databricks/cli/libs/utils"
1819
)
@@ -28,11 +29,9 @@ func (m *build) Name() string {
2829
}
2930

3031
func (m *build) Apply(ctx context.Context, b *bundle.Bundle) diag.Diagnostics {
31-
var diags diag.Diagnostics
32-
3332
cacheDir, err := b.CacheDir(ctx)
3433
if err != nil {
35-
diags = append(diags, diag.Diagnostic{
34+
logdiag.LogDiag(ctx, diag.Diagnostic{
3635
Severity: diag.Warning,
3736
Summary: "Failed to set up cache directory",
3837
})
@@ -44,20 +43,20 @@ func (m *build) Apply(ctx context.Context, b *bundle.Bundle) diag.Diagnostics {
4443
if a.BuildCommand != "" {
4544
err := doBuild(ctx, artifactName, a)
4645
if err != nil {
47-
diags = diags.Extend(diag.FromErr(err))
46+
logdiag.LogError(ctx, err)
4847
break
4948
}
5049

5150
// We need to expand glob reference after build mutator is applied because
5251
// if we do it before, any files that are generated by build command will
5352
// not be included into artifact.Files and thus will not be uploaded.
5453
// We only do it if BuildCommand was specified because otherwise it should have been done already by artifacts.Prepare()
55-
diags = diags.Extend(bundle.Apply(ctx, b, expandGlobs{name: artifactName}))
54+
bundle.ApplyContext(ctx, b, expandGlobs{name: artifactName})
5655

57-
// After bundle.Apply is called, all of b.Config is recreated and all pointers are invalidated (!)
56+
// After bundle.ApplyContext is called, all of b.Config is recreated and all pointers are invalidated (!)
5857
a = b.Config.Artifacts[artifactName]
5958

60-
if diags.HasError() {
59+
if logdiag.HasError(ctx) {
6160
break
6261
}
6362

@@ -66,20 +65,19 @@ func (m *build) Apply(ctx context.Context, b *bundle.Bundle) diag.Diagnostics {
6665
if a.Type == "whl" && a.DynamicVersion && cacheDir != "" {
6766
b.Metrics.AddBoolValue(metrics.ArtifactDynamicVersionIsSet, true)
6867
for ind, artifactFile := range a.Files {
69-
patchedWheel, extraDiags := makePatchedWheel(ctx, cacheDir, artifactName, artifactFile.Source)
68+
patchedWheel := makePatchedWheel(ctx, cacheDir, artifactName, artifactFile.Source)
7069
log.Debugf(ctx, "Patching ind=%d artifactName=%s Source=%s patchedWheel=%s", ind, artifactName, artifactFile.Source, patchedWheel)
71-
diags = append(diags, extraDiags...)
7270
if patchedWheel != "" {
7371
a.Files[ind].Patched = patchedWheel
7472
}
75-
if extraDiags.HasError() {
73+
if logdiag.HasError(ctx) {
7674
break
7775
}
7876
}
7977
}
8078
}
8179

82-
return diags
80+
return nil
8381
}
8482

8583
func doBuild(ctx context.Context, artifactName string, a *config.Artifact) error {
@@ -106,26 +104,29 @@ func doBuild(ctx context.Context, artifactName string, a *config.Artifact) error
106104
return nil
107105
}
108106

109-
func makePatchedWheel(ctx context.Context, cacheDir, artifactName, wheel string) (string, diag.Diagnostics) {
107+
func makePatchedWheel(ctx context.Context, cacheDir, artifactName, wheel string) string {
110108
msg := "Failed to patch wheel with dynamic version"
111109
info, err := patchwheel.ParseWheelFilename(wheel)
112110
if err != nil {
113-
return "", []diag.Diagnostic{{
111+
logdiag.LogDiag(ctx, diag.Diagnostic{
114112
Severity: diag.Warning,
115113
Summary: msg,
116114
Detail: fmt.Sprintf("When parsing filename \"%s\" encountered an error: %s", wheel, err),
117-
}}
115+
})
116+
return ""
118117
}
119118

120119
dir := filepath.Join(cacheDir, "patched_wheels", artifactName+"_"+info.Distribution)
121120
createAndCleanupDirectory(ctx, dir)
122121
patchedWheel, isBuilt, err := patchwheel.PatchWheel(wheel, dir)
123122
if err != nil {
124-
return "", []diag.Diagnostic{{
123+
logdiag.LogDiag(ctx, diag.Diagnostic{
125124
Severity: diag.Warning,
126125
Summary: msg,
127126
Detail: fmt.Sprintf("When patching %s encountered an error: %s", wheel, err),
128-
}}
127+
})
128+
return ""
129+
129130
}
130131

131132
if isBuilt {
@@ -134,7 +135,7 @@ func makePatchedWheel(ctx context.Context, cacheDir, artifactName, wheel string)
134135
log.Infof(ctx, "Patched wheel (cache) %s -> %s", wheel, patchedWheel)
135136
}
136137

137-
return patchedWheel, nil
138+
return patchedWheel
138139
}
139140

140141
func createAndCleanupDirectory(ctx context.Context, dir string) {

0 commit comments

Comments
 (0)