Commit Graph

525 Commits

Author SHA1 Message Date
Gleb Kanterov 490259a14a
Refactor jobs path translation (#1782)
## Changes
Extract package for other modules to transform different kinds of paths
in job resources.

## Tests
Unit tests
2024-09-24 13:51:54 +00:00
Andrew Nester 56ed9bebf3
Added support for creating all-purpose clusters (#1698)
## Changes
Added support for creating all-purpose clusters

Example of configuration

```
bundle:
  name: clusters

resources:
  clusters:
    test_cluster:
      cluster_name: "Test Cluster"
      num_workers: 2
      node_type_id: "i3.xlarge"
      autoscale:
        min_workers: 2
        max_workers: 7
      spark_version: "13.3.x-scala2.12"
      spark_conf:
        "spark.executor.memory": "2g"

  jobs:
    test_job:
      name: "Test Job"
      tasks:
        - task_key: test_task
          existing_cluster_id: ${resources.clusters.test_cluster.id}
          notebook_task:
            notebook_path: "./src/test.py"

targets:
    development:
      mode: development
      compute_id: ${resources.clusters.test_cluster.id}

```

## Tests
Added unit, config and E2E tests
2024-09-23 10:42:34 +00:00
Ilia Babanov ac80d3dfcb
Add verbose flag to the "bundle deploy" command (#1774)
## Changes
- Extract sync output logic from `cmd/sync` into `lib/sync`
- Add hidden `verbose` flag to the `bundle deploy` command, it's false
by default and hidden from the `--help` output
- Pass output handler to the `deploy/files/upload` mutator if the
verbose option is true

The was an idea to use in-place output overriding each past file sync
event in the output, bit that wont work for the extension, since it
doesn't display deploy logs in the terminal.

Example output:
```
~/tmp/defpy: ~/cli/cli bundle deploy --sync-progress
Building defpy...
Uploading defpy-0.0.1+20240917.112755-py3-none-any.whl...
Uploading bundle files to /Users/ilia.babanov@databricks.com/.bundle/defpy/dev/files...
Action: PUT: requirements-dev.txt, resources/defpy_pipeline.yml, pytest.ini, src/defpy/main.py, src/defpy/__init__.py, src/dlt_pipeline.ipynb, tests/main_test.py, src/notebook.ipynb, setup.py, resources/defpy_job.yml, .vscode/extensions.json, .vscode/settings.json, fixtures/.gitkeep, .vscode/__builtins__.pyi, README.md, .gitignore, databricks.yml
Uploaded tests
Uploaded resources
Uploaded fixtures
Uploaded .vscode
Uploaded src/defpy
Uploaded requirements-dev.txt
Uploaded .gitignore
Uploaded fixtures/.gitkeep
Uploaded src/defpy/__init__.py
Uploaded databricks.yml
Uploaded README.md
Uploaded setup.py
Uploaded .vscode/__builtins__.pyi
Uploaded .vscode/extensions.json
Uploaded src/dlt_pipeline.ipynb
Uploaded .vscode/settings.json
Uploaded resources/defpy_job.yml
Uploaded pytest.ini
Uploaded src/defpy/main.py
Uploaded tests/main_test.py
Uploaded resources/defpy_pipeline.yml
Uploaded src/notebook.ipynb
Initial Sync Complete
Deploying resources...
Updating deployment state...
Deployment complete!
```

Output example in the extension:
<img width="1843" alt="Screenshot 2024-09-19 at 11 07 48"
src="https://github.com/user-attachments/assets/0fafd095-cdc6-44b8-b482-27a38ada0330">


## Tests
Manually for the `sync` and `bundle deploy` commands + vscode extension
sync and deploy flows
2024-09-23 10:09:11 +00:00
Andrew Nester cf989a7e10
Upgrade to TF provider 1.52 (#1781)
## Changes
Upgrade to TF provider 1.52

We also temporarily skip generating plugin framework structs to unblock
upgrade as generation does not work yet and need to be fixed separately
2024-09-19 11:21:32 +00:00
Andrew Nester bcab6ca37b
Fixed detecting full syntax variable override which includes type field (#1775)
## Changes
Fixes #1773 

## Tests
Confirmed manually
2024-09-18 10:23:07 +00:00
Lennart Kats (databricks) e220f9ddd6
Use the friendly name of service principals when shortening their name (#1770)
## Summary

Use the friendly name of service principals when shortening their name.

This change is helpful for the prefix in development mode. Instead of
adding a prefix like `[dev 1706906c-c0a2-4c25-9f57-3a7aa3cb8123]`, we'll
prefix like `[dev my_principal]`.
2024-09-16 18:35:07 +00:00
Andrew Nester 66307134c1
Fixed generated YAML missing 'default' for empty values (#1765)
## Changes
Fixed generated YAML missing 'default' for empty values

## Tests
Added unit test
2024-09-11 09:49:58 +00:00
shreyas-goenka c61358407f
Add end to end integration tests for bundle JSON schema (#1726) 2024-09-11 09:15:56 +00:00
shreyas-goenka 5d2c0e3885
Alias variables block in the `Target` struct (#1748)
## Changes
This PR aliases and overrides the schema associated with the variables
block in `target` to allow for directly specifying a variable value in
the JSON schema (without an levels of nesting). This is needed because
this direct value is resolved by dynamically parsing the configuration
tree.

ca6332a5a4/bundle/config/root.go (L424)

## Tests
Existing unit tests.
2024-09-10 14:49:34 +00:00
shreyas-goenka 28b39cd3f7
Make bundle JSON schema modular with `$defs` (#1700)
## Changes
This PR makes sweeping changes to the way we generate and test the
bundle JSON schema. The main benefits are:

1. More modular JSON schema. Every definition in the schema now is one
level deep and points to references instead of inlining the entire
schema for a field. This unblocks PyDABs from taking a dependency on the
JSON schema.

2. Generate the JSON schema during CLI code generation. Directly stream
it instead of computing it at runtime whenever a user calls `databricks
bundle schema`. This is nice because we no longer need to embed a
partial OpenAPI spec in the CLI. Down the line, we can add a `Schema()`
method to every struct in the Databricks Go SDK and remove the
dependency on the OpenAPI spec altogether. It'll become more important
once we decouple Go SDK structs and methods from the underlying APIs.

3. Add enum values for Go SDK fields in the JSON schema. Better
autocompletion and validation for these fields. As a follow-up, we can
add enum values for non-Go SDK enums as well (created internal ticket to
track).

4. Use "packageName.structName" as a key to read JSON schemas from the
OpenAPI spec for Go SDK structs. Before, we would use an unrolled
presentation of the JSON schema (stored in `bundle_descriptions.json`),
which was complex to parse and include in the final JSON schema output.
This also means loading values from the OpenAPI spec for `target` schema
works automatically and no longer needs custom code.
5. Support recursive types (eg: `for_each_task`). With us now using
$refs everywhere it's trivial to support.
6. Using complex variables would be invalid according to the schema
generated before this PR. Now that bug is fixed. In the future adding
more custom rules will be easier as well due to the single level nature
of the JSON schema.


Since this is a complete change of approach in how we generate the JSON
schema, there are a few (very minor) regressions worth calling out.
1. We'll lose a few custom descriptions for non Go SDK structs that were
a part of `bundle_descriptions.json`. Support for those can be added in
the future as a followup.
2. Since now the final JSON schema is a static artefact, we lose some
lead time for the signal that JSON schema integration tests are failing.
It's okay though since we have a lot of coverage via the existing unit
tests.

## Tests
Unit tests. End to end tests are being added in this PR:
https://github.com/databricks/cli/pull/1726

Previous unit tests were all deleted because they were bloated. Effort
was made to make the new unit tests provide (almost) equivalent
coverage.
2024-09-10 13:55:18 +00:00
dependabot[bot] d3e221a116
Bump github.com/databricks/databricks-sdk-go from 0.45.0 to 0.46.0 (#1760)
Bumps
[github.com/databricks/databricks-sdk-go](https://github.com/databricks/databricks-sdk-go)
from 0.45.0 to 0.46.0.
<details>
<summary>Release notes</summary>
<p><em>Sourced from <a
href="https://github.com/databricks/databricks-sdk-go/releases">github.com/databricks/databricks-sdk-go's
releases</a>.</em></p>
<blockquote>
<h2>v0.46.0</h2>
<h3>Bug Fixes</h3>
<ul>
<li>Fail fast when authenticating if host is not configured (<a
href="https://redirect.github.com/databricks/databricks-sdk-go/pull/1033">#1033</a>).</li>
<li>Improve non-JSON error handling (<a
href="https://redirect.github.com/databricks/databricks-sdk-go/pull/1031">#1031</a>).</li>
</ul>
<h3>Internal Changes</h3>
<ul>
<li>Add TestAccCreateOboTokenOnAws to flaky test list (<a
href="https://redirect.github.com/databricks/databricks-sdk-go/pull/1029">#1029</a>).</li>
<li>Add workflows manage integration tests checks (<a
href="https://redirect.github.com/databricks/databricks-sdk-go/pull/1032">#1032</a>).</li>
<li>Fix TestMwsAccWorkspaces cleanup (<a
href="https://redirect.github.com/databricks/databricks-sdk-go/pull/1028">#1028</a>).</li>
<li>Improve integration test comment (<a
href="https://redirect.github.com/databricks/databricks-sdk-go/pull/1035">#1035</a>).</li>
<li>Temporary ignore Metastore test failures (<a
href="https://redirect.github.com/databricks/databricks-sdk-go/pull/1027">#1027</a>).</li>
<li>Update test to support new accounts (<a
href="https://redirect.github.com/databricks/databricks-sdk-go/pull/1026">#1026</a>).</li>
<li>Use statuses instead of checks (<a
href="https://redirect.github.com/databricks/databricks-sdk-go/pull/1036">#1036</a>).</li>
</ul>
<h3>API Changes:</h3>
<ul>
<li>Added <code>RegenerateDashboard</code> method for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/catalog#QualityMonitorsAPI">w.QualityMonitors</a>
workspace-level service.</li>
<li>Added <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/catalog#RegenerateDashboardRequest">catalog.RegenerateDashboardRequest</a>
and <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/catalog#RegenerateDashboardResponse">catalog.RegenerateDashboardResponse</a>.</li>
<li>Added <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/jobs#QueueDetails">jobs.QueueDetails</a>,
<a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/jobs#QueueDetailsCodeCode">jobs.QueueDetailsCodeCode</a>,
<a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/jobs#RunLifecycleStateV2State">jobs.RunLifecycleStateV2State</a>,
<a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/jobs#RunStatus">jobs.RunStatus</a>,
<a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/jobs#TerminationCodeCode">jobs.TerminationCodeCode</a>,
<a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/jobs#TerminationDetails">jobs.TerminationDetails</a>
and <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/jobs#TerminationTypeType">jobs.TerminationTypeType</a>.</li>
<li>Added <code>Status</code> field for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/jobs#BaseRun">jobs.BaseRun</a>.</li>
<li>Added <code>Status</code> field for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/jobs#RepairHistoryItem">jobs.RepairHistoryItem</a>.</li>
<li>Added <code>Status</code> field for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/jobs#Run">jobs.Run</a>.</li>
<li>Added <code>Status</code> field for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/jobs#RunTask">jobs.RunTask</a>.</li>
<li>Added <code>MaxProvisionedThroughput</code> and
<code>MinProvisionedThroughput</code> fields for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/serving#ServedModelInput">serving.ServedModelInput</a>.</li>
<li>Added <code>ColumnsToSync</code> field for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/vectorsearch#DeltaSyncVectorIndexSpecRequest">vectorsearch.DeltaSyncVectorIndexSpecRequest</a>.</li>
<li>Changed <code>WorkloadSize</code> field for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/serving#ServedModelInput">serving.ServedModelInput</a>
to no longer be required.</li>
</ul>
<p>OpenAPI SHA: d05898328669a3f8ab0c2ecee37db2673d3ea3f7, Date:
2024-09-04</p>
</blockquote>
</details>
<details>
<summary>Changelog</summary>
<p><em>Sourced from <a
href="https://github.com/databricks/databricks-sdk-go/blob/main/CHANGELOG.md">github.com/databricks/databricks-sdk-go's
changelog</a>.</em></p>
<blockquote>
<h2>[Release] Release v0.46.0</h2>
<h3>Bug Fixes</h3>
<ul>
<li>Fail fast when authenticating if host is not configured (<a
href="https://redirect.github.com/databricks/databricks-sdk-go/pull/1033">#1033</a>).</li>
<li>Improve non-JSON error handling (<a
href="https://redirect.github.com/databricks/databricks-sdk-go/pull/1031">#1031</a>).</li>
</ul>
<h3>Internal Changes</h3>
<ul>
<li>Add TestAccCreateOboTokenOnAws to flaky test list (<a
href="https://redirect.github.com/databricks/databricks-sdk-go/pull/1029">#1029</a>).</li>
<li>Add workflows manage integration tests checks (<a
href="https://redirect.github.com/databricks/databricks-sdk-go/pull/1032">#1032</a>).</li>
<li>Fix TestMwsAccWorkspaces cleanup (<a
href="https://redirect.github.com/databricks/databricks-sdk-go/pull/1028">#1028</a>).</li>
<li>Improve integration test comment (<a
href="https://redirect.github.com/databricks/databricks-sdk-go/pull/1035">#1035</a>).</li>
<li>Temporary ignore Metastore test failures (<a
href="https://redirect.github.com/databricks/databricks-sdk-go/pull/1027">#1027</a>).</li>
<li>Update test to support new accounts (<a
href="https://redirect.github.com/databricks/databricks-sdk-go/pull/1026">#1026</a>).</li>
<li>Use statuses instead of checks (<a
href="https://redirect.github.com/databricks/databricks-sdk-go/pull/1036">#1036</a>).</li>
</ul>
<h3>API Changes:</h3>
<ul>
<li>Added <code>RegenerateDashboard</code> method for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/catalog#QualityMonitorsAPI">w.QualityMonitors</a>
workspace-level service.</li>
<li>Added <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/catalog#RegenerateDashboardRequest">catalog.RegenerateDashboardRequest</a>
and <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/catalog#RegenerateDashboardResponse">catalog.RegenerateDashboardResponse</a>.</li>
<li>Added <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/jobs#QueueDetails">jobs.QueueDetails</a>,
<a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/jobs#QueueDetailsCodeCode">jobs.QueueDetailsCodeCode</a>,
<a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/jobs#RunLifecycleStateV2State">jobs.RunLifecycleStateV2State</a>,
<a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/jobs#RunStatus">jobs.RunStatus</a>,
<a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/jobs#TerminationCodeCode">jobs.TerminationCodeCode</a>,
<a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/jobs#TerminationDetails">jobs.TerminationDetails</a>
and <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/jobs#TerminationTypeType">jobs.TerminationTypeType</a>.</li>
<li>Added <code>Status</code> field for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/jobs#BaseRun">jobs.BaseRun</a>.</li>
<li>Added <code>Status</code> field for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/jobs#RepairHistoryItem">jobs.RepairHistoryItem</a>.</li>
<li>Added <code>Status</code> field for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/jobs#Run">jobs.Run</a>.</li>
<li>Added <code>Status</code> field for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/jobs#RunTask">jobs.RunTask</a>.</li>
<li>Added <code>MaxProvisionedThroughput</code> and
<code>MinProvisionedThroughput</code> fields for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/serving#ServedModelInput">serving.ServedModelInput</a>.</li>
<li>Added <code>ColumnsToSync</code> field for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/vectorsearch#DeltaSyncVectorIndexSpecRequest">vectorsearch.DeltaSyncVectorIndexSpecRequest</a>.</li>
<li>Changed <code>WorkloadSize</code> field for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/serving#ServedModelInput">serving.ServedModelInput</a>
to no longer be required.</li>
</ul>
<p>OpenAPI SHA: d05898328669a3f8ab0c2ecee37db2673d3ea3f7, Date:
2024-09-04</p>
</blockquote>
</details>
<details>
<summary>Commits</summary>
<ul>
<li><a
href="37cb031019"><code>37cb031</code></a>
[Release] Release v0.46.0 (<a
href="https://redirect.github.com/databricks/databricks-sdk-go/issues/1037">#1037</a>)</li>
<li><a
href="34f37f9e4c"><code>34f37f9</code></a>
[Internal] Use statuses instead of checks (<a
href="https://redirect.github.com/databricks/databricks-sdk-go/issues/1036">#1036</a>)</li>
<li><a
href="590d597046"><code>590d597</code></a>
[Internal] Improve integration test comment (<a
href="https://redirect.github.com/databricks/databricks-sdk-go/issues/1035">#1035</a>)</li>
<li><a
href="6ab81eed78"><code>6ab81ee</code></a>
[Internal] Add workflows manage integration tests checks (<a
href="https://redirect.github.com/databricks/databricks-sdk-go/issues/1032">#1032</a>)</li>
<li><a
href="4886afe312"><code>4886afe</code></a>
[Fix] Fail fast when authenticating if host is not configured (<a
href="https://redirect.github.com/databricks/databricks-sdk-go/issues/1033">#1033</a>)</li>
<li><a
href="796dae1674"><code>796dae1</code></a>
[Fix] Handle non-JSON errors (<a
href="https://redirect.github.com/databricks/databricks-sdk-go/issues/1031">#1031</a>)</li>
<li><a
href="a24a158b34"><code>a24a158</code></a>
[Internal] Add TestAccCreateOboTokenOnAws to flaky test list (<a
href="https://redirect.github.com/databricks/databricks-sdk-go/issues/1029">#1029</a>)</li>
<li><a
href="9ab8b42bc4"><code>9ab8b42</code></a>
[Fix] Fix TestMwsAccWorkspaces cleanup (<a
href="https://redirect.github.com/databricks/databricks-sdk-go/issues/1028">#1028</a>)</li>
<li><a
href="cc22621c96"><code>cc22621</code></a>
[Internal] Temporary ignore Metastore test failures (<a
href="https://redirect.github.com/databricks/databricks-sdk-go/issues/1027">#1027</a>)</li>
<li><a
href="8dbaaf2767"><code>8dbaaf2</code></a>
[Fix] Update test to support new accounts (<a
href="https://redirect.github.com/databricks/databricks-sdk-go/issues/1026">#1026</a>)</li>
<li>See full diff in <a
href="https://github.com/databricks/databricks-sdk-go/compare/v0.45.0...v0.46.0">compare
view</a></li>
</ul>
</details>
<br />

<details>
<summary>Most Recent Ignore Conditions Applied to This Pull
Request</summary>

| Dependency Name | Ignore Conditions |
| --- | --- |
| github.com/databricks/databricks-sdk-go | [>= 0.28.a, < 0.29] |
</details>


[![Dependabot compatibility
score](https://dependabot-badges.githubapp.com/badges/compatibility_score?dependency-name=github.com/databricks/databricks-sdk-go&package-manager=go_modules&previous-version=0.45.0&new-version=0.46.0)](https://docs.github.com/en/github/managing-security-vulnerabilities/about-dependabot-security-updates#about-compatibility-scores)

Dependabot will resolve any conflicts with this PR as long as you don't
alter it yourself. You can also trigger a rebase manually by commenting
`@dependabot rebase`.

[//]: # (dependabot-automerge-start)
[//]: # (dependabot-automerge-end)

---

<details>
<summary>Dependabot commands and options</summary>
<br />

You can trigger Dependabot actions by commenting on this PR:
- `@dependabot rebase` will rebase this PR
- `@dependabot recreate` will recreate this PR, overwriting any edits
that have been made to it
- `@dependabot merge` will merge this PR after your CI passes on it
- `@dependabot squash and merge` will squash and merge this PR after
your CI passes on it
- `@dependabot cancel merge` will cancel a previously requested merge
and block automerging
- `@dependabot reopen` will reopen this PR if it is closed
- `@dependabot close` will close this PR and stop Dependabot recreating
it. You can achieve the same result by closing it manually
- `@dependabot show <dependency name> ignore conditions` will show all
of the ignore conditions of the specified dependency
- `@dependabot ignore this major version` will close this PR and stop
Dependabot creating any more for this major version (unless you reopen
the PR or upgrade to it yourself)
- `@dependabot ignore this minor version` will close this PR and stop
Dependabot creating any more for this minor version (unless you reopen
the PR or upgrade to it yourself)
- `@dependabot ignore this dependency` will close this PR and stop
Dependabot creating any more for this dependency (unless you reopen the
PR or upgrade to it yourself)


</details>

---------

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
Co-authored-by: Andrew Nester <andrew.nester@databricks.com>
2024-09-09 14:49:39 +00:00
Pieter Noordhuis b451905b6e
Expand library globs relative to the sync root (#1756)
## Changes

Library glob expansion happens during deployment. Before that, all
entries that refer to local paths in resource definitions are made
relative to the _sync root_. Before #1694, they were made relative to
the _bundle root_. This PR didn't update the library glob expansion code
to use the sync root path.

If you were using the sync paths setting with library globs, the CLI
would fail to expand the globs because the code was using the wrong path
to anchor those globs.

This change fixes the issue.

## Tests

Manually confirmed that this fixes the issue reported in #1755.
2024-09-09 09:56:16 +00:00
Andrew Nester 02e83877f4
Added listing cluster filtering for cluster lookups (#1754)
## Changes
We added a custom resolver for the cluster to add filtering for the
cluster source when we list all clusters.

Without the filtering listing could take a very long time (5-10 mins)
which leads to lookup timeouts.

## Tests
Existing unit tests passing
2024-09-06 11:34:57 +00:00
Pieter Noordhuis ceefa80d72
Pass copy of `dyn.Path` to callback function (#1747)
## Changes

Some call sites hold on to the `dyn.Path` provided to them by the
callback. It must therefore never be mutated after the callback returns,
or these mutations leak out into unknown scope.

This change means it is no longer possible for this failure mode to
happen.

## Tests

Unit test.
2024-09-05 11:05:16 +00:00
Andrew Nester 72030844c5
Fixed variable override in target with full variable syntax (#1749)
## Changes
This PR makes sure that both of this override syntax for variables work
correctly
```
targets:
  dev:
    variables:
      cluster1:
        spark_version: "14.2.x-scala2.11"
        node_type_id: "Standard_DS3_v2"
        num_workers: 4
        spark_conf:
          spark.speculation: false
          spark.databricks.delta.retentionDurationCheck.enabled: false
      cluster2:
        default:
          spark_version: "14.2.x-scala2.11"
          node_type_id: "Standard_DS3_v2"
          num_workers: 4
          spark_conf:
            spark.speculation: false
            spark.databricks.delta.retentionDurationCheck.enabled: false
```
## Tests
Added regression test

---------

Co-authored-by: Pieter Noordhuis <pieter.noordhuis@databricks.com>
2024-09-04 17:16:40 +00:00
Andrew Nester ca6332a5a4
Fixed complex variables are not being correctly merged from include files (#1746)
## Changes
Fixes an `Error: no value assigned to required variable <variable>.`
when the main complex variable definition is defined in one file but
target override is defined in separate file which is included in the
main one.

## Tests
Added regression test
2024-09-04 11:24:55 +00:00
shreyas-goenka a27c24a397
Add prompt when a pipeline recreation happens (#1672)
## Changes
DLT pipeline recreations are destructive. They can lead to lost history
of previous updates, outage of the tables temporarily and are
potentially computationally expensive. Thus we make a breaking change
where a prompt is shown to the user if there configuration changes will
lead to a DLT recreation.

Users can skip the prompt by specifying  the `--auto-approve` flag.

This PR also fixes an issue with our test runner where logs from the
cmdio.Logger would not get propagated to the reader returned by our
cobra test runner.

## Tests
Manually, and new unit and integration tests.

```
➜  bundle-playground-3 cli bundle deploy
Uploading bundle files to /Users/63ec021d-b0c6-49c0-93a0-5123953a1cb2/.bundle/test/development/files...
The following DLT pipelines will be recreated. Underlying tables will be unavailable for a transient period until the newly recreated pipelines are run once successfully. History of previous pipeline update runs will be lost because of recreation:
  recreate pipeline foo

Would you like to proceed? [y/n]: n
Deployment cancelled!
```
2024-09-04 11:11:47 +00:00
shreyas-goenka 096123674a
Fix streaming of stdout, stdin, stderr in cobra test runner (#1742)
## Changes
We were not using the readers and writers set in the test fixtures in
the progress logger. This PR fixes that. It also modifies
`TestAccAbortBind`, which was implicitly relying on the bug.

I encountered this bug while working on
https://github.com/databricks/cli/pull/1672.

## Tests
Manually. 

From non-tty:
```
Error: failed to bind the resource, err: This bind operation requires user confirmation, but the current console does not support prompting. Please specify --auto-approve if you would like to skip prompts and proceed.
```

From tty, bind works as expected.
```
Confirm import changes? Changes will be remotely applied only after running 'bundle deploy'. [y/n]: y
Updating deployment state...
Successfully bound databricks_pipeline with an id '9d2dedbb-f522-4503-96ba-4bc4d5bfa77d'. Run 'bundle deploy' to deploy changes to your workspace
```
2024-09-02 13:43:17 +00:00
Gleb Kanterov ed448815b4
PythonMutator: explain missing package error (#1736)
## Changes
Explain the error when the `databricks-pydabs` package is not installed
or the Python environment isn't correctly activated.

Example output:

```
Error: python mutator process failed: ".venv/bin/python3 -m databricks.bundles.build --phase load --input .../input.json --output .../output.json --diagnostics .../diagnostics.json: exit status 1", use --debug to enable logging

.../.venv/bin/python3: Error while finding module specification for 'databricks.bundles.build' (ModuleNotFoundError: No module named 'databricks')

Explanation: 'databricks-pydabs' library is not installed in the Python environment.

If using Python wheels, ensure that 'databricks-pydabs' is included in the dependencies, 
and that the wheel is installed in the Python environment:

  $ .venv/bin/pip install -e .

If using a virtual environment, ensure it is specified as the venv_path property in databricks.yml, 
or activate the environment before running CLI commands:

  experimental:
    pydabs:
      venv_path: .venv
```

## Tests
Unit tests
2024-09-02 09:49:30 +00:00
Andrew Nester 582558cac2
Do not suppress normalisation diagnostics for resolving variables (#1740)
## Changes

Tested on the following bundle configuration

```
bundle:
  name: clusters
  mode: development

variables:
  webhook_notifications:
    description: Webhook URL for notifications
    type: complex
    default:
      on_failure:
        id: 6a6c04c1-389c-4534-95af-b68b62a9dbe6

resources:
  jobs:
    test_job:
      name: "Andrew Nester Test Job"
      tasks:
        - task_key: test_task
          notebook_task:
            notebook_path: "./src/test.py"
          new_cluster:
            num_workers: 2
            node_type_id: "i3.xlarge"
            autoscale:
              min_workers: 2
              max_workers: 7
            spark_version: "12.2.x-scala2.12"
            spark_conf:
              "spark.executor.memory": "2g"
      webhook_notifications: ${var.webhook_notifications}

```

bundle validate output is below

```
andrew.nester@HFW9Y94129 wheel % databricks bundle validate
Warning: expected sequence, found map
  at resources.jobs.test_job.webhook_notifications.on_failure
  in bundle.yml:11:9

Name: clusters
Target: default
Workspace:
  User: andrew.nester@databricks.com
  Path: /Users/andrew.nester@databricks.com/.bundle/clusters/default
```

**Note** that error correctly points to the variable
2024-09-02 09:17:18 +00:00
shreyas-goenka 5d9910c8e0
Make lock optional in the JSON schema (#1738)
Fixes https://github.com/databricks/cli/issues/1561
2024-09-02 08:39:08 +00:00
Gleb Kanterov 70ce802518
PythonMutator: preserve normalize diagnostics (#1735)
## Changes
Preserve diagnostics if there are any errors or warnings when
PythonMutator normalizes output. If anything goes wrong during
conversion, diagnostics contain the relevant location and path.

## Tests
Unit tests
2024-08-30 13:29:00 +00:00
Pieter Noordhuis 5fac7edcdf
Pass along $AZURE_CONFIG_FILE to Terraform process (#1734)
## Changes

This ensures that the CLI and Terraform can both use an Azure CLI
session configured under a non-standard path. This is the default
behavior on Azure DevOps when using the AzureCLI@2 task.

Fixes #1722.

## Tests

Unit test.
2024-08-29 14:41:12 +00:00
Andrew Nester 43ace69bb9
Consider serverless clusters as compatible for Python wheel tasks (#1733)
## Changes
Consider serverless clusters as compatible for Python wheel tasks.

Fixes a `Python wheel tasks require compute with DBR 13.3+ to include
local libraries` warning shown for serverless clusters
2024-08-29 12:47:44 +00:00
Lennart Kats (databricks) 85459c1963
Improve error handling for /Volumes paths in mode: development (#1716)
## Changes
* Provide a more helpful error when using an artifact_path based on
/Volumes
* Allow the use of short_names in /Volumes paths

## Example cases

Example of a valid /Volumes artifact_path:
* `artifact_path:
/Volumes/catalog/schema/${workspace.current_user.short_name}/libs`

Example of an invalid /Volumes path (when using `mode: development`):
* `artifact_path: /Volumes/catalog/schema/libs`
* Resulting error: `artifact_path should contain the current username or
${workspace.current_user.short_name} to ensure uniqueness when using
'mode: development'`
2024-08-28 12:14:19 +00:00
Andrew Nester 70363836d5
Correctly mark PyPI package name specs with multiple specifiers as remote libraries (#1725)
Correctly mark pypi package name specs with multiple specifiers as
remote libraries

Fixes this https://github.com/databricks/cli/issues/1728
2024-08-28 11:39:06 +00:00
dependabot[bot] 056d203236
Bump github.com/databricks/databricks-sdk-go from 0.44.0 to 0.45.0 (#1719)
Bumps
[github.com/databricks/databricks-sdk-go](https://github.com/databricks/databricks-sdk-go)
from 0.44.0 to 0.45.0.
<details>
<summary>Release notes</summary>
<p><em>Sourced from <a
href="https://github.com/databricks/databricks-sdk-go/releases">github.com/databricks/databricks-sdk-go's
releases</a>.</em></p>
<blockquote>
<h2>v0.45.0</h2>
<h2>0.45.0</h2>
<h3>Bug Fixes</h3>
<ul>
<li>Add INVALID_STATE to error code mapping (<a
href="https://redirect.github.com/databricks/databricks-sdk-go/pull/1014">#1014</a>).</li>
<li>Do not specify <code>--tenant</code> flag when fetching managed
identity access token from the CLI (<a
href="https://redirect.github.com/databricks/databricks-sdk-go/pull/1021">#1021</a>).</li>
</ul>
<h3>Internal Changes</h3>
<ul>
<li>Add terraform aliases to Entity (<a
href="https://redirect.github.com/databricks/databricks-sdk-go/pull/1017">#1017</a>).</li>
<li>Added Service.NamedIdMap (<a
href="https://redirect.github.com/databricks/databricks-sdk-go/pull/1016">#1016</a>).</li>
<li>Fix billing test for budget configuration update (<a
href="https://redirect.github.com/databricks/databricks-sdk-go/pull/1019">#1019</a>).</li>
</ul>
<h3>API Changes:</h3>
<ul>
<li>Added <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/compute#PolicyComplianceForClustersAPI">w.PolicyComplianceForClusters</a>
workspace-level service.</li>
<li>Added <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/jobs#PolicyComplianceForJobsAPI">w.PolicyComplianceForJobs</a>
workspace-level service.</li>
<li>Added <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/catalog#ResourceQuotasAPI">w.ResourceQuotas</a>
workspace-level service.</li>
<li>Added <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/catalog#GetQuotaRequest">catalog.GetQuotaRequest</a>,
<a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/catalog#GetQuotaResponse">catalog.GetQuotaResponse</a>,
<a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/catalog#ListQuotasRequest">catalog.ListQuotasRequest</a>,
<a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/catalog#ListQuotasResponse">catalog.ListQuotasResponse</a>
and <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/catalog#QuotaInfo">catalog.QuotaInfo</a>.</li>
<li>Added <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/compute#ClusterCompliance">compute.ClusterCompliance</a>,
<a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/compute#ClusterSettingsChange">compute.ClusterSettingsChange</a>,
<a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/compute#EnforceClusterComplianceRequest">compute.EnforceClusterComplianceRequest</a>,
<a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/compute#EnforceClusterComplianceResponse">compute.EnforceClusterComplianceResponse</a>,
<a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/compute#GetClusterComplianceRequest">compute.GetClusterComplianceRequest</a>,
<a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/compute#GetClusterComplianceResponse">compute.GetClusterComplianceResponse</a>,
<a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/compute#ListClusterCompliancesRequest">compute.ListClusterCompliancesRequest</a>
and <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/compute#ListClusterCompliancesResponse">compute.ListClusterCompliancesResponse</a>.</li>
<li>Added <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/jobs#EnforcePolicyComplianceForJobResponseJobClusterSettingsChange">jobs.EnforcePolicyComplianceForJobResponseJobClusterSettingsChange</a>,
<a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/jobs#EnforcePolicyComplianceRequest">jobs.EnforcePolicyComplianceRequest</a>,
<a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/jobs#EnforcePolicyComplianceResponse">jobs.EnforcePolicyComplianceResponse</a>,
<a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/jobs#GetPolicyComplianceRequest">jobs.GetPolicyComplianceRequest</a>,
<a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/jobs#GetPolicyComplianceResponse">jobs.GetPolicyComplianceResponse</a>,
<a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/jobs#JobCompliance">jobs.JobCompliance</a>,
<a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/jobs#ListJobComplianceForPolicyResponse">jobs.ListJobComplianceForPolicyResponse</a>
and <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/jobs#ListJobComplianceRequest">jobs.ListJobComplianceRequest</a>.</li>
<li>Added <code>Fallback</code> field for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/catalog#CreateExternalLocation">catalog.CreateExternalLocation</a>.</li>
<li>Added <code>Fallback</code> field for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/catalog#ExternalLocationInfo">catalog.ExternalLocationInfo</a>.</li>
<li>Added <code>Fallback</code> field for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/catalog#UpdateExternalLocation">catalog.UpdateExternalLocation</a>.</li>
<li>Added <code>JobRunId</code> field for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/jobs#BaseRun">jobs.BaseRun</a>.</li>
<li>Added <code>JobRunId</code> field for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/jobs#Run">jobs.Run</a>.</li>
<li>Added <code>IncludeMetrics</code> field for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/sql#ListQueryHistoryRequest">sql.ListQueryHistoryRequest</a>.</li>
<li>Added <code>StatementIds</code> field for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/sql#QueryFilter">sql.QueryFilter</a>.</li>
<li>Removed <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/sql#ContextFilter">sql.ContextFilter</a>.</li>
<li>Removed <code>ContextFilter</code> field for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/sql#QueryFilter">sql.QueryFilter</a>.</li>
<li>Removed <code>PipelineId</code> and <code>PipelineUpdateId</code>
fields for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/sql#QuerySource">sql.QuerySource</a>.</li>
</ul>
<p>OpenAPI SHA: 3eae49b444cac5a0118a3503e5b7ecef7f96527a, Date:
2024-08-21</p>
</blockquote>
</details>
<details>
<summary>Changelog</summary>
<p><em>Sourced from <a
href="https://github.com/databricks/databricks-sdk-go/blob/main/CHANGELOG.md">github.com/databricks/databricks-sdk-go's
changelog</a>.</em></p>
<blockquote>
<h2>[Release] Release v0.45.0</h2>
<h3>Bug Fixes</h3>
<ul>
<li>Add INVALID_STATE to error code mapping (<a
href="https://redirect.github.com/databricks/databricks-sdk-go/pull/1014">#1014</a>).</li>
<li>Do not specify <code>--tenant</code> flag when fetching managed
identity access token from the CLI (<a
href="https://redirect.github.com/databricks/databricks-sdk-go/pull/1021">#1021</a>).</li>
</ul>
<h3>Internal Changes</h3>
<ul>
<li>Add terraform aliases to Entity (<a
href="https://redirect.github.com/databricks/databricks-sdk-go/pull/1017">#1017</a>).</li>
<li>Added Service.NamedIdMap (<a
href="https://redirect.github.com/databricks/databricks-sdk-go/pull/1016">#1016</a>).</li>
<li>Fix billing test for budget configuration update (<a
href="https://redirect.github.com/databricks/databricks-sdk-go/pull/1019">#1019</a>).</li>
</ul>
<h3>API Changes:</h3>
<ul>
<li>Added <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/compute#PolicyComplianceForClustersAPI">w.PolicyComplianceForClusters</a>
workspace-level service.</li>
<li>Added <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/jobs#PolicyComplianceForJobsAPI">w.PolicyComplianceForJobs</a>
workspace-level service.</li>
<li>Added <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/catalog#ResourceQuotasAPI">w.ResourceQuotas</a>
workspace-level service.</li>
<li>Added <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/catalog#GetQuotaRequest">catalog.GetQuotaRequest</a>,
<a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/catalog#GetQuotaResponse">catalog.GetQuotaResponse</a>,
<a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/catalog#ListQuotasRequest">catalog.ListQuotasRequest</a>,
<a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/catalog#ListQuotasResponse">catalog.ListQuotasResponse</a>
and <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/catalog#QuotaInfo">catalog.QuotaInfo</a>.</li>
<li>Added <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/compute#ClusterCompliance">compute.ClusterCompliance</a>,
<a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/compute#ClusterSettingsChange">compute.ClusterSettingsChange</a>,
<a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/compute#EnforceClusterComplianceRequest">compute.EnforceClusterComplianceRequest</a>,
<a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/compute#EnforceClusterComplianceResponse">compute.EnforceClusterComplianceResponse</a>,
<a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/compute#GetClusterComplianceRequest">compute.GetClusterComplianceRequest</a>,
<a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/compute#GetClusterComplianceResponse">compute.GetClusterComplianceResponse</a>,
<a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/compute#ListClusterCompliancesRequest">compute.ListClusterCompliancesRequest</a>
and <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/compute#ListClusterCompliancesResponse">compute.ListClusterCompliancesResponse</a>.</li>
<li>Added <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/jobs#EnforcePolicyComplianceForJobResponseJobClusterSettingsChange">jobs.EnforcePolicyComplianceForJobResponseJobClusterSettingsChange</a>,
<a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/jobs#EnforcePolicyComplianceRequest">jobs.EnforcePolicyComplianceRequest</a>,
<a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/jobs#EnforcePolicyComplianceResponse">jobs.EnforcePolicyComplianceResponse</a>,
<a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/jobs#GetPolicyComplianceRequest">jobs.GetPolicyComplianceRequest</a>,
<a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/jobs#GetPolicyComplianceResponse">jobs.GetPolicyComplianceResponse</a>,
<a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/jobs#JobCompliance">jobs.JobCompliance</a>,
<a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/jobs#ListJobComplianceForPolicyResponse">jobs.ListJobComplianceForPolicyResponse</a>
and <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/jobs#ListJobComplianceRequest">jobs.ListJobComplianceRequest</a>.</li>
<li>Added <code>Fallback</code> field for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/catalog#CreateExternalLocation">catalog.CreateExternalLocation</a>.</li>
<li>Added <code>Fallback</code> field for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/catalog#ExternalLocationInfo">catalog.ExternalLocationInfo</a>.</li>
<li>Added <code>Fallback</code> field for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/catalog#UpdateExternalLocation">catalog.UpdateExternalLocation</a>.</li>
<li>Added <code>JobRunId</code> field for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/jobs#BaseRun">jobs.BaseRun</a>.</li>
<li>Added <code>JobRunId</code> field for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/jobs#Run">jobs.Run</a>.</li>
<li>Added <code>IncludeMetrics</code> field for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/sql#ListQueryHistoryRequest">sql.ListQueryHistoryRequest</a>.</li>
<li>Added <code>StatementIds</code> field for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/sql#QueryFilter">sql.QueryFilter</a>.</li>
<li>Removed <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/sql#ContextFilter">sql.ContextFilter</a>.</li>
<li>Removed <code>ContextFilter</code> field for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/sql#QueryFilter">sql.QueryFilter</a>.</li>
<li>Removed <code>PipelineId</code> and <code>PipelineUpdateId</code>
fields for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/sql#QuerySource">sql.QuerySource</a>.</li>
</ul>
<p>OpenAPI SHA: 3eae49b444cac5a0118a3503e5b7ecef7f96527a, Date:
2024-08-21</p>
</blockquote>
</details>
<details>
<summary>Commits</summary>
<ul>
<li><a
href="6d867882d0"><code>6d86788</code></a>
[Release] Release v0.45.0 (<a
href="https://redirect.github.com/databricks/databricks-sdk-go/issues/1023">#1023</a>)</li>
<li><a
href="ba4489b946"><code>ba4489b</code></a>
[Fix] Do not specify <code>--tenant</code> flag when fetching managed
identity access to...</li>
<li><a
href="f6248097d1"><code>f624809</code></a>
[Internal] Fix billing test for budget configuration update (<a
href="https://redirect.github.com/databricks/databricks-sdk-go/issues/1019">#1019</a>)</li>
<li><a
href="27a5055609"><code>27a5055</code></a>
[Internal] Add terraform aliases to Entity (<a
href="https://redirect.github.com/databricks/databricks-sdk-go/issues/1017">#1017</a>)</li>
<li><a
href="382a38d380"><code>382a38d</code></a>
[Internal] Added Service.NamedIdMap (<a
href="https://redirect.github.com/databricks/databricks-sdk-go/issues/1016">#1016</a>)</li>
<li><a
href="1ef9931dc9"><code>1ef9931</code></a>
[Fix] Add INVALID_STATE to error code mapping (<a
href="https://redirect.github.com/databricks/databricks-sdk-go/issues/1014">#1014</a>)</li>
<li>See full diff in <a
href="https://github.com/databricks/databricks-sdk-go/compare/v0.44.0...v0.45.0">compare
view</a></li>
</ul>
</details>
<br />

<details>
<summary>Most Recent Ignore Conditions Applied to This Pull
Request</summary>

| Dependency Name | Ignore Conditions |
| --- | --- |
| github.com/databricks/databricks-sdk-go | [>= 0.28.a, < 0.29] |
</details>


[![Dependabot compatibility
score](https://dependabot-badges.githubapp.com/badges/compatibility_score?dependency-name=github.com/databricks/databricks-sdk-go&package-manager=go_modules&previous-version=0.44.0&new-version=0.45.0)](https://docs.github.com/en/github/managing-security-vulnerabilities/about-dependabot-security-updates#about-compatibility-scores)

Dependabot will resolve any conflicts with this PR as long as you don't
alter it yourself. You can also trigger a rebase manually by commenting
`@dependabot rebase`.

[//]: # (dependabot-automerge-start)
[//]: # (dependabot-automerge-end)

---

<details>
<summary>Dependabot commands and options</summary>
<br />

You can trigger Dependabot actions by commenting on this PR:
- `@dependabot rebase` will rebase this PR
- `@dependabot recreate` will recreate this PR, overwriting any edits
that have been made to it
- `@dependabot merge` will merge this PR after your CI passes on it
- `@dependabot squash and merge` will squash and merge this PR after
your CI passes on it
- `@dependabot cancel merge` will cancel a previously requested merge
and block automerging
- `@dependabot reopen` will reopen this PR if it is closed
- `@dependabot close` will close this PR and stop Dependabot recreating
it. You can achieve the same result by closing it manually
- `@dependabot show <dependency name> ignore conditions` will show all
of the ignore conditions of the specified dependency
- `@dependabot ignore this major version` will close this PR and stop
Dependabot creating any more for this major version (unless you reopen
the PR or upgrade to it yourself)
- `@dependabot ignore this minor version` will close this PR and stop
Dependabot creating any more for this minor version (unless you reopen
the PR or upgrade to it yourself)
- `@dependabot ignore this dependency` will close this PR and stop
Dependabot creating any more for this dependency (unless you reopen the
PR or upgrade to it yourself)


</details>

---------

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
Co-authored-by: Andrew Nester <andrew.nester@databricks.com>
2024-08-27 08:54:05 +00:00
Andrew Nester 783e05c939
Do not treat empty path as a local path (#1717)
## Changes
Fixes issue introduced here https://github.com/databricks/cli/pull/1699
where PyPi packages were treated as local library.

The reason is that `libraryPath` returns an empty string as a path for
PyPi packages and then `IsLibraryLocal` treated empty string as local
path.

Both of these functions are fixed in this PR.

## Tests
Added regression test
2024-08-26 10:03:56 +00:00
Lennart Kats (databricks) 84b47745e4
Ignore CLI version check on development builds of the CLI (#1714)
## Changes

This changes makes sure we ignore CLI version check on development
builds of the CLI.

Before:

```
$ cat databricks.yml | grep cli_version
  databricks_cli_version: ">= 0.223.1"
$ cli bundle deploy
Error: Databricks CLI version constraint not satisfied. Required: >= 0.223.1, current: 0.0.0-dev+06b169284737
```

after

```
...
$ cli bundle deploy
...
Warning: Ignoring Databricks CLI version constraint for development build. Required: >= 0.223.1, current: 0.0.0-dev+d52d6f08fcd5
```


## Tests
<!-- How is this tested? -->
2024-08-23 10:13:21 +00:00
Pieter Noordhuis 6e8cd835a3
Add paths field to bundle sync configuration (#1694)
## Changes

This field allows a user to configure paths to synchronize to the
workspace.

Allowed values are relative paths to files and directories anchored at
the directory where the field is set. If one or more values traverse up
the directory tree (to an ancestor of the bundle root directory), the
CLI will dynamically determine the root path to use to ensure that the
file tree structure remains intact.

For example, given a `databricks.yml` in `my_bundle` that includes:

```yaml
sync:
  paths:
    - ../common
    - .
```

Then upon synchronization, the workspace will look like:
```
.
├── common
│   └── lib.py
└── my_bundle
    ├── databricks.yml
    └── notebook.py
```

If not set behavior remains identical.

## Tests

* Newly added unit tests for the mutators and under `bundle/tests`.
* Manually confirmed a bundle without this configuration works the same.
* Manually confirmed a bundle with this configuration works.
2024-08-21 15:33:25 +00:00
shreyas-goenka f5df211320
Fix prefix preset used for UC schemas (#1704)
## Changes
In https://github.com/databricks/cli/pull/1490 we regressed and started
using the development mode prefix for UC schemas regardless of the mode
of the bundle target.

This PR fixes the regression and adds a regression test

## Tests
Failing integration tests pass now.
2024-08-21 12:53:54 +00:00
Witold Czaplewski 192f33bb13
[DAB] Add support for requirements libraries in Job Tasks (#1543)
## Changes
While experimenting with DAB I discovered that requirements libraries
are being ignored.

One thing worth mentioning is that `bundle validate` runs successfully,
but `bundle deploy` fails. This PR only covers the second part.


## Tests
<!-- How is this tested? -->
Added a unit test
2024-08-21 10:03:56 +00:00
Andrew Nester c775d251ed
Improves detection of PyPI package names in environment dependencies (#1699)
## Changes
Improves detection of PyPi package names in environment dependencies

## Tests
Added unit tests
2024-08-21 08:22:35 +00:00
Gleb Kanterov 44902fa350
Make `pydabs/venv_path` optional (#1687)
## Changes
Make `pydabs/venv_path` optional. When not specified, CLI detects the
Python interpreter using `python.DetectExecutable`, the same way as for
`artifacts`. `python.DetectExecutable` works correctly if a virtual
environment is activated or `python3` is available on PATH through other
means.

Extract the venv detection code from PyDABs into `libs/python/detect`.
This code will be used when we implement the `python/venv_path` section
in `databricks.yml`.

## Tests
Unit tests and manually

---------

Co-authored-by: Pieter Noordhuis <pcnoordhuis@gmail.com>
2024-08-20 13:26:57 +00:00
Pieter Noordhuis af5048e73e
Share test initializer in common helper function (#1695)
## Changes

These tests inadvertently re-ran mutators, the first time through
`loadTarget` and the second time by running `phases.Initialize()`
themselves. Some of the mutators that are executed in
`phases.Initialize()` are also run as part of `loadTarget`. This is
overdue a refactor to make it unambiguous what runs when. Until then,
this removes the duplicated execution.

## Tests

Unit tests pass.
2024-08-20 12:54:56 +00:00
Andrew Nester 6771ba09a6
Correctly mark package names with versions as remote libraries (#1697)
## Changes
Fixes https://github.com/databricks/setup-cli/issues/124

## Tests
Added regression test
2024-08-20 09:33:03 +00:00
shreyas-goenka 242d4b51ed
Report all empty resources present in error diagnostic (#1685)
## Changes
This PR addressed post-merge feedback from
https://github.com/databricks/cli/pull/1673.

## Tests
Unit tests, and manually.
```
Error: experiment undefined-experiment is not defined
  at resources.experiments.undefined-experiment
  in databricks.yml:11:26

Error: job undefined-job is not defined
  at resources.jobs.undefined-job
  in databricks.yml:6:19

Error: pipeline undefined-pipeline is not defined
  at resources.pipelines.undefined-pipeline
  in databricks.yml:14:24

Name: undefined-job
Target: default

Found 3 errors
```
2024-08-20 00:22:00 +00:00
Lennart Kats (databricks) 78d0ac5c6a
Add configurable presets for name prefixes, tags, etc. (#1490)
## Changes

This adds configurable transformations based on the transformations
currently seen in `mode: development`.

Example databricks.yml showcasing how some transformations:

```
bundle:
  name: my_bundle

targets:
  dev:
    presets:
      prefix: "myprefix_"          # prefix all resource names with myprefix_
      pipelines_development: true  # set development to true by default for pipelines
      trigger_pause_status: PAUSED # set pause_status to PAUSED by default for all triggers and schedules
      jobs_max_concurrent_runs: 10 # set max_concurrent runs to 10 by default for all jobs
      tags:
        dev: true
```

## Tests

* Existing process_target_mode tests that were adapted to use this new
code
* Unit tests specific for the new mutator
* Unit tests for config loading and merging
* Manual e2e testing
2024-08-19 18:18:50 +00:00
Lennart Kats (databricks) 07627023f5
Pause continuous pipelines when 'mode: development' is used (#1590)
## Changes

This makes it so that the pipelines `continuous` property is set to
false by default when using `mode: development`.
2024-08-19 16:27:57 +00:00
Pieter Noordhuis 2b8cbc31cf
Pass through paths argument to libs/sync (#1689)
## Changes

Requires #1684. 

## Tests

Ran the sync integration tests.
2024-08-19 15:41:02 +00:00
Pieter Noordhuis 7de7583b37
Make fileset take optional list of paths to list (#1684)
## Changes

Before this change, the fileset library would take a single root path
and list all files in it. To support an allowlist of paths to list (much
like a Git `pathspec` without patterns; see [pathspec](pathspec)), this
change introduces an optional argument to `fileset.New` where the caller
can specify paths to list. If not specified, this argument defaults to
list `.` (i.e. list all files in the root).

The motivation for this change is that we wish to expose this pattern in
bundles. Users should be able to specify which paths to synchronize
instead of always only synchronizing the bundle root directory.

[pathspec]:
https://git-scm.com/docs/gitglossary#Documentation/gitglossary.txt-aiddefpathspecapathspec

## Tests

New and existing unit tests.
2024-08-19 15:15:14 +00:00
Gleb Kanterov ab4e8099fb
Add `import` option for PyDABs (#1693)
## Changes
Add 'import' option for PyDABs

## Tests
Manually
2024-08-19 13:24:56 +00:00
Andrew Nester 54799a1918
Upgrade Go SDK to 0.44.0 (#1679)
## Changes
Upgrade Go SDK to 0.44.0

---------

Co-authored-by: Pieter Noordhuis <pieter.noordhuis@databricks.com>
2024-08-15 13:23:07 +00:00
Pieter Noordhuis 6b3d33a846
Upgrade TF provider to 1.50.0 (#1681)
## Changes

See
https://github.com/databricks/terraform-provider-databricks/pull/3900

## Tests

* Manually test on a bundle with a pipeline and a schema
* Integration tests pass
2024-08-15 12:43:39 +00:00
Renaud Hartert 7aaaee2512
[Internal] Remove dependency to the `openapi` package of the Go SDK (#1676)
## Changes

This PR removes the dependency to the `databricks-sdk-go/openapi`
package by copying the struct and functions that are needed in a new
`schema/spec.go` file.

The reason to remove this dependency is that it is being deprecated.
Copying the code in the `cli` repo seems reasonable given that it only
uses a couple of very small structs.

## Tests

Verified that CLI code can be properly generated after this change.
2024-08-14 15:59:55 +00:00
Andrew Nester 48ff18e5fc
Upload local libraries even if they don't have artifact defined (#1664)
## Changes
Previously for all the libraries referenced in configuration DABs made
sure that there is corresponding artifact section.
But this is not really necessary and flexible, because local libraries
might be built outside of dabs context.
It also created difficult to follow logic in code where we back
referenced libraries to artifacts which was difficult to fllow


This PR does 3 things:
1. Allows all local libraries referenced in DABs config to be uploaded
to remote
2. Simplifies upload and glob references expand logic by doing this in
single place
3. Speed things up by uploading library only once and doing this in
parallel

## Tests
Added unit + integration tests + made sure that change is backward
compatible (no changes in existing tests)

---------

Co-authored-by: Pieter Noordhuis <pieter.noordhuis@databricks.com>
2024-08-14 09:03:44 +00:00
shreyas-goenka 7ae80de351
Stop tracking file path locations in bundle resources (#1673)
## Changes
Since locations are already tracked in the dynamic value tree, we no
longer need to track it at the resource/artifact level. This PR:
1. Removes use of `paths.Paths`. Uses dyn.Location instead.
2. Refactors the validation of resources not being empty valued to be
generic across all resource types.
  
## Tests
Existing unit tests.
2024-08-13 12:50:15 +00:00
shreyas-goenka 1b984b4f62
Skip pushing Terraform state after destroy (#1667)
## Changes
Following up
https://github.com/databricks/cli/pull/1583#discussion_r1681126323.

We can skip pushing because right after `root_path` is deleted, making
this a no-op effectively.
 
## Tests
2024-08-12 09:19:54 +00:00
Pieter Noordhuis d3d828d175
Fix glob expansion after running a generic build command (#1662)
## Changes

This didn't work as expected because the generic build mutator called
into the type-specific build mutator in the middle of the function. This
invalidated the `config.Artifact` pointer that was being mutated later
on, effectively hiding these mutations from its caller.

To fix this, I turned glob expansion into its own mutator. It now works
as expected, _and_ produces better errors if the glob patterns are
invalid or do not match files.

## Tests

Unit tests.

Manual verification:
```
% databricks bundle deploy
Building sbt_example...

Error: target/scala-2.12/sbt-e[xam22ple*.jar: syntax error in pattern
  at artifacts.sbt_example.files[1].source
  in databricks.yml:15:17
```
2024-08-07 14:47:03 +00:00
Pieter Noordhuis f3ffded3bf
Merge job parameters based on their name (#1659)
## Changes

This change enables overriding the default value of job parameters in
target overrides.

This is the same approach we already take for job clusters and job
tasks.

Closes #1620.

## Tests

Mutator unit tests and lightweight end-to-end tests.
2024-08-06 16:12:18 +00:00
Andrew Nester d26f3f4863
Fixed incorrectly cleaning up python wheel dist folder (#1656)
## Changes
In https://github.com/databricks/cli/pull/1618 we introduced prepare
step in which Python wheel folder was cleaned. Now it was cleaned
everytime instead of only when there is a build command how it is used
to work.

This PR fixes it by only cleaning up dist folder when there is a build
command for wheels.

Fixes #1638 

## Tests
Added regression test
2024-08-06 09:54:58 +00:00
Andrew Nester 809c67b675
Expand and upload local wheel libraries for all task types (#1649)
## Changes
Fixes #1553 

## Tests
Added regression test
2024-08-05 14:44:23 +00:00
shreyas-goenka c454c2fd10
Use precomputed terraform plan for `bundle deploy` (#1640)
# Changes
With https://github.com/databricks/cli/pull/1413 we started to compute
and partially print the plan if it contained deletion of UC schemas.
This PR uses the precomputed plan to avoid double planning when actually
doing the terraform plan.

This fixes a performance regression introduced in
https://github.com/databricks/cli/pull/1413.

# Tests

Tested manually.
1. Verified bundle deployment still works and deploys resources.
2. Verified that the precomputed plan is indeed being used by attaching
a debugger and removing the plan file right before the terraform apply
process is spawned and asserting that terraform apply fails because the
plan is not found.
2024-07-31 14:07:25 +00:00
Andrew Nester 1fb8e324d5
Added test for negation pattern in sync include exclude section (#1637)
## Changes
Added test for negation pattern in sync include exclude section
2024-07-31 13:42:23 +00:00
shreyas-goenka 89c0af5bdc
Add resource for UC schemas to DABs (#1413)
## Changes
This PR adds support for UC Schemas to DABs. This allows users to define
schemas for tables and other assets their pipelines/workflows create as
part of the DAB, thus managing the life-cycle in the DAB.

The first version has a couple of intentional limitations:
1. The owner of the schema will be the deployment user. Changing the
owner of the schema is not allowed (yet). `run_as` will not be
restricted for DABs containing UC schemas. Let's limit the scope of
run_as to the compute identity used instead of ownership of data assets
like UC schemas.
2. API fields that are present in the update API but not the create API.
For example: enabling predictive optimization is not supported in the
create schema API and thus is not available in DABs at the moment.

## Tests
Manually and integration test. Manually verified the following work:
1. Development mode adds a "dev_" prefix.
2. Modified status is correctly computed in the `bundle summary`
command.
3. Grants work as expected, for assigning privileges.
4. Variable interpolation works for the schema ID.
2024-07-31 12:16:28 +00:00
Alex Moschos ecba875fe5
Regenerate TF schema (#1635)
## Changes
- Regenerate TF schema for CLI. Due to an issue the previous generation
missed some TF changes.
2024-07-30 10:13:05 +00:00
shreyas-goenka a52b188e99
Use dynamic walking to validate unique resource keys (#1614)
## Changes
This PR:
1. Uses dynamic walking (via the `dyn.MapByPattern` func) to validate no
two resources have the same resource key. The allows us to remove this
validation at merge time.
2. Modifies `dyn.Mapping` to always return a sorted slice of pairs. This
makes traversal functions like `dyn.Walk` or `dyn.MapByPattern`
deterministic.

## Tests
Unit tests. Also manually.
2024-07-29 13:04:02 +00:00
shreyas-goenka 37b9df96e6
Support multiple paths for diagnostics (#1616)
## Changes
Some diagnostics can have multiple paths associated with them. For
instance, ensuring that unique resource keys are used across all
resources. This PR extends `diag.Diagnostic` to accept multiple paths.

This PR is symmetrical to
https://github.com/databricks/cli/pull/1610/files

## Tests
Unit tests
2024-07-25 15:16:27 +00:00
Andrew Nester 90aaf2d20f
Upgrade TF provider to 1.49.1 (#1626)
## Changes
Upgrade TF provider to 1.49.1
2024-07-25 14:18:49 +00:00
shreyas-goenka e6241e196f
Move to a single prompt during bundle destroy (#1583)
## Changes
Right now we ask users for two confirmations when destroying a bundle.
One to destroy the resources and one to delete the files. This PR
consolidates the two prompts into one.

## Tests
Manually

Destroying a bundle with no resources:
```
➜  bundle-playground git:(master) ✗ cli bundle destroy
All files and directories at the following location will be deleted: /Users/shreyas.goenka@databricks.com/.bundle/bundle-playground/default

Would you like to proceed? [y/n]: y
No resources to destroy
Updating deployment state...
Deleting files...
Destroy complete!
```

Destroying a bundle with no remote state:
```
➜  bundle-playground git:(master) ✗ cli bundle destroy
No active deployment found to destroy!
```

When a user cancells a deployment:
```
➜  bundle-playground git:(master) ✗ cli bundle destroy
The following resources will be deleted:
  delete job job_1
  delete job job_2
  delete pipeline foo

All files and directories at the following location will be deleted: /Users/shreyas.goenka@databricks.com/.bundle/bundle-playground/default

Would you like to proceed? [y/n]: n
Destroy cancelled!
```

When a user destroys resources:
```
➜  bundle-playground git:(master) ✗ cli bundle destroy
The following resources will be deleted:
  delete job job_1
  delete job job_2
  delete pipeline foo

All files and directories at the following location will be deleted: /Users/shreyas.goenka@databricks.com/.bundle/bundle-playground/default

Would you like to proceed? [y/n]: y
Updating deployment state...
Deleting files...
Destroy complete!
```
2024-07-24 13:02:19 +00:00
Andrew Nester 39fc86e83b
Split artifact cleanup into prepare step before build (#1618)
## Changes
Now prepare stage which does cleanup is execute once before every build,
so artifacts built into the same folder are correctly kept

Fixes workaround 2 from this issue #1602

## Tests
Added unit test
2024-07-24 09:13:49 +00:00
shreyas-goenka 4bf88b4209
Support multiple locations for diagnostics (#1610)
## Changes
This PR changes `diag.Diagnostics` to allow including multiple locations
associated with the diagnostic message. The diagnostics that now return
multiple locations with this PR are:
1. Warning for unknown keys in config.
2. Use of experimental.run_as
3. Accidental sync.exludes that exclude all files.

## Tests
Existing unit tests pass. New unit test case to assert on error message
when multiple locations are included.

Example output:
```
➜  bundle-playground-2 ~/cli2/cli/cli bundle validate              
Warning: You are using the legacy mode of run_as. The support for this mode is experimental and might be removed in a future release of the CLI. In order to run the DLT pipelines in your DAB as the run_as user this mode changes the owners of the pipelines to the run_as identity, which requires the user deploying the bundle to be a workspace admin, and also a Metastore admin if the pipeline target is in UC.
  at experimental.use_legacy_run_as
  in resources.yml:10:22
     databricks.yml:13:22

Name: fix run_if
Target: default
Workspace:
  User: shreyas.goenka@databricks.com
  Path: /Users/shreyas.goenka@databricks.com/.bundle/fix run_if/default

Found 1 warning
```
2024-07-23 17:20:11 +00:00
Pieter Noordhuis 52ca599cd5
Upgrade TF provider to 1.49.0 (#1617)
## Changes

This includes a fix for model serving endpoints.

See
https://github.com/databricks/terraform-provider-databricks/pull/3690.

## Tests

n/a
2024-07-23 16:15:02 +00:00
Pieter Noordhuis 2aeea5e384
Remove unused package bundle/deployer (#1607)
## Changes

This has been superseded by individual mutators under
`bundle/deploy/terraform`.

## Tests

n/a
2024-07-18 14:57:31 +00:00
Pieter Noordhuis 6953a5d5af
Add read-only mode for extension aware workspace filer (#1609)
## Changes

By default, construct a read/write instance. If constructed in read-only
mode, the underlying filer is wrapped in a readahead cache.

## Tests

* Filer integration tests pass.
* Manual test that caching is enabled when running on WSFS.
2024-07-18 14:17:42 +00:00
shreyas-goenka 5b65358146
Use local Terraform state only when lineage match (#1588)
## Changes
DABs deployments should be isolated if `root_path` and workspace host
are different. This PR fixes a bug where local terraform state gets
piggybacked if the same cwd is used to deploy two isolated deployments
for the same bundle target. This can happen if:
1. A user switches to a different identity on the same machine. 
2. The workspace host URL the bundle/target points to is changed.
3. A user changes the `root_path` while doing bundle development.

To solve this problem we rely on the lineage field available in the
terraform state, which is a uuid identifying unique terraform
deployments. There's a 1:1 mapping between a terraform deployment and a
bundle deployment.

For more details on how lineage works in terraform, see:
https://developer.hashicorp.com/terraform/language/state/backends#manual-state-pull-push

## Tests
Manually verified that changing the identity no longer results in the
incorrect terraform state being used. Also, new unit tests are added.
2024-07-18 09:47:59 +00:00
shreyas-goenka c6c2692368
Attribute Terraform API requests the CLI (#1598)
## Changes
This PR adds cli to the user agent sent downstream to the databricks
terraform provider when invoked via DABs.
 
## Tests
Unit tests. Based on the comment here
(10fe02075f/bundle/config/mutator/verify_cli_version_test.go (L113))
we don't need to set the version to make the test assertion work
correctly. This is likely because we use `go test` to run the tests
while the CLI is compiled and the version is set via `goreleaser`.
2024-07-18 09:38:09 +00:00
Pieter Noordhuis e1474a38f9
Upgrade TF provider to 1.48.3 (#1600)
## Changes

This includes a fix for using periodic triggers.

## Tests

Manually confirmed this works with
https://github.com/databricks/bundle-examples/pull/32.
2024-07-17 08:49:19 +00:00
shreyas-goenka 8ed9964482
Track multiple locations associated with a `dyn.Value` (#1510)
## Changes
This PR changes the location metadata associated with a `dyn.Value` to a
slice of locations. This will allow us to keep track of location
metadata across merges and overrides.

The convention is to treat the first location in the slice as the
primary location. Also, the semantics are the same as before if there's
only one location associated with a value, that is:
1. For complex values (maps, sequences) the location of the v1 is
primary in Merge(v1, v2)
2. For primitive values the location of v2 is primary in Merge(v1, v2)

## Tests
Modifying existing merge unit tests. Other existing unit tests and
integration tests pass.

---------

Co-authored-by: Pieter Noordhuis <pieter.noordhuis@databricks.com>
2024-07-16 11:27:27 +00:00
shreyas-goenka 39c2633773
Add UUID to uniquely identify a deployment state (#1595)
## Changes
We need a mechanism to invalidate the locally cached deployment state if
a user uses the same working directory to deploy to multiple distinct
deployments (separate targets, root_paths or even hosts).

This PR just adds the UUID to the deployment state in preparation for
invalidating this cache. The actual invalidation will follow up at a
later date (tracked in internal backlog).

## Tests
Unit test. Manually checked the deployment state is actually being
written.
2024-07-16 10:01:58 +00:00
Andrew Nester 434bcbb018
Allow artifacts (JARs, wheels) to be uploaded to UC Volumes (#1591)
## Changes
This change allows to specify UC volumes path as an artifact paths so
all artifacts (JARs, wheels) are uploaded to UC Volumes.

Example configuration is here:
```
bundle:
  name: jar-bundle

workspace:
  host: https://foo.com
  artifact_path: /Volumes/main/default/foobar

artifacts:
  my_java_code:
    path: ./sample-java
    build: "javac PrintArgs.java && jar cvfm PrintArgs.jar META-INF/MANIFEST.MF PrintArgs.class"
    files:
      - source: ./sample-java/PrintArgs.jar

resources:
  jobs:
    jar_job:
      name: "Test Spark Jar Job"
      tasks:
        - task_key: TestSparkJarTask
          new_cluster:
            num_workers: 1
            spark_version: "14.3.x-scala2.12"
            node_type_id: "i3.xlarge"
          spark_jar_task:
            main_class_name: PrintArgs
          libraries:
            - jar: ./sample-java/PrintArgs.jar
```
## Tests
Manually + added E2E test for Java jobs

E2E test is temporarily skipped until auth related issues for UC for
tests are resolved
2024-07-16 08:57:04 +00:00
Gleb Kanterov af975ca64b
Print diagnostics in 'bundle deploy' (#1579)
## Changes
Print diagnostics in 'bundle deploy' similar to 'bundle validate'. This
way if a bundle has any errors or warnings, they are going to be easy to
notice.

NB: due to how we render errors, there is one extra trailing new line in
output, preserved in examples below

## Example: No errors or warnings

```
% databricks bundle deploy
Building default...
Deploying resources...
Updating deployment state...
Deployment complete!
```

## Example: Error on load

```
% databricks bundle deploy
Error: Databricks CLI version constraint not satisfied. Required: >= 1337.0.0, current: 0.0.0-dev

```

## Example: Warning on load

```
% databricks bundle deploy
Building default...
Deploying resources...
Updating deployment state...
Deployment complete!
Warning: unknown field: foo
  in databricks.yml:6:1

```

## Example: Error + warning on load

```
% databricks bundle deploy
Warning: unknown field: foo
  in databricks.yml:6:1

Error: something went wrong

```

## Example: Warning on load + error in init

```
% databricks bundle deploy
Warning: unknown field: foo
  in databricks.yml:6:1

Error: Failed to xxx
  in yyy.yml

Detailed explanation
in multiple lines

```

## Tests
Tested manually
2024-07-10 11:14:57 +00:00
shreyas-goenka 5bc5c3c26a
Return early in bundle destroy if no deployment exists (#1581)
## Changes
This PR:
1. Moves the if mutator to the bundle package, to live with all-time
greats such as `bundle.Seq` and `bundle.Defer`. Also adds unit tests.
2. `bundle destroy` now returns early if `root_path` does not exist. We
do this by leveraging a `bundle.If` condition.

## Tests
Unit tests and manually.

Here's an example of what it'll look like once the bundle is destroyed.

```
➜  bundle-playground git:(master) ✗ cli bundle destroy
No active deployment found to destroy!
```

I would have added some e2e coverage for this as well, but the
`cobraTestRunner.Run()` method does not seem to return stdout/stderr
logs correctly. We can probably punt looking into it.
2024-07-09 15:08:38 +00:00
Andrew Nester 8b468b423f
Change SetVariables mutator to mutate dynamic configuration instead (#1573)
## Changes
Previously `SetVariables` mutator mutated typed configuration by using
`v.Set` for variables. This lead to variables `value` field not having
location information.

By using dynamic configuration mutation, we keep the same functionality
but also preserve location information for value when it's set from
default.

Fixes #1568 #1538

## Tests
Added unit tests
2024-07-09 11:12:42 +00:00
Andrew Nester 3d8446bbdb
Rewrite local path for libraries in foreach tasks (#1569)
## Changes
Now local library path in `libraries` section of foreach each tasks are
correctly replaced with remote path for this library when it's uploaded
to Databricks

## Tests
Added unit test
2024-07-05 10:58:28 +00:00
Andrew Nester 040b374430
Override complex variables with target overrides instead of merging (#1567)
## Changes
At the moment we merge values of complex variables while more expected
behaviour is overriding the value with the target one.

## Tests
Added unit test
2024-07-04 11:57:29 +00:00
Pieter Noordhuis f14dded946
Replace `vfs.Path` with extension-aware filer when running on DBR (#1556)
## Changes

The FUSE mount of the workspace file system on DBR doesn't include file
extensions for notebooks. When these notebooks are checked into a
repository, they do have an extension. PR #1457 added a filer type that
is aware of this disparity and makes these notebooks show up as if they
do have these extensions.

This change swaps out the native `vfs.Path` with one that uses this
filer when running on DBR.

Follow up: consolidate between interfaces exported by `filer.Filer` and
`vfs.Path`.

## Tests

* Unit tests pass
* (Manually ran a snapshot build on DBR against a bundle with notebooks)

---------

Co-authored-by: Andrew Nester <andrew.nester@databricks.com>
2024-07-03 11:55:42 +00:00
Pieter Noordhuis b3c044c461
Use `vfs.Path` for filesystem interaction (#1554)
## Changes

Note: this doesn't cover _all_ filesystem interaction.

To intercept calls where read or stat files to determine their type, we
need a layer between our code and the `os` package calls that interact
with the local file system. Interception is necessary to accommodate
differences between a regular local file system and the FUSE-mounted
Workspace File System when running the CLI on DBR.

This change makes use of #1452 in the bundle struct.

It uses #1525 to access the bundle variable in path rewriting.

## Tests

* Unit tests pass.
* Integration tests pass.
2024-07-03 10:13:22 +00:00
Gleb Kanterov 4787edba36
PythonMutator: allow insert 'resources' and 'resources.jobs' (#1555)
## Changes
Allow insert 'resources' and 'resources.jobs' because they can be absent
in incoming bundle.

## Tests
Unit tests
2024-07-03 08:33:23 +00:00
Gleb Kanterov b9e3c98723
PythonMutator: support omitempty in PyDABs (#1513)
## Changes
PyDABs output can omit empty sequences/mappings because we don't track
them as optional. There is no semantic difference between empty and
missing, which makes omitting correct. CLI detects that we falsely
modify input resources by deleting all empty collections.

To handle that, we extend `dyn.Override` to allow visitors to ignore
certain deletes. If we see that an empty sequence or mapping is deleted,
we revert such delete.

## Tests
Unit tests

---------

Co-authored-by: Pieter Noordhuis <pcnoordhuis@gmail.com>
2024-07-03 07:22:03 +00:00
Gleb Kanterov 5a0a6d7334
PythonMutator: add diagnostics (#1531)
## Changes
Allow PyDABs to report `dyn.Diagnostics` by writing to
`diagnostics.json` supplied as an argument, similar to `input.json` and
`output.json`

Such errors are not yet properly printed in `databricks bundle
validate`, which will be fixed in a follow-up PR.

## Tests
Unit tests
2024-07-02 15:10:53 +00:00
Andrew Nester 3d2f7622bc
Fixed bundle not loading when empty variable is defined (#1552)
## Changes
Fixes #1544

## Tests
Added regression test
2024-07-02 12:40:39 +00:00
Andrew Nester 0d64975d36
Fixed resolving variable references inside slice variable (#1550)
## Changes
Fixes #1541 

## Tests
Added regression unit test

---------

Co-authored-by: Pieter Noordhuis <pieter.noordhuis@databricks.com>
2024-07-02 11:45:16 +00:00
Pieter Noordhuis a0df54ac41
Add extra tests for the sync block (#1548)
## Changes

Issue #1545 describes how a nil entry in the sync block caused an error.

The fix for this issue is in #1547. This change adds end-to-end test
coverage.

## Tests

New test passes on top of #1547.
2024-07-01 13:08:50 +00:00
Gleb Kanterov e8b76a7f13
Improve `bundle validate` output (#1532)
## Changes
This combination of changes allows pretty-printing errors happening
during the "load" and "init" phases, including their locations.

Move to render code into a separate module dedicated to rendering
`diag.Diagnostics` in a human-readable format. This will be used for the
`bundle deploy` command.

Preserve the "bundle" value if an error occurs in mutators. Rewrite the
go templates to handle the case when the bundle isn't yet loaded if an
error occurs during loading, that is possible now.

Improve rendering for errors and warnings:
- don't render empty locations
- render "details" for errors if they exist

Add `root.ErrAlreadyPrinted` indicating that the error was already
printed, and the CLI entry point shouldn't print it again.

## Tests
Add tests for output, that are especially handy to detect extra newlines
2024-07-01 09:01:10 +00:00
Gleb Kanterov aee3910f3d
PythonMutator: register product in user agent extra (#1533)
## Changes
Register user agent product following RFC 9110.

See
https://github.com/databricks/terraform-provider-databricks/pull/3520
for Terraform change.

## Tests
Unit tests
2024-07-01 07:46:37 +00:00
shreyas-goenka 4d8eba04cd
Compare `.Kind()` instead of direct equality checks on a `dyn.Value` (#1520)
## Changes

This PR makes two changes:

1. In https://github.com/databricks/cli/pull/1510 we'll be adding
multiple associated location metadata with a dyn.Value. The Go compiler
does not allow comparing structs if they contain slice values
(presumably due to multiple possible definitions for equality). In
anticipation for adding a `[]dyn.Location` type field to `dyn.Value`
this PR removes all direct comparisons of `dyn.Value` and instead relies
on the kind.

2. Retain location metadata for values in convert.FromTyped. The change
diff is exactly the same as https://github.com/databricks/cli/pull/1523.
It's been combined with this PR because they both depend on each other
to prevent test failures (forming a test failure deadlock).

Go patch used:
```
@@
var x expression
@@
-x == dyn.InvalidValue
+x.Kind() == dyn.KindInvalid

@@
var x expression
@@
-x != dyn.InvalidValue
+x.Kind() != dyn.KindInvalid

@@
var x expression
@@
-x == dyn.NilValue
+x.Kind() == dyn.KindNil

@@
var x expression
@@
-x != dyn.NilValue
+x.Kind() != dyn.KindNil
```
 

## Tests
Unit tests and integration tests pass.
2024-06-27 13:28:19 +00:00
Andrew Nester 5f42791609
Added support for complex variables (#1467)
## Changes
Added support for complex variables

Now it's possible to add and use complex variables as shown below

```
bundle:
  name: complex-variables

resources:
  jobs:
    my_job:
      job_clusters:
        - job_cluster_key: key
          new_cluster: ${var.cluster}
      tasks:
      - task_key: test
        job_cluster_key: key

variables:
  cluster:
    description: "A cluster definition"
    type: complex
    default:
      spark_version: "13.2.x-scala2.11"
      node_type_id: "Standard_DS3_v2"
      num_workers: 2
      spark_conf:
        spark.speculation: true
        spark.databricks.delta.retentionDurationCheck.enabled: false
```

Fixes #1298

- [x] Support for complex variables
- [x] Allow variable overrides (with shortcut) in targets
- [x] Don't allow to provide complex variables via flag or env variable
- [x] Fail validation if complex value is used but not `type: complex`
provided
- [x] Support using variables inside complex variables 

## Tests
Added unit tests

---------

Co-authored-by: shreyas-goenka <88374338+shreyas-goenka@users.noreply.github.com>
2024-06-26 10:25:32 +00:00
Pieter Noordhuis ce5a3f2ce6
Upgrade TF provider to 1.48.0 (#1527)
## Changes

This includes a fix for library order not being respected.

## Tests

Manually confirmed the fix works in
https://github.com/databricks/bundle-examples/pull/29.
2024-06-26 09:29:46 +00:00
dependabot[bot] 8468878eed
Bump github.com/databricks/databricks-sdk-go from 0.42.0 to 0.43.0 (#1522)
Bumps
[github.com/databricks/databricks-sdk-go](https://github.com/databricks/databricks-sdk-go)
from 0.42.0 to 0.43.0.
<details>
<summary>Release notes</summary>
<p><em>Sourced from <a
href="https://github.com/databricks/databricks-sdk-go/releases">github.com/databricks/databricks-sdk-go's
releases</a>.</em></p>
<blockquote>
<h2>v0.43.0</h2>
<p>Major Changes and Improvements:</p>
<ul>
<li>Support partners in user agent for SDK (<a
href="https://redirect.github.com/databricks/databricks-sdk-go/pull/925">#925</a>).</li>
<li>Add <code>serverless_compute_id</code> field to the config (<a
href="https://redirect.github.com/databricks/databricks-sdk-go/pull/952">#952</a>).</li>
</ul>
<p>Other Changes:</p>
<ul>
<li>Generate from latest spec (<a
href="https://redirect.github.com/databricks/databricks-sdk-go/pull/944">#944</a>)
and (<a
href="https://redirect.github.com/databricks/databricks-sdk-go/pull/947">#947</a>).</li>
</ul>
<p>API Changes:</p>
<ul>
<li>Changed <code>IsolationMode</code> field for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/catalog#CatalogInfo">catalog.CatalogInfo</a>
to <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/catalog#CatalogIsolationMode">catalog.CatalogIsolationMode</a>.</li>
<li>Added <code>IsolationMode</code> field for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/catalog#ExternalLocationInfo">catalog.ExternalLocationInfo</a>.</li>
<li>Added <code>MaxResults</code> and <code>PageToken</code> fields for
<a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/catalog#ListCatalogsRequest">catalog.ListCatalogsRequest</a>.</li>
<li>Added <code>NextPageToken</code> field for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/catalog#ListCatalogsResponse">catalog.ListCatalogsResponse</a>.</li>
<li>Added <code>TableServingUrl</code> field for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/catalog#OnlineTable">catalog.OnlineTable</a>.</li>
<li>Added <code>IsolationMode</code> field for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/catalog#StorageCredentialInfo">catalog.StorageCredentialInfo</a>.</li>
<li>Changed <code>IsolationMode</code> field for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/catalog#UpdateCatalog">catalog.UpdateCatalog</a>
to <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/catalog#CatalogIsolationMode">catalog.CatalogIsolationMode</a>.</li>
<li>Added <code>IsolationMode</code> field for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/catalog#UpdateExternalLocation">catalog.UpdateExternalLocation</a>.</li>
<li>Added <code>IsolationMode</code> field for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/catalog#UpdateStorageCredential">catalog.UpdateStorageCredential</a>.</li>
<li>Added <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/catalog#CatalogIsolationMode">catalog.CatalogIsolationMode</a>.</li>
<li>Added <code>CreateSchedule</code>, <code>CreateSubscription</code>,
<code>DeleteSchedule</code>, <code>DeleteSubscription</code>,
<code>GetSchedule</code>, <code>GetSubscription</code>,
<code>List</code>, <code>ListSchedules</code>,
<code>ListSubscriptions</code> and <code>UpdateSchedule</code> methods
for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/dashboards#LakeviewAPI">w.Lakeview</a>
workspace-level service.</li>
<li>Added <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/dashboards#CreateScheduleRequest">dashboards.CreateScheduleRequest</a>,
<a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/dashboards#CreateSubscriptionRequest">dashboards.CreateSubscriptionRequest</a>,
<a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/dashboards#CronSchedule">dashboards.CronSchedule</a>,
<a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/dashboards#DashboardView">dashboards.DashboardView</a>,
<a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/dashboards#DeleteScheduleRequest">dashboards.DeleteScheduleRequest</a>,
<a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/dashboards#DeleteSubscriptionRequest">dashboards.DeleteSubscriptionRequest</a>,
<a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/dashboards#GetScheduleRequest">dashboards.GetScheduleRequest</a>,
<a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/dashboards#GetSubscriptionRequest">dashboards.GetSubscriptionRequest</a>,
<a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/dashboards#ListDashboardsRequest">dashboards.ListDashboardsRequest</a>,
<a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/dashboards#ListDashboardsResponse">dashboards.ListDashboardsResponse</a>,
<a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/dashboards#ListSchedulesRequest">dashboards.ListSchedulesRequest</a>,
<a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/dashboards#ListSchedulesResponse">dashboards.ListSchedulesResponse</a>,
<a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/dashboards#ListSubscriptionsRequest">dashboards.ListSubscriptionsRequest</a>,
<a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/dashboards#ListSubscriptionsResponse">dashboards.ListSubscriptionsResponse</a>,
<a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/dashboards#Schedule">dashboards.Schedule</a>,
<a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/dashboards#SchedulePauseStatus">dashboards.SchedulePauseStatus</a>,
<a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/dashboards#Subscriber">dashboards.Subscriber</a>,
<a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/dashboards#Subscription">dashboards.Subscription</a>,
<a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/dashboards#SubscriptionSubscriberDestination">dashboards.SubscriptionSubscriberDestination</a>,
<a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/dashboards#SubscriptionSubscriberUser">dashboards.SubscriptionSubscriberUser</a>
and <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/dashboards#UpdateScheduleRequest">dashboards.UpdateScheduleRequest</a>
structs.</li>
<li>Added <code>OnStreamingBacklogExceeded</code> field for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/jobs#JobEmailNotifications">jobs.JobEmailNotifications</a>.</li>
<li>Added <code>EnvironmentKey</code> field for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/jobs#RunTask">jobs.RunTask</a>.</li>
<li>Removed <code>ConditionTask</code>, <code>DbtTask</code>,
<code>NotebookTask</code>, <code>PipelineTask</code>,
<code>PythonWheelTask</code>, <code>RunJobTask</code>,
<code>SparkJarTask</code>, <code>SparkPythonTask</code>,
<code>SparkSubmitTask</code>, <code>SqlTask</code> and
<code>Environments</code> fields for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/jobs#SubmitRun">jobs.SubmitRun</a>.</li>
<li>Added <code>DbtTask</code> and <code>EnvironmentKey</code> field for
<a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/jobs#SubmitTask">jobs.SubmitTask</a>.</li>
<li>Added <code>OnStreamingBacklogExceeded</code> field for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/jobs#TaskEmailNotifications">jobs.TaskEmailNotifications</a>.</li>
<li>Added <code>Periodic</code> field for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/jobs#TriggerSettings">jobs.TriggerSettings</a>.</li>
<li>Added <code>OnStreamingBacklogExceeded</code> field for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/jobs#WebhookNotifications">jobs.WebhookNotifications</a>.</li>
<li>Added <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/jobs#PeriodicTriggerConfiguration">jobs.PeriodicTriggerConfiguration</a>.</li>
<li>Added <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/jobs#PeriodicTriggerConfigurationTimeUnit">jobs.PeriodicTriggerConfigurationTimeUnit</a>.</li>
<li>Added <code>ProviderSummary</code> field for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/marketplace#Listing">marketplace.Listing</a>.</li>
<li>Added <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/marketplace#ProviderIconFile">marketplace.ProviderIconFile</a>.</li>
<li>Added <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/marketplace#ProviderIconType">marketplace.ProviderIconType</a>.</li>
<li>Added <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/marketplace#ProviderListingSummaryInfo">marketplace.ProviderListingSummaryInfo</a>.</li>
<li>Added <code>Start</code> method for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/serving#AppsAPI">w.Apps</a>
workspace-level service.</li>
<li>Added <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/serving#ServingEndpointsDataPlaneAPI">w.ServingEndpointsDataPlane</a>
workspace-level service.</li>
<li>Added <code>ServicePrincipalId</code> field for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/serving#App">serving.App</a>.</li>
<li>Added <code>ServicePrincipalName</code> field for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/serving#App">serving.App</a>.</li>
<li>Added <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/serving#StartAppRequest">serving.StartAppRequest</a>.</li>
<li>Added <code>QueryNextPage</code> method for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/vectorsearch#VectorSearchIndexesAPI">w.VectorSearchIndexes</a>
workspace-level service.</li>
<li>Added <code>QueryType</code> field for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/vectorsearch#QueryVectorIndexRequest">vectorsearch.QueryVectorIndexRequest</a>.</li>
<li>Added <code>NextPageToken</code> field for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/vectorsearch#QueryVectorIndexResponse">vectorsearch.QueryVectorIndexResponse</a>.</li>
<li>Added <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/vectorsearch#QueryVectorIndexNextPageRequest">vectorsearch.QueryVectorIndexNextPageRequest</a>.</li>
</ul>
<p>OpenAPI SHA: 7437dabb9dadee402c1fc060df4c1ce8cc5369f0, Date:
2024-06-25</p>
</blockquote>
</details>
<details>
<summary>Changelog</summary>
<p><em>Sourced from <a
href="https://github.com/databricks/databricks-sdk-go/blob/main/CHANGELOG.md">github.com/databricks/databricks-sdk-go's
changelog</a>.</em></p>
<blockquote>
<h2>0.43.0</h2>
<p>Major Changes and Improvements:</p>
<ul>
<li>Support partners in user agent for SDK (<a
href="https://redirect.github.com/databricks/databricks-sdk-go/pull/925">#925</a>).</li>
<li>Add <code>serverless_compute_id</code> field to the config (<a
href="https://redirect.github.com/databricks/databricks-sdk-go/pull/952">#952</a>).</li>
</ul>
<p>Other Changes:</p>
<ul>
<li>Generate from latest spec (<a
href="https://redirect.github.com/databricks/databricks-sdk-go/pull/944">#944</a>)
and (<a
href="https://redirect.github.com/databricks/databricks-sdk-go/pull/947">#947</a>).</li>
</ul>
<p>API Changes:</p>
<ul>
<li>Changed <code>IsolationMode</code> field for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/catalog#CatalogInfo">catalog.CatalogInfo</a>
to <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/catalog#CatalogIsolationMode">catalog.CatalogIsolationMode</a>.</li>
<li>Added <code>IsolationMode</code> field for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/catalog#ExternalLocationInfo">catalog.ExternalLocationInfo</a>.</li>
<li>Added <code>MaxResults</code> and <code>PageToken</code> fields for
<a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/catalog#ListCatalogsRequest">catalog.ListCatalogsRequest</a>.</li>
<li>Added <code>NextPageToken</code> field for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/catalog#ListCatalogsResponse">catalog.ListCatalogsResponse</a>.</li>
<li>Added <code>TableServingUrl</code> field for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/catalog#OnlineTable">catalog.OnlineTable</a>.</li>
<li>Added <code>IsolationMode</code> field for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/catalog#StorageCredentialInfo">catalog.StorageCredentialInfo</a>.</li>
<li>Changed <code>IsolationMode</code> field for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/catalog#UpdateCatalog">catalog.UpdateCatalog</a>
to <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/catalog#CatalogIsolationMode">catalog.CatalogIsolationMode</a>.</li>
<li>Added <code>IsolationMode</code> field for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/catalog#UpdateExternalLocation">catalog.UpdateExternalLocation</a>.</li>
<li>Added <code>IsolationMode</code> field for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/catalog#UpdateStorageCredential">catalog.UpdateStorageCredential</a>.</li>
<li>Added <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/catalog#CatalogIsolationMode">catalog.CatalogIsolationMode</a>.</li>
<li>Added <code>CreateSchedule</code>, <code>CreateSubscription</code>,
<code>DeleteSchedule</code>, <code>DeleteSubscription</code>,
<code>GetSchedule</code>, <code>GetSubscription</code>,
<code>List</code>, <code>ListSchedules</code>,
<code>ListSubscriptions</code> and <code>UpdateSchedule</code> methods
for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/dashboards#LakeviewAPI">w.Lakeview</a>
workspace-level service.</li>
<li>Added <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/dashboards#CreateScheduleRequest">dashboards.CreateScheduleRequest</a>,
<a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/dashboards#CreateSubscriptionRequest">dashboards.CreateSubscriptionRequest</a>,
<a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/dashboards#CronSchedule">dashboards.CronSchedule</a>,
<a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/dashboards#DashboardView">dashboards.DashboardView</a>,
<a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/dashboards#DeleteScheduleRequest">dashboards.DeleteScheduleRequest</a>,
<a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/dashboards#DeleteSubscriptionRequest">dashboards.DeleteSubscriptionRequest</a>,
<a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/dashboards#GetScheduleRequest">dashboards.GetScheduleRequest</a>,
<a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/dashboards#GetSubscriptionRequest">dashboards.GetSubscriptionRequest</a>,
<a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/dashboards#ListDashboardsRequest">dashboards.ListDashboardsRequest</a>,
<a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/dashboards#ListDashboardsResponse">dashboards.ListDashboardsResponse</a>,
<a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/dashboards#ListSchedulesRequest">dashboards.ListSchedulesRequest</a>,
<a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/dashboards#ListSchedulesResponse">dashboards.ListSchedulesResponse</a>,
<a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/dashboards#ListSubscriptionsRequest">dashboards.ListSubscriptionsRequest</a>,
<a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/dashboards#ListSubscriptionsResponse">dashboards.ListSubscriptionsResponse</a>,
<a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/dashboards#Schedule">dashboards.Schedule</a>,
<a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/dashboards#SchedulePauseStatus">dashboards.SchedulePauseStatus</a>,
<a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/dashboards#Subscriber">dashboards.Subscriber</a>,
<a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/dashboards#Subscription">dashboards.Subscription</a>,
<a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/dashboards#SubscriptionSubscriberDestination">dashboards.SubscriptionSubscriberDestination</a>,
<a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/dashboards#SubscriptionSubscriberUser">dashboards.SubscriptionSubscriberUser</a>
and <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/dashboards#UpdateScheduleRequest">dashboards.UpdateScheduleRequest</a>
structs.</li>
<li>Added <code>OnStreamingBacklogExceeded</code> field for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/jobs#JobEmailNotifications">jobs.JobEmailNotifications</a>.</li>
<li>Added <code>EnvironmentKey</code> field for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/jobs#RunTask">jobs.RunTask</a>.</li>
<li>Removed <code>ConditionTask</code>, <code>DbtTask</code>,
<code>NotebookTask</code>, <code>PipelineTask</code>,
<code>PythonWheelTask</code>, <code>RunJobTask</code>,
<code>SparkJarTask</code>, <code>SparkPythonTask</code>,
<code>SparkSubmitTask</code>, <code>SqlTask</code> and
<code>Environments</code> fields for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/jobs#SubmitRun">jobs.SubmitRun</a>.</li>
<li>Added <code>DbtTask</code> and <code>EnvironmentKey</code> field for
<a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/jobs#SubmitTask">jobs.SubmitTask</a>.</li>
<li>Added <code>OnStreamingBacklogExceeded</code> field for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/jobs#TaskEmailNotifications">jobs.TaskEmailNotifications</a>.</li>
<li>Added <code>Periodic</code> field for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/jobs#TriggerSettings">jobs.TriggerSettings</a>.</li>
<li>Added <code>OnStreamingBacklogExceeded</code> field for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/jobs#WebhookNotifications">jobs.WebhookNotifications</a>.</li>
<li>Added <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/jobs#PeriodicTriggerConfiguration">jobs.PeriodicTriggerConfiguration</a>.</li>
<li>Added <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/jobs#PeriodicTriggerConfigurationTimeUnit">jobs.PeriodicTriggerConfigurationTimeUnit</a>.</li>
<li>Added <code>ProviderSummary</code> field for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/marketplace#Listing">marketplace.Listing</a>.</li>
<li>Added <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/marketplace#ProviderIconFile">marketplace.ProviderIconFile</a>.</li>
<li>Added <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/marketplace#ProviderIconType">marketplace.ProviderIconType</a>.</li>
<li>Added <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/marketplace#ProviderListingSummaryInfo">marketplace.ProviderListingSummaryInfo</a>.</li>
<li>Added <code>Start</code> method for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/serving#AppsAPI">w.Apps</a>
workspace-level service.</li>
<li>Added <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/serving#ServingEndpointsDataPlaneAPI">w.ServingEndpointsDataPlane</a>
workspace-level service.</li>
<li>Added <code>ServicePrincipalId</code> field for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/serving#App">serving.App</a>.</li>
<li>Added <code>ServicePrincipalName</code> field for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/serving#App">serving.App</a>.</li>
<li>Added <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/serving#StartAppRequest">serving.StartAppRequest</a>.</li>
<li>Added <code>QueryNextPage</code> method for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/vectorsearch#VectorSearchIndexesAPI">w.VectorSearchIndexes</a>
workspace-level service.</li>
<li>Added <code>QueryType</code> field for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/vectorsearch#QueryVectorIndexRequest">vectorsearch.QueryVectorIndexRequest</a>.</li>
<li>Added <code>NextPageToken</code> field for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/vectorsearch#QueryVectorIndexResponse">vectorsearch.QueryVectorIndexResponse</a>.</li>
<li>Added <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/vectorsearch#QueryVectorIndexNextPageRequest">vectorsearch.QueryVectorIndexNextPageRequest</a>.</li>
</ul>
<p>OpenAPI SHA: 7437dabb9dadee402c1fc060df4c1ce8cc5369f0, Date:
2024-06-25</p>
</blockquote>
</details>
<details>
<summary>Commits</summary>
<ul>
<li><a
href="3e419132ea"><code>3e41913</code></a>
Release v0.43.0 (<a
href="https://redirect.github.com/databricks/databricks-sdk-go/issues/955">#955</a>)</li>
<li><a
href="ce3dc984f7"><code>ce3dc98</code></a>
Add <code>serverless_compute_id</code> field to the config (<a
href="https://redirect.github.com/databricks/databricks-sdk-go/issues/952">#952</a>)</li>
<li><a
href="00b1d09b24"><code>00b1d09</code></a>
Update OpenAPI spec (<a
href="https://redirect.github.com/databricks/databricks-sdk-go/issues/947">#947</a>)</li>
<li><a
href="d098b1a3e7"><code>d098b1a</code></a>
Support partners in SDK (<a
href="https://redirect.github.com/databricks/databricks-sdk-go/issues/925">#925</a>)</li>
<li><a
href="490bc13c0e"><code>490bc13</code></a>
Generate from latest spec (<a
href="https://redirect.github.com/databricks/databricks-sdk-go/issues/944">#944</a>)</li>
<li>See full diff in <a
href="https://github.com/databricks/databricks-sdk-go/compare/v0.42.0...v0.43.0">compare
view</a></li>
</ul>
</details>
<br />

<details>
<summary>Most Recent Ignore Conditions Applied to This Pull
Request</summary>

| Dependency Name | Ignore Conditions |
| --- | --- |
| github.com/databricks/databricks-sdk-go | [>= 0.28.a, < 0.29] |
</details>


[![Dependabot compatibility
score](https://dependabot-badges.githubapp.com/badges/compatibility_score?dependency-name=github.com/databricks/databricks-sdk-go&package-manager=go_modules&previous-version=0.42.0&new-version=0.43.0)](https://docs.github.com/en/github/managing-security-vulnerabilities/about-dependabot-security-updates#about-compatibility-scores)

Dependabot will resolve any conflicts with this PR as long as you don't
alter it yourself. You can also trigger a rebase manually by commenting
`@dependabot rebase`.

[//]: # (dependabot-automerge-start)
[//]: # (dependabot-automerge-end)

---

<details>
<summary>Dependabot commands and options</summary>
<br />

You can trigger Dependabot actions by commenting on this PR:
- `@dependabot rebase` will rebase this PR
- `@dependabot recreate` will recreate this PR, overwriting any edits
that have been made to it
- `@dependabot merge` will merge this PR after your CI passes on it
- `@dependabot squash and merge` will squash and merge this PR after
your CI passes on it
- `@dependabot cancel merge` will cancel a previously requested merge
and block automerging
- `@dependabot reopen` will reopen this PR if it is closed
- `@dependabot close` will close this PR and stop Dependabot recreating
it. You can achieve the same result by closing it manually
- `@dependabot show <dependency name> ignore conditions` will show all
of the ignore conditions of the specified dependency
- `@dependabot ignore this major version` will close this PR and stop
Dependabot creating any more for this major version (unless you reopen
the PR or upgrade to it yourself)
- `@dependabot ignore this minor version` will close this PR and stop
Dependabot creating any more for this minor version (unless you reopen
the PR or upgrade to it yourself)
- `@dependabot ignore this dependency` will close this PR and stop
Dependabot creating any more for this dependency (unless you reopen the
PR or upgrade to it yourself)


</details>

---------

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
Co-authored-by: Pieter Noordhuis <pieter.noordhuis@databricks.com>
2024-06-25 12:51:17 +00:00
Pieter Noordhuis 100a0516d4
Add context type and value to path rewriting (#1525)
## Changes

For a future change where the inner rewriting functions need access to
the underlying bundle, this change makes preparations.

All values were passed via the stack before and adding yet another value
would make the code less readable.

## Tests

Unit tests pass.
2024-06-25 10:04:22 +00:00
Gleb Kanterov 5ff06578ac
PythonMutator: replace stdin/stdout with files (#1512)
## Changes
Replace stdin/stdout with files in `PythonMutator`. Files are created in
a temporary directory.

Rename `ApplyPythonMutator` to `PythonMutator`.

Add test for `dyn.Location` behavior during the "load" stage.

## Tests
Unit tests
2024-06-24 07:47:41 +00:00
shreyas-goenka 068c7cfc2d
Return `dyn.InvalidValue` instead of `dyn.NilValue` when errors happen (#1514)
## Changes
With https://github.com/databricks/cli/pull/1507 and
https://github.com/databricks/cli/pull/1511 we are clarifying the
semantics associated with `dyn.InvalidValue` and `dyn.NilValue`. An
invalid value is the default zero value and is used to signals the
complete absence of the value.

A nil value, on the other hand, is a valid value for a piece of
configuration and signals explicitly setting a key to nil in the
configuration tree. In keeping with that theme, this PR returns
`dyn.InvalidValue` instead of `dyn.NilValue` at error sites. This change
is not expected to have a material change in behaviour and is being done
to set the right convention since we have well-defined semantics
associated with both `NilValue` and `InvalidValue`.

## Tests
Unit tests and integration tests pass. Also manually scanned the changes
and the associated call sites to verify the `NilValue` value itself was
not being relied upon.
2024-06-21 14:22:42 +00:00
Pieter Noordhuis 446a9d0c52
Properly deal with nil values in `convert.FromTyped` (#1511)
## Changes

When a configuration defines:
```yaml
run_as:
```

It first showed up as `run_as -> nil` in the dynamic configuration only
to later be converted to `run_as -> {}` while going through typed
conversion. We were using the presence of a key to initialize an empty
value. This is incorrect and it should have remained a nil value.

This conversion was happening in `convert.FromTyped` where any struct
always returned a map value. Instead, it should only return a map value
in any one of these cases: 1) the struct has elements, 2) the struct was
originally a map in the dynamic configuration, or 3) the struct was
initialized to a non-empty pointer value.

Stacked on top of #1516 and #1518.

## Tests

* Unit tests pass.
* Integration tests pass.
* Manually ran through bundle CRUD with a bundle without resources.
2024-06-21 13:43:21 +00:00
Pieter Noordhuis 01adef666a
Set bool pointer to disable lock (#1516)
## Changes

This cherry-picks from #1490 to address an issue that came up in #1511.

The function `dyn.SetByPath` requires intermediate values to be present.
If they are not, it returns an error that it cannot index a map. This is
not an issue on main, where the intermediate maps are always created,
even if they are not present in the dynamic configuration tree. As of
#1511, we'll no longer populate empty maps for empty structs if they are
not explicitly set (i.e., a non-nil pointer). This change writes a bool
pointer to avoid this issue altogether.

## Tests

Unit tests pass.
2024-06-21 11:14:33 +00:00
Gleb Kanterov 57a5a65f87
Add ApplyPythonMutator (#1430)
## Changes
Add ApplyPythonMutator, which will fork the Python subprocess and
process pipe bundle configuration through it.

It's enabled through `experimental` section, for example:

```yaml
experimental:
  pydabs: 
    enable: true
    venv_path: .venv
```

For now, it's limited to two phases in the mutator pipeline:

- `load`: adds new jobs
- `init`: adds new jobs, or modifies existing ones

It's enforced that no jobs are modified in `load` and not jobs are
deleted in `load/init`, because, otherwise, it will break existing
assumptions.

## Tests
Unit tests
2024-06-20 08:43:08 +00:00
Pieter Noordhuis b2c03ea54c
Use `dyn.InvalidValue` to indicate absence (#1507)
## Changes

Previously, the functions `Get` and `Index` returned `dyn.NilValue` to
indicate that a map key or sequence index wasn't found. This is a valid
value, so we need to differentiate between actual absence and a real
`dyn.NilValue`. We do this with the zero value of a `dyn.Value` (also
captured in the constant `dyn.InvalidValue`).

## Tests

* Unit tests.
* Renamed `Get` and `Index` to find and update all call sites.
2024-06-19 15:24:57 +00:00
Lennart Kats (databricks) deb3e365cd
Pause quality monitors when "mode: development" is used (#1481)
## Changes

Similar to scheduled jobs, quality monitors should be paused when in
development mode (in line with the [behavior for scheduled
jobs](https://docs.databricks.com/en/dev-tools/bundles/deployment-modes.html)).
@aravind-segu @arpitjasa-db please take a look and verify this behavior.

- [x] Followup: documentation changes. If we make this change we should
update
https://docs.databricks.com/dev-tools/bundles/deployment-modes.html.

## Tests
Unit tests
2024-06-19 13:54:35 +00:00
Andrew Nester 663aa9ab8c
Override variables with lookup value even if values has default value set (#1504)
## Changes

This PR fixes the behaviour when variables were not overridden with
lookup value from targets if these variables had any default value set
in the default target.

Fixes #1449 

## Tests
Added regression test
2024-06-19 08:03:06 +00:00
shreyas-goenka 553fdd1e81
Serialize dynamic value for `bundle validate` output (#1499)
## Changes
Using dynamic values allows us to retain references like
`${resources.jobs...}` even when the type of field is not integer, eg:
`run_job_task`, or in general values that do not map to the Go types for
a field.

## Tests
Integration test
2024-06-18 15:04:20 +00:00