Commit Graph

1155 Commits

Author SHA1 Message Date
Andrew Nester 630a56e41e
Release v0.225.0 (#1642)
Bundles:
* Add resource for UC schemas to DABs
([#1413](https://github.com/databricks/cli/pull/1413)).

Internal:
* Use dynamic walking to validate unique resource keys
([#1614](https://github.com/databricks/cli/pull/1614)).
* Regenerate TF schema
([#1635](https://github.com/databricks/cli/pull/1635)).
* Add upgrade and upgrade eager flags to pip install call
([#1636](https://github.com/databricks/cli/pull/1636)).
* Added test for negation pattern in sync include exclude section
([#1637](https://github.com/databricks/cli/pull/1637)).
* Use precomputed terraform plan for `bundle deploy`
([#1640](https://github.com/databricks/cli/pull/1640)).
2024-07-31 16:47:00 +00:00
shreyas-goenka c454c2fd10
Use precomputed terraform plan for `bundle deploy` (#1640)
# Changes
With https://github.com/databricks/cli/pull/1413 we started to compute
and partially print the plan if it contained deletion of UC schemas.
This PR uses the precomputed plan to avoid double planning when actually
doing the terraform plan.

This fixes a performance regression introduced in
https://github.com/databricks/cli/pull/1413.

# Tests

Tested manually.
1. Verified bundle deployment still works and deploys resources.
2. Verified that the precomputed plan is indeed being used by attaching
a debugger and removing the plan file right before the terraform apply
process is spawned and asserting that terraform apply fails because the
plan is not found.
2024-07-31 14:07:25 +00:00
Andrew Nester 1fb8e324d5
Added test for negation pattern in sync include exclude section (#1637)
## Changes
Added test for negation pattern in sync include exclude section
2024-07-31 13:42:23 +00:00
shreyas-goenka 89c0af5bdc
Add resource for UC schemas to DABs (#1413)
## Changes
This PR adds support for UC Schemas to DABs. This allows users to define
schemas for tables and other assets their pipelines/workflows create as
part of the DAB, thus managing the life-cycle in the DAB.

The first version has a couple of intentional limitations:
1. The owner of the schema will be the deployment user. Changing the
owner of the schema is not allowed (yet). `run_as` will not be
restricted for DABs containing UC schemas. Let's limit the scope of
run_as to the compute identity used instead of ownership of data assets
like UC schemas.
2. API fields that are present in the update API but not the create API.
For example: enabling predictive optimization is not supported in the
create schema API and thus is not available in DABs at the moment.

## Tests
Manually and integration test. Manually verified the following work:
1. Development mode adds a "dev_" prefix.
2. Modified status is correctly computed in the `bundle summary`
command.
3. Grants work as expected, for assigning privileges.
4. Variable interpolation works for the schema ID.
2024-07-31 12:16:28 +00:00
Cor 5afcc25d27
Add upgrade and upgrade eager flags to pip install call (#1636)
## Changes
Add upgrade and upgrade eager flags to pip install call for Databricks
labs projects. See [this
documentation](https://pip.pypa.io/en/stable/cli/pip_install/#cmdoption-U)
for more information about the flags.

Resolves #1634

## Tests
- [x] Manually
2024-07-31 09:35:06 +00:00
Alex Moschos ecba875fe5
Regenerate TF schema (#1635)
## Changes
- Regenerate TF schema for CLI. Due to an issue the previous generation
missed some TF changes.
2024-07-30 10:13:05 +00:00
shreyas-goenka a52b188e99
Use dynamic walking to validate unique resource keys (#1614)
## Changes
This PR:
1. Uses dynamic walking (via the `dyn.MapByPattern` func) to validate no
two resources have the same resource key. The allows us to remove this
validation at merge time.
2. Modifies `dyn.Mapping` to always return a sorted slice of pairs. This
makes traversal functions like `dyn.Walk` or `dyn.MapByPattern`
deterministic.

## Tests
Unit tests. Also manually.
2024-07-29 13:04:02 +00:00
Andrew Nester 383d580917
Release v0.224.1 (#1627)
Bundles:
* Add UUID function to bundle template functions
([#1612](https://github.com/databricks/cli/pull/1612)).
* Upgrade TF provider to 1.49.0
([#1617](https://github.com/databricks/cli/pull/1617)).
* Upgrade TF provider to 1.49.1
([#1626](https://github.com/databricks/cli/pull/1626)).
* Support multiple locations for diagnostics
([#1610](https://github.com/databricks/cli/pull/1610)).
* Split artifact cleanup into prepare step before build
([#1618](https://github.com/databricks/cli/pull/1618)).
* Move to a single prompt during bundle destroy
([#1583](https://github.com/databricks/cli/pull/1583)).

Internal:
* Add tests for the Workspace API readahead cache
([#1605](https://github.com/databricks/cli/pull/1605)).
* Update Python dependencies before install when upgrading a labs
project ([#1624](https://github.com/databricks/cli/pull/1624)).
2024-07-26 09:33:36 +00:00
shreyas-goenka 37b9df96e6
Support multiple paths for diagnostics (#1616)
## Changes
Some diagnostics can have multiple paths associated with them. For
instance, ensuring that unique resource keys are used across all
resources. This PR extends `diag.Diagnostic` to accept multiple paths.

This PR is symmetrical to
https://github.com/databricks/cli/pull/1610/files

## Tests
Unit tests
2024-07-25 15:16:27 +00:00
Andrew Nester 90aaf2d20f
Upgrade TF provider to 1.49.1 (#1626)
## Changes
Upgrade TF provider to 1.49.1
2024-07-25 14:18:49 +00:00
Cor 9dbb58e821
Update Python dependencies before install when upgrading a labs project (#1624)
The install script might require the up-to-date Python dependencies,
explained in more detail in the referenced issue below

Fixes #1623 

## Tests
! Need support with testing !
2024-07-25 08:51:37 +00:00
shreyas-goenka e6241e196f
Move to a single prompt during bundle destroy (#1583)
## Changes
Right now we ask users for two confirmations when destroying a bundle.
One to destroy the resources and one to delete the files. This PR
consolidates the two prompts into one.

## Tests
Manually

Destroying a bundle with no resources:
```
➜  bundle-playground git:(master) ✗ cli bundle destroy
All files and directories at the following location will be deleted: /Users/shreyas.goenka@databricks.com/.bundle/bundle-playground/default

Would you like to proceed? [y/n]: y
No resources to destroy
Updating deployment state...
Deleting files...
Destroy complete!
```

Destroying a bundle with no remote state:
```
➜  bundle-playground git:(master) ✗ cli bundle destroy
No active deployment found to destroy!
```

When a user cancells a deployment:
```
➜  bundle-playground git:(master) ✗ cli bundle destroy
The following resources will be deleted:
  delete job job_1
  delete job job_2
  delete pipeline foo

All files and directories at the following location will be deleted: /Users/shreyas.goenka@databricks.com/.bundle/bundle-playground/default

Would you like to proceed? [y/n]: n
Destroy cancelled!
```

When a user destroys resources:
```
➜  bundle-playground git:(master) ✗ cli bundle destroy
The following resources will be deleted:
  delete job job_1
  delete job job_2
  delete pipeline foo

All files and directories at the following location will be deleted: /Users/shreyas.goenka@databricks.com/.bundle/bundle-playground/default

Would you like to proceed? [y/n]: y
Updating deployment state...
Deleting files...
Destroy complete!
```
2024-07-24 13:02:19 +00:00
Andrew Nester 39fc86e83b
Split artifact cleanup into prepare step before build (#1618)
## Changes
Now prepare stage which does cleanup is execute once before every build,
so artifacts built into the same folder are correctly kept

Fixes workaround 2 from this issue #1602

## Tests
Added unit test
2024-07-24 09:13:49 +00:00
shreyas-goenka 4bf88b4209
Support multiple locations for diagnostics (#1610)
## Changes
This PR changes `diag.Diagnostics` to allow including multiple locations
associated with the diagnostic message. The diagnostics that now return
multiple locations with this PR are:
1. Warning for unknown keys in config.
2. Use of experimental.run_as
3. Accidental sync.exludes that exclude all files.

## Tests
Existing unit tests pass. New unit test case to assert on error message
when multiple locations are included.

Example output:
```
➜  bundle-playground-2 ~/cli2/cli/cli bundle validate              
Warning: You are using the legacy mode of run_as. The support for this mode is experimental and might be removed in a future release of the CLI. In order to run the DLT pipelines in your DAB as the run_as user this mode changes the owners of the pipelines to the run_as identity, which requires the user deploying the bundle to be a workspace admin, and also a Metastore admin if the pipeline target is in UC.
  at experimental.use_legacy_run_as
  in resources.yml:10:22
     databricks.yml:13:22

Name: fix run_if
Target: default
Workspace:
  User: shreyas.goenka@databricks.com
  Path: /Users/shreyas.goenka@databricks.com/.bundle/fix run_if/default

Found 1 warning
```
2024-07-23 17:20:11 +00:00
Pieter Noordhuis 52ca599cd5
Upgrade TF provider to 1.49.0 (#1617)
## Changes

This includes a fix for model serving endpoints.

See
https://github.com/databricks/terraform-provider-databricks/pull/3690.

## Tests

n/a
2024-07-23 16:15:02 +00:00
Arpit Jasapara 15ca7fe62d
Add UUID function to bundle template functions (#1612)
## Changes

Add support for google/uuid.New() to DAB templates.

This is needed to generate UUIDs in downstream templates like MLOps
Stacks.

## Tests

Unit tests.
2024-07-19 11:38:20 +00:00
Pieter Noordhuis 0448307b14
Add tests for the Workspace API readahead cache (#1605)
## Changes

Backfill unit tests for #1582.

## Tests

New tests pass.
2024-07-19 07:03:25 +00:00
Andrew Nester c8ce18ffa1
Release v0.224.0 (#1604)
CLI:
* Do not buffer files in memory when downloading
([#1599](https://github.com/databricks/cli/pull/1599)).

Bundles:
* Allow artifacts (JARs, wheels) to be uploaded to UC Volumes
([#1591](https://github.com/databricks/cli/pull/1591)).
* Upgrade TF provider to 1.48.3
([#1600](https://github.com/databricks/cli/pull/1600)).
* Fixed job name normalisation for bundle generate
([#1601](https://github.com/databricks/cli/pull/1601)).

Internal:
* Add UUID to uniquely identify a deployment state
([#1595](https://github.com/databricks/cli/pull/1595)).
* Track multiple locations associated with a `dyn.Value`
([#1510](https://github.com/databricks/cli/pull/1510)).
* Attribute Terraform API requests the CLI
([#1598](https://github.com/databricks/cli/pull/1598)).
* Implement readahead cache for Workspace API calls
([#1582](https://github.com/databricks/cli/pull/1582)).
* Use local Terraform state only when lineage match
([#1588](https://github.com/databricks/cli/pull/1588)).
* Add read-only mode for extension aware workspace filer
([#1609](https://github.com/databricks/cli/pull/1609)).


Dependency updates:
* Bump github.com/databricks/databricks-sdk-go from 0.43.0 to 0.43.2
([#1594](https://github.com/databricks/cli/pull/1594)).
2024-07-18 15:07:43 +00:00
Pieter Noordhuis 2aeea5e384
Remove unused package bundle/deployer (#1607)
## Changes

This has been superseded by individual mutators under
`bundle/deploy/terraform`.

## Tests

n/a
2024-07-18 14:57:31 +00:00
Pieter Noordhuis 6953a5d5af
Add read-only mode for extension aware workspace filer (#1609)
## Changes

By default, construct a read/write instance. If constructed in read-only
mode, the underlying filer is wrapped in a readahead cache.

## Tests

* Filer integration tests pass.
* Manual test that caching is enabled when running on WSFS.
2024-07-18 14:17:42 +00:00
shreyas-goenka 5b65358146
Use local Terraform state only when lineage match (#1588)
## Changes
DABs deployments should be isolated if `root_path` and workspace host
are different. This PR fixes a bug where local terraform state gets
piggybacked if the same cwd is used to deploy two isolated deployments
for the same bundle target. This can happen if:
1. A user switches to a different identity on the same machine. 
2. The workspace host URL the bundle/target points to is changed.
3. A user changes the `root_path` while doing bundle development.

To solve this problem we rely on the lineage field available in the
terraform state, which is a uuid identifying unique terraform
deployments. There's a 1:1 mapping between a terraform deployment and a
bundle deployment.

For more details on how lineage works in terraform, see:
https://developer.hashicorp.com/terraform/language/state/backends#manual-state-pull-push

## Tests
Manually verified that changing the identity no longer results in the
incorrect terraform state being used. Also, new unit tests are added.
2024-07-18 09:47:59 +00:00
Pieter Noordhuis af0114a5a6
Implement readahead cache for Workspace API calls (#1582)
## Changes

The reason this readahead cache exists is that we frequently need to
recursively find all files in the bundle root directory, especially for
sync include and exclude processing. By caching the response for every
file/directory and frontloading the latency cost of these calls, we
significantly improve performance and eliminate redundant operations.

## Tests

* [ ] Working on unit tests
2024-07-18 09:45:10 +00:00
shreyas-goenka c6c2692368
Attribute Terraform API requests the CLI (#1598)
## Changes
This PR adds cli to the user agent sent downstream to the databricks
terraform provider when invoked via DABs.
 
## Tests
Unit tests. Based on the comment here
(10fe02075f/bundle/config/mutator/verify_cli_version_test.go (L113))
we don't need to set the version to make the test assertion work
correctly. This is likely because we use `go test` to run the tests
while the CLI is compiled and the version is set via `goreleaser`.
2024-07-18 09:38:09 +00:00
Andrew Nester 6d710a411a
Fixed job name normalisation for bundle generate (#1601)
## Changes
Fixes #1537 

## Tests
Added unit test
2024-07-17 12:33:49 +00:00
Pieter Noordhuis e1474a38f9
Upgrade TF provider to 1.48.3 (#1600)
## Changes

This includes a fix for using periodic triggers.

## Tests

Manually confirmed this works with
https://github.com/databricks/bundle-examples/pull/32.
2024-07-17 08:49:19 +00:00
Renaud Hartert 235973e7b1
[Fix] Do not buffer files in memory when downloading (#1599)
## Changes

This PR fixes a performance bug that led downloaded files (e.g. with
`databricks fs cp dbfs:/Volumes/.../somefile .`) to be buffered in
memory before being written.

Results from profiling the download of a ~100MB file:

Before:
```
Type: alloc_space
Showing nodes accounting for 374.02MB, 98.50% of 379.74MB total
```

After:
```
Type: alloc_space
Showing nodes accounting for 3748.67kB, 100% of 3748.67kB total
```

Note that this fix is temporary. A longer term solution should be to use
the API provided by the Go SDK rather than making an HTTP request
directly from the CLI.

fix #1575 

## Tests

Verified that the CLI properly download the file when doing the
profiling.
2024-07-17 07:14:02 +00:00
dependabot[bot] 10fe02075f
Bump github.com/databricks/databricks-sdk-go from 0.43.0 to 0.43.2 (#1594)
Bumps
[github.com/databricks/databricks-sdk-go](https://github.com/databricks/databricks-sdk-go)
from 0.43.0 to 0.43.2.
<details>
<summary>Release notes</summary>
<p><em>Sourced from <a
href="https://github.com/databricks/databricks-sdk-go/releases">github.com/databricks/databricks-sdk-go's
releases</a>.</em></p>
<blockquote>
<h2>v0.43.2</h2>
<h1>Release v0.43.2</h1>
<h2>Internal Changes</h2>
<ul>
<li>Enforce Tag on PRs (<a
href="https://redirect.github.com/databricks/databricks-sdk-go/pull/969">#969</a>).</li>
<li>Generate SDK for <code>apierr</code> changes (<a
href="https://redirect.github.com/databricks/databricks-sdk-go/pull/970">#970</a>).</li>
<li>Add Release tag and Workflow Fix (<a
href="https://redirect.github.com/databricks/databricks-sdk-go/pull/972">#972</a>).</li>
</ul>
<h2>v0.43.1</h2>
<h2>0.43.1</h2>
<h3>Major Changes and Improvements:</h3>
<ul>
<li>Add a credentials provider for Github Azure OIDC (<a
href="https://redirect.github.com/databricks/databricks-sdk-go/pull/965">#965</a>).</li>
<li>Add DataPlane API Support (<a
href="https://redirect.github.com/databricks/databricks-sdk-go/pull/936">#936</a>).</li>
<li>Added more error messages for retriable errors (timeouts, etc.) (<a
href="https://redirect.github.com/databricks/databricks-sdk-go/pull/963">#963</a>).</li>
</ul>
<h3>Internal Changes</h3>
<ul>
<li>Add ChangelogConfig to Generator struct (<a
href="https://redirect.github.com/databricks/databricks-sdk-go/pull/967">#967</a>).</li>
<li>Improve Changelog by grouping changes (<a
href="https://redirect.github.com/databricks/databricks-sdk-go/pull/962">#962</a>).</li>
<li>Parse API Error messages with <code>int</code> error codes (<a
href="https://redirect.github.com/databricks/databricks-sdk-go/pull/960">#960</a>).</li>
</ul>
</blockquote>
</details>
<details>
<summary>Changelog</summary>
<p><em>Sourced from <a
href="https://github.com/databricks/databricks-sdk-go/blob/main/CHANGELOG.md">github.com/databricks/databricks-sdk-go's
changelog</a>.</em></p>
<blockquote>
<h2>0.43.2</h2>
<h3>Internal Changes</h3>
<ul>
<li>Enforce Tag on PRs (<a
href="https://redirect.github.com/databricks/databricks-sdk-go/pull/969">#969</a>).</li>
<li>Generate SDK for <code>apierr</code> changes (<a
href="https://redirect.github.com/databricks/databricks-sdk-go/pull/970">#970</a>).</li>
<li>Add Release tag and Workflow Fix (<a
href="https://redirect.github.com/databricks/databricks-sdk-go/pull/972">#972</a>).</li>
</ul>
<h2>0.43.1</h2>
<h3>Major Changes and Improvements:</h3>
<ul>
<li>Add a credentials provider for Github Azure OIDC (<a
href="https://redirect.github.com/databricks/databricks-sdk-go/pull/965">#965</a>).</li>
<li>Add DataPlane API Support (<a
href="https://redirect.github.com/databricks/databricks-sdk-go/pull/936">#936</a>).</li>
<li>Added more error messages for retriable errors (timeouts, etc.) (<a
href="https://redirect.github.com/databricks/databricks-sdk-go/pull/963">#963</a>).</li>
</ul>
<h3>Internal Changes</h3>
<ul>
<li>Add ChangelogConfig to Generator struct (<a
href="https://redirect.github.com/databricks/databricks-sdk-go/pull/967">#967</a>).</li>
<li>Improve Changelog by grouping changes (<a
href="https://redirect.github.com/databricks/databricks-sdk-go/pull/962">#962</a>).</li>
<li>Parse API Error messages with <code>int</code> error codes (<a
href="https://redirect.github.com/databricks/databricks-sdk-go/pull/960">#960</a>).</li>
</ul>
</blockquote>
</details>
<details>
<summary>Commits</summary>
<ul>
<li><a
href="f0825efd08"><code>f0825ef</code></a>
[Release] v0.43.2 (<a
href="https://redirect.github.com/databricks/databricks-sdk-go/issues/971">#971</a>)</li>
<li><a
href="f111a28962"><code>f111a28</code></a>
[Internal] Add Release tag and Workflow Fix (<a
href="https://redirect.github.com/databricks/databricks-sdk-go/issues/972">#972</a>)</li>
<li><a
href="f20ef58444"><code>f20ef58</code></a>
Generate SDK for <code>apierr</code> changes (<a
href="https://redirect.github.com/databricks/databricks-sdk-go/issues/970">#970</a>)</li>
<li><a
href="a6222c809f"><code>a6222c8</code></a>
[Internal] Enforce Tag on PRs (<a
href="https://redirect.github.com/databricks/databricks-sdk-go/issues/969">#969</a>)</li>
<li><a
href="82b07c84a7"><code>82b07c8</code></a>
Release v0.43.1 (<a
href="https://redirect.github.com/databricks/databricks-sdk-go/issues/968">#968</a>)</li>
<li><a
href="c67dc8abba"><code>c67dc8a</code></a>
Added more error messages for retriable errors (timeouts, etc.) (<a
href="https://redirect.github.com/databricks/databricks-sdk-go/issues/963">#963</a>)</li>
<li><a
href="951c091b5b"><code>951c091</code></a>
[Internal] Improve Changelog by grouping changes (<a
href="https://redirect.github.com/databricks/databricks-sdk-go/issues/962">#962</a>)</li>
<li><a
href="f54345a3fc"><code>f54345a</code></a>
Add ChangelogConfig to Generator struct (<a
href="https://redirect.github.com/databricks/databricks-sdk-go/issues/967">#967</a>)</li>
<li><a
href="df99404994"><code>df99404</code></a>
Add DataPlane API Support (<a
href="https://redirect.github.com/databricks/databricks-sdk-go/issues/936">#936</a>)</li>
<li><a
href="78b367a02e"><code>78b367a</code></a>
Add a credentials provider for Github Azure OIDC (<a
href="https://redirect.github.com/databricks/databricks-sdk-go/issues/965">#965</a>)</li>
<li>Additional commits viewable in <a
href="https://github.com/databricks/databricks-sdk-go/compare/v0.43.0...v0.43.2">compare
view</a></li>
</ul>
</details>
<br />

<details>
<summary>Most Recent Ignore Conditions Applied to This Pull
Request</summary>

| Dependency Name | Ignore Conditions |
| --- | --- |
| github.com/databricks/databricks-sdk-go | [>= 0.28.a, < 0.29] |
</details>


[![Dependabot compatibility
score](https://dependabot-badges.githubapp.com/badges/compatibility_score?dependency-name=github.com/databricks/databricks-sdk-go&package-manager=go_modules&previous-version=0.43.0&new-version=0.43.2)](https://docs.github.com/en/github/managing-security-vulnerabilities/about-dependabot-security-updates#about-compatibility-scores)

Dependabot will resolve any conflicts with this PR as long as you don't
alter it yourself. You can also trigger a rebase manually by commenting
`@dependabot rebase`.

[//]: # (dependabot-automerge-start)
[//]: # (dependabot-automerge-end)

---

<details>
<summary>Dependabot commands and options</summary>
<br />

You can trigger Dependabot actions by commenting on this PR:
- `@dependabot rebase` will rebase this PR
- `@dependabot recreate` will recreate this PR, overwriting any edits
that have been made to it
- `@dependabot merge` will merge this PR after your CI passes on it
- `@dependabot squash and merge` will squash and merge this PR after
your CI passes on it
- `@dependabot cancel merge` will cancel a previously requested merge
and block automerging
- `@dependabot reopen` will reopen this PR if it is closed
- `@dependabot close` will close this PR and stop Dependabot recreating
it. You can achieve the same result by closing it manually
- `@dependabot show <dependency name> ignore conditions` will show all
of the ignore conditions of the specified dependency
- `@dependabot ignore this major version` will close this PR and stop
Dependabot creating any more for this major version (unless you reopen
the PR or upgrade to it yourself)
- `@dependabot ignore this minor version` will close this PR and stop
Dependabot creating any more for this minor version (unless you reopen
the PR or upgrade to it yourself)
- `@dependabot ignore this dependency` will close this PR and stop
Dependabot creating any more for this dependency (unless you reopen the
PR or upgrade to it yourself)


</details>

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2024-07-16 11:40:12 +00:00
shreyas-goenka 8ed9964482
Track multiple locations associated with a `dyn.Value` (#1510)
## Changes
This PR changes the location metadata associated with a `dyn.Value` to a
slice of locations. This will allow us to keep track of location
metadata across merges and overrides.

The convention is to treat the first location in the slice as the
primary location. Also, the semantics are the same as before if there's
only one location associated with a value, that is:
1. For complex values (maps, sequences) the location of the v1 is
primary in Merge(v1, v2)
2. For primitive values the location of v2 is primary in Merge(v1, v2)

## Tests
Modifying existing merge unit tests. Other existing unit tests and
integration tests pass.

---------

Co-authored-by: Pieter Noordhuis <pieter.noordhuis@databricks.com>
2024-07-16 11:27:27 +00:00
shreyas-goenka 39c2633773
Add UUID to uniquely identify a deployment state (#1595)
## Changes
We need a mechanism to invalidate the locally cached deployment state if
a user uses the same working directory to deploy to multiple distinct
deployments (separate targets, root_paths or even hosts).

This PR just adds the UUID to the deployment state in preparation for
invalidating this cache. The actual invalidation will follow up at a
later date (tracked in internal backlog).

## Tests
Unit test. Manually checked the deployment state is actually being
written.
2024-07-16 10:01:58 +00:00
Andrew Nester 434bcbb018
Allow artifacts (JARs, wheels) to be uploaded to UC Volumes (#1591)
## Changes
This change allows to specify UC volumes path as an artifact paths so
all artifacts (JARs, wheels) are uploaded to UC Volumes.

Example configuration is here:
```
bundle:
  name: jar-bundle

workspace:
  host: https://foo.com
  artifact_path: /Volumes/main/default/foobar

artifacts:
  my_java_code:
    path: ./sample-java
    build: "javac PrintArgs.java && jar cvfm PrintArgs.jar META-INF/MANIFEST.MF PrintArgs.class"
    files:
      - source: ./sample-java/PrintArgs.jar

resources:
  jobs:
    jar_job:
      name: "Test Spark Jar Job"
      tasks:
        - task_key: TestSparkJarTask
          new_cluster:
            num_workers: 1
            spark_version: "14.3.x-scala2.12"
            node_type_id: "i3.xlarge"
          spark_jar_task:
            main_class_name: PrintArgs
          libraries:
            - jar: ./sample-java/PrintArgs.jar
```
## Tests
Manually + added E2E test for Java jobs

E2E test is temporarily skipped until auth related issues for UC for
tests are resolved
2024-07-16 08:57:04 +00:00
Andrew Nester 61cb0f2695
Release v0.223.2 (#1587)
Bundles:
* Override complex variables with target overrides instead of merging
([#1567](https://github.com/databricks/cli/pull/1567)).
* Rewrite local path for libraries in foreach tasks
([#1569](https://github.com/databricks/cli/pull/1569)).
* Change SetVariables mutator to mutate dynamic configuration instead
([#1573](https://github.com/databricks/cli/pull/1573)).
* Return early in bundle destroy if no deployment exists
([#1581](https://github.com/databricks/cli/pull/1581)).
* Let notebook detection code use underlying metadata if available
([#1574](https://github.com/databricks/cli/pull/1574)).
* Remove schema override for variable default value
([#1536](https://github.com/databricks/cli/pull/1536)).
* Print diagnostics in 'bundle deploy'
([#1579](https://github.com/databricks/cli/pull/1579)).

Internal:
* Update actions/upload-artifact to v4
([#1559](https://github.com/databricks/cli/pull/1559)).
* Use Go 1.22 to build and test
([#1562](https://github.com/databricks/cli/pull/1562)).
* Move bespoke status call to main workspace files filer
([#1570](https://github.com/databricks/cli/pull/1570)).
* Add new template
([#1578](https://github.com/databricks/cli/pull/1578)).
* Add regression tests for CLI error output
([#1566](https://github.com/databricks/cli/pull/1566)).

Dependency updates:
* Bump golang.org/x/mod from 0.18.0 to 0.19.0
([#1576](https://github.com/databricks/cli/pull/1576)).
* Bump golang.org/x/term from 0.21.0 to 0.22.0
([#1577](https://github.com/databricks/cli/pull/1577)).
2024-07-10 12:04:59 +00:00
Gleb Kanterov af975ca64b
Print diagnostics in 'bundle deploy' (#1579)
## Changes
Print diagnostics in 'bundle deploy' similar to 'bundle validate'. This
way if a bundle has any errors or warnings, they are going to be easy to
notice.

NB: due to how we render errors, there is one extra trailing new line in
output, preserved in examples below

## Example: No errors or warnings

```
% databricks bundle deploy
Building default...
Deploying resources...
Updating deployment state...
Deployment complete!
```

## Example: Error on load

```
% databricks bundle deploy
Error: Databricks CLI version constraint not satisfied. Required: >= 1337.0.0, current: 0.0.0-dev

```

## Example: Warning on load

```
% databricks bundle deploy
Building default...
Deploying resources...
Updating deployment state...
Deployment complete!
Warning: unknown field: foo
  in databricks.yml:6:1

```

## Example: Error + warning on load

```
% databricks bundle deploy
Warning: unknown field: foo
  in databricks.yml:6:1

Error: something went wrong

```

## Example: Warning on load + error in init

```
% databricks bundle deploy
Warning: unknown field: foo
  in databricks.yml:6:1

Error: Failed to xxx
  in yyy.yml

Detailed explanation
in multiple lines

```

## Tests
Tested manually
2024-07-10 11:14:57 +00:00
shreyas-goenka 1da04a4318
Remove schema override for variable default value (#1536)
## Changes

This PR:
1. Removes the custom added in
https://github.com/databricks/cli/pull/1396/files for
`variables.*.default`. It's no longer needed because with complex
variables (https://github.com/databricks/cli/pull/1467) `default` has a
type of any.
2. Retains, and extends the override on `targets.*.variables.*`. Target
override values can now be complex objects, not just primitive values.

## Tests
Manually

Before:
Only primitive types were allowed.

<img width="436" alt="Screenshot 2024-06-27 at 3 58 34 PM"
src="https://github.com/databricks/cli/assets/88374338/de55be5c-9236-4ccc-b529-97ce7201b8e8">

After:
An empty JSON schema is generated. All YAML values are acceptable.

<img width="453" alt="Screenshot 2024-06-27 at 3 57 15 PM"
src="https://github.com/databricks/cli/assets/88374338/7534b770-563f-4efc-b9c6-0af526dcc705">
2024-07-10 06:57:27 +00:00
Gleb Kanterov 25737bbb5d
Add regression tests for CLI error output (#1566)
## Changes
Add regression tests for https://github.com/databricks/cli/issues/1563

We test 2 code paths:
- if there is an error, we can print to stderr
- if there is a valid output, we can print to stdout

We should also consider adding black-box tests that will run the CLI
binary as a black box and inspect its output to stderr/stdout.

## Tests
Unit tests
2024-07-10 06:38:06 +00:00
Pieter Noordhuis 8f56ca39a2
Let notebook detection code use underlying metadata if available (#1574)
## Changes

If we're using a `vfs.Path` backed by a workspace filesystem filer, we
have access to the `workspace.ObjectInfo` value for every file. By
providing access to this value we can use it directly and avoid reading
the first line of the underlying file.

A follow-up change will implement the interface defined in this change
for the workspace filesystem filer.

## Tests

Unit tests.
2024-07-10 06:37:47 +00:00
shreyas-goenka 5bc5c3c26a
Return early in bundle destroy if no deployment exists (#1581)
## Changes
This PR:
1. Moves the if mutator to the bundle package, to live with all-time
greats such as `bundle.Seq` and `bundle.Defer`. Also adds unit tests.
2. `bundle destroy` now returns early if `root_path` does not exist. We
do this by leveraging a `bundle.If` condition.

## Tests
Unit tests and manually.

Here's an example of what it'll look like once the bundle is destroyed.

```
➜  bundle-playground git:(master) ✗ cli bundle destroy
No active deployment found to destroy!
```

I would have added some e2e coverage for this as well, but the
`cobraTestRunner.Run()` method does not seem to return stdout/stderr
logs correctly. We can probably punt looking into it.
2024-07-09 15:08:38 +00:00
Andrew Nester 8b468b423f
Change SetVariables mutator to mutate dynamic configuration instead (#1573)
## Changes
Previously `SetVariables` mutator mutated typed configuration by using
`v.Set` for variables. This lead to variables `value` field not having
location information.

By using dynamic configuration mutation, we keep the same functionality
but also preserve location information for value when it's set from
default.

Fixes #1568 #1538

## Tests
Added unit tests
2024-07-09 11:12:42 +00:00
dependabot[bot] 4d13c7fbe3
Bump golang.org/x/term from 0.21.0 to 0.22.0 (#1577)
Bumps [golang.org/x/term](https://github.com/golang/term) from 0.21.0 to
0.22.0.
<details>
<summary>Commits</summary>
<ul>
<li><a
href="c976cb1d70"><code>c976cb1</code></a>
go.mod: update golang.org/x dependencies</li>
<li>See full diff in <a
href="https://github.com/golang/term/compare/v0.21.0...v0.22.0">compare
view</a></li>
</ul>
</details>
<br />


[![Dependabot compatibility
score](https://dependabot-badges.githubapp.com/badges/compatibility_score?dependency-name=golang.org/x/term&package-manager=go_modules&previous-version=0.21.0&new-version=0.22.0)](https://docs.github.com/en/github/managing-security-vulnerabilities/about-dependabot-security-updates#about-compatibility-scores)

Dependabot will resolve any conflicts with this PR as long as you don't
alter it yourself. You can also trigger a rebase manually by commenting
`@dependabot rebase`.

[//]: # (dependabot-automerge-start)
[//]: # (dependabot-automerge-end)

---

<details>
<summary>Dependabot commands and options</summary>
<br />

You can trigger Dependabot actions by commenting on this PR:
- `@dependabot rebase` will rebase this PR
- `@dependabot recreate` will recreate this PR, overwriting any edits
that have been made to it
- `@dependabot merge` will merge this PR after your CI passes on it
- `@dependabot squash and merge` will squash and merge this PR after
your CI passes on it
- `@dependabot cancel merge` will cancel a previously requested merge
and block automerging
- `@dependabot reopen` will reopen this PR if it is closed
- `@dependabot close` will close this PR and stop Dependabot recreating
it. You can achieve the same result by closing it manually
- `@dependabot show <dependency name> ignore conditions` will show all
of the ignore conditions of the specified dependency
- `@dependabot ignore this major version` will close this PR and stop
Dependabot creating any more for this major version (unless you reopen
the PR or upgrade to it yourself)
- `@dependabot ignore this minor version` will close this PR and stop
Dependabot creating any more for this minor version (unless you reopen
the PR or upgrade to it yourself)
- `@dependabot ignore this dependency` will close this PR and stop
Dependabot creating any more for this dependency (unless you reopen the
PR or upgrade to it yourself)


</details>

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2024-07-09 09:01:30 +00:00
dependabot[bot] 056e2af743
Bump golang.org/x/mod from 0.18.0 to 0.19.0 (#1576)
Bumps [golang.org/x/mod](https://github.com/golang/mod) from 0.18.0 to
0.19.0.
<details>
<summary>Commits</summary>
<ul>
<li><a
href="d58be1cb16"><code>d58be1c</code></a>
sumdb/tlog: set the hash of the empty tree according to RFC 6962</li>
<li><a
href="232e49f555"><code>232e49f</code></a>
Revert &quot;module: add COM0 and LPT0 to badWindowsNames&quot;</li>
<li>See full diff in <a
href="https://github.com/golang/mod/compare/v0.18.0...v0.19.0">compare
view</a></li>
</ul>
</details>
<br />


[![Dependabot compatibility
score](https://dependabot-badges.githubapp.com/badges/compatibility_score?dependency-name=golang.org/x/mod&package-manager=go_modules&previous-version=0.18.0&new-version=0.19.0)](https://docs.github.com/en/github/managing-security-vulnerabilities/about-dependabot-security-updates#about-compatibility-scores)

Dependabot will resolve any conflicts with this PR as long as you don't
alter it yourself. You can also trigger a rebase manually by commenting
`@dependabot rebase`.

[//]: # (dependabot-automerge-start)
[//]: # (dependabot-automerge-end)

---

<details>
<summary>Dependabot commands and options</summary>
<br />

You can trigger Dependabot actions by commenting on this PR:
- `@dependabot rebase` will rebase this PR
- `@dependabot recreate` will recreate this PR, overwriting any edits
that have been made to it
- `@dependabot merge` will merge this PR after your CI passes on it
- `@dependabot squash and merge` will squash and merge this PR after
your CI passes on it
- `@dependabot cancel merge` will cancel a previously requested merge
and block automerging
- `@dependabot reopen` will reopen this PR if it is closed
- `@dependabot close` will close this PR and stop Dependabot recreating
it. You can achieve the same result by closing it manually
- `@dependabot show <dependency name> ignore conditions` will show all
of the ignore conditions of the specified dependency
- `@dependabot ignore this major version` will close this PR and stop
Dependabot creating any more for this major version (unless you reopen
the PR or upgrade to it yourself)
- `@dependabot ignore this minor version` will close this PR and stop
Dependabot creating any more for this minor version (unless you reopen
the PR or upgrade to it yourself)
- `@dependabot ignore this dependency` will close this PR and stop
Dependabot creating any more for this dependency (unless you reopen the
PR or upgrade to it yourself)


</details>

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2024-07-09 09:01:11 +00:00
Gleb Kanterov d30c4c730d
Add new template (#1578)
## Changes
Add a new hidden experimental template 

## Tests
Tested manually
2024-07-08 13:32:56 +00:00
Pieter Noordhuis 869576e144
Move bespoke status call to main workspace files filer (#1570)
## Changes

This consolidates the two separate status calls into one.

The extension-aware filer now doesn't need the direct API client anymore
and fully relies on the underlying filer.

## Tests

* Unit tests.
* Ran the filer integration tests manually.
2024-07-05 11:32:29 +00:00
Andrew Nester 3d8446bbdb
Rewrite local path for libraries in foreach tasks (#1569)
## Changes
Now local library path in `libraries` section of foreach each tasks are
correctly replaced with remote path for this library when it's uploaded
to Databricks

## Tests
Added unit test
2024-07-05 10:58:28 +00:00
Andrew Nester 040b374430
Override complex variables with target overrides instead of merging (#1567)
## Changes
At the moment we merge values of complex variables while more expected
behaviour is overriding the value with the target one.

## Tests
Added unit test
2024-07-04 11:57:29 +00:00
Pieter Noordhuis 8c3be30093
Use different Go cache key for goreleaser jobs (#1558)
## Changes

The goreleaser jobs perform a cross-platform build of the main binary
without test files. It should use a different cache than the jobs that
run tests for a single platform.

This change also updates the `release-snapshot` job to use the latest
goreleaser action, as was done in #1477.

## Tests

Ran `release-snapshot` job from this PR.
2024-07-04 11:39:55 +00:00
Pieter Noordhuis 80136dea5f
Use Go 1.22 to build and test (#1562)
## Changes

This has been released for a while. Blog post:
https://go.dev/blog/go1.22.

## Tests

None besides the unit tests.
2024-07-04 06:54:41 +00:00
Pieter Noordhuis 324fa2e18b
Update actions/upload-artifact to v4 (#1559)
## Changes

This addresses a deprecation warning in our GHA output.

Full release notes of v4 at
https://github.com/actions/upload-artifact/releases/tag/v4.0.0
2024-07-04 06:54:30 +00:00
Pieter Noordhuis bf275428b6
Release v0.223.1 (#1565)
This bugfix release fixes missing error messages in v0.223.0.

CLI:
* Fix logic error in
[#1532](https://github.com/databricks/cli/pull/1532)
([#1564](https://github.com/databricks/cli/pull/1564)).
2024-07-03 16:41:55 +00:00
Pieter Noordhuis 7d2aa35738
Fix logic error in #1532 (#1564)
## Changes

This snuck into #1532 right before merging. The result is that error
output is no longer logged. This includes actual execution errors as
well as help output if arguments or flags are incorrectly specified.

We don't have test coverage for the `root.Execute` function. This is to
be fixed later.

## Tests

Manually confirmed we observe error output again.
2024-07-03 16:23:19 +00:00
Pieter Noordhuis 2a73d7788b
Release v0.223.0 (#1557)
Bundles:

As of this release you can interact with bundles when running the CLI on
DBR (e.g. via the Web Terminal).

* Fix non-default project names not working in dbt-sql template
([#1500](https://github.com/databricks/cli/pull/1500)).
* Improve `bundle validate` output
([#1532](https://github.com/databricks/cli/pull/1532)).
* Fixed resolving variable references inside slice variable
([#1550](https://github.com/databricks/cli/pull/1550)).
* Fixed bundle not loading when empty variable is defined
([#1552](https://github.com/databricks/cli/pull/1552)).
* Use `vfs.Path` for filesystem interaction
([#1554](https://github.com/databricks/cli/pull/1554)).
* Replace `vfs.Path` with extension-aware filer when running on DBR
([#1556](https://github.com/databricks/cli/pull/1556)).

Internal:
* merge.Override: Fix handling of dyn.NilValue
([#1530](https://github.com/databricks/cli/pull/1530)).
* Compare `.Kind()` instead of direct equality checks on a `dyn.Value`
([#1520](https://github.com/databricks/cli/pull/1520)).
* PythonMutator: register product in user agent extra
([#1533](https://github.com/databricks/cli/pull/1533)).
* Ignore `dyn.NilValue` when traversing value from `dyn.Map`
([#1547](https://github.com/databricks/cli/pull/1547)).
* Add extra tests for the sync block
([#1548](https://github.com/databricks/cli/pull/1548)).
* PythonMutator: add diagnostics
([#1531](https://github.com/databricks/cli/pull/1531)).
* PythonMutator: support omitempty in PyDABs
([#1513](https://github.com/databricks/cli/pull/1513)).
* PythonMutator: allow insert 'resources' and 'resources.jobs'
([#1555](https://github.com/databricks/cli/pull/1555)).
2024-07-03 12:24:42 +00:00
Pieter Noordhuis f14dded946
Replace `vfs.Path` with extension-aware filer when running on DBR (#1556)
## Changes

The FUSE mount of the workspace file system on DBR doesn't include file
extensions for notebooks. When these notebooks are checked into a
repository, they do have an extension. PR #1457 added a filer type that
is aware of this disparity and makes these notebooks show up as if they
do have these extensions.

This change swaps out the native `vfs.Path` with one that uses this
filer when running on DBR.

Follow up: consolidate between interfaces exported by `filer.Filer` and
`vfs.Path`.

## Tests

* Unit tests pass
* (Manually ran a snapshot build on DBR against a bundle with notebooks)

---------

Co-authored-by: Andrew Nester <andrew.nester@databricks.com>
2024-07-03 11:55:42 +00:00