Commit Graph

489 Commits

Author SHA1 Message Date
shreyas-goenka 242d4b51ed
Report all empty resources present in error diagnostic (#1685)
## Changes
This PR addressed post-merge feedback from
https://github.com/databricks/cli/pull/1673.

## Tests
Unit tests, and manually.
```
Error: experiment undefined-experiment is not defined
  at resources.experiments.undefined-experiment
  in databricks.yml:11:26

Error: job undefined-job is not defined
  at resources.jobs.undefined-job
  in databricks.yml:6:19

Error: pipeline undefined-pipeline is not defined
  at resources.pipelines.undefined-pipeline
  in databricks.yml:14:24

Name: undefined-job
Target: default

Found 3 errors
```
2024-08-20 00:22:00 +00:00
Lennart Kats (databricks) 78d0ac5c6a
Add configurable presets for name prefixes, tags, etc. (#1490)
## Changes

This adds configurable transformations based on the transformations
currently seen in `mode: development`.

Example databricks.yml showcasing how some transformations:

```
bundle:
  name: my_bundle

targets:
  dev:
    presets:
      prefix: "myprefix_"          # prefix all resource names with myprefix_
      pipelines_development: true  # set development to true by default for pipelines
      trigger_pause_status: PAUSED # set pause_status to PAUSED by default for all triggers and schedules
      jobs_max_concurrent_runs: 10 # set max_concurrent runs to 10 by default for all jobs
      tags:
        dev: true
```

## Tests

* Existing process_target_mode tests that were adapted to use this new
code
* Unit tests specific for the new mutator
* Unit tests for config loading and merging
* Manual e2e testing
2024-08-19 18:18:50 +00:00
Lennart Kats (databricks) 07627023f5
Pause continuous pipelines when 'mode: development' is used (#1590)
## Changes

This makes it so that the pipelines `continuous` property is set to
false by default when using `mode: development`.
2024-08-19 16:27:57 +00:00
Pieter Noordhuis 2b8cbc31cf
Pass through paths argument to libs/sync (#1689)
## Changes

Requires #1684. 

## Tests

Ran the sync integration tests.
2024-08-19 15:41:02 +00:00
Pieter Noordhuis 7de7583b37
Make fileset take optional list of paths to list (#1684)
## Changes

Before this change, the fileset library would take a single root path
and list all files in it. To support an allowlist of paths to list (much
like a Git `pathspec` without patterns; see [pathspec](pathspec)), this
change introduces an optional argument to `fileset.New` where the caller
can specify paths to list. If not specified, this argument defaults to
list `.` (i.e. list all files in the root).

The motivation for this change is that we wish to expose this pattern in
bundles. Users should be able to specify which paths to synchronize
instead of always only synchronizing the bundle root directory.

[pathspec]:
https://git-scm.com/docs/gitglossary#Documentation/gitglossary.txt-aiddefpathspecapathspec

## Tests

New and existing unit tests.
2024-08-19 15:15:14 +00:00
Gleb Kanterov ab4e8099fb
Add `import` option for PyDABs (#1693)
## Changes
Add 'import' option for PyDABs

## Tests
Manually
2024-08-19 13:24:56 +00:00
Andrew Nester 54799a1918
Upgrade Go SDK to 0.44.0 (#1679)
## Changes
Upgrade Go SDK to 0.44.0

---------

Co-authored-by: Pieter Noordhuis <pieter.noordhuis@databricks.com>
2024-08-15 13:23:07 +00:00
Pieter Noordhuis 6b3d33a846
Upgrade TF provider to 1.50.0 (#1681)
## Changes

See
https://github.com/databricks/terraform-provider-databricks/pull/3900

## Tests

* Manually test on a bundle with a pipeline and a schema
* Integration tests pass
2024-08-15 12:43:39 +00:00
Renaud Hartert 7aaaee2512
[Internal] Remove dependency to the `openapi` package of the Go SDK (#1676)
## Changes

This PR removes the dependency to the `databricks-sdk-go/openapi`
package by copying the struct and functions that are needed in a new
`schema/spec.go` file.

The reason to remove this dependency is that it is being deprecated.
Copying the code in the `cli` repo seems reasonable given that it only
uses a couple of very small structs.

## Tests

Verified that CLI code can be properly generated after this change.
2024-08-14 15:59:55 +00:00
Andrew Nester 48ff18e5fc
Upload local libraries even if they don't have artifact defined (#1664)
## Changes
Previously for all the libraries referenced in configuration DABs made
sure that there is corresponding artifact section.
But this is not really necessary and flexible, because local libraries
might be built outside of dabs context.
It also created difficult to follow logic in code where we back
referenced libraries to artifacts which was difficult to fllow


This PR does 3 things:
1. Allows all local libraries referenced in DABs config to be uploaded
to remote
2. Simplifies upload and glob references expand logic by doing this in
single place
3. Speed things up by uploading library only once and doing this in
parallel

## Tests
Added unit + integration tests + made sure that change is backward
compatible (no changes in existing tests)

---------

Co-authored-by: Pieter Noordhuis <pieter.noordhuis@databricks.com>
2024-08-14 09:03:44 +00:00
shreyas-goenka 7ae80de351
Stop tracking file path locations in bundle resources (#1673)
## Changes
Since locations are already tracked in the dynamic value tree, we no
longer need to track it at the resource/artifact level. This PR:
1. Removes use of `paths.Paths`. Uses dyn.Location instead.
2. Refactors the validation of resources not being empty valued to be
generic across all resource types.
  
## Tests
Existing unit tests.
2024-08-13 12:50:15 +00:00
shreyas-goenka 1b984b4f62
Skip pushing Terraform state after destroy (#1667)
## Changes
Following up
https://github.com/databricks/cli/pull/1583#discussion_r1681126323.

We can skip pushing because right after `root_path` is deleted, making
this a no-op effectively.
 
## Tests
2024-08-12 09:19:54 +00:00
Pieter Noordhuis d3d828d175
Fix glob expansion after running a generic build command (#1662)
## Changes

This didn't work as expected because the generic build mutator called
into the type-specific build mutator in the middle of the function. This
invalidated the `config.Artifact` pointer that was being mutated later
on, effectively hiding these mutations from its caller.

To fix this, I turned glob expansion into its own mutator. It now works
as expected, _and_ produces better errors if the glob patterns are
invalid or do not match files.

## Tests

Unit tests.

Manual verification:
```
% databricks bundle deploy
Building sbt_example...

Error: target/scala-2.12/sbt-e[xam22ple*.jar: syntax error in pattern
  at artifacts.sbt_example.files[1].source
  in databricks.yml:15:17
```
2024-08-07 14:47:03 +00:00
Pieter Noordhuis f3ffded3bf
Merge job parameters based on their name (#1659)
## Changes

This change enables overriding the default value of job parameters in
target overrides.

This is the same approach we already take for job clusters and job
tasks.

Closes #1620.

## Tests

Mutator unit tests and lightweight end-to-end tests.
2024-08-06 16:12:18 +00:00
Andrew Nester d26f3f4863
Fixed incorrectly cleaning up python wheel dist folder (#1656)
## Changes
In https://github.com/databricks/cli/pull/1618 we introduced prepare
step in which Python wheel folder was cleaned. Now it was cleaned
everytime instead of only when there is a build command how it is used
to work.

This PR fixes it by only cleaning up dist folder when there is a build
command for wheels.

Fixes #1638 

## Tests
Added regression test
2024-08-06 09:54:58 +00:00
Andrew Nester 809c67b675
Expand and upload local wheel libraries for all task types (#1649)
## Changes
Fixes #1553 

## Tests
Added regression test
2024-08-05 14:44:23 +00:00
shreyas-goenka c454c2fd10
Use precomputed terraform plan for `bundle deploy` (#1640)
# Changes
With https://github.com/databricks/cli/pull/1413 we started to compute
and partially print the plan if it contained deletion of UC schemas.
This PR uses the precomputed plan to avoid double planning when actually
doing the terraform plan.

This fixes a performance regression introduced in
https://github.com/databricks/cli/pull/1413.

# Tests

Tested manually.
1. Verified bundle deployment still works and deploys resources.
2. Verified that the precomputed plan is indeed being used by attaching
a debugger and removing the plan file right before the terraform apply
process is spawned and asserting that terraform apply fails because the
plan is not found.
2024-07-31 14:07:25 +00:00
Andrew Nester 1fb8e324d5
Added test for negation pattern in sync include exclude section (#1637)
## Changes
Added test for negation pattern in sync include exclude section
2024-07-31 13:42:23 +00:00
shreyas-goenka 89c0af5bdc
Add resource for UC schemas to DABs (#1413)
## Changes
This PR adds support for UC Schemas to DABs. This allows users to define
schemas for tables and other assets their pipelines/workflows create as
part of the DAB, thus managing the life-cycle in the DAB.

The first version has a couple of intentional limitations:
1. The owner of the schema will be the deployment user. Changing the
owner of the schema is not allowed (yet). `run_as` will not be
restricted for DABs containing UC schemas. Let's limit the scope of
run_as to the compute identity used instead of ownership of data assets
like UC schemas.
2. API fields that are present in the update API but not the create API.
For example: enabling predictive optimization is not supported in the
create schema API and thus is not available in DABs at the moment.

## Tests
Manually and integration test. Manually verified the following work:
1. Development mode adds a "dev_" prefix.
2. Modified status is correctly computed in the `bundle summary`
command.
3. Grants work as expected, for assigning privileges.
4. Variable interpolation works for the schema ID.
2024-07-31 12:16:28 +00:00
Alex Moschos ecba875fe5
Regenerate TF schema (#1635)
## Changes
- Regenerate TF schema for CLI. Due to an issue the previous generation
missed some TF changes.
2024-07-30 10:13:05 +00:00
shreyas-goenka a52b188e99
Use dynamic walking to validate unique resource keys (#1614)
## Changes
This PR:
1. Uses dynamic walking (via the `dyn.MapByPattern` func) to validate no
two resources have the same resource key. The allows us to remove this
validation at merge time.
2. Modifies `dyn.Mapping` to always return a sorted slice of pairs. This
makes traversal functions like `dyn.Walk` or `dyn.MapByPattern`
deterministic.

## Tests
Unit tests. Also manually.
2024-07-29 13:04:02 +00:00
shreyas-goenka 37b9df96e6
Support multiple paths for diagnostics (#1616)
## Changes
Some diagnostics can have multiple paths associated with them. For
instance, ensuring that unique resource keys are used across all
resources. This PR extends `diag.Diagnostic` to accept multiple paths.

This PR is symmetrical to
https://github.com/databricks/cli/pull/1610/files

## Tests
Unit tests
2024-07-25 15:16:27 +00:00
Andrew Nester 90aaf2d20f
Upgrade TF provider to 1.49.1 (#1626)
## Changes
Upgrade TF provider to 1.49.1
2024-07-25 14:18:49 +00:00
shreyas-goenka e6241e196f
Move to a single prompt during bundle destroy (#1583)
## Changes
Right now we ask users for two confirmations when destroying a bundle.
One to destroy the resources and one to delete the files. This PR
consolidates the two prompts into one.

## Tests
Manually

Destroying a bundle with no resources:
```
➜  bundle-playground git:(master) ✗ cli bundle destroy
All files and directories at the following location will be deleted: /Users/shreyas.goenka@databricks.com/.bundle/bundle-playground/default

Would you like to proceed? [y/n]: y
No resources to destroy
Updating deployment state...
Deleting files...
Destroy complete!
```

Destroying a bundle with no remote state:
```
➜  bundle-playground git:(master) ✗ cli bundle destroy
No active deployment found to destroy!
```

When a user cancells a deployment:
```
➜  bundle-playground git:(master) ✗ cli bundle destroy
The following resources will be deleted:
  delete job job_1
  delete job job_2
  delete pipeline foo

All files and directories at the following location will be deleted: /Users/shreyas.goenka@databricks.com/.bundle/bundle-playground/default

Would you like to proceed? [y/n]: n
Destroy cancelled!
```

When a user destroys resources:
```
➜  bundle-playground git:(master) ✗ cli bundle destroy
The following resources will be deleted:
  delete job job_1
  delete job job_2
  delete pipeline foo

All files and directories at the following location will be deleted: /Users/shreyas.goenka@databricks.com/.bundle/bundle-playground/default

Would you like to proceed? [y/n]: y
Updating deployment state...
Deleting files...
Destroy complete!
```
2024-07-24 13:02:19 +00:00
Andrew Nester 39fc86e83b
Split artifact cleanup into prepare step before build (#1618)
## Changes
Now prepare stage which does cleanup is execute once before every build,
so artifacts built into the same folder are correctly kept

Fixes workaround 2 from this issue #1602

## Tests
Added unit test
2024-07-24 09:13:49 +00:00
shreyas-goenka 4bf88b4209
Support multiple locations for diagnostics (#1610)
## Changes
This PR changes `diag.Diagnostics` to allow including multiple locations
associated with the diagnostic message. The diagnostics that now return
multiple locations with this PR are:
1. Warning for unknown keys in config.
2. Use of experimental.run_as
3. Accidental sync.exludes that exclude all files.

## Tests
Existing unit tests pass. New unit test case to assert on error message
when multiple locations are included.

Example output:
```
➜  bundle-playground-2 ~/cli2/cli/cli bundle validate              
Warning: You are using the legacy mode of run_as. The support for this mode is experimental and might be removed in a future release of the CLI. In order to run the DLT pipelines in your DAB as the run_as user this mode changes the owners of the pipelines to the run_as identity, which requires the user deploying the bundle to be a workspace admin, and also a Metastore admin if the pipeline target is in UC.
  at experimental.use_legacy_run_as
  in resources.yml:10:22
     databricks.yml:13:22

Name: fix run_if
Target: default
Workspace:
  User: shreyas.goenka@databricks.com
  Path: /Users/shreyas.goenka@databricks.com/.bundle/fix run_if/default

Found 1 warning
```
2024-07-23 17:20:11 +00:00
Pieter Noordhuis 52ca599cd5
Upgrade TF provider to 1.49.0 (#1617)
## Changes

This includes a fix for model serving endpoints.

See
https://github.com/databricks/terraform-provider-databricks/pull/3690.

## Tests

n/a
2024-07-23 16:15:02 +00:00
Pieter Noordhuis 2aeea5e384
Remove unused package bundle/deployer (#1607)
## Changes

This has been superseded by individual mutators under
`bundle/deploy/terraform`.

## Tests

n/a
2024-07-18 14:57:31 +00:00
Pieter Noordhuis 6953a5d5af
Add read-only mode for extension aware workspace filer (#1609)
## Changes

By default, construct a read/write instance. If constructed in read-only
mode, the underlying filer is wrapped in a readahead cache.

## Tests

* Filer integration tests pass.
* Manual test that caching is enabled when running on WSFS.
2024-07-18 14:17:42 +00:00
shreyas-goenka 5b65358146
Use local Terraform state only when lineage match (#1588)
## Changes
DABs deployments should be isolated if `root_path` and workspace host
are different. This PR fixes a bug where local terraform state gets
piggybacked if the same cwd is used to deploy two isolated deployments
for the same bundle target. This can happen if:
1. A user switches to a different identity on the same machine. 
2. The workspace host URL the bundle/target points to is changed.
3. A user changes the `root_path` while doing bundle development.

To solve this problem we rely on the lineage field available in the
terraform state, which is a uuid identifying unique terraform
deployments. There's a 1:1 mapping between a terraform deployment and a
bundle deployment.

For more details on how lineage works in terraform, see:
https://developer.hashicorp.com/terraform/language/state/backends#manual-state-pull-push

## Tests
Manually verified that changing the identity no longer results in the
incorrect terraform state being used. Also, new unit tests are added.
2024-07-18 09:47:59 +00:00
shreyas-goenka c6c2692368
Attribute Terraform API requests the CLI (#1598)
## Changes
This PR adds cli to the user agent sent downstream to the databricks
terraform provider when invoked via DABs.
 
## Tests
Unit tests. Based on the comment here
(10fe02075f/bundle/config/mutator/verify_cli_version_test.go (L113))
we don't need to set the version to make the test assertion work
correctly. This is likely because we use `go test` to run the tests
while the CLI is compiled and the version is set via `goreleaser`.
2024-07-18 09:38:09 +00:00
Pieter Noordhuis e1474a38f9
Upgrade TF provider to 1.48.3 (#1600)
## Changes

This includes a fix for using periodic triggers.

## Tests

Manually confirmed this works with
https://github.com/databricks/bundle-examples/pull/32.
2024-07-17 08:49:19 +00:00
shreyas-goenka 8ed9964482
Track multiple locations associated with a `dyn.Value` (#1510)
## Changes
This PR changes the location metadata associated with a `dyn.Value` to a
slice of locations. This will allow us to keep track of location
metadata across merges and overrides.

The convention is to treat the first location in the slice as the
primary location. Also, the semantics are the same as before if there's
only one location associated with a value, that is:
1. For complex values (maps, sequences) the location of the v1 is
primary in Merge(v1, v2)
2. For primitive values the location of v2 is primary in Merge(v1, v2)

## Tests
Modifying existing merge unit tests. Other existing unit tests and
integration tests pass.

---------

Co-authored-by: Pieter Noordhuis <pieter.noordhuis@databricks.com>
2024-07-16 11:27:27 +00:00
shreyas-goenka 39c2633773
Add UUID to uniquely identify a deployment state (#1595)
## Changes
We need a mechanism to invalidate the locally cached deployment state if
a user uses the same working directory to deploy to multiple distinct
deployments (separate targets, root_paths or even hosts).

This PR just adds the UUID to the deployment state in preparation for
invalidating this cache. The actual invalidation will follow up at a
later date (tracked in internal backlog).

## Tests
Unit test. Manually checked the deployment state is actually being
written.
2024-07-16 10:01:58 +00:00
Andrew Nester 434bcbb018
Allow artifacts (JARs, wheels) to be uploaded to UC Volumes (#1591)
## Changes
This change allows to specify UC volumes path as an artifact paths so
all artifacts (JARs, wheels) are uploaded to UC Volumes.

Example configuration is here:
```
bundle:
  name: jar-bundle

workspace:
  host: https://foo.com
  artifact_path: /Volumes/main/default/foobar

artifacts:
  my_java_code:
    path: ./sample-java
    build: "javac PrintArgs.java && jar cvfm PrintArgs.jar META-INF/MANIFEST.MF PrintArgs.class"
    files:
      - source: ./sample-java/PrintArgs.jar

resources:
  jobs:
    jar_job:
      name: "Test Spark Jar Job"
      tasks:
        - task_key: TestSparkJarTask
          new_cluster:
            num_workers: 1
            spark_version: "14.3.x-scala2.12"
            node_type_id: "i3.xlarge"
          spark_jar_task:
            main_class_name: PrintArgs
          libraries:
            - jar: ./sample-java/PrintArgs.jar
```
## Tests
Manually + added E2E test for Java jobs

E2E test is temporarily skipped until auth related issues for UC for
tests are resolved
2024-07-16 08:57:04 +00:00
Gleb Kanterov af975ca64b
Print diagnostics in 'bundle deploy' (#1579)
## Changes
Print diagnostics in 'bundle deploy' similar to 'bundle validate'. This
way if a bundle has any errors or warnings, they are going to be easy to
notice.

NB: due to how we render errors, there is one extra trailing new line in
output, preserved in examples below

## Example: No errors or warnings

```
% databricks bundle deploy
Building default...
Deploying resources...
Updating deployment state...
Deployment complete!
```

## Example: Error on load

```
% databricks bundle deploy
Error: Databricks CLI version constraint not satisfied. Required: >= 1337.0.0, current: 0.0.0-dev

```

## Example: Warning on load

```
% databricks bundle deploy
Building default...
Deploying resources...
Updating deployment state...
Deployment complete!
Warning: unknown field: foo
  in databricks.yml:6:1

```

## Example: Error + warning on load

```
% databricks bundle deploy
Warning: unknown field: foo
  in databricks.yml:6:1

Error: something went wrong

```

## Example: Warning on load + error in init

```
% databricks bundle deploy
Warning: unknown field: foo
  in databricks.yml:6:1

Error: Failed to xxx
  in yyy.yml

Detailed explanation
in multiple lines

```

## Tests
Tested manually
2024-07-10 11:14:57 +00:00
shreyas-goenka 5bc5c3c26a
Return early in bundle destroy if no deployment exists (#1581)
## Changes
This PR:
1. Moves the if mutator to the bundle package, to live with all-time
greats such as `bundle.Seq` and `bundle.Defer`. Also adds unit tests.
2. `bundle destroy` now returns early if `root_path` does not exist. We
do this by leveraging a `bundle.If` condition.

## Tests
Unit tests and manually.

Here's an example of what it'll look like once the bundle is destroyed.

```
➜  bundle-playground git:(master) ✗ cli bundle destroy
No active deployment found to destroy!
```

I would have added some e2e coverage for this as well, but the
`cobraTestRunner.Run()` method does not seem to return stdout/stderr
logs correctly. We can probably punt looking into it.
2024-07-09 15:08:38 +00:00
Andrew Nester 8b468b423f
Change SetVariables mutator to mutate dynamic configuration instead (#1573)
## Changes
Previously `SetVariables` mutator mutated typed configuration by using
`v.Set` for variables. This lead to variables `value` field not having
location information.

By using dynamic configuration mutation, we keep the same functionality
but also preserve location information for value when it's set from
default.

Fixes #1568 #1538

## Tests
Added unit tests
2024-07-09 11:12:42 +00:00
Andrew Nester 3d8446bbdb
Rewrite local path for libraries in foreach tasks (#1569)
## Changes
Now local library path in `libraries` section of foreach each tasks are
correctly replaced with remote path for this library when it's uploaded
to Databricks

## Tests
Added unit test
2024-07-05 10:58:28 +00:00
Andrew Nester 040b374430
Override complex variables with target overrides instead of merging (#1567)
## Changes
At the moment we merge values of complex variables while more expected
behaviour is overriding the value with the target one.

## Tests
Added unit test
2024-07-04 11:57:29 +00:00
Pieter Noordhuis f14dded946
Replace `vfs.Path` with extension-aware filer when running on DBR (#1556)
## Changes

The FUSE mount of the workspace file system on DBR doesn't include file
extensions for notebooks. When these notebooks are checked into a
repository, they do have an extension. PR #1457 added a filer type that
is aware of this disparity and makes these notebooks show up as if they
do have these extensions.

This change swaps out the native `vfs.Path` with one that uses this
filer when running on DBR.

Follow up: consolidate between interfaces exported by `filer.Filer` and
`vfs.Path`.

## Tests

* Unit tests pass
* (Manually ran a snapshot build on DBR against a bundle with notebooks)

---------

Co-authored-by: Andrew Nester <andrew.nester@databricks.com>
2024-07-03 11:55:42 +00:00
Pieter Noordhuis b3c044c461
Use `vfs.Path` for filesystem interaction (#1554)
## Changes

Note: this doesn't cover _all_ filesystem interaction.

To intercept calls where read or stat files to determine their type, we
need a layer between our code and the `os` package calls that interact
with the local file system. Interception is necessary to accommodate
differences between a regular local file system and the FUSE-mounted
Workspace File System when running the CLI on DBR.

This change makes use of #1452 in the bundle struct.

It uses #1525 to access the bundle variable in path rewriting.

## Tests

* Unit tests pass.
* Integration tests pass.
2024-07-03 10:13:22 +00:00
Gleb Kanterov 4787edba36
PythonMutator: allow insert 'resources' and 'resources.jobs' (#1555)
## Changes
Allow insert 'resources' and 'resources.jobs' because they can be absent
in incoming bundle.

## Tests
Unit tests
2024-07-03 08:33:23 +00:00
Gleb Kanterov b9e3c98723
PythonMutator: support omitempty in PyDABs (#1513)
## Changes
PyDABs output can omit empty sequences/mappings because we don't track
them as optional. There is no semantic difference between empty and
missing, which makes omitting correct. CLI detects that we falsely
modify input resources by deleting all empty collections.

To handle that, we extend `dyn.Override` to allow visitors to ignore
certain deletes. If we see that an empty sequence or mapping is deleted,
we revert such delete.

## Tests
Unit tests

---------

Co-authored-by: Pieter Noordhuis <pcnoordhuis@gmail.com>
2024-07-03 07:22:03 +00:00
Gleb Kanterov 5a0a6d7334
PythonMutator: add diagnostics (#1531)
## Changes
Allow PyDABs to report `dyn.Diagnostics` by writing to
`diagnostics.json` supplied as an argument, similar to `input.json` and
`output.json`

Such errors are not yet properly printed in `databricks bundle
validate`, which will be fixed in a follow-up PR.

## Tests
Unit tests
2024-07-02 15:10:53 +00:00
Andrew Nester 3d2f7622bc
Fixed bundle not loading when empty variable is defined (#1552)
## Changes
Fixes #1544

## Tests
Added regression test
2024-07-02 12:40:39 +00:00
Andrew Nester 0d64975d36
Fixed resolving variable references inside slice variable (#1550)
## Changes
Fixes #1541 

## Tests
Added regression unit test

---------

Co-authored-by: Pieter Noordhuis <pieter.noordhuis@databricks.com>
2024-07-02 11:45:16 +00:00
Pieter Noordhuis a0df54ac41
Add extra tests for the sync block (#1548)
## Changes

Issue #1545 describes how a nil entry in the sync block caused an error.

The fix for this issue is in #1547. This change adds end-to-end test
coverage.

## Tests

New test passes on top of #1547.
2024-07-01 13:08:50 +00:00
Gleb Kanterov e8b76a7f13
Improve `bundle validate` output (#1532)
## Changes
This combination of changes allows pretty-printing errors happening
during the "load" and "init" phases, including their locations.

Move to render code into a separate module dedicated to rendering
`diag.Diagnostics` in a human-readable format. This will be used for the
`bundle deploy` command.

Preserve the "bundle" value if an error occurs in mutators. Rewrite the
go templates to handle the case when the bundle isn't yet loaded if an
error occurs during loading, that is possible now.

Improve rendering for errors and warnings:
- don't render empty locations
- render "details" for errors if they exist

Add `root.ErrAlreadyPrinted` indicating that the error was already
printed, and the CLI entry point shouldn't print it again.

## Tests
Add tests for output, that are especially handy to detect extra newlines
2024-07-01 09:01:10 +00:00
Gleb Kanterov aee3910f3d
PythonMutator: register product in user agent extra (#1533)
## Changes
Register user agent product following RFC 9110.

See
https://github.com/databricks/terraform-provider-databricks/pull/3520
for Terraform change.

## Tests
Unit tests
2024-07-01 07:46:37 +00:00
shreyas-goenka 4d8eba04cd
Compare `.Kind()` instead of direct equality checks on a `dyn.Value` (#1520)
## Changes

This PR makes two changes:

1. In https://github.com/databricks/cli/pull/1510 we'll be adding
multiple associated location metadata with a dyn.Value. The Go compiler
does not allow comparing structs if they contain slice values
(presumably due to multiple possible definitions for equality). In
anticipation for adding a `[]dyn.Location` type field to `dyn.Value`
this PR removes all direct comparisons of `dyn.Value` and instead relies
on the kind.

2. Retain location metadata for values in convert.FromTyped. The change
diff is exactly the same as https://github.com/databricks/cli/pull/1523.
It's been combined with this PR because they both depend on each other
to prevent test failures (forming a test failure deadlock).

Go patch used:
```
@@
var x expression
@@
-x == dyn.InvalidValue
+x.Kind() == dyn.KindInvalid

@@
var x expression
@@
-x != dyn.InvalidValue
+x.Kind() != dyn.KindInvalid

@@
var x expression
@@
-x == dyn.NilValue
+x.Kind() == dyn.KindNil

@@
var x expression
@@
-x != dyn.NilValue
+x.Kind() != dyn.KindNil
```
 

## Tests
Unit tests and integration tests pass.
2024-06-27 13:28:19 +00:00
Andrew Nester 5f42791609
Added support for complex variables (#1467)
## Changes
Added support for complex variables

Now it's possible to add and use complex variables as shown below

```
bundle:
  name: complex-variables

resources:
  jobs:
    my_job:
      job_clusters:
        - job_cluster_key: key
          new_cluster: ${var.cluster}
      tasks:
      - task_key: test
        job_cluster_key: key

variables:
  cluster:
    description: "A cluster definition"
    type: complex
    default:
      spark_version: "13.2.x-scala2.11"
      node_type_id: "Standard_DS3_v2"
      num_workers: 2
      spark_conf:
        spark.speculation: true
        spark.databricks.delta.retentionDurationCheck.enabled: false
```

Fixes #1298

- [x] Support for complex variables
- [x] Allow variable overrides (with shortcut) in targets
- [x] Don't allow to provide complex variables via flag or env variable
- [x] Fail validation if complex value is used but not `type: complex`
provided
- [x] Support using variables inside complex variables 

## Tests
Added unit tests

---------

Co-authored-by: shreyas-goenka <88374338+shreyas-goenka@users.noreply.github.com>
2024-06-26 10:25:32 +00:00
Pieter Noordhuis ce5a3f2ce6
Upgrade TF provider to 1.48.0 (#1527)
## Changes

This includes a fix for library order not being respected.

## Tests

Manually confirmed the fix works in
https://github.com/databricks/bundle-examples/pull/29.
2024-06-26 09:29:46 +00:00
dependabot[bot] 8468878eed
Bump github.com/databricks/databricks-sdk-go from 0.42.0 to 0.43.0 (#1522)
Bumps
[github.com/databricks/databricks-sdk-go](https://github.com/databricks/databricks-sdk-go)
from 0.42.0 to 0.43.0.
<details>
<summary>Release notes</summary>
<p><em>Sourced from <a
href="https://github.com/databricks/databricks-sdk-go/releases">github.com/databricks/databricks-sdk-go's
releases</a>.</em></p>
<blockquote>
<h2>v0.43.0</h2>
<p>Major Changes and Improvements:</p>
<ul>
<li>Support partners in user agent for SDK (<a
href="https://redirect.github.com/databricks/databricks-sdk-go/pull/925">#925</a>).</li>
<li>Add <code>serverless_compute_id</code> field to the config (<a
href="https://redirect.github.com/databricks/databricks-sdk-go/pull/952">#952</a>).</li>
</ul>
<p>Other Changes:</p>
<ul>
<li>Generate from latest spec (<a
href="https://redirect.github.com/databricks/databricks-sdk-go/pull/944">#944</a>)
and (<a
href="https://redirect.github.com/databricks/databricks-sdk-go/pull/947">#947</a>).</li>
</ul>
<p>API Changes:</p>
<ul>
<li>Changed <code>IsolationMode</code> field for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/catalog#CatalogInfo">catalog.CatalogInfo</a>
to <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/catalog#CatalogIsolationMode">catalog.CatalogIsolationMode</a>.</li>
<li>Added <code>IsolationMode</code> field for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/catalog#ExternalLocationInfo">catalog.ExternalLocationInfo</a>.</li>
<li>Added <code>MaxResults</code> and <code>PageToken</code> fields for
<a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/catalog#ListCatalogsRequest">catalog.ListCatalogsRequest</a>.</li>
<li>Added <code>NextPageToken</code> field for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/catalog#ListCatalogsResponse">catalog.ListCatalogsResponse</a>.</li>
<li>Added <code>TableServingUrl</code> field for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/catalog#OnlineTable">catalog.OnlineTable</a>.</li>
<li>Added <code>IsolationMode</code> field for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/catalog#StorageCredentialInfo">catalog.StorageCredentialInfo</a>.</li>
<li>Changed <code>IsolationMode</code> field for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/catalog#UpdateCatalog">catalog.UpdateCatalog</a>
to <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/catalog#CatalogIsolationMode">catalog.CatalogIsolationMode</a>.</li>
<li>Added <code>IsolationMode</code> field for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/catalog#UpdateExternalLocation">catalog.UpdateExternalLocation</a>.</li>
<li>Added <code>IsolationMode</code> field for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/catalog#UpdateStorageCredential">catalog.UpdateStorageCredential</a>.</li>
<li>Added <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/catalog#CatalogIsolationMode">catalog.CatalogIsolationMode</a>.</li>
<li>Added <code>CreateSchedule</code>, <code>CreateSubscription</code>,
<code>DeleteSchedule</code>, <code>DeleteSubscription</code>,
<code>GetSchedule</code>, <code>GetSubscription</code>,
<code>List</code>, <code>ListSchedules</code>,
<code>ListSubscriptions</code> and <code>UpdateSchedule</code> methods
for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/dashboards#LakeviewAPI">w.Lakeview</a>
workspace-level service.</li>
<li>Added <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/dashboards#CreateScheduleRequest">dashboards.CreateScheduleRequest</a>,
<a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/dashboards#CreateSubscriptionRequest">dashboards.CreateSubscriptionRequest</a>,
<a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/dashboards#CronSchedule">dashboards.CronSchedule</a>,
<a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/dashboards#DashboardView">dashboards.DashboardView</a>,
<a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/dashboards#DeleteScheduleRequest">dashboards.DeleteScheduleRequest</a>,
<a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/dashboards#DeleteSubscriptionRequest">dashboards.DeleteSubscriptionRequest</a>,
<a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/dashboards#GetScheduleRequest">dashboards.GetScheduleRequest</a>,
<a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/dashboards#GetSubscriptionRequest">dashboards.GetSubscriptionRequest</a>,
<a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/dashboards#ListDashboardsRequest">dashboards.ListDashboardsRequest</a>,
<a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/dashboards#ListDashboardsResponse">dashboards.ListDashboardsResponse</a>,
<a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/dashboards#ListSchedulesRequest">dashboards.ListSchedulesRequest</a>,
<a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/dashboards#ListSchedulesResponse">dashboards.ListSchedulesResponse</a>,
<a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/dashboards#ListSubscriptionsRequest">dashboards.ListSubscriptionsRequest</a>,
<a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/dashboards#ListSubscriptionsResponse">dashboards.ListSubscriptionsResponse</a>,
<a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/dashboards#Schedule">dashboards.Schedule</a>,
<a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/dashboards#SchedulePauseStatus">dashboards.SchedulePauseStatus</a>,
<a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/dashboards#Subscriber">dashboards.Subscriber</a>,
<a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/dashboards#Subscription">dashboards.Subscription</a>,
<a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/dashboards#SubscriptionSubscriberDestination">dashboards.SubscriptionSubscriberDestination</a>,
<a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/dashboards#SubscriptionSubscriberUser">dashboards.SubscriptionSubscriberUser</a>
and <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/dashboards#UpdateScheduleRequest">dashboards.UpdateScheduleRequest</a>
structs.</li>
<li>Added <code>OnStreamingBacklogExceeded</code> field for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/jobs#JobEmailNotifications">jobs.JobEmailNotifications</a>.</li>
<li>Added <code>EnvironmentKey</code> field for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/jobs#RunTask">jobs.RunTask</a>.</li>
<li>Removed <code>ConditionTask</code>, <code>DbtTask</code>,
<code>NotebookTask</code>, <code>PipelineTask</code>,
<code>PythonWheelTask</code>, <code>RunJobTask</code>,
<code>SparkJarTask</code>, <code>SparkPythonTask</code>,
<code>SparkSubmitTask</code>, <code>SqlTask</code> and
<code>Environments</code> fields for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/jobs#SubmitRun">jobs.SubmitRun</a>.</li>
<li>Added <code>DbtTask</code> and <code>EnvironmentKey</code> field for
<a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/jobs#SubmitTask">jobs.SubmitTask</a>.</li>
<li>Added <code>OnStreamingBacklogExceeded</code> field for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/jobs#TaskEmailNotifications">jobs.TaskEmailNotifications</a>.</li>
<li>Added <code>Periodic</code> field for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/jobs#TriggerSettings">jobs.TriggerSettings</a>.</li>
<li>Added <code>OnStreamingBacklogExceeded</code> field for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/jobs#WebhookNotifications">jobs.WebhookNotifications</a>.</li>
<li>Added <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/jobs#PeriodicTriggerConfiguration">jobs.PeriodicTriggerConfiguration</a>.</li>
<li>Added <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/jobs#PeriodicTriggerConfigurationTimeUnit">jobs.PeriodicTriggerConfigurationTimeUnit</a>.</li>
<li>Added <code>ProviderSummary</code> field for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/marketplace#Listing">marketplace.Listing</a>.</li>
<li>Added <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/marketplace#ProviderIconFile">marketplace.ProviderIconFile</a>.</li>
<li>Added <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/marketplace#ProviderIconType">marketplace.ProviderIconType</a>.</li>
<li>Added <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/marketplace#ProviderListingSummaryInfo">marketplace.ProviderListingSummaryInfo</a>.</li>
<li>Added <code>Start</code> method for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/serving#AppsAPI">w.Apps</a>
workspace-level service.</li>
<li>Added <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/serving#ServingEndpointsDataPlaneAPI">w.ServingEndpointsDataPlane</a>
workspace-level service.</li>
<li>Added <code>ServicePrincipalId</code> field for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/serving#App">serving.App</a>.</li>
<li>Added <code>ServicePrincipalName</code> field for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/serving#App">serving.App</a>.</li>
<li>Added <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/serving#StartAppRequest">serving.StartAppRequest</a>.</li>
<li>Added <code>QueryNextPage</code> method for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/vectorsearch#VectorSearchIndexesAPI">w.VectorSearchIndexes</a>
workspace-level service.</li>
<li>Added <code>QueryType</code> field for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/vectorsearch#QueryVectorIndexRequest">vectorsearch.QueryVectorIndexRequest</a>.</li>
<li>Added <code>NextPageToken</code> field for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/vectorsearch#QueryVectorIndexResponse">vectorsearch.QueryVectorIndexResponse</a>.</li>
<li>Added <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/vectorsearch#QueryVectorIndexNextPageRequest">vectorsearch.QueryVectorIndexNextPageRequest</a>.</li>
</ul>
<p>OpenAPI SHA: 7437dabb9dadee402c1fc060df4c1ce8cc5369f0, Date:
2024-06-25</p>
</blockquote>
</details>
<details>
<summary>Changelog</summary>
<p><em>Sourced from <a
href="https://github.com/databricks/databricks-sdk-go/blob/main/CHANGELOG.md">github.com/databricks/databricks-sdk-go's
changelog</a>.</em></p>
<blockquote>
<h2>0.43.0</h2>
<p>Major Changes and Improvements:</p>
<ul>
<li>Support partners in user agent for SDK (<a
href="https://redirect.github.com/databricks/databricks-sdk-go/pull/925">#925</a>).</li>
<li>Add <code>serverless_compute_id</code> field to the config (<a
href="https://redirect.github.com/databricks/databricks-sdk-go/pull/952">#952</a>).</li>
</ul>
<p>Other Changes:</p>
<ul>
<li>Generate from latest spec (<a
href="https://redirect.github.com/databricks/databricks-sdk-go/pull/944">#944</a>)
and (<a
href="https://redirect.github.com/databricks/databricks-sdk-go/pull/947">#947</a>).</li>
</ul>
<p>API Changes:</p>
<ul>
<li>Changed <code>IsolationMode</code> field for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/catalog#CatalogInfo">catalog.CatalogInfo</a>
to <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/catalog#CatalogIsolationMode">catalog.CatalogIsolationMode</a>.</li>
<li>Added <code>IsolationMode</code> field for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/catalog#ExternalLocationInfo">catalog.ExternalLocationInfo</a>.</li>
<li>Added <code>MaxResults</code> and <code>PageToken</code> fields for
<a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/catalog#ListCatalogsRequest">catalog.ListCatalogsRequest</a>.</li>
<li>Added <code>NextPageToken</code> field for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/catalog#ListCatalogsResponse">catalog.ListCatalogsResponse</a>.</li>
<li>Added <code>TableServingUrl</code> field for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/catalog#OnlineTable">catalog.OnlineTable</a>.</li>
<li>Added <code>IsolationMode</code> field for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/catalog#StorageCredentialInfo">catalog.StorageCredentialInfo</a>.</li>
<li>Changed <code>IsolationMode</code> field for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/catalog#UpdateCatalog">catalog.UpdateCatalog</a>
to <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/catalog#CatalogIsolationMode">catalog.CatalogIsolationMode</a>.</li>
<li>Added <code>IsolationMode</code> field for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/catalog#UpdateExternalLocation">catalog.UpdateExternalLocation</a>.</li>
<li>Added <code>IsolationMode</code> field for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/catalog#UpdateStorageCredential">catalog.UpdateStorageCredential</a>.</li>
<li>Added <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/catalog#CatalogIsolationMode">catalog.CatalogIsolationMode</a>.</li>
<li>Added <code>CreateSchedule</code>, <code>CreateSubscription</code>,
<code>DeleteSchedule</code>, <code>DeleteSubscription</code>,
<code>GetSchedule</code>, <code>GetSubscription</code>,
<code>List</code>, <code>ListSchedules</code>,
<code>ListSubscriptions</code> and <code>UpdateSchedule</code> methods
for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/dashboards#LakeviewAPI">w.Lakeview</a>
workspace-level service.</li>
<li>Added <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/dashboards#CreateScheduleRequest">dashboards.CreateScheduleRequest</a>,
<a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/dashboards#CreateSubscriptionRequest">dashboards.CreateSubscriptionRequest</a>,
<a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/dashboards#CronSchedule">dashboards.CronSchedule</a>,
<a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/dashboards#DashboardView">dashboards.DashboardView</a>,
<a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/dashboards#DeleteScheduleRequest">dashboards.DeleteScheduleRequest</a>,
<a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/dashboards#DeleteSubscriptionRequest">dashboards.DeleteSubscriptionRequest</a>,
<a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/dashboards#GetScheduleRequest">dashboards.GetScheduleRequest</a>,
<a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/dashboards#GetSubscriptionRequest">dashboards.GetSubscriptionRequest</a>,
<a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/dashboards#ListDashboardsRequest">dashboards.ListDashboardsRequest</a>,
<a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/dashboards#ListDashboardsResponse">dashboards.ListDashboardsResponse</a>,
<a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/dashboards#ListSchedulesRequest">dashboards.ListSchedulesRequest</a>,
<a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/dashboards#ListSchedulesResponse">dashboards.ListSchedulesResponse</a>,
<a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/dashboards#ListSubscriptionsRequest">dashboards.ListSubscriptionsRequest</a>,
<a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/dashboards#ListSubscriptionsResponse">dashboards.ListSubscriptionsResponse</a>,
<a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/dashboards#Schedule">dashboards.Schedule</a>,
<a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/dashboards#SchedulePauseStatus">dashboards.SchedulePauseStatus</a>,
<a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/dashboards#Subscriber">dashboards.Subscriber</a>,
<a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/dashboards#Subscription">dashboards.Subscription</a>,
<a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/dashboards#SubscriptionSubscriberDestination">dashboards.SubscriptionSubscriberDestination</a>,
<a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/dashboards#SubscriptionSubscriberUser">dashboards.SubscriptionSubscriberUser</a>
and <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/dashboards#UpdateScheduleRequest">dashboards.UpdateScheduleRequest</a>
structs.</li>
<li>Added <code>OnStreamingBacklogExceeded</code> field for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/jobs#JobEmailNotifications">jobs.JobEmailNotifications</a>.</li>
<li>Added <code>EnvironmentKey</code> field for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/jobs#RunTask">jobs.RunTask</a>.</li>
<li>Removed <code>ConditionTask</code>, <code>DbtTask</code>,
<code>NotebookTask</code>, <code>PipelineTask</code>,
<code>PythonWheelTask</code>, <code>RunJobTask</code>,
<code>SparkJarTask</code>, <code>SparkPythonTask</code>,
<code>SparkSubmitTask</code>, <code>SqlTask</code> and
<code>Environments</code> fields for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/jobs#SubmitRun">jobs.SubmitRun</a>.</li>
<li>Added <code>DbtTask</code> and <code>EnvironmentKey</code> field for
<a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/jobs#SubmitTask">jobs.SubmitTask</a>.</li>
<li>Added <code>OnStreamingBacklogExceeded</code> field for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/jobs#TaskEmailNotifications">jobs.TaskEmailNotifications</a>.</li>
<li>Added <code>Periodic</code> field for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/jobs#TriggerSettings">jobs.TriggerSettings</a>.</li>
<li>Added <code>OnStreamingBacklogExceeded</code> field for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/jobs#WebhookNotifications">jobs.WebhookNotifications</a>.</li>
<li>Added <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/jobs#PeriodicTriggerConfiguration">jobs.PeriodicTriggerConfiguration</a>.</li>
<li>Added <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/jobs#PeriodicTriggerConfigurationTimeUnit">jobs.PeriodicTriggerConfigurationTimeUnit</a>.</li>
<li>Added <code>ProviderSummary</code> field for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/marketplace#Listing">marketplace.Listing</a>.</li>
<li>Added <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/marketplace#ProviderIconFile">marketplace.ProviderIconFile</a>.</li>
<li>Added <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/marketplace#ProviderIconType">marketplace.ProviderIconType</a>.</li>
<li>Added <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/marketplace#ProviderListingSummaryInfo">marketplace.ProviderListingSummaryInfo</a>.</li>
<li>Added <code>Start</code> method for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/serving#AppsAPI">w.Apps</a>
workspace-level service.</li>
<li>Added <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/serving#ServingEndpointsDataPlaneAPI">w.ServingEndpointsDataPlane</a>
workspace-level service.</li>
<li>Added <code>ServicePrincipalId</code> field for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/serving#App">serving.App</a>.</li>
<li>Added <code>ServicePrincipalName</code> field for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/serving#App">serving.App</a>.</li>
<li>Added <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/serving#StartAppRequest">serving.StartAppRequest</a>.</li>
<li>Added <code>QueryNextPage</code> method for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/vectorsearch#VectorSearchIndexesAPI">w.VectorSearchIndexes</a>
workspace-level service.</li>
<li>Added <code>QueryType</code> field for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/vectorsearch#QueryVectorIndexRequest">vectorsearch.QueryVectorIndexRequest</a>.</li>
<li>Added <code>NextPageToken</code> field for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/vectorsearch#QueryVectorIndexResponse">vectorsearch.QueryVectorIndexResponse</a>.</li>
<li>Added <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/vectorsearch#QueryVectorIndexNextPageRequest">vectorsearch.QueryVectorIndexNextPageRequest</a>.</li>
</ul>
<p>OpenAPI SHA: 7437dabb9dadee402c1fc060df4c1ce8cc5369f0, Date:
2024-06-25</p>
</blockquote>
</details>
<details>
<summary>Commits</summary>
<ul>
<li><a
href="3e419132ea"><code>3e41913</code></a>
Release v0.43.0 (<a
href="https://redirect.github.com/databricks/databricks-sdk-go/issues/955">#955</a>)</li>
<li><a
href="ce3dc984f7"><code>ce3dc98</code></a>
Add <code>serverless_compute_id</code> field to the config (<a
href="https://redirect.github.com/databricks/databricks-sdk-go/issues/952">#952</a>)</li>
<li><a
href="00b1d09b24"><code>00b1d09</code></a>
Update OpenAPI spec (<a
href="https://redirect.github.com/databricks/databricks-sdk-go/issues/947">#947</a>)</li>
<li><a
href="d098b1a3e7"><code>d098b1a</code></a>
Support partners in SDK (<a
href="https://redirect.github.com/databricks/databricks-sdk-go/issues/925">#925</a>)</li>
<li><a
href="490bc13c0e"><code>490bc13</code></a>
Generate from latest spec (<a
href="https://redirect.github.com/databricks/databricks-sdk-go/issues/944">#944</a>)</li>
<li>See full diff in <a
href="https://github.com/databricks/databricks-sdk-go/compare/v0.42.0...v0.43.0">compare
view</a></li>
</ul>
</details>
<br />

<details>
<summary>Most Recent Ignore Conditions Applied to This Pull
Request</summary>

| Dependency Name | Ignore Conditions |
| --- | --- |
| github.com/databricks/databricks-sdk-go | [>= 0.28.a, < 0.29] |
</details>


[![Dependabot compatibility
score](https://dependabot-badges.githubapp.com/badges/compatibility_score?dependency-name=github.com/databricks/databricks-sdk-go&package-manager=go_modules&previous-version=0.42.0&new-version=0.43.0)](https://docs.github.com/en/github/managing-security-vulnerabilities/about-dependabot-security-updates#about-compatibility-scores)

Dependabot will resolve any conflicts with this PR as long as you don't
alter it yourself. You can also trigger a rebase manually by commenting
`@dependabot rebase`.

[//]: # (dependabot-automerge-start)
[//]: # (dependabot-automerge-end)

---

<details>
<summary>Dependabot commands and options</summary>
<br />

You can trigger Dependabot actions by commenting on this PR:
- `@dependabot rebase` will rebase this PR
- `@dependabot recreate` will recreate this PR, overwriting any edits
that have been made to it
- `@dependabot merge` will merge this PR after your CI passes on it
- `@dependabot squash and merge` will squash and merge this PR after
your CI passes on it
- `@dependabot cancel merge` will cancel a previously requested merge
and block automerging
- `@dependabot reopen` will reopen this PR if it is closed
- `@dependabot close` will close this PR and stop Dependabot recreating
it. You can achieve the same result by closing it manually
- `@dependabot show <dependency name> ignore conditions` will show all
of the ignore conditions of the specified dependency
- `@dependabot ignore this major version` will close this PR and stop
Dependabot creating any more for this major version (unless you reopen
the PR or upgrade to it yourself)
- `@dependabot ignore this minor version` will close this PR and stop
Dependabot creating any more for this minor version (unless you reopen
the PR or upgrade to it yourself)
- `@dependabot ignore this dependency` will close this PR and stop
Dependabot creating any more for this dependency (unless you reopen the
PR or upgrade to it yourself)


</details>

---------

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
Co-authored-by: Pieter Noordhuis <pieter.noordhuis@databricks.com>
2024-06-25 12:51:17 +00:00
Pieter Noordhuis 100a0516d4
Add context type and value to path rewriting (#1525)
## Changes

For a future change where the inner rewriting functions need access to
the underlying bundle, this change makes preparations.

All values were passed via the stack before and adding yet another value
would make the code less readable.

## Tests

Unit tests pass.
2024-06-25 10:04:22 +00:00
Gleb Kanterov 5ff06578ac
PythonMutator: replace stdin/stdout with files (#1512)
## Changes
Replace stdin/stdout with files in `PythonMutator`. Files are created in
a temporary directory.

Rename `ApplyPythonMutator` to `PythonMutator`.

Add test for `dyn.Location` behavior during the "load" stage.

## Tests
Unit tests
2024-06-24 07:47:41 +00:00
shreyas-goenka 068c7cfc2d
Return `dyn.InvalidValue` instead of `dyn.NilValue` when errors happen (#1514)
## Changes
With https://github.com/databricks/cli/pull/1507 and
https://github.com/databricks/cli/pull/1511 we are clarifying the
semantics associated with `dyn.InvalidValue` and `dyn.NilValue`. An
invalid value is the default zero value and is used to signals the
complete absence of the value.

A nil value, on the other hand, is a valid value for a piece of
configuration and signals explicitly setting a key to nil in the
configuration tree. In keeping with that theme, this PR returns
`dyn.InvalidValue` instead of `dyn.NilValue` at error sites. This change
is not expected to have a material change in behaviour and is being done
to set the right convention since we have well-defined semantics
associated with both `NilValue` and `InvalidValue`.

## Tests
Unit tests and integration tests pass. Also manually scanned the changes
and the associated call sites to verify the `NilValue` value itself was
not being relied upon.
2024-06-21 14:22:42 +00:00
Pieter Noordhuis 446a9d0c52
Properly deal with nil values in `convert.FromTyped` (#1511)
## Changes

When a configuration defines:
```yaml
run_as:
```

It first showed up as `run_as -> nil` in the dynamic configuration only
to later be converted to `run_as -> {}` while going through typed
conversion. We were using the presence of a key to initialize an empty
value. This is incorrect and it should have remained a nil value.

This conversion was happening in `convert.FromTyped` where any struct
always returned a map value. Instead, it should only return a map value
in any one of these cases: 1) the struct has elements, 2) the struct was
originally a map in the dynamic configuration, or 3) the struct was
initialized to a non-empty pointer value.

Stacked on top of #1516 and #1518.

## Tests

* Unit tests pass.
* Integration tests pass.
* Manually ran through bundle CRUD with a bundle without resources.
2024-06-21 13:43:21 +00:00
Pieter Noordhuis 01adef666a
Set bool pointer to disable lock (#1516)
## Changes

This cherry-picks from #1490 to address an issue that came up in #1511.

The function `dyn.SetByPath` requires intermediate values to be present.
If they are not, it returns an error that it cannot index a map. This is
not an issue on main, where the intermediate maps are always created,
even if they are not present in the dynamic configuration tree. As of
#1511, we'll no longer populate empty maps for empty structs if they are
not explicitly set (i.e., a non-nil pointer). This change writes a bool
pointer to avoid this issue altogether.

## Tests

Unit tests pass.
2024-06-21 11:14:33 +00:00
Gleb Kanterov 57a5a65f87
Add ApplyPythonMutator (#1430)
## Changes
Add ApplyPythonMutator, which will fork the Python subprocess and
process pipe bundle configuration through it.

It's enabled through `experimental` section, for example:

```yaml
experimental:
  pydabs: 
    enable: true
    venv_path: .venv
```

For now, it's limited to two phases in the mutator pipeline:

- `load`: adds new jobs
- `init`: adds new jobs, or modifies existing ones

It's enforced that no jobs are modified in `load` and not jobs are
deleted in `load/init`, because, otherwise, it will break existing
assumptions.

## Tests
Unit tests
2024-06-20 08:43:08 +00:00
Pieter Noordhuis b2c03ea54c
Use `dyn.InvalidValue` to indicate absence (#1507)
## Changes

Previously, the functions `Get` and `Index` returned `dyn.NilValue` to
indicate that a map key or sequence index wasn't found. This is a valid
value, so we need to differentiate between actual absence and a real
`dyn.NilValue`. We do this with the zero value of a `dyn.Value` (also
captured in the constant `dyn.InvalidValue`).

## Tests

* Unit tests.
* Renamed `Get` and `Index` to find and update all call sites.
2024-06-19 15:24:57 +00:00
Lennart Kats (databricks) deb3e365cd
Pause quality monitors when "mode: development" is used (#1481)
## Changes

Similar to scheduled jobs, quality monitors should be paused when in
development mode (in line with the [behavior for scheduled
jobs](https://docs.databricks.com/en/dev-tools/bundles/deployment-modes.html)).
@aravind-segu @arpitjasa-db please take a look and verify this behavior.

- [x] Followup: documentation changes. If we make this change we should
update
https://docs.databricks.com/dev-tools/bundles/deployment-modes.html.

## Tests
Unit tests
2024-06-19 13:54:35 +00:00
Andrew Nester 663aa9ab8c
Override variables with lookup value even if values has default value set (#1504)
## Changes

This PR fixes the behaviour when variables were not overridden with
lookup value from targets if these variables had any default value set
in the default target.

Fixes #1449 

## Tests
Added regression test
2024-06-19 08:03:06 +00:00
shreyas-goenka 553fdd1e81
Serialize dynamic value for `bundle validate` output (#1499)
## Changes
Using dynamic values allows us to retain references like
`${resources.jobs...}` even when the type of field is not integer, eg:
`run_job_task`, or in general values that do not map to the Go types for
a field.

## Tests
Integration test
2024-06-18 15:04:20 +00:00
shreyas-goenka 274688d8a2
Clean up unused code (#1502)
## Changes
1. Removes `DefaultMutatorsForTarget` which is no longer used anywhere
2. Makes SnapshotPath a private field. It's no longer needed by data
structures outside its package.

FYI, I also tried finding other instances of dead code but I could not
find anything else that was safe to remove. I used
https://go.dev/blog/deadcode to search for them, and the other instances
either implemented an interface, increased test coverage for some of our
other code paths or there was some other reason I could not remove them
(like autogenerated functions or used in tests).

Good sign our codebase is mostly clean (at least superficially).
2024-06-18 14:14:27 +00:00
shreyas-goenka 44e3928d6a
Avoid multiple file tree traversals on bundle deploy (#1493)
## Changes
To run bundle deploy from DBR we use an abstraction over the workspace
import / export APIs to create a `filer.Filer` and abstract the file
system. Walking the file tree in such a filer is expensive and requires
multiple API calls. This PR remove the two duplicate file tree walks
that happen by caching the result.
2024-06-17 09:48:52 +00:00
Pieter Noordhuis 311dfa4642
Upgrade TF provider to 1.47.0 (#1476)
## Changes

This includes a bugfix for provisioning jobs with `num_workers = 0`.

Fixes #1472.

## Tests

Manually tested this fixes the issue.
2024-06-05 11:33:43 +00:00
Pieter Noordhuis 70fd8ad3d7
Update OpenAPI spec (#1466)
## Changes

Notable changes:

* Pagination of account-level storage credentials
* Rename app deployment method

Go SDK release notes:
https://github.com/databricks/databricks-sdk-go/releases/tag/v0.42.0

## Tests

* Nightlies pass.
2024-06-03 14:14:48 +00:00
Pieter Noordhuis c9b4f11947
Update error checks that use the `os` package to use `errors.Is` (#1461)
## Changes

From the [documentation](https://pkg.go.dev/os#IsNotExist) on the
functions in the `os` package:
> This function predates errors.Is. It only supports errors returned by
the os package.
> New code should use errors.Is(err, fs.ErrNotExist).

This issue surfaced while working on using a different `vfs.Path`
implementation that uses errors from the `fs` package. Calls to
`os.IsNotExist` didn't return true for errors that wrap
`fs.ErrNotExist`.

## Tests

n/a
2024-06-03 12:39:36 +00:00
Pieter Noordhuis 30fd84893f
Generate bundle schema placeholder for quality monitors (#1465)
## Changes

Generated with default generation command.

The team is making a fix to ensure the proper comments are included
later.

## Tests

n/a
2024-06-03 12:39:27 +00:00
Aravind Segu a33d0c8bf9
Add support for Lakehouse monitoring in bundles (#1307)
## Changes

This change adds support for Lakehouse monitoring in bundles.

The associated resource type name is "quality monitor".

## Testing

Unit tests.

---------

Co-authored-by: Pieter Noordhuis <pcnoordhuis@gmail.com>
Co-authored-by: Pieter Noordhuis <pieter.noordhuis@databricks.com>
Co-authored-by: Arpit Jasapara <87999496+arpitjasa-db@users.noreply.github.com>
2024-05-31 09:42:25 +00:00
Pieter Noordhuis 364a609ea7
Upgrade TF provider to 1.46.0 (#1460)
## Changes

Release notes in
https://github.com/databricks/terraform-provider-databricks/releases/tag/v1.46.0

Notable changes since 1.43.0:
* The job resource has been migrated to the Go SDK. More fields are now
passed through from DABs into TF.
* Improved zero-value handling.

## Tests

n/a
2024-05-31 07:13:43 +00:00
Pieter Noordhuis 424499ec1d
Abstract over filesystem interaction with libs/vfs (#1452)
## Changes

Introduce `libs/vfs` for an implementation of `fs.FS` and friends that
_includes_ the absolute path it is anchored to.

This is needed for:
1. Intercepting file operations to inject custom logic (e.g., logging,
access control).
2. Traversing directories to find specific leaf directories (e.g.,
`.git`).
3. Converting virtual paths to OS-native paths.

Options 2 and 3 are not possible with the standard `fs.FS` interface.
They are needed such that we can provide an instance to the sync package
and still detect the containing `.git` directory and convert paths to
native paths.

This change focuses on making the following packages use `vfs.Path`:
* libs/fileset
* libs/git
* libs/sync

All entries returned by `fileset.All` are now slash-separated. This has
2 consequences:
* The sync snapshot now always uses slash-separated paths
* We don't need to call `filepath.FromSlash` as much as we did

## Tests

* All unit tests pass
* All integration tests pass
* Manually confirmed that a deployment made on Windows by a previous
version of the CLI can be deployed by a new version of the CLI while
retaining the validity of the local sync snapshot as well as the remote
deployment state.
2024-05-30 07:41:50 +00:00
Pieter Noordhuis 63ceede335
Update Go SDK to v0.41.0 (#1445)
## Changes

Release notes at
https://github.com/databricks/databricks-sdk-go/releases/tag/v0.41.0.

## Tests

n/a
2024-05-22 07:41:32 +00:00
Andrew Nester 3f8036f2df
Fixed seg fault when specifying environment key for tasks (#1443)
## Changes
Fixed seg fault when specifying environment key for tasks
2024-05-21 10:00:04 +00:00
Andrew Nester a014d50a6a
Fixed panic when loading incorrectly defined jobs (#1402)
## Changes
If only key was defined for a job in YAML config, validate previously
failed with segfault.

This PR validates that jobs are correctly defined and returns an error
if not.

## Tests
Added regression test
2024-05-17 10:10:17 +00:00
Pieter Noordhuis dd94107853
Remove dependency on `ConfigFilePath` from path translation mutator (#1437)
## Changes

This is one step toward removing the `path.Paths` struct embedding from
resource types.

Going forward, we'll exclusively use the `dyn.Value` tree for location
information.

## Tests

Existing unit tests that cover path resolution with fallback behavior
pass.
2024-05-17 09:26:09 +00:00
Miles Yucht f7d4b272f4
Improve token refresh flow (#1434)
## Changes
Currently, there are a number of issues with the non-happy-path flows
for token refresh in the CLI.

If the token refresh fails, the raw error message is presented to the
user, as seen below. This message is very difficult for users to
interpret and doesn't give any clear direction on how to resolve this
issue.
```
Error: token refresh: Post "https://adb-<WSID>.azuredatabricks.net/oidc/v1/token": http 400: {"error":"invalid_request","error_description":"Refresh token is invalid"}
```

When logging in again, I've noticed that the timeout for logging in is
very short, only 45 seconds. If a user is using a password manager and
needs to login to that first, or needs to do MFA, 45 seconds may not be
enough time. to an account-level profile, it is quite frustrating for
users to need to re-enter account ID information when that information
is already stored in the user's `.databrickscfg` file.

This PR tackles these two issues. First, the presentation of error
messages from `databricks auth token` is improved substantially by
converting the `error` into a human-readable message. When the refresh
token is invalid, it will present a command for the user to run to
reauthenticate. If the token fetching failed for some other reason, that
reason will be presented in a nice way, providing front-line debugging
steps and ultimately redirecting users to file a ticket at this repo if
they can't resolve the issue themselves. After this PR, the new error
message is:
```
Error: a new access token could not be retrieved because the refresh token is invalid. To reauthenticate, run `.databricks/databricks auth login --host https://adb-<WSID>.azuredatabricks.net`
```

To improve the login flow, this PR modifies `databricks auth login` to
auto-complete the account ID from the profile when present.
Additionally, it increases the login timeout from 45 seconds to 1 hour
to give the user sufficient time to login as needed.

To test this change, I needed to refactor some components of the CLI
around profile management, the token cache, and the API client used to
fetch OAuth tokens. These are now settable in the context, and a
demonstration of how they can be set and used is found in
`auth_test.go`.

Separately, this also demonstrates a sort-of integration test of the CLI
by executing the Cobra command for `databricks auth token` from tests,
which may be useful for testing other end-to-end functionality in the
CLI. In particular, I believe this is necessary in order to set flag
values (like the `--profile` flag in this case) for use in testing.

## Tests
Unit tests cover the unhappy and happy paths using the mocked API
client, token cache, and profiler.

Manually tested

---------

Co-authored-by: Pieter Noordhuis <pieter.noordhuis@databricks.com>
2024-05-16 10:22:09 +00:00
dependabot[bot] 216d2b058a
Bump github.com/databricks/databricks-sdk-go from 0.39.0 to 0.40.1 (#1431)
Bumps
[github.com/databricks/databricks-sdk-go](https://github.com/databricks/databricks-sdk-go)
from 0.39.0 to 0.40.1.
<details>
<summary>Release notes</summary>
<p><em>Sourced from <a
href="https://github.com/databricks/databricks-sdk-go/releases">github.com/databricks/databricks-sdk-go's
releases</a>.</em></p>
<blockquote>
<h2>v0.40.1</h2>
<ul>
<li>Fixed codecov for repository (<a
href="https://redirect.github.com/databricks/databricks-sdk-go/pull/909">#909</a>).</li>
<li>Add traceparent header to enable distributed tracing. (<a
href="https://redirect.github.com/databricks/databricks-sdk-go/pull/914">#914</a>).</li>
<li>Log cancelled and failed requests (<a
href="https://redirect.github.com/databricks/databricks-sdk-go/pull/919">#919</a>).</li>
</ul>
<p>Dependency updates:</p>
<ul>
<li>Bump golang.org/x/net from 0.22.0 to 0.24.0 (<a
href="https://redirect.github.com/databricks/databricks-sdk-go/pull/884">#884</a>).</li>
<li>Bump golang.org/x/net from 0.17.0 to 0.23.0 in /examples/zerolog (<a
href="https://redirect.github.com/databricks/databricks-sdk-go/pull/896">#896</a>).</li>
<li>Bump golang.org/x/net from 0.21.0 to 0.23.0 in /examples/slog (<a
href="https://redirect.github.com/databricks/databricks-sdk-go/pull/897">#897</a>).</li>
</ul>
<h2>v0.40.0</h2>
<h2>0.40.0</h2>
<ul>
<li>Allow unlimited timeouts in retries (<a
href="https://redirect.github.com/databricks/databricks-sdk-go/pull/904">#904</a>).
By setting RETRY_TIMEOUT_SECONDS to a negative value, WorkspaceClient
and AccountClient will retry retriable failures indefinitely. As a
reminder, without setting this parameter, the default retry timeout is 5
minutes.</li>
</ul>
<p>API Changes:</p>
<ul>
<li>Changed <code>Create</code> method for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/serving#AppsAPI">w.Apps</a>
workspace-level service . New request type is <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/serving#CreateAppRequest">serving.CreateAppRequest</a>.</li>
<li>Changed <code>Create</code> method for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/serving#AppsAPI">w.Apps</a>
workspace-level service to return <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/serving#App">serving.App</a>.</li>
<li>Removed <code>DeleteApp</code> method for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/serving#AppsAPI">w.Apps</a>
workspace-level service.</li>
<li>Removed <code>GetApp</code> method for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/serving#AppsAPI">w.Apps</a>
workspace-level service.</li>
<li>Removed <code>GetAppDeploymentStatus</code> method for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/serving#AppsAPI">w.Apps</a>
workspace-level service.</li>
<li>Removed <code>GetApps</code> method for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/serving#AppsAPI">w.Apps</a>
workspace-level service.</li>
<li>Removed <code>GetEvents</code> method for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/serving#AppsAPI">w.Apps</a>
workspace-level service.</li>
<li>Added <code>CreateDeployment</code> method for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/serving#AppsAPI">w.Apps</a>
workspace-level service.</li>
<li>Added <code>Delete</code> method for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/serving#AppsAPI">w.Apps</a>
workspace-level service.</li>
<li>Added <code>Get</code> method for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/serving#AppsAPI">w.Apps</a>
workspace-level service.</li>
<li>Added <code>GetDeployment</code> method for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/serving#AppsAPI">w.Apps</a>
workspace-level service.</li>
<li>Added <code>GetEnvironment</code> method for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/serving#AppsAPI">w.Apps</a>
workspace-level service.</li>
<li>Added <code>List</code> method for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/serving#AppsAPI">w.Apps</a>
workspace-level service.</li>
<li>Added <code>ListDeployments</code> method for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/serving#AppsAPI">w.Apps</a>
workspace-level service.</li>
<li>Added <code>Stop</code> method for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/serving#AppsAPI">w.Apps</a>
workspace-level service.</li>
<li>Added <code>Update</code> method for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/serving#AppsAPI">w.Apps</a>
workspace-level service.</li>
<li>Removed <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/serving#AppEvents">serving.AppEvents</a>.</li>
<li>Removed <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/serving#AppManifest">serving.AppManifest</a>.</li>
<li>Removed <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/serving#AppServiceStatus">serving.AppServiceStatus</a>.</li>
<li>Removed <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/serving#DeleteAppResponse">serving.DeleteAppResponse</a>.</li>
<li>Removed <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/serving#DeployAppRequest">serving.DeployAppRequest</a>.</li>
<li>Removed <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/serving#DeploymentStatus">serving.DeploymentStatus</a>.</li>
<li>Removed <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/serving#DeploymentStatusState">serving.DeploymentStatusState</a>.</li>
<li>Removed <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/serving#GetAppDeploymentStatusRequest">serving.GetAppDeploymentStatusRequest</a>.</li>
<li>Removed <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/serving#GetAppResponse">serving.GetAppResponse</a>.</li>
<li>Removed <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/serving#GetEventsRequest">serving.GetEventsRequest</a>.</li>
<li>Removed <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/serving#ListAppEventsResponse">serving.ListAppEventsResponse</a>.</li>
<li>Changed <code>Apps</code> field for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/serving#ListAppsResponse">serving.ListAppsResponse</a>
to <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/serving#AppList">serving.AppList</a>.</li>
<li>Added <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/serving#App">serving.App</a>.</li>
<li>Added <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/serving#AppDeployment">serving.AppDeployment</a>.</li>
<li>Added <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/serving#AppDeploymentState">serving.AppDeploymentState</a>.</li>
</ul>
<!-- raw HTML omitted -->
</blockquote>
<p>... (truncated)</p>
</details>
<details>
<summary>Changelog</summary>
<p><em>Sourced from <a
href="https://github.com/databricks/databricks-sdk-go/blob/main/CHANGELOG.md">github.com/databricks/databricks-sdk-go's
changelog</a>.</em></p>
<blockquote>
<h2>0.40.1</h2>
<ul>
<li>Fixed codecov for repository (<a
href="https://redirect.github.com/databricks/databricks-sdk-go/pull/909">#909</a>).</li>
<li>Add traceparent header to enable distributed tracing. (<a
href="https://redirect.github.com/databricks/databricks-sdk-go/pull/914">#914</a>).</li>
<li>Log cancelled and failed requests (<a
href="https://redirect.github.com/databricks/databricks-sdk-go/pull/919">#919</a>).</li>
</ul>
<p>Dependency updates:</p>
<ul>
<li>Bump golang.org/x/net from 0.22.0 to 0.24.0 (<a
href="https://redirect.github.com/databricks/databricks-sdk-go/pull/884">#884</a>).</li>
<li>Bump golang.org/x/net from 0.17.0 to 0.23.0 in /examples/zerolog (<a
href="https://redirect.github.com/databricks/databricks-sdk-go/pull/896">#896</a>).</li>
<li>Bump golang.org/x/net from 0.21.0 to 0.23.0 in /examples/slog (<a
href="https://redirect.github.com/databricks/databricks-sdk-go/pull/897">#897</a>).</li>
</ul>
<h2>0.40.0</h2>
<ul>
<li>Allow unlimited timeouts in retries (<a
href="https://redirect.github.com/databricks/databricks-sdk-go/pull/904">#904</a>).
By setting RETRY_TIMEOUT_SECONDS to a negative value, WorkspaceClient
and AccountClient will retry retriable failures indefinitely. As a
reminder, without setting this parameter, the default retry timeout is 5
minutes.</li>
</ul>
<p>API Changes:</p>
<ul>
<li>Changed <code>Create</code> method for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/serving#AppsAPI">w.Apps</a>
workspace-level service . New request type is <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/serving#CreateAppRequest">serving.CreateAppRequest</a>.</li>
<li>Changed <code>Create</code> method for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/serving#AppsAPI">w.Apps</a>
workspace-level service to return <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/serving#App">serving.App</a>.</li>
<li>Removed <code>DeleteApp</code> method for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/serving#AppsAPI">w.Apps</a>
workspace-level service.</li>
<li>Removed <code>GetApp</code> method for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/serving#AppsAPI">w.Apps</a>
workspace-level service.</li>
<li>Removed <code>GetAppDeploymentStatus</code> method for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/serving#AppsAPI">w.Apps</a>
workspace-level service.</li>
<li>Removed <code>GetApps</code> method for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/serving#AppsAPI">w.Apps</a>
workspace-level service.</li>
<li>Removed <code>GetEvents</code> method for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/serving#AppsAPI">w.Apps</a>
workspace-level service.</li>
<li>Added <code>CreateDeployment</code> method for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/serving#AppsAPI">w.Apps</a>
workspace-level service.</li>
<li>Added <code>Delete</code> method for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/serving#AppsAPI">w.Apps</a>
workspace-level service.</li>
<li>Added <code>Get</code> method for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/serving#AppsAPI">w.Apps</a>
workspace-level service.</li>
<li>Added <code>GetDeployment</code> method for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/serving#AppsAPI">w.Apps</a>
workspace-level service.</li>
<li>Added <code>GetEnvironment</code> method for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/serving#AppsAPI">w.Apps</a>
workspace-level service.</li>
<li>Added <code>List</code> method for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/serving#AppsAPI">w.Apps</a>
workspace-level service.</li>
<li>Added <code>ListDeployments</code> method for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/serving#AppsAPI">w.Apps</a>
workspace-level service.</li>
<li>Added <code>Stop</code> method for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/serving#AppsAPI">w.Apps</a>
workspace-level service.</li>
<li>Added <code>Update</code> method for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/serving#AppsAPI">w.Apps</a>
workspace-level service.</li>
<li>Removed <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/serving#AppEvents">serving.AppEvents</a>.</li>
<li>Removed <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/serving#AppManifest">serving.AppManifest</a>.</li>
<li>Removed <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/serving#AppServiceStatus">serving.AppServiceStatus</a>.</li>
<li>Removed <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/serving#DeleteAppResponse">serving.DeleteAppResponse</a>.</li>
<li>Removed <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/serving#DeployAppRequest">serving.DeployAppRequest</a>.</li>
<li>Removed <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/serving#DeploymentStatus">serving.DeploymentStatus</a>.</li>
<li>Removed <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/serving#DeploymentStatusState">serving.DeploymentStatusState</a>.</li>
<li>Removed <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/serving#GetAppDeploymentStatusRequest">serving.GetAppDeploymentStatusRequest</a>.</li>
<li>Removed <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/serving#GetAppResponse">serving.GetAppResponse</a>.</li>
<li>Removed <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/serving#GetEventsRequest">serving.GetEventsRequest</a>.</li>
<li>Removed <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/serving#ListAppEventsResponse">serving.ListAppEventsResponse</a>.</li>
<li>Changed <code>Apps</code> field for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/serving#ListAppsResponse">serving.ListAppsResponse</a>
to <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/serving#AppList">serving.AppList</a>.</li>
<li>Added <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/serving#App">serving.App</a>.</li>
<li>Added <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/serving#AppDeployment">serving.AppDeployment</a>.</li>
<li>Added <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/serving#AppDeploymentState">serving.AppDeploymentState</a>.</li>
<li>Added <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/serving#AppDeploymentStatus">serving.AppDeploymentStatus</a>.</li>
</ul>
<!-- raw HTML omitted -->
</blockquote>
<p>... (truncated)</p>
</details>
<details>
<summary>Commits</summary>
<ul>
<li><a
href="72334ef449"><code>72334ef</code></a>
Release v0.40.1 (<a
href="https://redirect.github.com/databricks/databricks-sdk-go/issues/920">#920</a>)</li>
<li><a
href="63343c8c66"><code>63343c8</code></a>
Log cancelled and failed requests (<a
href="https://redirect.github.com/databricks/databricks-sdk-go/issues/919">#919</a>)</li>
<li><a
href="27d08a67df"><code>27d08a6</code></a>
Add traceparent header to enable distributed tracing. (<a
href="https://redirect.github.com/databricks/databricks-sdk-go/issues/914">#914</a>)</li>
<li><a
href="b5322db1a7"><code>b5322db</code></a>
Bump golang.org/x/net from 0.21.0 to 0.23.0 in /examples/slog (<a
href="https://redirect.github.com/databricks/databricks-sdk-go/issues/897">#897</a>)</li>
<li><a
href="5edb2dfc92"><code>5edb2df</code></a>
Bump golang.org/x/net from 0.17.0 to 0.23.0 in /examples/zerolog (<a
href="https://redirect.github.com/databricks/databricks-sdk-go/issues/896">#896</a>)</li>
<li><a
href="0e8dba0f70"><code>0e8dba0</code></a>
Bump golang.org/x/net from 0.22.0 to 0.24.0 (<a
href="https://redirect.github.com/databricks/databricks-sdk-go/issues/884">#884</a>)</li>
<li><a
href="b6e76fc44b"><code>b6e76fc</code></a>
Fixed codecov for repository (<a
href="https://redirect.github.com/databricks/databricks-sdk-go/issues/909">#909</a>)</li>
<li><a
href="70aa8eea19"><code>70aa8ee</code></a>
Release v0.40.0 (<a
href="https://redirect.github.com/databricks/databricks-sdk-go/issues/908">#908</a>)</li>
<li><a
href="dae1642c5d"><code>dae1642</code></a>
Add ToLibraryList() for ClusterStatusResponse (<a
href="https://redirect.github.com/databricks/databricks-sdk-go/issues/906">#906</a>)</li>
<li><a
href="c6516aa754"><code>c6516aa</code></a>
Bump version of mockery (<a
href="https://redirect.github.com/databricks/databricks-sdk-go/issues/907">#907</a>)</li>
<li>Additional commits viewable in <a
href="https://github.com/databricks/databricks-sdk-go/compare/v0.39.0...v0.40.1">compare
view</a></li>
</ul>
</details>
<br />

<details>
<summary>Most Recent Ignore Conditions Applied to This Pull
Request</summary>

| Dependency Name | Ignore Conditions |
| --- | --- |
| github.com/databricks/databricks-sdk-go | [>= 0.28.a, < 0.29] |
</details>


[![Dependabot compatibility
score](https://dependabot-badges.githubapp.com/badges/compatibility_score?dependency-name=github.com/databricks/databricks-sdk-go&package-manager=go_modules&previous-version=0.39.0&new-version=0.40.1)](https://docs.github.com/en/github/managing-security-vulnerabilities/about-dependabot-security-updates#about-compatibility-scores)

Dependabot will resolve any conflicts with this PR as long as you don't
alter it yourself. You can also trigger a rebase manually by commenting
`@dependabot rebase`.

[//]: # (dependabot-automerge-start)
[//]: # (dependabot-automerge-end)

---

<details>
<summary>Dependabot commands and options</summary>
<br />

You can trigger Dependabot actions by commenting on this PR:
- `@dependabot rebase` will rebase this PR
- `@dependabot recreate` will recreate this PR, overwriting any edits
that have been made to it
- `@dependabot merge` will merge this PR after your CI passes on it
- `@dependabot squash and merge` will squash and merge this PR after
your CI passes on it
- `@dependabot cancel merge` will cancel a previously requested merge
and block automerging
- `@dependabot reopen` will reopen this PR if it is closed
- `@dependabot close` will close this PR and stop Dependabot recreating
it. You can achieve the same result by closing it manually
- `@dependabot show <dependency name> ignore conditions` will show all
of the ignore conditions of the specified dependency
- `@dependabot ignore this major version` will close this PR and stop
Dependabot creating any more for this major version (unless you reopen
the PR or upgrade to it yourself)
- `@dependabot ignore this minor version` will close this PR and stop
Dependabot creating any more for this minor version (unless you reopen
the PR or upgrade to it yourself)
- `@dependabot ignore this dependency` will close this PR and stop
Dependabot creating any more for this dependency (unless you reopen the
PR or upgrade to it yourself)


</details>

---------

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
Co-authored-by: Pieter Noordhuis <pieter.noordhuis@databricks.com>
2024-05-16 09:04:58 +00:00
Ilia Babanov 2035516fde
Don't merge-in remote resources during depolyments (#1432)
## Changes
`check_running_resources` now pulls the remote state without modifying
the bundle state, similar to how it was doing before. This avoids a
problem when we fail to compute deployment metadata for a deleted job
(which we shouldn't do in the first place)

`deploy_then_remove_resources_test` now also deploys and deletes a job
(in addition to a pipeline), which catches the error that this PR fixes.

## Tests
Unit and integ tests
2024-05-15 12:41:44 +00:00
Andrew Nester 0a21428a48
Upgrade to 1.43 terraform provider (#1429)
## Changes
Upgrade to 1.43 terraform provider
2024-05-14 12:19:34 +00:00
shreyas-goenka 63617253bd
Assert customer marshalling is implemented for resources (#1425)
## Changes
This PR ensures every resource implements a custom marshaller /
unmarshaller. This is required because we directly embed Go SDK structs.
which implement custom marshalling overrides. Since the struct is
embedded, the [customer marshalling
overrides](https://pkg.go.dev/encoding/json#example-package-CustomMarshalJSON)
are promoted to the top level. If the embedded struct itself is nil,
then JSON marshal / unmarshal will panic because it tries to call
`MarshalJSON` / `UnmarshalJSON` on a nil object.

Fixing this issue at the Go SDK level does not seem possible. Discussed
with @hectorcast-db.
2024-05-14 10:30:48 +00:00
shreyas-goenka 95bbe2ece1
Fix flaky tests for the parallel mutator (#1426)
## Changes
Around 0.5% to 1% of the time, the tests would fail due to concurrent
access to the underlying slice in the mutator. This PR makes the test
thread safe preventing race conditions.

Example of failed run:
https://github.com/databricks/cli/actions/runs/9004657555/job/24738145829
2024-05-13 12:16:43 +00:00
Andrew Nester a393c87ed9
Upgrade TF provider to 1.42.0 (#1418)
## Changes
Upgrade TF provider to 1.42.0

Also fixes #1258
2024-05-06 11:41:37 +00:00
shreyas-goenka 30215860e7
Fix description memoization in bundle schema (#1409)
## Changes
Fixes https://github.com/databricks/cli/issues/559

The CLI generation is now stable and does not produce a diff for the
`bundle_descriptions.json` file.

Before a pointer to the schema was stored in the memo, which would be
mutated later to include the description. This lead to duplicate
documentation for schema components that were used in multiple places.
This PR fixes this issue.

Eg: Before all references of `pause_status` would have the same
description.

## Tests
Added regression test.
2024-05-01 11:04:37 +00:00
shreyas-goenka 507053ee50
Annotate DLT pipelines when deployed using DABs (#1410)
## Changes
This PR annotates any pipelines that were deployed using DABs to have
`deployment.kind` set to "BUNDLE", mirroring the annotation for Jobs
(similar PR for jobs FYI: https://github.com/databricks/cli/pull/880).

Breakglass UI is not yet available for pipelines, so this annotation
will just be used for revenue attribution ATM.

Note: The API field has been deployed in all regions including GovCloud.

## Tests
Unit tests and manually.

Manually verified that the kind and metadata_file_path are being set by
DABs, and are returned by a GET API to a pipeline deployed using a DAB.
Example:
```
    "deployment": {
      "kind":"BUNDLE",
      "metadata_file_path":"/Users/shreyas.goenka@databricks.com/.bundle/bundle-playground/default/state/metadata.json"
    },
```
2024-05-01 08:37:03 +00:00
Ilia Babanov 153141d3ea
Don't fail while parsing outdated terraform state (#1404)
`terraform show -json` (`terraform.Show()`) fails if the state file
contains resources with fields that non longer conform to the provider
schemas.

This can happen when you deploy a bundle with one version of the CLI,
then updated the CLI to a version that uses different databricks
terraform provider, and try to run `bundle run` or `bundle summary`.
Those commands don't recreate local terraform state (only `terraform
apply` or `plan` do) and terraform itself fails while parsing it.
[Terraform
docs](https://developer.hashicorp.com/terraform/language/state#format)
point out that it's best to use `terraform show` after successful
`apply` or `plan`.

Here we parse the state ourselves. The state file format is internal to
terraform, but it's more stable than our resource schemas. We only parse
a subset of fields from the state, and only update ID and ModifiedStatus
of bundle resources in the `terraform.Load` mutator.
2024-05-01 08:22:35 +00:00
dependabot[bot] 781688c9cb
Bump github.com/databricks/databricks-sdk-go from 0.38.0 to 0.39.0 (#1405)
Bumps
[github.com/databricks/databricks-sdk-go](https://github.com/databricks/databricks-sdk-go)
from 0.38.0 to 0.39.0.
<details>
<summary>Release notes</summary>
<p><em>Sourced from <a
href="https://github.com/databricks/databricks-sdk-go/releases">github.com/databricks/databricks-sdk-go's
releases</a>.</em></p>
<blockquote>
<h2>v0.39.0</h2>
<h2>0.39.0</h2>
<ul>
<li>Ignored flaky integration tests (<a
href="https://redirect.github.com/databricks/databricks-sdk-go/pull/894">#894</a>).</li>
<li>Added retries for &quot;worker env WorkerEnvId(workerenv-XXXXX) not
found&quot; (<a
href="https://redirect.github.com/databricks/databricks-sdk-go/pull/890">#890</a>).</li>
<li>Updated SDK to OpenAPI spec (<a
href="https://redirect.github.com/databricks/databricks-sdk-go/pull/899">#899</a>).</li>
</ul>
<p>Note: This release contains breaking changes, please see the API
changes below for more details.</p>
<p>API Changes:</p>
<ul>
<li>Added <code>IngestionDefinition</code> field for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/pipelines#CreatePipeline">pipelines.CreatePipeline</a>,
<a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/pipelines#EditPipeline">pipelines.EditPipeline</a>
and <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/pipelines#PipelineSpec">pipelines.PipelineSpec</a>.</li>
<li>Added <code>Deployment</code> field for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/pipelines#CreatePipeline">pipelines.CreatePipeline</a>,
<a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/pipelines#EditPipeline">pipelines.EditPipeline</a>
and <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/pipelines#PipelineSpec">pipelines.PipelineSpec</a>.</li>
<li>Added <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/compute#ClusterStatus">compute.ClusterStatus</a>.</li>
<li>Added <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/compute#ClusterStatusResponse">compute.ClusterStatusResponse</a>.</li>
<li>Added <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/compute#LibraryInstallStatus">compute.LibraryInstallStatus</a>.</li>
<li>Added <code>WarehouseId</code> field for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/jobs#NotebookTask">jobs.NotebookTask</a>.</li>
<li>Added <code>RunAs</code> field for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/jobs#SubmitRun">jobs.SubmitRun</a>.</li>
<li>Added <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/pipelines#DeploymentKind">pipelines.DeploymentKind</a>.</li>
<li>Added <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/pipelines#IngestionConfig">pipelines.IngestionConfig</a>.</li>
<li>Added <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/pipelines#ManagedIngestionPipelineDefinition">pipelines.ManagedIngestionPipelineDefinition</a>.</li>
<li>Added <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/pipelines#PipelineDeployment">pipelines.PipelineDeployment</a>.</li>
<li>Added <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/pipelines#SchemaSpec">pipelines.SchemaSpec</a>.</li>
<li>Added <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/pipelines#TableSpec">pipelines.TableSpec</a>.</li>
<li>Added <code>GetOpenApi</code> method for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/serving#ServingEndpointsAPI">w.ServingEndpoints</a>
workspace-level service.</li>
<li>Added <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/serving#GetOpenApiRequest">serving.GetOpenApiRequest</a>.</li>
<li>Added <code>SchemaId</code> field for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/catalog#SchemaInfo">catalog.SchemaInfo</a>.</li>
<li>Added <code>Operation</code> field for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/catalog#ValidationResult">catalog.ValidationResult</a>.</li>
<li>Added <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/catalog#ValidationResultOperation">catalog.ValidationResultOperation</a>.</li>
<li>Added <code>Requirements</code> field for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/compute#Library">compute.Library</a>.</li>
<li>Removed <code>AwsOperation</code> field for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/catalog#ValidationResult">catalog.ValidationResult</a>.</li>
<li>Removed <code>AzureOperation</code> field for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/catalog#ValidationResult">catalog.ValidationResult</a>.</li>
<li>Removed <code>GcpOperation</code> field for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/catalog#ValidationResult">catalog.ValidationResult</a>.</li>
<li>Removed <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/catalog#ValidationResultAwsOperation">catalog.ValidationResultAwsOperation</a>.</li>
<li>Removed <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/catalog#ValidationResultAzureOperation">catalog.ValidationResultAzureOperation</a>.</li>
<li>Removed <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/catalog#ValidationResultGcpOperation">catalog.ValidationResultGcpOperation</a>.</li>
<li>Removed <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/compute#ClusterStatusRequest">compute.ClusterStatusRequest</a>.</li>
<li>Removed <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/compute#LibraryFullStatusStatus">compute.LibraryFullStatusStatus</a>.</li>
<li>Changed <code>ClusterStatus</code> method for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/compute#LibrariesAPI">w.Libraries</a>
workspace-level service . New request type is <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/compute#ClusterStatus">compute.ClusterStatus</a>.</li>
<li>Changed <code>ClusterStatus</code> method for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/compute#LibrariesAPI">w.Libraries</a>
workspace-level service to return <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/compute#ClusterStatusResponse">compute.ClusterStatusResponse</a>.</li>
<li>Changed <code>Status</code> field for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/compute#LibraryFullStatus">compute.LibraryFullStatus</a>
to <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/compute#LibraryInstallStatus">compute.LibraryInstallStatus</a>.</li>
</ul>
<p>OpenAPI SHA: 21f9f1482f9d0d15228da59f2cd9f0863d2a6d55, Date:
2024-04-23</p>
</blockquote>
</details>
<details>
<summary>Changelog</summary>
<p><em>Sourced from <a
href="https://github.com/databricks/databricks-sdk-go/blob/main/CHANGELOG.md">github.com/databricks/databricks-sdk-go's
changelog</a>.</em></p>
<blockquote>
<h2>0.39.0</h2>
<ul>
<li>Ignored flaky integration tests (<a
href="https://redirect.github.com/databricks/databricks-sdk-go/pull/894">#894</a>).</li>
<li>Added retries for &quot;worker env WorkerEnvId(workerenv-XXXXX) not
found&quot; (<a
href="https://redirect.github.com/databricks/databricks-sdk-go/pull/890">#890</a>).</li>
<li>Updated SDK to OpenAPI spec (<a
href="https://redirect.github.com/databricks/databricks-sdk-go/pull/899">#899</a>).</li>
</ul>
<p>Note: This release contains breaking changes, please see the API
changes below for more details.</p>
<p>API Changes:</p>
<ul>
<li>Added <code>IngestionDefinition</code> field for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/pipelines#CreatePipeline">pipelines.CreatePipeline</a>,
<a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/pipelines#EditPipeline">pipelines.EditPipeline</a>
and <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/pipelines#PipelineSpec">pipelines.PipelineSpec</a>.</li>
<li>Added <code>Deployment</code> field for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/pipelines#CreatePipeline">pipelines.CreatePipeline</a>,
<a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/pipelines#EditPipeline">pipelines.EditPipeline</a>
and <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/pipelines#PipelineSpec">pipelines.PipelineSpec</a>.</li>
<li>Added <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/compute#ClusterStatus">compute.ClusterStatus</a>.</li>
<li>Added <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/compute#ClusterStatusResponse">compute.ClusterStatusResponse</a>.</li>
<li>Added <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/compute#LibraryInstallStatus">compute.LibraryInstallStatus</a>.</li>
<li>Added <code>WarehouseId</code> field for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/jobs#NotebookTask">jobs.NotebookTask</a>.</li>
<li>Added <code>RunAs</code> field for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/jobs#SubmitRun">jobs.SubmitRun</a>.</li>
<li>Added <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/pipelines#DeploymentKind">pipelines.DeploymentKind</a>.</li>
<li>Added <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/pipelines#IngestionConfig">pipelines.IngestionConfig</a>.</li>
<li>Added <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/pipelines#ManagedIngestionPipelineDefinition">pipelines.ManagedIngestionPipelineDefinition</a>.</li>
<li>Added <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/pipelines#PipelineDeployment">pipelines.PipelineDeployment</a>.</li>
<li>Added <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/pipelines#SchemaSpec">pipelines.SchemaSpec</a>.</li>
<li>Added <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/pipelines#TableSpec">pipelines.TableSpec</a>.</li>
<li>Added <code>GetOpenApi</code> method for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/serving#ServingEndpointsAPI">w.ServingEndpoints</a>
workspace-level service.</li>
<li>Added <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/serving#GetOpenApiRequest">serving.GetOpenApiRequest</a>.</li>
<li>Added <code>SchemaId</code> field for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/catalog#SchemaInfo">catalog.SchemaInfo</a>.</li>
<li>Added <code>Operation</code> field for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/catalog#ValidationResult">catalog.ValidationResult</a>.</li>
<li>Added <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/catalog#ValidationResultOperation">catalog.ValidationResultOperation</a>.</li>
<li>Added <code>Requirements</code> field for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/compute#Library">compute.Library</a>.</li>
<li>Removed <code>AwsOperation</code> field for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/catalog#ValidationResult">catalog.ValidationResult</a>.</li>
<li>Removed <code>AzureOperation</code> field for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/catalog#ValidationResult">catalog.ValidationResult</a>.</li>
<li>Removed <code>GcpOperation</code> field for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/catalog#ValidationResult">catalog.ValidationResult</a>.</li>
<li>Removed <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/catalog#ValidationResultAwsOperation">catalog.ValidationResultAwsOperation</a>.</li>
<li>Removed <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/catalog#ValidationResultAzureOperation">catalog.ValidationResultAzureOperation</a>.</li>
<li>Removed <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/catalog#ValidationResultGcpOperation">catalog.ValidationResultGcpOperation</a>.</li>
<li>Removed <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/compute#ClusterStatusRequest">compute.ClusterStatusRequest</a>.</li>
<li>Removed <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/compute#LibraryFullStatusStatus">compute.LibraryFullStatusStatus</a>.</li>
<li>Changed <code>ClusterStatus</code> method for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/compute#LibrariesAPI">w.Libraries</a>
workspace-level service . New request type is <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/compute#ClusterStatus">compute.ClusterStatus</a>.</li>
<li>Changed <code>ClusterStatus</code> method for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/compute#LibrariesAPI">w.Libraries</a>
workspace-level service to return <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/compute#ClusterStatusResponse">compute.ClusterStatusResponse</a>.</li>
<li>Changed <code>Status</code> field for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/compute#LibraryFullStatus">compute.LibraryFullStatus</a>
to <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/compute#LibraryInstallStatus">compute.LibraryInstallStatus</a>.</li>
</ul>
<p>OpenAPI SHA: 21f9f1482f9d0d15228da59f2cd9f0863d2a6d55, Date:
2024-04-23</p>
</blockquote>
</details>
<details>
<summary>Commits</summary>
<ul>
<li><a
href="7672dece38"><code>7672dec</code></a>
Release v0.39.0 (<a
href="https://redirect.github.com/databricks/databricks-sdk-go/issues/901">#901</a>)</li>
<li><a
href="2f56ab8431"><code>2f56ab8</code></a>
Update SDK to OpenAPI spec (<a
href="https://redirect.github.com/databricks/databricks-sdk-go/issues/899">#899</a>)</li>
<li><a
href="fa3a5d24eb"><code>fa3a5d2</code></a>
Add retries for &quot;worker env WorkerEnvId(workerenv-XXXXX) not
found&quot; (<a
href="https://redirect.github.com/databricks/databricks-sdk-go/issues/890">#890</a>)</li>
<li><a
href="219975c53f"><code>219975c</code></a>
Ignore flaky integration tests (<a
href="https://redirect.github.com/databricks/databricks-sdk-go/issues/894">#894</a>)</li>
<li>See full diff in <a
href="https://github.com/databricks/databricks-sdk-go/compare/v0.38.0...v0.39.0">compare
view</a></li>
</ul>
</details>
<br />

<details>
<summary>Most Recent Ignore Conditions Applied to This Pull
Request</summary>

| Dependency Name | Ignore Conditions |
| --- | --- |
| github.com/databricks/databricks-sdk-go | [>= 0.28.a, < 0.29] |
</details>


[![Dependabot compatibility
score](https://dependabot-badges.githubapp.com/badges/compatibility_score?dependency-name=github.com/databricks/databricks-sdk-go&package-manager=go_modules&previous-version=0.38.0&new-version=0.39.0)](https://docs.github.com/en/github/managing-security-vulnerabilities/about-dependabot-security-updates#about-compatibility-scores)

Dependabot will resolve any conflicts with this PR as long as you don't
alter it yourself. You can also trigger a rebase manually by commenting
`@dependabot rebase`.

[//]: # (dependabot-automerge-start)
[//]: # (dependabot-automerge-end)

---

<details>
<summary>Dependabot commands and options</summary>
<br />

You can trigger Dependabot actions by commenting on this PR:
- `@dependabot rebase` will rebase this PR
- `@dependabot recreate` will recreate this PR, overwriting any edits
that have been made to it
- `@dependabot merge` will merge this PR after your CI passes on it
- `@dependabot squash and merge` will squash and merge this PR after
your CI passes on it
- `@dependabot cancel merge` will cancel a previously requested merge
and block automerging
- `@dependabot reopen` will reopen this PR if it is closed
- `@dependabot close` will close this PR and stop Dependabot recreating
it. You can achieve the same result by closing it manually
- `@dependabot show <dependency name> ignore conditions` will show all
of the ignore conditions of the specified dependency
- `@dependabot ignore this major version` will close this PR and stop
Dependabot creating any more for this major version (unless you reopen
the PR or upgrade to it yourself)
- `@dependabot ignore this minor version` will close this PR and stop
Dependabot creating any more for this minor version (unless you reopen
the PR or upgrade to it yourself)
- `@dependabot ignore this dependency` will close this PR and stop
Dependabot creating any more for this dependency (unless you reopen the
PR or upgrade to it yourself)


</details>

---------

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
Co-authored-by: Andrew Nester <andrew.nester@databricks.com>
2024-04-30 14:41:24 +00:00
shreyas-goenka d949f2b4f2
Fix bundle schema for variables (#1396)
## Changes
This PR fixes the variable schema to:
1. Allow non-string values in the "default" value of a variable.
2. Allow non-string overrides in a target for a variable. 

## Tests
Manually. There are no longer squiggly lines. 

Before:
<img width="329" alt="Screenshot 2024-04-24 at 3 26 43 PM"
src="https://github.com/databricks/cli/assets/88374338/43be02c2-80a4-4f80-bd79-0f3e1e93ee17">


After:
<img width="361" alt="Screenshot 2024-04-24 at 3 26 10 PM"
src="https://github.com/databricks/cli/assets/88374338/2c1fb892-a2a2-478b-8d2e-9bda6d844b54">
2024-04-25 11:23:50 +00:00
shreyas-goenka e652333103
Fix variable overrides in targets for non-string variables (#1397)
Before variable overrides that were not string in a target would not
work. This PR fixes that. Tested manually and via a unit test.
2024-04-25 11:21:10 +00:00
shreyas-goenka 6fd581d173
Allow variable references in non-string fields in the JSON schema (#1398)
## Tests
Verified manually.

Before:
<img width="373" alt="Screenshot 2024-04-24 at 7 18 44 PM"
src="https://github.com/databricks/cli/assets/88374338/b4aef51f-0c16-4589-9d47-cdec9ab91158">

After:
<img width="364" alt="Screenshot 2024-04-24 at 7 18 31 PM"
src="https://github.com/databricks/cli/assets/88374338/3d8e412e-77ee-4641-943d-f99eab26ba02">
<img width="356" alt="Screenshot 2024-04-24 at 7 16 54 PM"
src="https://github.com/databricks/cli/assets/88374338/2aed369a-3c6a-4754-9c76-0969423f319e">

Manually verified the schema diff is sane. Example:
```
<                               "type": "boolean",
<                               "description": "If inference tables are enabled or not. NOTE: If you have already disabled payload logging once, you cannot enable again."
---
>                               "description": "If inference tables are enabled or not. NOTE: If you have already disabled payload logging once, you cannot enable again.",
>                               "anyOf": [
>                                 {
>                                   "type": "boolean"
>                                 },
>                                 {
>                                   "type": "string",
>                                   "pattern": "\\$\\{([a-zA-Z]+([-_]?[a-zA-Z0-9]+)*(\\.[a-zA-Z]+([-_]?[a-zA-Z0-9]+)*)*)\\}"
>                                 }
>                               ]
```
2024-04-25 11:20:45 +00:00
Lennart Kats (databricks) 60122f6035
Show a better error message for using wheel tasks with older DBR versions (#1373)
## Changes

This is a minor improvement to the error about wheel tasks with older
DBR versions, since we get questions about it every now and then. It
also adds a pointer to the docs that were added since the original
messages was committed.

---------

Co-authored-by: Pieter Noordhuis <pcnoordhuis@gmail.com>
2024-04-23 19:36:25 +00:00
shreyas-goenka 1d9bf4b2c4
Add legacy option for `run_as` (#1384)
## Changes
This PR partially reverts the changes in
https://github.com/databricks/cli/pull/1233 and puts the old code under
an "experimental.use_legacy_run_as" configuration. This gives customers
who ran into the breaking change made in the PR a way out.


## Tests
Both manually and via unit tests.

Manually verified that run_as works for pipelines now. And if a user
wants to use the feature they need to be both a Metastore and a
workspace admin.

---------

Error when the deploying user is a workspace admin but not a metastore
admin:
```
Error: terraform apply: exit status 1

Error: cannot update permissions: User is not a metastore admin for Metastore 'deco-uc-prod-aws-us-east-1'.

  with databricks_permissions.pipeline_foo,
  on bundle.tf.json line 23, in resource.databricks_permissions.pipeline_foo:
  23:       }
```

--------

Output of bundle validate:
```
➜  bundle-playground git:(master) ✗ cli bundle validate
Warning: You are using the legacy mode of run_as. The support for this mode is experimental and might be removed in a future release of the CLI. In order to run the DLT pipelines in your DAB as the run_as user this mode changes the owners of the pipelines to the run_as identity, which requires the user deploying the bundle to be a workspace admin, and also a Metastore admin if the pipeline target is in UC.
  at experimental.use_legacy_run_as
  in databricks.yml:13:22

Name: bundle-playground
Target: default
Workspace:
  Host: https://dbc-a39a1eb1-ef95.cloud.databricks.com
  User: shreyas.goenka@databricks.com
  Path: /Users/shreyas.goenka@databricks.com/.bundle/bundle-playground/default

Found 1 warning
```
2024-04-22 11:51:41 +00:00
Pieter Noordhuis 3108883a8f
Processing and completion of positional args to bundle run (#1120)
## Changes

With this change, both job parameters and task parameters can be
specified as positional arguments to bundle run. How the positional
arguments are interpreted depends on the configuration of the job.

### Examples:

For a job that has job parameters configured a user can specify:

```
databricks bundle run my_job -- --param1=value1 --param2=value2
```

And the run is kicked off with job parameters set to:
```json
{
  "param1": "value1",
  "param2": "value2"
}
```

Similarly, for a job that doesn't use job parameters and only has
`notebook_task` tasks, a user can specify:

```
databricks bundle run my_notebook_job -- --param1=value1 --param2=value2
```

And the run is kicked off with task level `notebook_params` configured
as:
```json
{
  "param1": "value1",
  "param2": "value2"
}
```

For a job that doesn't doesn't use job parameters and only has either
`spark_python_task` or `python_wheel_task` tasks, a user can specify:

```
databricks bundle run my_python_file_job -- --flag=value other arguments
```

And the run is kicked off with task level `python_params` configured as:
```json
[
  "--flag=value",
  "other",
  "arguments"
]
```

The same is applied to jobs with only `spark_jar_task` or
`spark_submit_task` tasks.

## Tests

Unit tests. Tested the completions manually.
2024-04-22 11:50:13 +00:00
Andrew Nester 1872aa12b3
Added support for job environments (#1379)
## Changes
The main changes are:
1. Don't link artifacts to libraries anymore and instead just iterate
over all jobs and tasks when uploading artifacts and update local path
to remote
2. Iterating over `jobs.environments` to check if there are any local
libraries and checking that they exist locally
3. Added tests to check environments are handled correctly

End-to-end test will follow up

## Tests
Added regression test, existing tests (including integration one) pass
2024-04-22 11:44:34 +00:00
Lennart Kats (databricks) 000a7fef8c
Enable job queueing by default (#1385)
## Changes

This enable queueing for jobs by default, following the behavior from
API 2.2+. Queing is a best practice and will be the default in API 2.2.
Since we're still using API 2.1 which has queueing disabled by default,
this PR enables queuing using a mutator.

Customers can manually turn off queueing for any job by adding the
following to their job spec:

```
queue:
  enabled: false
```

## Tests

Unit tests, manual confirmation of property after deployment.

---------

Co-authored-by: Pieter Noordhuis <pcnoordhuis@gmail.com>
2024-04-22 10:36:39 +00:00
Pieter Noordhuis cd675ded9a
Update `testutil` helpers to return path (#1383)
## Changes

I spotted a few call sites where the path of a test file was synthesized
multiple times. It is easier to capture the path as a variable and reuse
it.
2024-04-19 15:05:36 +00:00
shreyas-goenka 6ca57a7e68
Add docs URL for `run_as` in error message (#1381) 2024-04-19 14:09:33 +00:00
shreyas-goenka e008c2bd8c
Cleanup remote file path on bundle destroy (#1374)
## Changes
The sync struct initialization would recreate the deleted `file_path`.
This PR moves to not initializing the sync object to delete the
snapshot, thus fixing the lingering `file_path` after `bundle destroy`.

## Tests
Manually, and a integration test to prevent regression.
2024-04-19 11:48:04 +00:00
shreyas-goenka 3c14204e98
Followup improvements to the Docker setup script (#1369)
## Changes
This PR:
1. Uses bash to run the setup.sh script instead of the native busybox sh
shipped with alpine.
2. Verifies the checksums of the installed terraform CLI binaries.
 
## Tests
Manually. The docker image successfully builds.

---------

Co-authored-by: Pieter Noordhuis <pieter.noordhuis@databricks.com>
2024-04-18 20:52:11 +00:00