Commit Graph

307 Commits

Author SHA1 Message Date
Pieter Noordhuis db84a707cd
Fix bundle documentation URL (#1399)
Closes #1395.
2024-04-25 11:25:26 +00:00
shreyas-goenka d949f2b4f2
Fix bundle schema for variables (#1396)
## Changes
This PR fixes the variable schema to:
1. Allow non-string values in the "default" value of a variable.
2. Allow non-string overrides in a target for a variable. 

## Tests
Manually. There are no longer squiggly lines. 

Before:
<img width="329" alt="Screenshot 2024-04-24 at 3 26 43 PM"
src="https://github.com/databricks/cli/assets/88374338/43be02c2-80a4-4f80-bd79-0f3e1e93ee17">


After:
<img width="361" alt="Screenshot 2024-04-24 at 3 26 10 PM"
src="https://github.com/databricks/cli/assets/88374338/2c1fb892-a2a2-478b-8d2e-9bda6d844b54">
2024-04-25 11:23:50 +00:00
Jim Idle 4c71f8cac4
Ensure that python dependencies are installed during upgrade (#1390)
## Changes
The installer.Upgrade() processing did not install Python dependencies.
This resulted in errors such as:

```
ModuleNotFoundError: No module named 'databricks.labs.blueprint'
```

Any new dependencies are now installed during the upgrade process.

Resolves: databrickslabs/ucx#1276

## Tests

The TestUpgraderWorksForReleases test now checks to see if the upgrade
process resulted in the dependencies being installed.

---------

Signed-off-by: Jim.Idle <jimi@idle.ws>
2024-04-24 17:34:09 +00:00
Kartik Gupta 1c02224902
Pass `DATABRICKS_CONFIG_FILE` env var to sdk config during `auth profiles` (#1394)
## Changes
* Currently, we use `auth profiles` command with
`DATABRICKS_CONFIG_FILE` env var set, the file pointed to by the env var
is ONLY used for loading the profile names (ini file sections). It is
not passed to go sdk config object. We also don't use env variable
loader in the go sdk config object, so this env var is ignored by the
config and only default file is read.
* This PR explicitly sets the config file path in the go sdk config
object.

## Tests
* integration tests in vscode
2024-04-24 09:18:13 +00:00
Pieter Noordhuis 3108883a8f
Processing and completion of positional args to bundle run (#1120)
## Changes

With this change, both job parameters and task parameters can be
specified as positional arguments to bundle run. How the positional
arguments are interpreted depends on the configuration of the job.

### Examples:

For a job that has job parameters configured a user can specify:

```
databricks bundle run my_job -- --param1=value1 --param2=value2
```

And the run is kicked off with job parameters set to:
```json
{
  "param1": "value1",
  "param2": "value2"
}
```

Similarly, for a job that doesn't use job parameters and only has
`notebook_task` tasks, a user can specify:

```
databricks bundle run my_notebook_job -- --param1=value1 --param2=value2
```

And the run is kicked off with task level `notebook_params` configured
as:
```json
{
  "param1": "value1",
  "param2": "value2"
}
```

For a job that doesn't doesn't use job parameters and only has either
`spark_python_task` or `python_wheel_task` tasks, a user can specify:

```
databricks bundle run my_python_file_job -- --flag=value other arguments
```

And the run is kicked off with task level `python_params` configured as:
```json
[
  "--flag=value",
  "other",
  "arguments"
]
```

The same is applied to jobs with only `spark_jar_task` or
`spark_submit_task` tasks.

## Tests

Unit tests. Tested the completions manually.
2024-04-22 11:50:13 +00:00
shreyas-goenka 331313ea5f
Print host in `bundle validate` when passed via profile or environment variables (#1378)
## Changes
Fixes to get host from the workspace client rather than only printing
the host when it's configured in the bundle config.

## Tests
Manually. When a profile was specified for auth.

Before:
```
➜  bundle-playground git:(master) ✗ cli bundle validate
Name: bundle-playground
Target: default
Workspace:
  Host: 
  User: shreyas.goenka@databricks.com
  Path: /Users/shreyas.goenka@databricks.com/.bundle/bundle-playground/default
```


After:
```
➜  bundle-playground git:(master) ✗ cli bundle validate
Name: bundle-playground
Target: default
Workspace:
  Host: https://e2-dogfood.staging.cloud.databricks.com
  User: shreyas.goenka@databricks.com
  Path: /Users/shreyas.goenka@databricks.com/.bundle/bundle-playground/default
```
2024-04-19 11:43:50 +00:00
Andrew Nester 27f51c760f
Added validate mutator to surface additional bundle warnings (#1352)
## Changes
All these validators will return warnings as part of `bundle validate`
run

Added 2 mutators: 
1. To check that if tasks use job_cluster_key it is actually defined
2. To check if there are any files to sync as part of deployment

Also added `bundle.Parallel` to run them in parallel

To make sure mutators under bundle.Parallel do not mutate config,
introduced new `ReadOnlyMutator`, `ReadOnlyBundle` and `ReadOnlyConfig`.

Example 

```
databricks bundle validate -p deco-staging
Warning: unknown field: new_cluster
  at resources.jobs.my_job
  in bundle.yml:24:7

Warning: job_cluster_key high_cpu_workload_job_cluster is not defined
  at resources.jobs.my_job.tasks[0].job_cluster_key
  in bundle.yml:35:28

Warning: There are no files to sync, please check your your .gitignore and sync.exclude configuration
  at sync.exclude
  in bundle.yml:18:5

Name: test
Target: default
Workspace:
  Host: https://acme.databricks.com
  User: andrew.nester@databricks.com
  Path: /Users/andrew.nester@databricks.com/.bundle/test/default

Found 3 warnings
```

## Tests
Added unit tests
2024-04-18 15:13:16 +00:00
shreyas-goenka eb9665d2ee
Add better documentation for the `auth login` command (#1366)
This PR improves the documentation for the `auth login` command,
accounting for the various ways this command can be used in.

---------

Co-authored-by: PaulCornellDB <paul.cornell@databricks.com>
Co-authored-by: Pieter Noordhuis <pieter.noordhuis@databricks.com>
2024-04-18 11:55:42 +00:00
dependabot[bot] c949655f9f
Bump github.com/databricks/databricks-sdk-go from 0.37.0 to 0.38.0 (#1361)
[![Dependabot compatibility
score](https://dependabot-badges.githubapp.com/badges/compatibility_score?dependency-name=github.com/databricks/databricks-sdk-go&package-manager=go_modules&previous-version=0.37.0&new-version=0.38.0)](https://docs.github.com/en/github/managing-security-vulnerabilities/about-dependabot-security-updates#about-compatibility-scores)

Dependabot will resolve any conflicts with this PR as long as you don't
alter it yourself. You can also trigger a rebase manually by commenting
`@dependabot rebase`.

[//]: # (dependabot-automerge-start)
[//]: # (dependabot-automerge-end)

---

<details>
<summary>Dependabot commands and options</summary>
<br />

You can trigger Dependabot actions by commenting on this PR:
- `@dependabot rebase` will rebase this PR
- `@dependabot recreate` will recreate this PR, overwriting any edits
that have been made to it
- `@dependabot merge` will merge this PR after your CI passes on it
- `@dependabot squash and merge` will squash and merge this PR after
your CI passes on it
- `@dependabot cancel merge` will cancel a previously requested merge
and block automerging
- `@dependabot reopen` will reopen this PR if it is closed
- `@dependabot close` will close this PR and stop Dependabot recreating
it. You can achieve the same result by closing it manually
- `@dependabot show <dependency name> ignore conditions` will show all
of the ignore conditions of the specified dependency
- `@dependabot ignore this major version` will close this PR and stop
Dependabot creating any more for this major version (unless you reopen
the PR or upgrade to it yourself)
- `@dependabot ignore this minor version` will close this PR and stop
Dependabot creating any more for this minor version (unless you reopen
the PR or upgrade to it yourself)
- `@dependabot ignore this dependency` will close this PR and stop
Dependabot creating any more for this dependency (unless you reopen the
PR or upgrade to it yourself)


</details>

---------

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
Co-authored-by: Andrew Nester <andrew.nester@databricks.com>
2024-04-16 12:03:21 +00:00
shreyas-goenka b71f853649
Do not prefill https:// in prompt for Databricks Host (#1364)
## Changes
This PR is a minor UX improvement. By not autofilling the https://
prefix in Databricks Host we allow users to directly copy-paste from
their browser.

UX:
```
➜  cli git:(fix/copy-host) cli auth login
Databricks Profile Name: my-profile
Databricks Host (e.g. https://<databricks-instance>.cloud.databricks.com): https://foobar.cloud.databricks.com
Profile my-profile was successfully saved
```

## Tests
Manually.
2024-04-15 17:31:00 +00:00
shreyas-goenka 1f1fe4c6a8
Add URLs for authentication documentation to the auth command help (#1365)
```
➜  cli git:(fix/better-auth-docs) ✗ cli auth -h
Authentication related commands. For more information regarding how
authentication for the Databricks CLI and SDKs work please refer to the documentation
linked below.

AWS: https://docs.databricks.com/en/dev-tools/auth/index.html
Azure: https://learn.microsoft.com/en-us/azure/databricks/dev-tools/auth
GCP: https://docs.gcp.databricks.com/en/dev-tools/auth/index.html
```
2024-04-15 16:43:46 +00:00
Andrew Nester 60a4a347f9
Fixed typo in error template for auth describe (#1341)
## Changes
Fixed typo in error template for auth describe

## Tests
Manually + added integration test
2024-04-08 11:19:13 +00:00
Ilia Babanov 338fe1fe62
Don't attempt auth in `auth profiles --skip-validate` (#1282)
This makes the command almost instant, no matter how many profiles cfg
file has. One downside is that we don't set AuthType for profiles that
don't have it defined.

We can technically infer AuthType based on ConfigAttributes tags, but
their names are different from the names of actual auth providers (and
some tags cover multiple providers at the same time).
2024-04-05 10:19:54 +00:00
Andrew Nester 5a7405e606
Fixed message for successful auth describe run (#1336)
## Changes
Fixed message for successful auth describe run
2024-04-03 15:47:45 +00:00
Pieter Noordhuis 04cbc7171e
Make bundle validation print text output by default (#1335)
## Changes

It now shows human-readable warnings and validation status.

## Tests

* Manual tests against many examples.
* Errors still return immediately.
2024-04-03 15:33:43 +00:00
dependabot[bot] f28a9d7107
Bump github.com/databricks/databricks-sdk-go from 0.36.0 to 0.37.0 (#1326)
[![Dependabot compatibility
score](https://dependabot-badges.githubapp.com/badges/compatibility_score?dependency-name=github.com/databricks/databricks-sdk-go&package-manager=go_modules&previous-version=0.36.0&new-version=0.37.0)](https://docs.github.com/en/github/managing-security-vulnerabilities/about-dependabot-security-updates#about-compatibility-scores)

Dependabot will resolve any conflicts with this PR as long as you don't
alter it yourself. You can also trigger a rebase manually by commenting
`@dependabot rebase`.

[//]: # (dependabot-automerge-start)
[//]: # (dependabot-automerge-end)

---

<details>
<summary>Dependabot commands and options</summary>
<br />

You can trigger Dependabot actions by commenting on this PR:
- `@dependabot rebase` will rebase this PR
- `@dependabot recreate` will recreate this PR, overwriting any edits
that have been made to it
- `@dependabot merge` will merge this PR after your CI passes on it
- `@dependabot squash and merge` will squash and merge this PR after
your CI passes on it
- `@dependabot cancel merge` will cancel a previously requested merge
and block automerging
- `@dependabot reopen` will reopen this PR if it is closed
- `@dependabot close` will close this PR and stop Dependabot recreating
it. You can achieve the same result by closing it manually
- `@dependabot show <dependency name> ignore conditions` will show all
of the ignore conditions of the specified dependency
- `@dependabot ignore this major version` will close this PR and stop
Dependabot creating any more for this major version (unless you reopen
the PR or upgrade to it yourself)
- `@dependabot ignore this minor version` will close this PR and stop
Dependabot creating any more for this minor version (unless you reopen
the PR or upgrade to it yourself)
- `@dependabot ignore this dependency` will close this PR and stop
Dependabot creating any more for this dependency (unless you reopen the
PR or upgrade to it yourself)


</details>

---------

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
Co-authored-by: Andrew Nester <andrew.nester@databricks.com>
2024-04-03 10:39:53 +00:00
Andrew Nester 8c144a2de4
Added `auth describe` command (#1244)
## Changes
This command provide details on auth configuration user is using as well
as authenticated user and auth mechanism used.

Relies on https://github.com/databricks/databricks-sdk-go/pull/838
(tests will fail until merged)

Examples of output

```
Workspace: https://test.com
User: andrew.nester@databricks.com
Authenticated with: pat
-----
Configuration:
  ✓ auth_type: pat
  ✓ host: https://test.com (from bundle)
  ✓ profile: DEFAULT (from --profile flag)
  ✓ token: ******** (from /Users/andrew.nester/.databrickscfg config file)
```

```
DATABRICKS_AUTH_TYPE=azure-msi databricks auth describe -p "Azure 2"
Unable to authenticate: inner token: Post "https://foobar.com/oauth2/token": AADSTS900023: Specified tenant identifier foobar_aaaaaaa' is neither a valid DNS name, nor a valid external domain. See https://login.microsoftonline.com/error?code=900023
-----
Configuration:
  ✓ auth_type: azure-msi (from DATABRICKS_AUTH_TYPE environment variable)
  ✓ azure_client_id: 8470f3ba-aaaa-bbbb-cccc-xxxxyyyyzzzz (from /Users/andrew.nester/.databrickscfg config file)
  ~ azure_client_secret: ******** (from /Users/andrew.nester/.databrickscfg config file, not used for auth type azure-msi)
  ~ azure_tenant_id: foobar_aaaaaaa (from /Users/andrew.nester/.databrickscfg config file, not used for auth type azure-msi)
  ✓ azure_use_msi: true (from /Users/andrew.nester/.databrickscfg config file)
  ✓ host: https://foobar.com (from /Users/andrew.nester/.databrickscfg config file)
  ✓ profile: Azure 2 (from --profile flag)
```

For account

```
Unable to authenticate: default auth: databricks-cli: cannot get access token: Error: token refresh: Post "https://xxxxxxx.com/v1/token": http 400: {"error":"invalid_request","error_description":"Refresh token is invalid"}
. Config: host=https://xxxxxxx.com, account_id=ed0ca3c5-fae5-4619-bb38-eebe04a4af4b, profile=ACCOUNT-ed0ca3c5-fae5-4619-bb38-eebe04a4af4b
-----
Configuration:
  ✓ account_id: ed0ca3c5-fae5-4619-bb38-eebe04a4af4b (from /Users/andrew.nester/.databrickscfg config file)
  ✓ auth_type: databricks-cli (from /Users/andrew.nester/.databrickscfg config file)
  ✓ host: https://xxxxxxxxx.com (from /Users/andrew.nester/.databrickscfg config file)
  ✓ profile: ACCOUNT-ed0ca3c5-fae5-4619-bb38-eebe04a4af4b
```

## Tests
Added unit tests

---------

Co-authored-by: Julia Crawford (Databricks) <julia.crawford@databricks.com>
2024-04-03 08:14:04 +00:00
Ilia Babanov 079c416f8d
Add `bundle debug terraform` command (#1294)
- Add `bundle debug terraform` command. It prints versions of the
Terraform and the Databricks Terraform provider. In the text mode it
also explains how to setup the CLI in environments with restricted
internet access.
- Use `DATABRICKS_TF_EXEC_PATH` env var to point Databricks CLI to the
Terraform binary. The CLI only uses it if `DATABRICKS_TF_VERSION`
matches the currently used terraform version.
- Use `DATABRICKS_TF_CLI_CONFIG_FILE` env var to point Terraform CLI
config that points to the filesystem mirror for the Databricks provider.
The CLI only uses it if `DATABRICKS_TF_PROVIDER_VERSION` matches the
currently used provider version.


Relevant PR on the VSCode extension side:
https://github.com/databricks/databricks-vscode/pull/1147

Example output of the `databricks bundle debug terraform`:
```
Terraform version: 1.5.5
Terraform URL: https://releases.hashicorp.com/terraform/1.5.5

Databricks Terraform Provider version: 1.38.0
Databricks Terraform Provider URL: https://github.com/databricks/terraform-provider-databricks/releases/tag/v1.38.0

Databricks CLI downloads its Terraform dependencies automatically.

If you run the CLI in an air-gapped environment, you can download the dependencies manually and set these environment variables:

  DATABRICKS_TF_VERSION=1.5.5
  DATABRICKS_TF_EXEC_PATH=/path/to/terraform/binary
  DATABRICKS_TF_PROVIDER_VERSION=1.38.0
  DATABRICKS_TF_CLI_CONFIG_FILE=/path/to/terraform/cli/config.tfrc

Here is an example *.tfrc configuration file:

  disable_checkpoint = true
  provider_installation {
    filesystem_mirror {
      path = "/path/to/a/folder/with/databricks/terraform/provider"
    }
  }

The filesystem mirror path should point to the folder with the Databricks Terraform Provider. The folder should have this structure: /registry.terraform.io/databricks/databricks/terraform-provider-databricks_1.38.0_ARCH.zip

For more information about filesystem mirrors, see the Terraform documentation: https://developer.hashicorp.com/terraform/cli/config/config-file#filesystem_mirror
```

---------

Co-authored-by: shreyas-goenka <88374338+shreyas-goenka@users.noreply.github.com>
2024-04-02 12:56:27 +00:00
Pieter Noordhuis eea34b2504
Return diagnostics from `config.Load` (#1324)
## Changes

We no longer need to store load diagnostics on the `config.Root` type
itself and instead can return them from the `config.Load` call directly.
It is up to the caller of this function to append them to previous
diagnostics, if any.

Background: previous commits moved configuration loading of the entry
point into a mutator, so now all diagnostics naturally flow from
applying mutators.

This PR depends on #1319.

## Tests

Unit and manual validation of the debug statements in the validate
command.
2024-03-28 10:59:03 +00:00
Pieter Noordhuis b21e3c81cd
Make bundle loaders return diagnostics (#1319)
## Changes

The function signature of Cobra's `PreRunE` function has an `error`
return value. We'd like to start returning `diag.Diagnostics` after
loading a bundle, so this is incompatible. This change modifies all
usage of `PreRunE` to load a bundle to inline function calls in the
command's `RunE` function.

## Tests

* Unit tests pass.
* Integration tests pass.
2024-03-28 10:32:34 +00:00
Pieter Noordhuis ca534d596b
Load bundle configuration from mutator (#1318)
## Changes

Prior to this change, the bundle configuration entry point was loaded
from the function `bundle.Load`. Other configuration files were only
loaded once the caller applied the first set of mutators. This
separation was unnecessary and not ideal in light of gathering
diagnostics while loading _any_ configuration file, not just the ones
from the includes.

This change:
* Updates `bundle.Load` to only verify that the specified path is a
valid bundle root.
* Moves mutators that perform loading to `bundle/config/loader`.
* Adds a "load" phase that takes the place of applying
`DefaultMutators`.

Follow ups:
* Rename `bundle.Load` -> `bundle.Find` (because it no longer performs
loading)

This change depends on #1316 and #1317.

## Tests

Tests pass.
2024-03-27 10:49:05 +00:00
Pieter Noordhuis 00d76d5afa
Move path field to bundle type (#1316)
## Changes

The bundle path was previously stored on the `config.Root` type under
the assumption that the first configuration file being loaded would set
it. This is slightly counterintuitive and we know what the path is upon
construction of the bundle. The new location for this property reflects
this.

## Tests

Unit tests pass.
2024-03-27 09:03:24 +00:00
Pieter Noordhuis ed194668db
Return `diag.Diagnostics` from mutators (#1305)
## Changes

This diagnostics type allows us to capture multiple warnings as well as
errors in the return value. This is a preparation for returning
additional warnings from mutators in case we detect non-fatal problems.

* All return statements that previously returned an error now return
`diag.FromErr`
* All return statements that previously returned `fmt.Errorf` now return
`diag.Errorf`
* All `err != nil` checks now use `diags.HasError()` or `diags.Error()`

## Tests

* Existing tests pass.
* I confirmed no call site under `./bundle` or `./cmd/bundle` uses
`errors.Is` on the return value from mutators. This is relevant because
we cannot wrap errors with `%w` when calling `diag.Errorf` (like
`fmt.Errorf`; context in https://github.com/golang/go/issues/47641).
2024-03-25 14:18:47 +00:00
Pieter Noordhuis fd8dbff631
Update Go SDK to v0.36.0 (#1304)
## Changes

SDK release:
https://github.com/databricks/databricks-sdk-go/releases/tag/v0.36.0

No notable differences other than a few type name changes.

## Tests

Tests pass.
2024-03-22 13:15:54 +00:00
Pieter Noordhuis 0ef93c2502
Update Go SDK to v0.35.0 (#1300)
## Changes

SDK release:
https://github.com/databricks/databricks-sdk-go/releases/tag/v0.35.0

## Tests

Tests pass.
2024-03-20 13:57:53 +00:00
Andrew Nester 1b0ac61093
Added deployment state for bundles (#1267)
## Changes
This PR introduces new structure (and a file) being used locally and
synced remotely to Databricks workspace to track bundle deployment
related metadata.

The state is pulled from remote, updated and pushed back remotely as
part of `bundle deploy` command.

This state can be used for deployment sequencing as it's `Version` field
is monotonically increasing on each deployment.

Currently, it only tracks files being synced as part of the deployment.

This helps fix the issue with files not being removed during deployments
on CI/CD as sync snapshot was never present there.

Fixes #943 

## Tests
Added E2E (regression) test for files removal on CI/CD

---------

Co-authored-by: Pieter Noordhuis <pieter.noordhuis@databricks.com>
2024-03-18 14:41:58 +00:00
Andrew Nester c7818560ca
Add usage string when command fails with incorrect arguments (#1276)
## Changes
Add usage string when command fails with incorrect arguments

Fixes #1119

## Tests
Example output

```
> databricks libraries cluster-status 
Error: accepts 1 arg(s), received 0

Usage:
  databricks libraries cluster-status CLUSTER_ID [flags]

Flags:
  -h, --help   help for cluster-status

Global Flags:
      --debug            enable debug logging
  -o, --output type      output type: text or json (default text)
  -p, --profile string   ~/.databrickscfg profile
  -t, --target string    bundle target to use (if applicable)
```
2024-03-12 14:12:34 +00:00
Pieter Noordhuis 74b1e05ed7
Update Go SDK to v0.34.0 (#1256)
## Changes

SDK release
https://github.com/databricks/databricks-sdk-go/releases/tag/v0.34.0

This incorporates two changes to the generation code:
* Use explicit empty check for response types (see
https://github.com/databricks/databricks-sdk-go/pull/831)
* Support subservices for the settings commands (see
https://github.com/databricks/databricks-sdk-go/pull/826)

As part of the subservices support, this change also updates how methods
are registered with their services. This used to be done with `init`
functions and now through inline function calls. This should have a
(negligible) positive impact on binary start time because we no longer
have to call as many `init` functions.

## Tests

tbd
2024-03-06 09:53:44 +00:00
Pieter Noordhuis e1407038d3
Configure cobra.NoArgs for bundle commands where applicable (#1250)
## Changes

Return an error if unused arguments are passed to these commands.

## Tests

n/a
2024-03-01 15:50:20 +00:00
Ilia Babanov d12f88e24d
Fix summary command when internal terraform config doesn't exist (#1242)
Check if `bundle.tf.json` doesn't exist and create it before executing
`terraform init` (inside `terraform.Load`)

Fixes a problem when during `terraform.Load` it fails with: 
```
Error: Failed to load plugin schemas

Error while loading schemas for plugin components: Failed to obtain provider
schema: Could not load the schema for provider
registry.terraform.io/databricks/databricks: failed to instantiate provider
"registry.terraform.io/databricks/databricks" to obtain schema: unavailable
provider "registry.terraform.io/databricks/databricks"..
```
2024-03-01 08:25:12 +00:00
Andrew Nester 1b4a774609
Only set ComputeID value when `--compute-id` flag provided (#1229)
## Changes
Fixes an issue when `compute_id` is defined in the bundle config,
correctly replaced in `validate` command but not used in `deploy`
command

## Tests
Manually
2024-02-22 15:14:06 +00:00
Miles Yucht b65ce75c1f
Use Go SDK Iterators when listing resources with the CLI (#1202)
## Changes
Currently, when the CLI run a list API call (like list jobs), it uses
the `List*All` methods from the SDK, which list all resources in the
collection. This is very slow for large collections: if you need to list
all jobs from a workspace that has 10,000+ jobs, you'll be waiting for
at least 100 RPCs to complete before seeing any output.

Instead of using List*All() methods, the SDK recently added an iterator
data structure that allows traversing the collection without needing to
completely list it first. New pages are fetched lazily if the next
requested item belongs to the next page. Using the List() methods that
return these iterators, the CLI can proactively print out some of the
response before the complete collection has been fetched.

This involves a pretty major rewrite of the rendering logic in `cmdio`.
The idea there is to define custom rendering logic based on the type of
the provided resource. There are three renderer interfaces:

1. textRenderer: supports printing something in a textual format (i.e.
not JSON, and not templated).
2. jsonRenderer: supports printing something in a pretty-printed JSON
format.
3. templateRenderer: supports printing something using a text template.

There are also three renderer implementations:

1. readerRenderer: supports printing a reader. This only implements the
textRenderer interface.
2. iteratorRenderer: supports printing a `listing.Iterator` from the Go
SDK. This implements jsonRenderer and templateRenderer, buffering 20
resources at a time before writing them to the output.
3. defaultRenderer: supports printing arbitrary resources (the previous
implementation).

Callers will either use `cmdio.Render()` for rendering individual
resources or `io.Reader` or `cmdio.RenderIterator()` for rendering an
iterator. This separate method is needed to safely be able to match on
the type of the iterator, since Go does not allow runtime type matches
on generic types with an existential type parameter.

One other change that needs to happen is to split the templates used for
text representation of list resources into a header template and a row
template. The template is now executed multiple times for List API
calls, but the header should only be printed once. To support this, I
have added `headerTemplate` to `cmdIO`, and I have also changed
`RenderWithTemplate` to include a `headerTemplate` parameter everywhere.

## Tests
- [x] Unit tests for text rendering logic
- [x] Unit test for reflection-based iterator construction.

---------

Co-authored-by: Andrew Nester <andrew.nester@databricks.com>
2024-02-21 14:16:36 +00:00
Andrew Nester 5309e0fc2a
Improved error message when no .databrickscfg (#1223)
## Changes
Fixes #1060
2024-02-21 14:15:26 +00:00
shreyas-goenka 5ba0aaa5c5
Add support for UC Volumes to the `databricks fs` commands (#1209)
## Changes
```
shreyas.goenka@THW32HFW6T cli % databricks fs -h
Commands to do file system operations on DBFS and UC Volumes.

Usage:
  databricks fs [command]

Available Commands:
  cat         Show file content.
  cp          Copy files and directories.
  ls          Lists files.
  mkdir       Make directories.
  rm          Remove files and directories.
```

This PR adds support for UC Volumes to the fs commands. The fs commands
for UC volumes work the same as they currently do for DBFS. This is
ensured by running the same test matrix we across both DBFS and UC
Volumes versions of the fs commands.

## Tests
Support for UC volumes is tested by running the same tests as we did
originally for DBFS commands. The tests require a `main` catalog to
exist in the workspace, which does in our test workspaces environments
which have the `TEST_METASTORE_ID` environment variable set.

For the Files API filer, we do the same by running mostly common tests
to ensure the filers for "local", "wsfs", "dbfs" and "files API" are
consistent.

The tests are also made to all run in parallel to reduce the time taken.
To ensure the separation of the tests, each test creates its own UC
schema (for UC volumes tests) or DBFS directories (for DBFS tests).
2024-02-20 16:14:37 +00:00
dependabot[bot] d9f34e6b22
Bump github.com/databricks/databricks-sdk-go from 0.32.0 to 0.33.0 (#1222)
Bumps
[github.com/databricks/databricks-sdk-go](https://github.com/databricks/databricks-sdk-go)
from 0.32.0 to 0.33.0.
<details>
<summary>Release notes</summary>
<p><em>Sourced from <a
href="https://github.com/databricks/databricks-sdk-go/releases">github.com/databricks/databricks-sdk-go's
releases</a>.</em></p>
<blockquote>
<h2>v0.33.0</h2>
<p>Internal Changes:</p>
<ul>
<li>Add helper function to get header fields (<a
href="https://redirect.github.com/databricks/databricks-sdk-go/pull/822">#822</a>).</li>
<li>Add Int64 to header type injection (<a
href="https://redirect.github.com/databricks/databricks-sdk-go/pull/819">#819</a>).</li>
</ul>
<p>API Changes:</p>
<ul>
<li>Changed <code>Update</code> method for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/catalog#LakehouseMonitorsAPI">w.LakehouseMonitors</a>
workspace-level service with new required argument order.</li>
<li>Added <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/catalog#OnlineTablesAPI">w.OnlineTables</a>
workspace-level service.</li>
<li>Removed <code>AssetsDir</code> field for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/catalog#UpdateMonitor">catalog.UpdateMonitor</a>.</li>
<li>Added <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/catalog#ContinuousUpdateStatus">catalog.ContinuousUpdateStatus</a>.</li>
<li>Added <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/catalog#DeleteOnlineTableRequest">catalog.DeleteOnlineTableRequest</a>.</li>
<li>Added <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/catalog#FailedStatus">catalog.FailedStatus</a>.</li>
<li>Added <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/catalog#GetOnlineTableRequest">catalog.GetOnlineTableRequest</a>.</li>
<li>Added <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/catalog#OnlineTable">catalog.OnlineTable</a>.</li>
<li>Added <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/catalog#OnlineTableSpec">catalog.OnlineTableSpec</a>.</li>
<li>Added <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/catalog#OnlineTableState">catalog.OnlineTableState</a>.</li>
<li>Added <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/catalog#OnlineTableStatus">catalog.OnlineTableStatus</a>.</li>
<li>Added <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/catalog#PipelineProgress">catalog.PipelineProgress</a>.</li>
<li>Added <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/catalog#ProvisioningStatus">catalog.ProvisioningStatus</a>.</li>
<li>Added <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/catalog#TriggeredUpdateStatus">catalog.TriggeredUpdateStatus</a>.</li>
<li>Added <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/catalog#ViewData">catalog.ViewData</a>.</li>
<li>Added <code>ContentLength</code> field for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/files#DownloadResponse">files.DownloadResponse</a>.</li>
<li>Added <code>ContentType</code> field for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/files#DownloadResponse">files.DownloadResponse</a>.</li>
<li>Added <code>LastModified</code> field for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/files#DownloadResponse">files.DownloadResponse</a>.</li>
<li>Changed <code>LastModified</code> field for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/files#GetMetadataResponse">files.GetMetadataResponse</a>
to <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/files#LastModifiedHttpDate">files.LastModifiedHttpDate</a>.</li>
<li>Added <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/files#LastModifiedHttpDate">files.LastModifiedHttpDate</a>.</li>
<li>Removed <code>Config</code> field for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/serving#ExternalModel">serving.ExternalModel</a>.</li>
<li>Added <code>Ai21labsConfig</code> field for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/serving#ExternalModel">serving.ExternalModel</a>.</li>
<li>Added <code>AnthropicConfig</code> field for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/serving#ExternalModel">serving.ExternalModel</a>.</li>
<li>Added <code>AwsBedrockConfig</code> field for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/serving#ExternalModel">serving.ExternalModel</a>.</li>
<li>Added <code>CohereConfig</code> field for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/serving#ExternalModel">serving.ExternalModel</a>.</li>
<li>Added <code>DatabricksModelServingConfig</code> field for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/serving#ExternalModel">serving.ExternalModel</a>.</li>
<li>Added <code>OpenaiConfig</code> field for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/serving#ExternalModel">serving.ExternalModel</a>.</li>
<li>Added <code>PalmConfig</code> field for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/serving#ExternalModel">serving.ExternalModel</a>.</li>
<li>Removed <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/serving#ExternalModelConfig">serving.ExternalModelConfig</a>.</li>
<li>Added <code>MaxProvisionedThroughput</code> field for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/serving#ServedEntityInput">serving.ServedEntityInput</a>.</li>
<li>Added <code>MinProvisionedThroughput</code> field for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/serving#ServedEntityInput">serving.ServedEntityInput</a>.</li>
<li>Added <code>MaxProvisionedThroughput</code> field for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/serving#ServedEntityOutput">serving.ServedEntityOutput</a>.</li>
<li>Added <code>MinProvisionedThroughput</code> field for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/serving#ServedEntityOutput">serving.ServedEntityOutput</a>.</li>
</ul>
</blockquote>
</details>
<details>
<summary>Changelog</summary>
<p><em>Sourced from <a
href="https://github.com/databricks/databricks-sdk-go/blob/main/CHANGELOG.md">github.com/databricks/databricks-sdk-go's
changelog</a>.</em></p>
<blockquote>
<h2>0.33.0</h2>
<p>Internal Changes:</p>
<ul>
<li>Add helper function to get header fields (<a
href="https://redirect.github.com/databricks/databricks-sdk-go/pull/822">#822</a>).</li>
<li>Add Int64 to header type injection (<a
href="https://redirect.github.com/databricks/databricks-sdk-go/pull/819">#819</a>).</li>
</ul>
<p>API Changes:</p>
<ul>
<li>Changed <code>Update</code> method for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/catalog#LakehouseMonitorsAPI">w.LakehouseMonitors</a>
workspace-level service with new required argument order.</li>
<li>Added <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/catalog#OnlineTablesAPI">w.OnlineTables</a>
workspace-level service.</li>
<li>Removed <code>AssetsDir</code> field for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/catalog#UpdateMonitor">catalog.UpdateMonitor</a>.</li>
<li>Added <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/catalog#ContinuousUpdateStatus">catalog.ContinuousUpdateStatus</a>.</li>
<li>Added <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/catalog#DeleteOnlineTableRequest">catalog.DeleteOnlineTableRequest</a>.</li>
<li>Added <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/catalog#FailedStatus">catalog.FailedStatus</a>.</li>
<li>Added <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/catalog#GetOnlineTableRequest">catalog.GetOnlineTableRequest</a>.</li>
<li>Added <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/catalog#OnlineTable">catalog.OnlineTable</a>.</li>
<li>Added <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/catalog#OnlineTableSpec">catalog.OnlineTableSpec</a>.</li>
<li>Added <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/catalog#OnlineTableState">catalog.OnlineTableState</a>.</li>
<li>Added <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/catalog#OnlineTableStatus">catalog.OnlineTableStatus</a>.</li>
<li>Added <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/catalog#PipelineProgress">catalog.PipelineProgress</a>.</li>
<li>Added <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/catalog#ProvisioningStatus">catalog.ProvisioningStatus</a>.</li>
<li>Added <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/catalog#TriggeredUpdateStatus">catalog.TriggeredUpdateStatus</a>.</li>
<li>Added <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/catalog#ViewData">catalog.ViewData</a>.</li>
<li>Added <code>ContentLength</code> field for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/files#DownloadResponse">files.DownloadResponse</a>.</li>
<li>Added <code>ContentType</code> field for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/files#DownloadResponse">files.DownloadResponse</a>.</li>
<li>Added <code>LastModified</code> field for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/files#DownloadResponse">files.DownloadResponse</a>.</li>
<li>Changed <code>LastModified</code> field for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/files#GetMetadataResponse">files.GetMetadataResponse</a>
to <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/files#LastModifiedHttpDate">files.LastModifiedHttpDate</a>.</li>
<li>Added <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/files#LastModifiedHttpDate">files.LastModifiedHttpDate</a>.</li>
<li>Removed <code>Config</code> field for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/serving#ExternalModel">serving.ExternalModel</a>.</li>
<li>Added <code>Ai21labsConfig</code> field for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/serving#ExternalModel">serving.ExternalModel</a>.</li>
<li>Added <code>AnthropicConfig</code> field for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/serving#ExternalModel">serving.ExternalModel</a>.</li>
<li>Added <code>AwsBedrockConfig</code> field for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/serving#ExternalModel">serving.ExternalModel</a>.</li>
<li>Added <code>CohereConfig</code> field for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/serving#ExternalModel">serving.ExternalModel</a>.</li>
<li>Added <code>DatabricksModelServingConfig</code> field for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/serving#ExternalModel">serving.ExternalModel</a>.</li>
<li>Added <code>OpenaiConfig</code> field for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/serving#ExternalModel">serving.ExternalModel</a>.</li>
<li>Added <code>PalmConfig</code> field for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/serving#ExternalModel">serving.ExternalModel</a>.</li>
<li>Removed <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/serving#ExternalModelConfig">serving.ExternalModelConfig</a>.</li>
<li>Added <code>MaxProvisionedThroughput</code> field for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/serving#ServedEntityInput">serving.ServedEntityInput</a>.</li>
<li>Added <code>MinProvisionedThroughput</code> field for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/serving#ServedEntityInput">serving.ServedEntityInput</a>.</li>
<li>Added <code>MaxProvisionedThroughput</code> field for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/serving#ServedEntityOutput">serving.ServedEntityOutput</a>.</li>
<li>Added <code>MinProvisionedThroughput</code> field for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/serving#ServedEntityOutput">serving.ServedEntityOutput</a>.</li>
</ul>
<p>OpenAPI SHA: cdd76a98a4fca7008572b3a94427566dd286c63b, Date:
2024-02-19</p>
</blockquote>
</details>
<details>
<summary>Commits</summary>
<ul>
<li><a
href="eba5c8b3ae"><code>eba5c8b</code></a>
Release v0.33.0 (<a
href="https://redirect.github.com/databricks/databricks-sdk-go/issues/823">#823</a>)</li>
<li><a
href="6846045a98"><code>6846045</code></a>
Add Int64 to header type injection (<a
href="https://redirect.github.com/databricks/databricks-sdk-go/issues/819">#819</a>)</li>
<li><a
href="c6a803ae18"><code>c6a803a</code></a>
Add helper function to get header fields (<a
href="https://redirect.github.com/databricks/databricks-sdk-go/issues/822">#822</a>)</li>
<li>See full diff in <a
href="https://github.com/databricks/databricks-sdk-go/compare/v0.32.0...v0.33.0">compare
view</a></li>
</ul>
</details>
<br />

<details>
<summary>Most Recent Ignore Conditions Applied to This Pull
Request</summary>

| Dependency Name | Ignore Conditions |
| --- | --- |
| github.com/databricks/databricks-sdk-go | [>= 0.28.a, < 0.29] |
</details>


[![Dependabot compatibility
score](https://dependabot-badges.githubapp.com/badges/compatibility_score?dependency-name=github.com/databricks/databricks-sdk-go&package-manager=go_modules&previous-version=0.32.0&new-version=0.33.0)](https://docs.github.com/en/github/managing-security-vulnerabilities/about-dependabot-security-updates#about-compatibility-scores)

Dependabot will resolve any conflicts with this PR as long as you don't
alter it yourself. You can also trigger a rebase manually by commenting
`@dependabot rebase`.

[//]: # (dependabot-automerge-start)
[//]: # (dependabot-automerge-end)

---

<details>
<summary>Dependabot commands and options</summary>
<br />

You can trigger Dependabot actions by commenting on this PR:
- `@dependabot rebase` will rebase this PR
- `@dependabot recreate` will recreate this PR, overwriting any edits
that have been made to it
- `@dependabot merge` will merge this PR after your CI passes on it
- `@dependabot squash and merge` will squash and merge this PR after
your CI passes on it
- `@dependabot cancel merge` will cancel a previously requested merge
and block automerging
- `@dependabot reopen` will reopen this PR if it is closed
- `@dependabot close` will close this PR and stop Dependabot recreating
it. You can achieve the same result by closing it manually
- `@dependabot show <dependency name> ignore conditions` will show all
of the ignore conditions of the specified dependency
- `@dependabot ignore this major version` will close this PR and stop
Dependabot creating any more for this major version (unless you reopen
the PR or upgrade to it yourself)
- `@dependabot ignore this minor version` will close this PR and stop
Dependabot creating any more for this minor version (unless you reopen
the PR or upgrade to it yourself)
- `@dependabot ignore this dependency` will close this PR and stop
Dependabot creating any more for this dependency (unless you reopen the
PR or upgrade to it yourself)


</details>

---------

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
Co-authored-by: Andrew Nester <andrew.nester@databricks.com>
2024-02-19 14:30:06 +00:00
Lennart Kats (databricks) 162b115e19
Add an experimental default-sql template (#1051)
## Changes

This adds a `default-sql` template! 

In this latest revision, I've hidden the new template from the list so
we can merge it, iterate over it, and properly release the template at
the right time.

- [x] WorkspaceFS support for .sql files is in prod
- [x] SQL extension is preconfigured based on extension settings (if
possible)
- [ ] Streaming tables support is either ungated or the template
provides instructions about signup
- _Mitigation for now: this template is hidden from the list of
templates._
- [x] Support non-UC workspaces

## Tests
- [x] Unit tests
- [x] Manual testing
- [x] More manual testing
- [x] Reviewer testing

---------

Co-authored-by: Pieter Noordhuis <pieter.noordhuis@databricks.com>
Co-authored-by: PaulCornellDB <paul.cornell@databricks.com>
2024-02-19 12:01:11 +00:00
Lennart Kats (databricks) 1c680121c8
Add an experimental dbt-sql template (#1059)
## Changes

This adds a new dbt-sql template. This work requires the new WorkspaceFS
support for dbt tasks.

In this latest revision, I've hidden the new template from the list so
we can merge it, iterate over it, and propertly release the template at
the right time.

Blockers:
- [x] WorkspaceFS support for dbt projects is in prod
- [x] Move dbt files into a subdirectory
- [ ] Wait until the next (>1.7.4) release of the dbt plugin which will
have major improvements!
- _Rather than wait, this template is hidden from the list of
templates._
- [x] SQL extension is preconfigured based on extension settings (if
possible)
- MV / streaming tables:
  - [x] Add to template
- [x] Fix https://github.com/databricks/dbt-databricks/issues/535 (to be
released with in 1.7.4)
- [x] Merge https://github.com/databricks/dbt-databricks/pull/338 (to be
released with in 1.7.4)
- [ ] Fix "too many 503 errors" issue
(https://github.com/databricks/dbt-databricks/issues/570, internal
tracker: ES-1009215, ES-1014138)
  - [x] Support ANSI mode in the template
- [ ] Streaming tables support is either ungated or the template
provides instructions about signup
- _Mitigation for now: this template is hidden from the list of
templates._
- [x] Support non-workspace-admin deployment
- [x] Make sure `data_security_mode: SINGLE_USER` works on non-UC
workspaces (it's required to be explicitly specified on UC workspaces
with single-node clusters)
- [x] Support non-UC workspaces

## Tests

- [x] Unit tests
- [x] Manual testing
- [x] More manual testing
- [ ] Reviewer manual testing
  - _I'd like to do a small bug bash post-merging._
- [x] Unit tests
2024-02-19 09:15:17 +00:00
Pieter Noordhuis 87dd46a3f8
Use dynamic configuration model in bundles (#1098)
## Changes

This is a fundamental change to how we load and process bundle
configuration. We now depend on the configuration being represented as a
`dyn.Value`. This representation is functionally equivalent to Go's
`any` (it is variadic) and allows us to capture metadata associated with
a value, such as where it was defined (e.g. file, line, and column). It
also allows us to represent Go's zero values properly (e.g. empty
string, integer equal to 0, or boolean false).

Using this representation allows us to let the configuration model
deviate from the typed structure we have been relying on so far
(`config.Root`). We need to deviate from these types when using
variables for fields that are not a string themselves. For example,
using `${var.num_workers}` for an integer `workers` field was impossible
until now (though not implemented in this change).

The loader for a `dyn.Value` includes functionality to capture any and
all type mismatches between the user-defined configuration and the
expected types. These mismatches can be surfaced as validation errors in
future PRs.

Given that many mutators expect the typed struct to be the source of
truth, this change converts between the dynamic representation and the
typed representation on mutator entry and exit. Existing mutators can
continue to modify the typed representation and these modifications are
reflected in the dynamic representation (see `MarkMutatorEntry` and
`MarkMutatorExit` in `bundle/config/root.go`).

Required changes included in this change:
* The existing interpolation package is removed in favor of
`libs/dyn/dynvar`.
* Functionality to merge job clusters, job tasks, and pipeline clusters
are now all broken out into their own mutators.

To be implemented later:
* Allow variable references for non-string types.
* Surface diagnostics about the configuration provided by the user in
the validation output.
* Some mutators use a resource's configuration file path to resolve
related relative paths. These depend on `bundle/config/paths.Path` being
set and populated through `ConfigureConfigFilePath`. Instead, they
should interact with the dynamically typed configuration directly. Doing
this also unlocks being able to differentiate different base paths used
within a job (e.g. a task override with a relative path defined in a
directory other than the base job).

## Tests

* Existing unit tests pass (some have been modified to accommodate)
* Integration tests pass
2024-02-16 19:41:58 +00:00
Andrew Nester e474948a4b
Generate correct YAML if custom_tags or spark_conf is used for pipeline or job cluster configuration (#1210)
These fields (key and values) needs to be double quoted in order for
yaml loader to read, parse and unmarshal it into Go struct correctly
because these fields are `map[string]string` type.

## Tests
Added regression unit and E2E tests
2024-02-15 15:03:19 +00:00
dependabot[bot] 299e9b56a6
Bump github.com/databricks/databricks-sdk-go from 0.30.1 to 0.32.0 (#1199)
Bumps
[github.com/databricks/databricks-sdk-go](https://github.com/databricks/databricks-sdk-go)
from 0.30.1 to 0.32.0.

---------

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
Co-authored-by: Andrew Nester <andrew.nester@databricks.com>
2024-02-15 14:52:17 +00:00
Andrew Nester 80670eceed
Added `bundle deployment bind` and `unbind` command (#1131)
## Changes
Added `bundle deployment bind` and `unbind` command.

This command allows to bind bundle-defined resources to existing
resources in Databricks workspace so they become DABs-managed.

## Tests
Manually + added E2E test
2024-02-14 18:04:45 +00:00
Miles Yucht e8b0698e19
Regenerate the CLI using the same OpenAPI spec as the SDK (#1205)
## Changes
The OpenAPI spec used to generate the CLI doesn't match the version used
for the SDK version that the CLI currently depends on. This PR
regenerates the CLI based on the same version of the OpenAPI spec used
by the SDK on v0.30.1.

## Tests
<!-- How is this tested? -->
2024-02-13 14:33:59 +00:00
Andrew Nester bc30c9ed4a
Added `--restart` flag for `bundle run` command (#1191)
## Changes
Added `--restart` flag for `bundle run` command

When running with this flag, `bundle run` will cancel all existing runs
before starting a new one

## Tests
Manually
2024-02-09 14:33:14 +00:00
shreyas-goenka d638262665
Add spinner when downloading templates for bundle init (#1188)
## Changes
Templates can take a long time to download. This PR adds a spinner to
give feedback to users.

## Tests
<!-- How is this tested? -->
Manually


https://github.com/databricks/cli/assets/88374338/b453982c-3233-40f4-8d6f-f31606ff0195
2024-02-08 12:52:53 +00:00
Pieter Noordhuis a835a3e564
Ignore environment variables for `auth profiles` (#1189)
## Changes

If environment variables related to unified authentication are set and a
user runs `auth profiles`, the environment variables will interfere with
the output. This change only takes profile data into account for the
output.

## Tests

Added a unit test.
2024-02-08 12:25:51 +00:00
Pieter Noordhuis b1b5ad8acd
Log time it takes for profile to load (#1186)
## Changes

Aids debugging why `auth profiles` may take longer than expected.

## Tests

Confirmed manually that timing information shows up in the log output.
2024-02-08 11:10:52 +00:00
Pieter Noordhuis 8e58e04e8f
Move folders package into libs (#1184)
## Changes

This is the last top-level package that doesn't need to be top-level.
2024-02-07 16:33:18 +00:00
Andrew Nester 6edab93233
Added warning when trying to deploy bundle with `--fail-if-running` and running resources (#1163)
## Changes
Deploying bundle when there are bundle resources running at the same
time can be disruptive for jobs and pipelines in progress.

With this change during deployment phase (before uploading any
resources) if there is `--fail-if-running` specified DABs will check if
there are any resources running and if so, will fail the deployment

## Tests
Manual + add tests
2024-02-07 11:17:17 +00:00
Andrew Nester 2bbb644749
Group bundle run flags by job and pipeline types (#1174)
## Changes
Group bundle run flags by job and pipeline types

## Tests
```
Run a resource (e.g. a job or a pipeline)

Usage:
  databricks bundle run [flags] KEY

Job Flags:
      --dbt-commands strings                 A list of commands to execute for jobs with DBT tasks.
      --jar-params strings                   A list of parameters for jobs with Spark JAR tasks.
      --notebook-params stringToString       A map from keys to values for jobs with notebook tasks. (default [])
      --params stringToString                comma separated k=v pairs for job parameters (default [])
      --pipeline-params stringToString       A map from keys to values for jobs with pipeline tasks. (default [])
      --python-named-params stringToString   A map from keys to values for jobs with Python wheel tasks. (default [])
      --python-params strings                A list of parameters for jobs with Python tasks.
      --spark-submit-params strings          A list of parameters for jobs with Spark submit tasks.
      --sql-params stringToString            A map from keys to values for jobs with SQL tasks. (default [])

Pipeline Flags:
      --full-refresh strings   List of tables to reset and recompute.
      --full-refresh-all       Perform a full graph reset and recompute.
      --refresh strings        List of tables to update.
      --refresh-all            Perform a full graph update.

Flags:
  -h, --help      help for run
      --no-wait   Don't wait for the run to complete.

Global Flags:
      --debug            enable debug logging
  -o, --output type      output type: text or json (default text)
  -p, --profile string   ~/.databrickscfg profile
  -t, --target string    bundle target to use (if applicable)
      --var strings      set values for variables defined in bundle config. Example: --var="foo=bar"
   ```
2024-02-06 14:51:02 +00:00
Andrew Nester b28432afed
Add `--key` flag for generate commands to specify resource key (#1165)
## Changes
Add --key for generate commands to specify resource key.

Also, resource config files are now not prefixed anymore.

## Tests
Integration tests passed

---------

Co-authored-by: Pieter Noordhuis <pieter.noordhuis@databricks.com>
2024-01-31 10:23:35 +00:00