Commit Graph

525 Commits

Author SHA1 Message Date
shreyas-goenka b323703c1b
Add validation for single node clusters (#1909)
## Changes
This PR adds a warning validating that the configuration for a single
node cluster is valid for interactive, job, job-task, and pipeline
clusters.

Note: We skip the validation if a cluster policy is configured because
the policy is likely to configure `spark_conf` / `custom_tags` itself.

Note: Terrform originally only had validation for interactive, job, and
job-task clusters. This PR adding the validation for pipeline clusters
as well is new.

This PR follows the same logic as we used to have in Terraform. The
validation was removed from Terraform because we had no way to demote
the error to a warning:
https://github.com/databricks/terraform-provider-databricks/pull/4222

### Background
Single-node clusters require `spark_conf` and `custom_tags` to be
correctly set in the cluster definition for them to function optimally.
The cluster will be created even if incorrectly configured, but its
performance will not be great.

For example, if both `spark_conf` and `custom_tags` are not set and
`num_workers` is 0, then only the driver process will be launched on the
cluster compute instance thus leading to sub-optimal utilization of
available compute resources and no parallelization across worker
processes when processing a spark query.

### Issue

This PR addresses some issues reported in
https://github.com/databricks/cli/issues/1546

## Tests
Unit tests and manually.

Example output of the warning:
```
➜  bundle-playground git:(master) ✗ cli bundle validate
Warning: Single node cluster is not correctly configured
  at resources.pipelines.bar.clusters[0]
  in databricks.yml:29:11

num_workers should be 0 only for single-node clusters. To create a
valid single node cluster please ensure that the following properties
are correctly set in the cluster specification:

  spark_conf:
    spark.databricks.cluster.profile: singleNode
    spark.master: local[*]

  custom_tags:
    ResourceClass: SingleNode
  

Name: foobar
Target: default
Workspace:
  User: shreyas.goenka@databricks.com
  Path: /Workspace/Users/shreyas.goenka@databricks.com/.bundle/foobar/default

Found 1 warning
```
2024-11-22 15:48:09 +00:00
Ilya Kuznetsov 490dd058aa
Extended message for warning when source-linked mode is used outside of the workspace (#1929)
## Changes

Added path and locations to the warning which displayed when
source-linked mode is used outside of the workspace
2024-11-22 14:44:33 +00:00
Pieter Noordhuis abfd1713e0
Skip sync warning if no sync paths are defined (#1926)
## Changes

Users can configure the bundle to not synchronize any files with:
```yaml
sync:
  paths: []
```

If it is explicitly configured as an empty list, the validate command
must not warn about not having any files to synchronize. The warning
exists to alert users who are unintentionally not synchronizing any
files (they might have a `.gitignore` pattern that matches everything).

Closes #1663.

## Tests

* New unit test.
2024-11-21 15:03:13 +00:00
Pieter Noordhuis a3cea07c9e
Support lookup by name of notification destinations (#1922)
## Changes

Add support for notification destinations in variable lookups.

More information:
https://docs.databricks.com/en/admin/workspace-settings/notification-destinations.html

Depends on #1921.

## Tests

* New unit test
* Manually confirmed that the lookup works
2024-11-21 15:52:14 +01:00
shreyas-goenka c2e2abcc35
Extend "notebook not found" error to warn about missing extension (#1920)
## Changes
The full workspace path for a notebook does not contain the notebook's
extension. If a user converts that file path to a relative path (like
`/Workspace/bundle_root/bar/nb` -> `./bar/nb`), they can be confused as
to why the new file path does not work.

The changes in this PR nudge them to add the appropriate file extension
(e.g., `./bar/nb.py` or `./bar/nb.ipynb`).

One common way users can end up in this scenario is by using the view
job as YAML functionality in the Databricks UI.

## Tests
Unit test and manually.

```
(.venv) ➜  bundle-playground git:(master) ✗ cli bundle validate 
Error: notebook ./foo not found. Local notebook references are expected
to contain one of the following file extensions: [.py, .r, .scala, .sql, .ipynb]
```
2024-11-21 16:21:21 +05:30
Pieter Noordhuis 14fe03dcb9
Breakout variable lookup into separate files and tests (#1921)
## Changes

While looking into adding variable lookups for notification destinations
([API][API]), I found the codegen approach for different classes of
variable lookups a bit complex. The template had a custom field override
(for service principals), the package had an override for the cluster
lookup, and it didn't produce tests.

The notification destinations API uses a default page size of 20 for
listing. I want to use a larger page size to limit the number of API
calls, so that would imply another customization on the template or a
manual override.

This code being rather mechanical, I used copilot to produce all
instances of the resolvers and their tests (after writing one of them
manually).

[api]: https://docs.databricks.com/api/workspace/notificationdestinations

## Tests

* Unit tests pass
* Manual confirmation that lookups of warehouses still work
2024-11-21 11:28:50 +01:00
Ilya Kuznetsov 756e55fabc
Source-linked deployments for bundles in the workspace (#1884)
## Changes

This change adds a preset for source-linked deployments. It is enabled
by default for targets in `development` mode **if** the Databricks CLI
is running from the `/Workspace` directory on DBR. It does not have an
effect when running the CLI anywhere else.

Key highlights:
1. Files in this mode won't be uploaded to workspace
2. Created resources will use references to source files instead of
their workspace copies

## Tests
1. Apply preset unit test covering conditional logic
2. High-level process target mode unit test for testing integration
between mutators

---------

Co-authored-by: Pieter Noordhuis <pieter.noordhuis@databricks.com>
2024-11-20 13:22:27 +01:00
Andrew Nester 7f3fb10c4a
Do not prepend paths starting with ~ or variable reference (#1905)
## Changes
Fixes #1904 

## Tests
Added regression test
2024-11-15 15:03:59 +00:00
Pieter Noordhuis 1db384018c
Make `TableName` field part of quality monitor schema (#1903)
## Changes

This field was special-cased in #1307 because it's not part of the JSON
payload in the SDK struct.

This approach, while pragmatic, meant it didn't show up in the JSON
schema. While debugging an issue with quality monitors in #1900, I
couldn't figure out why I was getting schema errors on this field, or
how it was passed through to the TF representation. This commit removes
the special case and makes it behave like everything else.

## Tests

* Unit tests pass.
* Confirmed that the updated schema failed validation before this
change.
2024-11-14 17:39:38 +00:00
Pieter Noordhuis 1508d65c4c
Extract functionality to detect if the CLI is running on DBR (#1889)
## Changes

Whether or not the CLI is running on DBR can be detected once and stored
in the command's context.

By storing it in the context, it can easily be mocked for testing.

This builds on the simpler approach and conversation in #1744. It
unblocks testing of the DBR-specific paths while not compromising on the
checks we can perform to test if the CLI is running on DBR.

## Tests

* Unit tests for the new `dbr` package
* New unit test for the `ConfigureWSFS` mutator
2024-11-14 16:10:45 +00:00
Pieter Noordhuis 21d27885dc
Upgrade TF provider to 1.58.0 (#1900)
## Changes

Notable changes:
* Adds support for `restart_window` for pipelines.
* Fix drift for pipelines where `catalog` contains uppercase characters.
* Better error message if single-node job clusters are incorrectly configured.

See:
* https://github.com/databricks/terraform-provider-databricks/releases/tag/v1.58.0
* https://github.com/databricks/terraform-provider-databricks/releases/tag/v1.57.0
* https://github.com/databricks/terraform-provider-databricks/releases/tag/v1.56.0
* https://github.com/databricks/terraform-provider-databricks/releases/tag/v1.55.0

## Tests

Integration tests pass.
2024-11-14 14:00:15 +01:00
dependabot[bot] 25838ee0af
Bump github.com/databricks/databricks-sdk-go from 0.49.0 to 0.51.0 (#1878)
Known issues:

- [ ] _(non-blocking with a command override)_ `apps.Update` requires 2
`name` params (one from path, one from request body)
- [ ] _(non-blocking)_ `lakeview.Create` does not require positional
argument `display_name` anymore because it's not marked as required in
request body

Bumps
[github.com/databricks/databricks-sdk-go](https://github.com/databricks/databricks-sdk-go)
from 0.49.0 to 0.51.0.

---------

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
Co-authored-by: Andrew Nester <andrew.nester@databricks.com>
2024-11-13 13:40:53 +00:00
Pieter Noordhuis 26afab2ccb
Fix relative path resolution for dashboards on Windows (#1881)
## Changes

The file presence check for dashboard files was missing a
`filepath.ToSlash`.

This means it didn't work on Windows unless the dashboard was located at
a path without slashes (i.e. the bundle root).

Closes #1875.

## Tests

* Added a unit test to cover this case (failed before the fix).
* Manually ran a dashboard deployment on Windows.
2024-11-05 09:53:53 +00:00
Andrew Nester ac71d2e5ce
Fixed adding /Workspace prefix for resource paths (#1866)
## Changes
`/Workspace` prefix needs to be added to `resource_path` as well.

Fixes the issue mentioned here:
https://github.com/databricks/cli/pull/1822#issuecomment-2447697498

Fixes #1867 

## Tests
Added regression test
2024-10-30 17:34:11 +00:00
Andrew Nester f018daf413
Use SetPermissions instead of UpdatePermissions when setting folder permissions based on top-level ones (#1822)
## Changes
Changed to use SetPermissions() to configure the permissions which
remove other permissions on deployment folders.

## Tests
Added unit test
2024-10-29 12:06:38 +00:00
Pieter Noordhuis 1896b09350
Add bundle generate variant for dashboards (#1847)
## Changes

This change adds the `databricks bundle generate dashboard` command.

The command requires one of three flags:
* `--existing-id` to generate configuration for an existing dashboard by
its ID.
* `--existing-path` to generate configuration for an existing dashboard
by its path in the workspace file system.
* `--resource` to generate the `.lvdash.json` dashboard file for a
dashboard that's already defined in the bundle. This option does not
impact the YAML configuration.

A typical workflow could look like this:
1. Use the command with `--existing-id` or `--existing-path` for a
starting point
2. Run `bundle deploy` to deploy a copy of the dashboard
3. Run `bundle open` to open this copy in your browser
4. Navigate to the draft mode and make modifications
5. Run `bundle generate dashboard` with `--resource` to update the local
`.lvdash.json` file with the remote modifications

## Tests

* Unit tests.
* Manual walkthrough as documented in the [Dashboard for NYC Taxi Trip
Analysis
example](https://github.com/databricks/bundle-examples/tree/main/knowledge_base/dashboard_nyc_taxi).
2024-10-29 11:51:59 +00:00
Pieter Noordhuis 11f75fd320
Add support for AI/BI dashboards (#1743)
## Changes

This change adds support for modeling [AI/BI dashboards][docs] in DABs.


[Example bundle configuration][example] is located in the
`bundle-examples` repository.

[docs]: https://docs.databricks.com/en/dashboards/index.html#dashboards
[example]:
https://github.com/databricks/bundle-examples/tree/main/knowledge_base/dashboard_nyc_taxi

## Tests

* Added unit tests for self-contained parts
* Integration test for e2e dashboard deployment and remote change
modification
2024-10-29 09:11:08 +00:00
Pieter Noordhuis ed84a33b0a
Reuse resource resolution code for the run command (#1858)
## Changes

As of #1846 we have a generalized package for doing resource lookups and
completion.

This change updates the run command to use this instead of more specific
code under `bundle/run`.

## Tests

* Unit tests pass
* Manually confirmed that completion and prompting works
2024-10-24 13:24:30 +00:00
Andrew Nester eaea308254
Added validator for folder permissions (#1824)
## Changes
This validator checks permissions defined in top-level bundle config and
permissions set in workspace for the folders bundle is deployed to. It
raises the warning if the permissions defined in the workspace are not
defined in bundle.

This validator is executed only during `bundle validate` command.

## Tests

```
Warning: untracked permissions apply to target workspace path

The following permissions apply to the workspace folder at "/Workspace/Users/andrew.nester@databricks.com/.bundle/clusters/default" but are not configured in the bundle:
- level: CAN_MANAGE, user_name: andrew.nester@databricks.com
```

---------

Co-authored-by: Pieter Noordhuis <pieter.noordhuis@databricks.com>
2024-10-24 12:36:17 +00:00
Pieter Noordhuis 89ee7d8a99
Add command to open a resource in the browser (#1846)
## Changes

This builds on the functionality added in #1731 that produces a URL for
every resource.

Adds `bundle/resources` package to deal with resource lookups and
command completion. The new functionality is similar to the lookup and
command completion functionality located in `bundle/run`. It differs in
that it doesn't gracefully deal with ambiguous references to resources,
now that we explicitly validate this doesn't occur in the bundle
configuration. It still allows resources to be looked up with their
fully qualified key, `<plural type>.<key>`.

## Tests

* Added unit tests for resource lookup and completion
* Manually confirmed that `bundle open` prompts, accepts a key argument,
and opens a browser
2024-10-24 12:20:33 +00:00
shreyas-goenka 3bab21e72e
Fix race condition when restarting continuous jobs (#1849)
## Changes
We don't need to cancel existing runs when the job is continuous and
unpaused. The `/jobs/run-now` command will cancel the existing run and
trigger a new one automatically.

Cancelling the job manually can cause a race condition where both the
manual trigger from the CLI and the continuous trigger from the job
configuration happens at the same time. This PR prevents that from
happening.

## Tests
Unit tests and manually
2024-10-22 14:59:17 +00:00
Andrew Nester 68d69d6e0b
Upgrade TF provider to 1.54.0 (#1852)
## Changes
Upgrade TF provider to 1.54.0
2024-10-22 10:43:43 +00:00
dependabot[bot] f8bb3a8d72
Bump github.com/databricks/databricks-sdk-go from 0.48.0 to 0.49.0 (#1843)
Bumps
[github.com/databricks/databricks-sdk-go](https://github.com/databricks/databricks-sdk-go)
from 0.48.0 to 0.49.0.
<details>
<summary>Release notes</summary>
<p><em>Sourced from <a
href="https://github.com/databricks/databricks-sdk-go/releases">github.com/databricks/databricks-sdk-go's
releases</a>.</em></p>
<blockquote>
<h2>v0.49.0</h2>
<h3>API Changes:</h3>
<ul>
<li>Added <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/settings#DisableLegacyDbfsAPI">w.DisableLegacyDbfs</a>
workspace-level service.</li>
<li>Added <code>UnityCatalogProvisioningState</code> field for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/catalog#OnlineTable">catalog.OnlineTable</a>.</li>
<li>Added <code>IsTruncated</code> field for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/dashboards#Result">dashboards.Result</a>.</li>
<li>Added <code>EffectiveBudgetPolicyId</code> field for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/jobs#BaseJob">jobs.BaseJob</a>.</li>
<li>Added <code>BudgetPolicyId</code> field for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/jobs#CreateJob">jobs.CreateJob</a>.</li>
<li>Added <code>EffectiveBudgetPolicyId</code> field for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/jobs#Job">jobs.Job</a>.</li>
<li>Added <code>BudgetPolicyId</code> field for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/jobs#JobSettings">jobs.JobSettings</a>.</li>
<li>Added <code>BudgetPolicyId</code> field for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/jobs#SubmitRun">jobs.SubmitRun</a>.</li>
<li>Added <code>Report</code> field for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/pipelines#IngestionConfig">pipelines.IngestionConfig</a>.</li>
<li>Added <code>SequenceBy</code> field for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/pipelines#TableSpecificConfig">pipelines.TableSpecificConfig</a>.</li>
<li>Added <code>NotifyOnOk</code> field for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/sql#Alert">sql.Alert</a>.</li>
<li>Added <code>NotifyOnOk</code> field for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/sql#CreateAlertRequestAlert">sql.CreateAlertRequestAlert</a>.</li>
<li>Added <code>NotifyOnOk</code> field for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/sql#ListAlertsResponseAlert">sql.ListAlertsResponseAlert</a>.</li>
<li>Added <code>NotifyOnOk</code> field for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/sql#UpdateAlertRequestAlert">sql.UpdateAlertRequestAlert</a>.</li>
</ul>
<p>OpenAPI SHA: cf9c61453990df0f9453670f2fe68e1b128647a2, Date:
2024-10-14</p>
</blockquote>
</details>
<details>
<summary>Changelog</summary>
<p><em>Sourced from <a
href="https://github.com/databricks/databricks-sdk-go/blob/main/CHANGELOG.md">github.com/databricks/databricks-sdk-go's
changelog</a>.</em></p>
<blockquote>
<h2>[Release] Release v0.49.0</h2>
<h3>API Changes:</h3>
<ul>
<li>Added <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/settings#DisableLegacyDbfsAPI">w.DisableLegacyDbfs</a>
workspace-level service.</li>
<li>Added <code>UnityCatalogProvisioningState</code> field for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/catalog#OnlineTable">catalog.OnlineTable</a>.</li>
<li>Added <code>IsTruncated</code> field for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/dashboards#Result">dashboards.Result</a>.</li>
<li>Added <code>EffectiveBudgetPolicyId</code> field for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/jobs#BaseJob">jobs.BaseJob</a>.</li>
<li>Added <code>BudgetPolicyId</code> field for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/jobs#CreateJob">jobs.CreateJob</a>.</li>
<li>Added <code>EffectiveBudgetPolicyId</code> field for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/jobs#Job">jobs.Job</a>.</li>
<li>Added <code>BudgetPolicyId</code> field for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/jobs#JobSettings">jobs.JobSettings</a>.</li>
<li>Added <code>BudgetPolicyId</code> field for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/jobs#SubmitRun">jobs.SubmitRun</a>.</li>
<li>Added <code>Report</code> field for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/pipelines#IngestionConfig">pipelines.IngestionConfig</a>.</li>
<li>Added <code>SequenceBy</code> field for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/pipelines#TableSpecificConfig">pipelines.TableSpecificConfig</a>.</li>
<li>Added <code>NotifyOnOk</code> field for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/sql#Alert">sql.Alert</a>.</li>
<li>Added <code>NotifyOnOk</code> field for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/sql#CreateAlertRequestAlert">sql.CreateAlertRequestAlert</a>.</li>
<li>Added <code>NotifyOnOk</code> field for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/sql#ListAlertsResponseAlert">sql.ListAlertsResponseAlert</a>.</li>
<li>Added <code>NotifyOnOk</code> field for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/sql#UpdateAlertRequestAlert">sql.UpdateAlertRequestAlert</a>.</li>
</ul>
<p>OpenAPI SHA: cf9c61453990df0f9453670f2fe68e1b128647a2, Date:
2024-10-14</p>
</blockquote>
</details>
<details>
<summary>Commits</summary>
<ul>
<li><a
href="b47ecac9ea"><code>b47ecac</code></a>
[Release] Release v0.49.0 (<a
href="https://redirect.github.com/databricks/databricks-sdk-go/issues/1062">#1062</a>)</li>
<li>See full diff in <a
href="https://github.com/databricks/databricks-sdk-go/compare/v0.48.0...v0.49.0">compare
view</a></li>
</ul>
</details>
<br />

<details>
<summary>Most Recent Ignore Conditions Applied to This Pull
Request</summary>

| Dependency Name | Ignore Conditions |
| --- | --- |
| github.com/databricks/databricks-sdk-go | [>= 0.28.a, < 0.29] |
</details>


[![Dependabot compatibility
score](https://dependabot-badges.githubapp.com/badges/compatibility_score?dependency-name=github.com/databricks/databricks-sdk-go&package-manager=go_modules&previous-version=0.48.0&new-version=0.49.0)](https://docs.github.com/en/github/managing-security-vulnerabilities/about-dependabot-security-updates#about-compatibility-scores)

Dependabot will resolve any conflicts with this PR as long as you don't
alter it yourself. You can also trigger a rebase manually by commenting
`@dependabot rebase`.

[//]: # (dependabot-automerge-start)
[//]: # (dependabot-automerge-end)

---

<details>
<summary>Dependabot commands and options</summary>
<br />

You can trigger Dependabot actions by commenting on this PR:
- `@dependabot rebase` will rebase this PR
- `@dependabot recreate` will recreate this PR, overwriting any edits
that have been made to it
- `@dependabot merge` will merge this PR after your CI passes on it
- `@dependabot squash and merge` will squash and merge this PR after
your CI passes on it
- `@dependabot cancel merge` will cancel a previously requested merge
and block automerging
- `@dependabot reopen` will reopen this PR if it is closed
- `@dependabot close` will close this PR and stop Dependabot recreating
it. You can achieve the same result by closing it manually
- `@dependabot show <dependency name> ignore conditions` will show all
of the ignore conditions of the specified dependency
- `@dependabot ignore this major version` will close this PR and stop
Dependabot creating any more for this major version (unless you reopen
the PR or upgrade to it yourself)
- `@dependabot ignore this minor version` will close this PR and stop
Dependabot creating any more for this minor version (unless you reopen
the PR or upgrade to it yourself)
- `@dependabot ignore this dependency` will close this PR and stop
Dependabot creating any more for this dependency (unless you reopen the
PR or upgrade to it yourself)


</details>

---------

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
Co-authored-by: Andrew Nester <andrew.nester@databricks.com>
2024-10-22 09:37:01 +00:00
Andrew Nester ffdbec87cc
Added support for pip options in environment dependencies (#1842)
## Changes
Added support for specifying pip options such as `--extra-index-url` and
etc. in environments dependencies

```
environments:
  - environment_key: Default
    spec:
      client: "1"
      dependencies:
        - --extra-index-url https://foo@bar.com/packages/smth somepackage
        - json==1.0.0
```

## Tests
Added regression test

---------

Co-authored-by: Pieter Noordhuis <pieter.noordhuis@databricks.com>
2024-10-21 11:45:39 +00:00
Pieter Noordhuis 3b1fb6d2df
Remove Terraform conversion function that's no longer used (#1840)
## Changes

In #1218, the `BundleToTerraform` function was replaced by a version
based on the dynamic configuration tree. Since then, it has only been
used in tests to confirm that the output of the old function was equal
to the output of the new function. We no longer need this and can
exclusively rely on the version based on the dynamic configuration tree.

## Tests

Tests pass.
2024-10-18 15:38:10 +00:00
Andrew Nester 0c9c90208f
Added a warning when incorrect permissions used for `/Workspace/Shared` bundle root (#1821)
## Changes
Added a warning when incorrect permissions used for `/Workspace/Shared`
bundle root

## Tests
Added unit test

---------

Co-authored-by: Pieter Noordhuis <pieter.noordhuis@databricks.com>
2024-10-18 15:37:16 +00:00
Lennart Kats (databricks) c5043c3d9d
Add `bundle summary` to display URLs for deployed resources (#1731)
## Changes

Adds a textual output to the `databricks bundle summary` command, which
includes URLs of deployed resources.

Example usage:

```
$ databricks bundle summary
Name: my_pipeline
Target: dev
Workspace:
  Host: https://domain.databricks.com
  User: user@databricks.com
  Path: /Users/user@databricks.com/.bundle/my_pipeline/dev
Resources:
  Jobs:
    my_project_job:
      Name: [dev lennart] my_project_job
      URL:  https://domain.databricks.com/jobs/206899209187287?o=6051921418418893
  Pipelines:
    my_project_pipeline:
      Name: [dev lennart] my_project_pipeline
      URL:  https://domain.databricks.com/pipelines/3f849fd5-ba7d-47fa-a34c-c6bf034b4f58?o=6051921418418893
```

Notes:
* The top headers of the output are the same as those from the existing
`bundle validate` command
* URLs are colored light blue in the output
* For resources that haven't been deployed yet, we show `(not deployed)`
in place of the URL

---------

Co-authored-by: Pieter Noordhuis <pieter.noordhuis@databricks.com>
Co-authored-by: Pieter Noordhuis <pcnoordhuis@gmail.com>
2024-10-18 06:45:47 +00:00
Pieter Noordhuis e4d039a1aa
Handle normalization of `dyn.KindTime` into an any type (#1836)
## Changes

The issue reported in #1828 illustrates how using a YAML timestamp-like
value (a date in this case) causes an issue during conversion to and
from the typed configuration tree.

We use the `AsAny()` function on the `dyn.Value` when normalizing for
the `any` type. We only use the `any` type for variable values, because
they can assume every type. The `AsAny()` function returns a `time.Time`
for the time value during conversion **to** the typed configuration
tree. Upon conversion **from** the typed configuration tree back into
the dynamic configuration tree, we cannot distinguish a `time.Time`
struct from any other struct.

To address this, we use the underlying string value of the time value
when we normalize for the `any` type.

Fixes #1828.

## Tests

Existing unit tests pass
2024-10-17 10:00:40 +00:00
Andrew Nester f0e2981596
Added JSON input validation for CLI commands (#1771)
## Changes
Added JSON input validation for CLI commands. Now when invalid JSON
passed as a payload to CLI commands, CLI performs input normalisation
and detects if there are any mismatches such as incorrect types, unknown
fields and etc.

This diagnostic information is printed in standard error output and does
not block command execution, so the change is backward compatible.

Fixes #1769 #1764 #1625 #1560


## Tests
Added unit tests

```
andrew.nester@HFW9Y94129 ~ % databricks jobs create --json '{"seeti}'
Error: error decoding JSON at (inline):1:2: unexpected EOF


andrew.nester@HFW9Y94129 ~ % databricks jobs create --json '{"seeti": true}'
Warning: unknown field: seeti
  in (inline):1:9

Error: Job settings must be specified.
```

---------

Co-authored-by: Pieter Noordhuis <pieter.noordhuis@databricks.com>
2024-10-11 14:39:53 +00:00
Andrew Nester 845d23ac21
Fixed typo in converting cluster permissions (#1826)
## Changes
Fixed typo in converting cluster permissions
2024-10-10 14:10:16 +00:00
Pieter Noordhuis b9085de533
Remove unused `IS_OWNER` constant (#1823)
## Changes

Leftover from #1386.

## Tests

All tests pass (indicating it really wasn't used).
2024-10-10 13:43:21 +00:00
Pieter Noordhuis 3270afaff4
Move utility functions dealing with IAM to libs/iamutil (#1820)
## Changes

The two functions `GetShortUserName` and `IsServicePrincipal` are
unrelated to auth or the purpose of the auth package. This change moves
them into their own package and updates `IsServicePrincipal` to take an
`*iam.User` argument instead of a string username.

## Tests

Tests pass.
2024-10-10 13:02:25 +00:00
Lennart Kats (databricks) e885794722
Show actionable errors for collaborative deployment scenarios (#1386)
## Changes

This adds diagnostics for collaborative (production) deployment
scenarios, including:

- Bob deploys a bundle that is normally deployed by Alice, but this
fails because Bob can't write to `/Users/Alice/.bundle`.
- Charlie deploys a bundle that is normally deployed by Alice, but this
fails because he can't create a new pipeline where Alice would be the
owner.
- Alice deploys a bundle where she didn't list herself as one of the
CAN_MANAGE users in permissions. That can work, but is probably a
mistake.

## Tests

Unit tests, manual testing.
2024-10-10 11:18:23 +00:00
Pieter Noordhuis ae568743c5
Upgrade TF provider to 1.53.0 (#1815)
## Changes

See
https://github.com/databricks/terraform-provider-databricks/releases/tag/v1.53.0

## Tests

Integration tests pass.
2024-10-08 11:41:32 +00:00
shreyas-goenka 88318d384a
Remove deprecated or readonly fields from the bundle schema (#1809)
## Changes
We do not need the user to specify these fields in their bundle
configuration, so we remove them from the JSON schema.

## Tests
Manually and end to end tests. The JSON schema has also been regenerated
after these changes.
2024-10-07 15:13:42 +00:00
dependabot[bot] 7a2141fc5b
Bump github.com/databricks/databricks-sdk-go from 0.47.0 to 0.48.0 (#1810)
Bumps
[github.com/databricks/databricks-sdk-go](https://github.com/databricks/databricks-sdk-go)
from 0.47.0 to 0.48.0.
<details>
<summary>Release notes</summary>
<p><em>Sourced from <a
href="https://github.com/databricks/databricks-sdk-go/releases">github.com/databricks/databricks-sdk-go's
releases</a>.</em></p>
<blockquote>
<h2>v0.48.0</h2>
<h3>Internal Changes</h3>
<ul>
<li>Update SDK to latest OpenAPI spec (<a
href="https://redirect.github.com/databricks/databricks-sdk-go/pull/1057">#1057</a>).</li>
</ul>
<p>Note: This release contains breaking changes, please see the API
changes below for more details.</p>
<h3>API Changes:</h3>
<ul>
<li>Added <code>DefaultSourceCodePath</code> and <code>Resources</code>
fields for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/apps#App">apps.App</a>.</li>
<li>Added <code>Resources</code> field for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/apps#CreateAppRequest">apps.CreateAppRequest</a>.</li>
<li>Added <code>Resources</code> field for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/apps#UpdateAppRequest">apps.UpdateAppRequest</a>.</li>
<li>Added <code>Schema</code> field for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/pipelines#CreatePipeline">pipelines.CreatePipeline</a>.</li>
<li>Added <code>Schema</code> field for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/pipelines#EditPipeline">pipelines.EditPipeline</a>.</li>
<li>Added <code>Schema</code> field for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/pipelines#PipelineSpec">pipelines.PipelineSpec</a>.</li>
<li>[Breaking] Changed <code>Create</code> method for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/workspace#GitCredentialsAPI">w.GitCredentials</a>
workspace-level service . New request type is <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/workspace#CreateCredentialsRequest">workspace.CreateCredentialsRequest</a>.</li>
<li>[Breaking] Changed <code>Create</code> method for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/workspace#GitCredentialsAPI">w.GitCredentials</a>
workspace-level service to type <code>Create</code> method for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/workspace#GitCredentialsAPI">w.GitCredentials</a>
workspace-level service.</li>
<li>[Breaking] Changed <code>Delete</code> method for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/workspace#GitCredentialsAPI">w.GitCredentials</a>
workspace-level service . New request type is <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/workspace#DeleteCredentialsRequest">workspace.DeleteCredentialsRequest</a>.</li>
<li>[Breaking] Changed <code>Delete</code> method for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/workspace#GitCredentialsAPI">w.GitCredentials</a>
workspace-level service to return <code>any</code>.</li>
<li>Changed <code>Delete</code> method for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/workspace#GitCredentialsAPI">w.GitCredentials</a>
workspace-level service to type <code>Delete</code> method for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/workspace#GitCredentialsAPI">w.GitCredentials</a>
workspace-level service.</li>
<li>[Breaking] Changed <code>Get</code> method for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/workspace#GitCredentialsAPI">w.GitCredentials</a>
workspace-level service . New request type is <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/workspace#GetCredentialsRequest">workspace.GetCredentialsRequest</a>.</li>
<li>Changed <code>Get</code> method for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/workspace#GitCredentialsAPI">w.GitCredentials</a>
workspace-level service to type <code>Get</code> method for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/workspace#GitCredentialsAPI">w.GitCredentials</a>
workspace-level service.</li>
<li>[Breaking] Changed <code>Get</code> method for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/workspace#GitCredentialsAPI">w.GitCredentials</a>
workspace-level service to return <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/workspace#GetCredentialsResponse">workspace.GetCredentialsResponse</a>.</li>
<li>[Breaking] Changed <code>List</code> method for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/workspace#GitCredentialsAPI">w.GitCredentials</a>
workspace-level service to return <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/workspace#ListCredentialsResponse">workspace.ListCredentialsResponse</a>.</li>
<li>Changed <code>List</code> method for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/workspace#GitCredentialsAPI">w.GitCredentials</a>
workspace-level service to type <code>List</code> method for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/workspace#GitCredentialsAPI">w.GitCredentials</a>
workspace-level service.</li>
<li>Changed <code>Update</code> method for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/workspace#GitCredentialsAPI">w.GitCredentials</a>
workspace-level service to type <code>Update</code> method for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/workspace#GitCredentialsAPI">w.GitCredentials</a>
workspace-level service.</li>
<li>[Breaking] Changed <code>Update</code> method for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/workspace#GitCredentialsAPI">w.GitCredentials</a>
workspace-level service to return <code>any</code>.</li>
<li>[Breaking] Changed <code>Update</code> method for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/workspace#GitCredentialsAPI">w.GitCredentials</a>
workspace-level service . New request type is <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/workspace#UpdateCredentialsRequest">workspace.UpdateCredentialsRequest</a>.</li>
<li>Changed <code>Create</code> method for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/workspace#ReposAPI">w.Repos</a>
workspace-level service to type <code>Create</code> method for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/workspace#ReposAPI">w.Repos</a>
workspace-level service.</li>
<li>[Breaking] Changed <code>Create</code> method for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/workspace#ReposAPI">w.Repos</a>
workspace-level service . New request type is <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/workspace#CreateRepoRequest">workspace.CreateRepoRequest</a>.</li>
<li>[Breaking] Changed <code>Create</code> method for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/workspace#ReposAPI">w.Repos</a>
workspace-level service to return <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/workspace#CreateRepoResponse">workspace.CreateRepoResponse</a>.</li>
<li>[Breaking] Changed <code>Delete</code> method for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/workspace#ReposAPI">w.Repos</a>
workspace-level service to return <code>any</code>.</li>
<li>Changed <code>Delete</code> method for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/workspace#ReposAPI">w.Repos</a>
workspace-level service to type <code>Delete</code> method for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/workspace#ReposAPI">w.Repos</a>
workspace-level service.</li>
<li>[Breaking] Changed <code>Get</code> method for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/workspace#ReposAPI">w.Repos</a>
workspace-level service to return <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/workspace#GetRepoResponse">workspace.GetRepoResponse</a>.</li>
<li>Changed <code>Get</code> method for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/workspace#ReposAPI">w.Repos</a>
workspace-level service to type <code>Get</code> method for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/workspace#ReposAPI">w.Repos</a>
workspace-level service.</li>
<li>Changed <code>Update</code> method for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/workspace#ReposAPI">w.Repos</a>
workspace-level service to return <code>any</code>.</li>
<li>[Breaking] Changed <code>Update</code> method for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/workspace#ReposAPI">w.Repos</a>
workspace-level service . New request type is <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/workspace#UpdateRepoRequest">workspace.UpdateRepoRequest</a>.</li>
<li>Changed <code>Update</code> method for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/workspace#ReposAPI">w.Repos</a>
workspace-level service to type <code>Update</code> method for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/workspace#ReposAPI">w.Repos</a>
workspace-level service.</li>
<li>[Breaking] Changed <code>CredentialId</code> and
<code>GitProvider</code> fields for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/workspace#CreateCredentialsResponse">workspace.CreateCredentialsResponse</a>
to be required.</li>
<li>Changed <code>CredentialId</code> field for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/workspace#CredentialInfo">workspace.CredentialInfo</a>
to be required.</li>
<li>Changed <code>CredentialId</code> field for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/workspace#GetCredentialsResponse">workspace.GetCredentialsResponse</a>
to be required.</li>
<li>Changed <code>Patterns</code> field for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/workspace#SparseCheckout">workspace.SparseCheckout</a>
to type <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/workspace#List">workspace.List</a>.</li>
<li>Changed <code>Patterns</code> field for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/workspace#SparseCheckoutUpdate">workspace.SparseCheckoutUpdate</a>
to type <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/workspace#List">workspace.List</a>.</li>
<li>[Breaking] Changed <code>GitProvider</code> field for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/workspace#UpdateCredentialsRequest">workspace.UpdateCredentialsRequest</a>
to be required.</li>
</ul>
<p>OpenAPI SHA: 0c86ea6dbd9a730c24ff0d4e509603e476955ac5, Date:
2024-10-02</p>
</blockquote>
</details>
<details>
<summary>Changelog</summary>
<p><em>Sourced from <a
href="https://github.com/databricks/databricks-sdk-go/blob/main/CHANGELOG.md">github.com/databricks/databricks-sdk-go's
changelog</a>.</em></p>
<blockquote>
<h2>[Release] Release v0.48.0</h2>
<h3>Internal Changes</h3>
<ul>
<li>Update SDK to latest OpenAPI spec (<a
href="https://redirect.github.com/databricks/databricks-sdk-go/pull/1057">#1057</a>).</li>
</ul>
<p>Note: This release contains breaking changes, please see the API
changes below for more details.</p>
<h3>API Changes:</h3>
<ul>
<li>Added <code>DefaultSourceCodePath</code> and <code>Resources</code>
fields for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/apps#App">apps.App</a>.</li>
<li>Added <code>Resources</code> field for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/apps#CreateAppRequest">apps.CreateAppRequest</a>.</li>
<li>Added <code>Resources</code> field for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/apps#UpdateAppRequest">apps.UpdateAppRequest</a>.</li>
<li>Added <code>Schema</code> field for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/pipelines#CreatePipeline">pipelines.CreatePipeline</a>.</li>
<li>Added <code>Schema</code> field for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/pipelines#EditPipeline">pipelines.EditPipeline</a>.</li>
<li>Added <code>Schema</code> field for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/pipelines#PipelineSpec">pipelines.PipelineSpec</a>.</li>
<li>[Breaking] Changed <code>Create</code> method for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/workspace#GitCredentialsAPI">w.GitCredentials</a>
workspace-level service . New request type is <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/workspace#CreateCredentialsRequest">workspace.CreateCredentialsRequest</a>.</li>
<li>[Breaking] Changed <code>Create</code> method for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/workspace#GitCredentialsAPI">w.GitCredentials</a>
workspace-level service to type <code>Create</code> method for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/workspace#GitCredentialsAPI">w.GitCredentials</a>
workspace-level service.</li>
<li>[Breaking] Changed <code>Delete</code> method for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/workspace#GitCredentialsAPI">w.GitCredentials</a>
workspace-level service . New request type is <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/workspace#DeleteCredentialsRequest">workspace.DeleteCredentialsRequest</a>.</li>
<li>[Breaking] Changed <code>Delete</code> method for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/workspace#GitCredentialsAPI">w.GitCredentials</a>
workspace-level service to return <code>any</code>.</li>
<li>Changed <code>Delete</code> method for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/workspace#GitCredentialsAPI">w.GitCredentials</a>
workspace-level service to type <code>Delete</code> method for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/workspace#GitCredentialsAPI">w.GitCredentials</a>
workspace-level service.</li>
<li>[Breaking] Changed <code>Get</code> method for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/workspace#GitCredentialsAPI">w.GitCredentials</a>
workspace-level service . New request type is <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/workspace#GetCredentialsRequest">workspace.GetCredentialsRequest</a>.</li>
<li>Changed <code>Get</code> method for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/workspace#GitCredentialsAPI">w.GitCredentials</a>
workspace-level service to type <code>Get</code> method for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/workspace#GitCredentialsAPI">w.GitCredentials</a>
workspace-level service.</li>
<li>[Breaking] Changed <code>Get</code> method for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/workspace#GitCredentialsAPI">w.GitCredentials</a>
workspace-level service to return <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/workspace#GetCredentialsResponse">workspace.GetCredentialsResponse</a>.</li>
<li>[Breaking] Changed <code>List</code> method for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/workspace#GitCredentialsAPI">w.GitCredentials</a>
workspace-level service to return <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/workspace#ListCredentialsResponse">workspace.ListCredentialsResponse</a>.</li>
<li>Changed <code>List</code> method for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/workspace#GitCredentialsAPI">w.GitCredentials</a>
workspace-level service to type <code>List</code> method for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/workspace#GitCredentialsAPI">w.GitCredentials</a>
workspace-level service.</li>
<li>Changed <code>Update</code> method for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/workspace#GitCredentialsAPI">w.GitCredentials</a>
workspace-level service to type <code>Update</code> method for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/workspace#GitCredentialsAPI">w.GitCredentials</a>
workspace-level service.</li>
<li>[Breaking] Changed <code>Update</code> method for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/workspace#GitCredentialsAPI">w.GitCredentials</a>
workspace-level service to return <code>any</code>.</li>
<li>[Breaking] Changed <code>Update</code> method for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/workspace#GitCredentialsAPI">w.GitCredentials</a>
workspace-level service . New request type is <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/workspace#UpdateCredentialsRequest">workspace.UpdateCredentialsRequest</a>.</li>
<li>Changed <code>Create</code> method for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/workspace#ReposAPI">w.Repos</a>
workspace-level service to type <code>Create</code> method for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/workspace#ReposAPI">w.Repos</a>
workspace-level service.</li>
<li>[Breaking] Changed <code>Create</code> method for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/workspace#ReposAPI">w.Repos</a>
workspace-level service . New request type is <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/workspace#CreateRepoRequest">workspace.CreateRepoRequest</a>.</li>
<li>[Breaking] Changed <code>Create</code> method for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/workspace#ReposAPI">w.Repos</a>
workspace-level service to return <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/workspace#CreateRepoResponse">workspace.CreateRepoResponse</a>.</li>
<li>[Breaking] Changed <code>Delete</code> method for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/workspace#ReposAPI">w.Repos</a>
workspace-level service to return <code>any</code>.</li>
<li>Changed <code>Delete</code> method for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/workspace#ReposAPI">w.Repos</a>
workspace-level service to type <code>Delete</code> method for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/workspace#ReposAPI">w.Repos</a>
workspace-level service.</li>
<li>[Breaking] Changed <code>Get</code> method for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/workspace#ReposAPI">w.Repos</a>
workspace-level service to return <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/workspace#GetRepoResponse">workspace.GetRepoResponse</a>.</li>
<li>Changed <code>Get</code> method for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/workspace#ReposAPI">w.Repos</a>
workspace-level service to type <code>Get</code> method for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/workspace#ReposAPI">w.Repos</a>
workspace-level service.</li>
<li>Changed <code>Update</code> method for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/workspace#ReposAPI">w.Repos</a>
workspace-level service to return <code>any</code>.</li>
<li>[Breaking] Changed <code>Update</code> method for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/workspace#ReposAPI">w.Repos</a>
workspace-level service . New request type is <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/workspace#UpdateRepoRequest">workspace.UpdateRepoRequest</a>.</li>
<li>Changed <code>Update</code> method for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/workspace#ReposAPI">w.Repos</a>
workspace-level service to type <code>Update</code> method for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/workspace#ReposAPI">w.Repos</a>
workspace-level service.</li>
<li>[Breaking] Changed <code>CredentialId</code> and
<code>GitProvider</code> fields for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/workspace#CreateCredentialsResponse">workspace.CreateCredentialsResponse</a>
to be required.</li>
<li>Changed <code>CredentialId</code> field for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/workspace#CredentialInfo">workspace.CredentialInfo</a>
to be required.</li>
<li>Changed <code>CredentialId</code> field for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/workspace#GetCredentialsResponse">workspace.GetCredentialsResponse</a>
to be required.</li>
<li>Changed <code>Patterns</code> field for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/workspace#SparseCheckout">workspace.SparseCheckout</a>
to type <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/workspace#List">workspace.List</a>.</li>
<li>Changed <code>Patterns</code> field for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/workspace#SparseCheckoutUpdate">workspace.SparseCheckoutUpdate</a>
to type <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/workspace#List">workspace.List</a>.</li>
<li>[Breaking] Changed <code>GitProvider</code> field for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/workspace#UpdateCredentialsRequest">workspace.UpdateCredentialsRequest</a>
to be required.</li>
</ul>
<p>OpenAPI SHA: 0c86ea6dbd9a730c24ff0d4e509603e476955ac5, Date:
2024-10-02</p>
</blockquote>
</details>
<details>
<summary>Commits</summary>
<ul>
<li><a
href="80c963696b"><code>80c9636</code></a>
[Release] Release v0.48.0 (<a
href="https://redirect.github.com/databricks/databricks-sdk-go/issues/1058">#1058</a>)</li>
<li><a
href="6cecc224cb"><code>6cecc22</code></a>
[Internal] Update SDK to latest OpenAPI spec (<a
href="https://redirect.github.com/databricks/databricks-sdk-go/issues/1057">#1057</a>)</li>
<li>See full diff in <a
href="https://github.com/databricks/databricks-sdk-go/compare/v0.47.0...v0.48.0">compare
view</a></li>
</ul>
</details>
<br />

<details>
<summary>Most Recent Ignore Conditions Applied to This Pull
Request</summary>

| Dependency Name | Ignore Conditions |
| --- | --- |
| github.com/databricks/databricks-sdk-go | [>= 0.28.a, < 0.29] |
</details>


[![Dependabot compatibility
score](https://dependabot-badges.githubapp.com/badges/compatibility_score?dependency-name=github.com/databricks/databricks-sdk-go&package-manager=go_modules&previous-version=0.47.0&new-version=0.48.0)](https://docs.github.com/en/github/managing-security-vulnerabilities/about-dependabot-security-updates#about-compatibility-scores)

Dependabot will resolve any conflicts with this PR as long as you don't
alter it yourself. You can also trigger a rebase manually by commenting
`@dependabot rebase`.

[//]: # (dependabot-automerge-start)
[//]: # (dependabot-automerge-end)

---

<details>
<summary>Dependabot commands and options</summary>
<br />

You can trigger Dependabot actions by commenting on this PR:
- `@dependabot rebase` will rebase this PR
- `@dependabot recreate` will recreate this PR, overwriting any edits
that have been made to it
- `@dependabot merge` will merge this PR after your CI passes on it
- `@dependabot squash and merge` will squash and merge this PR after
your CI passes on it
- `@dependabot cancel merge` will cancel a previously requested merge
and block automerging
- `@dependabot reopen` will reopen this PR if it is closed
- `@dependabot close` will close this PR and stop Dependabot recreating
it. You can achieve the same result by closing it manually
- `@dependabot show <dependency name> ignore conditions` will show all
of the ignore conditions of the specified dependency
- `@dependabot ignore this major version` will close this PR and stop
Dependabot creating any more for this major version (unless you reopen
the PR or upgrade to it yourself)
- `@dependabot ignore this minor version` will close this PR and stop
Dependabot creating any more for this minor version (unless you reopen
the PR or upgrade to it yourself)
- `@dependabot ignore this dependency` will close this PR and stop
Dependabot creating any more for this dependency (unless you reopen the
PR or upgrade to it yourself)


</details>

---------

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
Co-authored-by: Andrew Nester <andrew.nester@databricks.com>
2024-10-07 13:21:05 +00:00
shreyas-goenka bca9c2eda4
Add validation for files with a `.(resource-name).yml` extension (#1780)
## Changes
We want to encourage a pattern of specifying only a single resource in a
YAML file when the `.(resource-type).yml` extension is used (for
example, `.job.yml`). This convention could allow us to bijectively map
a resource YAML file to its corresponding resource in the Databricks
workspace.

This PR:
1. Emits a recommendation diagnostic when we detect this convention is
being violated. We can promote this to a warning when we want to
encourage this pattern more strongly.
2. Visualises the recommendation diagnostics in the `bundle validate`
command.

**NOTE:** While this PR also shows the recommendation for `.yaml` files,
we do not encourage users to use this extension. We only support it here
since it's part of the YAML standard and some existing users might
already be using `.yaml`.

## Tests
Unit tests and manually. Here's what an example output looks like:
```
Recommendation: define a single job in a file with the .job.yml extension.
  at resources.jobs.bar
     resources.jobs.foo
  in foo.job.yml:13:7
     foo.job.yml:5:7

The following resources are defined or configured in this file:
  - bar (job)
  - foo (job)
```

---------

Co-authored-by: Lennart Kats (databricks) <lennart.kats@databricks.com>
2024-10-07 09:16:20 +00:00
Andrew Nester a8cff48c0b
Always prepend bundle remote paths with /Workspace (#1724)
## Changes
Due to platform changes, all libraries, notebooks and etc. paths used in
Databricks must be started with either /Workspace or /Volumes prefix.

This PR makes sure that all bundle paths are correctly prefixed.

Note: this change is a breaking change if user previously configured and
used `/Workspace/Workspace` folder in their workspace file system or
having `/Workspace/${workspace.root_path}...` pattern configured
anywhere in their bundle config

Fixes: #1751

AI:
- [x] Scan DABs config and error out on
`/Workspace/${workspace.root_path}...` pattern usage

## Tests
Added unit tests

---------

Co-authored-by: Pieter Noordhuis <pieter.noordhuis@databricks.com>
2024-10-02 15:34:00 +00:00
Pieter Noordhuis 80d55f4540
Add resource path field to bundle workspace configuration (#1800)
## Changes

Default workspace path for resources with a presence in the workspace
tree.

Note: this path is **not** created automatically (yet). We need this
only for dashboards (so far), so can take care of creation if one or
more dashboards are part of a deployment. This saves an API call for
deployments where this is not necessary.

## Tests

Expanded existing tests.
2024-10-02 13:55:40 +00:00
Andrew Nester 044a00c7f9
Add an error if state files grow bigger than the export limit (#1795)
## Changes
Currently API limits on exporting files from workspaces are set at 10
MBs while uploading to is 500 MBs. We want to prevent users running into
deadlock when they won't be able to pull state file anymore so we
prevent from uploading large state files (over 10 MBs) to Databricks
workspace.
2024-10-02 13:53:24 +00:00
Andrew Nester 84fc1ed131
Upgrade to Go SDK 0.47.0 (#1799)
## Changes
Upgrade to Go SDK 0.47.0

## Tests
<!-- How is this tested? -->
2024-10-01 10:44:47 +00:00
Lennart Kats (databricks) da3b4f7c72
Fix panic in `apply_presets.go` (#1796)
## Changes

This fixes the user-reported panic in `apply_presets.go`. I'm still
unsure how to reproduce this, since the CLI just reports `ob broken_job
is not defined` when I try to use `bundle deploy` with an empty job.
That said — we may as well be defensive here and I see we have lots of
checks for empty job/cluster/etc. settings scattered throughout our code
base so at least we're somewhat consistent.
2024-09-29 14:08:10 +00:00
Pieter Noordhuis 1d1aa0a416
Rename `RootPath` -> `BundleRootPath` (#1792)
## Changes

After introducing the `SyncRootPath` field on the bundle (#1694), the
previous `RootPath` became ambiguous. Does it mean the bundle root path
or the sync root path? This PR renames to field to `BundleRootPath` to
remove the ambiguity.

## Tests

n/a

---------

Co-authored-by: shreyas-goenka <88374338+shreyas-goenka@users.noreply.github.com>
2024-09-27 10:03:05 +00:00
Pieter Noordhuis 56cd96cb93
Move trampoline code into trampoline package (#1793)
## Changes

Doing this to make room for PyDABs under `bundle/python`.

## Tests

n/a
2024-09-27 09:32:54 +00:00
Pieter Noordhuis a1dca56abf
Trim trailing whitespace (#1794)
## Changes

Trailing whitespace is trimmed per the VS Code settings for this
repository.

## Tests

n/a
2024-09-27 09:30:39 +00:00
shreyas-goenka 4e8e027380
Sort tasks by `task_key` before generating the Terraform configuration (#1776)
## Changes
Sort the tasks in the resultant `bundle.tf.json`. This is important
because configuration from one task can leak into another if the tasks
are not sorted.

For more details see:
1.
https://github.com/databricks/terraform-provider-databricks/issues/3951
2.
https://github.com/databricks/terraform-provider-databricks/issues/4011

## Tests
Unit test and manually.

For manual testing I used the following configuration:
```
resources:
  jobs:
    foo:
      tasks: 
        - task_key: task-Z
          notebook_task: 
            notebook_path: nb.py
            source: GIT
          existing_cluster_id: 0715-133738-ju0ma84z

        - task_key: task-1
          notebook_task: 
            notebook_path: ${workspace.file_path}/local.py
            source: WORKSPACE
          existing_cluster_id: 0715-133738-ju0ma84z
          depends_on: 
            - task_key: task-Z


      git_source: 
        git_provider: gitHub
        git_url: https://github.com/shreyas-goenka/job-source-tmp.git
        git_branch: main
```

Steps (1):
1. Deploy this bundle.
2. Comment out "source: GIT"
3. Deploy again

Before:
Deploying this bundle twice would fail. This is because the "source:
GIT" would carry over to the next deployment.

After:
There was no error on the subsequent deployment.

Steps (2):
1. Deploy once
2. Deploy again

Before:
Works correctly but leads to a update API call every time.

After:
No diff is detected by terraform.
2024-09-26 13:22:22 +00:00
Andrew Nester 66f2ba64a8
Simplified isFullVariableOverrideDef implementation (#1791)
## Changes
Simplified isFullVariableOverrideDef implementation

Follow up on https://github.com/databricks/cli/pull/1787
2024-09-26 12:55:07 +00:00
shreyas-goenka 495040e4cd
Modify SetLocation test utility to take full locations as argument (#1788)
I plan to use this in https://github.com/databricks/cli/pull/1780, to
set the line and column numbers as well for the locations.

gopatch file used:
```
@@
var x expression
var y expression
var z expression
@@
-bundletest.SetLocation(x, y, z)
+bundletest.SetLocation(x, y, []dyn.Location{{File: z}})
```
2024-09-25 16:13:48 +00:00
Andrew Nester b3a3071086
Fixed full variable override detection (#1787)
## Changes
Fixes #1786 

## Tests
All valid override combinations are added as test cases
2024-09-25 12:35:16 +00:00
Gleb Kanterov 3d9decdda9
Add JobTaskClusterSpec validate mutator (#1784)
## Changes
Add JobTaskClusterSpec validate mutator. It catches the case when tasks
don't which cluster to use.

For example, we can get this error with minor modifications to
`default-python` template:

```yaml
      tasks:
        - task_key: python_file_task
          spark_python_task:
            python_file: ../src/my_project_10/main.py
```

```
 % databricks bundle validate
Error: Missing required cluster or environment settings
  at resources.jobs.my_project_10_job.tasks[0]
  in resources/my_project_10_job.yml:17:11

Task "print_github_stars" requires a cluster or an environment to run.
Specify one of the following fields: job_cluster_key, environment_key, existing_cluster_id, new_cluster.
```

We implicitly rely on "one of" validation, which does not exist. Many
bundle fields can't co-exist, for instance, specifying:
`JobTask.{existing_cluster_id,job_cluster_key}`, `Library.{whl,pypi}`,
`JobTask.{notebook_task,python_wheel_task}`, etc.

## Tests
Unit tests

---------

Co-authored-by: Pieter Noordhuis <pcnoordhuis@gmail.com>
2024-09-25 11:30:14 +00:00