Compare commits

...

22 Commits

Author SHA1 Message Date
Shreyas Goenka 4373e7b513
make unit test package separate 2024-09-26 17:57:24 +02:00
Shreyas Goenka 28917ebdab
- 2024-09-26 17:55:09 +02:00
Shreyas Goenka 3203858032
add comment 2024-09-26 17:48:15 +02:00
Shreyas Goenka fec73db111
address comments 2024-09-26 17:47:07 +02:00
Shreyas Goenka 4cc0dd1fa1
Merge remote-tracking branch 'origin' into detect-convention 2024-09-26 17:33:44 +02:00
Shreyas Goenka 6040e58262
complete info to recommendation 2024-09-26 17:23:51 +02:00
Shreyas Goenka d9cf582287
fix failing tests 2024-09-26 17:14:43 +02:00
Shreyas Goenka 4734249854
convert inline tests to files 2024-09-26 16:51:46 +02:00
Shreyas Goenka d14b1e2cca
directly pass config value 2024-09-26 16:07:32 +02:00
Shreyas Goenka 5308ad8bf3
split into summary and detail 2024-09-26 16:03:49 +02:00
Shreyas Goenka fd01824ee1
- 2024-09-26 15:39:41 +02:00
shreyas-goenka 4e8e027380
Sort tasks by `task_key` before generating the Terraform configuration (#1776)
## Changes
Sort the tasks in the resultant `bundle.tf.json`. This is important
because configuration from one task can leak into another if the tasks
are not sorted.

For more details see:
1.
https://github.com/databricks/terraform-provider-databricks/issues/3951
2.
https://github.com/databricks/terraform-provider-databricks/issues/4011

## Tests
Unit test and manually.

For manual testing I used the following configuration:
```
resources:
  jobs:
    foo:
      tasks: 
        - task_key: task-Z
          notebook_task: 
            notebook_path: nb.py
            source: GIT
          existing_cluster_id: 0715-133738-ju0ma84z

        - task_key: task-1
          notebook_task: 
            notebook_path: ${workspace.file_path}/local.py
            source: WORKSPACE
          existing_cluster_id: 0715-133738-ju0ma84z
          depends_on: 
            - task_key: task-Z


      git_source: 
        git_provider: gitHub
        git_url: https://github.com/shreyas-goenka/job-source-tmp.git
        git_branch: main
```

Steps (1):
1. Deploy this bundle.
2. Comment out "source: GIT"
3. Deploy again

Before:
Deploying this bundle twice would fail. This is because the "source:
GIT" would carry over to the next deployment.

After:
There was no error on the subsequent deployment.

Steps (2):
1. Deploy once
2. Deploy again

Before:
Works correctly but leads to a update API call every time.

After:
No diff is detected by terraform.
2024-09-26 13:22:22 +00:00
Andrew Nester 66f2ba64a8
Simplified isFullVariableOverrideDef implementation (#1791)
## Changes
Simplified isFullVariableOverrideDef implementation

Follow up on https://github.com/databricks/cli/pull/1787
2024-09-26 12:55:07 +00:00
dependabot[bot] 94d8c3ba1e
Bump github.com/hashicorp/hc-install from 0.7.0 to 0.9.0 (#1772)
Bumps
[github.com/hashicorp/hc-install](https://github.com/hashicorp/hc-install)
from 0.7.0 to 0.9.0.
<details>
<summary>Release notes</summary>
<p><em>Sourced from <a
href="https://github.com/hashicorp/hc-install/releases">github.com/hashicorp/hc-install's
releases</a>.</em></p>
<blockquote>
<h2>v0.9.0</h2>
<h2>What's Changed</h2>
<ul>
<li>Finish Release of 0.8.1 by updating VERSION by <a
href="https://github.com/mutahhir"><code>@​mutahhir</code></a> in <a
href="https://redirect.github.com/hashicorp/hc-install/pull/248">hashicorp/hc-install#248</a></li>
<li>build(deps): bump golang.org/x/mod from 0.20.0 to 0.21.0 by <a
href="https://github.com/dependabot"><code>@​dependabot</code></a> in <a
href="https://redirect.github.com/hashicorp/hc-install/pull/242">hashicorp/hc-install#242</a></li>
<li>docs: Update release instructions by <a
href="https://github.com/radeksimko"><code>@​radeksimko</code></a> in <a
href="https://redirect.github.com/hashicorp/hc-install/pull/249">hashicorp/hc-install#249</a></li>
<li>Prepare for v0.9.0 release by <a
href="https://github.com/mutahhir"><code>@​mutahhir</code></a> in <a
href="https://redirect.github.com/hashicorp/hc-install/pull/250">hashicorp/hc-install#250</a></li>
</ul>
<p><strong>Full Changelog</strong>: <a
href="https://github.com/hashicorp/hc-install/compare/v0.8.1...v0.9.0">https://github.com/hashicorp/hc-install/compare/v0.8.1...v0.9.0</a></p>
<h2>v0.8.1</h2>
<h2>What's Changed</h2>
<ul>
<li>Add artifacts manifest (automatically generated) by <a
href="https://github.com/jeanneryan"><code>@​jeanneryan</code></a> in <a
href="https://redirect.github.com/hashicorp/hc-install/pull/235">hashicorp/hc-install#235</a></li>
<li>build(deps): bump golang.org/x/mod from 0.19.0 to 0.20.0 by <a
href="https://github.com/dependabot"><code>@​dependabot</code></a> in <a
href="https://redirect.github.com/hashicorp/hc-install/pull/236">hashicorp/hc-install#236</a></li>
<li>build(deps): Bump workflows to latest trusted versions by <a
href="https://github.com/hashicorp-tsccr"><code>@​hashicorp-tsccr</code></a>
in <a
href="https://redirect.github.com/hashicorp/hc-install/pull/237">hashicorp/hc-install#237</a></li>
<li>build(deps): Bump workflows to latest trusted versions by <a
href="https://github.com/hashicorp-tsccr"><code>@​hashicorp-tsccr</code></a>
in <a
href="https://redirect.github.com/hashicorp/hc-install/pull/238">hashicorp/hc-install#238</a></li>
<li>build(deps): bump hashicorp/action-setup-bob from 2.1.0 to 2.1.1 in
the github-actions-backward-compatible group by <a
href="https://github.com/dependabot"><code>@​dependabot</code></a> in <a
href="https://redirect.github.com/hashicorp/hc-install/pull/241">hashicorp/hc-install#241</a></li>
<li>httpclient: Reuse existing configured logger by <a
href="https://github.com/radeksimko"><code>@​radeksimko</code></a> in <a
href="https://redirect.github.com/hashicorp/hc-install/pull/240">hashicorp/hc-install#240</a></li>
<li>build(deps): Bump workflows to latest trusted versions by <a
href="https://github.com/hashicorp-tsccr"><code>@​hashicorp-tsccr</code></a>
in <a
href="https://redirect.github.com/hashicorp/hc-install/pull/243">hashicorp/hc-install#243</a></li>
<li>Nightly and PR builds: fix &quot;no space left on device&quot; on
macOS runner by <a
href="https://github.com/kmoe"><code>@​kmoe</code></a> in <a
href="https://redirect.github.com/hashicorp/hc-install/pull/245">hashicorp/hc-install#245</a></li>
<li>Update CONTRIBUTING.md to add clean up step by <a
href="https://github.com/mutahhir"><code>@​mutahhir</code></a> in <a
href="https://redirect.github.com/hashicorp/hc-install/pull/246">hashicorp/hc-install#246</a></li>
</ul>
<h2>New Contributors</h2>
<ul>
<li><a
href="https://github.com/jeanneryan"><code>@​jeanneryan</code></a> made
their first contribution in <a
href="https://redirect.github.com/hashicorp/hc-install/pull/235">hashicorp/hc-install#235</a></li>
<li><a href="https://github.com/mutahhir"><code>@​mutahhir</code></a>
made their first contribution in <a
href="https://redirect.github.com/hashicorp/hc-install/pull/246">hashicorp/hc-install#246</a></li>
</ul>
<p><strong>Full Changelog</strong>: <a
href="https://github.com/hashicorp/hc-install/compare/v0.8.0...v0.8.1">https://github.com/hashicorp/hc-install/compare/v0.8.0...v0.8.1</a></p>
<h2>v0.8.0</h2>
<p>ENHANCEMENTS:</p>
<ul>
<li>Add retries for HTTP operations by <a
href="https://github.com/james0209"><code>@​james0209</code></a> in <a
href="https://redirect.github.com/hashicorp/hc-install/pull/218">hashicorp/hc-install#218</a></li>
<li>Allow <code>LicenseDir</code> field for non-enterprise usage by <a
href="https://github.com/james0209"><code>@​james0209</code></a> in <a
href="https://redirect.github.com/hashicorp/hc-install/pull/214">hashicorp/hc-install#214</a></li>
</ul>
<p>BUG FIXES:</p>
<ul>
<li>[fix] include custom url's &quot;path&quot; when creating Archive
URL by <a
href="https://github.com/james0209"><code>@​james0209</code></a> in <a
href="https://redirect.github.com/hashicorp/hc-install/pull/234">hashicorp/hc-install#234</a></li>
</ul>
<p>INTERNAL:</p>
<ul>
<li>[chore] Remove unused variable by <a
href="https://github.com/james0209"><code>@​james0209</code></a> in <a
href="https://redirect.github.com/hashicorp/hc-install/pull/215">hashicorp/hc-install#215</a></li>
<li>build(deps): bump github.com/hashicorp/go-retryablehttp from 0.7.6
to 0.7.7 by <a
href="https://github.com/dependabot"><code>@​dependabot</code></a> in <a
href="https://redirect.github.com/hashicorp/hc-install/pull/221">hashicorp/hc-install#221</a></li>
<li>build(deps): bump github.com/hashicorp/go-version from 1.6.0 to
1.7.0 by <a
href="https://github.com/dependabot"><code>@​dependabot</code></a> in <a
href="https://redirect.github.com/hashicorp/hc-install/pull/216">hashicorp/hc-install#216</a></li>
<li>build(deps): bump golang.org/x/mod from 0.17.0 to 0.18.0 by <a
href="https://github.com/dependabot"><code>@​dependabot</code></a> in <a
href="https://redirect.github.com/hashicorp/hc-install/pull/223">hashicorp/hc-install#223</a></li>
<li>build(deps): bump golang.org/x/mod from 0.18.0 to 0.19.0 by <a
href="https://github.com/dependabot"><code>@​dependabot</code></a> in <a
href="https://redirect.github.com/hashicorp/hc-install/pull/229">hashicorp/hc-install#229</a></li>
<li>build(deps): bump hashicorp/action-setup-bob from 2.0.0 to 2.0.3 in
the github-actions-backward-compatible group by <a
href="https://github.com/dependabot"><code>@​dependabot</code></a> in <a
href="https://redirect.github.com/hashicorp/hc-install/pull/220">hashicorp/hc-install#220</a></li>
<li>build(deps): bump hashicorp/action-setup-bob from 2.0.3 to 2.1.0 in
the github-actions-backward-compatible group by <a
href="https://github.com/dependabot"><code>@​dependabot</code></a> in <a
href="https://redirect.github.com/hashicorp/hc-install/pull/222">hashicorp/hc-install#222</a></li>
<li>build(deps): bump hashicorp/actions-packaging-linux from 1.7 to 1.8
in the github-actions-backward-compatible group by <a
href="https://github.com/dependabot"><code>@​dependabot</code></a> in <a
href="https://redirect.github.com/hashicorp/hc-install/pull/224">hashicorp/hc-install#224</a></li>
<li>build(deps): Bump workflows to latest trusted versions by <a
href="https://github.com/hashicorp-tsccr"><code>@​hashicorp-tsccr</code></a>
in <a
href="https://redirect.github.com/hashicorp/hc-install/pull/219">hashicorp/hc-install#219</a></li>
<li>build(deps): Bump workflows to latest trusted versions by <a
href="https://github.com/hashicorp-tsccr"><code>@​hashicorp-tsccr</code></a>
in <a
href="https://redirect.github.com/hashicorp/hc-install/pull/226">hashicorp/hc-install#226</a></li>
</ul>
<!-- raw HTML omitted -->
</blockquote>
<p>... (truncated)</p>
</details>
<details>
<summary>Commits</summary>
<ul>
<li><a
href="157a802cb6"><code>157a802</code></a>
Merge pull request <a
href="https://redirect.github.com/hashicorp/hc-install/issues/250">#250</a>
from hashicorp/release-0.9.0</li>
<li><a
href="4c734fc034"><code>4c734fc</code></a>
Prepare for v0.9.0 release</li>
<li><a
href="d78b32850e"><code>d78b328</code></a>
Merge pull request <a
href="https://redirect.github.com/hashicorp/hc-install/issues/249">#249</a>
from hashicorp/d-contributing-md-update</li>
<li><a
href="34f38b0890"><code>34f38b0</code></a>
docs: Update release instructions</li>
<li><a
href="6a5aa830d9"><code>6a5aa83</code></a>
build(deps): bump golang.org/x/mod from 0.20.0 to 0.21.0 (<a
href="https://redirect.github.com/hashicorp/hc-install/issues/242">#242</a>)</li>
<li><a
href="1784fccf08"><code>1784fcc</code></a>
Merge pull request <a
href="https://redirect.github.com/hashicorp/hc-install/issues/248">#248</a>
from hashicorp/revert-version-contents</li>
<li><a
href="ea2c69b3af"><code>ea2c69b</code></a>
Finish Release of 0.8.1 by updating VERSION</li>
<li><a
href="4f3e00edd9"><code>4f3e00e</code></a>
Releasing 0.8.1</li>
<li><a
href="c6d1ced5b4"><code>c6d1ced</code></a>
Merge pull request <a
href="https://redirect.github.com/hashicorp/hc-install/issues/246">#246</a>
from hashicorp/update-contributing</li>
<li><a
href="eea12f14a6"><code>eea12f1</code></a>
Update CONTRIBUTING.md to add clean up step</li>
<li>Additional commits viewable in <a
href="https://github.com/hashicorp/hc-install/compare/v0.7.0...v0.9.0">compare
view</a></li>
</ul>
</details>
<br />

<details>
<summary>Most Recent Ignore Conditions Applied to This Pull
Request</summary>

| Dependency Name | Ignore Conditions |
| --- | --- |
| github.com/hashicorp/hc-install | [>= 0.8.a, < 0.9] |
</details>


[![Dependabot compatibility
score](https://dependabot-badges.githubapp.com/badges/compatibility_score?dependency-name=github.com/hashicorp/hc-install&package-manager=go_modules&previous-version=0.7.0&new-version=0.9.0)](https://docs.github.com/en/github/managing-security-vulnerabilities/about-dependabot-security-updates#about-compatibility-scores)

Dependabot will resolve any conflicts with this PR as long as you don't
alter it yourself. You can also trigger a rebase manually by commenting
`@dependabot rebase`.

[//]: # (dependabot-automerge-start)
[//]: # (dependabot-automerge-end)

---

<details>
<summary>Dependabot commands and options</summary>
<br />

You can trigger Dependabot actions by commenting on this PR:
- `@dependabot rebase` will rebase this PR
- `@dependabot recreate` will recreate this PR, overwriting any edits
that have been made to it
- `@dependabot merge` will merge this PR after your CI passes on it
- `@dependabot squash and merge` will squash and merge this PR after
your CI passes on it
- `@dependabot cancel merge` will cancel a previously requested merge
and block automerging
- `@dependabot reopen` will reopen this PR if it is closed
- `@dependabot close` will close this PR and stop Dependabot recreating
it. You can achieve the same result by closing it manually
- `@dependabot show <dependency name> ignore conditions` will show all
of the ignore conditions of the specified dependency
- `@dependabot ignore this major version` will close this PR and stop
Dependabot creating any more for this major version (unless you reopen
the PR or upgrade to it yourself)
- `@dependabot ignore this minor version` will close this PR and stop
Dependabot creating any more for this minor version (unless you reopen
the PR or upgrade to it yourself)
- `@dependabot ignore this dependency` will close this PR and stop
Dependabot creating any more for this dependency (unless you reopen the
PR or upgrade to it yourself)


</details>

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2024-09-26 06:29:34 +00:00
dependabot[bot] 875b112f80
Bump golang.org/x/mod from 0.20.0 to 0.21.0 (#1758)
Bumps [golang.org/x/mod](https://github.com/golang/mod) from 0.20.0 to
0.21.0.
<details>
<summary>Commits</summary>
<ul>
<li><a
href="46a3137dae"><code>46a3137</code></a>
zip: set GIT_DIR in test when using bare repositories</li>
<li><a
href="3afcd4e90a"><code>3afcd4e</code></a>
go.mod: set go version to 1.22.0</li>
<li><a
href="b1d336cfca"><code>b1d336c</code></a>
go.mod: update required go version to go1.22</li>
<li>See full diff in <a
href="https://github.com/golang/mod/compare/v0.20.0...v0.21.0">compare
view</a></li>
</ul>
</details>
<br />


[![Dependabot compatibility
score](https://dependabot-badges.githubapp.com/badges/compatibility_score?dependency-name=golang.org/x/mod&package-manager=go_modules&previous-version=0.20.0&new-version=0.21.0)](https://docs.github.com/en/github/managing-security-vulnerabilities/about-dependabot-security-updates#about-compatibility-scores)

Dependabot will resolve any conflicts with this PR as long as you don't
alter it yourself. You can also trigger a rebase manually by commenting
`@dependabot rebase`.

[//]: # (dependabot-automerge-start)
[//]: # (dependabot-automerge-end)

---

<details>
<summary>Dependabot commands and options</summary>
<br />

You can trigger Dependabot actions by commenting on this PR:
- `@dependabot rebase` will rebase this PR
- `@dependabot recreate` will recreate this PR, overwriting any edits
that have been made to it
- `@dependabot merge` will merge this PR after your CI passes on it
- `@dependabot squash and merge` will squash and merge this PR after
your CI passes on it
- `@dependabot cancel merge` will cancel a previously requested merge
and block automerging
- `@dependabot reopen` will reopen this PR if it is closed
- `@dependabot close` will close this PR and stop Dependabot recreating
it. You can achieve the same result by closing it manually
- `@dependabot show <dependency name> ignore conditions` will show all
of the ignore conditions of the specified dependency
- `@dependabot ignore this major version` will close this PR and stop
Dependabot creating any more for this major version (unless you reopen
the PR or upgrade to it yourself)
- `@dependabot ignore this minor version` will close this PR and stop
Dependabot creating any more for this minor version (unless you reopen
the PR or upgrade to it yourself)
- `@dependabot ignore this dependency` will close this PR and stop
Dependabot creating any more for this dependency (unless you reopen the
PR or upgrade to it yourself)


</details>

---------

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
Co-authored-by: Andrew Nester <andrew.nester@databricks.com>
Co-authored-by: Pieter Noordhuis <pieter.noordhuis@databricks.com>
2024-09-26 06:07:01 +00:00
shreyas-goenka 495040e4cd
Modify SetLocation test utility to take full locations as argument (#1788)
I plan to use this in https://github.com/databricks/cli/pull/1780, to
set the line and column numbers as well for the locations.

gopatch file used:
```
@@
var x expression
var y expression
var z expression
@@
-bundletest.SetLocation(x, y, z)
+bundletest.SetLocation(x, y, []dyn.Location{{File: z}})
```
2024-09-25 16:13:48 +00:00
Pieter Noordhuis 7f1121d8d8
Pin Go toolchain to 1.22.7 (#1790)
## Changes

Relates to https://github.com/databricks/cli/pull/1758.

More information about toolchains:
* https://go.dev/blog/toolchain
* https://go.dev/doc/toolchain

We need to specify the toolchain as we need to bump Go to 1.22.0 for the
`mod` upgrade and want to use the latest toolchain on the 1.22 series.

## Tests

The previous release was made with Go 1.22.7 so we should continue to
use it.
2024-09-25 15:45:28 +00:00
shreyas-goenka a4ba0bbe9f
Add sub-extension to resource files in built-in templates (#1777)
## Changes
We want to encourage a pattern of only specifying a single resource in a
YAML file when an `.<resource-type>.yml` (like `.job.yml`) is used. This
convention could allow us to bijectively map a resource YAML file to
it's corresponding resource in the Databricks workspace.

This PR simply makes the built-in templates compliant to this format.

## Tests
Existing tests.
2024-09-25 12:58:14 +00:00
Andrew Nester b3a3071086
Fixed full variable override detection (#1787)
## Changes
Fixes #1786 

## Tests
All valid override combinations are added as test cases
2024-09-25 12:35:16 +00:00
Gleb Kanterov 3d9decdda9
Add JobTaskClusterSpec validate mutator (#1784)
## Changes
Add JobTaskClusterSpec validate mutator. It catches the case when tasks
don't which cluster to use.

For example, we can get this error with minor modifications to
`default-python` template:

```yaml
      tasks:
        - task_key: python_file_task
          spark_python_task:
            python_file: ../src/my_project_10/main.py
```

```
 % databricks bundle validate
Error: Missing required cluster or environment settings
  at resources.jobs.my_project_10_job.tasks[0]
  in resources/my_project_10_job.yml:17:11

Task "print_github_stars" requires a cluster or an environment to run.
Specify one of the following fields: job_cluster_key, environment_key, existing_cluster_id, new_cluster.
```

We implicitly rely on "one of" validation, which does not exist. Many
bundle fields can't co-exist, for instance, specifying:
`JobTask.{existing_cluster_id,job_cluster_key}`, `Library.{whl,pypi}`,
`JobTask.{notebook_task,python_wheel_task}`, etc.

## Tests
Unit tests

---------

Co-authored-by: Pieter Noordhuis <pcnoordhuis@gmail.com>
2024-09-25 11:30:14 +00:00
Gleb Kanterov 490259a14a
Refactor jobs path translation (#1782)
## Changes
Extract package for other modules to transform different kinds of paths
in job resources.

## Tests
Unit tests
2024-09-24 13:51:54 +00:00
shreyas-goenka 0cc35ca056
Assert tokens are redacted in origin URL when username is not specified (#1785)
TSIA
2024-09-23 12:42:30 +00:00
46 changed files with 1310 additions and 549 deletions

View File

@ -33,7 +33,7 @@ jobs:
- name: Setup Go - name: Setup Go
uses: actions/setup-go@v5 uses: actions/setup-go@v5
with: with:
go-version: 1.22.x go-version: 1.22.7
- name: Setup Python - name: Setup Python
uses: actions/setup-python@v5 uses: actions/setup-python@v5
@ -68,7 +68,7 @@ jobs:
- name: Setup Go - name: Setup Go
uses: actions/setup-go@v5 uses: actions/setup-go@v5
with: with:
go-version: 1.22.x go-version: 1.22.7
# No need to download cached dependencies when running gofmt. # No need to download cached dependencies when running gofmt.
cache: false cache: false
@ -100,7 +100,7 @@ jobs:
- name: Setup Go - name: Setup Go
uses: actions/setup-go@v5 uses: actions/setup-go@v5
with: with:
go-version: 1.22.x go-version: 1.22.7
# Github repo: https://github.com/ajv-validator/ajv-cli # Github repo: https://github.com/ajv-validator/ajv-cli
- name: Install ajv-cli - name: Install ajv-cli

View File

@ -21,7 +21,7 @@ jobs:
- name: Setup Go - name: Setup Go
uses: actions/setup-go@v5 uses: actions/setup-go@v5
with: with:
go-version: 1.22.x go-version: 1.22.7
# The default cache key for this action considers only the `go.sum` file. # The default cache key for this action considers only the `go.sum` file.
# We include .goreleaser.yaml here to differentiate from the cache used by the push action # We include .goreleaser.yaml here to differentiate from the cache used by the push action

View File

@ -22,7 +22,7 @@ jobs:
- name: Setup Go - name: Setup Go
uses: actions/setup-go@v5 uses: actions/setup-go@v5
with: with:
go-version: 1.22.x go-version: 1.22.7
# The default cache key for this action considers only the `go.sum` file. # The default cache key for this action considers only the `go.sum` file.
# We include .goreleaser.yaml here to differentiate from the cache used by the push action # We include .goreleaser.yaml here to differentiate from the cache used by the push action

View File

@ -3,6 +3,7 @@ package loader
import ( import (
"context" "context"
"fmt" "fmt"
"slices"
"sort" "sort"
"strings" "strings"
@ -10,26 +11,14 @@ import (
"github.com/databricks/cli/bundle/config" "github.com/databricks/cli/bundle/config"
"github.com/databricks/cli/libs/diag" "github.com/databricks/cli/libs/diag"
"github.com/databricks/cli/libs/dyn" "github.com/databricks/cli/libs/dyn"
"golang.org/x/exp/maps"
) )
var resourceTypes = []string{ func validateFileFormat(configRoot dyn.Value, filePath string) diag.Diagnostics {
"job", for _, resourceDescription := range config.SupportedResources() {
"pipeline", singularName := resourceDescription.SingularName
"model", for _, ext := range []string{fmt.Sprintf(".%s.yml", singularName), fmt.Sprintf(".%s.yaml", singularName)} {
"experiment",
"model_serving_endpoint",
"registered_model",
"quality_monitor",
"schema",
"cluster",
}
func validateFileFormat(r *config.Root, filePath string) diag.Diagnostics {
for _, typ := range resourceTypes {
for _, ext := range []string{fmt.Sprintf(".%s.yml", typ), fmt.Sprintf(".%s.yaml", typ)} {
if strings.HasSuffix(filePath, ext) { if strings.HasSuffix(filePath, ext) {
return validateSingleResourceDefined(r, ext, typ) return validateSingleResourceDefined(configRoot, ext, singularName)
} }
} }
} }
@ -37,7 +26,7 @@ func validateFileFormat(r *config.Root, filePath string) diag.Diagnostics {
return nil return nil
} }
func validateSingleResourceDefined(r *config.Root, ext, typ string) diag.Diagnostics { func validateSingleResourceDefined(configRoot dyn.Value, ext, typ string) diag.Diagnostics {
type resource struct { type resource struct {
path dyn.Path path dyn.Path
value dyn.Value value dyn.Value
@ -46,16 +35,17 @@ func validateSingleResourceDefined(r *config.Root, ext, typ string) diag.Diagnos
} }
resources := []resource{} resources := []resource{}
supportedResources := config.SupportedResources()
// Gather all resources defined in the resources block. // Gather all resources defined in the resources block.
_, err := dyn.MapByPattern( _, err := dyn.MapByPattern(
r.Value(), configRoot,
dyn.NewPattern(dyn.Key("resources"), dyn.AnyKey(), dyn.AnyKey()), dyn.NewPattern(dyn.Key("resources"), dyn.AnyKey(), dyn.AnyKey()),
func(p dyn.Path, v dyn.Value) (dyn.Value, error) { func(p dyn.Path, v dyn.Value) (dyn.Value, error) {
// The key for the resource. Eg: "my_job" for jobs.my_job. // The key for the resource. Eg: "my_job" for jobs.my_job.
k := p[2].Key() k := p[2].Key()
// The type of the resource. Eg: "job" for jobs.my_job. // The type of the resource. Eg: "job" for jobs.my_job.
typ := strings.TrimSuffix(p[1].Key(), "s") typ := supportedResources[p[1].Key()].SingularName
resources = append(resources, resource{path: p, value: v, typ: typ, key: k}) resources = append(resources, resource{path: p, value: v, typ: typ, key: k})
return v, nil return v, nil
@ -66,13 +56,13 @@ func validateSingleResourceDefined(r *config.Root, ext, typ string) diag.Diagnos
// Gather all resources defined in a target block. // Gather all resources defined in a target block.
_, err = dyn.MapByPattern( _, err = dyn.MapByPattern(
r.Value(), configRoot,
dyn.NewPattern(dyn.Key("targets"), dyn.AnyKey(), dyn.Key("resources"), dyn.AnyKey(), dyn.AnyKey()), dyn.NewPattern(dyn.Key("targets"), dyn.AnyKey(), dyn.Key("resources"), dyn.AnyKey(), dyn.AnyKey()),
func(p dyn.Path, v dyn.Value) (dyn.Value, error) { func(p dyn.Path, v dyn.Value) (dyn.Value, error) {
// The key for the resource. Eg: "my_job" for jobs.my_job. // The key for the resource. Eg: "my_job" for jobs.my_job.
k := p[4].Key() k := p[4].Key()
// The type of the resource. Eg: "job" for jobs.my_job. // The type of the resource. Eg: "job" for jobs.my_job.
typ := strings.TrimSuffix(p[3].Key(), "s") typ := supportedResources[p[3].Key()].SingularName
resources = append(resources, resource{path: p, value: v, typ: typ, key: k}) resources = append(resources, resource{path: p, value: v, typ: typ, key: k})
return v, nil return v, nil
@ -93,28 +83,26 @@ func validateSingleResourceDefined(r *config.Root, ext, typ string) diag.Diagnos
seenKeys[rr.key] = struct{}{} seenKeys[rr.key] = struct{}{}
} }
// Format matches. There's less than or equal to one resource defined in the file. // Format matches. There's at most one resource defined in the file.
// The resource is also of the correct type. // The resource is also of the correct type.
if typeMatch && len(seenKeys) <= 1 { if typeMatch && len(seenKeys) <= 1 {
return nil return nil
} }
msg := strings.Builder{} detail := strings.Builder{}
msg.WriteString(fmt.Sprintf("We recommend only defining a single %s in a file with the %s extension.\n", typ, ext)) detail.WriteString("The following resources are defined or configured in this file:\n")
lines := []string{}
// Dedup the list of resources before adding them the diagnostic message. This
// is needed because we do not dedup earlier when gathering the resources and
// it's valid to define the same resource in both the resources and targets block.
msg.WriteString("The following resources are defined or configured in this file:\n")
setOfLines := map[string]struct{}{}
for _, r := range resources { for _, r := range resources {
setOfLines[fmt.Sprintf(" - %s (%s)\n", r.key, r.typ)] = struct{}{} lines = append(lines, fmt.Sprintf(" - %s (%s)\n", r.key, r.typ))
} }
// Sort the line s to print to make the output deterministic. // Sort the lines to print to make the output deterministic.
listOfLines := maps.Keys(setOfLines) sort.Strings(lines)
sort.Strings(listOfLines) // Compact the lines before writing them to the message to remove any duplicate lines.
for _, l := range listOfLines { // This is needed because we do not dedup earlier when gathering the resources
msg.WriteString(l) // and it's valid to define the same resource in both the resources and targets block.
lines = slices.Compact(lines)
for _, l := range lines {
detail.WriteString(l)
} }
locations := []dyn.Location{} locations := []dyn.Location{}
@ -133,8 +121,9 @@ func validateSingleResourceDefined(r *config.Root, ext, typ string) diag.Diagnos
return diag.Diagnostics{ return diag.Diagnostics{
{ {
Severity: diag.Info, Severity: diag.Recommendation,
Summary: msg.String(), Summary: fmt.Sprintf("We recommend only defining a single %s in a file with the %s extension.", typ, ext),
Detail: detail.String(),
Locations: locations, Locations: locations,
Paths: paths, Paths: paths,
}, },
@ -165,7 +154,7 @@ func (m *processInclude) Apply(_ context.Context, b *bundle.Bundle) diag.Diagnos
} }
// Add any diagnostics associated with the file format. // Add any diagnostics associated with the file format.
diags = append(diags, validateFileFormat(this, m.relPath)...) diags = append(diags, validateFileFormat(this.Value(), m.relPath)...)
if diags.HasError() { if diags.HasError() {
return diags return diags
} }

View File

@ -1,16 +1,13 @@
package loader package loader_test
import ( import (
"context" "context"
"path/filepath" "path/filepath"
"reflect"
"strings"
"testing" "testing"
"github.com/databricks/cli/bundle" "github.com/databricks/cli/bundle"
"github.com/databricks/cli/bundle/config" "github.com/databricks/cli/bundle/config"
"github.com/databricks/cli/bundle/config/resources" "github.com/databricks/cli/bundle/config/loader"
"github.com/databricks/cli/bundle/internal/bundletest"
"github.com/databricks/cli/libs/diag" "github.com/databricks/cli/libs/diag"
"github.com/databricks/cli/libs/dyn" "github.com/databricks/cli/libs/dyn"
"github.com/stretchr/testify/assert" "github.com/stretchr/testify/assert"
@ -27,7 +24,7 @@ func TestProcessInclude(t *testing.T) {
}, },
} }
m := ProcessInclude(filepath.Join(b.RootPath, "host.yml"), "host.yml") m := loader.ProcessInclude(filepath.Join(b.RootPath, "host.yml"), "host.yml")
assert.Equal(t, "ProcessInclude(host.yml)", m.Name()) assert.Equal(t, "ProcessInclude(host.yml)", m.Name())
// Assert the host value prior to applying the mutator // Assert the host value prior to applying the mutator
@ -39,338 +36,137 @@ func TestProcessInclude(t *testing.T) {
assert.Equal(t, "bar", b.Config.Workspace.Host) assert.Equal(t, "bar", b.Config.Workspace.Host)
} }
func TestProcessIncludeValidatesFileFormat(t *testing.T) { func TestProcessIncludeFormatPass(t *testing.T) {
b := &bundle.Bundle{ for _, fileName := range []string{
RootPath: "testdata/format", "one_job.job.yml",
Config: config.Root{ "one_pipeline.pipeline.yaml",
Bundle: config.Bundle{ "two_job.yml",
Name: "format_test", "job_and_pipeline.yml",
}, } {
}, t.Run(fileName, func(t *testing.T) {
} b := &bundle.Bundle{
RootPath: "testdata/format_pass",
m := ProcessInclude(filepath.Join(b.RootPath, "foo.job.yml"), "foo.job.yml") Config: config.Root{
diags := bundle.Apply(context.Background(), b, m) Bundle: config.Bundle{
require.NoError(t, diags.Error()) Name: "format_test",
// Assert that the diagnostics contain the expected information
assert.Len(t, diags, 1)
assert.Equal(t, diag.Diagnostics{
{
Severity: diag.Info,
Summary: "We recommend only defining a single job in a file with the .job.yml extension.\nThe following resources are defined or configured in this file:\n - bar (job)\n - foo (job)\n",
Locations: []dyn.Location{
{File: filepath.FromSlash("testdata/format/foo.job.yml"), Line: 4, Column: 7},
{File: filepath.FromSlash("testdata/format/foo.job.yml"), Line: 7, Column: 7},
},
Paths: []dyn.Path{
dyn.MustPathFromString("resources.jobs.bar"),
dyn.MustPathFromString("resources.jobs.foo"),
},
},
}, diags)
}
func TestResourceNames(t *testing.T) {
names := []string{}
typ := reflect.TypeOf(config.Resources{})
for i := 0; i < typ.NumField(); i++ {
field := typ.Field(i)
jsonTags := strings.Split(field.Tag.Get("json"), ",")
singularName := strings.TrimSuffix(jsonTags[0], "s")
names = append(names, singularName)
}
// Assert the contents of the two lists are equal. Please add the singular
// name of your resource to resourceNames global if you are adding a new
// resource.
assert.Equal(t, len(resourceTypes), len(names))
for _, name := range names {
assert.Contains(t, resourceTypes, name)
}
}
func TestValidateFileFormat(t *testing.T) {
onlyJob := config.Root{
Resources: config.Resources{
Jobs: map[string]*resources.Job{
"job1": {},
},
},
Targets: map[string]*config.Target{
"target1": {
Resources: &config.Resources{
Jobs: map[string]*resources.Job{
"job1": {},
}, },
}, },
},
},
}
onlyJobBundle := bundle.Bundle{Config: onlyJob}
onlyPipeline := config.Root{
Resources: config.Resources{
Pipelines: map[string]*resources.Pipeline{
"pipeline1": {},
},
},
}
onlyPipelineBundle := bundle.Bundle{Config: onlyPipeline}
bothJobAndPipeline := config.Root{
Resources: config.Resources{
Jobs: map[string]*resources.Job{
"job1": {},
},
},
Targets: map[string]*config.Target{
"target1": {
Resources: &config.Resources{
Pipelines: map[string]*resources.Pipeline{
"pipeline1": {},
},
},
},
},
}
bothJobAndPipelineBundle := bundle.Bundle{Config: bothJobAndPipeline}
twoJobs := config.Root{
Resources: config.Resources{
Jobs: map[string]*resources.Job{
"job1": {},
"job2": {},
},
},
}
twoJobsBundle := bundle.Bundle{Config: twoJobs}
twoJobsTopLevelAndTarget := config.Root{
Resources: config.Resources{
Jobs: map[string]*resources.Job{
"job1": {},
},
},
Targets: map[string]*config.Target{
"target1": {
Resources: &config.Resources{
Jobs: map[string]*resources.Job{
"job2": {},
},
},
},
},
}
twoJobsTopLevelAndTargetBundle := bundle.Bundle{Config: twoJobsTopLevelAndTarget}
twoJobsInTarget := config.Root{
Targets: map[string]*config.Target{
"target1": {
Resources: &config.Resources{
Jobs: map[string]*resources.Job{
"job1": {},
"job2": {},
},
},
},
},
}
twoJobsInTargetBundle := bundle.Bundle{Config: twoJobsInTarget}
tcases := []struct {
name string
bundle *bundle.Bundle
expected diag.Diagnostics
fileName string
locations map[string]dyn.Location
}{
{
name: "single job",
bundle: &onlyJobBundle,
expected: nil,
fileName: "foo.job.yml",
locations: map[string]dyn.Location{
"resources.jobs.job1": {File: "foo.job.yml", Line: 1, Column: 1},
},
},
{
name: "single pipeline",
bundle: &onlyPipelineBundle,
expected: nil,
fileName: "foo.pipeline.yml",
locations: map[string]dyn.Location{
"resources.pipelines.pipeline1": {File: "foo.pipeline.yaml", Line: 1, Column: 1},
},
},
{
name: "single job but extension is pipeline",
bundle: &onlyJobBundle,
expected: diag.Diagnostics{
{
Severity: diag.Info,
Summary: "We recommend only defining a single pipeline in a file with the .pipeline.yml extension.\nThe following resources are defined or configured in this file:\n - job1 (job)\n",
Locations: []dyn.Location{
{File: "foo.pipeline.yml", Line: 1, Column: 1},
{File: "foo.pipeline.yml", Line: 2, Column: 2},
},
Paths: []dyn.Path{
dyn.MustPathFromString("resources.jobs.job1"),
dyn.MustPathFromString("targets.target1.resources.jobs.job1"),
},
},
},
fileName: "foo.pipeline.yml",
locations: map[string]dyn.Location{
"resources.jobs.job1": {File: "foo.pipeline.yml", Line: 1, Column: 1},
"targets.target1.resources.jobs.job1": {File: "foo.pipeline.yml", Line: 2, Column: 2},
},
},
{
name: "job and pipeline",
bundle: &bothJobAndPipelineBundle,
expected: nil,
fileName: "foo.yml",
locations: map[string]dyn.Location{
"resources.jobs.job1": {File: "foo.yml", Line: 1, Column: 1},
"targets.target1.resources.pipelines.pipeline1": {File: "foo.yml", Line: 2, Column: 2},
},
},
{
name: "job and pipeline but extension is job",
bundle: &bothJobAndPipelineBundle,
expected: diag.Diagnostics{
{
Severity: diag.Info,
Summary: "We recommend only defining a single job in a file with the .job.yml extension.\nThe following resources are defined or configured in this file:\n - job1 (job)\n - pipeline1 (pipeline)\n",
Locations: []dyn.Location{
{File: "foo.job.yml", Line: 1, Column: 1},
{File: "foo.job.yml", Line: 2, Column: 2},
},
Paths: []dyn.Path{
dyn.MustPathFromString("resources.jobs.job1"),
dyn.MustPathFromString("targets.target1.resources.pipelines.pipeline1"),
},
},
},
fileName: "foo.job.yml",
locations: map[string]dyn.Location{
"resources.jobs.job1": {File: "foo.job.yml", Line: 1, Column: 1},
"targets.target1.resources.pipelines.pipeline1": {File: "foo.job.yml", Line: 2, Column: 2},
},
},
{
name: "job and pipeline but extension is experiment",
bundle: &bothJobAndPipelineBundle,
expected: diag.Diagnostics{
{
Severity: diag.Info,
Summary: "We recommend only defining a single experiment in a file with the .experiment.yml extension.\nThe following resources are defined or configured in this file:\n - job1 (job)\n - pipeline1 (pipeline)\n",
Locations: []dyn.Location{
{File: "foo.experiment.yml", Line: 1, Column: 1},
{File: "foo.experiment.yml", Line: 2, Column: 2},
},
Paths: []dyn.Path{
dyn.MustPathFromString("resources.jobs.job1"),
dyn.MustPathFromString("targets.target1.resources.pipelines.pipeline1"),
},
},
},
fileName: "foo.experiment.yml",
locations: map[string]dyn.Location{
"resources.jobs.job1": {File: "foo.experiment.yml", Line: 1, Column: 1},
"targets.target1.resources.pipelines.pipeline1": {File: "foo.experiment.yml", Line: 2, Column: 2},
},
},
{
name: "two jobs",
bundle: &twoJobsBundle,
expected: diag.Diagnostics{
{
Severity: diag.Info,
Summary: "We recommend only defining a single job in a file with the .job.yml extension.\nThe following resources are defined or configured in this file:\n - job1 (job)\n - job2 (job)\n",
Locations: []dyn.Location{
{File: "foo.job.yml", Line: 1, Column: 1},
{File: "foo.job.yml", Line: 2, Column: 2},
},
Paths: []dyn.Path{
dyn.MustPathFromString("resources.jobs.job1"),
dyn.MustPathFromString("resources.jobs.job2"),
},
},
},
fileName: "foo.job.yml",
locations: map[string]dyn.Location{
"resources.jobs.job1": {File: "foo.job.yml", Line: 1, Column: 1},
"resources.jobs.job2": {File: "foo.job.yml", Line: 2, Column: 2},
},
},
{
name: "two jobs but extension is simple yaml",
bundle: &twoJobsBundle,
expected: nil,
fileName: "foo.yml",
locations: map[string]dyn.Location{
"resources.jobs.job1": {File: "foo.yml", Line: 1, Column: 1},
"resources.jobs.job2": {File: "foo.yml", Line: 2, Column: 2},
},
},
{
name: "two jobs in top level and target",
bundle: &twoJobsTopLevelAndTargetBundle,
expected: diag.Diagnostics{
{
Severity: diag.Info,
Summary: "We recommend only defining a single job in a file with the .job.yml extension.\nThe following resources are defined or configured in this file:\n - job1 (job)\n - job2 (job)\n",
Locations: []dyn.Location{
{File: "foo.job.yml", Line: 1, Column: 1},
{File: "foo.job.yml", Line: 2, Column: 2},
},
Paths: []dyn.Path{
dyn.MustPathFromString("resources.jobs.job1"),
dyn.MustPathFromString("targets.target1.resources.jobs.job2"),
},
},
},
fileName: "foo.job.yml",
locations: map[string]dyn.Location{
"resources.jobs.job1": {File: "foo.job.yml", Line: 1, Column: 1},
"targets.target1.resources.jobs.job2": {File: "foo.job.yml", Line: 2, Column: 2},
},
},
{
name: "two jobs in target",
bundle: &twoJobsInTargetBundle,
expected: diag.Diagnostics{
{
Severity: diag.Info,
Summary: "We recommend only defining a single job in a file with the .job.yml extension.\nThe following resources are defined or configured in this file:\n - job1 (job)\n - job2 (job)\n",
Locations: []dyn.Location{
{File: "foo.job.yml", Line: 1, Column: 1},
{File: "foo.job.yml", Line: 2, Column: 2},
},
Paths: []dyn.Path{
dyn.MustPathFromString(("targets.target1.resources.jobs.job1")),
dyn.MustPathFromString("targets.target1.resources.jobs.job2"),
},
},
},
fileName: "foo.job.yml",
locations: map[string]dyn.Location{
"targets.target1.resources.jobs.job1": {File: "foo.job.yml", Line: 1, Column: 1},
"targets.target1.resources.jobs.job2": {File: "foo.job.yml", Line: 2, Column: 2},
},
},
}
for _, tc := range tcases {
t.Run(tc.name, func(t *testing.T) {
for k, v := range tc.locations {
bundletest.SetLocation(tc.bundle, k, []dyn.Location{v})
} }
diags := validateFileFormat(&tc.bundle.Config, tc.fileName) m := loader.ProcessInclude(filepath.Join(b.RootPath, fileName), fileName)
assert.Equal(t, tc.expected, diags) diags := bundle.Apply(context.Background(), b, m)
assert.Empty(t, diags)
})
}
}
func TestProcessIncludeFormatFail(t *testing.T) {
for fileName, expectedDiags := range map[string]diag.Diagnostics{
"single_job.pipeline.yaml": {
{
Severity: diag.Recommendation,
Summary: "We recommend only defining a single pipeline in a file with the .pipeline.yaml extension.",
Detail: "The following resources are defined or configured in this file:\n - job1 (job)\n",
Locations: []dyn.Location{
{File: filepath.FromSlash("testdata/format_fail/single_job.pipeline.yaml"), Line: 11, Column: 11},
{File: filepath.FromSlash("testdata/format_fail/single_job.pipeline.yaml"), Line: 4, Column: 7},
},
Paths: []dyn.Path{
dyn.MustPathFromString("resources.jobs.job1"),
dyn.MustPathFromString("targets.target1.resources.jobs.job1"),
},
},
},
"job_and_pipeline.job.yml": {
{
Severity: diag.Recommendation,
Summary: "We recommend only defining a single job in a file with the .job.yml extension.",
Detail: "The following resources are defined or configured in this file:\n - job1 (job)\n - pipeline1 (pipeline)\n",
Locations: []dyn.Location{
{File: filepath.FromSlash("testdata/format_fail/job_and_pipeline.job.yml"), Line: 11, Column: 11},
{File: filepath.FromSlash("testdata/format_fail/job_and_pipeline.job.yml"), Line: 4, Column: 7},
},
Paths: []dyn.Path{
dyn.MustPathFromString("resources.pipelines.pipeline1"),
dyn.MustPathFromString("targets.target1.resources.jobs.job1"),
},
},
},
"job_and_pipeline.experiment.yml": {
{
Severity: diag.Recommendation,
Summary: "We recommend only defining a single experiment in a file with the .experiment.yml extension.",
Detail: "The following resources are defined or configured in this file:\n - job1 (job)\n - pipeline1 (pipeline)\n",
Locations: []dyn.Location{
{File: filepath.FromSlash("testdata/format_fail/job_and_pipeline.experiment.yml"), Line: 11, Column: 11},
{File: filepath.FromSlash("testdata/format_fail/job_and_pipeline.experiment.yml"), Line: 4, Column: 7},
},
Paths: []dyn.Path{
dyn.MustPathFromString("resources.pipelines.pipeline1"),
dyn.MustPathFromString("targets.target1.resources.jobs.job1"),
},
},
},
"two_jobs.job.yml": {
{
Severity: diag.Recommendation,
Summary: "We recommend only defining a single job in a file with the .job.yml extension.",
Detail: "The following resources are defined or configured in this file:\n - job1 (job)\n - job2 (job)\n",
Locations: []dyn.Location{
{File: filepath.FromSlash("testdata/format_fail/two_jobs.job.yml"), Line: 4, Column: 7},
{File: filepath.FromSlash("testdata/format_fail/two_jobs.job.yml"), Line: 7, Column: 7},
},
Paths: []dyn.Path{
dyn.MustPathFromString("resources.jobs.job1"),
dyn.MustPathFromString("resources.jobs.job2"),
},
},
},
"second_job_in_target.job.yml": {
{
Severity: diag.Recommendation,
Summary: "We recommend only defining a single job in a file with the .job.yml extension.",
Detail: "The following resources are defined or configured in this file:\n - job1 (job)\n - job2 (job)\n",
Locations: []dyn.Location{
{File: filepath.FromSlash("testdata/format_fail/second_job_in_target.job.yml"), Line: 11, Column: 11},
{File: filepath.FromSlash("testdata/format_fail/second_job_in_target.job.yml"), Line: 4, Column: 7},
},
Paths: []dyn.Path{
dyn.MustPathFromString("resources.jobs.job1"),
dyn.MustPathFromString("targets.target1.resources.jobs.job2"),
},
},
},
"two_jobs_in_target.job.yml": {
{
Severity: diag.Recommendation,
Summary: "We recommend only defining a single job in a file with the .job.yml extension.",
Detail: "The following resources are defined or configured in this file:\n - job1 (job)\n - job2 (job)\n",
Locations: []dyn.Location{
{File: filepath.FromSlash("testdata/format_fail/two_jobs_in_target.job.yml"), Line: 6, Column: 11},
{File: filepath.FromSlash("testdata/format_fail/two_jobs_in_target.job.yml"), Line: 8, Column: 11},
},
Paths: []dyn.Path{
dyn.MustPathFromString("targets.target1.resources.jobs.job1"),
dyn.MustPathFromString("targets.target1.resources.jobs.job2"),
},
},
},
} {
t.Run(fileName, func(t *testing.T) {
b := &bundle.Bundle{
RootPath: "testdata/format_fail",
Config: config.Root{
Bundle: config.Bundle{
Name: "format_test",
},
},
}
m := loader.ProcessInclude(filepath.Join(b.RootPath, fileName), fileName)
diags := bundle.Apply(context.Background(), b, m)
require.Len(t, diags, 1)
assert.Equal(t, expectedDiags, diags)
}) })
} }
} }

View File

@ -0,0 +1,11 @@
resources:
pipelines:
pipeline1:
name: pipeline1
targets:
target1:
resources:
jobs:
job1:
name: job1

View File

@ -0,0 +1,11 @@
resources:
pipelines:
pipeline1:
name: pipeline1
targets:
target1:
resources:
jobs:
job1:
name: job1

View File

@ -0,0 +1,11 @@
resources:
jobs:
job1:
name: job1
targets:
target1:
resources:
jobs:
job2:
name: job2

View File

@ -0,0 +1,11 @@
resources:
jobs:
job1:
name: job1
targets:
target1:
resources:
jobs:
job1:
description: job1

View File

@ -0,0 +1,7 @@
resources:
jobs:
job1:
name: job1
job2:
name: job2

View File

@ -0,0 +1,8 @@
targets:
target1:
resources:
jobs:
job1:
description: job1
job2:
description: job2

View File

@ -0,0 +1,11 @@
resources:
pipelines:
pipeline1:
name: pipeline1
targets:
target1:
resources:
jobs:
job1:
name: job1

View File

@ -0,0 +1,11 @@
resources:
jobs:
job1:
name: job1
targets:
target1:
resources:
jobs:
job1:
description: job1

View File

@ -0,0 +1,5 @@
# TODO: Remove all the schema inlined references
resources:
pipelines:
pipeline1:
name: pipeline1

View File

@ -0,0 +1,7 @@
resources:
jobs:
job1:
name: job1
job2:
name: job2

View File

@ -0,0 +1,115 @@
package paths
import (
"github.com/databricks/cli/bundle/libraries"
"github.com/databricks/cli/libs/dyn"
)
type jobRewritePattern struct {
pattern dyn.Pattern
kind PathKind
skipRewrite func(string) bool
}
func noSkipRewrite(string) bool {
return false
}
func jobTaskRewritePatterns(base dyn.Pattern) []jobRewritePattern {
return []jobRewritePattern{
{
base.Append(dyn.Key("notebook_task"), dyn.Key("notebook_path")),
PathKindNotebook,
noSkipRewrite,
},
{
base.Append(dyn.Key("spark_python_task"), dyn.Key("python_file")),
PathKindWorkspaceFile,
noSkipRewrite,
},
{
base.Append(dyn.Key("dbt_task"), dyn.Key("project_directory")),
PathKindDirectory,
noSkipRewrite,
},
{
base.Append(dyn.Key("sql_task"), dyn.Key("file"), dyn.Key("path")),
PathKindWorkspaceFile,
noSkipRewrite,
},
{
base.Append(dyn.Key("libraries"), dyn.AnyIndex(), dyn.Key("whl")),
PathKindLibrary,
noSkipRewrite,
},
{
base.Append(dyn.Key("libraries"), dyn.AnyIndex(), dyn.Key("jar")),
PathKindLibrary,
noSkipRewrite,
},
{
base.Append(dyn.Key("libraries"), dyn.AnyIndex(), dyn.Key("requirements")),
PathKindWorkspaceFile,
noSkipRewrite,
},
}
}
func jobRewritePatterns() []jobRewritePattern {
// Base pattern to match all tasks in all jobs.
base := dyn.NewPattern(
dyn.Key("resources"),
dyn.Key("jobs"),
dyn.AnyKey(),
dyn.Key("tasks"),
dyn.AnyIndex(),
)
// Compile list of patterns and their respective rewrite functions.
jobEnvironmentsPatterns := []jobRewritePattern{
{
dyn.NewPattern(
dyn.Key("resources"),
dyn.Key("jobs"),
dyn.AnyKey(),
dyn.Key("environments"),
dyn.AnyIndex(),
dyn.Key("spec"),
dyn.Key("dependencies"),
dyn.AnyIndex(),
),
PathKindWithPrefix,
func(s string) bool {
return !libraries.IsLibraryLocal(s)
},
},
}
taskPatterns := jobTaskRewritePatterns(base)
forEachPatterns := jobTaskRewritePatterns(base.Append(dyn.Key("for_each_task"), dyn.Key("task")))
allPatterns := append(taskPatterns, jobEnvironmentsPatterns...)
allPatterns = append(allPatterns, forEachPatterns...)
return allPatterns
}
// VisitJobPaths visits all paths in job resources and applies a function to each path.
func VisitJobPaths(value dyn.Value, fn VisitFunc) (dyn.Value, error) {
var err error
var newValue = value
for _, rewritePattern := range jobRewritePatterns() {
newValue, err = dyn.MapByPattern(newValue, rewritePattern.pattern, func(p dyn.Path, v dyn.Value) (dyn.Value, error) {
if rewritePattern.skipRewrite(v.MustString()) {
return v, nil
}
return fn(p, rewritePattern.kind, v)
})
if err != nil {
return dyn.InvalidValue, err
}
}
return newValue, nil
}

View File

@ -0,0 +1,168 @@
package paths
import (
"testing"
"github.com/databricks/cli/bundle/config"
"github.com/databricks/cli/bundle/config/resources"
"github.com/databricks/cli/libs/dyn"
assert "github.com/databricks/cli/libs/dyn/dynassert"
"github.com/databricks/databricks-sdk-go/service/compute"
"github.com/databricks/databricks-sdk-go/service/jobs"
"github.com/stretchr/testify/require"
)
func TestVisitJobPaths(t *testing.T) {
task0 := jobs.Task{
NotebookTask: &jobs.NotebookTask{
NotebookPath: "abc",
},
}
task1 := jobs.Task{
SparkPythonTask: &jobs.SparkPythonTask{
PythonFile: "abc",
},
}
task2 := jobs.Task{
DbtTask: &jobs.DbtTask{
ProjectDirectory: "abc",
},
}
task3 := jobs.Task{
SqlTask: &jobs.SqlTask{
File: &jobs.SqlTaskFile{
Path: "abc",
},
},
}
task4 := jobs.Task{
Libraries: []compute.Library{
{Whl: "dist/foo.whl"},
},
}
task5 := jobs.Task{
Libraries: []compute.Library{
{Jar: "dist/foo.jar"},
},
}
task6 := jobs.Task{
Libraries: []compute.Library{
{Requirements: "requirements.txt"},
},
}
job0 := &resources.Job{
JobSettings: &jobs.JobSettings{
Tasks: []jobs.Task{
task0,
task1,
task2,
task3,
task4,
task5,
task6,
},
},
}
root := config.Root{
Resources: config.Resources{
Jobs: map[string]*resources.Job{
"job0": job0,
},
},
}
actual := visitJobPaths(t, root)
expected := []dyn.Path{
dyn.MustPathFromString("resources.jobs.job0.tasks[0].notebook_task.notebook_path"),
dyn.MustPathFromString("resources.jobs.job0.tasks[1].spark_python_task.python_file"),
dyn.MustPathFromString("resources.jobs.job0.tasks[2].dbt_task.project_directory"),
dyn.MustPathFromString("resources.jobs.job0.tasks[3].sql_task.file.path"),
dyn.MustPathFromString("resources.jobs.job0.tasks[4].libraries[0].whl"),
dyn.MustPathFromString("resources.jobs.job0.tasks[5].libraries[0].jar"),
dyn.MustPathFromString("resources.jobs.job0.tasks[6].libraries[0].requirements"),
}
assert.ElementsMatch(t, expected, actual)
}
func TestVisitJobPaths_environments(t *testing.T) {
environment0 := jobs.JobEnvironment{
Spec: &compute.Environment{
Dependencies: []string{
"dist_0/*.whl",
"dist_1/*.whl",
},
},
}
job0 := &resources.Job{
JobSettings: &jobs.JobSettings{
Environments: []jobs.JobEnvironment{
environment0,
},
},
}
root := config.Root{
Resources: config.Resources{
Jobs: map[string]*resources.Job{
"job0": job0,
},
},
}
actual := visitJobPaths(t, root)
expected := []dyn.Path{
dyn.MustPathFromString("resources.jobs.job0.environments[0].spec.dependencies[0]"),
dyn.MustPathFromString("resources.jobs.job0.environments[0].spec.dependencies[1]"),
}
assert.ElementsMatch(t, expected, actual)
}
func TestVisitJobPaths_foreach(t *testing.T) {
task0 := jobs.Task{
ForEachTask: &jobs.ForEachTask{
Task: jobs.Task{
NotebookTask: &jobs.NotebookTask{
NotebookPath: "abc",
},
},
},
}
job0 := &resources.Job{
JobSettings: &jobs.JobSettings{
Tasks: []jobs.Task{
task0,
},
},
}
root := config.Root{
Resources: config.Resources{
Jobs: map[string]*resources.Job{
"job0": job0,
},
},
}
actual := visitJobPaths(t, root)
expected := []dyn.Path{
dyn.MustPathFromString("resources.jobs.job0.tasks[0].for_each_task.task.notebook_task.notebook_path"),
}
assert.ElementsMatch(t, expected, actual)
}
func visitJobPaths(t *testing.T, root config.Root) []dyn.Path {
var actual []dyn.Path
err := root.Mutate(func(value dyn.Value) (dyn.Value, error) {
return VisitJobPaths(value, func(p dyn.Path, kind PathKind, v dyn.Value) (dyn.Value, error) {
actual = append(actual, p)
return v, nil
})
})
require.NoError(t, err)
return actual
}

View File

@ -0,0 +1,26 @@
package paths
import "github.com/databricks/cli/libs/dyn"
type PathKind int
const (
// PathKindLibrary is a path to a library file
PathKindLibrary = iota
// PathKindNotebook is a path to a notebook file
PathKindNotebook
// PathKindWorkspaceFile is a path to a regular workspace file,
// notebooks are not allowed because they are uploaded a special
// kind of workspace object.
PathKindWorkspaceFile
// PathKindWithPrefix is a path that starts with './'
PathKindWithPrefix
// PathKindDirectory is a path to directory
PathKindDirectory
)
type VisitFunc func(path dyn.Path, kind PathKind, value dyn.Value) (dyn.Value, error)

View File

@ -4,97 +4,11 @@ import (
"fmt" "fmt"
"slices" "slices"
"github.com/databricks/cli/bundle/libraries" "github.com/databricks/cli/bundle/config/mutator/paths"
"github.com/databricks/cli/libs/dyn" "github.com/databricks/cli/libs/dyn"
) )
type jobRewritePattern struct {
pattern dyn.Pattern
fn rewriteFunc
skipRewrite func(string) bool
}
func noSkipRewrite(string) bool {
return false
}
func rewritePatterns(t *translateContext, base dyn.Pattern) []jobRewritePattern {
return []jobRewritePattern{
{
base.Append(dyn.Key("notebook_task"), dyn.Key("notebook_path")),
t.translateNotebookPath,
noSkipRewrite,
},
{
base.Append(dyn.Key("spark_python_task"), dyn.Key("python_file")),
t.translateFilePath,
noSkipRewrite,
},
{
base.Append(dyn.Key("dbt_task"), dyn.Key("project_directory")),
t.translateDirectoryPath,
noSkipRewrite,
},
{
base.Append(dyn.Key("sql_task"), dyn.Key("file"), dyn.Key("path")),
t.translateFilePath,
noSkipRewrite,
},
{
base.Append(dyn.Key("libraries"), dyn.AnyIndex(), dyn.Key("whl")),
t.translateNoOp,
noSkipRewrite,
},
{
base.Append(dyn.Key("libraries"), dyn.AnyIndex(), dyn.Key("jar")),
t.translateNoOp,
noSkipRewrite,
},
{
base.Append(dyn.Key("libraries"), dyn.AnyIndex(), dyn.Key("requirements")),
t.translateFilePath,
noSkipRewrite,
},
}
}
func (t *translateContext) jobRewritePatterns() []jobRewritePattern {
// Base pattern to match all tasks in all jobs.
base := dyn.NewPattern(
dyn.Key("resources"),
dyn.Key("jobs"),
dyn.AnyKey(),
dyn.Key("tasks"),
dyn.AnyIndex(),
)
// Compile list of patterns and their respective rewrite functions.
jobEnvironmentsPatterns := []jobRewritePattern{
{
dyn.NewPattern(
dyn.Key("resources"),
dyn.Key("jobs"),
dyn.AnyKey(),
dyn.Key("environments"),
dyn.AnyIndex(),
dyn.Key("spec"),
dyn.Key("dependencies"),
dyn.AnyIndex(),
),
t.translateNoOpWithPrefix,
func(s string) bool {
return !libraries.IsLibraryLocal(s)
},
},
}
taskPatterns := rewritePatterns(t, base)
forEachPatterns := rewritePatterns(t, base.Append(dyn.Key("for_each_task"), dyn.Key("task")))
allPatterns := append(taskPatterns, jobEnvironmentsPatterns...)
allPatterns = append(allPatterns, forEachPatterns...)
return allPatterns
}
func (t *translateContext) applyJobTranslations(v dyn.Value) (dyn.Value, error) { func (t *translateContext) applyJobTranslations(v dyn.Value) (dyn.Value, error) {
var err error var err error
@ -111,30 +25,41 @@ func (t *translateContext) applyJobTranslations(v dyn.Value) (dyn.Value, error)
} }
} }
for _, rewritePattern := range t.jobRewritePatterns() { return paths.VisitJobPaths(v, func(p dyn.Path, kind paths.PathKind, v dyn.Value) (dyn.Value, error) {
v, err = dyn.MapByPattern(v, rewritePattern.pattern, func(p dyn.Path, v dyn.Value) (dyn.Value, error) { key := p[2].Key()
key := p[2].Key()
// Skip path translation if the job is using git source. // Skip path translation if the job is using git source.
if slices.Contains(ignore, key) { if slices.Contains(ignore, key) {
return v, nil return v, nil
} }
dir, err := v.Location().Directory() dir, err := v.Location().Directory()
if err != nil { if err != nil {
return dyn.InvalidValue, fmt.Errorf("unable to determine directory for job %s: %w", key, err) return dyn.InvalidValue, fmt.Errorf("unable to determine directory for job %s: %w", key, err)
} }
sv := v.MustString() rewritePatternFn, err := t.getRewritePatternFn(kind)
if rewritePattern.skipRewrite(sv) {
return v, nil
}
return t.rewriteRelativeTo(p, v, rewritePattern.fn, dir, fallback[key])
})
if err != nil { if err != nil {
return dyn.InvalidValue, err return dyn.InvalidValue, err
} }
return t.rewriteRelativeTo(p, v, rewritePatternFn, dir, fallback[key])
})
}
func (t *translateContext) getRewritePatternFn(kind paths.PathKind) (rewriteFunc, error) {
switch kind {
case paths.PathKindLibrary:
return t.translateNoOp, nil
case paths.PathKindNotebook:
return t.translateNotebookPath, nil
case paths.PathKindWorkspaceFile:
return t.translateFilePath, nil
case paths.PathKindDirectory:
return t.translateDirectoryPath, nil
case paths.PathKindWithPrefix:
return t.translateNoOpWithPrefix, nil
} }
return v, nil return nil, fmt.Errorf("unsupported path kind: %d", kind)
} }

View File

@ -59,3 +59,22 @@ func (r *Resources) FindResourceByConfigKey(key string) (ConfigResource, error)
return found[0], nil return found[0], nil
} }
type ResourceDescription struct {
SingularName string
}
// The keys of the map corresponds to the resource key in the bundle configuration.
func SupportedResources() map[string]ResourceDescription {
return map[string]ResourceDescription{
"jobs": {SingularName: "job"},
"pipelines": {SingularName: "pipeline"},
"models": {SingularName: "model"},
"experiments": {SingularName: "experiment"},
"model_serving_endpoints": {SingularName: "model_serving_endpoint"},
"registered_models": {SingularName: "registered_model"},
"quality_monitors": {SingularName: "quality_monitor"},
"schemas": {SingularName: "schema"},
"clusters": {SingularName: "cluster"},
}
}

View File

@ -3,6 +3,7 @@ package config
import ( import (
"encoding/json" "encoding/json"
"reflect" "reflect"
"strings"
"testing" "testing"
"github.com/stretchr/testify/assert" "github.com/stretchr/testify/assert"
@ -61,3 +62,18 @@ func TestCustomMarshallerIsImplemented(t *testing.T) {
}, "Resource %s does not have a custom unmarshaller", field.Name) }, "Resource %s does not have a custom unmarshaller", field.Name)
} }
} }
func TestSupportedResources(t *testing.T) {
expected := map[string]ResourceDescription{}
typ := reflect.TypeOf(Resources{})
for i := 0; i < typ.NumField(); i++ {
field := typ.Field(i)
jsonTags := strings.Split(field.Tag.Get("json"), ",")
singularName := strings.TrimSuffix(jsonTags[0], "s")
expected[jsonTags[0]] = ResourceDescription{SingularName: singularName}
}
// Please add your resource to the SupportedResources() function in resources.go
// if you are adding a new resource.
assert.Equal(t, expected, SupportedResources())
}

View File

@ -406,7 +406,14 @@ func (r *Root) MergeTargetOverrides(name string) error {
return r.updateWithDynamicValue(root) return r.updateWithDynamicValue(root)
} }
var variableKeywords = []string{"default", "lookup"} var allowedVariableDefinitions = []([]string){
{"default", "type", "description"},
{"default", "type"},
{"default", "description"},
{"lookup", "description"},
{"default"},
{"lookup"},
}
// isFullVariableOverrideDef checks if the given value is a full syntax varaible override. // isFullVariableOverrideDef checks if the given value is a full syntax varaible override.
// A full syntax variable override is a map with either 1 of 2 keys. // A full syntax variable override is a map with either 1 of 2 keys.
@ -418,26 +425,26 @@ func isFullVariableOverrideDef(v dyn.Value) bool {
return false return false
} }
// If the map has more than 2 keys, it is not a full variable override. // If the map has more than 3 keys, it is not a full variable override.
if mv.Len() > 2 { if mv.Len() > 3 {
return false return false
} }
// If the map has 2 keys, one of them should be "default" and the other is "type" for _, keys := range allowedVariableDefinitions {
if mv.Len() == 2 { if len(keys) != mv.Len() {
if _, ok := mv.GetByString("type"); !ok { continue
return false
} }
if _, ok := mv.GetByString("default"); !ok { // Check if the keys are the same.
return false match := true
for _, key := range keys {
if _, ok := mv.GetByString(key); !ok {
match = false
break
}
} }
return true if match {
}
for _, keyword := range variableKeywords {
if _, ok := mv.GetByString(keyword); ok {
return true return true
} }
} }

View File

@ -6,6 +6,7 @@ import (
"testing" "testing"
"github.com/databricks/cli/bundle/config/variable" "github.com/databricks/cli/bundle/config/variable"
"github.com/databricks/cli/libs/dyn"
"github.com/stretchr/testify/assert" "github.com/stretchr/testify/assert"
"github.com/stretchr/testify/require" "github.com/stretchr/testify/require"
) )
@ -169,3 +170,87 @@ func TestRootMergeTargetOverridesWithVariables(t *testing.T) {
assert.Equal(t, "complex var", root.Variables["complex"].Description) assert.Equal(t, "complex var", root.Variables["complex"].Description)
} }
func TestIsFullVariableOverrideDef(t *testing.T) {
testCases := []struct {
value dyn.Value
expected bool
}{
{
value: dyn.V(map[string]dyn.Value{
"type": dyn.V("string"),
"default": dyn.V("foo"),
"description": dyn.V("foo var"),
}),
expected: true,
},
{
value: dyn.V(map[string]dyn.Value{
"type": dyn.V("string"),
"lookup": dyn.V("foo"),
"description": dyn.V("foo var"),
}),
expected: false,
},
{
value: dyn.V(map[string]dyn.Value{
"type": dyn.V("string"),
"default": dyn.V("foo"),
}),
expected: true,
},
{
value: dyn.V(map[string]dyn.Value{
"type": dyn.V("string"),
"lookup": dyn.V("foo"),
}),
expected: false,
},
{
value: dyn.V(map[string]dyn.Value{
"description": dyn.V("string"),
"default": dyn.V("foo"),
}),
expected: true,
},
{
value: dyn.V(map[string]dyn.Value{
"description": dyn.V("string"),
"lookup": dyn.V("foo"),
}),
expected: true,
},
{
value: dyn.V(map[string]dyn.Value{
"default": dyn.V("foo"),
}),
expected: true,
},
{
value: dyn.V(map[string]dyn.Value{
"lookup": dyn.V("foo"),
}),
expected: true,
},
{
value: dyn.V(map[string]dyn.Value{
"type": dyn.V("string"),
}),
expected: false,
},
{
value: dyn.V(map[string]dyn.Value{
"type": dyn.V("string"),
"default": dyn.V("foo"),
"description": dyn.V("foo var"),
"lookup": dyn.V("foo"),
}),
expected: false,
},
}
for i, tc := range testCases {
assert.Equal(t, tc.expected, isFullVariableOverrideDef(tc.value), "test case %d", i)
}
}

View File

@ -0,0 +1,161 @@
package validate
import (
"context"
"fmt"
"strings"
"github.com/databricks/cli/bundle"
"github.com/databricks/cli/libs/diag"
"github.com/databricks/cli/libs/dyn"
"github.com/databricks/databricks-sdk-go/service/jobs"
)
// JobTaskClusterSpec validates that job tasks have cluster spec defined
// if task requires a cluster
func JobTaskClusterSpec() bundle.ReadOnlyMutator {
return &jobTaskClusterSpec{}
}
type jobTaskClusterSpec struct {
}
func (v *jobTaskClusterSpec) Name() string {
return "validate:job_task_cluster_spec"
}
func (v *jobTaskClusterSpec) Apply(ctx context.Context, rb bundle.ReadOnlyBundle) diag.Diagnostics {
diags := diag.Diagnostics{}
jobsPath := dyn.NewPath(dyn.Key("resources"), dyn.Key("jobs"))
for resourceName, job := range rb.Config().Resources.Jobs {
resourcePath := jobsPath.Append(dyn.Key(resourceName))
for taskIndex, task := range job.Tasks {
taskPath := resourcePath.Append(dyn.Key("tasks"), dyn.Index(taskIndex))
diags = diags.Extend(validateJobTask(rb, task, taskPath))
}
}
return diags
}
func validateJobTask(rb bundle.ReadOnlyBundle, task jobs.Task, taskPath dyn.Path) diag.Diagnostics {
diags := diag.Diagnostics{}
var specified []string
var unspecified []string
if task.JobClusterKey != "" {
specified = append(specified, "job_cluster_key")
} else {
unspecified = append(unspecified, "job_cluster_key")
}
if task.EnvironmentKey != "" {
specified = append(specified, "environment_key")
} else {
unspecified = append(unspecified, "environment_key")
}
if task.ExistingClusterId != "" {
specified = append(specified, "existing_cluster_id")
} else {
unspecified = append(unspecified, "existing_cluster_id")
}
if task.NewCluster != nil {
specified = append(specified, "new_cluster")
} else {
unspecified = append(unspecified, "new_cluster")
}
if task.ForEachTask != nil {
forEachTaskPath := taskPath.Append(dyn.Key("for_each_task"), dyn.Key("task"))
diags = diags.Extend(validateJobTask(rb, task.ForEachTask.Task, forEachTaskPath))
}
if isComputeTask(task) && len(specified) == 0 {
if task.NotebookTask != nil {
// notebook tasks without cluster spec will use notebook environment
} else {
// path might be not very helpful, adding user-specified task key clarifies the context
detail := fmt.Sprintf(
"Task %q requires a cluster or an environment to run.\nSpecify one of the following fields: %s.",
task.TaskKey,
strings.Join(unspecified, ", "),
)
diags = diags.Append(diag.Diagnostic{
Severity: diag.Error,
Summary: "Missing required cluster or environment settings",
Detail: detail,
Locations: rb.Config().GetLocations(taskPath.String()),
Paths: []dyn.Path{taskPath},
})
}
}
return diags
}
// isComputeTask returns true if the task runs on a cluster or serverless GC
func isComputeTask(task jobs.Task) bool {
if task.NotebookTask != nil {
// if warehouse_id is set, it's SQL notebook that doesn't need cluster or serverless GC
if task.NotebookTask.WarehouseId != "" {
return false
} else {
// task settings don't require specifying a cluster/serverless GC, but task itself can run on one
// we handle that case separately in validateJobTask
return true
}
}
if task.PythonWheelTask != nil {
return true
}
if task.DbtTask != nil {
return true
}
if task.SparkJarTask != nil {
return true
}
if task.SparkSubmitTask != nil {
return true
}
if task.SparkPythonTask != nil {
return true
}
if task.SqlTask != nil {
return false
}
if task.PipelineTask != nil {
// while pipelines use clusters, pipeline tasks don't, they only trigger pipelines
return false
}
if task.RunJobTask != nil {
return false
}
if task.ConditionTask != nil {
return false
}
// for each task doesn't use clusters, underlying task(s) can though
if task.ForEachTask != nil {
return false
}
return false
}

View File

@ -0,0 +1,203 @@
package validate
import (
"context"
"testing"
"github.com/databricks/cli/bundle"
"github.com/databricks/cli/bundle/config"
"github.com/databricks/cli/bundle/config/resources"
"github.com/databricks/databricks-sdk-go/service/compute"
"github.com/databricks/databricks-sdk-go/service/jobs"
"github.com/stretchr/testify/assert"
)
func TestJobTaskClusterSpec(t *testing.T) {
expectedSummary := "Missing required cluster or environment settings"
type testCase struct {
name string
task jobs.Task
errorPath string
errorDetail string
errorSummary string
}
testCases := []testCase{
{
name: "valid notebook task",
task: jobs.Task{
// while a cluster is needed, it will use notebook environment to create one
NotebookTask: &jobs.NotebookTask{},
},
},
{
name: "valid notebook task (job_cluster_key)",
task: jobs.Task{
JobClusterKey: "cluster1",
NotebookTask: &jobs.NotebookTask{},
},
},
{
name: "valid notebook task (new_cluster)",
task: jobs.Task{
NewCluster: &compute.ClusterSpec{},
NotebookTask: &jobs.NotebookTask{},
},
},
{
name: "valid notebook task (existing_cluster_id)",
task: jobs.Task{
ExistingClusterId: "cluster1",
NotebookTask: &jobs.NotebookTask{},
},
},
{
name: "valid SQL notebook task",
task: jobs.Task{
NotebookTask: &jobs.NotebookTask{
WarehouseId: "warehouse1",
},
},
},
{
name: "valid python wheel task",
task: jobs.Task{
JobClusterKey: "cluster1",
PythonWheelTask: &jobs.PythonWheelTask{},
},
},
{
name: "valid python wheel task (environment_key)",
task: jobs.Task{
EnvironmentKey: "environment1",
PythonWheelTask: &jobs.PythonWheelTask{},
},
},
{
name: "valid dbt task",
task: jobs.Task{
JobClusterKey: "cluster1",
DbtTask: &jobs.DbtTask{},
},
},
{
name: "valid spark jar task",
task: jobs.Task{
JobClusterKey: "cluster1",
SparkJarTask: &jobs.SparkJarTask{},
},
},
{
name: "valid spark submit",
task: jobs.Task{
NewCluster: &compute.ClusterSpec{},
SparkSubmitTask: &jobs.SparkSubmitTask{},
},
},
{
name: "valid spark python task",
task: jobs.Task{
JobClusterKey: "cluster1",
SparkPythonTask: &jobs.SparkPythonTask{},
},
},
{
name: "valid SQL task",
task: jobs.Task{
SqlTask: &jobs.SqlTask{},
},
},
{
name: "valid pipeline task",
task: jobs.Task{
PipelineTask: &jobs.PipelineTask{},
},
},
{
name: "valid run job task",
task: jobs.Task{
RunJobTask: &jobs.RunJobTask{},
},
},
{
name: "valid condition task",
task: jobs.Task{
ConditionTask: &jobs.ConditionTask{},
},
},
{
name: "valid for each task",
task: jobs.Task{
ForEachTask: &jobs.ForEachTask{
Task: jobs.Task{
JobClusterKey: "cluster1",
NotebookTask: &jobs.NotebookTask{},
},
},
},
},
{
name: "invalid python wheel task",
task: jobs.Task{
PythonWheelTask: &jobs.PythonWheelTask{},
TaskKey: "my_task",
},
errorPath: "resources.jobs.job1.tasks[0]",
errorDetail: `Task "my_task" requires a cluster or an environment to run.
Specify one of the following fields: job_cluster_key, environment_key, existing_cluster_id, new_cluster.`,
errorSummary: expectedSummary,
},
{
name: "invalid for each task",
task: jobs.Task{
ForEachTask: &jobs.ForEachTask{
Task: jobs.Task{
PythonWheelTask: &jobs.PythonWheelTask{},
TaskKey: "my_task",
},
},
},
errorPath: "resources.jobs.job1.tasks[0].for_each_task.task",
errorDetail: `Task "my_task" requires a cluster or an environment to run.
Specify one of the following fields: job_cluster_key, environment_key, existing_cluster_id, new_cluster.`,
errorSummary: expectedSummary,
},
}
for _, tc := range testCases {
t.Run(tc.name, func(t *testing.T) {
job := &resources.Job{
JobSettings: &jobs.JobSettings{
Tasks: []jobs.Task{tc.task},
},
}
b := createBundle(map[string]*resources.Job{"job1": job})
diags := bundle.ApplyReadOnly(context.Background(), bundle.ReadOnly(b), JobTaskClusterSpec())
if tc.errorPath != "" || tc.errorDetail != "" || tc.errorSummary != "" {
assert.Len(t, diags, 1)
assert.Len(t, diags[0].Paths, 1)
diag := diags[0]
assert.Equal(t, tc.errorPath, diag.Paths[0].String())
assert.Equal(t, tc.errorSummary, diag.Summary)
assert.Equal(t, tc.errorDetail, diag.Detail)
} else {
assert.ElementsMatch(t, []string{}, diags)
}
})
}
}
func createBundle(jobs map[string]*resources.Job) *bundle.Bundle {
return &bundle.Bundle{
Config: config.Root{
Resources: config.Resources{
Jobs: jobs,
},
},
}
}

View File

@ -34,6 +34,7 @@ func (v *validate) Apply(ctx context.Context, b *bundle.Bundle) diag.Diagnostics
JobClusterKeyDefined(), JobClusterKeyDefined(),
FilesToSync(), FilesToSync(),
ValidateSyncPatterns(), ValidateSyncPatterns(),
JobTaskClusterSpec(),
)) ))
} }

View File

@ -4,6 +4,7 @@ import (
"context" "context"
"encoding/json" "encoding/json"
"fmt" "fmt"
"sort"
"github.com/databricks/cli/bundle/config" "github.com/databricks/cli/bundle/config"
"github.com/databricks/cli/bundle/config/resources" "github.com/databricks/cli/bundle/config/resources"
@ -82,6 +83,10 @@ func BundleToTerraform(config *config.Root) *schema.Root {
conv(src, &dst) conv(src, &dst)
if src.JobSettings != nil { if src.JobSettings != nil {
sort.Slice(src.JobSettings.Tasks, func(i, j int) bool {
return src.JobSettings.Tasks[i].TaskKey < src.JobSettings.Tasks[j].TaskKey
})
for _, v := range src.Tasks { for _, v := range src.Tasks {
var t schema.ResourceJobTask var t schema.ResourceJobTask
conv(v, &t) conv(v, &t)

View File

@ -3,6 +3,7 @@ package tfdyn
import ( import (
"context" "context"
"fmt" "fmt"
"sort"
"github.com/databricks/cli/bundle/internal/tf/schema" "github.com/databricks/cli/bundle/internal/tf/schema"
"github.com/databricks/cli/libs/dyn" "github.com/databricks/cli/libs/dyn"
@ -19,8 +20,38 @@ func convertJobResource(ctx context.Context, vin dyn.Value) (dyn.Value, error) {
log.Debugf(ctx, "job normalization diagnostic: %s", diag.Summary) log.Debugf(ctx, "job normalization diagnostic: %s", diag.Summary)
} }
// Sort the tasks of each job in the bundle by task key. Sorting
// the task keys ensures that the diff computed by terraform is correct and avoids
// recreates. For more details see the NOTE at
// https://registry.terraform.io/providers/databricks/databricks/latest/docs/resources/job#example-usage
// and https://github.com/databricks/terraform-provider-databricks/issues/4011
// and https://github.com/databricks/cli/pull/1776
vout := vin
var err error
tasks, ok := vin.Get("tasks").AsSequence()
if ok {
sort.Slice(tasks, func(i, j int) bool {
// We sort the tasks by their task key. Tasks without task keys are ordered
// before tasks with task keys. We do not error for those tasks
// since presence of a task_key is validated for in the Jobs backend.
tk1, ok := tasks[i].Get("task_key").AsString()
if !ok {
return true
}
tk2, ok := tasks[j].Get("task_key").AsString()
if !ok {
return false
}
return tk1 < tk2
})
vout, err = dyn.Set(vin, "tasks", dyn.V(tasks))
if err != nil {
return dyn.InvalidValue, err
}
}
// Modify top-level keys. // Modify top-level keys.
vout, err := renameKeys(vin, map[string]string{ vout, err = renameKeys(vout, map[string]string{
"tasks": "task", "tasks": "task",
"job_clusters": "job_cluster", "job_clusters": "job_cluster",
"parameters": "parameter", "parameters": "parameter",

View File

@ -42,8 +42,8 @@ func TestConvertJob(t *testing.T) {
}, },
Tasks: []jobs.Task{ Tasks: []jobs.Task{
{ {
TaskKey: "task_key", TaskKey: "task_key_b",
JobClusterKey: "job_cluster_key", JobClusterKey: "job_cluster_key_b",
Libraries: []compute.Library{ Libraries: []compute.Library{
{ {
Pypi: &compute.PythonPyPiLibrary{ Pypi: &compute.PythonPyPiLibrary{
@ -55,6 +55,17 @@ func TestConvertJob(t *testing.T) {
}, },
}, },
}, },
{
TaskKey: "task_key_a",
JobClusterKey: "job_cluster_key_a",
},
{
TaskKey: "task_key_c",
JobClusterKey: "job_cluster_key_c",
},
{
Description: "missing task key 😱",
},
}, },
}, },
Permissions: []resources.Permission{ Permissions: []resources.Permission{
@ -100,8 +111,15 @@ func TestConvertJob(t *testing.T) {
}, },
"task": []any{ "task": []any{
map[string]any{ map[string]any{
"task_key": "task_key", "description": "missing task key 😱",
"job_cluster_key": "job_cluster_key", },
map[string]any{
"task_key": "task_key_a",
"job_cluster_key": "job_cluster_key_a",
},
map[string]any{
"task_key": "task_key_b",
"job_cluster_key": "job_cluster_key_b",
"library": []any{ "library": []any{
map[string]any{ map[string]any{
"pypi": map[string]any{ "pypi": map[string]any{
@ -113,6 +131,10 @@ func TestConvertJob(t *testing.T) {
}, },
}, },
}, },
map[string]any{
"task_key": "task_key_c",
"job_cluster_key": "job_cluster_key_c",
},
}, },
}, out.Job["my_job"]) }, out.Job["my_job"])

View File

@ -56,7 +56,7 @@ const warningTemplate = `{{ "Warning" | yellow }}: {{ .Summary }}
` `
const infoTemplate = `{{ "Info" | blue }}: {{ .Summary }} const infoTemplate = `{{ "Recommendation" | blue }}: {{ .Summary }}
{{- range $index, $element := .Paths }} {{- range $index, $element := .Paths }}
{{ if eq $index 0 }}at {{else}} {{ end}}{{ $element.String | green }} {{ if eq $index 0 }}at {{else}} {{ end}}{{ $element.String | green }}
{{- end }} {{- end }}
@ -108,12 +108,18 @@ func buildTrailer(diags diag.Diagnostics) string {
if warnings := len(diags.Filter(diag.Warning)); warnings > 0 { if warnings := len(diags.Filter(diag.Warning)); warnings > 0 {
parts = append(parts, color.YellowString(pluralize(warnings, "warning", "warnings"))) parts = append(parts, color.YellowString(pluralize(warnings, "warning", "warnings")))
} }
if infos := len(diags.Filter(diag.Info)); infos > 0 { if recommendations := len(diags.Filter(diag.Recommendation)); recommendations > 0 {
parts = append(parts, color.BlueString(pluralize(infos, "info", "infos"))) parts = append(parts, color.BlueString(pluralize(recommendations, "recommendation", "recommendations")))
} }
if len(parts) > 0 { switch {
return fmt.Sprintf("Found %s", strings.Join(parts, " and ")) case len(parts) >= 2:
} else { first := strings.Join(parts[:len(parts)-1], ", ")
last := parts[len(parts)-1]
return fmt.Sprintf("Found %s and %s", first, last)
case len(parts) == 1:
return fmt.Sprintf("Found %s", parts[0])
default:
// No diagnostics to print.
return color.GreenString("Validation OK!") return color.GreenString("Validation OK!")
} }
} }
@ -147,7 +153,7 @@ func renderSummaryTemplate(out io.Writer, b *bundle.Bundle, diags diag.Diagnosti
func renderDiagnostics(out io.Writer, b *bundle.Bundle, diags diag.Diagnostics) error { func renderDiagnostics(out io.Writer, b *bundle.Bundle, diags diag.Diagnostics) error {
errorT := template.Must(template.New("error").Funcs(renderFuncMap).Parse(errorTemplate)) errorT := template.Must(template.New("error").Funcs(renderFuncMap).Parse(errorTemplate))
warningT := template.Must(template.New("warning").Funcs(renderFuncMap).Parse(warningTemplate)) warningT := template.Must(template.New("warning").Funcs(renderFuncMap).Parse(warningTemplate))
infoT := template.Must(template.New("info").Funcs(renderFuncMap).Parse(infoTemplate)) recommendationT := template.Must(template.New("info").Funcs(renderFuncMap).Parse(infoTemplate))
// Print errors and warnings. // Print errors and warnings.
for _, d := range diags { for _, d := range diags {
@ -157,8 +163,8 @@ func renderDiagnostics(out io.Writer, b *bundle.Bundle, diags diag.Diagnostics)
t = errorT t = errorT
case diag.Warning: case diag.Warning:
t = warningT t = warningT
case diag.Info: case diag.Recommendation:
t = infoT t = recommendationT
} }
for i := range d.Locations { for i := range d.Locations {

View File

@ -46,17 +46,17 @@ func TestRenderTextOutput(t *testing.T) {
"Found 1 error\n", "Found 1 error\n",
}, },
{ {
name: "nil bundle and 1 info", name: "nil bundle and 1 recommendation",
diags: diag.Diagnostics{ diags: diag.Diagnostics{
{ {
Severity: diag.Info, Severity: diag.Recommendation,
Summary: "info", Summary: "recommendation",
}, },
}, },
opts: RenderOptions{RenderSummaryTable: true}, opts: RenderOptions{RenderSummaryTable: true},
expected: "Info: info\n" + expected: "Recommendation: recommendation\n" +
"\n" + "\n" +
"Found 1 info\n", "Found 1 recommendation\n",
}, },
{ {
name: "bundle during 'load' and 1 error", name: "bundle during 'load' and 1 error",
@ -97,7 +97,7 @@ func TestRenderTextOutput(t *testing.T) {
"Found 2 warnings\n", "Found 2 warnings\n",
}, },
{ {
name: "bundle during 'load' and 2 errors, 1 warning and 1 info with details", name: "bundle during 'load' and 2 errors, 1 warning and 1 recommendation with details",
bundle: loadingBundle, bundle: loadingBundle,
diags: diag.Diagnostics{ diags: diag.Diagnostics{
diag.Diagnostic{ diag.Diagnostic{
@ -119,8 +119,8 @@ func TestRenderTextOutput(t *testing.T) {
Locations: []dyn.Location{{File: "foo.py", Line: 3, Column: 1}}, Locations: []dyn.Location{{File: "foo.py", Line: 3, Column: 1}},
}, },
diag.Diagnostic{ diag.Diagnostic{
Severity: diag.Info, Severity: diag.Recommendation,
Summary: "info (4)", Summary: "recommendation (4)",
Detail: "detail (4)", Detail: "detail (4)",
Locations: []dyn.Location{{File: "foo.py", Line: 4, Column: 1}}, Locations: []dyn.Location{{File: "foo.py", Line: 4, Column: 1}},
}, },
@ -141,7 +141,7 @@ func TestRenderTextOutput(t *testing.T) {
"\n" + "\n" +
"detail (3)\n" + "detail (3)\n" +
"\n" + "\n" +
"Info: info (4)\n" + "Recommendation: recommendation (4)\n" +
" in foo.py:4:1\n" + " in foo.py:4:1\n" +
"\n" + "\n" +
"detail (4)\n" + "detail (4)\n" +
@ -149,7 +149,73 @@ func TestRenderTextOutput(t *testing.T) {
"Name: test-bundle\n" + "Name: test-bundle\n" +
"Target: test-target\n" + "Target: test-target\n" +
"\n" + "\n" +
"Found 2 errors and 1 warning and 1 info\n", "Found 2 errors, 1 warning and 1 recommendation\n",
},
{
name: "bundle during 'load' and 1 errors, 2 warning and 2 recommendations with details",
bundle: loadingBundle,
diags: diag.Diagnostics{
diag.Diagnostic{
Severity: diag.Error,
Summary: "error (1)",
Detail: "detail (1)",
Locations: []dyn.Location{{File: "foo.py", Line: 1, Column: 1}},
},
diag.Diagnostic{
Severity: diag.Warning,
Summary: "warning (2)",
Detail: "detail (2)",
Locations: []dyn.Location{{File: "foo.py", Line: 2, Column: 1}},
},
diag.Diagnostic{
Severity: diag.Warning,
Summary: "warning (3)",
Detail: "detail (3)",
Locations: []dyn.Location{{File: "foo.py", Line: 3, Column: 1}},
},
diag.Diagnostic{
Severity: diag.Recommendation,
Summary: "recommendation (4)",
Detail: "detail (4)",
Locations: []dyn.Location{{File: "foo.py", Line: 4, Column: 1}},
},
diag.Diagnostic{
Severity: diag.Recommendation,
Summary: "recommendation (5)",
Detail: "detail (5)",
Locations: []dyn.Location{{File: "foo.py", Line: 5, Column: 1}},
},
},
opts: RenderOptions{RenderSummaryTable: true},
expected: "Error: error (1)\n" +
" in foo.py:1:1\n" +
"\n" +
"detail (1)\n" +
"\n" +
"Warning: warning (2)\n" +
" in foo.py:2:1\n" +
"\n" +
"detail (2)\n" +
"\n" +
"Warning: warning (3)\n" +
" in foo.py:3:1\n" +
"\n" +
"detail (3)\n" +
"\n" +
"Recommendation: recommendation (4)\n" +
" in foo.py:4:1\n" +
"\n" +
"detail (4)\n" +
"\n" +
"Recommendation: recommendation (5)\n" +
" in foo.py:5:1\n" +
"\n" +
"detail (5)\n" +
"\n" +
"Name: test-bundle\n" +
"Target: test-target\n" +
"\n" +
"Found 1 error, 2 warnings and 2 recommendations\n",
}, },
{ {
name: "bundle during 'init'", name: "bundle during 'init'",
@ -182,7 +248,7 @@ func TestRenderTextOutput(t *testing.T) {
"Validation OK!\n", "Validation OK!\n",
}, },
{ {
name: "nil bundle without summary with 1 error, 1 warning and 1 info", name: "nil bundle without summary with 1 error, 1 warning and 1 recommendation",
bundle: nil, bundle: nil,
diags: diag.Diagnostics{ diags: diag.Diagnostics{
diag.Diagnostic{ diag.Diagnostic{
@ -198,8 +264,8 @@ func TestRenderTextOutput(t *testing.T) {
Locations: []dyn.Location{{File: "foo.py", Line: 3, Column: 1}}, Locations: []dyn.Location{{File: "foo.py", Line: 3, Column: 1}},
}, },
diag.Diagnostic{ diag.Diagnostic{
Severity: diag.Info, Severity: diag.Recommendation,
Summary: "info (3)", Summary: "recommendation (3)",
Detail: "detail (3)", Detail: "detail (3)",
Locations: []dyn.Location{{File: "foo.py", Line: 5, Column: 1}}, Locations: []dyn.Location{{File: "foo.py", Line: 5, Column: 1}},
}, },
@ -215,7 +281,7 @@ func TestRenderTextOutput(t *testing.T) {
"\n" + "\n" +
"detail (2)\n" + "detail (2)\n" +
"\n" + "\n" +
"Info: info (3)\n" + "Recommendation: recommendation (3)\n" +
" in foo.py:5:1\n" + " in foo.py:5:1\n" +
"\n" + "\n" +
"detail (3)\n" + "detail (3)\n" +
@ -340,10 +406,10 @@ func TestRenderDiagnostics(t *testing.T) {
"'name' is required\n\n", "'name' is required\n\n",
}, },
{ {
name: "info with multiple paths and locations", name: "recommendation with multiple paths and locations",
diags: diag.Diagnostics{ diags: diag.Diagnostics{
{ {
Severity: diag.Info, Severity: diag.Recommendation,
Summary: "summary", Summary: "summary",
Detail: "detail", Detail: "detail",
Paths: []dyn.Path{ Paths: []dyn.Path{
@ -356,7 +422,7 @@ func TestRenderDiagnostics(t *testing.T) {
}, },
}, },
}, },
expected: "Info: summary\n" + expected: "Recommendation: summary\n" +
" at resources.jobs.xxx\n" + " at resources.jobs.xxx\n" +
" resources.jobs.yyy\n" + " resources.jobs.yyy\n" +
" in foo.yaml:1:2\n" + " in foo.yaml:1:2\n" +

9
go.mod
View File

@ -1,6 +1,8 @@
module github.com/databricks/cli module github.com/databricks/cli
go 1.22 go 1.22.0
toolchain go1.22.7
require ( require (
github.com/Masterminds/semver/v3 v3.3.0 // MIT github.com/Masterminds/semver/v3 v3.3.0 // MIT
@ -10,7 +12,7 @@ require (
github.com/ghodss/yaml v1.0.0 // MIT + NOTICE github.com/ghodss/yaml v1.0.0 // MIT + NOTICE
github.com/google/uuid v1.6.0 // BSD-3-Clause github.com/google/uuid v1.6.0 // BSD-3-Clause
github.com/hashicorp/go-version v1.7.0 // MPL 2.0 github.com/hashicorp/go-version v1.7.0 // MPL 2.0
github.com/hashicorp/hc-install v0.7.0 // MPL 2.0 github.com/hashicorp/hc-install v0.9.0 // MPL 2.0
github.com/hashicorp/terraform-exec v0.21.0 // MPL 2.0 github.com/hashicorp/terraform-exec v0.21.0 // MPL 2.0
github.com/hashicorp/terraform-json v0.22.1 // MPL 2.0 github.com/hashicorp/terraform-json v0.22.1 // MPL 2.0
github.com/manifoldco/promptui v0.9.0 // BSD-3-Clause github.com/manifoldco/promptui v0.9.0 // BSD-3-Clause
@ -22,7 +24,7 @@ require (
github.com/spf13/pflag v1.0.5 // BSD-3-Clause github.com/spf13/pflag v1.0.5 // BSD-3-Clause
github.com/stretchr/testify v1.9.0 // MIT github.com/stretchr/testify v1.9.0 // MIT
golang.org/x/exp v0.0.0-20240222234643-814bf88cf225 golang.org/x/exp v0.0.0-20240222234643-814bf88cf225
golang.org/x/mod v0.20.0 golang.org/x/mod v0.21.0
golang.org/x/oauth2 v0.23.0 golang.org/x/oauth2 v0.23.0
golang.org/x/sync v0.8.0 golang.org/x/sync v0.8.0
golang.org/x/term v0.24.0 golang.org/x/term v0.24.0
@ -49,6 +51,7 @@ require (
github.com/google/s2a-go v0.1.7 // indirect github.com/google/s2a-go v0.1.7 // indirect
github.com/googleapis/enterprise-certificate-proxy v0.3.2 // indirect github.com/googleapis/enterprise-certificate-proxy v0.3.2 // indirect
github.com/hashicorp/go-cleanhttp v0.5.2 // indirect github.com/hashicorp/go-cleanhttp v0.5.2 // indirect
github.com/hashicorp/go-retryablehttp v0.7.7 // indirect
github.com/inconshreveable/mousetrap v1.1.0 // indirect github.com/inconshreveable/mousetrap v1.1.0 // indirect
github.com/mattn/go-colorable v0.1.13 // indirect github.com/mattn/go-colorable v0.1.13 // indirect
github.com/pmezard/go-difflib v1.0.0 // indirect github.com/pmezard/go-difflib v1.0.0 // indirect

12
go.sum generated
View File

@ -99,10 +99,14 @@ github.com/googleapis/gax-go/v2 v2.12.4 h1:9gWcmF85Wvq4ryPFvGFaOgPIs1AQX0d0bcbGw
github.com/googleapis/gax-go/v2 v2.12.4/go.mod h1:KYEYLorsnIGDi/rPC8b5TdlB9kbKoFubselGIoBMCwI= github.com/googleapis/gax-go/v2 v2.12.4/go.mod h1:KYEYLorsnIGDi/rPC8b5TdlB9kbKoFubselGIoBMCwI=
github.com/hashicorp/go-cleanhttp v0.5.2 h1:035FKYIWjmULyFRBKPs8TBQoi0x6d9G4xc9neXJWAZQ= github.com/hashicorp/go-cleanhttp v0.5.2 h1:035FKYIWjmULyFRBKPs8TBQoi0x6d9G4xc9neXJWAZQ=
github.com/hashicorp/go-cleanhttp v0.5.2/go.mod h1:kO/YDlP8L1346E6Sodw+PrpBSV4/SoxCXGY6BqNFT48= github.com/hashicorp/go-cleanhttp v0.5.2/go.mod h1:kO/YDlP8L1346E6Sodw+PrpBSV4/SoxCXGY6BqNFT48=
github.com/hashicorp/go-hclog v1.6.3 h1:Qr2kF+eVWjTiYmU7Y31tYlP1h0q/X3Nl3tPGdaB11/k=
github.com/hashicorp/go-hclog v1.6.3/go.mod h1:W4Qnvbt70Wk/zYJryRzDRU/4r0kIg0PVHBcfoyhpF5M=
github.com/hashicorp/go-retryablehttp v0.7.7 h1:C8hUCYzor8PIfXHa4UrZkU4VvK8o9ISHxT2Q8+VepXU=
github.com/hashicorp/go-retryablehttp v0.7.7/go.mod h1:pkQpWZeYWskR+D1tR2O5OcBFOxfA7DoAO6xtkuQnHTk=
github.com/hashicorp/go-version v1.7.0 h1:5tqGy27NaOTB8yJKUZELlFAS/LTKJkrmONwQKeRZfjY= github.com/hashicorp/go-version v1.7.0 h1:5tqGy27NaOTB8yJKUZELlFAS/LTKJkrmONwQKeRZfjY=
github.com/hashicorp/go-version v1.7.0/go.mod h1:fltr4n8CU8Ke44wwGCBoEymUuxUHl09ZGVZPK5anwXA= github.com/hashicorp/go-version v1.7.0/go.mod h1:fltr4n8CU8Ke44wwGCBoEymUuxUHl09ZGVZPK5anwXA=
github.com/hashicorp/hc-install v0.7.0 h1:Uu9edVqjKQxxuD28mR5TikkKDd/p55S8vzPC1659aBk= github.com/hashicorp/hc-install v0.9.0 h1:2dIk8LcvANwtv3QZLckxcjyF5w8KVtiMxu6G6eLhghE=
github.com/hashicorp/hc-install v0.7.0/go.mod h1:ELmmzZlGnEcqoUMKUuykHaPCIR1sYLYX+KSggWSKZuA= github.com/hashicorp/hc-install v0.9.0/go.mod h1:+6vOP+mf3tuGgMApVYtmsnDoKWMDcFXeTxCACYZ8SFg=
github.com/hashicorp/terraform-exec v0.21.0 h1:uNkLAe95ey5Uux6KJdua6+cv8asgILFVWkd/RG0D2XQ= github.com/hashicorp/terraform-exec v0.21.0 h1:uNkLAe95ey5Uux6KJdua6+cv8asgILFVWkd/RG0D2XQ=
github.com/hashicorp/terraform-exec v0.21.0/go.mod h1:1PPeMYou+KDUSSeRE9szMZ/oHf4fYUmB923Wzbq1ICg= github.com/hashicorp/terraform-exec v0.21.0/go.mod h1:1PPeMYou+KDUSSeRE9szMZ/oHf4fYUmB923Wzbq1ICg=
github.com/hashicorp/terraform-json v0.22.1 h1:xft84GZR0QzjPVWs4lRUwvTcPnegqlyS7orfb5Ltvec= github.com/hashicorp/terraform-json v0.22.1 h1:xft84GZR0QzjPVWs4lRUwvTcPnegqlyS7orfb5Ltvec=
@ -180,8 +184,8 @@ golang.org/x/exp v0.0.0-20240222234643-814bf88cf225/go.mod h1:CxmFvTBINI24O/j8iY
golang.org/x/lint v0.0.0-20181026193005-c67002cb31c3/go.mod h1:UVdnD1Gm6xHRNCYTkRU2/jEulfH38KcIWyp/GAMgvoE= golang.org/x/lint v0.0.0-20181026193005-c67002cb31c3/go.mod h1:UVdnD1Gm6xHRNCYTkRU2/jEulfH38KcIWyp/GAMgvoE=
golang.org/x/lint v0.0.0-20190227174305-5b3e6a55c961/go.mod h1:wehouNa3lNwaWXcvxsM5YxQ5yQlVC4a0KAMCusXpPoU= golang.org/x/lint v0.0.0-20190227174305-5b3e6a55c961/go.mod h1:wehouNa3lNwaWXcvxsM5YxQ5yQlVC4a0KAMCusXpPoU=
golang.org/x/lint v0.0.0-20190313153728-d0100b6bd8b3/go.mod h1:6SW0HCj/g11FgYtHlgUYUwCkIfeOF89ocIRzGO/8vkc= golang.org/x/lint v0.0.0-20190313153728-d0100b6bd8b3/go.mod h1:6SW0HCj/g11FgYtHlgUYUwCkIfeOF89ocIRzGO/8vkc=
golang.org/x/mod v0.20.0 h1:utOm6MM3R3dnawAiJgn0y+xvuYRsm1RKM/4giyfDgV0= golang.org/x/mod v0.21.0 h1:vvrHzRwRfVKSiLrG+d4FMl/Qi4ukBCE6kZlTUkDYRT0=
golang.org/x/mod v0.20.0/go.mod h1:hTbmBsO62+eylJbnUtE2MGJUyE7QWk4xUqPFrRgJ+7c= golang.org/x/mod v0.21.0/go.mod h1:6SkKJ3Xj0I0BrPOZoBy3bdMptDDU9oJrpohJ3eWZ1fY=
golang.org/x/net v0.0.0-20180724234803-3673e40ba225/go.mod h1:mL1N/T3taQHkDXs73rZJwtUhF3w3ftmwwsq0BUmARs4= golang.org/x/net v0.0.0-20180724234803-3673e40ba225/go.mod h1:mL1N/T3taQHkDXs73rZJwtUhF3w3ftmwwsq0BUmARs4=
golang.org/x/net v0.0.0-20180826012351-8a410e7b638d/go.mod h1:mL1N/T3taQHkDXs73rZJwtUhF3w3ftmwwsq0BUmARs4= golang.org/x/net v0.0.0-20180826012351-8a410e7b638d/go.mod h1:mL1N/T3taQHkDXs73rZJwtUhF3w3ftmwwsq0BUmARs4=
golang.org/x/net v0.0.0-20190213061140-3a22650c66bd/go.mod h1:mL1N/T3taQHkDXs73rZJwtUhF3w3ftmwwsq0BUmARs4= golang.org/x/net v0.0.0-20190213061140-3a22650c66bd/go.mod h1:mL1N/T3taQHkDXs73rZJwtUhF3w3ftmwwsq0BUmARs4=

View File

@ -6,4 +6,5 @@ const (
Error Severity = iota Error Severity = iota
Warning Warning
Info Info
Recommendation
) )

View File

@ -209,7 +209,26 @@ func TestRepositoryGitConfigWhenNotARepo(t *testing.T) {
} }
func TestRepositoryOriginUrlRemovesUserCreds(t *testing.T) { func TestRepositoryOriginUrlRemovesUserCreds(t *testing.T) {
repo := newTestRepository(t) tcases := []struct {
repo.addOriginUrl("https://username:token@github.com/databricks/foobar.git") url string
repo.assertOriginUrl("https://github.com/databricks/foobar.git") expected string
}{
{
url: "https://username:token@github.com/databricks/foobar.git",
expected: "https://github.com/databricks/foobar.git",
},
{
// Note: The token is still considered and parsed as a username here.
// However credentials integrations by Git providers like GitHub
// allow for setting a PAT token as a username.
url: "https://token@github.com/databricks/foobar.git",
expected: "https://github.com/databricks/foobar.git",
},
}
for _, tc := range tcases {
repo := newTestRepository(t)
repo.addOriginUrl(tc.url)
repo.assertOriginUrl(tc.expected)
}
} }

View File

@ -121,7 +121,7 @@ You can find that job by opening your workpace and clicking on **Workflows**.
You can also deploy to your production target directly from the command-line. You can also deploy to your production target directly from the command-line.
The warehouse, catalog, and schema for that target are configured in databricks.yml. The warehouse, catalog, and schema for that target are configured in databricks.yml.
When deploying to this target, note that the default job at resources/{{.project_name}}_job.yml When deploying to this target, note that the default job at resources/{{.project_name}}.job.yml
has a schedule set that runs every day. The schedule is paused when deploying in development mode has a schedule set that runs every day. The schedule is paused when deploying in development mode
(see https://docs.databricks.com/dev-tools/bundles/deployment-modes.html). (see https://docs.databricks.com/dev-tools/bundles/deployment-modes.html).

View File

@ -18,7 +18,7 @@ This file only template directives; it is skipped for the actual output.
{{if $notDLT}} {{if $notDLT}}
{{skip "{{.project_name}}/src/dlt_pipeline.ipynb"}} {{skip "{{.project_name}}/src/dlt_pipeline.ipynb"}}
{{skip "{{.project_name}}/resources/{{.project_name}}_pipeline.yml"}} {{skip "{{.project_name}}/resources/{{.project_name}}.pipeline.yml"}}
{{end}} {{end}}
{{if $notNotebook}} {{if $notNotebook}}
@ -26,7 +26,7 @@ This file only template directives; it is skipped for the actual output.
{{end}} {{end}}
{{if (and $notDLT $notNotebook $notPython)}} {{if (and $notDLT $notNotebook $notPython)}}
{{skip "{{.project_name}}/resources/{{.project_name}}_job.yml"}} {{skip "{{.project_name}}/resources/{{.project_name}}.job.yml"}}
{{else}} {{else}}
{{skip "{{.project_name}}/resources/.gitkeep"}} {{skip "{{.project_name}}/resources/.gitkeep"}}
{{end}} {{end}}

View File

@ -29,7 +29,7 @@ The '{{.project_name}}' project was generated by using the default-python templa
``` ```
Note that the default job from the template has a schedule that runs every day Note that the default job from the template has a schedule that runs every day
(defined in resources/{{.project_name}}_job.yml). The schedule (defined in resources/{{.project_name}}.job.yml). The schedule
is paused when deploying in development mode (see is paused when deploying in development mode (see
https://docs.databricks.com/dev-tools/bundles/deployment-modes.html). https://docs.databricks.com/dev-tools/bundles/deployment-modes.html).

View File

@ -40,7 +40,7 @@ resources:
- task_key: notebook_task - task_key: notebook_task
{{- end}} {{- end}}
pipeline_task: pipeline_task:
{{- /* TODO: we should find a way that doesn't use magics for the below, like ./{{project_name}}_pipeline.yml */}} {{- /* TODO: we should find a way that doesn't use magics for the below, like ./{{project_name}}.pipeline.yml */}}
pipeline_id: ${resources.pipelines.{{.project_name}}_pipeline.id} pipeline_id: ${resources.pipelines.{{.project_name}}_pipeline.id}
{{end -}} {{end -}}
{{- if (eq .include_python "yes") }} {{- if (eq .include_python "yes") }}

View File

@ -14,7 +14,7 @@
"source": [ "source": [
"# DLT pipeline\n", "# DLT pipeline\n",
"\n", "\n",
"This Delta Live Tables (DLT) definition is executed using a pipeline defined in resources/{{.project_name}}_pipeline.yml." "This Delta Live Tables (DLT) definition is executed using a pipeline defined in resources/{{.project_name}}.pipeline.yml."
] ]
}, },
{ {

View File

@ -14,7 +14,7 @@
"source": [ "source": [
"# Default notebook\n", "# Default notebook\n",
"\n", "\n",
"This default notebook is executed using Databricks Workflows as defined in resources/{{.project_name}}_job.yml." "This default notebook is executed using Databricks Workflows as defined in resources/{{.project_name}}.job.yml."
] ]
}, },
{ {

View File

@ -1,4 +1,4 @@
-- This query is executed using Databricks Workflows (see resources/{{.project_name}}_sql_job.yml) -- This query is executed using Databricks Workflows (see resources/{{.project_name}}_sql.job.yml)
USE CATALOG {{"{{"}}catalog{{"}}"}}; USE CATALOG {{"{{"}}catalog{{"}}"}};
USE IDENTIFIER({{"{{"}}schema{{"}}"}}); USE IDENTIFIER({{"{{"}}schema{{"}}"}});

View File

@ -1,4 +1,4 @@
-- This query is executed using Databricks Workflows (see resources/{{.project_name}}_sql_job.yml) -- This query is executed using Databricks Workflows (see resources/{{.project_name}}_sql.job.yml)
-- --
-- The streaming table below ingests all JSON files in /databricks-datasets/retail-org/sales_orders/ -- The streaming table below ingests all JSON files in /databricks-datasets/retail-org/sales_orders/
-- See also https://docs.databricks.com/sql/language-manual/sql-ref-syntax-ddl-create-streaming-table.html -- See also https://docs.databricks.com/sql/language-manual/sql-ref-syntax-ddl-create-streaming-table.html