Shreyas Goenka
de5a348b4e
Merge remote-tracking branch 'origin' into bundle-exec
2025-03-05 11:42:30 +01:00
Andrew Nester
4dba35dff4
Upgrade TF provider to 1.68.0 ( #2426 )
...
## Changes
Upgrade TF provider to 1.68.0
- Added support for Volumes as a destination for Cluster log conf
2025-03-05 10:20:55 +00:00
Andrew Nester
294db2ecca
Upgrade Go SDK to 0.59.0 ( #2425 )
...
## Changes
- Added `service-principal-secrets` command
- Added `budget-policy-id` for apps
- `experiments.log-inputs` now requires `ID` parameter as an input
- Added `genie.get-space` command
- Added `providers.list-provider-share-assets` command
For the whole list of SDK changes see:
https://github.com/databricks/databricks-sdk-go/releases/tag/v0.59.0
2025-03-05 10:20:51 +00:00
Shreyas Goenka
150a51c11a
merge
2025-03-05 10:29:21 +01:00
Andrew Nester
41961226be
Switch to use GET workspaces-files/{name} instead of workspace/export for state files ( #2423 )
...
## Changes
Switch to use GET workspaces-files/{name} instead of workspace/export
for state files.
## Why
`/api/2.0./workspaces-files/{name}` has a higher limit which allows to
export state files larger than 10 MBs (which is the current limit for
`workspace/export`). We don't use the same API for read in other places
and fully replacing existing Filer because it doesn't correct get the
file content for notebooks and returns "File Not Found" error instead.
## Tests
All existing tests pass
2025-03-04 15:03:51 +00:00
Anton Nekipelov
c0f5436a28
Add support for schemas in deployment bind/unbind commands ( #2406 )
...
## Changes
1. Changed `FindResourceByConfigKey` to return schema resources
2. Implemented the `Exists` method for schema resources
## Why
This PR adds support for schema resources in deployment operations,
enabling users to:
- Bind schemas using `databricks bundle deployment bind <myschema_key>
<schema_full_name>`
- Unbind schemas using `databricks bundle deployment unbind
<myschema_key>`
Where:
- `myschema_key` is a resource key defined in the bundle's .yml file
- `schema_full_name` references an existing schema in the Databricks
workspace
These capabilities allow for more flexible resource management of
schemas within bundles.
## Tests
Added a new integration test that tests bind and unbind methods together
with bundle deployment and destruction.
2025-03-04 12:46:39 +00:00
Shreyas Goenka
0c9fbf7b23
-
2025-03-03 19:58:14 +01:00
Shreyas Goenka
898b2c1cc3
merge
2025-03-03 19:21:18 +01:00
Shreyas Goenka
58d28a9c92
add more tests for profile and target
2025-03-03 18:26:17 +01:00
Andrew Nester
010f88f84e
Added a warning when `config` section is used in apps ( #2416 )
...
## Changes
Added a warning when `config` section is used in apps
## Why
To avoid the confusion between using apps in DABs and outside of DABs,
we want to provide only one way of configuring apps runtime
configuration - by using `app.yml` file in the root of the app.
## Tests
Added acceptance tests
2025-03-03 16:40:28 +00:00
Andrew Nester
3b07265113
Restrict same name libraries check for only whl and jar types ( #2401 )
...
## Changes
Same name libraries check only valid for local libraries. Local
libraries are only supported for Whl and Jar types. Hence we can
restrict matching pattern only to these libraries.
## Tests
Existing acceptance tests pass
2025-03-03 15:34:41 +00:00
Denis Bilenko
e4cd782852
Remove bundle.{Parallel,ReadOnlyBundle} ( #2414 )
...
## Changes
- Remove bundle.Parallel & bundle.ReadOnlyBundle.
- Add bundle.ApplyParallel, as a helper to migrate from bundle.Parallel.
- Keep ReadOnlyMutator as a separate type but it's now a subtype of
Mutator so it works on regular *Bundle. Having it as a separate type
prevents non-readonly mutators being passed to ApplyParallel
- validate.Validate becomes a function (was Mutator).
## Why
This a follow up to #2390 where we removed most of the tools to
construct chains of mutators. Same motivation applies here.
When it comes to read-only bundles, it's a leaky abstraction -- since
it's a shallow copy, it does not actually guarantee or enforce readonly
access to bundle. A better approach would be to run parallel operations
on independent narrowly-focused deep-copied structs, with just enough
information to carry out the task (this is not implemented here, but the
eventual goal). Now that we can just write regular code in phases and
not limited to mutator interface, we can switch to that approach.
## Tests
Existing tests.
---------
Co-authored-by: shreyas-goenka <88374338+shreyas-goenka@users.noreply.github.com>
2025-03-03 13:35:36 +00:00
shreyas-goenka
25a7e0a72b
Do not use `.MustString()` in auth interpolation warning ( #2412 )
...
## Why
Addresses post merge feedback from
https://github.com/databricks/cli/pull/2399#discussion_r1975091730
## Tests
N/A
2025-03-03 12:03:46 +00:00
Denis Bilenko
8f8f24c3a9
Convert python wheel tests to acceptance ( #2396 )
...
## Changes
Rewrite bundle/tests/python_wheel_test.go into acceptance tests. The
same configs are used, but the test now runs 'bundle deploy' and in
addition to checking the files on the file system, also checks that the
files were uploaded and records jobs/create request.
There is a new test helper bin/find.py which filters out paths based on
regex, asserts on number of expected results. I've added it because
'find' on Windows behaves differently, so this helps avoid
cross-platform differences.
2025-03-03 11:09:25 +00:00
shreyas-goenka
03e4bb2575
Update warning for includes outside root to only mention databricks.yml ( #2411 )
...
## Why
Addresses feedback from this thread:
https://github.com/databricks/cli/pull/2389#discussion_r1975760462 .
2025-03-03 09:21:39 +00:00
Andrew Nester
6a07e05e9b
Raise an error when there are multiple local libraries with the same basename used ( #2382 )
...
## Changes
It could happen that there are multiple artifacts defined in the bundle
which build and therefore deploy wheel packages with the same name. This
leads to conflict between these packages, they will overwrite each other
and therefore they should have different names instead
Fixes https://github.com/databricks/cli/issues/1674
Previous attempt (https://github.com/databricks/cli/pull/2297 +
https://github.com/databricks/cli/pull/2341 ) led to the breakage, this
PR fixes both issues.
## Tests
Added acceptance test
2025-02-27 16:32:50 +00:00
shreyas-goenka
bc299cafb8
Add warning when variable interpolation is used for auth fields ( #2399 )
...
## Changes
This PR adds a warning which gives users clear guidance when they try to
use variable interpolation for an auth field.
## Tests
Modify existing acceptance test.
2025-02-27 15:58:48 +00:00
shreyas-goenka
eb57dbd844
Add warning when include is used in config files other than databricks.yml ( #2389 )
...
## Changes
Defining an include section in config files other than the main
`databricks.yml` file fails silently. With this PR users will get a
warning when they try this.
## Tests
Acceptance test.
2025-02-27 14:59:00 +00:00
Denis Bilenko
e2db0cd0e2
Remove bundle.{Seq,If,Defer,newPhase,logString}, switch to regular functions ( #2390 )
...
## Changes
- Instead of constructing chains of mutators and then executing them,
execute them directly.
- Remove functionality related to chain-building: Seq, If, Defer,
newPhase, logString.
- Phases become functions that apply the changes directly rather than
construct mutator chains that will be called later.
- Add a helper ApplySeq to call multiple mutators, use it where
Apply+Seq were used before.
This is intended to be a refactoring without functional changes, but
there are a few behaviour changes:
- Since defer() is used to call unlock instead of bundle.Defer()
unlocking will now happen even in case of panics.
- In --debug, the phase names are are still logged once before start of
the phase but each entry no longer has 'seq' or phase name in it.
- The message "Deployment complete!" was printed even if
terraform.Apply() mutator had an error. It no longer does that.
## Motivation
The use of the chains was necessary when mutators were returning a list
of other mutators instead of calling them directly. But that has since
been removed, so now the chain machinery have no purpose anymore.
Use of direct functions simplifies the logic and makes bugs more
apparent and easy to fix.
Other improvements that this unlocks:
- Simpler stacktraces/debugging (breakpoints).
- Use of functions with narrowly scoped API: instead of mutators that
receive full bundle config, we can use focused functions that only deal
with sections they care about prepareGitSettings(currentGitSection) ->
updatedGitSection. This makes the data flow more apparent.
- Parallel computations across mutators (within phase): launch
goroutines fetching data from APIs at the beggining, process them once
they are ready.
## Tests
Existing tests.
2025-02-27 11:41:58 +00:00
Andrew Nester
cdea775bd2
Fixed spark version check for clusters defined in the same bundle ( #2374 )
...
## Changes
Previously using python wheel tasks in the tasks with compute referering
to interactive cluster defied in the same bundle would produce a warning
like below
```
GET /api/2.1/clusters/get?cluster_id=${resources.clusters.development_cluster.id}
< HTTP/2.0 400 Bad Request
< {
< "error_code": "INVALID_PARAMETER_VALUE",
< "message": "Cluster ${resources.clusters.development_cluster.id} does not exist"
< } pid=14465 mutator=seq mutator=initialize mutator=seq mutator=PythonWrapperWarning sdk=true
```
This PR fixes it by making sure that we check spark version for such
clusters based on its bundle configuration and don't make API calls
## Tests
Added acceptance test
2025-02-26 13:04:45 +01:00
Denis Bilenko
14f0292598
Simplify whl artifact autodetection code ( #2371 )
...
## Changes
- Get rid of artifacts.DetectPackages which is a thin wrapper around
artifacts/whl.DetectPackage
- Get rid of parsing name out of setup.py. Do not randomize either, use
a static one.
## Tests
Existing tests.
2025-02-25 13:10:25 +00:00
dependabot[bot]
764142978c
build(deps): bump github.com/databricks/databricks-sdk-go from 0.57.0 to 0.58.1 ( #2357 )
...
Bumps
[github.com/databricks/databricks-sdk-go](https://github.com/databricks/databricks-sdk-go )
from 0.57.0 to 0.58.1.
<details>
<summary>Release notes</summary>
<p><em>Sourced from <a
href="https://github.com/databricks/databricks-sdk-go/releases ">github.com/databricks/databricks-sdk-go's
releases</a>.</em></p>
<blockquote>
<h2>v0.58.1</h2>
<h3>Internal Changes</h3>
<ul>
<li>Do not send ForceSendFields as query parameters.</li>
</ul>
<h2>v0.58.0</h2>
<h2>[Release] Release v0.58.0</h2>
<h3>New Features and Improvements</h3>
<ul>
<li>Enable async refreshes for OAuth tokens (<a
href="https://redirect.github.com/databricks/databricks-sdk-go/pull/1143 ">#1143</a>).</li>
</ul>
<h3>Internal Changes</h3>
<ul>
<li>Add support for asynchronous data plane token refreshes (<a
href="https://redirect.github.com/databricks/databricks-sdk-go/pull/1142 ">#1142</a>).</li>
<li>Introduce new TokenSource interface that takes a
<code>context.Context</code> (<a
href="https://redirect.github.com/databricks/databricks-sdk-go/pull/1141 ">#1141</a>).</li>
</ul>
<h3>API Changes:</h3>
<ul>
<li>Added <code>GetMessageQueryResultByAttachment</code> method for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/dashboards#GenieAPI ">w.Genie</a>
workspace-level service.</li>
<li>Added <code>Id</code> field for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/apps#App ">apps.App</a>.</li>
<li>Added <code>LimitConfig</code> field for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/billing#UpdateBudgetPolicyRequest ">billing.UpdateBudgetPolicyRequest</a>.</li>
<li>Added <code>Volumes</code> field for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/compute#ClusterLogConf ">compute.ClusterLogConf</a>.</li>
<li>Removed <code>ReviewState</code>, <code>Reviews</code> and
<code>RunnerCollaborators</code> fields for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/cleanrooms#CleanRoomAssetNotebook ">cleanrooms.CleanRoomAssetNotebook</a>.</li>
</ul>
<p>OpenAPI SHA: 99f644e72261ef5ecf8d74db20f4b7a1e09723cc, Date:
2025-02-11</p>
</blockquote>
</details>
<details>
<summary>Changelog</summary>
<p><em>Sourced from <a
href="https://github.com/databricks/databricks-sdk-go/blob/main/CHANGELOG.md ">github.com/databricks/databricks-sdk-go's
changelog</a>.</em></p>
<blockquote>
<h2>[Release] Release v0.58.1</h2>
<h3>Internal Changes</h3>
<ul>
<li>Do not send ForceSendFields as query parameters.</li>
</ul>
<h2>[Release] Release v0.58.0</h2>
<h3>New Features and Improvements</h3>
<ul>
<li>Enable async refreshes for OAuth tokens (<a
href="https://redirect.github.com/databricks/databricks-sdk-go/pull/1143 ">#1143</a>).</li>
</ul>
<h3>Internal Changes</h3>
<ul>
<li>Add support for asynchronous data plane token refreshes (<a
href="https://redirect.github.com/databricks/databricks-sdk-go/pull/1142 ">#1142</a>).</li>
<li>Introduce new TokenSource interface that takes a
<code>context.Context</code> (<a
href="https://redirect.github.com/databricks/databricks-sdk-go/pull/1141 ">#1141</a>).</li>
</ul>
<h3>API Changes:</h3>
<ul>
<li>Added <code>GetMessageQueryResultByAttachment</code> method for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/dashboards#GenieAPI ">w.Genie</a>
workspace-level service.</li>
<li>Added <code>Id</code> field for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/apps#App ">apps.App</a>.</li>
<li>Added <code>LimitConfig</code> field for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/billing#UpdateBudgetPolicyRequest ">billing.UpdateBudgetPolicyRequest</a>.</li>
<li>Added <code>Volumes</code> field for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/compute#ClusterLogConf ">compute.ClusterLogConf</a>.</li>
<li>Added .</li>
<li>Removed <code>ReviewState</code>, <code>Reviews</code> and
<code>RunnerCollaborators</code> fields for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/cleanrooms#CleanRoomAssetNotebook ">cleanrooms.CleanRoomAssetNotebook</a>.</li>
</ul>
<p>OpenAPI SHA: 99f644e72261ef5ecf8d74db20f4b7a1e09723cc, Date:
2025-02-11</p>
</blockquote>
</details>
<details>
<summary>Commits</summary>
<ul>
<li><a
href="967d0632b7
"><code>967d063</code></a>
[Release] Release v0.58.1 (<a
href="https://redirect.github.com/databricks/databricks-sdk-go/issues/1146 ">#1146</a>)</li>
<li><a
href="9dc3c56fb0
"><code>9dc3c56</code></a>
[Release] Release v0.58.0 (<a
href="https://redirect.github.com/databricks/databricks-sdk-go/issues/1144 ">#1144</a>)</li>
<li><a
href="8307a4d467
"><code>8307a4d</code></a>
[Feature] Enable async refreshes for OAuth tokens (<a
href="https://redirect.github.com/databricks/databricks-sdk-go/issues/1143 ">#1143</a>)</li>
<li><a
href="815cace601
"><code>815cace</code></a>
[Internal] Add support for asynchronous data plane token refreshes (<a
href="https://redirect.github.com/databricks/databricks-sdk-go/issues/1142 ">#1142</a>)</li>
<li><a
href="3aebd68bf3
"><code>3aebd68</code></a>
[Internal] Introduce new TokenSource interface that takes a
<code>context.Context</code>...</li>
<li>See full diff in <a
href="https://github.com/databricks/databricks-sdk-go/compare/v0.57.0...v0.58.1 ">compare
view</a></li>
</ul>
</details>
<br />
<details>
<summary>Most Recent Ignore Conditions Applied to This Pull
Request</summary>
| Dependency Name | Ignore Conditions |
| --- | --- |
| github.com/databricks/databricks-sdk-go | [>= 0.28.a, < 0.29] |
</details>
[](https://docs.github.com/en/github/managing-security-vulnerabilities/about-dependabot-security-updates#about-compatibility-scores )
Dependabot will resolve any conflicts with this PR as long as you don't
alter it yourself. You can also trigger a rebase manually by commenting
`@dependabot rebase`.
[//]: # (dependabot-automerge-start)
[//]: # (dependabot-automerge-end)
---
<details>
<summary>Dependabot commands and options</summary>
<br />
You can trigger Dependabot actions by commenting on this PR:
- `@dependabot rebase` will rebase this PR
- `@dependabot recreate` will recreate this PR, overwriting any edits
that have been made to it
- `@dependabot merge` will merge this PR after your CI passes on it
- `@dependabot squash and merge` will squash and merge this PR after
your CI passes on it
- `@dependabot cancel merge` will cancel a previously requested merge
and block automerging
- `@dependabot reopen` will reopen this PR if it is closed
- `@dependabot close` will close this PR and stop Dependabot recreating
it. You can achieve the same result by closing it manually
- `@dependabot show <dependency name> ignore conditions` will show all
of the ignore conditions of the specified dependency
- `@dependabot ignore this major version` will close this PR and stop
Dependabot creating any more for this major version (unless you reopen
the PR or upgrade to it yourself)
- `@dependabot ignore this minor version` will close this PR and stop
Dependabot creating any more for this minor version (unless you reopen
the PR or upgrade to it yourself)
- `@dependabot ignore this dependency` will close this PR and stop
Dependabot creating any more for this dependency (unless you reopen the
PR or upgrade to it yourself)
</details>
---------
Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
Co-authored-by: Andrew Nester <andrew.nester@databricks.com>
2025-02-25 11:20:29 +00:00
Julia Crawford (Databricks)
864f0abc4b
Populate missing bundle config reference doc descriptions and fix resource reference doc template ( #2314 )
...
## Changes
- Added some missing descriptions to annotations.yml
- Fixed links in the resource reference doc template
## Tests
<!-- How is this tested? -->
2025-02-24 18:59:12 +00:00
Lennart Kats (databricks)
ce7e64062b
Change warning about incomplete permissions section into a recommendation ( #2043 )
...
## Changes
Changes the warning about an incomplete / implicit permissions section
into a recommendation, and does a minor bit of cleanup.
## Tests
New unit test.
2025-02-24 09:39:03 +00:00
Lennart Kats (databricks)
6e18d94264
Refine `mode: production` diagnostic output ( #2236 )
...
## Changes
This refines the `mode: production` diagnostic output now that the
`Detail` property is rendered as output. This is a follow-up to
https://github.com/databricks/cli/pull/1712 .
2025-02-24 08:33:13 +00:00
Ilya Kuznetsov
874a05a27b
Add escaping for links and headers in docsgen ( #2330 )
...
## Changes
To avoid build warnings and errors in docs build we need to escape
symbols that are treated as syntax elements
## Tests
<!-- How is this tested? -->
2025-02-18 16:12:49 +00:00
Pieter Noordhuis
96302c7415
Revert changes related to basename check for local libraries ( #2345 )
...
## Changes
These changes break the use of non-local libraries (such as PyPI
libraries).
This reverts the set so we can cut a patch release and take a closer
look later.
Original PRs are #2297 and #2341 .
Issue reported in #2343 .
## Tests
Manually confirmed that a bundle with PyPI package in libraries now
deploys fine.
2025-02-12 20:05:49 +01:00
Andrew Nester
ac439f8c1a
Fix for regression deploying resources with PyPi and Maven library types ( #2341 )
...
## Changes
The CheckForSameNameLibraries mutator incorrectly assumed all resource
libraries define libraries as paths of the `string` type, but some
libraries, such as PyPi and Maven, define them as objects.
This PR addresses this issue. It was introduced in #2297 .
## Tests
Added regression test.
2025-02-12 17:14:30 +01:00
Andrew Nester
f7a45d0c7e
Upgrade to TF provider 1.65.1 ( #2328 )
...
## Changes
Upgrade to TF provider 1.65.1
Notable changes:
- Now it's possible to use `run_as` field in `pipelines` definition
- Added support for `performance_target` for `jobs`
2025-02-10 14:06:02 +00:00
dependabot[bot]
047691dd91
Bump github.com/databricks/databricks-sdk-go from 0.56.1 to 0.57.0 ( #2321 )
...
Bumps
[github.com/databricks/databricks-sdk-go](https://github.com/databricks/databricks-sdk-go )
from 0.56.1 to 0.57.0.
<details>
<summary>Release notes</summary>
<p><em>Sourced from <a
href="https://github.com/databricks/databricks-sdk-go/releases ">github.com/databricks/databricks-sdk-go's
releases</a>.</em></p>
<blockquote>
<h2>v0.57.0</h2>
<h2>[Release] Release v0.57.0</h2>
<h3>New Features and Improvements</h3>
<ul>
<li>Add support for async OAuth token refreshes (<a
href="https://redirect.github.com/databricks/databricks-sdk-go/pull/1135 ">#1135</a>).</li>
</ul>
<h3>API Changes:</h3>
<ul>
<li>Added <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/billing#BudgetPolicyAPI ">a.BudgetPolicy</a>
account-level service.</li>
<li>Added <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/settings#EnableIpAccessListsAPI ">a.EnableIpAccessLists</a>
account-level service.</li>
<li>Added <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/dashboards#LakeviewEmbeddedAPI ">w.LakeviewEmbedded</a>
workspace-level service and <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/dashboards#QueryExecutionAPI ">w.QueryExecution</a>
workspace-level service.</li>
<li>Added <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/sql#RedashConfigAPI ">w.RedashConfig</a>
workspace-level service.</li>
<li>Added <code>GcpOauthToken</code> field for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/catalog#TemporaryCredentials ">catalog.TemporaryCredentials</a>.</li>
<li>Added <code>Options</code> field for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/catalog#UpdateCatalog ">catalog.UpdateCatalog</a>.</li>
<li>Added <code>StatementId</code> field for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/dashboards#QueryAttachment ">dashboards.QueryAttachment</a>.</li>
<li>Added <code>EffectivePerformanceTarget</code> field for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/jobs#BaseRun ">jobs.BaseRun</a>.</li>
<li>Added <code>PerformanceTarget</code> field for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/jobs#CreateJob ">jobs.CreateJob</a>.</li>
<li>Added <code>PerformanceTarget</code> field for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/jobs#JobSettings ">jobs.JobSettings</a>.</li>
<li>Added <code>EffectivePerformanceTarget</code> field for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/jobs#Run ">jobs.Run</a>.</li>
<li>Added <code>PerformanceTarget</code> field for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/jobs#RunNow ">jobs.RunNow</a>.</li>
<li>Added <code>Disabled</code> and
<code>EffectivePerformanceTarget</code> fields for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/jobs#RunTask ">jobs.RunTask</a>.</li>
<li>Added <code>UserAuthorizedScopes</code> field for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/oauth2#CreateCustomAppIntegration ">oauth2.CreateCustomAppIntegration</a>.</li>
<li>Added <code>UserAuthorizedScopes</code> field for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/oauth2#GetCustomAppIntegrationOutput ">oauth2.GetCustomAppIntegrationOutput</a>.</li>
<li>Added <code>UserAuthorizedScopes</code> field for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/oauth2#UpdateCustomAppIntegration ">oauth2.UpdateCustomAppIntegration</a>.</li>
<li>Added <code>Contents</code> field for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/serving#HttpRequestResponse ">serving.HttpRequestResponse</a>.</li>
<li>Changed <code>HttpRequest</code> method for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/serving#ServingEndpointsAPI ">w.ServingEndpoints</a>
workspace-level service to type <code>HttpRequest</code> method for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/serving#ServingEndpointsAPI ">w.ServingEndpoints</a>
workspace-level service.</li>
<li>Changed <code>HttpRequest</code> method for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/serving#ServingEndpointsAPI ">w.ServingEndpoints</a>
workspace-level service to return <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/serving#HttpRequestResponse ">serving.HttpRequestResponse</a>.</li>
<li>Removed <code>SecurableKind</code> field for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/catalog#CatalogInfo ">catalog.CatalogInfo</a>.</li>
<li>Removed <code>SecurableKind</code> field for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/catalog#ConnectionInfo ">catalog.ConnectionInfo</a>.</li>
<li>Removed <code>StatusCode</code> and <code>Text</code> fields for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/serving#ExternalFunctionResponse ">serving.ExternalFunctionResponse</a>.</li>
</ul>
<p>OpenAPI SHA: c72c58f97b950fcb924a90ef164bcb10cfcd5ece, Date:
2025-02-03</p>
</blockquote>
</details>
<details>
<summary>Changelog</summary>
<p><em>Sourced from <a
href="https://github.com/databricks/databricks-sdk-go/blob/main/CHANGELOG.md ">github.com/databricks/databricks-sdk-go's
changelog</a>.</em></p>
<blockquote>
<h2>[Release] Release v0.57.0</h2>
<h3>New Features and Improvements</h3>
<ul>
<li>Add support for async OAuth token refreshes (<a
href="https://redirect.github.com/databricks/databricks-sdk-go/pull/1135 ">#1135</a>).</li>
</ul>
<h3>API Changes:</h3>
<ul>
<li>Added <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/billing#BudgetPolicyAPI ">a.BudgetPolicy</a>
account-level service.</li>
<li>Added <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/settings#EnableIpAccessListsAPI ">a.EnableIpAccessLists</a>
account-level service.</li>
<li>Added <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/dashboards#LakeviewEmbeddedAPI ">w.LakeviewEmbedded</a>
workspace-level service and <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/dashboards#QueryExecutionAPI ">w.QueryExecution</a>
workspace-level service.</li>
<li>Added <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/sql#RedashConfigAPI ">w.RedashConfig</a>
workspace-level service.</li>
<li>Added <code>GcpOauthToken</code> field for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/catalog#TemporaryCredentials ">catalog.TemporaryCredentials</a>.</li>
<li>Added <code>Options</code> field for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/catalog#UpdateCatalog ">catalog.UpdateCatalog</a>.</li>
<li>Added <code>StatementId</code> field for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/dashboards#QueryAttachment ">dashboards.QueryAttachment</a>.</li>
<li>Added <code>EffectivePerformanceTarget</code> field for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/jobs#BaseRun ">jobs.BaseRun</a>.</li>
<li>Added <code>PerformanceTarget</code> field for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/jobs#CreateJob ">jobs.CreateJob</a>.</li>
<li>Added <code>PerformanceTarget</code> field for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/jobs#JobSettings ">jobs.JobSettings</a>.</li>
<li>Added <code>EffectivePerformanceTarget</code> field for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/jobs#Run ">jobs.Run</a>.</li>
<li>Added <code>PerformanceTarget</code> field for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/jobs#RunNow ">jobs.RunNow</a>.</li>
<li>Added <code>Disabled</code> and
<code>EffectivePerformanceTarget</code> fields for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/jobs#RunTask ">jobs.RunTask</a>.</li>
<li>Added <code>UserAuthorizedScopes</code> field for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/oauth2#CreateCustomAppIntegration ">oauth2.CreateCustomAppIntegration</a>.</li>
<li>Added <code>UserAuthorizedScopes</code> field for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/oauth2#GetCustomAppIntegrationOutput ">oauth2.GetCustomAppIntegrationOutput</a>.</li>
<li>Added <code>UserAuthorizedScopes</code> field for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/oauth2#UpdateCustomAppIntegration ">oauth2.UpdateCustomAppIntegration</a>.</li>
<li>Added <code>Contents</code> field for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/serving#HttpRequestResponse ">serving.HttpRequestResponse</a>.</li>
<li>Changed <code>HttpRequest</code> method for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/serving#ServingEndpointsAPI ">w.ServingEndpoints</a>
workspace-level service to type <code>HttpRequest</code> method for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/serving#ServingEndpointsAPI ">w.ServingEndpoints</a>
workspace-level service.</li>
<li>Changed <code>HttpRequest</code> method for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/serving#ServingEndpointsAPI ">w.ServingEndpoints</a>
workspace-level service to return <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/serving#HttpRequestResponse ">serving.HttpRequestResponse</a>.</li>
<li>Removed <code>SecurableKind</code> field for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/catalog#CatalogInfo ">catalog.CatalogInfo</a>.</li>
<li>Removed <code>SecurableKind</code> field for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/catalog#ConnectionInfo ">catalog.ConnectionInfo</a>.</li>
<li>Removed <code>StatusCode</code> and <code>Text</code> fields for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/serving#ExternalFunctionResponse ">serving.ExternalFunctionResponse</a>.</li>
</ul>
<p>OpenAPI SHA: c72c58f97b950fcb924a90ef164bcb10cfcd5ece, Date:
2025-02-03</p>
</blockquote>
</details>
<details>
<summary>Commits</summary>
<ul>
<li><a
href="7cb1883c85
"><code>7cb1883</code></a>
[Release] Release v0.57.0 (<a
href="https://redirect.github.com/databricks/databricks-sdk-go/issues/1140 ">#1140</a>)</li>
<li><a
href="31fdc692bb
"><code>31fdc69</code></a>
[Feature] Add support for async OAuth token refreshes (<a
href="https://redirect.github.com/databricks/databricks-sdk-go/issues/1135 ">#1135</a>)</li>
<li>See full diff in <a
href="https://github.com/databricks/databricks-sdk-go/compare/v0.56.1...v0.57.0 ">compare
view</a></li>
</ul>
</details>
<br />
<details>
<summary>Most Recent Ignore Conditions Applied to This Pull
Request</summary>
| Dependency Name | Ignore Conditions |
| --- | --- |
| github.com/databricks/databricks-sdk-go | [>= 0.28.a, < 0.29] |
</details>
[](https://docs.github.com/en/github/managing-security-vulnerabilities/about-dependabot-security-updates#about-compatibility-scores )
Dependabot will resolve any conflicts with this PR as long as you don't
alter it yourself. You can also trigger a rebase manually by commenting
`@dependabot rebase`.
[//]: # (dependabot-automerge-start)
[//]: # (dependabot-automerge-end)
---
<details>
<summary>Dependabot commands and options</summary>
<br />
You can trigger Dependabot actions by commenting on this PR:
- `@dependabot rebase` will rebase this PR
- `@dependabot recreate` will recreate this PR, overwriting any edits
that have been made to it
- `@dependabot merge` will merge this PR after your CI passes on it
- `@dependabot squash and merge` will squash and merge this PR after
your CI passes on it
- `@dependabot cancel merge` will cancel a previously requested merge
and block automerging
- `@dependabot reopen` will reopen this PR if it is closed
- `@dependabot close` will close this PR and stop Dependabot recreating
it. You can achieve the same result by closing it manually
- `@dependabot show <dependency name> ignore conditions` will show all
of the ignore conditions of the specified dependency
- `@dependabot ignore this major version` will close this PR and stop
Dependabot creating any more for this major version (unless you reopen
the PR or upgrade to it yourself)
- `@dependabot ignore this minor version` will close this PR and stop
Dependabot creating any more for this minor version (unless you reopen
the PR or upgrade to it yourself)
- `@dependabot ignore this dependency` will close this PR and stop
Dependabot creating any more for this dependency (unless you reopen the
PR or upgrade to it yourself)
</details>
---------
Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
Co-authored-by: Andrew Nester <andrew.nester@databricks.com>
2025-02-10 12:50:52 +00:00
Andrew Nester
f8aaa7fce3
Added support to generate Git based jobs ( #2304 )
...
## Changes
This will generate bundle YAML configuration for Git based jobs but
won't download any related files as they are in Git repo.
Fixes #1423
## Tests
Added unit test
---------
Co-authored-by: Pieter Noordhuis <pieter.noordhuis@databricks.com>
2025-02-07 18:37:03 +00:00
Andrew Nester
2a97dcaa45
Raise an error when there are multiple local libraries with the same basename used ( #2297 )
...
## Changes
Raise an error when there are multiple local libraries with the same
basename used
Fixes #1674
## Tests
Added an unit test
2025-02-07 17:55:16 +00:00
Andrew Nester
5aa89230e9
Use CreatePipeline instead of PipelineSpec for resources.Pipeline struct ( #2287 )
...
## Changes
`CreatePipeline` is a more complete structure (superset of PipelineSpec
one) which enables support of additional fields such as `run_as` and
`allow_duplicate_names` in DABs configuration. Note: these fields are
subject to support in TF in order to correctly work.
## Tests
Existing tests pass + no fields are removed from JSON schema
2025-02-07 17:22:51 +00:00
Denis Bilenko
54e16d5f62
Always print warnings and errors; clean up format ( #2213 )
...
## Changes
- Print warnings and errors by default.
- Fix ErrAlreadyPrinted not to be logged at Error level.
- Format log messages as "Warn: message" instead of "WARN" to make it
more readable and in-line with the rest of the output.
- Only print attributes (pid, mutator, etc) and time when the overall
level is debug (so --debug output has not changed much).
## Tests
- Existing acceptance tests show how warning messages appear in various
test case.
- Added new test for `--debug` output.
- Add sort_lines.py helper to avoid dependency on 'sort' which is
locale-sensitive.
2025-02-07 11:29:40 +00:00
Ilya Kuznetsov
27eb0c4072
Allow 'any' examples in JSON schema ( #2289 )
...
## Changes
1. Allow `any` examples in json-schema type since we have many of them
in open api spec
2. Fix issue with missing overrides annotations when re-generating the
schema
## Tests
<!-- How is this tested? -->
2025-02-06 19:27:55 +00:00
Ilya Kuznetsov
1678503cb0
Fix docs template ( #2283 )
...
## Changes
Comment breaks markdown front-matter and description cannot be read
## Tests
<!-- How is this tested? -->
2025-02-05 09:01:51 +00:00
rikjansen-hu
dcc61cd763
Fix env variable for AzureCli local config ( #2248 )
...
## Changes
Solves #1722 (current solution passes wrong variable)
## Tests
None, this is a simple find-and-replace on a previous PR.
Proof that this is the correct
[variable](https://learn.microsoft.com/en-us/cli/azure/azure-cli-configuration#cli-configuration-file ).
This just passes the variable along to the Terraform environment, which
[should](https://github.com/hashicorp/terraform/issues/25416 ) be picked
up by Terraform.
Co-authored-by: Rik Jansen <rik.jansen2@nl.abnamro.com>
2025-02-04 19:30:02 +01:00
Simon Poltier
84b694f2a1
accept JSON includes ( #2265 )
...
#2201 disabled using JSON as part of a bundle definition. I believe this
was not intended.
## Changes
Accept json files as includes, just as YAML files.
## Tests
Covered by the tests in #2201
2025-02-04 19:28:19 +01:00
Denis Bilenko
38efedcd73
Remove bundle.git.inferred ( #2258 )
...
The only use case for it was to emit a warning and based on the
discussion here
https://github.com/databricks/cli/pull/2213/files#r1933558087 the
warning it not useful and logging that with reduced severity is also not
useful.
2025-01-29 14:15:52 +00:00
Gleb Kanterov
13596eb605
PythonMutator: Fix relative path error ( #2253 )
...
## Changes
Fix relative path errors in the Python mutator that was failing during
deployment since v0.239.1.
Before that:
```
% databricks bundle deploy
Deploying resources...
Updating deployment state...
Error: failed to compute relative path for job jobs_as_code_project_job: Rel: can't make resources/jobs_as_code_project_job.py relative to /Users/$USER/jobs_as_code_project
```
As a result, the bundle was deployed, but the deployment state wasn't
updated.
## Tests
Unit tests, adding acceptance tests in
https://github.com/databricks/cli/pull/2254
2025-01-29 13:56:57 +00:00
Ilya Kuznetsov
708c4fbb7a
Autogenerated documentation for bundle config ( #2033 )
...
## Changes
Documentation autogeneration tool. This tool uses same annotations_*.yml
files as in json-schema
Result will go
[there](https://docs.databricks.com/en/dev-tools/bundles/reference.html )
and
[there](https://docs.databricks.com/en/dev-tools/bundles/resources.html#cluster )
## Tests
Manually
2025-01-29 12:14:21 +00:00
shreyas-goenka
884b5f26ed
Set bundle auth configuration in command context ( #2195 )
...
## Changes
This change is required to enable tracking execution time telemetry for
bundle commands. In order to track execution time for the command
generally, we need to have the databricks auth configuration available
at this section of the code:
41bbd89257/cmd/root/root.go (L99)
In order to do this we can rely on the `configUsed` context key.
Most commands rely on the `root.MustWorkspaceClient` function which
automatically sets the client config in the `configUsed` context key.
Bundle commands, however, do not do so. They instead store their
workspace clients in the `&bundle.Bundle{}` object.
With this PR, the `configUsed` context key will be set for all `bundle`
commands. Functionally nothing changes.
## Tests
Existing tests. Also manually verified that either
`root.MustConfigureBundle` or `utils.ConfigureBundleWithVariables` is
called for all bundle commands (except `bundle init`) thus ensuring this
context key would be set for all bundle commands.
refs for the functions:
1. `root.MustConfigureBundle`:
41bbd89257/cmd/root/bundle.go (L88)
2. `utils.ConfigureBundleWithVariables`:
41bbd89257/cmd/bundle/utils/utils.go (L19)
---------
Co-authored-by: Pieter Noordhuis <pieter.noordhuis@databricks.com>
2025-01-29 11:02:08 +00:00
Andrew Nester
413ca5c134
Do not wait for app compute to start on `bundle deploy` ( #2144 )
...
## Changes
This allows DABs to avoid waiting for the compute to start when app is
initially created as part of "bundle deploy" which significantly
improves deploy time.
Always set no_compute to true for apps
## Tests
Covered by `TestDeployBundleWithApp`, currently fails until TF provider
is upgraded to the version supporting `no_compute` option
2025-01-28 17:17:37 +00:00
Andrew Nester
099e9bed0f
Upgrade TF provider to 1.64.1 ( #2247 )
...
## Changes
- Added support for `no_compute` in Apps
- Added support for `run_as_repl` for job tasks
2025-01-28 14:34:44 +00:00
Denis Bilenko
be908ee1a1
Add acceptance test for 'experimental.scripts' ( #2240 )
2025-01-27 15:28:33 +00:00
dependabot[bot]
4595c6f1b5
Bump github.com/databricks/databricks-sdk-go from 0.55.0 to 0.56.1 ( #2238 )
...
Bumps
[github.com/databricks/databricks-sdk-go](https://github.com/databricks/databricks-sdk-go )
from 0.55.0 to 0.56.1.
<details>
<summary>Release notes</summary>
<p><em>Sourced from <a
href="https://github.com/databricks/databricks-sdk-go/releases ">github.com/databricks/databricks-sdk-go's
releases</a>.</em></p>
<blockquote>
<h2>v0.56.1</h2>
<h3>Bug Fixes</h3>
<ul>
<li>Do not send query parameters when set to zero value (<a
href="https://redirect.github.com/databricks/databricks-sdk-go/pull/1136 ">#1136</a>).</li>
</ul>
<h2>v0.56.0</h2>
<h3>Bug Fixes</h3>
<ul>
<li>Support Query parameters for all HTTP operations (<a
href="https://redirect.github.com/databricks/databricks-sdk-go/pull/1124 ">#1124</a>).</li>
</ul>
<h3>Internal Changes</h3>
<ul>
<li>Add download target to MakeFile (<a
href="https://redirect.github.com/databricks/databricks-sdk-go/pull/1125 ">#1125</a>).</li>
<li>Delete examples/mocking module (<a
href="https://redirect.github.com/databricks/databricks-sdk-go/pull/1126 ">#1126</a>).</li>
<li>Scope the traversing directory in the Recursive list workspace test
(<a
href="https://redirect.github.com/databricks/databricks-sdk-go/pull/1120 ">#1120</a>).</li>
</ul>
<h3>API Changes:</h3>
<ul>
<li>Added <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/iam#AccessControlAPI ">w.AccessControl</a>
workspace-level service.</li>
<li>Added <code>HttpRequest</code> method for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/serving#ServingEndpointsAPI ">w.ServingEndpoints</a>
workspace-level service.</li>
<li>Added <code>ReviewState</code>, <code>Reviews</code> and
<code>RunnerCollaborators</code> fields for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/cleanrooms#CleanRoomAssetNotebook ">cleanrooms.CleanRoomAssetNotebook</a>.</li>
<li>Added <code>CleanRoomsNotebookOutput</code> field for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/jobs#RunOutput ">jobs.RunOutput</a>.</li>
<li>Added <code>RunAsRepl</code> field for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/jobs#SparkJarTask ">jobs.SparkJarTask</a>.</li>
<li>Added <code>Scopes</code> field for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/oauth2#UpdateCustomAppIntegration ">oauth2.UpdateCustomAppIntegration</a>.</li>
<li>Added <code>Contents</code> field for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/serving#GetOpenApiResponse ">serving.GetOpenApiResponse</a>.</li>
<li>Added <code>Activated</code>, <code>ActivationUrl</code>,
<code>AuthenticationType</code>, <code>Cloud</code>,
<code>Comment</code>, <code>CreatedAt</code>, <code>CreatedBy</code>,
<code>DataRecipientGlobalMetastoreId</code>, <code>IpAccessList</code>,
<code>MetastoreId</code>, <code>Name</code>, <code>Owner</code>,
<code>PropertiesKvpairs</code>, <code>Region</code>,
<code>SharingCode</code>, <code>Tokens</code>, <code>UpdatedAt</code>
and <code>UpdatedBy</code> fields for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/sharing#RecipientInfo ">sharing.RecipientInfo</a>.</li>
<li>Added <code>ExpirationTime</code> field for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/sharing#RecipientInfo ">sharing.RecipientInfo</a>.</li>
<li>Added <code>Pending</code> enum value for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/cleanrooms#CleanRoomAssetStatusEnum ">cleanrooms.CleanRoomAssetStatusEnum</a>.</li>
<li>Added <code>AddNodesFailed</code>,
<code>AutomaticClusterUpdate</code>, <code>AutoscalingBackoff</code> and
<code>AutoscalingFailed</code> enum values for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/compute#EventType ">compute.EventType</a>.</li>
<li>Added <code>PendingWarehouse</code> enum value for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/dashboards#MessageStatus ">dashboards.MessageStatus</a>.</li>
<li>Added <code>Cpu</code>, <code>GpuLarge</code>,
<code>GpuMedium</code>, <code>GpuSmall</code> and
<code>MultigpuMedium</code> enum values for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/serving#ServingModelWorkloadType ">serving.ServingModelWorkloadType</a>.</li>
<li>Changed <code>Update</code> method for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/sharing#RecipientsAPI ">w.Recipients</a>
workspace-level service to return <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/sharing#RecipientInfo ">sharing.RecipientInfo</a>.</li>
<li>Changed <code>Update</code> method for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/sharing#RecipientsAPI ">w.Recipients</a>
workspace-level service return type to become non-empty.</li>
<li>Changed <code>Update</code> method for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/sharing#RecipientsAPI ">w.Recipients</a>
workspace-level service to type <code>Update</code> method for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/sharing#RecipientsAPI ">w.Recipients</a>
workspace-level service.</li>
<li>Changed <code>Create</code> method for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/serving#ServingEndpointsAPI ">w.ServingEndpoints</a>
workspace-level service with new required argument order.</li>
<li>Changed <code>GetOpenApi</code> method for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/serving#ServingEndpointsAPI ">w.ServingEndpoints</a>
workspace-level service return type to become non-empty.</li>
<li>Changed <code>Patch</code> method for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/serving#ServingEndpointsAPI ">w.ServingEndpoints</a>
workspace-level service to type <code>Patch</code> method for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/serving#ServingEndpointsAPI ">w.ServingEndpoints</a>
workspace-level service.</li>
<li>Changed <code>Patch</code> method for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/serving#ServingEndpointsAPI ">w.ServingEndpoints</a>
workspace-level service to return <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/serving#EndpointTags ">serving.EndpointTags</a>.</li>
<li>Changed <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/serving#EndpointTagList ">serving.EndpointTagList</a>
to.</li>
<li>Changed <code>CollaboratorAlias</code> field for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/cleanrooms#CleanRoomCollaborator ">cleanrooms.CleanRoomCollaborator</a>
to be required.</li>
<li>Changed <code>CollaboratorAlias</code> field for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/cleanrooms#CleanRoomCollaborator ">cleanrooms.CleanRoomCollaborator</a>
to be required.</li>
<li>Changed <code>Behavior</code> field for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/serving#AiGatewayGuardrailPiiBehavior ">serving.AiGatewayGuardrailPiiBehavior</a>
to no longer be required.</li>
<li>Changed <code>Behavior</code> field for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/serving#AiGatewayGuardrailPiiBehavior ">serving.AiGatewayGuardrailPiiBehavior</a>
to no longer be required.</li>
<li>Changed <code>Config</code> field for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/serving#CreateServingEndpoint ">serving.CreateServingEndpoint</a>
to no longer be required.</li>
<li>Changed <code>ProjectId</code> and <code>Region</code> fields for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/serving#GoogleCloudVertexAiConfig ">serving.GoogleCloudVertexAiConfig</a>
to be required.</li>
<li>Changed <code>ProjectId</code> and <code>Region</code> fields for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/serving#GoogleCloudVertexAiConfig ">serving.GoogleCloudVertexAiConfig</a>
to be required.</li>
<li>Changed <code>WorkloadType</code> field for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/serving#ServedEntityInput ">serving.ServedEntityInput</a>
to type <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/serving#ServingModelWorkloadType ">serving.ServingModelWorkloadType</a>.</li>
<li>Changed <code>WorkloadType</code> field for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/serving#ServedEntityOutput ">serving.ServedEntityOutput</a>
to type <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/serving#ServingModelWorkloadType ">serving.ServingModelWorkloadType</a>.</li>
</ul>
<!-- raw HTML omitted -->
</blockquote>
<p>... (truncated)</p>
</details>
<details>
<summary>Changelog</summary>
<p><em>Sourced from <a
href="https://github.com/databricks/databricks-sdk-go/blob/main/CHANGELOG.md ">github.com/databricks/databricks-sdk-go's
changelog</a>.</em></p>
<blockquote>
<h2>[Release] Release v0.56.1</h2>
<h3>Bug Fixes</h3>
<ul>
<li>Do not send query parameters when set to zero value (<a
href="https://redirect.github.com/databricks/databricks-sdk-go/pull/1136 ">#1136</a>).</li>
</ul>
<h2>[Release] Release v0.56.0</h2>
<h3>Bug Fixes</h3>
<ul>
<li>Support Query parameters for all HTTP operations (<a
href="https://redirect.github.com/databricks/databricks-sdk-go/pull/1124 ">#1124</a>).</li>
</ul>
<h3>Internal Changes</h3>
<ul>
<li>Add download target to MakeFile (<a
href="https://redirect.github.com/databricks/databricks-sdk-go/pull/1125 ">#1125</a>).</li>
<li>Delete examples/mocking module (<a
href="https://redirect.github.com/databricks/databricks-sdk-go/pull/1126 ">#1126</a>).</li>
<li>Scope the traversing directory in the Recursive list workspace test
(<a
href="https://redirect.github.com/databricks/databricks-sdk-go/pull/1120 ">#1120</a>).</li>
</ul>
<h3>API Changes:</h3>
<ul>
<li>Added <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/iam#AccessControlAPI ">w.AccessControl</a>
workspace-level service.</li>
<li>Added <code>HttpRequest</code> method for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/serving#ServingEndpointsAPI ">w.ServingEndpoints</a>
workspace-level service.</li>
<li>Added <code>ReviewState</code>, <code>Reviews</code> and
<code>RunnerCollaborators</code> fields for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/cleanrooms#CleanRoomAssetNotebook ">cleanrooms.CleanRoomAssetNotebook</a>.</li>
<li>Added <code>CleanRoomsNotebookOutput</code> field for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/jobs#RunOutput ">jobs.RunOutput</a>.</li>
<li>Added <code>RunAsRepl</code> field for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/jobs#SparkJarTask ">jobs.SparkJarTask</a>.</li>
<li>Added <code>Scopes</code> field for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/oauth2#UpdateCustomAppIntegration ">oauth2.UpdateCustomAppIntegration</a>.</li>
<li>Added <code>Contents</code> field for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/serving#GetOpenApiResponse ">serving.GetOpenApiResponse</a>.</li>
<li>Added <code>Activated</code>, <code>ActivationUrl</code>,
<code>AuthenticationType</code>, <code>Cloud</code>,
<code>Comment</code>, <code>CreatedAt</code>, <code>CreatedBy</code>,
<code>DataRecipientGlobalMetastoreId</code>, <code>IpAccessList</code>,
<code>MetastoreId</code>, <code>Name</code>, <code>Owner</code>,
<code>PropertiesKvpairs</code>, <code>Region</code>,
<code>SharingCode</code>, <code>Tokens</code>, <code>UpdatedAt</code>
and <code>UpdatedBy</code> fields for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/sharing#RecipientInfo ">sharing.RecipientInfo</a>.</li>
<li>Added <code>ExpirationTime</code> field for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/sharing#RecipientInfo ">sharing.RecipientInfo</a>.</li>
<li>Added <code>Pending</code> enum value for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/cleanrooms#CleanRoomAssetStatusEnum ">cleanrooms.CleanRoomAssetStatusEnum</a>.</li>
<li>Added <code>AddNodesFailed</code>,
<code>AutomaticClusterUpdate</code>, <code>AutoscalingBackoff</code> and
<code>AutoscalingFailed</code> enum values for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/compute#EventType ">compute.EventType</a>.</li>
<li>Added <code>PendingWarehouse</code> enum value for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/dashboards#MessageStatus ">dashboards.MessageStatus</a>.</li>
<li>Added <code>Cpu</code>, <code>GpuLarge</code>,
<code>GpuMedium</code>, <code>GpuSmall</code> and
<code>MultigpuMedium</code> enum values for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/serving#ServingModelWorkloadType ">serving.ServingModelWorkloadType</a>.</li>
<li>Changed <code>Update</code> method for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/sharing#RecipientsAPI ">w.Recipients</a>
workspace-level service to return <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/sharing#RecipientInfo ">sharing.RecipientInfo</a>.</li>
<li>Changed <code>Update</code> method for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/sharing#RecipientsAPI ">w.Recipients</a>
workspace-level service return type to become non-empty.</li>
<li>Changed <code>Update</code> method for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/sharing#RecipientsAPI ">w.Recipients</a>
workspace-level service to type <code>Update</code> method for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/sharing#RecipientsAPI ">w.Recipients</a>
workspace-level service.</li>
<li>Changed <code>Create</code> method for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/serving#ServingEndpointsAPI ">w.ServingEndpoints</a>
workspace-level service with new required argument order.</li>
<li>Changed <code>GetOpenApi</code> method for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/serving#ServingEndpointsAPI ">w.ServingEndpoints</a>
workspace-level service return type to become non-empty.</li>
<li>Changed <code>Patch</code> method for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/serving#ServingEndpointsAPI ">w.ServingEndpoints</a>
workspace-level service to type <code>Patch</code> method for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/serving#ServingEndpointsAPI ">w.ServingEndpoints</a>
workspace-level service.</li>
<li>Changed <code>Patch</code> method for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/serving#ServingEndpointsAPI ">w.ServingEndpoints</a>
workspace-level service to return <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/serving#EndpointTags ">serving.EndpointTags</a>.</li>
<li>Changed <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/serving#EndpointTagList ">serving.EndpointTagList</a>
to.</li>
<li>Changed <code>CollaboratorAlias</code> field for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/cleanrooms#CleanRoomCollaborator ">cleanrooms.CleanRoomCollaborator</a>
to be required.</li>
<li>Changed <code>CollaboratorAlias</code> field for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/cleanrooms#CleanRoomCollaborator ">cleanrooms.CleanRoomCollaborator</a>
to be required.</li>
<li>Changed <code>Behavior</code> field for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/serving#AiGatewayGuardrailPiiBehavior ">serving.AiGatewayGuardrailPiiBehavior</a>
to no longer be required.</li>
<li>Changed <code>Behavior</code> field for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/serving#AiGatewayGuardrailPiiBehavior ">serving.AiGatewayGuardrailPiiBehavior</a>
to no longer be required.</li>
<li>Changed <code>Config</code> field for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/serving#CreateServingEndpoint ">serving.CreateServingEndpoint</a>
to no longer be required.</li>
</ul>
<!-- raw HTML omitted -->
</blockquote>
<p>... (truncated)</p>
</details>
<details>
<summary>Commits</summary>
<ul>
<li><a
href="bf617bb7a6
"><code>bf617bb</code></a>
[Release] Release v0.56.1 (<a
href="https://redirect.github.com/databricks/databricks-sdk-go/issues/1137 ">#1137</a>)</li>
<li><a
href="18cebf1d5c
"><code>18cebf1</code></a>
[Fix] Do not send query parameters when set to zero value (<a
href="https://redirect.github.com/databricks/databricks-sdk-go/issues/1136 ">#1136</a>)</li>
<li><a
href="28ff749ee2
"><code>28ff749</code></a>
[Release] Release v0.56.0 (<a
href="https://redirect.github.com/databricks/databricks-sdk-go/issues/1134 ">#1134</a>)</li>
<li><a
href="113454080f
"><code>1134540</code></a>
[Internal] Add download target to MakeFile (<a
href="https://redirect.github.com/databricks/databricks-sdk-go/issues/1125 ">#1125</a>)</li>
<li><a
href="e079db96f3
"><code>e079db9</code></a>
[Fix] Support Query parameters for all HTTP operations (<a
href="https://redirect.github.com/databricks/databricks-sdk-go/issues/1124 ">#1124</a>)</li>
<li><a
href="1045fb9697
"><code>1045fb9</code></a>
[Internal] Delete examples/mocking module (<a
href="https://redirect.github.com/databricks/databricks-sdk-go/issues/1126 ">#1126</a>)</li>
<li><a
href="914ab6b7e8
"><code>914ab6b</code></a>
[Internal] Scope the traversing directory in the Recursive list
workspace tes...</li>
<li>See full diff in <a
href="https://github.com/databricks/databricks-sdk-go/compare/v0.55.0...v0.56.1 ">compare
view</a></li>
</ul>
</details>
<br />
<details>
<summary>Most Recent Ignore Conditions Applied to This Pull
Request</summary>
| Dependency Name | Ignore Conditions |
| --- | --- |
| github.com/databricks/databricks-sdk-go | [>= 0.28.a, < 0.29] |
</details>
[](https://docs.github.com/en/github/managing-security-vulnerabilities/about-dependabot-security-updates#about-compatibility-scores )
Dependabot will resolve any conflicts with this PR as long as you don't
alter it yourself. You can also trigger a rebase manually by commenting
`@dependabot rebase`.
[//]: # (dependabot-automerge-start)
[//]: # (dependabot-automerge-end)
---
<details>
<summary>Dependabot commands and options</summary>
<br />
You can trigger Dependabot actions by commenting on this PR:
- `@dependabot rebase` will rebase this PR
- `@dependabot recreate` will recreate this PR, overwriting any edits
that have been made to it
- `@dependabot merge` will merge this PR after your CI passes on it
- `@dependabot squash and merge` will squash and merge this PR after
your CI passes on it
- `@dependabot cancel merge` will cancel a previously requested merge
and block automerging
- `@dependabot reopen` will reopen this PR if it is closed
- `@dependabot close` will close this PR and stop Dependabot recreating
it. You can achieve the same result by closing it manually
- `@dependabot show <dependency name> ignore conditions` will show all
of the ignore conditions of the specified dependency
- `@dependabot ignore this major version` will close this PR and stop
Dependabot creating any more for this major version (unless you reopen
the PR or upgrade to it yourself)
- `@dependabot ignore this minor version` will close this PR and stop
Dependabot creating any more for this minor version (unless you reopen
the PR or upgrade to it yourself)
- `@dependabot ignore this dependency` will close this PR and stop
Dependabot creating any more for this dependency (unless you reopen the
PR or upgrade to it yourself)
</details>
---------
Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
Co-authored-by: Andrew Nester <andrew.nester@databricks.com>
2025-01-27 13:11:07 +00:00
Ilya Kuznetsov
0487e816cc
Reading variables from file ( #2171 )
...
## Changes
New source of default values for variables - variable file
`.databricks/bundle/<target>/variable-overrides.json`
CLI tries to stat and read that file every time during variable
initialisation phase
<!-- Summary of your changes that are easy to understand -->
## Tests
Acceptance tests
2025-01-23 14:35:33 +00:00
Andrew Nester
8af9efaa62
Show an error when non-yaml files used in include section ( #2201 )
...
## Changes
`include` section is used only to include other bundle configuration
YAML files. If any other file type is used, raise an error and guide
users to use `sync.include` instead
## Tests
Added acceptance test
---------
Co-authored-by: Julia Crawford (Databricks) <julia.crawford@databricks.com>
2025-01-23 13:58:18 +00:00
Andrew Nester
6153423c56
Revert "Upgrade Go SDK to 0.56.0 ( #2214 )" ( #2217 )
...
This reverts commit 798189eb96
.
2025-01-23 13:21:59 +00:00
Denis Bilenko
ddd45e25ee
Pass USE_SDK_V2_{RESOURCES,DATA_SOURCES} to terraform ( #2207 )
...
## Changes
- Propagate env vars USE_SDK_V2_RESOURCES and $USE_SDK_V2_DATA_SOURCES
to terraform
- This are troubleshooting helpers for resources migrated to new plugin
framework, recommended here:
https://registry.terraform.io/providers/databricks/databricks/latest/docs/guides/troubleshooting#plugin-framework-migration-problems
- This current unblocks deploying quality monitors, see
https://github.com/databricks/terraform-provider-databricks/issues/4229#issuecomment-2520344690
## Tests
Manually testing that I can deploy quality monitor after this change
with `USE_SDK_V2_RESOURCES="databricks_quality_monitor"` set
### Main branch:
```
~/work/databricks_quality_monitor_repro % USE_SDK_V2_RESOURCES="databricks_quality_monitor" ../cli/cli-main bundle deploy
Uploading bundle files to /Workspace/Users/denis.bilenko@databricks.com/.bundle/quality_monitor_bundle/default/files...
Deploying resources...
Updating deployment state...
Deployment complete!
Error: terraform apply: exit status 1
Error: Provider produced inconsistent result after apply
When applying changes to databricks_quality_monitor.monitor_trips, provider
"provider[\"registry.terraform.io/databricks/databricks\"]" produced an
unexpected new value: .data_classification_config: block count changed from 0
to 1.
This is a bug in the provider, which should be reported in the provider's own
issue tracker.
```
### This branch:
```
~/work/databricks_quality_monitor_repro % USE_SDK_V2_RESOURCES="databricks_quality_monitor" ../cli/cli bundle deploy
Uploading bundle files to /Workspace/Users/denis.bilenko@databricks.com/.bundle/quality_monitor_bundle/default/files...
Deploying resources...
Updating deployment state...
Deployment complete!
```
### Config:
```
~/work/databricks_quality_monitor_repro % cat databricks.yml
bundle:
name: quality_monitor_bundle
resources:
quality_monitors:
monitor_trips:
table_name: main.denis-bilenko-cuj-pe34.trips_sanitized_1
output_schema_name: main.denis-bilenko-cuj-pe34
assets_dir: /Workspace/Users/${workspace.current_user.userName}/quality_monitor_issue
snapshot: {}
```
2025-01-23 12:48:47 +00:00