Commit Graph

623 Commits

Author SHA1 Message Date
Shreyas Goenka acd64fa296
merge 2025-01-29 16:02:09 +01:00
Denis Bilenko 38efedcd73
Remove bundle.git.inferred (#2258)
The only use case for it was to emit a warning and based on the
discussion here
https://github.com/databricks/cli/pull/2213/files#r1933558087 the
warning it not useful and logging that with reduced severity is also not
useful.
2025-01-29 14:15:52 +00:00
Gleb Kanterov 13596eb605
PythonMutator: Fix relative path error (#2253)
## Changes

Fix relative path errors in the Python mutator that was failing during
deployment since v0.239.1.

Before that:

```
% databricks bundle deploy  
Deploying resources...
Updating deployment state...
Error: failed to compute relative path for job jobs_as_code_project_job: Rel: can't make resources/jobs_as_code_project_job.py relative to /Users/$USER/jobs_as_code_project
```

As a result, the bundle was deployed, but the deployment state wasn't
updated.

## Tests

Unit tests, adding acceptance tests in
https://github.com/databricks/cli/pull/2254
2025-01-29 13:56:57 +00:00
Ilya Kuznetsov 708c4fbb7a
Autogenerated documentation for bundle config (#2033)
## Changes

Documentation autogeneration tool. This tool uses same annotations_*.yml
files as in json-schema

Result will go
[there](https://docs.databricks.com/en/dev-tools/bundles/reference.html)
and
[there](https://docs.databricks.com/en/dev-tools/bundles/resources.html#cluster)

## Tests
Manually
2025-01-29 12:14:21 +00:00
shreyas-goenka 884b5f26ed
Set bundle auth configuration in command context (#2195)
## Changes

This change is required to enable tracking execution time telemetry for
bundle commands. In order to track execution time for the command
generally, we need to have the databricks auth configuration available
at this section of the code:


41bbd89257/cmd/root/root.go (L99)

In order to do this we can rely on the `configUsed` context key.   

Most commands rely on the `root.MustWorkspaceClient` function which
automatically sets the client config in the `configUsed` context key.
Bundle commands, however, do not do so. They instead store their
workspace clients in the `&bundle.Bundle{}` object.

With this PR, the `configUsed` context key will be set for all `bundle`
commands. Functionally nothing changes.

## Tests
Existing tests. Also manually verified that either
`root.MustConfigureBundle` or `utils.ConfigureBundleWithVariables` is
called for all bundle commands (except `bundle init`) thus ensuring this
context key would be set for all bundle commands.

refs for the functions:
1. `root.MustConfigureBundle`:
41bbd89257/cmd/root/bundle.go (L88)
2. `utils.ConfigureBundleWithVariables`:
41bbd89257/cmd/bundle/utils/utils.go (L19)

---------

Co-authored-by: Pieter Noordhuis <pieter.noordhuis@databricks.com>
2025-01-29 11:02:08 +00:00
Andrew Nester 413ca5c134
Do not wait for app compute to start on `bundle deploy` (#2144)
## Changes
This allows DABs to avoid waiting for the compute to start when app is
initially created as part of "bundle deploy" which significantly
improves deploy time.

Always set no_compute to true for apps

## Tests
Covered by `TestDeployBundleWithApp`, currently fails until TF provider
is upgraded to the version supporting `no_compute` option
2025-01-28 17:17:37 +00:00
Andrew Nester 099e9bed0f
Upgrade TF provider to 1.64.1 (#2247)
## Changes
- Added support for `no_compute` in Apps
- Added support for `run_as_repl` for job tasks
2025-01-28 14:34:44 +00:00
Denis Bilenko be908ee1a1
Add acceptance test for 'experimental.scripts' (#2240) 2025-01-27 15:28:33 +00:00
dependabot[bot] 4595c6f1b5
Bump github.com/databricks/databricks-sdk-go from 0.55.0 to 0.56.1 (#2238)
Bumps
[github.com/databricks/databricks-sdk-go](https://github.com/databricks/databricks-sdk-go)
from 0.55.0 to 0.56.1.
<details>
<summary>Release notes</summary>
<p><em>Sourced from <a
href="https://github.com/databricks/databricks-sdk-go/releases">github.com/databricks/databricks-sdk-go's
releases</a>.</em></p>
<blockquote>
<h2>v0.56.1</h2>
<h3>Bug Fixes</h3>
<ul>
<li>Do not send query parameters when set to zero value (<a
href="https://redirect.github.com/databricks/databricks-sdk-go/pull/1136">#1136</a>).</li>
</ul>
<h2>v0.56.0</h2>
<h3>Bug Fixes</h3>
<ul>
<li>Support Query parameters for all HTTP operations (<a
href="https://redirect.github.com/databricks/databricks-sdk-go/pull/1124">#1124</a>).</li>
</ul>
<h3>Internal Changes</h3>
<ul>
<li>Add download target to MakeFile (<a
href="https://redirect.github.com/databricks/databricks-sdk-go/pull/1125">#1125</a>).</li>
<li>Delete examples/mocking module (<a
href="https://redirect.github.com/databricks/databricks-sdk-go/pull/1126">#1126</a>).</li>
<li>Scope the traversing directory in the Recursive list workspace test
(<a
href="https://redirect.github.com/databricks/databricks-sdk-go/pull/1120">#1120</a>).</li>
</ul>
<h3>API Changes:</h3>
<ul>
<li>Added <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/iam#AccessControlAPI">w.AccessControl</a>
workspace-level service.</li>
<li>Added <code>HttpRequest</code> method for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/serving#ServingEndpointsAPI">w.ServingEndpoints</a>
workspace-level service.</li>
<li>Added <code>ReviewState</code>, <code>Reviews</code> and
<code>RunnerCollaborators</code> fields for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/cleanrooms#CleanRoomAssetNotebook">cleanrooms.CleanRoomAssetNotebook</a>.</li>
<li>Added <code>CleanRoomsNotebookOutput</code> field for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/jobs#RunOutput">jobs.RunOutput</a>.</li>
<li>Added <code>RunAsRepl</code> field for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/jobs#SparkJarTask">jobs.SparkJarTask</a>.</li>
<li>Added <code>Scopes</code> field for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/oauth2#UpdateCustomAppIntegration">oauth2.UpdateCustomAppIntegration</a>.</li>
<li>Added <code>Contents</code> field for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/serving#GetOpenApiResponse">serving.GetOpenApiResponse</a>.</li>
<li>Added <code>Activated</code>, <code>ActivationUrl</code>,
<code>AuthenticationType</code>, <code>Cloud</code>,
<code>Comment</code>, <code>CreatedAt</code>, <code>CreatedBy</code>,
<code>DataRecipientGlobalMetastoreId</code>, <code>IpAccessList</code>,
<code>MetastoreId</code>, <code>Name</code>, <code>Owner</code>,
<code>PropertiesKvpairs</code>, <code>Region</code>,
<code>SharingCode</code>, <code>Tokens</code>, <code>UpdatedAt</code>
and <code>UpdatedBy</code> fields for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/sharing#RecipientInfo">sharing.RecipientInfo</a>.</li>
<li>Added <code>ExpirationTime</code> field for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/sharing#RecipientInfo">sharing.RecipientInfo</a>.</li>
<li>Added <code>Pending</code> enum value for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/cleanrooms#CleanRoomAssetStatusEnum">cleanrooms.CleanRoomAssetStatusEnum</a>.</li>
<li>Added <code>AddNodesFailed</code>,
<code>AutomaticClusterUpdate</code>, <code>AutoscalingBackoff</code> and
<code>AutoscalingFailed</code> enum values for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/compute#EventType">compute.EventType</a>.</li>
<li>Added <code>PendingWarehouse</code> enum value for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/dashboards#MessageStatus">dashboards.MessageStatus</a>.</li>
<li>Added <code>Cpu</code>, <code>GpuLarge</code>,
<code>GpuMedium</code>, <code>GpuSmall</code> and
<code>MultigpuMedium</code> enum values for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/serving#ServingModelWorkloadType">serving.ServingModelWorkloadType</a>.</li>
<li>Changed <code>Update</code> method for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/sharing#RecipientsAPI">w.Recipients</a>
workspace-level service to return <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/sharing#RecipientInfo">sharing.RecipientInfo</a>.</li>
<li>Changed <code>Update</code> method for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/sharing#RecipientsAPI">w.Recipients</a>
workspace-level service return type to become non-empty.</li>
<li>Changed <code>Update</code> method for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/sharing#RecipientsAPI">w.Recipients</a>
workspace-level service to type <code>Update</code> method for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/sharing#RecipientsAPI">w.Recipients</a>
workspace-level service.</li>
<li>Changed <code>Create</code> method for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/serving#ServingEndpointsAPI">w.ServingEndpoints</a>
workspace-level service with new required argument order.</li>
<li>Changed <code>GetOpenApi</code> method for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/serving#ServingEndpointsAPI">w.ServingEndpoints</a>
workspace-level service return type to become non-empty.</li>
<li>Changed <code>Patch</code> method for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/serving#ServingEndpointsAPI">w.ServingEndpoints</a>
workspace-level service to type <code>Patch</code> method for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/serving#ServingEndpointsAPI">w.ServingEndpoints</a>
workspace-level service.</li>
<li>Changed <code>Patch</code> method for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/serving#ServingEndpointsAPI">w.ServingEndpoints</a>
workspace-level service to return <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/serving#EndpointTags">serving.EndpointTags</a>.</li>
<li>Changed <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/serving#EndpointTagList">serving.EndpointTagList</a>
to.</li>
<li>Changed <code>CollaboratorAlias</code> field for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/cleanrooms#CleanRoomCollaborator">cleanrooms.CleanRoomCollaborator</a>
to be required.</li>
<li>Changed <code>CollaboratorAlias</code> field for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/cleanrooms#CleanRoomCollaborator">cleanrooms.CleanRoomCollaborator</a>
to be required.</li>
<li>Changed <code>Behavior</code> field for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/serving#AiGatewayGuardrailPiiBehavior">serving.AiGatewayGuardrailPiiBehavior</a>
to no longer be required.</li>
<li>Changed <code>Behavior</code> field for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/serving#AiGatewayGuardrailPiiBehavior">serving.AiGatewayGuardrailPiiBehavior</a>
to no longer be required.</li>
<li>Changed <code>Config</code> field for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/serving#CreateServingEndpoint">serving.CreateServingEndpoint</a>
to no longer be required.</li>
<li>Changed <code>ProjectId</code> and <code>Region</code> fields for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/serving#GoogleCloudVertexAiConfig">serving.GoogleCloudVertexAiConfig</a>
to be required.</li>
<li>Changed <code>ProjectId</code> and <code>Region</code> fields for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/serving#GoogleCloudVertexAiConfig">serving.GoogleCloudVertexAiConfig</a>
to be required.</li>
<li>Changed <code>WorkloadType</code> field for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/serving#ServedEntityInput">serving.ServedEntityInput</a>
to type <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/serving#ServingModelWorkloadType">serving.ServingModelWorkloadType</a>.</li>
<li>Changed <code>WorkloadType</code> field for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/serving#ServedEntityOutput">serving.ServedEntityOutput</a>
to type <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/serving#ServingModelWorkloadType">serving.ServingModelWorkloadType</a>.</li>
</ul>
<!-- raw HTML omitted -->
</blockquote>
<p>... (truncated)</p>
</details>
<details>
<summary>Changelog</summary>
<p><em>Sourced from <a
href="https://github.com/databricks/databricks-sdk-go/blob/main/CHANGELOG.md">github.com/databricks/databricks-sdk-go's
changelog</a>.</em></p>
<blockquote>
<h2>[Release] Release v0.56.1</h2>
<h3>Bug Fixes</h3>
<ul>
<li>Do not send query parameters when set to zero value (<a
href="https://redirect.github.com/databricks/databricks-sdk-go/pull/1136">#1136</a>).</li>
</ul>
<h2>[Release] Release v0.56.0</h2>
<h3>Bug Fixes</h3>
<ul>
<li>Support Query parameters for all HTTP operations (<a
href="https://redirect.github.com/databricks/databricks-sdk-go/pull/1124">#1124</a>).</li>
</ul>
<h3>Internal Changes</h3>
<ul>
<li>Add download target to MakeFile (<a
href="https://redirect.github.com/databricks/databricks-sdk-go/pull/1125">#1125</a>).</li>
<li>Delete examples/mocking module (<a
href="https://redirect.github.com/databricks/databricks-sdk-go/pull/1126">#1126</a>).</li>
<li>Scope the traversing directory in the Recursive list workspace test
(<a
href="https://redirect.github.com/databricks/databricks-sdk-go/pull/1120">#1120</a>).</li>
</ul>
<h3>API Changes:</h3>
<ul>
<li>Added <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/iam#AccessControlAPI">w.AccessControl</a>
workspace-level service.</li>
<li>Added <code>HttpRequest</code> method for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/serving#ServingEndpointsAPI">w.ServingEndpoints</a>
workspace-level service.</li>
<li>Added <code>ReviewState</code>, <code>Reviews</code> and
<code>RunnerCollaborators</code> fields for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/cleanrooms#CleanRoomAssetNotebook">cleanrooms.CleanRoomAssetNotebook</a>.</li>
<li>Added <code>CleanRoomsNotebookOutput</code> field for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/jobs#RunOutput">jobs.RunOutput</a>.</li>
<li>Added <code>RunAsRepl</code> field for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/jobs#SparkJarTask">jobs.SparkJarTask</a>.</li>
<li>Added <code>Scopes</code> field for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/oauth2#UpdateCustomAppIntegration">oauth2.UpdateCustomAppIntegration</a>.</li>
<li>Added <code>Contents</code> field for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/serving#GetOpenApiResponse">serving.GetOpenApiResponse</a>.</li>
<li>Added <code>Activated</code>, <code>ActivationUrl</code>,
<code>AuthenticationType</code>, <code>Cloud</code>,
<code>Comment</code>, <code>CreatedAt</code>, <code>CreatedBy</code>,
<code>DataRecipientGlobalMetastoreId</code>, <code>IpAccessList</code>,
<code>MetastoreId</code>, <code>Name</code>, <code>Owner</code>,
<code>PropertiesKvpairs</code>, <code>Region</code>,
<code>SharingCode</code>, <code>Tokens</code>, <code>UpdatedAt</code>
and <code>UpdatedBy</code> fields for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/sharing#RecipientInfo">sharing.RecipientInfo</a>.</li>
<li>Added <code>ExpirationTime</code> field for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/sharing#RecipientInfo">sharing.RecipientInfo</a>.</li>
<li>Added <code>Pending</code> enum value for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/cleanrooms#CleanRoomAssetStatusEnum">cleanrooms.CleanRoomAssetStatusEnum</a>.</li>
<li>Added <code>AddNodesFailed</code>,
<code>AutomaticClusterUpdate</code>, <code>AutoscalingBackoff</code> and
<code>AutoscalingFailed</code> enum values for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/compute#EventType">compute.EventType</a>.</li>
<li>Added <code>PendingWarehouse</code> enum value for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/dashboards#MessageStatus">dashboards.MessageStatus</a>.</li>
<li>Added <code>Cpu</code>, <code>GpuLarge</code>,
<code>GpuMedium</code>, <code>GpuSmall</code> and
<code>MultigpuMedium</code> enum values for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/serving#ServingModelWorkloadType">serving.ServingModelWorkloadType</a>.</li>
<li>Changed <code>Update</code> method for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/sharing#RecipientsAPI">w.Recipients</a>
workspace-level service to return <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/sharing#RecipientInfo">sharing.RecipientInfo</a>.</li>
<li>Changed <code>Update</code> method for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/sharing#RecipientsAPI">w.Recipients</a>
workspace-level service return type to become non-empty.</li>
<li>Changed <code>Update</code> method for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/sharing#RecipientsAPI">w.Recipients</a>
workspace-level service to type <code>Update</code> method for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/sharing#RecipientsAPI">w.Recipients</a>
workspace-level service.</li>
<li>Changed <code>Create</code> method for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/serving#ServingEndpointsAPI">w.ServingEndpoints</a>
workspace-level service with new required argument order.</li>
<li>Changed <code>GetOpenApi</code> method for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/serving#ServingEndpointsAPI">w.ServingEndpoints</a>
workspace-level service return type to become non-empty.</li>
<li>Changed <code>Patch</code> method for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/serving#ServingEndpointsAPI">w.ServingEndpoints</a>
workspace-level service to type <code>Patch</code> method for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/serving#ServingEndpointsAPI">w.ServingEndpoints</a>
workspace-level service.</li>
<li>Changed <code>Patch</code> method for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/serving#ServingEndpointsAPI">w.ServingEndpoints</a>
workspace-level service to return <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/serving#EndpointTags">serving.EndpointTags</a>.</li>
<li>Changed <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/serving#EndpointTagList">serving.EndpointTagList</a>
to.</li>
<li>Changed <code>CollaboratorAlias</code> field for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/cleanrooms#CleanRoomCollaborator">cleanrooms.CleanRoomCollaborator</a>
to be required.</li>
<li>Changed <code>CollaboratorAlias</code> field for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/cleanrooms#CleanRoomCollaborator">cleanrooms.CleanRoomCollaborator</a>
to be required.</li>
<li>Changed <code>Behavior</code> field for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/serving#AiGatewayGuardrailPiiBehavior">serving.AiGatewayGuardrailPiiBehavior</a>
to no longer be required.</li>
<li>Changed <code>Behavior</code> field for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/serving#AiGatewayGuardrailPiiBehavior">serving.AiGatewayGuardrailPiiBehavior</a>
to no longer be required.</li>
<li>Changed <code>Config</code> field for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/serving#CreateServingEndpoint">serving.CreateServingEndpoint</a>
to no longer be required.</li>
</ul>
<!-- raw HTML omitted -->
</blockquote>
<p>... (truncated)</p>
</details>
<details>
<summary>Commits</summary>
<ul>
<li><a
href="bf617bb7a6"><code>bf617bb</code></a>
[Release] Release v0.56.1 (<a
href="https://redirect.github.com/databricks/databricks-sdk-go/issues/1137">#1137</a>)</li>
<li><a
href="18cebf1d5c"><code>18cebf1</code></a>
[Fix] Do not send query parameters when set to zero value (<a
href="https://redirect.github.com/databricks/databricks-sdk-go/issues/1136">#1136</a>)</li>
<li><a
href="28ff749ee2"><code>28ff749</code></a>
[Release] Release v0.56.0 (<a
href="https://redirect.github.com/databricks/databricks-sdk-go/issues/1134">#1134</a>)</li>
<li><a
href="113454080f"><code>1134540</code></a>
[Internal] Add download target to MakeFile (<a
href="https://redirect.github.com/databricks/databricks-sdk-go/issues/1125">#1125</a>)</li>
<li><a
href="e079db96f3"><code>e079db9</code></a>
[Fix] Support Query parameters for all HTTP operations (<a
href="https://redirect.github.com/databricks/databricks-sdk-go/issues/1124">#1124</a>)</li>
<li><a
href="1045fb9697"><code>1045fb9</code></a>
[Internal] Delete examples/mocking module (<a
href="https://redirect.github.com/databricks/databricks-sdk-go/issues/1126">#1126</a>)</li>
<li><a
href="914ab6b7e8"><code>914ab6b</code></a>
[Internal] Scope the traversing directory in the Recursive list
workspace tes...</li>
<li>See full diff in <a
href="https://github.com/databricks/databricks-sdk-go/compare/v0.55.0...v0.56.1">compare
view</a></li>
</ul>
</details>
<br />

<details>
<summary>Most Recent Ignore Conditions Applied to This Pull
Request</summary>

| Dependency Name | Ignore Conditions |
| --- | --- |
| github.com/databricks/databricks-sdk-go | [>= 0.28.a, < 0.29] |
</details>


[![Dependabot compatibility
score](https://dependabot-badges.githubapp.com/badges/compatibility_score?dependency-name=github.com/databricks/databricks-sdk-go&package-manager=go_modules&previous-version=0.55.0&new-version=0.56.1)](https://docs.github.com/en/github/managing-security-vulnerabilities/about-dependabot-security-updates#about-compatibility-scores)

Dependabot will resolve any conflicts with this PR as long as you don't
alter it yourself. You can also trigger a rebase manually by commenting
`@dependabot rebase`.

[//]: # (dependabot-automerge-start)
[//]: # (dependabot-automerge-end)

---

<details>
<summary>Dependabot commands and options</summary>
<br />

You can trigger Dependabot actions by commenting on this PR:
- `@dependabot rebase` will rebase this PR
- `@dependabot recreate` will recreate this PR, overwriting any edits
that have been made to it
- `@dependabot merge` will merge this PR after your CI passes on it
- `@dependabot squash and merge` will squash and merge this PR after
your CI passes on it
- `@dependabot cancel merge` will cancel a previously requested merge
and block automerging
- `@dependabot reopen` will reopen this PR if it is closed
- `@dependabot close` will close this PR and stop Dependabot recreating
it. You can achieve the same result by closing it manually
- `@dependabot show <dependency name> ignore conditions` will show all
of the ignore conditions of the specified dependency
- `@dependabot ignore this major version` will close this PR and stop
Dependabot creating any more for this major version (unless you reopen
the PR or upgrade to it yourself)
- `@dependabot ignore this minor version` will close this PR and stop
Dependabot creating any more for this minor version (unless you reopen
the PR or upgrade to it yourself)
- `@dependabot ignore this dependency` will close this PR and stop
Dependabot creating any more for this dependency (unless you reopen the
PR or upgrade to it yourself)


</details>

---------

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
Co-authored-by: Andrew Nester <andrew.nester@databricks.com>
2025-01-27 13:11:07 +00:00
Ilya Kuznetsov 0487e816cc
Reading variables from file (#2171)
## Changes

New source of default values for variables - variable file
`.databricks/bundle/<target>/variable-overrides.json`

CLI tries to stat and read that file every time during variable
initialisation phase

<!-- Summary of your changes that are easy to understand -->

## Tests

Acceptance tests
2025-01-23 14:35:33 +00:00
Andrew Nester 8af9efaa62
Show an error when non-yaml files used in include section (#2201)
## Changes
`include` section is used only to include other bundle configuration
YAML files. If any other file type is used, raise an error and guide
users to use `sync.include` instead

## Tests
Added acceptance test

---------

Co-authored-by: Julia Crawford (Databricks) <julia.crawford@databricks.com>
2025-01-23 13:58:18 +00:00
Andrew Nester 6153423c56
Revert "Upgrade Go SDK to 0.56.0 (#2214)" (#2217)
This reverts commit 798189eb96.
2025-01-23 13:21:59 +00:00
Shreyas Goenka 01d63dd20e
Merge remote-tracking branch 'origin' into implement-async-logger 2025-01-23 14:19:46 +01:00
Denis Bilenko ddd45e25ee
Pass USE_SDK_V2_{RESOURCES,DATA_SOURCES} to terraform (#2207)
## Changes
- Propagate env vars USE_SDK_V2_RESOURCES and $USE_SDK_V2_DATA_SOURCES
to terraform
- This are troubleshooting helpers for resources migrated to new plugin
framework, recommended here:
https://registry.terraform.io/providers/databricks/databricks/latest/docs/guides/troubleshooting#plugin-framework-migration-problems
- This current unblocks deploying quality monitors, see
https://github.com/databricks/terraform-provider-databricks/issues/4229#issuecomment-2520344690

## Tests
Manually testing that I can deploy quality monitor after this change
with `USE_SDK_V2_RESOURCES="databricks_quality_monitor"` set

### Main branch:
```
~/work/databricks_quality_monitor_repro % USE_SDK_V2_RESOURCES="databricks_quality_monitor" ../cli/cli-main bundle deploy
Uploading bundle files to /Workspace/Users/denis.bilenko@databricks.com/.bundle/quality_monitor_bundle/default/files...
Deploying resources...
Updating deployment state...
Deployment complete!
Error: terraform apply: exit status 1

Error: Provider produced inconsistent result after apply

When applying changes to databricks_quality_monitor.monitor_trips, provider
"provider[\"registry.terraform.io/databricks/databricks\"]" produced an
unexpected new value: .data_classification_config: block count changed from 0
to 1.

This is a bug in the provider, which should be reported in the provider's own
issue tracker.
```

### This branch:
```
~/work/databricks_quality_monitor_repro % USE_SDK_V2_RESOURCES="databricks_quality_monitor" ../cli/cli bundle deploy
Uploading bundle files to /Workspace/Users/denis.bilenko@databricks.com/.bundle/quality_monitor_bundle/default/files...
Deploying resources...
Updating deployment state...
Deployment complete!
```

### Config:
```
~/work/databricks_quality_monitor_repro % cat databricks.yml
bundle:
  name: quality_monitor_bundle

resources:
  quality_monitors:
    monitor_trips:
      table_name: main.denis-bilenko-cuj-pe34.trips_sanitized_1
      output_schema_name: main.denis-bilenko-cuj-pe34
      assets_dir: /Workspace/Users/${workspace.current_user.userName}/quality_monitor_issue
      snapshot: {}
```
2025-01-23 12:48:47 +00:00
Andrew Nester 798189eb96
Upgrade Go SDK to 0.56.0 (#2214)
## Changes

Upgrade Go SDK to 0.56.0

Relevant changes:
- Support Query parameters for all HTTP operations
(https://github.com/databricks/databricks-sdk-go/pull/1124).
2025-01-23 11:17:52 +00:00
Ilya Kuznetsov f60ad32f07
Allow yaml-anchors in schema (#2200)
## Changes

Allows custom untyped fields in the root config in json-schema so it
doesn't highlight errors when using yaml-anchors.

Example use case:

```
tags: &job-tags
  environment: ${bundle.target}


resources:
  jobs:
    db1:
      tags:
        <<: *job-tags
    db1:
      tags:
        <<: *job-tags
```
  
One downside is that we don't highlight any unknown top-level properties
anymore (but they will still fail during CLI validation)

## Tests

Manually checked behavior in VSCode - it doesn't show validation error.
Also checked that other typed properties are still suggested
2025-01-23 11:11:44 +00:00
Shreyas Goenka 3964d8d454
[WIP] In process telemetry logger 2025-01-22 22:03:13 +01:00
Gleb Kanterov 3d91691f25
PythonMutator: propagate source locations (#1783)
## Changes
Add a mechanism to load Python source locations in the Python mutator.
Previously, locations pointed to generated YAML. Now, they point to
Python sources instead. Python process outputs "locations.json"
containing locations of bundle paths, examples:

```json
{"path": "resources.jobs.job_0", "file": "resources/job_0.py", "line": 3, "column": 5}
{"path": "resources.jobs.job_0.tasks[0].task_key", "file": "resources/job_0.py", "line": 10, "column": 5}
{"path": "resources.jobs.job_1", "file": "resources/job_1.py", "line": 5, "column": 7}
```

Such locations form a tree, and we assign locations of the closest
ancestor to each `dyn.Value` based on its path. For example,
`resources.jobs.job_0.tasks[0].task_key` is located at `job_0.py:10:5`
and `resources.jobs.job_0.tasks[0].email_notifications` is located at
`job_0.py:3:5`, because we use the location of the job as the most
precise approximation.

This feature is only enabled if `experimental/python` is used.

Note: for now, we don't update locations with relative paths, because it
has a side effect in changing how these paths are resolved

## Example
```
% databricks bundle validate

Warning: job_cluster_key abc is not defined
  at resources.jobs.examples.tasks[0].job_cluster_key
  in resources/example.py:10:1
```

## Tests
Unit tests and manually
2025-01-22 15:37:37 +00:00
Denis Bilenko 54a470837c
Fix context propagation in bundle/deploy/terraform (#2208)
https://github.com/databricks/cli/pull/747#discussion_r1925248116
2025-01-22 13:28:13 +00:00
Denis Bilenko 667302b61b
Refactor env forwarding function in terraform (#2206)
No functional changes, just making it easier to add variables.
2025-01-22 12:51:17 +00:00
shreyas-goenka 6c3ddbd921
Add `auth.Env` function (#2204)
## Changes
`auth.Env` is a generic function that we can use for authenticated tools
downstream to the CLI.

## Tests
Unit test.
2025-01-22 12:14:54 +00:00
Denis Bilenko e9902036b8
Set WorktreeRoot to sync root outside git repo (#2197)
## Changes
If git is not detected, set default worktree root to sync root.
Otherwise NewFileSet/View raise an error about worktree root being
outside view root in acceptance/bundle/sync-paths-dotdot.

This behavior is introduced in
https://github.com/databricks/cli/pull/1945

Stacked on https://github.com/databricks/cli/pull/2202

## Tests
Existing tests.
2025-01-22 10:50:13 +00:00
Ilya Kuznetsov c224be5c1f
Allow using variables in enum fields (#2199)
## Changes

It is possible to pass variable to enum fields but json-schema doesn't
accept it. This PR adds `oneOf` for enum types that includes `${var-*}`
pattern

## Tests

Manually checked in VSCode
2025-01-22 10:30:17 +00:00
Denis Bilenko 41bbd89257
Clean up unnecessary cleanup of inferred flag (#2193)
## Changes
The SelectTarget mutator (part of Load phase) clears bundle.git.inferred
flag but it is not set until later - Initialize phase / LoadGitDetails
mutator.

## Tests
Existing tests.
2025-01-20 17:21:34 +00:00
Denis Bilenko ee4a4b4c24
Migrate quality_monitor_test.go to acceptance test (#2192) 2025-01-20 16:33:03 +00:00
Ilya Kuznetsov 84a73052d2
fix: Detailed message for using source-linked deployment with file_path specified (#2119)
## Changes

Resolves remaining comments from here
https://github.com/databricks/cli/pull/2046


[This](https://github.com/databricks/cli/pull/2046#discussion_r1907121844)
and
[this](https://github.com/databricks/cli/pull/2046#discussion_r1908928239)
are on hold until Pieter's response

## Tests
<!-- How is this tested? -->
2025-01-20 16:16:51 +00:00
shreyas-goenka e6982d09ac
Add doc string for `bundle.uuid` (#2170)
Co-authored-by: Julia Crawford (Databricks) <julia.crawford@databricks.com>
2025-01-20 14:52:22 +00:00
shreyas-goenka 41a21af556
Refactor `bundle init` (#2074)
## Summary of changes
This PR introduces three new abstractions: 
1. `Resolver`: Resolves which reader and writer to use for a template.
2. `Writer`: Writes a template project to disk. Prompts the user if
necessary.
3. `Reader`: Reads a template specification from disk, built into the
CLI or from GitHub.

Introducing these abstractions helps decouple reading a template from
writing it. When I tried adding telemetry for the `bundle init` command,
I noticed that the code in `cmd/init.go` was getting convoluted and hard
to test. A future change could have accidentally logged PII when a user
initialised a custom template.

Hedging against that risk is important here because we use a generic
untyped `map<string, string>` representation in the backend to log
telemetry for the `databricks bundle init`. Otherwise, we risk
accidentally breaking our compliance with our centralization
requirements.

### Details

After this PR there are two classes of templates that can be
initialized:
1. A `databricks` template: This could be a builtin template or a
template outside the CLI like mlops-stacks, which is still owned and
managed by Databricks. These templates log their telemetry arguments and
template name.
2. A `custom` template: These are templates created by and managed by
the end user. In these templates we do not log the template name and
args. Instead a generic placeholder string of "custom" is logged in our
telemetry system.

NOTE: The functionality of the `databricks bundle init` command remains
the same after this PR. Only the internal abstractions used are changed.

## Tests
New unit tests. Existing golden and unit tests. Also a fair bit of
manual testing.
2025-01-20 12:09:28 +00:00
Andrew Nester 0c088d4050
Fixed an apps message order and added output test (#2174)
## Changes
Fixed an apps message order and added output test
2025-01-17 14:42:39 +00:00
Pieter Noordhuis 89eb556318
Migrate path translation tests to acceptance tests (#2122)
## Changes

The assertions on the output made are now captured in the `output.*`
files. These don't capture intent like actual assertions do, but we
still have regular test coverage in the path translation tests under
`bundle/config/mutator`.

## Tests

Tests pass.
2025-01-17 10:22:49 +00:00
Pieter Noordhuis 9061635789
Default to forward slash-separated paths for path translation (#2145)
## Changes

This came up in #2122 where relative library paths showed up with
backslashes on Windows. It's hard to run acceptance tests where paths
may be in either form. This change updates path translation logic to
always use forward slash-separated paths, including for absolute paths.

## Tests

* Unit tests pass.
* Confirmed that code where library paths are used uses the `filepath`
package for path manipulation. The functions in this package always
normalize their inputs to be platform-native paths.
* Confirmed that code that uses absolute paths works with forward
slash-separated paths on Windows.
2025-01-17 09:38:01 +00:00
Pieter Noordhuis 2cd0d88bdd
Format Python code with ruff (#2166)
## Changes

The materialized templates included in #2146 include Python code that we
require to be formatted. Instead of running ruff as part of the
testcase, we can enforce that all Python code in the repository is
formatted. It won't be possible to have a passing acceptance test for
template initialization with unformatted code.
2025-01-17 07:38:47 +00:00
Denis Bilenko 2e70558dc1
Resolve variables in a loop (#2164)
## Changes
- Instead of doing 2 passes on variable resolution, do a loop until
there are no more updates (or we reach count 100).
- Stacked on top of #2163 which is a regression test for this:
acceptance/bundle/variables/complex-transitive-deep

## Tests
Existing tests, new regression tests.

These tests already passed before, added for completeness:
- acceptance/bundle/variables/cycle
- acceptance/bundle/variables/complex-cross-ref
2025-01-16 14:39:54 +00:00
shreyas-goenka f2bba632cb
Patch references to UC schemas to capture dependencies automatically (#1989)
## Changes
Fixes https://github.com/databricks/cli/issues/1977.  

This PR modifies the bundle configuration to capture the dependency that
a UC Volume or a DLT pipeline might have on a UC schema at deployment
time. It does so by replacing the schema name with a reference of the
form `${resources.schemas.foo.name}`.

For example:
The following UC Volume definition depends on the UC schema with the
name `schema_name`. This mutator converts this configuration

from:
```
resources:
  volumes:
    bar:
      catalog_name: catalog_name
      name: volume_name
      schema_name: schema_name

  schemas:
    foo:
      catalog_name: catalog_name
      name: schema_name
```

to:

```
resources:
  volumes:
    bar:
      catalog_name: catalog_name
      name: volume_name
      schema_name: ${resources.schemas.foo.name}`

  schemas:
    foo:
      catalog_name: catalog_name
      name: schema_name
```


## Tests
Unit tests and manually.
2025-01-16 13:27:00 +00:00
Andrew Nester fa87f22706
Changed warning message for apps (#2165)
## Changes
Changed warning message for apps

Original warning message added here:
https://github.com/databricks/cli/pull/2161
2025-01-16 13:03:35 +00:00
Andrew Nester 98244606b3
Upgrade TF provider to 1.63.0 (#2162)
## Changes
No significant changes to call out for DABs.
2025-01-16 12:04:00 +00:00
Andrew Nester 8f34fc7961
Added output message to warn about slower deployments with apps (#2161)
## Changes
When users create one or more Databricks apps in their bundle it can
lead to initial bundle deployment being slower because apps need to wait
until their compute is fully provisioned and started.

This PR adds a message to warn about it. This message will be removed
when `no_compute` option becomes available in TF provider and used in
DABs (https://github.com/databricks/cli/pull/2144)
2025-01-16 11:39:59 +00:00
Denis Bilenko b273dc5942
Enable linter 'copyloopvar' and fix the issues (#2160)
## Changes
- Remove all unnecessary copies of the loop variable, it is not
necessary since Go 1.22 https://go.dev/blog/loopvar-preview
- Enable the linter that catches this issue
https://github.com/karamaru-alpha/copyloopvar

## Tests
Existing tests.
2025-01-16 11:20:50 +00:00
Denis Bilenko 30dec59781
Improve resolution of complex variables within complex variables (#2157)
## Changes
- Remove ResolveVariableReferencesInComplexVariables - it blocked
complex-within-complex for no good reason.
- Repeat regular resolution twice, it helps with a couple test cases we
have.

There may be a case for running it 3 times or more in a loop, but there
is no test case for that, so this PR is simple incremental improvement.

## Tests
Existing acceptance tests. Previously all unit tests for complex
variables were converted to acceptance tests, to capture this change and
ensure nothing breaks.
2025-01-15 18:03:43 +01:00
Denis Bilenko 39b03592d7
Migrate TestResolveComplexVariableWithVarReference (#2156)
This is the last test referencing
ResolveVariableReferencesInComplexVariables, allowing removal of that
mutator.
2025-01-15 17:52:17 +01:00
Denis Bilenko 581565a1c4
Migrate more variable tests to acceptance (#2154) 2025-01-15 15:59:42 +01:00
Andrew Nester dd554412a6
Retry app deployment if there is an active deployment in progress (#2153)
## Changes
If before running an app, the app was stopped with an active deployment,
then Apps backend start it and does the auto-deploy of the last active
deployment. In such cases StartApp API won't return any active or
pending deployments in its response but doing the deploy immediately
after the start might result in the error `Cannot deploy app *** as
there is an active deployment in progress`.

From DABs side, we have to do a new deployment on every `bundle run`
(command which start the app and does deployment) because local files in
bundle might have been changed and users expect to have the app running
with new code.

Thus this PR works around the error by catching “deployment in progress”
error, getting any active / pending deployments, waits for them to
finish and start the new deployment again. If 2nd attempts fails, the
whole command fails.

## Tests
Added unit test.
2025-01-15 12:51:06 +01:00
Denis Bilenko b76eee0e8c
Migrate resolution tests to acceptance tests (#2143) 2025-01-15 11:22:23 +01:00
dependabot[bot] 72e677d0ac
Bump github.com/databricks/databricks-sdk-go from 0.54.0 to 0.55.0 (#2126)
Bumps
[github.com/databricks/databricks-sdk-go](https://github.com/databricks/databricks-sdk-go)
from 0.54.0 to 0.55.0.
<details>
<summary>Release notes</summary>
<p><em>Sourced from <a
href="https://github.com/databricks/databricks-sdk-go/releases">github.com/databricks/databricks-sdk-go's
releases</a>.</em></p>
<blockquote>
<h2>v0.55.0</h2>
<h3>Internal Changes</h3>
<ul>
<li>Bump staticcheck to 0.5.1 and add go 1.23 test coverage (<a
href="https://redirect.github.com/databricks/databricks-sdk-go/pull/1106">#1106</a>).</li>
<li>Bump x/net, x/crypto dependencies (<a
href="https://redirect.github.com/databricks/databricks-sdk-go/pull/1107">#1107</a>).</li>
<li>Create custom codeql.yml (<a
href="https://redirect.github.com/databricks/databricks-sdk-go/pull/1114">#1114</a>).</li>
<li>Decouple serving and oauth2 package (<a
href="https://redirect.github.com/databricks/databricks-sdk-go/pull/1110">#1110</a>).</li>
<li>Migrate workflows that need write access to use hosted runners (<a
href="https://redirect.github.com/databricks/databricks-sdk-go/pull/1112">#1112</a>).</li>
<li>Move package credentials in config (<a
href="https://redirect.github.com/databricks/databricks-sdk-go/pull/1115">#1115</a>).</li>
<li>Update Queries test (<a
href="https://redirect.github.com/databricks/databricks-sdk-go/pull/1104">#1104</a>).</li>
</ul>
<h3>API Changes:</h3>
<ul>
<li>Added <code>NoCompute</code> field for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/apps#CreateAppRequest">apps.CreateAppRequest</a>.</li>
<li>Added <code>HasMore</code> field for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/jobs#BaseJob">jobs.BaseJob</a>.</li>
<li>Added <code>HasMore</code> field for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/jobs#BaseRun">jobs.BaseRun</a>.</li>
<li>Added <code>PageToken</code> field for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/jobs#GetJobRequest">jobs.GetJobRequest</a>.</li>
<li>Added <code>HasMore</code> and <code>NextPageToken</code> fields for
<a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/jobs#Job">jobs.Job</a>.</li>
<li>Added <code>HasMore</code> field for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/jobs#Run">jobs.Run</a>.</li>
<li>Added <code>RunAs</code> field for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/pipelines#CreatePipeline">pipelines.CreatePipeline</a>.</li>
<li>Added <code>RunAs</code> field for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/pipelines#EditPipeline">pipelines.EditPipeline</a>.</li>
<li>Added <code>AuthorizationDetails</code> and <code>EndpointUrl</code>
fields for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/serving#DataPlaneInfo">serving.DataPlaneInfo</a>.</li>
<li>[Breaking] Changed <code>Update</code> method for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/oauth2#AccountFederationPolicyAPI">a.AccountFederationPolicy</a>
account-level service with new required argument order.</li>
<li>[Breaking] Changed <code>Update</code> method for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/oauth2#ServicePrincipalFederationPolicyAPI">a.ServicePrincipalFederationPolicy</a>
account-level service with new required argument order.</li>
<li>Changed <code>UpdateMask</code> field for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/oauth2#UpdateAccountFederationPolicyRequest">oauth2.UpdateAccountFederationPolicyRequest</a>
to no longer be required.</li>
<li>Changed <code>UpdateMask</code> field for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/oauth2#UpdateServicePrincipalFederationPolicyRequest">oauth2.UpdateServicePrincipalFederationPolicyRequest</a>
to no longer be required.</li>
<li>[Breaking] Changed <code>DaysOfWeek</code> field for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/pipelines#RestartWindow">pipelines.RestartWindow</a>
to type <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/pipelines#DayOfWeekList">pipelines.DayOfWeekList</a>.</li>
</ul>
<p>OpenAPI SHA: 779817ed8d63031f5ea761fbd25ee84f38feec0d, Date:
2025-01-08</p>
</blockquote>
</details>
<details>
<summary>Changelog</summary>
<p><em>Sourced from <a
href="https://github.com/databricks/databricks-sdk-go/blob/main/CHANGELOG.md">github.com/databricks/databricks-sdk-go's
changelog</a>.</em></p>
<blockquote>
<h2>[Release] Release v0.55.0</h2>
<h3>Internal Changes</h3>
<ul>
<li>Bump staticcheck to 0.5.1 and add go 1.23 test coverage (<a
href="https://redirect.github.com/databricks/databricks-sdk-go/pull/1106">#1106</a>).</li>
<li>Bump x/net, x/crypto dependencies (<a
href="https://redirect.github.com/databricks/databricks-sdk-go/pull/1107">#1107</a>).</li>
<li>Create custom codeql.yml (<a
href="https://redirect.github.com/databricks/databricks-sdk-go/pull/1114">#1114</a>).</li>
<li>Decouple serving and oauth2 package (<a
href="https://redirect.github.com/databricks/databricks-sdk-go/pull/1110">#1110</a>).</li>
<li>Migrate workflows that need write access to use hosted runners (<a
href="https://redirect.github.com/databricks/databricks-sdk-go/pull/1112">#1112</a>).</li>
<li>Move package credentials in config (<a
href="https://redirect.github.com/databricks/databricks-sdk-go/pull/1115">#1115</a>).</li>
<li>Update Queries test (<a
href="https://redirect.github.com/databricks/databricks-sdk-go/pull/1104">#1104</a>).</li>
</ul>
<h3>API Changes:</h3>
<ul>
<li>Added <code>NoCompute</code> field for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/apps#CreateAppRequest">apps.CreateAppRequest</a>.</li>
<li>Added <code>HasMore</code> field for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/jobs#BaseJob">jobs.BaseJob</a>.</li>
<li>Added <code>HasMore</code> field for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/jobs#BaseRun">jobs.BaseRun</a>.</li>
<li>Added <code>PageToken</code> field for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/jobs#GetJobRequest">jobs.GetJobRequest</a>.</li>
<li>Added <code>HasMore</code> and <code>NextPageToken</code> fields for
<a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/jobs#Job">jobs.Job</a>.</li>
<li>Added <code>HasMore</code> field for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/jobs#Run">jobs.Run</a>.</li>
<li>Added <code>RunAs</code> field for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/pipelines#CreatePipeline">pipelines.CreatePipeline</a>.</li>
<li>Added <code>RunAs</code> field for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/pipelines#EditPipeline">pipelines.EditPipeline</a>.</li>
<li>Added <code>AuthorizationDetails</code> and <code>EndpointUrl</code>
fields for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/serving#DataPlaneInfo">serving.DataPlaneInfo</a>.</li>
<li>[Breaking] Changed <code>Update</code> method for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/oauth2#AccountFederationPolicyAPI">a.AccountFederationPolicy</a>
account-level service with new required argument order.</li>
<li>[Breaking] Changed <code>Update</code> method for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/oauth2#ServicePrincipalFederationPolicyAPI">a.ServicePrincipalFederationPolicy</a>
account-level service with new required argument order.</li>
<li>Changed <code>UpdateMask</code> field for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/oauth2#UpdateAccountFederationPolicyRequest">oauth2.UpdateAccountFederationPolicyRequest</a>
to no longer be required.</li>
<li>Changed <code>UpdateMask</code> field for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/oauth2#UpdateServicePrincipalFederationPolicyRequest">oauth2.UpdateServicePrincipalFederationPolicyRequest</a>
to no longer be required.</li>
<li>[Breaking] Changed <code>DaysOfWeek</code> field for <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/pipelines#RestartWindow">pipelines.RestartWindow</a>
to type <a
href="https://pkg.go.dev/github.com/databricks/databricks-sdk-go/service/pipelines#DayOfWeekList">pipelines.DayOfWeekList</a>.</li>
</ul>
<p>OpenAPI SHA: 779817ed8d63031f5ea761fbd25ee84f38feec0d, Date:
2025-01-08</p>
</blockquote>
</details>
<details>
<summary>Commits</summary>
<ul>
<li><a
href="b83a7262d5"><code>b83a726</code></a>
[Release] Release v0.55.0 (<a
href="https://redirect.github.com/databricks/databricks-sdk-go/issues/1117">#1117</a>)</li>
<li><a
href="23d9c1ea2b"><code>23d9c1e</code></a>
[Internal] Move package credentials in config (<a
href="https://redirect.github.com/databricks/databricks-sdk-go/issues/1115">#1115</a>)</li>
<li><a
href="adc94cabf7"><code>adc94ca</code></a>
[Internal] Decouple serving and oauth2 package (<a
href="https://redirect.github.com/databricks/databricks-sdk-go/issues/1110">#1110</a>)</li>
<li><a
href="83db3cbdab"><code>83db3cb</code></a>
[Internal] Create custom codeql.yml (<a
href="https://redirect.github.com/databricks/databricks-sdk-go/issues/1114">#1114</a>)</li>
<li><a
href="2b55375727"><code>2b55375</code></a>
[Internal] Migrate workflows that need write access to use hosted
runners (<a
href="https://redirect.github.com/databricks/databricks-sdk-go/issues/1">#1</a>...</li>
<li><a
href="03fb2681fa"><code>03fb268</code></a>
[Internal] Bump x/net, x/crypto dependencies (<a
href="https://redirect.github.com/databricks/databricks-sdk-go/issues/1107">#1107</a>)</li>
<li><a
href="28e1a698ab"><code>28e1a69</code></a>
[Internal] Bump staticcheck to 0.5.1 and add go 1.23 test coverage (<a
href="https://redirect.github.com/databricks/databricks-sdk-go/issues/1106">#1106</a>)</li>
<li><a
href="2399d721fe"><code>2399d72</code></a>
[Internal] Update Queries test (<a
href="https://redirect.github.com/databricks/databricks-sdk-go/issues/1104">#1104</a>)</li>
<li>See full diff in <a
href="https://github.com/databricks/databricks-sdk-go/compare/v0.54.0...v0.55.0">compare
view</a></li>
</ul>
</details>
<br />

<details>
<summary>Most Recent Ignore Conditions Applied to This Pull
Request</summary>

| Dependency Name | Ignore Conditions |
| --- | --- |
| github.com/databricks/databricks-sdk-go | [>= 0.28.a, < 0.29] |
</details>


[![Dependabot compatibility
score](https://dependabot-badges.githubapp.com/badges/compatibility_score?dependency-name=github.com/databricks/databricks-sdk-go&package-manager=go_modules&previous-version=0.54.0&new-version=0.55.0)](https://docs.github.com/en/github/managing-security-vulnerabilities/about-dependabot-security-updates#about-compatibility-scores)

Dependabot will resolve any conflicts with this PR as long as you don't
alter it yourself. You can also trigger a rebase manually by commenting
`@dependabot rebase`.

[//]: # (dependabot-automerge-start)
[//]: # (dependabot-automerge-end)

---

<details>
<summary>Dependabot commands and options</summary>
<br />

You can trigger Dependabot actions by commenting on this PR:
- `@dependabot rebase` will rebase this PR
- `@dependabot recreate` will recreate this PR, overwriting any edits
that have been made to it
- `@dependabot merge` will merge this PR after your CI passes on it
- `@dependabot squash and merge` will squash and merge this PR after
your CI passes on it
- `@dependabot cancel merge` will cancel a previously requested merge
and block automerging
- `@dependabot reopen` will reopen this PR if it is closed
- `@dependabot close` will close this PR and stop Dependabot recreating
it. You can achieve the same result by closing it manually
- `@dependabot show <dependency name> ignore conditions` will show all
of the ignore conditions of the specified dependency
- `@dependabot ignore this major version` will close this PR and stop
Dependabot creating any more for this major version (unless you reopen
the PR or upgrade to it yourself)
- `@dependabot ignore this minor version` will close this PR and stop
Dependabot creating any more for this minor version (unless you reopen
the PR or upgrade to it yourself)
- `@dependabot ignore this dependency` will close this PR and stop
Dependabot creating any more for this dependency (unless you reopen the
PR or upgrade to it yourself)


</details>

---------

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
Co-authored-by: Andrew Nester <andrew.nester@databricks.com>
2025-01-14 16:02:34 +00:00
Denis Bilenko 2b452973f3
Enable linter 'unconvert' and fix the issues found (#2136) 2025-01-14 10:56:38 +00:00
Pieter Noordhuis 5d9bc3b553
Allow artifact path to be located outside the sync root (#2128)
## Changes

We perform a check during path translation that the path being
referenced is contained in the bundle's sync root. If it isn't, it's not
a valid remote reference. However, this doesn't apply to paths that are
_always_ local, such as the artifact path. An artifact's build command
is executed in its path. Files created by the artifact build (e.g.
wheels or JARs) don't need to be in the sync root because they have a
dedicated and different upload path into `${workspace.artifact_path}`.

Therefore, this check that a path is contained in the bundle's sync root
doesn't apply to artifact paths. This change modifies the structure of
path translation to allow opting out of this check.

Fixes #1927.

## Tests

* Existing and new tests pass.
* Manually confirmed that building and using a wheel built outside the
sync root path works as expected.
* No acceptance tests because we don't run build as part of validate.
2025-01-14 08:34:55 +00:00
Andrew Nester 913e10a037
Added support for Databricks Apps in DABs (#1928)
## Changes
Now it's possible to configure new `app` resource in bundle and point it
to the custom `source_code_path` location where Databricks App code is
defined.

On `databricks bundle deploy` DABs will create an app. All consecutive
`databricks bundle deploy` execution will update an existing app if
there are any updated

On `databricks bundle run <my_app>` DABs will execute app deployment. If
the app is not started yet, it will start the app first.

### Bundle configuration

```
bundle:
  name: apps

variables:
  my_job_id:
    description: "ID of job to run app"
    lookup:
      job: "My Job"
  databricks_name:
    description: "Name for app user"
  additional_flags:
    description: "Additional flags to run command app"
    default: ""
  my_app_config:
    type: complex
    description: "Configuration for my Databricks App"
    default:
      command:
        - flask
        - --app
        - hello
        - run
        - ${var.additional_flags}
      env:
        - name: DATABRICKS_NAME
          value: ${var.databricks_name}

resources:
  apps:
    my_app:
      name: "anester-app" # required and has to be unique
      description: "My App"
      source_code_path: ./app # required and points to location of app code
      config: ${var.my_app_config}
      resources:
        - name: "my-job"
          description: "A job for app to be able to run"
          job:
            id: ${var.my_job_id}
            permission: "CAN_MANAGE_RUN"
      permissions:
        - user_name: "foo@bar.com"
          level: "CAN_VIEW"
        - service_principal_name: "my_sp"
          level: "CAN_MANAGE"

targets:
  dev:
    variables:
      databricks_name: "Andrew (from dev)"
      additional_flags: --debug
  
  prod:
    variables:
      databricks_name: "Andrew (from prod)"
```

### Execution
1. `databricks bundle deploy -t dev`
2. `databricks bundle run my_app -t dev`

**If app is started**
```
✓ Getting the status of the app my-app
✓ App is in RUNNING state
✓ Preparing source code for new app deployment.
✓ Deployment is pending
✓ Starting app with command: flask --app hello run --debug
✓ App started successfully
You can access the app at <app-url>
```

**If app is not started**
```
✓ Getting the status of the app my-app
✓ App is in UNAVAILABLE state
✓ Starting the app my-app
✓ App is starting...
....
✓ App is starting...
✓ App is started!
✓ Preparing source code for new app deployment.
✓ Downloading source code from /Workspace/Users/...
✓ Starting app with command: flask --app hello run --debug
✓ App started successfully
You can access the app at <app-url>
```

## Tests
Added unit and config tests + manual test.

```
--- PASS: TestAccDeployBundleWithApp (404.59s)
PASS
coverage: 36.8% of statements in ./...
ok      github.com/databricks/cli/internal/bundle       405.035s        coverage: 36.8% of statements in ./...
```
2025-01-13 16:43:48 +00:00
Denis Bilenko 1ead1b2e36
Move merge fix-ups after variable resolution (#2125)
## Changes
Move mutator.Merge{JobClusters,JobParameters,JobTasks,PipelineClusters}
after variable resolution. This helps with the case when key contains a
variable.

@pietern mentioned here
https://github.com/databricks/cli/pull/2101#pullrequestreview-2539168762
it should be safe.

## Tests
Existing acceptance that was capturing the bug is updated with corrected
output.
2025-01-13 13:01:31 +00:00
Lennart Kats (databricks) 3e40a0c2f1
Encourage the use of root_path in production to ensure single deployment (#1712)
## Changes

This updates `mode: production` to allow `root_path` to indicate
uniqueness. Historically, we required `run_as` for this, which isn't
actually very effective for that purpose. `run_as` also had the problem
that it doesn't work for pipelines.

This is a cherry-pick from https://github.com/databricks/cli/pull/1387

---------

Co-authored-by: Pieter Noordhuis <pcnoordhuis@gmail.com>
2025-01-13 12:19:12 +00:00
Gleb Kanterov f8f804fe17
PythonMutator: update instrumentation (#2124)
## Changes
Update instrumentation for PythonMutator to handle `experimental/python`
config.

## Tests
Unit tests
2025-01-13 09:16:29 +00:00